WO2021173571A1 - Animal health evaluation system and method - Google Patents

Animal health evaluation system and method Download PDF

Info

Publication number
WO2021173571A1
WO2021173571A1 PCT/US2021/019269 US2021019269W WO2021173571A1 WO 2021173571 A1 WO2021173571 A1 WO 2021173571A1 US 2021019269 W US2021019269 W US 2021019269W WO 2021173571 A1 WO2021173571 A1 WO 2021173571A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
animal
score
sensor data
prescribed
Prior art date
Application number
PCT/US2021/019269
Other languages
French (fr)
Inventor
Danielle Simone Davyd MADELEY
Benedict Duncan Xavier LASCELLES
Original Assignee
Aniv8, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aniv8, Inc. filed Critical Aniv8, Inc.
Publication of WO2021173571A1 publication Critical patent/WO2021173571A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4504Bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4824Touch or pain perception evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/40Animals

Definitions

  • the present disclosure is directed to evaluating health of animals and in particular is directed to systems and computer-implemented methods for detecting and evaluating movement- related conditions in animals.
  • Osteoarthritis also known as degenerative joint disease (DJD)
  • DJD degenerative joint disease
  • DJD degenerative joint disease
  • OA is defined as the progressive and permanent long-term deterioration of the cartilage surrounding joints.
  • Spontaneously occurring OA affects a large percentage of animals, including dogs and cats. For example it is estimated that 30-40% dogs and cats have OA and associated clinical signs due to pain.
  • a computer- implemented method for evaluating a movement-related condition in an animal comprises: at a computing device: receiving sensor data from one or more sensors associated with the animal, the one or more sensors measuring movement-related parameters while the animal is engaged in two or more prescribed movement-related activities; determining the two or more prescribed movement-related activities; slicing the sensor data based on the determined prescribed movement-related activities such that each data slice corresponds to a respective prescribed movement-related activity; for each data slice, calculating one or more movement- related metrics; providing the calculated one or more movement-related metrics for each data slice to a movement base model that is trained to classify movement-related metrics into two or more movement scores, wherein the movement scores are associated with a movement-related condition of an animal, and wherein a first movement score indicates a movement-related condition in the animal and a second movement score indicates a healthy animal; and receiving a predicted movement score based on the movement base model, the predicted movement score indicating the movement-related
  • the predicted movement score can be selected from a range of multiple movement scores.
  • the two or more movement scores can include the range of movement scores between the first movement score and the second movement score.
  • the movement base model may be a machine learning model created by utilising a supervised machine learning algorithm (such as a support vector machine (SVM) algorithm) on a set of training data, the training data including one or more movement-related metrics values calculated for multiple subjects and corresponding movement scores assigned to the multiple subjects.
  • a supervised machine learning algorithm such as a support vector machine (SVM) algorithm
  • the movement base model is preferably trained to generate a mapping function between the movement related metrics and the two or more condition based on the set of training data, the movement base model configured to determine the predicted movement score by mapping the calculated movement-related metrics to a movement score of the two or more movement scores using the generated mapping function.
  • the step of determining the two or more prescribed movement-related activities preferably includes identifying the two or more prescribed movement-related activities from the sensor data using a trained activity model.
  • a processing unit e.g., a hardware processing unit configured to: receive sensor data from one or more sensors associated with the animal, the one or more sensors measuring movement-related parameters while the animal is engaged in two or more prescribed movement-related activities; determine the two or more prescribed movement-related activities; slice the sensor data based on the determined prescribed movement-related activities such that each data slice corresponds to a respective prescribed movement-related activity; for each data slice, calculate one or more movement-related metrics; provide the calculated one or more movement-related metrics for each data slice to a movement base model that is trained to classify movement-related metrics into two or more movement scores, wherein the movement scores are associated with a movement-related condition of an animal, and wherein a first movement score indicates a movement-related condition in the animal and a second movement score indicates a healthy animal; and receive a predicted movement score based on the movement base model, the predicted movement score
  • the method may include receiving sensor data from one or more sensors associated with the animal, the one or more sensors measuring movement-related parameters while the animal is engaged in two or more prescribed movement-related activities; determining the two or more prescribed movement-related activities; and slicing the sensor data based on the determined prescribed movement-related activities such that each data slice corresponds to a respective prescribed movement-related activity.
  • the computer-implemented method further includes providing the sliced sensor data to a machine learned movement base model that is trained to classify the sliced sensor data into two or more movement scores.
  • the movement scores may be associated with a movement-related condition of an animal, and a first movement score may indicate a movement- related condition in the animal and a second movement score may indicate a healthy animal.
  • the method further includes receiving a predicted movement score based on the machine learned movement base model that indicates the movement-related condition of the animal.
  • the sensor data may not be sliced based on prescribed activities. Instead, the sensor data received from the sensors may be provided directly to a machine learned movement base model that ingests the sensor data and determines a movement score for a given animal - the movement score indicating whether the animal has a movement related condition or is healthy.
  • Fig. 1 A is a schematic diagram of a diagnostic system according to some aspects of the disclosure.
  • Fig. IB is a graphical representation of a subject on which the diagnostic system can be utilised.
  • Fig. 2 is a block diagram of a sensing unit according to some aspects of this disclosure.
  • FIG. 3 is a block diagram of a processing unit according some aspects of the present disclosure.
  • Fig. 4 is a block diagram of an output unit according to some aspects of the present disclosure.
  • FIG. 5 is a flowchart illustrating an example method for generating a movement base model according to some embodiments of the present disclosure.
  • Fig. 6 is a flowchart illustrating an example method for generating a movement score according to some embodiments of the present disclosure.
  • Fig. 7 illustrates a portion of an example training dataset according to some embodiments of the present disclosure.
  • Fig. 8 illustrates an example dataset of a subject according to some embodiments of the present disclosure.
  • Fig. 9 illustrates an example processing device that may be used in connection with the present disclosure.
  • aspects of the present disclosure present a diagnostic system and method that can predictably detect perturbations from an animal's normal movement.
  • the diagnostic system and method is agnostic to the weight, breed, limb-length and lifestyle of the animal. It can be used to detect abnormality of the movements of animals of different breed types and sizes.
  • the disclosed systems and methods are robust against behavioural factors (such as shaking, scratching, etc.) as well and work across the whole range of animal behavioural phenotypes. The disclosed systems and methods can both be used to diagnose medical conditions that affect an animal's movements, detect early onset of such conditions and monitor the effectiveness of therapies for these conditions.
  • the disclosed diagnostic systems and methods calculate an overall movement score for an animal.
  • the overall movement score indicates the quality of movement of an animal and includes factors such as the ease and flow of movement and the power and efficiency of the movement. It was observed by the inventors that animals that suffer from pain or any other related conditions typically make stiff, jerky movements and/or moved in a rather inefficient manner or with reduced power. Accordingly, a calculated overall movement score can indicate the amount of pain an animal may be suffering as a result of OA or other conditions that affect movement.
  • the presently disclosed diagnostic system determines certain metrics related with smoothness of motion, including metrics such as spectral arc length acceleration and velocity and jerk in an animal's movements while the animal is performing certain activities. These metrics are usually considered good indicators of power and efficiency of movement, and also good indicators of smoothness of motion, movement flow and ease.
  • the measured values for these metrics are compared with a base model to arrive at a movement score for the animal.
  • the scores can be configured such that a high movement score indicates a healthy animal that does not have underlying conditions that affect the animal's movements whereas a low movement score may indicate that the animal is suffering from some conditions that affect the animal's quality of movement. It will be appreciated obviously that the converse can also be easily configured - i.e., low scores indicating good health and high scores indicating some movement-related conditions in the subject.
  • Owners or veterinary doctors may use these movement scores to determine if the animal's health is improving when a particular treatment plan is implemented. For example, if the movement score is low, pain medication may be prescribed to the animal. Owners can then monitor the progress of their pet using the diagnostic system. An improvement in the movement score would indicate that the pain medication is working. Alternatively, if the movement scores do not improve and/or improve for a period and then start declining again, it may be that the pain medication has not worked or has stopped working and the owner may consider changing the pet's pain medication and/or taking the pet back to see a veterinary doctor.
  • FIG. 1 A illustrates a diagnostic system 100 including a sensing unit 104, a processing unit 106, and an output unit 108. These units communicate with each other via one or more communication networks 110.
  • Fig. IB is a schematic representation of a subject 102 on which the diagnostic system 100 can be utilized.
  • Fig. IB includes a representation 200 of a subject 102 that shows various anatomical planes with respect to the subject/animal 102. These include, a transverse (or axial) plane 112 that divides the body into cranial 114 and caudal portions 116, a coronal (or dorsal) plane 118 that divides the body into dorsal 120 and ventral portions 122, and a median plane 124 that divides the body into left and right portions.
  • a transverse (or axial) plane 112 that divides the body into cranial 114 and caudal portions 116
  • a coronal (or dorsal) plane 118 that divides the body into dorsal 120 and ventral portions 122
  • a median plane 124 that divides the body into left and right portions.
  • the sensing unit 104 includes one or more sensors, which measure various movement-related parameters such as dynamic acceleration forces, tilt, rotational motion, angular velocity, etc.
  • the sensing unit 104 is associated with the subject 102.
  • the sensing unit 104 may be attached to or implanted in the subject using suitable means and may be positioned at one or more suitable positions.
  • the sensing unit 104 may be external to the subject 102 (for example, comprising a camera system to sense motion of subject 102, with processing means to determine movement-related parameters from the received image data).
  • the sensing unit 104 is attached to or integrated into a collar or harness worn by the subject 102.
  • the sensing unit 102 or portions thereof may be implanted in one or more parts of the animal's body.
  • the sensing unit 104 includes sensors that are placed at a single position on the subject 102, the sensing unit 104 can be enclosed in a single housing.
  • the sensing unit 104 comprises multiple housings, each including one or more sensors.
  • the sensing unit 104 includes three sensor assemblies - one attached to a caudal-dorsal portion of the subject (as indicated by reference numeral 126 in Fig.
  • the sensing unit 104 includes a single sensor assembly attached to the subject's collar. It will be appreciated that these are merely examples and that in practice the number and position of the sensor assemblies may be modified without departing from the scope of the present disclosure.
  • the processing unit 106 is configured to maintain a baseline movement model and to process sensor data received from the sensing unit 104 to determine a movement score.
  • the sensing unit 104 and the processing unit 106 are part of a single device. Alternatively, these units may be separate devices/modules - e.g., the sensing unit 104 is attached to the subject 102 during use and the processing unit 106 is incorporated at a remote location - e.g., in a separate computer device executing in a veterinary hospital or clinic, or on the cloud.
  • the output unit 108 may be configured to receive the movement score from the processing unit 106 and provide this to a user of the diagnostic system 100.
  • the output unit 108 is installed and executed on a separate device, e.g., a mobile device of the user. In other embodiments, the output unit 108 is executed on the same device as the sensing unit 104 and/or the processing unit 106.
  • the senor(s) sense various parameters associated with the subject's movements while the subject 102 engages in one or more activities. At some point, e.g., after expiry of a certain period, upon collection of a threshold amount of sensed data, or upon receiving a command from the processing unit 106, the sensing unit 104 communicates the sensed data to the processing unit 106 for further processing. Once the data is processed, the processing unit 106 communicates a movement score to the output unit 108. Each of these units will be described in detail in the following sections.
  • Fig. 2 illustrates an example sensing unit 104 according to some aspects of the present disclosure.
  • the sensing unit 104 includes one or more sensors 202 configured to measure the movement parameters and a network interface 204 configured to communicate the sensed data to the processing unit 106.
  • the sensors 202 and network interface 204 may be separate components or incorporated in a single device.
  • the sensing unit 104 includes an embedded controller 206, a memory 208 and a power source 210. These components are interconnected through a system bus 212.
  • the one or more sensors 202 includes an accelerometer 220 and a gyroscope 222.
  • An accelerometer is a device that measures acceleration. Accelerometers may be single or multiple axis accelerometers - i.e., they can measure acceleration along a single axis or multiple axes. In one example, the accelerometer 220 used in the present disclosure is a tri-axis accelerometer.
  • a gyroscope is typically a device used for measuring orientation and angular velocity. Just like accelerometers, gyroscopes can also measure orientation and angular velocity along a single axis or multiple axes.
  • the gyroscope 222 used in the present disclosure is a tri-axis gyroscope.
  • 6 values may be measured by the sensors 202 (3 linear acceleration values and 3 angular velocity values).
  • the sensors 202 may be configured to sense or measure these values at the same rate or at different rates. In one example, the sensing rate is 100 times per second.
  • the processing unit 106 and utilized to calculate the movement score one or both of these types of sensors may be utilized.
  • the network interface 204 is used to transmit data to and receive data from the processing system 106.
  • the data transmitted from the sensing unit 104 to the processing unit 106 includes sensor data captured by the sensors 202. Additionally, the sensing unit 104 may transmit information from other modules of the sensing unit - e.g., it may transmit information about its battery life to the processing unit 106, e.g., when the battery life reduces below a threshold amount and/or when the processing unit 106 requests information about the sensing unit's remaining battery life.
  • the data received from the processing unit 106 at the sensing unit 104 can include signals to activate or deactivate the sensors 202 (e.g., when sufficient amount of sensor data is collected and/or at certain times of the day to conserve battery power). Additionally, the data received from the processing unit 106 at the sensing unit 104 can include instructions to transfer the sensor data to the processing unit 106, instructions to delete sensor data from the sensing unit's memory 208, etc.
  • the network interface 204 includes a network interface controller (NIC) that allows the sensing unit 104 to access the network 110 and process low-level network information.
  • the NIC has a connector for accepting a cable, or an aerial for wireless transmission and reception, and the associated circuitry.
  • the memory 208 may be utilized to store recorded sensor data and configuration data for the sensing unit 104. Access to the memory 208 is controlled by the embedded controller 206.
  • the memory can include one or more memory chips or modules.
  • the memory 208 can be arranged into memory' address blocks dedicated to certain purposes e.g. storage of configuration parameters, and storage of sensor data being the two of primary concern here.
  • the memory ' can also have a reserved, secure region that is not accessible by the user or erasable during firmware update.
  • the power supply module 210 provides power to the various modules of the sensing unit 104 during operation.
  • the power supply module 210 includes a battery', such as a lithium polymer battery. Power is supplied to the battery via a power connection, which in this case can come from an external source.
  • the battery recharges itself using alternative power means, such as movement, solar energy etc.
  • the battery ' ⁇ is charged via direct external power supplies - e.g., via wired or wireless charging.
  • the sensing unit 104 includes a circuit (not illustrated) connected to a power pin of a charging connector, such as a USB or mini USB connector.
  • the sensing unit 104 includes a wireless power receiver (not illustrated) which can wirelessly connect with a wireless transmitter.
  • the power which is delivered to the power supply module 210 in any of these manners may be converted to an appropriate voltage by a DC-DC converter and charge may be delivered to the battery ' ⁇ in accordance with a battery ' ⁇ charge management scheme implemented by the battery' charging circuit.
  • the embedded controller 206 is configured to communicate with the power supply module 210 to receive the charge state of the battery. Based on this information, the embedded controller 206 may be configured to perform a number of functions - e.g., the embedded controller 206 may enter a low power or sleep mode when the battery power is below a threshold level, it may generate an alert when the battery power reduces below another threshold level, etc.
  • the sensing unit 104 also includes a location detection module (not illustrated) such as a GPS receiver. In some aspects, instead of or in addition to a GPS receiver, the location detection module includes an ultra-wideband (UWB) positioning system.
  • UWB ultra-wideband
  • UWB positioning systems are known to be able to pinpoint location in real time to within 20 centimetres or less.
  • the accelerometer or gyroscope are used in combination with the network interface 204 to accurately and reliably identify indoor locations of the subject.
  • any other positioning system such as Bluetooth that can provide accurate positioning information (with an accuracy of a few centimetres) indoors may be utilized without departing from the scope of the present disclosure.
  • the sensing unit 104 is depicted as a single block diagram. As described previously, in some embodiments, the entire sensing unit 104 is contained in a single housing. Alternatively, the sensing unit 104 may be distributed, having multiple housings coupled to each other through wires or wirelessly. For instance, the sensors 202 are individually mounted on the collar of the subject 102 whereas the other modules are mounted on a harness and positioned on the subject's back.
  • the processing unit 106 performs various operations. For example, and as described further below, the processing unit 106 operates to: receive sensor data from the sensing unit 104, process the received sensor data to identify one or more activities the subject may be engaged in, process the received sensor data to determine a movement score, maintain and train a movement base model, and communicate with the sensing unit 104 and the output unit 108.
  • the processing unit 106 is a computer processing system.
  • Fig. 3 provides one example of a suitable computer processing system that can be used as the processing unit 106.
  • the processing unit 106 includes a processor 302.
  • the processor 302 is in communication with computer readable system memory 308 (e.g. a read only memory storing a BIOS for basic system operations), computer readable volatile memory 310 (e.g. random access memory such as DRAM modules), and computer readable non-transient memory 312 (e.g., one or more hard disk drives, such as NVRAM Flash modules, e.g., NAND flash or other flash technology).
  • Instructions and data for controlling the processor 302 are stored in the memory 308, 310, 312, and 302.
  • Non-transient memory 312 provides data storage for a database 304 for storing data associated with the programs.
  • the database 304 may store an activity model and a movement base model.
  • the database 304 may also store user registration details and passwords for users accessing the system. It may also store information related to the subjects being assessed, such as history of movement scores calculated for the animal, animal breed and size, owner details, veterinary specialist details, etc.
  • the database 304 may alternatively be stored on external computer readable storage accessible by the processing unit 106 (via wired, wireless, direct or network connection).
  • the processing unit 106 also includes one or more communication interfaces 314. Communications interfaces 314 are operated to provide wired or wireless connection to the communication network(s) 110. Via the communication interface(s) 314 and network(s) 110, processing unit 106 can communicate with other computer systems and electronic devices connected to the network 122. Such systems include, for example, the sensing unit 104, and the output units 108. This communication enables the processing unit 106 to send control messages, such as activate, and deactivate signals to the sensing unit 104 and data such as movement scores and other related information about a particular animal to the output unit 108. This communication also enables the processing unit 106 to receive data, such as sensed data from the sensing unit 104 and user identification details from the output device 108.
  • the processing unit 106 is configured by executing software.
  • Software in the form of programs or modules, is stored in non-transient memory 312 and includes computer readable instructions and data.
  • the instructions and data are read into system memory (e.g. 308) and executed by the processor 302 to cause the processing unit 106 to provide the various functions and perform the various operations described herein.
  • One such module is an activity detection module 316 that is configured to process data received from the sensing unit 104 and identify a particular activity the corresponding animal may be engaged in based on the data in certain embodiments. For example, the activity detection module 316 can access sensor data and/or location data to determine whether the animal is walking, trotting, climbing up or down stairs, standing or sitting.
  • the memory 312 also includes a calculator module 320, which is configured to calculate one or more movement- related metrics associated with a particular identified activity and to determine a movement score for a particular set of sensor data.
  • the processing unit 106 may include multiple computer systems (e.g. multiple computer servers) with, for example, a load balancer operating to direct traffic to/from a given computer system of the processing unit 106.
  • multiple computer systems may be utilized to perform separate functions of the processing unit 106.
  • the output unit 108 is, typically, a personal computing device owned by a user, such as a pet owner or a veterinary specialist.
  • the output unit 108 may, for example, be a mobile phone, a tablet, a watch, or any other portable electronic device capable of communicating with the processing unit 106 and displaying information.
  • the output unit 108 includes an embedded controller 402, having a processor 405 coupled to an internal storage module 409.
  • a diagnostic application 410 is installed and stored in the memory 409.
  • the diagnostic application 410 may provide client- side functionality of the diagnostic system 100.
  • the diagnostic application 410 may be a general web browser application (such as Chrome, Safari, Internet Explorer, Opera, or an alternative web browser application) which accesses the processing unit 106 via an appropriate uniform resource locator (URL) and communicates with the processing unit 106 via general world- wide- web protocols (e.g. http, https, ftp).
  • the diagnostic application 410 may be a specific application programmed to communicate with the processing unit 106 using defined application programming interface (API) calls.
  • API application programming interface
  • the diagnostic application 410 When active, the diagnostic application 410 includes instructions, which are executed to perform various functions. Example functions include: a) Launching and running the diagnostic application 410. b) Capturing or otherwise enabling entry of a user or animal identifier (i.e. an identifier that uniquely identifies a particular animal associated with the diagnostic system 100, e.g. microchip scan data). c) Receiving movement scores from the processing unit 108 once the movement scores are calculated. d) Retrieving history data associated with the animal identifier - the history data indicating the progress of the animal over time. e) Generating alerts/notifications when movement scores indicate that the health of the animal has deteriorated. f) Generating alerts/notification when the animal's movement scores drop below/exceed a threshold score, indicating the severe movement related issues.
  • a user or animal identifier i.e. an identifier that uniquely identifies a particular animal associated with the diagnostic system 100, e.g. microchip
  • the output unit 108 includes a display 414 (e.g. touch screen display, LCD display, LED display, or other display device).
  • the display 414 is configured to display an interface for the diagnostic application 410 in accordance with instructions received from the embedded controller 402, to which the display is connected.
  • An audio device 404 e.g. a speaker, headphones, or other audio device may also be provided, the audio device 404 being configured to output sound in accordance with instructions received from the embedded controller 402.
  • the output unit 108 may also include one or more user input devices.
  • the user input devices may include a touch sensitive panel physically associated with the display 414 to collectively form a touch-screen 412.
  • Other user input devices may also be provided, such as a microphone (not illustrated) for voice commands or a joystick/thumb wheel (not illustrated) for easily navigating menus.
  • the output unit 108 further includes a communication interface 408 that allows the output unit 108 to wirelessly communicate with the processing unit 106.
  • Examples of wireless connection include, High Speed Packet Access (HSPA+), 4G Long-Term Evolution (LTE), Mobile WiMAX, Wi-Fi (including protocols based on the IEEE 802.1 family standards),
  • IrDA Infrared Data Association
  • Bluetooth ® Bluetooth ®
  • This section describes computer-implemented methods for generating and maintaining a movement base model and computer-implemented methods for displaying a movement score for an animal's movements.
  • the methods will be described with reference to flowcharts of Figs. 5 and 6, which illustrate processing performed by the diagnostic system 100.
  • method 500 depicts a process for generating and maintaining the movement base model
  • method 600 depicts a process performed by the various units of the diagnostic system 100 to generate and display the movement score.
  • the movement base model is a machine learned model that may be generated based on training data.
  • the method 500 commences at step 502, where sensor data is generated.
  • the sensor data is generated by attaching sensing units 104 on multiple subjects of different sexes, sizes, weights, breeds, and detailed health phenotypes.
  • the subjects are selected and setup with the sensing units 104, they are made to perform certain activities - such as ascending or descending stairs, running, trotting, walking, transitioning from a standing position to a sitting position and vice versa.
  • the various acceleration, orientation, and angular velocity measurements from the sensing units 104 are recorded while the subjects are performing each of these activities.
  • the sensor data is communicated to the processing unit 106.
  • the sensor data can include 9 sensor values per instance per sensing unit 104 including three values related to acceleration in the 3 axes, three values related to angular velocity in the 3 axes, and three values related to orientation relative to magnetic north and gravity.
  • the sensor data may be time-stamped - i.e., the sensor values measured by the sensors may be associated with the time at which each value is sensed. For example, if the sensors 202 are configured to measure the movement parameters every 10 milliseconds, the sensor values may be separated by 10ms intervals and may include a timestamp for the date and time the measurement was taken.
  • this received sensor data is processed.
  • the sensor data is sliced based on activities.
  • the processing unit 106 may be configured to first identify activities the corresponding subjects were engaged in based on the received sensor data. In certain embodiments, this is done manually. For example, the subjects are recorded while they are engaged in the various activities of interest. The recording is then synchronized with the sensor data and a person manually selects start and end times for identified activities of interest and then labels the corresponding sensor data based on the identified activity.
  • the activities are identified automatically by the processing unit 106 (and in particular, the activity detection module 316) utilizing machine- learning models such as convolutional neural networks (CNN).
  • Convolutional networks are a class of artificial neural networks (ANN). These networks are built of interconnected artificial nodes, called ‘neurons' that generally mimic a biological neural network. Typically, the network includes a set of adaptive weights, i.e., numerical parameters that are tuned by training the network to perform certain complex functions. CNNs expand upon ANNs to include several convolution and pooling layers to reduce the total dimensionality and complexity of the network. Training an ANN/CNN essentially means selecting one model from a set of allowed models that minimizes a cost criterion. There are numerous algorithms available for training neural network models; most of them can be viewed as a straightforward application of optimization theory and stati sti cal estimation .
  • the activity detection module 316 trains the CNN to identify one or more preselected activities from received sensor data and identify which portions of the sensor data correspond to the identified activities.
  • the CNN is trained by first generating an appropriate amount (such as several hundred hours) of sensor and corresponding recording data of different types, breeds, sizes and/or phenotypic health status of animals engaged in the pre-selected activities. Subsequently, the sensor data is tagged, i.e., each set of sensor values is labelled based on the corresponding identified activity (e.g., from the recording). Next, the labelled data is fed to the CNN, which is trained to estimate the activity label of sensor data based on the values of the sensor data. During the training process, sensor values may be fed to the processing unit 106 and based on the weights of the neural networks, an activity type is predicted.
  • an appropriate amount such as several hundred hours
  • the sensor data is tagged, i.e., each set of sensor values is labelled based on the corresponding identified activity (e.g., from the recording).
  • the labelled data is fed to the CNN, which is trained to estimate the activity label of sensor data based on the values of the sensor data.
  • the CNN changes its weights to be more likely to produce the correct output. This process is repeated numerous times with multiple sensor values, until the CNN can correctly determine the output most of the times. It will be appreciated that the more the process is repeated, the more accurate the CNN will become.
  • the activity detection module 316 can also train the CNN to ignore other activities from the sensor data, for example, an animal sleeping, resting, sitting, etc.
  • sensor data corresponding to the immaterial activities can be labelled as ‘irrelevant' during the training process.
  • the CNN is fed enough number of such sensor values, it is able to calculate the appropriate weights to effectively classify sensor values associated with immaterial activities as irrelevant sensor data and may be trained to discard such data. Training of CNN models is fairly well known in the art and is not described in more detail here.
  • the activities to be identified may be preselected based on the type of animal being evaluated. For example, if the diagnostic system 100 is configured to diagnose the condition of dogs, one or more of the following activities are pre-selected for training the CNN - walking, ascending and descending stairs, transitioning from a sitting to a standing position, and running. Alternatively, if the diagnostic system 100 is configured to diagnose the condition of cats, one or more of the following activities are pre-selected for training the CNN - half- jumping down, jumping across, jumping down, jumping up, trotting, and walking. In certain embodiments, the diagnostic system 100 may be configured to detect movement-related conditions in both cats and dogs.
  • the activity detection module 316 may utilize two different CNNs - one configured to identify the types of activities for dogs and the other configured to identify the types of activities for cats.
  • the animal category type is provided to the activity detection module 316 before step 506 commences and it can then select the appropriate CNN to use at this step.
  • the activity detection module 316 Once the activity detection module 316 has a trained activity model, it is ready for use in the diagnostic system 100. It receives sensor data from the various sensing units 104, labels the sensor data and slices the sensor data based on the identified activities. For example, if the activity detection module 316 is given a 10 minute worth of sensor data of a subject alternatively ascending and descending stairs, it is configured to slice the sensor data into the two identified activities and further into the various instances of each activity such that each slice of data corresponds to a particular instance of a particular activity being performed. The activity detection module 316 may also discard sensor data that doesn't correspond to any of these identified activities - e.g., sensor data collected when the subject is resting between subsequent instances of ascending and/or descending the stairs.
  • sensor data sliced by activity and instances is obtained.
  • metadata associated with the subject may also be appended to the sliced sensor data.
  • This metadata may include, e.g., pathology, activity type, repeat number for each activity, etc.
  • one or more movement-related metrics are calculated.
  • the processing unit and more particularly the calculator module 318) calculates one or more of nine movement metrics for each data slice. These include: maximum acceleration (along the median plane); maximum acceleration (absolute magnitude); maximum angular velocity (along the median plane); maximum angular velocity (absolute magnitude); Spectral Arc Length (SPARC) of acceleration (absolute magnitude); SPARC of angular velocity (absolute magnitude); Dimensionless Jerk (DLJ); Log Dimensionless Jerk (LDLJ); and Root Mean Square (RMS) Jerk.
  • SPARC Spectral Arc Length
  • DLJ Dimensionless Jerk
  • LDLJ Log Dimensionless Jerk
  • RMS Root Mean Square
  • Maximum acceleration in the median plane is an analogue for ground reaction force normalized for subject weight. It shows the strongest difference during maximal effort events (e.g. jumping and landing).
  • This metric can be measured in g-force using a single axis of the accelerometer 220 on a scale of 0-4g.
  • the maximum acceleration value is selected as the acceleration value along the z axis that has the maximum value in that particular time slice. This value can be represented as: where a is linear acceleration.
  • Maximum acceleration can be considered an analogue for the normalized power of a subject.
  • This metric is generally robust with regards to rotation of the sensing unit 104 in the anatomical axes of the subject 102 and can be measured as the magnitude of all three axes of a tri-axial accelerometer (each on a scale of 0-4g).
  • the maximum acceleration value is selected as the acceleration value from any of the three axis readings of the accelerometer that has the maximum value in that particular time slice. This value can be represented as: where a is the linear acceleration vector.
  • Maximum angular velocity in the median plane can be considered a replacement for rate of joint motion, which has been shown to be a valuable metric for determining extent of OA in human subjects. In particular, this metric has shown to be valuable in sit-to-stand and stand- to-sit activities.
  • the value can be measured using a single axis of the gyroscope 222 in degrees per second. If the gyroscope is attached in the cranial or caudal positions on the subject 102, the x-axis can be used. Alternatively, if the gyroscope is attached to the collar of the subject 102, the y-axis of the gyroscope data can be used.
  • the maximum angular velocity or rate of turn around the medial-lateral axis value is selected as the angular velocity value along the y axis that has the maximum value in that particular time slice. This value can be represented as: where w is angular velocity.
  • Maximum angular velocity is robust against sensor positioning. This value can be measured as the magnitude of all three axes of the gyroscope 222 (in degrees per second). This allows movements of the animal in any plane to be considered, e.g. rotation of the hips in the transverse plane.
  • the maximum angular velocity value is selected as the angular velocity value from any of the three axis readings of the gyroscope that has the maximum value in that particular time slice. This value can be represented as: where is the angular velocity vector.
  • SPARC Spectral Arc Length
  • This metric relies on changes in the Fourier spectrum of movements to quantify smoothness.
  • This metric addresses the defects of classic gait smoothness metrics such as Harmonic Ratio and newer motion smoothness metrics such as dimensionless jerk.
  • SPARC can be calculated using magnitude of acceleration and magnitude of angular velocity as its input. SPARC is unitless and the values often cannot be correlated to specific outcomes. Instead, the values for healthy and unhealthy subjects are activity specific and are measured for each activity. However, the metric is considered robust against non-kinematic factors such as limb length, movement amplitude, sensor positioning and noise.
  • This metric is calculated based on sensor data from all three axes of the accelerometer 220 to obtain acceleration-related SPARC values and from all three axes of the gyroscope 222 to obtain velocity-related SPARC values.
  • the equation for calculating SPARC for a particular instance of an activity is given by equation (1) - where is the real fourier series normalised by the DC power: and ⁇ c is the minimum of: the max cut-off frequency and the last frequency whose magnitude is below the cut-off threshold.
  • SPARC is calculated with v(t ) as both the magnitude linear acceleration or the magnitude angular velocity
  • DLJ Dimensionless Jerk
  • LDLJ Log Dimensionless Jerk
  • DLJ the rate of change of acceleration
  • LDLJ the rate of change of acceleration
  • Both metrics are unitless, having been normalized against movement amplitude. Both these values can be calculated based on sensor data measure using all 3 axes of the accelerometer 220. Equations for calculating DLJ and LDLJ for a particular instance of an activity are given by equations (4) - (6) ⁇
  • Root Mean Square (RMS) Jerk is not generally considered a metric of motion smoothness. It has the advantage of being interpretable; however, the values are not normalized against movement amplitude or subject phenotype.
  • RMS Jerk can be calculated using all 3 axes of the accelerometer 220 and it represented in the units of ms-3.
  • An equation for calculating RMS Jerk for a particular instance of an activity is given by equation (7) - (10)
  • j jerk
  • a is the acceleration vector
  • the processing unit 106 computes values for these nine metrics for each data slice, this may not necessarily be required in all implementations of the method. Instead, the processing unit 106 may compute a subset of the metrics or any other movement related metrics without departing from the scope of the present disclosure. For example, in some implementations, the dimensionless jerk values and/or the RMS jerk values may not be calculated. Further, it may not be necessary to calculate all the metrics for all the activities. Instead, in some examples, the SPARC values may be computed only for activities such as ascending and descending stairs and may not be calculated for activities such as sit-to- stand or stand-to-sit. [0084] Once one or more movement metrics are calculated for each data slice, the method proceeds to step 510, where a training data set is generated and stored.
  • the training data set is stored in a tabular fashion, where each row of data in the dataset corresponds to a single subject performing one instance of each pre selected activity and includes the calculated metrics for a single instance of each activity, and corresponding metadata associated with the subject. Further, for each subject, and therefore for each row of data in the training dataset, a movement score is provided. This movement score is manually computed for each subject in the training data set based on information collected by veterinary specialists to define the phenotype. This movement score indicates the overall movement-related health of the subject. The higher the score, the better the condition of the subject and vice versa.
  • Fig. 7 shows a portion of an example training dataset in tabular form. It will be appreciated that this is merely an example form and that the training dataset may be stored in any other form without departing from the scope of the present disclosure.
  • Fig. 7 shows the training dataset for 8 subjects only. It will be appreciated this is not the case in real implementations, where the training dataset can include data collected from hundreds if not thousands of subjects. Further, because of limited space on the drawing sheet, the metrics values collected for each activity is shown one below the other. The first row under each activity corresponds to a first subject and so on. In reality, however, these metric values are not stored in this fashion, but the metrics values computed for one instance of each activity for one subject are stored together and considered a single training “record”.
  • each row of training data is in relation to a given subject and for a given activity - namely, trotting, walking, ascending/descending stairs, and transitioning from a standing to a sitting position or vice versa. Further, each row includes the values of the nine metrics described above for each of the activities. Further each record includes a movement score that was manually computed for the corresponding subject. In one example, the movement scores range from 1 to 6, where a movement score of 6 indicates a subject with severe movement related issues whereas a score of 1 indicates a subject with excellent overall movement-related health.
  • a movement condition model is created using this training data set.
  • the calculator module 318 may clean the training dataset. For example, it may substitute any missing data in the training dataset. In some cases, sensors may not measure accurately, and/or the calculator module 318 may not be able to calculate a metric value because of insufficient data. In such cases, some metrics values may be missing in the training dataset. To ensure that the movement condition model is not generated taking these missing values into consideration, the missing values in the training dataset are substituted. In one example, this may be done by calculating a mean value based on the other values computed for that metric and subject combination.
  • the calculator module 318 normalizes the calculated metrics values - i.e., it adjusts the metrics values measured on different scales to a notionally common scale measure. This prevents the model from assigning an unnecessarily higher weight on certain metrics. In one example, all the calculated metrics values may be normalized to fall between 0 and 1.
  • the processing unit 106 utilizes a supervised machine learning algorithm to create the movement condition model.
  • the aim of the training algorithm is to learn a mapping function between the input variables/features (metrics values in this case) and an output variable (the movement score in this case).
  • the goal of the algorithm is to approximate the mapping function such that when new sensor values are provided to the model, the model can predict the movement score for that data.
  • a support vector machine (SVM) algorithm may be utilized.
  • naive Bayes, k-nearest neighbour, decision trees algorithms or artificial neural networks (ANNs) may be utilized.
  • Support-vector machines are supervised machine learning models with associated learning algorithms that analyse data used for classification and regression analysis.
  • SVM is used for regression analysis, it is abbreviated to SVR.
  • each record may include multiple features or dimensions and therefore the space in which the records are mapped may be a multi-dimensional space and the shape identified by SVR may be a complex shape.
  • each record includes 36 features or dimensions and the training dataset includes an expected real value in the range 1-6.
  • the SVM model includes a 36 dimensional space where each record is scored under the movement score associated with that record.
  • the training data set in addition to movement scores, also includes an indication for each subject that indicates whether the subject is healthy (i.e., does not exhibit any movement-related conditions) or unhealthy (i.e., exhibits some movement-related conditions).
  • the learning algorithm may also be configured to perform a binary classification of the sensor values as corresponding to healthy or unhealthy subjects.
  • method 600 describes a method for determining a movement score of a subject, such as a subject 102.
  • the diagnostic system 100 may be utilized by a veterinary specialist to diagnose the condition of a subject.
  • the diagnostic tool may be utilized by a pet owner.
  • the veterinary specialist or the owner may have to commence the diagnostic process - at least initially.
  • the veterinary specialist or the owner (commonly referred to as a user hereafter) may be required to install the diagnostic application 410 on the output device 108.
  • the user may have to associate the sensing unit 104 with the diagnostic application 410 and/or with the subject being diagnosed.
  • the user when the user registers the diagnostic application, the user creates a profile for the subject being diagnosed. This profile may include, e.g., the breed, sex, age, and name of the subject.
  • the diagnostic application 410 forwards these details to the processing unit 106 - which creates a record for the subject in the database 304 and assigns a unique identifier to the registered subject. This unique identifier is communicated to the diagnostic application 410 for storage.
  • each sensing unit 104 may be identifiable by a unique identifier associated with the sensing unit 104. This identifier may be stored in the memory 208 of the sensing units.
  • a sensing unit 104 when a sensing unit 104 is initialised, it sends its unique identifier to the diagnostic application 410 (e.g., via wireless means). The diagnostic application 410 forwards this identifier to the processing unit 106, which stores the sensing unit's identifier in association with an identifier of the subject associated with the diagnostic application 410. In this manner, the processing unit 106 may be aware of all the sensing units 104 that may be active at any given time and the corresponding subjects these sensing units 104 are monitoring.
  • each of the sensing units may function autonomously - they may each transmit their unique identifiers to the diagnostic application 410 such that the processing unit 106 can store each of the multiple sensing unit's identifiers in association with the identifier of the subject associated with the diagnostic application.
  • the multiple sensing units 104 may be configured to elect a master sensing unit 104 amongst themselves. The master sensing unit 104 then communicates with the diagnostic application 410 and the processing unit 106 whereas the other sensing units 104 communicate their sensor data to the master sensing unit 104.
  • a sensing unit 104 may be initialized (as described above) and attached to the subject 102.
  • Method 600 commences at step 602, where the sensing unit 104 monitors one or more movement-related parameters and stores the monitored data in the memory of the sensing unit 104. If the sensing unit 104 includes an accelerometer 220, the accelerometer measures proper acceleration during the period the accelerometer is active. Similarly, if a gyroscope is utilized, the gyroscope 222 measures the angular velocity and orientation of the gyroscope while the gyroscope is active.
  • these sensors are tri-axis sensors, they measure acceleration along the x, y, and z axes and angular velocity and orientation along the x, y, and z axes respectively. Data measured by these sensors 202 is stored in the memory 208.
  • the sensing unit 104 includes additional modules, such as a location module, data collected by the additional modules is also stored in the memory 208.
  • the processing unit 106 receives the stored sensor data.
  • the sensing unit 104 senses data for a period of time (e.g., 1-2 days). Once this period has expired, the sensing unit 104 may stop sensing operations. Thereafter, it communicates the stored sensor data to the processing unit 106.
  • the sensing unit 104 may transfer sensor data to the processing unit 106 in near real time (e.g., if a network 110 is available between the sensing unit 104 and processing unit 106), at specific intervals (e.g., once every hour) or when certain conditions are met (e.g., the sensing unit 104 and processing unit 106 are connected via a network 110).
  • the data may either be pushed by the sensing unit 104 automatically or may be communicated to the processing unit 106 is response to receiving a request for the data. In either case, when transmitting the data to the processing unit 106, the sensing unit 104 also communicates the unique identifier of the sensing unit to the processing unit 106.
  • the processing unit 106 looks up the sensing unit's identifier in the records of active sensing units, retrieves the corresponding identifier for the subject associated with that sensing unit, and then stores the received sensor data in association with the unique identifier of the subject being diagnosed.
  • the processing unit 106 may also process the sensor data to identify one or more activities the subject 102 may be engaged in when the sensor data was recorded.
  • the activity detection module 316 is also configured to slice the sensor data at this step, such that each data slice corresponds to a particular instance of an activity. Sensor data that does not correspond to a pre-selected activity may be discarded at this step. Examples of activities include, e.g., walking, trotting/running, ascending or descending stairs, standing, sitting, etc. As described previously, in certain embodiments, the activities can be manually identified.
  • the activity detection module 316 may identify the various different activities based on received sensor data (i.e., accelerometer and/or gyroscope data). Further, the activities can be recognized, e.g., using one or more machine learning techniques and activity models as described above with respect to Fig. 5.
  • the processing unit 106 divides the sensor data into chunks, e.g., corresponding to a second of measured activity, and then utilizes the activity model to classify the activity occurring in each data chunk - e.g., the convolved acceleration and/or angular velocity values of the sensed data in a data chunk multiplied by the trained weights of the models give an output value for the likelihood of each activity. If multiple activities are returned as possible, a threshold is applied for the most likely, or the data is discarded as inconclusive. For example, if the closest match is to sensor values in the walk entry that data chunk in the sensed dataset is classified to be walking.
  • the processing unit 106 may also compare the identified activity of a data chunk with the identified activity of neighbouring chunks via majority vote, recurrent neural networks or long short-term memory (LSTM) or similar. This can be used to rule out outliers. Data chunks that cannot be classified under any of the selected activities may be discarded. Once data chunks are classified in this manner, adjacent data chunks that correspond to the same activity are combined into a data slice.
  • the sensor data collected by the sensing unit 106 may be time stamped. In such cases, at step 506, the processing unit 106 may classify the data slices into distinct instances of the activity based on the timestamps.
  • the data slices labelled under the descending stairs activity can be divided into two separate data slices - one for the activity lasting 5 seconds and the other for the same activity lasting 6 seconds.
  • the labelled sliced sensor data is then forwarded to the calculator module 320.
  • the processing unit 106 determines whether a threshold number of activities and a threshold instances of each activity are collected.
  • the threshold number of activities is four and the threshold instances of each activity is 12. It will be appreciated that these threshold numbers are configurable and may be changed based on the particular implementation. For highly accurate movement base models, fewer instances of activities may be sufficient, whereas for movement base models that have lower accuracy, more number of instances of each activity may be required.
  • step 608 If at step 608, it is determined that a threshold number of activities and/or a threshold number of separate instances of an activity are not detected, the method proceeds to step 610, where the processing unit 106 may generate a suitable error message and in some cases may instruct the sensing unit 104 to continue measuring and storing sensor data.
  • the error message may indicate that insufficient sensor data is available to calculate a movement score.
  • step 608 if at step 608, it is determined that the threshold number of activities and the threshold number of separate instances per activity are detected and collected, the method proceeds to step 612, where one or more movement-related metrics are calculated for each data slice.
  • the calculator module 320 calculates the same nine movement metrics that were calculated at step 508 of method 500.
  • maximum acceleration (along the Dorsal -Ventral Axis), maximum acceleration (absolute magnitude), maximum angular velocity (along the median plane); maximum angular velocity (absolute magnitude); Spectral Arc Length (SPARC) of acceleration (absolute magnitude); SPARC of angular velocity (absolute magnitude); Dimensionless Jerk (DLJ); Log Dimensionless Jerk (LDLJ); and Root Mean Square (RMS) Jerk.
  • the calculator module 320 computes values for these nine metrics for each data slice. This may not be the case in all implementations of the method - instead, in other implementations, fewer or other movement metrics may be computed for a data slice. For example, in some implementations, the dimensionless jerk values and/or the RMS jerk values are not calculated. Further, it may not be necessary to calculate all the metrics for all the activities. Instead, in some examples, the SPARC values are computed only for activities such as ascending and descending stairs and may not be calculated for activities such as sit-to-stand or stand-to-sit. Further, although the calculator module 320 in this example method computes the metrics values for multiple slices related to the same activity that may not be necessary in other examples, where the calculator module 320 computes the metrics values for a single data slice related to an activity.
  • Fig. 8 illustrates an example output of step 612 in tabular form.
  • This table 800 shows the nine metrics values calculated for the same four activities as in the training data set. Further, each activity includes 4 instances in this example. It will be appreciated this is just an example and the number of instances can be increased/decreased in real implementations.
  • the metrics values collected for each activity are shown one below the other.
  • the first row under each activity corresponds to a first instance of that activity and so on.
  • these metric values may not be stored in this fashion, but the metrics values computed for one instance of each activity are stored together and considered a single record.
  • the size of this record is similar to the size of each record in the training dataset and includes the same metrics and activities.
  • the record includes 36 metric values.
  • sensor data from one sensing unit 104 is utilized. If multiple sensing units 104 are applied - e.g., one on the collar, one in the cranial region and one in the caudal region, separate metrics values are computed based on the sensor data from each of the three sensing units 104. In such cases, the number of metrics values per record is 108.
  • the method determines a movement score for the animal being evaluated.
  • the processing unit 106 maintains a movement base model.
  • the output from step 612 is fed to the model and the model predicts a movement score for the output. This will be described with reference to the training set depicted in Fig. 7 and the output depicted in Fig. 8 for clarity.
  • each record of the output shown in Fig. 8 is fed to the movement base model which predicts a movement score for the record using the same function that was used to create the movement base model.
  • the calculation module uses the function shown in equation 11 above to determine a predicted movement score for the record.
  • x is the input record (including the 36 points) and a is the set of support vectors that constitutes the movement base model. This process is repeated for each record in the output - i.e., a movement score is predicted for each record using the function shown in equation 11.
  • the movement scores predicted for each record can then be aggregated based on a calculated mean or median of the individual movement scores. Further, in some examples, a confidence interval may be calculated based on the individual scores. This allows the calculator module 318 to handle outliers caused by unexpected kinematic events such as the subject tripping, shaking, or slowing down mid-activity due to distraction, etc. Further, in some examples, the mean/median and confidence intervals can be computed between different sets of data for the same subject - e.g., data collected at different times of the day and/or week.
  • this can be done using a rolling-moving-average or exponentially-weighted-moving- average (with a window length possibly up to 24h) to show the progression of movement scores over time.
  • the progression of movement scores over this window can also be used to stabilise the predicted movement scores against energy and fatigue levels in the subject changing over the day. This can also account for differences in the animals' diurnal cycle and/or instances where the subject may be experiencing Frenetic Random Activity Periods (FRAPS).
  • FRAPS Frenetic Random Activity Periods
  • the movement score is stored against the identifier of the subject along with a date and time stamp of when the movement score was generated. The movement score can then be forwarded to the diagnostic application 410 associated with the subject at step 616.
  • the diagnostic application 410 may display the movement score on a display of the output unit 108.
  • the processing unit 106 pushes the movement score to the diagnostic application 410 when the score is generated.
  • the diagnostic application 410 requests the processing unit 106 to forward the latest movement score for the subject associated with the diagnostic application 410.
  • the diagnostic application 410 maintains a database of movement scores received for the subject over time. It may compare the received movement score with one or more immediately preceding scores and generate one or more insights based on the movement score. For example, if the movement score has improved from the previously generated score and the user of the application has input new medication details in the diagnostic application 410, the diagnostic application 410 may display the movement score and a message stating, e.g., “[the subject] appears to be doing much better - looks like the new medication is taking affect.”.
  • the diagnostic application 410 may display the movement score and a message stating, e.g., “[the subject's] condition appears to be deteriorating. May wish to book an appointment with the Vet”.
  • the diagnostic application 410 may be configured to display a chart of the movement scores for the subject over a period of time. It will be appreciated that these are just some example ways in which the movement score may be displayed to a user of the diagnostic application 410 and that any other known techniques for displaying movement scores may be used (based on the particular application) without departing from the scope of the present disclosure.
  • the output unit 108 is described as a mobile phone or tablet (or other portable device) owned by a pet owner or a veterinary specialist, with the diagnostic application 410 installed and stored on the portable device. Further, the subject being diagnosed is registered with the diagnostic system 100 through the portable device and once registered, the pet owner or a veterinary specialist does not need to re-register or login each time the diagnostic process of Fig. 6 needs to be performed.
  • the functions of the output unit 108 are performed by the processing unit 106.
  • the diagnostic application 410 is installed on the processing unit 106 and the processing unit 106 includes a user input device (such as keyboard or keypad) and a display.
  • the display of the processing unit 106 displays a login page, where the user can enter their login details such as a user name and password. If the user is using the diagnostic system 100 for the first time, the user first registers a subject with the diagnostic system.
  • the activity detection module 316 is part of the sensing unit 104 and not part of the processing unit 106.
  • the sensor data may be assessed in real time and if the activity detection module 316 determines that the activity the subject is currently engaged in is immaterial, the activity detection module sends a signal to the embedded controller 206, which instructs the sensors 202 to enter a sleep mode for a predetermined period of time.
  • the sensing unit 104 includes both an accelerometer 220 and a gyroscope 222
  • the embedded controller 206 may instruct the gyroscope 222 to enter a sleep mode.
  • the accelerometer 220 may be instructed to measure data at a reduced sampling rate.
  • the activity detection module 316 sends a signal to the embedded controller 206, which instructs the sensors 202 to exit the sleep/reduced sample rate modes.
  • the activity detection module 316 may instruct the embedded controller 206 to stop storing the sensor values.
  • the activity detection module 316 may reverse the instruction - causing the embedded controller to start storing sensor values again. In this way, storage space in the sensing unit 104 is utilized in an efficient manner, such that only relevant sensor values are stored.
  • the activity detection unit 316 may maintain a count of the number of activities identified in the sensor data in real time and a count of the number of instances of each activity identified and recorded.
  • the activity detector unit 316 can instruct the embedded controller 206 to enter a sleep/inactive mode. In this mode the sensing unit shuts off all operations and stops sensing data - thereby again conserving power, preserving battery life of the sensing unit, reducing the amount of data recorded and stored in the memory 208.
  • movement-related metrics need not be computed. Instead, sensed activity data may be directly fed to the movement base model. In such cases, the movement base model is trained based on sensor activity data and not metrics data. For instance, in such cases, in method 500, step 508 may be omitted. Instead, the training data set at step 510 may include activity based sensor data and associated movement scores. Further, the movement base model learns a mapping function between the input variables/features (sensed activity data in this case) and an output variable (the movement score). Further, the goal of the machine learning algorithm is to approximate the mapping function such that when new sensor values are provided to the machine learned model, the model can predict the movement score for that data.
  • step 612 may be omitted.
  • the movement score for the animal being evaluated may be determined by providing sensed activity data to the movement model, which applies the approximated mapping function to generate a condition score for the animal based on the sensed activity data.
  • FIG. 9 illustrates an example schematic of a processing device 900 suitable for implementing aspects of the disclosed technology including one or more of the modules and/or applications described above. That is, as noted above, the processing device 900 comprising the processing unit 106, output unit 108, and/or sensing unit 104 may comprise any of the foregoing description or may be according to the processing device shown in Fig. 9.
  • the processing device 900 includes one or more processor unit(s) 902, memory 904, a display 906, and other interfaces 908 (e.g., buttons).
  • the memory 904 generally includes both volatile memory (e.g., RAM) and non-volatile memory (e.g., flash memory).
  • An operating system 910 such as the Microsoft Windows® operating system, the Apple macOS operating system, or the Linux operating system, resides in the memory 904 and is executed by the processor unit(s) 902, although it should be understood that other operating systems may be employed.
  • One or more applications 912 are loaded in the memory 904 and executed on the operating system 910 by the processor unit(s) 902.
  • Applications 912 may receive input from various input local devices such as a microphone 934, input accessory 935 (e.g., keypad, mouse, stylus, touchpad, joystick, instrument mounted input, or the like). Additionally, the applications 912 may receive input from one or more remote devices such as remotely-located smart devices by communicating with such devices over a wired or wireless network using more communication transceivers 930 and an antenna 938 to provide network connectivity (e.g., a mobile phone network, Wi-Fi®, Bluetooth®).
  • network connectivity e.g., a mobile phone network, Wi-Fi®, Bluetooth®
  • the processing device 900 may also include various other components, such as a positioning system (e.g., a global positioning satellite transceiver), one or more accelerometers, one or more cameras, an audio interface (e.g., the microphone 934, an audio amplifier and speaker and/or audio jack), and storage devices 928. Other configurations may also be employed.
  • a positioning system e.g., a global positioning satellite transceiver
  • one or more accelerometers e.g., a global positioning satellite transceiver
  • an audio interface e.g., the microphone 934, an audio amplifier and speaker and/or audio jack
  • the processing device 900 further includes a power supply 916, which is powered by one or more batteries or other power sources and which provides power to other components of the processing device 900.
  • the power supply 916 may also be connected to an external power source (not shown) that overrides or recharges the built-in batteries or other power sources.
  • an activity detection module 950, calculator module 952, and/or diagnostic application 954 as described above may be embodied by instructions stored in the memory 904 and/or the storage devices 928 and processed by the processor unit(s) 902.
  • the memory 904 may be the memory of a host device or of an accessory that couples to the host.
  • the processing device 900 may include a variety of tangible processor-readable storage media and intangible processor-readable communication signals.
  • Tangible processor- readable storage can be embodied by any available media that can be accessed by the processing device 900 and includes both volatile and non-volatile storage media, removable and non removable storage media.
  • Tangible processor-readable storage media excludes intangible communications signals and includes volatile and non-volatile, removable and non-removable storage media implemented in any method or technology for storage of information such as processor-readable instructions, data structures, program modules or other data.
  • Tangible processor-readable storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other tangible medium which can be used to store the desired information and which can be accessed by the processing device 900.
  • intangible processor-readable communication signals may embody processor- readable instructions, data structures, program modules or other data resident in a modulated data signal, such as a carrier wave or other signal transport mechanism.
  • modulated data signal means an intangible communications signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • intangible communication signals include signals traveling through wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.
  • Some implementations may comprise an article of manufacture.
  • An article of manufacture may comprise a tangible storage medium to store logic. Examples of a storage medium may include one or more types of processor-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
  • Examples of the logic may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, operation segments, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof.
  • an article of manufacture may store executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described implementations.
  • the executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like.
  • the executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain operation segment.
  • the instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.

Abstract

Systems and methods for evaluating a movement-related condition in an animal are disclosed. The method includes: receiving sensor data from sensors attached to the animal, the sensors measuring movement-related parameters while the animal is engaged in movement-related activities; identifying the movement-related activities and slicing the sensor data based on the identified movement-related activities such that each data slice corresponds to a respective prescribed movement-related activity; for each data slice, calculating movement-related metrics; providing the calculated movement-related metrics for each data slice to a movement base model that is trained to classify the movement-related metrics into movement scores, wherein the movement scores are associated with a movement-related condition of an animal; and receiving a predicted movement score based on the movement base model, the predicted movement score indicating the movement-related condition of the animal.

Description

ANIMAL HEALTH EVALUATION SYSTEM AND METHOD
TECHNICAL FIELD
[0001] The present disclosure is directed to evaluating health of animals and in particular is directed to systems and computer-implemented methods for detecting and evaluating movement- related conditions in animals.
BACKGROUND
[0002] The developments described in this section are known to the inventors. However, unless otherwise indicated, it should not be assumed that any of the developments described in this section qualify as prior art merely by virtue of their inclusion in this section, or that those developments are known to a person of ordinary skill in the art.
[0003] Osteoarthritis (OA), also known as degenerative joint disease (DJD), is defined as the progressive and permanent long-term deterioration of the cartilage surrounding joints. Spontaneously occurring OA affects a large percentage of animals, including dogs and cats. For example it is estimated that 30-40% dogs and cats have OA and associated clinical signs due to pain.
[0004] Historically, detection of OA-pain and treatment-related improvements in pain and disability required owner-based questionnaires, (e.g., owner observable changes in the activities the animal performs or the way it moves), or activity data using simple summed output (total activity or steps). Some other previously known techniques for detecting OA-pain in animals included analysing the animal's gait using force plates or pressure sensitive walkways.
However, these techniques involve specialized equipment and trained operators, and are not appropriate for the identification and early detection of OA-pain in all types and sizes of animals.
[0005] Accordingly, there is a need for a valid, reliable, objective diagnostic system and method that can predictably detect OA-pain in animals of different breed types and sizes, and can both be used to diagnose OA-pain, especially to detect the early onset of OA-pain, and to monitor the efficacy of therapies. SUMMARY
[0006] According to a first aspect of the present disclosure there is provided a computer- implemented method for evaluating a movement-related condition in an animal. The method comprises: at a computing device: receiving sensor data from one or more sensors associated with the animal, the one or more sensors measuring movement-related parameters while the animal is engaged in two or more prescribed movement-related activities; determining the two or more prescribed movement-related activities; slicing the sensor data based on the determined prescribed movement-related activities such that each data slice corresponds to a respective prescribed movement-related activity; for each data slice, calculating one or more movement- related metrics; providing the calculated one or more movement-related metrics for each data slice to a movement base model that is trained to classify movement-related metrics into two or more movement scores, wherein the movement scores are associated with a movement-related condition of an animal, and wherein a first movement score indicates a movement-related condition in the animal and a second movement score indicates a healthy animal; and receiving a predicted movement score based on the movement base model, the predicted movement score indicating the movement-related condition of the animal.
[0007] The predicted movement score can be selected from a range of multiple movement scores. For example, the two or more movement scores can include the range of movement scores between the first movement score and the second movement score.
[0008] The movement base model may be a machine learning model created by utilising a supervised machine learning algorithm (such as a support vector machine (SVM) algorithm) on a set of training data, the training data including one or more movement-related metrics values calculated for multiple subjects and corresponding movement scores assigned to the multiple subjects.
[0009] The movement base model is preferably trained to generate a mapping function between the movement related metrics and the two or more condition based on the set of training data, the movement base model configured to determine the predicted movement score by mapping the calculated movement-related metrics to a movement score of the two or more movement scores using the generated mapping function. [0010] The step of determining the two or more prescribed movement-related activities preferably includes identifying the two or more prescribed movement-related activities from the sensor data using a trained activity model.
[0011] According to a second aspect of the present disclosure there is provided a system for evaluating a movement-related condition in an animal, the system comprising a processing unit (e.g., a hardware processing unit) configured to: receive sensor data from one or more sensors associated with the animal, the one or more sensors measuring movement-related parameters while the animal is engaged in two or more prescribed movement-related activities; determine the two or more prescribed movement-related activities; slice the sensor data based on the determined prescribed movement-related activities such that each data slice corresponds to a respective prescribed movement-related activity; for each data slice, calculate one or more movement-related metrics; provide the calculated one or more movement-related metrics for each data slice to a movement base model that is trained to classify movement-related metrics into two or more movement scores, wherein the movement scores are associated with a movement-related condition of an animal, and wherein a first movement score indicates a movement-related condition in the animal and a second movement score indicates a healthy animal; and receive a predicted movement score based on the movement base model, the predicted movement score indicating the movement-related condition of the animal.
[0012] In further embodiments of the present disclosure, computer-implemented methods and systems for evaluating a movement-related condition in an animal are disclosed. The method may include receiving sensor data from one or more sensors associated with the animal, the one or more sensors measuring movement-related parameters while the animal is engaged in two or more prescribed movement-related activities; determining the two or more prescribed movement-related activities; and slicing the sensor data based on the determined prescribed movement-related activities such that each data slice corresponds to a respective prescribed movement-related activity. The computer-implemented method further includes providing the sliced sensor data to a machine learned movement base model that is trained to classify the sliced sensor data into two or more movement scores. The movement scores may be associated with a movement-related condition of an animal, and a first movement score may indicate a movement- related condition in the animal and a second movement score may indicate a healthy animal.
The method further includes receiving a predicted movement score based on the machine learned movement base model that indicates the movement-related condition of the animal. [0013] In still other embodiments, the sensor data may not be sliced based on prescribed activities. Instead, the sensor data received from the sensors may be provided directly to a machine learned movement base model that ingests the sensor data and determines a movement score for a given animal - the movement score indicating whether the animal has a movement related condition or is healthy.
BRIEF DESCRIPTION OF THE DRAWINGS [0014] In the drawings:
[0015] Fig. 1 A is a schematic diagram of a diagnostic system according to some aspects of the disclosure.
[0016] Fig. IB is a graphical representation of a subject on which the diagnostic system can be utilised.
[0017] Fig. 2 is a block diagram of a sensing unit according to some aspects of this disclosure.
[0018] Fig. 3 is a block diagram of a processing unit according some aspects of the present disclosure.
[0019] Fig. 4 is a block diagram of an output unit according to some aspects of the present disclosure.
[0020] Fig. 5 is a flowchart illustrating an example method for generating a movement base model according to some embodiments of the present disclosure.
[0021] Fig. 6 is a flowchart illustrating an example method for generating a movement score according to some embodiments of the present disclosure.
[0022] Fig. 7 illustrates a portion of an example training dataset according to some embodiments of the present disclosure.
[0023] Fig. 8 illustrates an example dataset of a subject according to some embodiments of the present disclosure.
[0024] Fig. 9 illustrates an example processing device that may be used in connection with the present disclosure. [0025] While the invention as claimed is amenable to various modifications and alternative forms, specific embodiments are shown by way of example in the drawings and are described in detail. It should be understood, however, that the drawings and detailed description are not intended to limit the invention to the particular form disclosed. The intention is to cover all modifications, equivalents, and alternatives falling within the scope of the present invention as defined by the appended claims.
DETAILED DESCRIPTION
[0026] In the following description numerous specific details are set forth in order to provide a thorough understanding of the claimed invention. It will be apparent, however, that the claimed invention may be practiced without these specific details. In some instances, well- known structures and devices are shown in block diagram form in order to avoid unnecessary obscuring.
Overview
[0027] Animals often appear more stoic than humans and often do not complain or demonstrate pain but rather, attempt to make adjustments to accommodate their distress.
Further, because animals have limited communication abilities, it is often difficult for pet owners to determine that their pets are suffering until the underlying condition has significantly progressed.
[0028] As described previously, conventional techniques for detecting pain conditions (such as OA-pain) that affect movement in animals rely either on pet owners being able to observe changes in their pets or on highly technical tests performed in a veterinary setting, the results of which may not reflect normal movement in the home environment, and therefore may not be able to accurately diagnose an animal's condition. Further, the success of these conventional techniques vary based on the weight, breed and limb-length of the animal as well as the technical expertise of the operators.
[0029] To address one or more of these issues, aspects of the present disclosure present a diagnostic system and method that can predictably detect perturbations from an animal's normal movement. In some embodiments, the diagnostic system and method is agnostic to the weight, breed, limb-length and lifestyle of the animal. It can be used to detect abnormality of the movements of animals of different breed types and sizes. Further, in some embodiments, the disclosed systems and methods are robust against behavioural factors (such as shaking, scratching, etc.) as well and work across the whole range of animal behavioural phenotypes. The disclosed systems and methods can both be used to diagnose medical conditions that affect an animal's movements, detect early onset of such conditions and monitor the effectiveness of therapies for these conditions.
[0030] In particular, the disclosed diagnostic systems and methods calculate an overall movement score for an animal. As used in this disclosure, the overall movement score indicates the quality of movement of an animal and includes factors such as the ease and flow of movement and the power and efficiency of the movement. It was observed by the inventors that animals that suffer from pain or any other related conditions typically make stiff, jerky movements and/or moved in a rather inefficient manner or with reduced power. Accordingly, a calculated overall movement score can indicate the amount of pain an animal may be suffering as a result of OA or other conditions that affect movement.
[0031] In order to calculate this movement score, in certain embodiments, the presently disclosed diagnostic system determines certain metrics related with smoothness of motion, including metrics such as spectral arc length acceleration and velocity and jerk in an animal's movements while the animal is performing certain activities. These metrics are usually considered good indicators of power and efficiency of movement, and also good indicators of smoothness of motion, movement flow and ease. The measured values for these metrics are compared with a base model to arrive at a movement score for the animal. The scores can be configured such that a high movement score indicates a healthy animal that does not have underlying conditions that affect the animal's movements whereas a low movement score may indicate that the animal is suffering from some conditions that affect the animal's quality of movement. It will be appreciated obviously that the converse can also be easily configured - i.e., low scores indicating good health and high scores indicating some movement-related conditions in the subject.
[0032] Owners or veterinary doctors may use these movement scores to determine if the animal's health is improving when a particular treatment plan is implemented. For example, if the movement score is low, pain medication may be prescribed to the animal. Owners can then monitor the progress of their pet using the diagnostic system. An improvement in the movement score would indicate that the pain medication is working. Alternatively, if the movement scores do not improve and/or improve for a period and then start declining again, it may be that the pain medication has not worked or has stopped working and the owner may consider changing the pet's pain medication and/or taking the pet back to see a veterinary doctor.
[0033] These and other aspects and technical advantages of the present disclosure are described in detail in the following sections with reference to dogs. However, it will be appreciated that this is merely an example and that the diagnostic systems and methods described herein can just as easily be implemented on other animals, such as cats and other animals without departing from the scope of the present disclosure.
Example diagnostic system
[0034] Fig. 1 A illustrates a diagnostic system 100 including a sensing unit 104, a processing unit 106, and an output unit 108. These units communicate with each other via one or more communication networks 110.
[0035] Fig. IB is a schematic representation of a subject 102 on which the diagnostic system 100 can be utilized. Fig. IB includes a representation 200 of a subject 102 that shows various anatomical planes with respect to the subject/animal 102. These include, a transverse (or axial) plane 112 that divides the body into cranial 114 and caudal portions 116, a coronal (or dorsal) plane 118 that divides the body into dorsal 120 and ventral portions 122, and a median plane 124 that divides the body into left and right portions.
[0036] The sensing unit 104 includes one or more sensors, which measure various movement-related parameters such as dynamic acceleration forces, tilt, rotational motion, angular velocity, etc. The sensing unit 104 is associated with the subject 102. For instance, the sensing unit 104 may be attached to or implanted in the subject using suitable means and may be positioned at one or more suitable positions. Alternatively, the sensing unit 104 may be external to the subject 102 (for example, comprising a camera system to sense motion of subject 102, with processing means to determine movement-related parameters from the received image data). In some examples, the sensing unit 104 is attached to or integrated into a collar or harness worn by the subject 102. In other examples, the sensing unit 102 or portions thereof may be implanted in one or more parts of the animal's body. In case the sensing unit 104 includes sensors that are placed at a single position on the subject 102, the sensing unit 104 can be enclosed in a single housing. Alternatively, if the sensors are positioned in multiple positions on or inside the subject 102, the sensing unit 104 comprises multiple housings, each including one or more sensors. [0037] In one embodiment, the sensing unit 104 includes three sensor assemblies - one attached to a caudal-dorsal portion of the subject (as indicated by reference numeral 126 in Fig. IB), one attached to a cranial-dorsal portion of the subject 102 (as indicated by reference numeral 128 in Fig. IB) and one attached at the subject's collar (as shown in Fig. IB). In another embodiment, the sensing unit 104 includes a single sensor assembly attached to the subject's collar. It will be appreciated that these are merely examples and that in practice the number and position of the sensor assemblies may be modified without departing from the scope of the present disclosure.
[0038] The processing unit 106 is configured to maintain a baseline movement model and to process sensor data received from the sensing unit 104 to determine a movement score. In some embodiments, the sensing unit 104 and the processing unit 106 are part of a single device. Alternatively, these units may be separate devices/modules - e.g., the sensing unit 104 is attached to the subject 102 during use and the processing unit 106 is incorporated at a remote location - e.g., in a separate computer device executing in a veterinary hospital or clinic, or on the cloud.
[0039] The output unit 108 may be configured to receive the movement score from the processing unit 106 and provide this to a user of the diagnostic system 100. In some embodiments, the output unit 108 is installed and executed on a separate device, e.g., a mobile device of the user. In other embodiments, the output unit 108 is executed on the same device as the sensing unit 104 and/or the processing unit 106.
[0040] During use, the sensor(s) sense various parameters associated with the subject's movements while the subject 102 engages in one or more activities. At some point, e.g., after expiry of a certain period, upon collection of a threshold amount of sensed data, or upon receiving a command from the processing unit 106, the sensing unit 104 communicates the sensed data to the processing unit 106 for further processing. Once the data is processed, the processing unit 106 communicates a movement score to the output unit 108. Each of these units will be described in detail in the following sections.
Sensing unit
[0041] Fig. 2 illustrates an example sensing unit 104 according to some aspects of the present disclosure. As described previously, the main function of the sensing unit 104 is to sense/measure one or more parameters associated with movement of the subject 102 to which the sensing unit is attached. To this end, in its simplest form, the sensing unit 104 includes one or more sensors 202 configured to measure the movement parameters and a network interface 204 configured to communicate the sensed data to the processing unit 106. The sensors 202 and network interface 204 may be separate components or incorporated in a single device. In addition to these components, the sensing unit 104 includes an embedded controller 206, a memory 208 and a power source 210. These components are interconnected through a system bus 212.
[0042] In the example of Fig. 2, the one or more sensors 202 includes an accelerometer 220 and a gyroscope 222. An accelerometer is a device that measures acceleration. Accelerometers may be single or multiple axis accelerometers - i.e., they can measure acceleration along a single axis or multiple axes. In one example, the accelerometer 220 used in the present disclosure is a tri-axis accelerometer. A gyroscope is typically a device used for measuring orientation and angular velocity. Just like accelerometers, gyroscopes can also measure orientation and angular velocity along a single axis or multiple axes. In one example, the gyroscope 222 used in the present disclosure is a tri-axis gyroscope. In case both the accelerometer 220 and gyroscope 222 are tri-axial, in some examples, 6 values may be measured by the sensors 202 (3 linear acceleration values and 3 angular velocity values). Further, the sensors 202 may be configured to sense or measure these values at the same rate or at different rates. In one example, the sensing rate is 100 times per second. Depending on the metrics calculated by the processing unit 106 and utilized to calculate the movement score, one or both of these types of sensors may be utilized.
[0043] The network interface 204 is used to transmit data to and receive data from the processing system 106. In certain embodiments, the data transmitted from the sensing unit 104 to the processing unit 106 includes sensor data captured by the sensors 202. Additionally, the sensing unit 104 may transmit information from other modules of the sensing unit - e.g., it may transmit information about its battery life to the processing unit 106, e.g., when the battery life reduces below a threshold amount and/or when the processing unit 106 requests information about the sensing unit's remaining battery life. The data received from the processing unit 106 at the sensing unit 104 can include signals to activate or deactivate the sensors 202 (e.g., when sufficient amount of sensor data is collected and/or at certain times of the day to conserve battery power). Additionally, the data received from the processing unit 106 at the sensing unit 104 can include instructions to transfer the sensor data to the processing unit 106, instructions to delete sensor data from the sensing unit's memory 208, etc. [0044] In order to facilitate communications, the network interface 204 includes a network interface controller (NIC) that allows the sensing unit 104 to access the network 110 and process low-level network information. For example, the NIC has a connector for accepting a cable, or an aerial for wireless transmission and reception, and the associated circuitry.
[0045] The memory 208 may be utilized to store recorded sensor data and configuration data for the sensing unit 104. Access to the memory 208 is controlled by the embedded controller 206. The memory can include one or more memory chips or modules. The memory 208 can be arranged into memory' address blocks dedicated to certain purposes e.g. storage of configuration parameters, and storage of sensor data being the two of primary concern here. The memory' can also have a reserved, secure region that is not accessible by the user or erasable during firmware update.
[0046] The power supply module 210 provides power to the various modules of the sensing unit 104 during operation. In some examples, the power supply module 210 includes a battery', such as a lithium polymer battery. Power is supplied to the battery via a power connection, which in this case can come from an external source. In some embodiments, the battery recharges itself using alternative power means, such as movement, solar energy etc. In other embodiments, the battery is charged via direct external power supplies - e.g., via wired or wireless charging. In case the battery is charged by wired means, the sensing unit 104 includes a circuit (not illustrated) connected to a power pin of a charging connector, such as a USB or mini USB connector. In case the battery is charged by wireless means, the sensing unit 104 includes a wireless power receiver (not illustrated) which can wirelessly connect with a wireless transmitter. The power which is delivered to the power supply module 210 in any of these manners may be converted to an appropriate voltage by a DC-DC converter and charge may be delivered to the battery in accordance with a battery charge management scheme implemented by the battery' charging circuit.
[0047] In some embodiments, the embedded controller 206 is configured to communicate with the power supply module 210 to receive the charge state of the battery. Based on this information, the embedded controller 206 may be configured to perform a number of functions - e.g., the embedded controller 206 may enter a low power or sleep mode when the battery power is below a threshold level, it may generate an alert when the battery power reduces below another threshold level, etc. [0048] In certain embodiments, the sensing unit 104 also includes a location detection module (not illustrated) such as a GPS receiver. In some aspects, instead of or in addition to a GPS receiver, the location detection module includes an ultra-wideband (UWB) positioning system. UWB positioning systems are known to be able to pinpoint location in real time to within 20 centimetres or less. In certain embodiments, the accelerometer or gyroscope are used in combination with the network interface 204 to accurately and reliably identify indoor locations of the subject. In addition or alternatively, any other positioning system such as Bluetooth that can provide accurate positioning information (with an accuracy of a few centimetres) indoors may be utilized without departing from the scope of the present disclosure.
[0049] In Fig. 2, the sensing unit 104 is depicted as a single block diagram. As described previously, in some embodiments, the entire sensing unit 104 is contained in a single housing. Alternatively, the sensing unit 104 may be distributed, having multiple housings coupled to each other through wires or wirelessly. For instance, the sensors 202 are individually mounted on the collar of the subject 102 whereas the other modules are mounted on a harness and positioned on the subject's back.
Processing unit
[0050] The processing unit 106 performs various operations. For example, and as described further below, the processing unit 106 operates to: receive sensor data from the sensing unit 104, process the received sensor data to identify one or more activities the subject may be engaged in, process the received sensor data to determine a movement score, maintain and train a movement base model, and communicate with the sensing unit 104 and the output unit 108.
[0051] The processing unit 106 is a computer processing system. Fig. 3 provides one example of a suitable computer processing system that can be used as the processing unit 106.
[0052] As depicted in Fig. 3, the processing unit 106 includes a processor 302. Through a communication bus 306, the processor 302 is in communication with computer readable system memory 308 (e.g. a read only memory storing a BIOS for basic system operations), computer readable volatile memory 310 (e.g. random access memory such as DRAM modules), and computer readable non-transient memory 312 (e.g., one or more hard disk drives, such as NVRAM Flash modules, e.g., NAND flash or other flash technology). Instructions and data for controlling the processor 302 are stored in the memory 308, 310, 312, and 302. [0053] Non-transient memory 312 provides data storage for a database 304 for storing data associated with the programs. For instance, the database 304 may store an activity model and a movement base model. The database 304 may also store user registration details and passwords for users accessing the system. It may also store information related to the subjects being assessed, such as history of movement scores calculated for the animal, animal breed and size, owner details, veterinary specialist details, etc. The database 304 may alternatively be stored on external computer readable storage accessible by the processing unit 106 (via wired, wireless, direct or network connection).
[0054] The processing unit 106 also includes one or more communication interfaces 314. Communications interfaces 314 are operated to provide wired or wireless connection to the communication network(s) 110. Via the communication interface(s) 314 and network(s) 110, processing unit 106 can communicate with other computer systems and electronic devices connected to the network 122. Such systems include, for example, the sensing unit 104, and the output units 108. This communication enables the processing unit 106 to send control messages, such as activate, and deactivate signals to the sensing unit 104 and data such as movement scores and other related information about a particular animal to the output unit 108. This communication also enables the processing unit 106 to receive data, such as sensed data from the sensing unit 104 and user identification details from the output device 108.
[0055] The processing unit 106 is configured by executing software. Software, in the form of programs or modules, is stored in non-transient memory 312 and includes computer readable instructions and data. The instructions and data are read into system memory (e.g. 308) and executed by the processor 302 to cause the processing unit 106 to provide the various functions and perform the various operations described herein.
[0056] One such module is an activity detection module 316 that is configured to process data received from the sensing unit 104 and identify a particular activity the corresponding animal may be engaged in based on the data in certain embodiments. For example, the activity detection module 316 can access sensor data and/or location data to determine whether the animal is walking, trotting, climbing up or down stairs, standing or sitting. The memory 312 also includes a calculator module 320, which is configured to calculate one or more movement- related metrics associated with a particular identified activity and to determine a movement score for a particular set of sensor data. These and other such programs/modules of the processing unit 106 will be described in detail with reference to method Figures 5 and 6. [0057] It will be appreciated that alternative system architectures to that described above and illustrated in Fig. 3 are possible such as the architecture shown in Fig. 9 and descried in greater detail below. Furthermore, in order to handle the required processing and communication load (and provide redundancy), the processing unit 106 may include multiple computer systems (e.g. multiple computer servers) with, for example, a load balancer operating to direct traffic to/from a given computer system of the processing unit 106. Alternatively, multiple computer systems may be utilized to perform separate functions of the processing unit 106.
Output unit
[0058] The output unit 108 is, typically, a personal computing device owned by a user, such as a pet owner or a veterinary specialist. The output unit 108 may, for example, be a mobile phone, a tablet, a watch, or any other portable electronic device capable of communicating with the processing unit 106 and displaying information.
[0059] One example of a portable electronic device suitable for use as an output unit 108 is shown in Fig. 4. In this example, the output unit 108 includes an embedded controller 402, having a processor 405 coupled to an internal storage module 409. A diagnostic application 410 is installed and stored in the memory 409. The diagnostic application 410 may provide client- side functionality of the diagnostic system 100. The diagnostic application 410 may be a general web browser application (such as Chrome, Safari, Internet Explorer, Opera, or an alternative web browser application) which accesses the processing unit 106 via an appropriate uniform resource locator (URL) and communicates with the processing unit 106 via general world- wide- web protocols (e.g. http, https, ftp). Alternatively, the diagnostic application 410 may be a specific application programmed to communicate with the processing unit 106 using defined application programming interface (API) calls.
[0060] When active, the diagnostic application 410 includes instructions, which are executed to perform various functions. Example functions include: a) Launching and running the diagnostic application 410. b) Capturing or otherwise enabling entry of a user or animal identifier (i.e. an identifier that uniquely identifies a particular animal associated with the diagnostic system 100, e.g. microchip scan data). c) Receiving movement scores from the processing unit 108 once the movement scores are calculated. d) Retrieving history data associated with the animal identifier - the history data indicating the progress of the animal over time. e) Generating alerts/notifications when movement scores indicate that the health of the animal has deteriorated. f) Generating alerts/notification when the animal's movement scores drop below/exceed a threshold score, indicating the severe movement related issues.
[0061] To help perform these functions, the output unit 108 includes a display 414 (e.g. touch screen display, LCD display, LED display, or other display device). The display 414 is configured to display an interface for the diagnostic application 410 in accordance with instructions received from the embedded controller 402, to which the display is connected. An audio device 404 (e.g. a speaker, headphones, or other audio device) may also be provided, the audio device 404 being configured to output sound in accordance with instructions received from the embedded controller 402.
[0062] The output unit 108 may also include one or more user input devices. In some implementations, the user input devices may include a touch sensitive panel physically associated with the display 414 to collectively form a touch-screen 412. Other user input devices may also be provided, such as a microphone (not illustrated) for voice commands or a joystick/thumb wheel (not illustrated) for easily navigating menus.
[0063] The output unit 108 further includes a communication interface 408 that allows the output unit 108 to wirelessly communicate with the processing unit 106. Examples of wireless connection include, High Speed Packet Access (HSPA+), 4G Long-Term Evolution (LTE), Mobile WiMAX, Wi-Fi (including protocols based on the IEEE 802.1 family standards),
Infrared Data Association (IrDA), Bluetooth®, and the like.
Example processes
[0064] This section describes computer-implemented methods for generating and maintaining a movement base model and computer-implemented methods for displaying a movement score for an animal's movements. The methods will be described with reference to flowcharts of Figs. 5 and 6, which illustrate processing performed by the diagnostic system 100. In particular, method 500 depicts a process for generating and maintaining the movement base model whereas method 600 depicts a process performed by the various units of the diagnostic system 100 to generate and display the movement score.
Process for generating and maintaining movement base model
[0065] The movement base model is a machine learned model that may be generated based on training data. The method 500 commences at step 502, where sensor data is generated. In one embodiment, the sensor data is generated by attaching sensing units 104 on multiple subjects of different sexes, sizes, weights, breeds, and detailed health phenotypes. The larger the training data, the more accurate the model. Therefore, it may be beneficial to include data from as many subjects as possible when obtaining the initial training data. Once the subjects are selected and setup with the sensing units 104, they are made to perform certain activities - such as ascending or descending stairs, running, trotting, walking, transitioning from a standing position to a sitting position and vice versa. The various acceleration, orientation, and angular velocity measurements from the sensing units 104 are recorded while the subjects are performing each of these activities.
[0066] Next, at step 504, the sensor data is communicated to the processing unit 106. If the sensors 202 included a tri-axis accelerometer 220 and tri-axis gyroscope 220, the sensor data can include 9 sensor values per instance per sensing unit 104 including three values related to acceleration in the 3 axes, three values related to angular velocity in the 3 axes, and three values related to orientation relative to magnetic north and gravity. Further, the sensor data may be time-stamped - i.e., the sensor values measured by the sensors may be associated with the time at which each value is sensed. For example, if the sensors 202 are configured to measure the movement parameters every 10 milliseconds, the sensor values may be separated by 10ms intervals and may include a timestamp for the date and time the measurement was taken.
[0067] At step 506, this received sensor data is processed. In particular, the sensor data is sliced based on activities. To this end, the processing unit 106 may be configured to first identify activities the corresponding subjects were engaged in based on the received sensor data. In certain embodiments, this is done manually. For example, the subjects are recorded while they are engaged in the various activities of interest. The recording is then synchronized with the sensor data and a person manually selects start and end times for identified activities of interest and then labels the corresponding sensor data based on the identified activity.
[0068] In another preferred embodiment, the activities are identified automatically by the processing unit 106 (and in particular, the activity detection module 316) utilizing machine- learning models such as convolutional neural networks (CNN). Convolutional networks are a class of artificial neural networks (ANN). These networks are built of interconnected artificial nodes, called ‘neurons' that generally mimic a biological neural network. Typically, the network includes a set of adaptive weights, i.e., numerical parameters that are tuned by training the network to perform certain complex functions. CNNs expand upon ANNs to include several convolution and pooling layers to reduce the total dimensionality and complexity of the network. Training an ANN/CNN essentially means selecting one model from a set of allowed models that minimizes a cost criterion. There are numerous algorithms available for training neural network models; most of them can be viewed as a straightforward application of optimization theory and stati sti cal estimation .
[0069] In the present example, the activity detection module 316 trains the CNN to identify one or more preselected activities from received sensor data and identify which portions of the sensor data correspond to the identified activities.
[0070] To be able to identify the pre-selected activities, the CNN is trained by first generating an appropriate amount (such as several hundred hours) of sensor and corresponding recording data of different types, breeds, sizes and/or phenotypic health status of animals engaged in the pre-selected activities. Subsequently, the sensor data is tagged, i.e., each set of sensor values is labelled based on the corresponding identified activity (e.g., from the recording). Next, the labelled data is fed to the CNN, which is trained to estimate the activity label of sensor data based on the values of the sensor data. During the training process, sensor values may be fed to the processing unit 106 and based on the weights of the neural networks, an activity type is predicted. If the output is incorrect, the CNN changes its weights to be more likely to produce the correct output. This process is repeated numerous times with multiple sensor values, until the CNN can correctly determine the output most of the times. It will be appreciated that the more the process is repeated, the more accurate the CNN will become.
[0071] In addition to identifying one of the pre-selected activities, the activity detection module 316 can also train the CNN to ignore other activities from the sensor data, for example, an animal sleeping, resting, sitting, etc. To that end, sensor data corresponding to the immaterial activities can be labelled as ‘irrelevant' during the training process. Once the CNN is fed enough number of such sensor values, it is able to calculate the appropriate weights to effectively classify sensor values associated with immaterial activities as irrelevant sensor data and may be trained to discard such data. Training of CNN models is fairly well known in the art and is not described in more detail here.
[0072] The activities to be identified may be preselected based on the type of animal being evaluated. For example, if the diagnostic system 100 is configured to diagnose the condition of dogs, one or more of the following activities are pre-selected for training the CNN - walking, ascending and descending stairs, transitioning from a sitting to a standing position, and running. Alternatively, if the diagnostic system 100 is configured to diagnose the condition of cats, one or more of the following activities are pre-selected for training the CNN - half- jumping down, jumping across, jumping down, jumping up, trotting, and walking. In certain embodiments, the diagnostic system 100 may be configured to detect movement-related conditions in both cats and dogs. In this case, the activity detection module 316 may utilize two different CNNs - one configured to identify the types of activities for dogs and the other configured to identify the types of activities for cats. In this case, the animal category type is provided to the activity detection module 316 before step 506 commences and it can then select the appropriate CNN to use at this step.
[0073] Once the activity detection module 316 has a trained activity model, it is ready for use in the diagnostic system 100. It receives sensor data from the various sensing units 104, labels the sensor data and slices the sensor data based on the identified activities. For example, if the activity detection module 316 is given a 10 minute worth of sensor data of a subject alternatively ascending and descending stairs, it is configured to slice the sensor data into the two identified activities and further into the various instances of each activity such that each slice of data corresponds to a particular instance of a particular activity being performed. The activity detection module 316 may also discard sensor data that doesn't correspond to any of these identified activities - e.g., sensor data collected when the subject is resting between subsequent instances of ascending and/or descending the stairs.
[0074] At the end of method step 506, sensor data sliced by activity and instances is obtained. Once the activities have been identified, metadata associated with the subject may also be appended to the sliced sensor data. This metadata may include, e.g., pathology, activity type, repeat number for each activity, etc.
[0075] Next, at step 508, for each slice of sensor data, one or more movement-related metrics are calculated. In one example, the processing unit (and more particularly the calculator module 318) calculates one or more of nine movement metrics for each data slice. These include: maximum acceleration (along the median plane); maximum acceleration (absolute magnitude); maximum angular velocity (along the median plane); maximum angular velocity (absolute magnitude); Spectral Arc Length (SPARC) of acceleration (absolute magnitude); SPARC of angular velocity (absolute magnitude); Dimensionless Jerk (DLJ); Log Dimensionless Jerk (LDLJ); and Root Mean Square (RMS) Jerk.
[0076] Maximum acceleration in the median plane is an analogue for ground reaction force normalized for subject weight. It shows the strongest difference during maximal effort events (e.g. jumping and landing). This metric can be measured in g-force using a single axis of the accelerometer 220 on a scale of 0-4g. The maximum acceleration value is selected as the acceleration value along the z axis that has the maximum value in that particular time slice. This value can be represented as:
Figure imgf000019_0001
where a is linear acceleration.
[0077] Maximum acceleration (absolute magnitude) can be considered an analogue for the normalized power of a subject. This metric is generally robust with regards to rotation of the sensing unit 104 in the anatomical axes of the subject 102 and can be measured as the magnitude of all three axes of a tri-axial accelerometer (each on a scale of 0-4g). The maximum acceleration value is selected as the acceleration value from any of the three axis readings of the accelerometer that has the maximum value in that particular time slice. This value can be represented as:
Figure imgf000019_0002
where a is the linear acceleration vector.
[0078] Maximum angular velocity in the median plane can be considered a replacement for rate of joint motion, which has been shown to be a valuable metric for determining extent of OA in human subjects. In particular, this metric has shown to be valuable in sit-to-stand and stand- to-sit activities. The value can be measured using a single axis of the gyroscope 222 in degrees per second. If the gyroscope is attached in the cranial or caudal positions on the subject 102, the x-axis can be used. Alternatively, if the gyroscope is attached to the collar of the subject 102, the y-axis of the gyroscope data can be used. The maximum angular velocity or rate of turn around the medial-lateral axis value is selected as the angular velocity value along the y axis that has the maximum value in that particular time slice. This value can be represented as: where w
Figure imgf000019_0003
is angular velocity.
[0079] Maximum angular velocity (absolute magnitude) is robust against sensor positioning. This value can be measured as the magnitude of all three axes of the gyroscope 222 (in degrees per second). This allows movements of the animal in any plane to be considered, e.g. rotation of the hips in the transverse plane. The maximum angular velocity value is selected as the angular velocity value from any of the three axis readings of the gyroscope that has the maximum value in that particular time slice. This value can be represented as: where
Figure imgf000020_0002
is the angular
Figure imgf000020_0001
velocity vector.
[0080] Spectral Arc Length (SPARC) is a metric related to smoothness of motion. This metric relies on changes in the Fourier spectrum of movements to quantify smoothness. This metric addresses the defects of classic gait smoothness metrics such as Harmonic Ratio and newer motion smoothness metrics such as dimensionless jerk. SPARC can be calculated using magnitude of acceleration and magnitude of angular velocity as its input. SPARC is unitless and the values often cannot be correlated to specific outcomes. Instead, the values for healthy and unhealthy subjects are activity specific and are measured for each activity. However, the metric is considered robust against non-kinematic factors such as limb length, movement amplitude, sensor positioning and noise. This metric is calculated based on sensor data from all three axes of the accelerometer 220 to obtain acceleration-related SPARC values and from all three axes of the gyroscope 222 to obtain velocity-related SPARC values. The equation for calculating SPARC for a particular instance of an activity is given by equation (1) -
Figure imgf000020_0003
where is the real fourier series normalised by the DC power:
Figure imgf000020_0005
Figure imgf000020_0004
and ωc is the minimum of: the max cut-off frequency and the last frequency whose magnitude is below the cut-off threshold. SPARC is calculated with v(t ) as both the magnitude linear acceleration or the magnitude angular velocity
Figure imgf000020_0007
Figure imgf000020_0006
[0081] Dimensionless Jerk (DLJ) and Log Dimensionless Jerk (LDLJ) are the other validated metrics related to smoothness of motion. Generally speaking Jerk is the rate of change of acceleration. LDLJ differs from DLJ by focusing on the physiological range. Both metrics are unitless, having been normalized against movement amplitude. Both these values can be calculated based on sensor data measure using all 3 axes of the accelerometer 220. Equations for calculating DLJ and LDLJ for a particular instance of an activity are given by equations (4) - (6)·
Figure imgf000021_0001
Where a is the acceleration vector and t is the length of a in seconds.
[0082] Root Mean Square (RMS) Jerk is not generally considered a metric of motion smoothness. It has the advantage of being interpretable; however, the values are not normalized against movement amplitude or subject phenotype. RMS Jerk can be calculated using all 3 axes of the accelerometer 220 and it represented in the units of ms-3. An equation for calculating RMS Jerk for a particular instance of an activity is given by equation (7) - (10)
Figure imgf000021_0002
Where, j is jerk, and a is the acceleration vector.
[0083] Although in the present disclosure, the processing unit 106 computes values for these nine metrics for each data slice, this may not necessarily be required in all implementations of the method. Instead, the processing unit 106 may compute a subset of the metrics or any other movement related metrics without departing from the scope of the present disclosure. For example, in some implementations, the dimensionless jerk values and/or the RMS jerk values may not be calculated. Further, it may not be necessary to calculate all the metrics for all the activities. Instead, in some examples, the SPARC values may be computed only for activities such as ascending and descending stairs and may not be calculated for activities such as sit-to- stand or stand-to-sit. [0084] Once one or more movement metrics are calculated for each data slice, the method proceeds to step 510, where a training data set is generated and stored.
[0085] In one embodiment, the training data set is stored in a tabular fashion, where each row of data in the dataset corresponds to a single subject performing one instance of each pre selected activity and includes the calculated metrics for a single instance of each activity, and corresponding metadata associated with the subject. Further, for each subject, and therefore for each row of data in the training dataset, a movement score is provided. This movement score is manually computed for each subject in the training data set based on information collected by veterinary specialists to define the phenotype. This movement score indicates the overall movement-related health of the subject. The higher the score, the better the condition of the subject and vice versa.
[0086] Fig. 7 shows a portion of an example training dataset in tabular form. It will be appreciated that this is merely an example form and that the training dataset may be stored in any other form without departing from the scope of the present disclosure. Fig. 7 shows the training dataset for 8 subjects only. It will be appreciated this is not the case in real implementations, where the training dataset can include data collected from hundreds if not thousands of subjects. Further, because of limited space on the drawing sheet, the metrics values collected for each activity is shown one below the other. The first row under each activity corresponds to a first subject and so on. In reality, however, these metric values are not stored in this fashion, but the metrics values computed for one instance of each activity for one subject are stored together and considered a single training “record”.
[0087] As shown in this Fig. 7, each row of training data is in relation to a given subject and for a given activity - namely, trotting, walking, ascending/descending stairs, and transitioning from a standing to a sitting position or vice versa. Further, each row includes the values of the nine metrics described above for each of the activities. Further each record includes a movement score that was manually computed for the corresponding subject. In one example, the movement scores range from 1 to 6, where a movement score of 6 indicates a subject with severe movement related issues whereas a score of 1 indicates a subject with excellent overall movement-related health.
[0088] Returning to Fig. 5, at step 512, a movement condition model is created using this training data set. Before the model is created, the calculator module 318 may clean the training dataset. For example, it may substitute any missing data in the training dataset. In some cases, sensors may not measure accurately, and/or the calculator module 318 may not be able to calculate a metric value because of insufficient data. In such cases, some metrics values may be missing in the training dataset. To ensure that the movement condition model is not generated taking these missing values into consideration, the missing values in the training dataset are substituted. In one example, this may be done by calculating a mean value based on the other values computed for that metric and subject combination. In addition, the calculator module 318 normalizes the calculated metrics values - i.e., it adjusts the metrics values measured on different scales to a notionally common scale measure. This prevents the model from assigning an unnecessarily higher weight on certain metrics. In one example, all the calculated metrics values may be normalized to fall between 0 and 1.
[0089] Once the training dataset is clean and normalized, the movement condition model is created. As the training dataset includes the various metrics values and activity types and the corresponding movement scores, the processing unit 106 utilizes a supervised machine learning algorithm to create the movement condition model. In particular, the aim of the training algorithm is to learn a mapping function between the input variables/features (metrics values in this case) and an output variable (the movement score in this case). Further, the goal of the algorithm is to approximate the mapping function such that when new sensor values are provided to the model, the model can predict the movement score for that data. In one example, if the dataset is small, a support vector machine (SVM) algorithm may be utilized. In another example, if the dataset is quite large, naive Bayes, k-nearest neighbour, decision trees algorithms or artificial neural networks (ANNs) may be utilized.
[0090] Support-vector machines (SVMs) are supervised machine learning models with associated learning algorithms that analyse data used for classification and regression analysis. When SVM is used for regression analysis, it is abbreviated to SVR. Given a set of training records, each marked as having a given movement score, an SVR training algorithm builds a model that is a representation of the records as points in a space, mapped so that the records belonging to the separate movement scores are divided by a clear gap that is as wide as possible. Further, SVR may identify a shape bounded by support vectors within which metrics values associated with a particular movement score lie. New records can then be assigned to one of the movement scores, by mapping the new records into that same space and their score predicted based on function:
Figure imgf000023_0001
Where x is the input (feature) vector and a is the support vector.
[0091] In SVM, each record may include multiple features or dimensions and therefore the space in which the records are mapped may be a multi-dimensional space and the shape identified by SVR may be a complex shape. In the example shown in Fig. 7, each record includes 36 features or dimensions and the training dataset includes an expected real value in the range 1-6. In this case, the SVM model includes a 36 dimensional space where each record is scored under the movement score associated with that record.
[0092] In this manner, a movement base model is created. In some embodiments, in addition to movement scores, the training data set also includes an indication for each subject that indicates whether the subject is healthy (i.e., does not exhibit any movement-related conditions) or unhealthy (i.e., exhibits some movement-related conditions). In such cases, the learning algorithm may also be configured to perform a binary classification of the sensor values as corresponding to healthy or unhealthy subjects.
Process for diagnosing a sub ject
[0093] As described previously, method 600 describes a method for determining a movement score of a subject, such as a subject 102. Generally speaking, the diagnostic system 100 may be utilized by a veterinary specialist to diagnose the condition of a subject. Alternatively, the diagnostic tool may be utilized by a pet owner. In either case, the veterinary specialist or the owner may have to commence the diagnostic process - at least initially. In particular, the veterinary specialist or the owner (commonly referred to as a user hereafter) may be required to install the diagnostic application 410 on the output device 108. Further, the user may have to associate the sensing unit 104 with the diagnostic application 410 and/or with the subject being diagnosed. In one example, when the user registers the diagnostic application, the user creates a profile for the subject being diagnosed. This profile may include, e.g., the breed, sex, age, and name of the subject. Once the profile is created, the diagnostic application 410 forwards these details to the processing unit 106 - which creates a record for the subject in the database 304 and assigns a unique identifier to the registered subject. This unique identifier is communicated to the diagnostic application 410 for storage.
[0094] Further, each sensing unit 104 may be identifiable by a unique identifier associated with the sensing unit 104. This identifier may be stored in the memory 208 of the sensing units.
In some embodiments, when a sensing unit 104 is initialised, it sends its unique identifier to the diagnostic application 410 (e.g., via wireless means). The diagnostic application 410 forwards this identifier to the processing unit 106, which stores the sensing unit's identifier in association with an identifier of the subject associated with the diagnostic application 410. In this manner, the processing unit 106 may be aware of all the sensing units 104 that may be active at any given time and the corresponding subjects these sensing units 104 are monitoring.
[0095] In case multiple sensing units 104 are employed to monitor a subject, each of the sensing units may function autonomously - they may each transmit their unique identifiers to the diagnostic application 410 such that the processing unit 106 can store each of the multiple sensing unit's identifiers in association with the identifier of the subject associated with the diagnostic application. Alternatively, the multiple sensing units 104 may be configured to elect a master sensing unit 104 amongst themselves. The master sensing unit 104 then communicates with the diagnostic application 410 and the processing unit 106 whereas the other sensing units 104 communicate their sensor data to the master sensing unit 104.
[0096] During operation, a sensing unit 104 may be initialized (as described above) and attached to the subject 102. Method 600 commences at step 602, where the sensing unit 104 monitors one or more movement-related parameters and stores the monitored data in the memory of the sensing unit 104. If the sensing unit 104 includes an accelerometer 220, the accelerometer measures proper acceleration during the period the accelerometer is active. Similarly, if a gyroscope is utilized, the gyroscope 222 measures the angular velocity and orientation of the gyroscope while the gyroscope is active. In case these sensors are tri-axis sensors, they measure acceleration along the x, y, and z axes and angular velocity and orientation along the x, y, and z axes respectively. Data measured by these sensors 202 is stored in the memory 208.
[0097] In cases where the sensing unit 104 includes additional modules, such as a location module, data collected by the additional modules is also stored in the memory 208.
[0098] At step 604, the processing unit 106 receives the stored sensor data. In some embodiments, the sensing unit 104 senses data for a period of time (e.g., 1-2 days). Once this period has expired, the sensing unit 104 may stop sensing operations. Thereafter, it communicates the stored sensor data to the processing unit 106. In other embodiments, the sensing unit 104 may transfer sensor data to the processing unit 106 in near real time (e.g., if a network 110 is available between the sensing unit 104 and processing unit 106), at specific intervals (e.g., once every hour) or when certain conditions are met (e.g., the sensing unit 104 and processing unit 106 are connected via a network 110). The data may either be pushed by the sensing unit 104 automatically or may be communicated to the processing unit 106 is response to receiving a request for the data. In either case, when transmitting the data to the processing unit 106, the sensing unit 104 also communicates the unique identifier of the sensing unit to the processing unit 106.
[0099] At step 605, the processing unit 106 looks up the sensing unit's identifier in the records of active sensing units, retrieves the corresponding identifier for the subject associated with that sensing unit, and then stores the received sensor data in association with the unique identifier of the subject being diagnosed.
[00100] Next, at step 606, the processing unit 106 (and in particular the activity detection module 316) may also process the sensor data to identify one or more activities the subject 102 may be engaged in when the sensor data was recorded. The activity detection module 316 is also configured to slice the sensor data at this step, such that each data slice corresponds to a particular instance of an activity. Sensor data that does not correspond to a pre-selected activity may be discarded at this step. Examples of activities include, e.g., walking, trotting/running, ascending or descending stairs, standing, sitting, etc. As described previously, in certain embodiments, the activities can be manually identified. However, in more preferred embodiments, the activity detection module 316 may identify the various different activities based on received sensor data (i.e., accelerometer and/or gyroscope data). Further, the activities can be recognized, e.g., using one or more machine learning techniques and activity models as described above with respect to Fig. 5.
[00101] In one example, the processing unit 106 divides the sensor data into chunks, e.g., corresponding to a second of measured activity, and then utilizes the activity model to classify the activity occurring in each data chunk - e.g., the convolved acceleration and/or angular velocity values of the sensed data in a data chunk multiplied by the trained weights of the models give an output value for the likelihood of each activity. If multiple activities are returned as possible, a threshold is applied for the most likely, or the data is discarded as inconclusive. For example, if the closest match is to sensor values in the walk entry that data chunk in the sensed dataset is classified to be walking. The processing unit 106 may also compare the identified activity of a data chunk with the identified activity of neighbouring chunks via majority vote, recurrent neural networks or long short-term memory (LSTM) or similar. This can be used to rule out outliers. Data chunks that cannot be classified under any of the selected activities may be discarded. Once data chunks are classified in this manner, adjacent data chunks that correspond to the same activity are combined into a data slice. In certain embodiments, the sensor data collected by the sensing unit 106 may be time stamped. In such cases, at step 506, the processing unit 106 may classify the data slices into distinct instances of the activity based on the timestamps. For example, if the animal being diagnosed descended stairs once at 8 AM for 5 seconds and then descended stairs again at 8:10AM for 10 seconds, the data slices labelled under the descending stairs activity can be divided into two separate data slices - one for the activity lasting 5 seconds and the other for the same activity lasting 6 seconds.
[00102] The labelled sliced sensor data is then forwarded to the calculator module 320.
[00103] At step 608, the processing unit 106 determines whether a threshold number of activities and a threshold instances of each activity are collected. In one example, the threshold number of activities is four and the threshold instances of each activity is 12. It will be appreciated that these threshold numbers are configurable and may be changed based on the particular implementation. For highly accurate movement base models, fewer instances of activities may be sufficient, whereas for movement base models that have lower accuracy, more number of instances of each activity may be required.
[00104] If at step 608, it is determined that a threshold number of activities and/or a threshold number of separate instances of an activity are not detected, the method proceeds to step 610, where the processing unit 106 may generate a suitable error message and in some cases may instruct the sensing unit 104 to continue measuring and storing sensor data. The error message may indicate that insufficient sensor data is available to calculate a movement score.
[00105] Alternatively, if at step 608, it is determined that the threshold number of activities and the threshold number of separate instances per activity are detected and collected, the method proceeds to step 612, where one or more movement-related metrics are calculated for each data slice. In one embodiment, the calculator module 320 calculates the same nine movement metrics that were calculated at step 508 of method 500. These include: maximum acceleration (along the Dorsal -Ventral Axis), maximum acceleration (absolute magnitude), maximum angular velocity (along the median plane); maximum angular velocity (absolute magnitude); Spectral Arc Length (SPARC) of acceleration (absolute magnitude); SPARC of angular velocity (absolute magnitude); Dimensionless Jerk (DLJ); Log Dimensionless Jerk (LDLJ); and Root Mean Square (RMS) Jerk.
[00106] Although in the present example, the calculator module 320 computes values for these nine metrics for each data slice. This may not be the case in all implementations of the method - instead, in other implementations, fewer or other movement metrics may be computed for a data slice. For example, in some implementations, the dimensionless jerk values and/or the RMS jerk values are not calculated. Further, it may not be necessary to calculate all the metrics for all the activities. Instead, in some examples, the SPARC values are computed only for activities such as ascending and descending stairs and may not be calculated for activities such as sit-to-stand or stand-to-sit. Further, although the calculator module 320 in this example method computes the metrics values for multiple slices related to the same activity that may not be necessary in other examples, where the calculator module 320 computes the metrics values for a single data slice related to an activity.
[00107] Fig. 8 illustrates an example output of step 612 in tabular form. This table 800 shows the nine metrics values calculated for the same four activities as in the training data set. Further, each activity includes 4 instances in this example. It will be appreciated this is just an example and the number of instances can be increased/decreased in real implementations.
Further, because of limited space on the drawing sheet, the metrics values collected for each activity are shown one below the other. The first row under each activity corresponds to a first instance of that activity and so on. In reality, however, these metric values may not be stored in this fashion, but the metrics values computed for one instance of each activity are stored together and considered a single record. It will be appreciated that the size of this record is similar to the size of each record in the training dataset and includes the same metrics and activities. In particular, in this example, the record includes 36 metric values.
[00108] Further, in the example datasets shown in Figs. 7 and 8, sensor data from one sensing unit 104 is utilized. If multiple sensing units 104 are applied - e.g., one on the collar, one in the cranial region and one in the caudal region, separate metrics values are computed based on the sensor data from each of the three sensing units 104. In such cases, the number of metrics values per record is 108.
[00109] Next, at step 614, the method determines a movement score for the animal being evaluated. As described previously, the processing unit 106 maintains a movement base model. In one embodiment, the output from step 612 is fed to the model and the model predicts a movement score for the output. This will be described with reference to the training set depicted in Fig. 7 and the output depicted in Fig. 8 for clarity. In this example, each record of the output shown in Fig. 8 is fed to the movement base model which predicts a movement score for the record using the same function that was used to create the movement base model. In case SVR is used, the calculation module uses the function shown in equation 11 above to determine a predicted movement score for the record. In this case, x is the input record (including the 36 points) and a is the set of support vectors that constitutes the movement base model. This process is repeated for each record in the output - i.e., a movement score is predicted for each record using the function shown in equation 11.
[00110] The movement scores predicted for each record can then be aggregated based on a calculated mean or median of the individual movement scores. Further, in some examples, a confidence interval may be calculated based on the individual scores. This allows the calculator module 318 to handle outliers caused by unexpected kinematic events such as the subject tripping, shaking, or slowing down mid-activity due to distraction, etc. Further, in some examples, the mean/median and confidence intervals can be computed between different sets of data for the same subject - e.g., data collected at different times of the day and/or week. In one example, this can be done using a rolling-moving-average or exponentially-weighted-moving- average (with a window length possibly up to 24h) to show the progression of movement scores over time. The progression of movement scores over this window can also be used to stabilise the predicted movement scores against energy and fatigue levels in the subject changing over the day. This can also account for differences in the animals' diurnal cycle and/or instances where the subject may be experiencing Frenetic Random Activity Periods (FRAPS).
[00111] In case a movement score can be generated with high confidence at this step, the movement score is stored against the identifier of the subject along with a date and time stamp of when the movement score was generated. The movement score can then be forwarded to the diagnostic application 410 associated with the subject at step 616.
[00112] Once the diagnostic application 410 receives the movement score, at step 618, it may display the movement score on a display of the output unit 108. In some examples, the processing unit 106 pushes the movement score to the diagnostic application 410 when the score is generated. Alternatively, when a user of the output unit 108 activates the diagnostic application 410 on the output unit 108, the diagnostic application 410 requests the processing unit 106 to forward the latest movement score for the subject associated with the diagnostic application 410.
[00113] In some examples, the diagnostic application 410 maintains a database of movement scores received for the subject over time. It may compare the received movement score with one or more immediately preceding scores and generate one or more insights based on the movement score. For example, if the movement score has improved from the previously generated score and the user of the application has input new medication details in the diagnostic application 410, the diagnostic application 410 may display the movement score and a message stating, e.g., “[the subject] appears to be doing much better - looks like the new medication is taking affect.”. Similarly, if the subject's movement score has declined since the last score was calculated, the diagnostic application 410 may display the movement score and a message stating, e.g., “[the subject's] condition appears to be deteriorating. May wish to book an appointment with the Vet”. Alternatively or in addition, the diagnostic application 410 may be configured to display a chart of the movement scores for the subject over a period of time. It will be appreciated that these are just some example ways in which the movement score may be displayed to a user of the diagnostic application 410 and that any other known techniques for displaying movement scores may be used (based on the particular application) without departing from the scope of the present disclosure.
Alternative embodiments
[00114] In the embodiments described above, the output unit 108 is described as a mobile phone or tablet (or other portable device) owned by a pet owner or a veterinary specialist, with the diagnostic application 410 installed and stored on the portable device. Further, the subject being diagnosed is registered with the diagnostic system 100 through the portable device and once registered, the pet owner or a veterinary specialist does not need to re-register or login each time the diagnostic process of Fig. 6 needs to be performed.
[00115] In an alternative embodiment, the functions of the output unit 108 are performed by the processing unit 106. In this case, the diagnostic application 410 is installed on the processing unit 106 and the processing unit 106 includes a user input device (such as keyboard or keypad) and a display. When a user accesses the diagnostic application 410 on the processing unit 106, the display of the processing unit 106 displays a login page, where the user can enter their login details such as a user name and password. If the user is using the diagnostic system 100 for the first time, the user first registers a subject with the diagnostic system.
[00116] In yet another alternative embodiment, the activity detection module 316 is part of the sensing unit 104 and not part of the processing unit 106. In this case, the sensor data may be assessed in real time and if the activity detection module 316 determines that the activity the subject is currently engaged in is immaterial, the activity detection module sends a signal to the embedded controller 206, which instructs the sensors 202 to enter a sleep mode for a predetermined period of time. For example, if the sensing unit 104 includes both an accelerometer 220 and a gyroscope 222, the embedded controller 206 may instruct the gyroscope 222 to enter a sleep mode. The accelerometer 220 may be instructed to measure data at a reduced sampling rate. This way, battery life of the sensing unit 104 can be extended and preserved. If during this state, the values sensed by the accelerometer 220 change, the activity detection module 316 sends a signal to the embedded controller 206, which instructs the sensors 202 to exit the sleep/reduced sample rate modes. In other example, if the activity detection module 316 determines that the sensor data being assessed relates to an immaterial activity, the activity detection module 316 may instruct the embedded controller 206 to stop storing the sensor values. When relevant activity is identified, the activity detection module 316 may reverse the instruction - causing the embedded controller to start storing sensor values again. In this way, storage space in the sensing unit 104 is utilized in an efficient manner, such that only relevant sensor values are stored.
[00117] Another advantage of incorporating the activity detection module 316 in the sensing unit 104 relates to efficiency. The activity detection unit 316 may maintain a count of the number of activities identified in the sensor data in real time and a count of the number of instances of each activity identified and recorded. When the threshold number of activities have been detected and the threshold number of instances of each activity have been identified, the activity detector unit 316 can instruct the embedded controller 206 to enter a sleep/inactive mode. In this mode the sensing unit shuts off all operations and stops sensing data - thereby again conserving power, preserving battery life of the sensing unit, reducing the amount of data recorded and stored in the memory 208.
[00118] Further still, as described previously, in some embodiments, movement-related metrics need not be computed. Instead, sensed activity data may be directly fed to the movement base model. In such cases, the movement base model is trained based on sensor activity data and not metrics data. For instance, in such cases, in method 500, step 508 may be omitted. Instead, the training data set at step 510 may include activity based sensor data and associated movement scores. Further, the movement base model learns a mapping function between the input variables/features (sensed activity data in this case) and an output variable (the movement score). Further, the goal of the machine learning algorithm is to approximate the mapping function such that when new sensor values are provided to the machine learned model, the model can predict the movement score for that data. In such cases, during diagnosis, again, step 612 may be omitted. Instead, the movement score for the animal being evaluated may be determined by providing sensed activity data to the movement model, which applies the approximated mapping function to generate a condition score for the animal based on the sensed activity data.
[00119] FIG. 9 illustrates an example schematic of a processing device 900 suitable for implementing aspects of the disclosed technology including one or more of the modules and/or applications described above. That is, as noted above, the processing device 900 comprising the processing unit 106, output unit 108, and/or sensing unit 104 may comprise any of the foregoing description or may be according to the processing device shown in Fig. 9. The processing device 900 includes one or more processor unit(s) 902, memory 904, a display 906, and other interfaces 908 (e.g., buttons). The memory 904 generally includes both volatile memory (e.g., RAM) and non-volatile memory (e.g., flash memory). An operating system 910, such as the Microsoft Windows® operating system, the Apple macOS operating system, or the Linux operating system, resides in the memory 904 and is executed by the processor unit(s) 902, although it should be understood that other operating systems may be employed.
[00120] One or more applications 912 are loaded in the memory 904 and executed on the operating system 910 by the processor unit(s) 902. Applications 912 may receive input from various input local devices such as a microphone 934, input accessory 935 (e.g., keypad, mouse, stylus, touchpad, joystick, instrument mounted input, or the like). Additionally, the applications 912 may receive input from one or more remote devices such as remotely-located smart devices by communicating with such devices over a wired or wireless network using more communication transceivers 930 and an antenna 938 to provide network connectivity (e.g., a mobile phone network, Wi-Fi®, Bluetooth®). The processing device 900 may also include various other components, such as a positioning system (e.g., a global positioning satellite transceiver), one or more accelerometers, one or more cameras, an audio interface (e.g., the microphone 934, an audio amplifier and speaker and/or audio jack), and storage devices 928. Other configurations may also be employed.
[00121] The processing device 900 further includes a power supply 916, which is powered by one or more batteries or other power sources and which provides power to other components of the processing device 900. The power supply 916 may also be connected to an external power source (not shown) that overrides or recharges the built-in batteries or other power sources.
[00122] In an example implementation, an activity detection module 950, calculator module 952, and/or diagnostic application 954 as described above may be embodied by instructions stored in the memory 904 and/or the storage devices 928 and processed by the processor unit(s) 902. The memory 904 may be the memory of a host device or of an accessory that couples to the host.
[00123] The processing device 900 may include a variety of tangible processor-readable storage media and intangible processor-readable communication signals. Tangible processor- readable storage can be embodied by any available media that can be accessed by the processing device 900 and includes both volatile and non-volatile storage media, removable and non removable storage media. Tangible processor-readable storage media excludes intangible communications signals and includes volatile and non-volatile, removable and non-removable storage media implemented in any method or technology for storage of information such as processor-readable instructions, data structures, program modules or other data. Tangible processor-readable storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other tangible medium which can be used to store the desired information and which can be accessed by the processing device 900. In contrast to tangible processor-readable storage media, intangible processor-readable communication signals may embody processor- readable instructions, data structures, program modules or other data resident in a modulated data signal, such as a carrier wave or other signal transport mechanism. The term "modulated data signal" means an intangible communications signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, intangible communication signals include signals traveling through wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.
[00124] Some implementations may comprise an article of manufacture. An article of manufacture may comprise a tangible storage medium to store logic. Examples of a storage medium may include one or more types of processor-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of the logic may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, operation segments, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. In one implementation, for example, an article of manufacture may store executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described implementations. The executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like. The executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain operation segment. The instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
[00125] The flowcharts illustrated in the figures and described above define operations in particular orders to explain various features. In some cases the operations described and illustrated may be able to be performed in a different order to that shown/described, one or more operations may be combined into a single operation, a single operation may be divided into multiple separate operations, and/or the function(s) achieved by one or more of the described/illustrated operations may be achieved by one or more alternative operations. . Still further, the functionality/processing of a given flowchart operation could potentially be performed by different systems or applications.
[00126] Unless otherwise stated, the terms “include” and "comprise" (and variations thereof such as “including”, “includes”, "comprising", "comprises", "comprised" and the like) are used inclusively and do not exclude further features, components, integers, steps, or elements.
[00127] It will be understood that the embodiments disclosed and defined in this specification extend to alternative combinations of two or more of the individual features mentioned in or evident from the text or drawings. All of these different combinations constitute alternative embodiments of the present disclosure.
[00128] The present specification describes various embodiments with reference to numerous specific details that may vary from implementation to implementation. No limitation, element, property, feature, advantage or attribute that is not expressly recited in a claim should be considered as a required or essential feature. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims

1. A computer-implemented method for evaluating a movement-related condition in an animal, the method comprising: at a computing device: receiving sensor data from one or more sensors associated with the animal, the one or more sensors measuring movement-related parameters while the animal is engaged in two or more prescribed movement-related activities; determining the two or more prescribed movement-related activities; slicing the sensor data based on the determined prescribed movement-related activities such that each data slice corresponds to a respective prescribed movement- related activity; for each data slice, calculating one or more movement-related metrics; providing the calculated one or more movement-related metrics for each data slice to a movement base model that is trained to classify movement-related metrics into two or more movement scores, wherein the movement scores are associated with a movement-related condition of an animal, and wherein a first movement score indicates a movement-related condition in the animal and a second movement score indicates a healthy animal; and receiving a predicted movement score based on the movement base model, the predicted movement score indicating the movement-related condition of the animal.
2. The method of claim 1, wherein the movement base model is a model created by utilising a supervised machine learning algorithm on a set of training data, the training data including one or more movement-related metrics values calculated for multiple subjects and corresponding movement scores assigned to the multiple subjects.
3. The method of claim 2, wherein the movement base model is trained to generate a mapping function between the movement related metrics and the two or more condition based on the set of training data; and wherein the movement base model determines the predicted movement score by mapping the calculated movement-related metrics to a movement score of the two or more movement scores using the generated mapping function.
4. The method of claim 3, wherein the supervised machine learning algorithm is a support vector machine (SVM) algorithm.
5. The method of claim 1, wherein determining the two or more prescribed movement-related activities includes identifying the two or more prescribed movement-related activities from the sensor data using a trained activity model.
6. The method of claim 5, wherein slicing the sensor data includes: identifying multiple instances of the two or more prescribed movement-related activities; and slicing the sensor data such that each slice corresponds to an instance of an identified movement-related activity.
7. The method of claim 1, wherein the movement related metrics includes at least one of spectral arc length acceleration, spectral arc length angular velocity, or dimensionless jerk measured in an animal's movements while performing the two or more prescribed movement- related activities.
8. The method of claim 7, wherein the one or more movement-related metrics indicate a level of smoothness of motion experienced by the animal when engaged in the corresponding movement-related activity.
9. The method of claim 1, wherein the sensor data includes linear acceleration measured along three axes and angular velocity measured along the three axes.
10. The method of claim 1, wherein the two or more movement scores include a range of movement scores between the first movement score and the second movement score.
11. A system for evaluating a movement-related condition in an animal, the system comprising a processing unit configured to: receive sensor data from one or more sensors associated with the animal, the one or more sensors measuring movement-related parameters while the animal is engaged in two or more prescribed movement-related activities; determine the two or more prescribed movement-related activities; slice the sensor data based on the determined prescribed movement-related activities such that each data slice corresponds to a respective prescribed movement-related activity; for each data slice, calculate one or more movement-related metrics; provide the calculated one or more movement-related metrics for each data slice to a movement base model that is trained to classify movement-related metrics into two or more movement scores, wherein the movement scores are associated with a movement-related condition of an animal, and wherein a first movement score indicates a movement-related condition in the animal and a second movement score indicates a healthy animal; and receive a predicted movement score based on the movement base model, the predicted movement score indicating the movement-related condition of the animal.
12. The system of claim 11, further comprising a sensing unit including the one or more sensors.
13. The system of claim 12, wherein the one or more sensors include a tri-axial accelerometer and a tri-axial gyroscope.
14. The system of claim 11 further comprising an output unit configured for receiving the predicted movement score from the processing unit and displaying the predicted movement score on a display of the output unit.
15. The system of claim 11, wherein the movement base model is a model created by utilizing a supervised machine learning algorithm on a set of training data, the training data including one or more movement-related metrics values calculated for multiple subjects and corresponding movement scores assigned to the multiple subjects.
16. The system of claim 15, wherein the movement base model is trained to generate a mapping function between the movement related metrics and the two or more condition based on the set of training data; and wherein the movement base model determines the predicted movement score by mapping the calculated movement-related metrics to a movement score of the two or more movement scores using the generated mapping function.
17. The system of claim 15, wherein the supervised machine learning algorithm is a support vector machine (SVM) algorithm.
18. The system of claim 11, wherein the system includes an activity detection unit configured to determine the two or more prescribed movement-related activities by identifying the two or more prescribed movement-related activities from the sensor data, using a trained activity model.
19. The system of claim 18, wherein the activity detection unit is configured to slice the sensor data, and to slice the sensor data, the activity detection unit is configured to: identify multiple instances of the two or more prescribed movement-related activities; and slice the sensor data such that each slice corresponds to an instance of an identified prescribed movement-related activity.
20. The system of claim 11, wherein the movement related metrics includes at least one of spectral arc length acceleration, spectral arc length angular velocity, or dimensionless jerk measured in an animal's movements while performing the two or more prescribed movement- related activities.
21. The system of claim 11, wherein the one or more movement-related metrics indicate a level of smoothness of motion experienced by the animal when engaged in the corresponding movement-related activity.
22. The system of claim 11, wherein the two or more movement scores include a range of movement scores between the first movement score and the second movement score.
23. One or more tangible processor-readable storage media embodied with instructions for executing on one or more processors and circuits of a device a process evaluating a movement-related condition in an animal, comprising: receiving sensor data from one or more sensors associated with the animal, the one or more sensors measuring movement-related parameters while the animal is engaged in two or more prescribed movement-related activities; determining the two or more prescribed movement-related activities; slicing the sensor data based on the determined prescribed movement-related activities such that each data slice corresponds to a respective prescribed movement-related activity; for each data slice, calculating one or more movement-related metrics; providing the calculated one or more movement-related metrics for each data slice to a movement base model that is trained to classify movement-related metrics into two or more movement scores, wherein the movement scores are associated with a movement-related condition of an animal, and wherein a first movement score indicates a movement-related condition in the animal and a second movement score indicates a healthy animal; and receiving a predicted movement score based on the movement base model, the predicted movement score indicating the movement-related condition of the animal.
24. The one or more tangible processor-readable storage media of claim 23, wherein the movement base model is a model created by utilising a supervised machine learning algorithm on a set of training data, the training data including one or more movement-related metrics values calculated for multiple subjects and corresponding movement scores assigned to the multiple subjects.
25. The one or more tangible processor-readable storage media of claim 24, wherein the movement base model is trained to generate a mapping function between the movement related metrics and the two or more condition based on the set of training data; and wherein the movement base model determines the predicted movement score by mapping the calculated movement-related metrics to a movement score of the two or more movement scores using the generated mapping function.
26. The one or more tangible processor-readable storage media of claim 25, wherein the supervised machine learning algorithm is a support vector machine (SVM) algorithm.
27. The one or more tangible processor-readable storage media of claim 23, wherein determining the two or more prescribed movement-related activities includes identifying the two or more prescribed movement-related activities from the sensor data using a trained activity model.
28. The one or more tangible processor-readable storage media of claim 27, wherein slicing the sensor data includes: identifying multiple instances of the two or more prescribed movement-related activities; and slicing the sensor data such that each slice corresponds to an instance of an identified movement-related activity.
29. The one or more tangible processor-readable storage media of claim 23, wherein the movement related metrics includes at least one of spectral arc length acceleration, spectral arc length angular velocity, or dimensionless jerk measured in an animal's movements while performing the two or more prescribed movement-related activities.
30. The one or more tangible processor-readable storage media of claim 29, wherein the one or more movement-related metrics indicate a level of smoothness of motion experienced by the animal when engaged in the corresponding movement-related activity.
31. The one or more tangible processor-readable storage media of claim 23, wherein the sensor data includes linear acceleration measured along three axes and angular velocity measured along the three axes.
32. The one or more tangible processor-readable storage media of claim 23, wherein the two or more movement scores include a range of movement scores between the first movement score and the second movement score.
PCT/US2021/019269 2020-02-24 2021-02-23 Animal health evaluation system and method WO2021173571A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202062980749P 2020-02-24 2020-02-24
US62/980,749 2020-02-24

Publications (1)

Publication Number Publication Date
WO2021173571A1 true WO2021173571A1 (en) 2021-09-02

Family

ID=77490414

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/019269 WO2021173571A1 (en) 2020-02-24 2021-02-23 Animal health evaluation system and method

Country Status (1)

Country Link
WO (1) WO2021173571A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230154623A1 (en) * 2021-11-17 2023-05-18 Fetch Insurance Services, Inc. Techniques for predicting diseases using simulations improved via machine learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060000420A1 (en) * 2004-05-24 2006-01-05 Martin Davies Michael A Animal instrumentation
US20090132446A1 (en) * 2003-08-29 2009-05-21 Milenova Boriana L Support Vector Machines Processing System
US20100321189A1 (en) * 2007-02-09 2010-12-23 David John Michael Gibson Monitoring and displaying activities
US20130328139A1 (en) * 2010-09-18 2013-12-12 Fairchild Semiconductor Corporation Micromachined monolithic 3-axis gyroscope with single drive

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090132446A1 (en) * 2003-08-29 2009-05-21 Milenova Boriana L Support Vector Machines Processing System
US20060000420A1 (en) * 2004-05-24 2006-01-05 Martin Davies Michael A Animal instrumentation
US20100321189A1 (en) * 2007-02-09 2010-12-23 David John Michael Gibson Monitoring and displaying activities
US20130328139A1 (en) * 2010-09-18 2013-12-12 Fairchild Semiconductor Corporation Micromachined monolithic 3-axis gyroscope with single drive

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230154623A1 (en) * 2021-11-17 2023-05-18 Fetch Insurance Services, Inc. Techniques for predicting diseases using simulations improved via machine learning

Similar Documents

Publication Publication Date Title
US11877830B2 (en) Machine learning health analysis with a mobile device
US10561321B2 (en) Continuous monitoring of a user's health with a mobile device
US20190076031A1 (en) Continuous monitoring of a user's health with a mobile device
US11089146B2 (en) Systems and methods for rehabilitative motion sensing
EP3890609A1 (en) Systems and methods for prevention of pressure ulcers
JP2020536623A (en) Continuous monitoring of user health using mobile devices
CN109564586B (en) System monitor and system monitoring method
US20210158965A1 (en) Automated mobility assessment
EP2845539B1 (en) Device and method for automatically normalizing the physiological signals of a living being
WO2021173571A1 (en) Animal health evaluation system and method
US20230081657A1 (en) System and method for determining and predicting of a misstep
US20230079951A1 (en) System for configuring patient monitoring
US20240032820A1 (en) System and method for self-learning and reference tuning activity monitor
US20220125384A1 (en) Signal amplitude correction using spatial vector mapping
WO2020073012A1 (en) Continuous monitoring of a user's health with a mobile device
CN111128326A (en) Community patient monitoring method and system based on target tracking
TWI581760B (en) Method for detecting abnormal heartbeat signal and electronic apparatus thereof
EP4113535A1 (en) Remote monitoring methods and systems for monitoring patients suffering from chronical inflammatory diseases
WO2021127566A1 (en) Devices and methods for measuring physiological parameters
US10079074B1 (en) System for monitoring disease progression
WO2020073013A1 (en) Machine learning health analysis with a mobile device
US20180140202A1 (en) Apparatus, computer-readable medium, and method for detecting biological data of target patient from attachable sensor attached to target patient
US20230114876A1 (en) Body area network having sensing capability
Pattankar et al. Parkinson's Disease Detection & Self-Stabilizing Spoon with Health Analysis
US20240099593A1 (en) Machine learning health analysis with a mobile device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21760325

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 18.10.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 21760325

Country of ref document: EP

Kind code of ref document: A1