US20230085511A1 - Method and system for heterogeneous event detection - Google Patents
Method and system for heterogeneous event detection Download PDFInfo
- Publication number
- US20230085511A1 US20230085511A1 US18/046,968 US202218046968A US2023085511A1 US 20230085511 A1 US20230085511 A1 US 20230085511A1 US 202218046968 A US202218046968 A US 202218046968A US 2023085511 A1 US2023085511 A1 US 2023085511A1
- Authority
- US
- United States
- Prior art keywords
- data
- time
- event
- sensor
- sensor data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 116
- 238000001514 detection method Methods 0.000 title abstract description 72
- 230000033001 locomotion Effects 0.000 claims abstract description 24
- 238000001914 filtration Methods 0.000 claims description 57
- 230000009184 walking Effects 0.000 claims description 15
- 208000027418 Wounds and injury Diseases 0.000 claims description 13
- 238000004458 analytical method Methods 0.000 claims description 13
- 238000004891 communication Methods 0.000 claims description 13
- 230000006378 damage Effects 0.000 claims description 13
- 208000014674 injury Diseases 0.000 claims description 13
- 230000005021 gait Effects 0.000 claims description 12
- 230000009191 jumping Effects 0.000 claims description 9
- 230000003252 repetitive effect Effects 0.000 claims description 9
- 230000009183 running Effects 0.000 claims description 7
- 230000000007 visual effect Effects 0.000 claims description 5
- 239000000463 material Substances 0.000 claims description 2
- 241001465754 Metazoa Species 0.000 abstract description 6
- 230000005236 sound signal Effects 0.000 abstract description 4
- 230000036962 time dependent Effects 0.000 abstract description 2
- 230000001131 transforming effect Effects 0.000 abstract description 2
- 230000003044 adaptive effect Effects 0.000 description 47
- 210000002683 foot Anatomy 0.000 description 35
- 238000012545 processing Methods 0.000 description 30
- 230000000694 effects Effects 0.000 description 17
- 230000001133 acceleration Effects 0.000 description 16
- 230000008859 change Effects 0.000 description 16
- 238000013459 approach Methods 0.000 description 12
- 210000004744 fore-foot Anatomy 0.000 description 12
- 230000004044 response Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 230000008901 benefit Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 210000000707 wrist Anatomy 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 210000002414 leg Anatomy 0.000 description 5
- 230000006399 behavior Effects 0.000 description 4
- 230000002250 progressing effect Effects 0.000 description 4
- 238000005070 sampling Methods 0.000 description 4
- 238000012512 characterization method Methods 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000010079 rubber tapping Methods 0.000 description 3
- 230000009182 swimming Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 210000005010 torso Anatomy 0.000 description 3
- 238000012800 visualization Methods 0.000 description 3
- 210000003423 ankle Anatomy 0.000 description 2
- 210000000617 arm Anatomy 0.000 description 2
- 230000009194 climbing Effects 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 210000004247 hand Anatomy 0.000 description 2
- 239000007943 implant Substances 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000037081 physical activity Effects 0.000 description 2
- 230000035807 sensation Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 239000003826 tablet Substances 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000011144 upstream manufacturing Methods 0.000 description 2
- 241001342895 Chorus Species 0.000 description 1
- 206010060820 Joint injury Diseases 0.000 description 1
- 208000028389 Nerve injury Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000037147 athletic performance Effects 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 238000009530 blood pressure measurement Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000001976 improved effect Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000001872 metatarsal bone Anatomy 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003739 neck Anatomy 0.000 description 1
- 230000008764 nerve damage Effects 0.000 description 1
- 230000002981 neuropathic effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 210000000689 upper leg Anatomy 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/112—Gait analysis
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
-
- A—HUMAN NECESSITIES
- A43—FOOTWEAR
- A43B—CHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
- A43B17/00—Insoles for insertion, e.g. footbeds or inlays, for attachment to the shoe after the upper has been joined
-
- A—HUMAN NECESSITIES
- A43—FOOTWEAR
- A43B—CHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
- A43B3/00—Footwear characterised by the shape or the use
- A43B3/34—Footwear characterised by the shape or the use with electrical or electronic arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1036—Measuring load distribution, e.g. podologic studies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6804—Garments; Clothes
- A61B5/6807—Footwear
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/725—Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7253—Details of waveform analysis characterised by using transforms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7282—Event detection, e.g. detecting unique waveforms indicative of a medical condition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R25/00—Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
- H04R25/50—Customised settings for obtaining desired overall acoustical characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1041—Mechanical or electronic switches, or control elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2225/00—Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
- H04R2225/41—Detection or adaptation of hearing aid parameters or programs to listening situation, e.g. pub, forest
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2225/00—Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
- H04R2225/61—Aspects relating to mechanical or electronic switches or control elements, e.g. functioning
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2460/00—Details of hearing devices, i.e. of ear- or headphones covered by H04R1/10 or H04R5/033 but not provided for in any of their subgroups, or of hearing aids covered by H04R25/00 but not provided for in any of its subgroups
- H04R2460/07—Use of position data from wide-area or local-area positioning systems in hearing devices, e.g. program or information selection
Definitions
- the present disclosure relates to heterogeneous event detection.
- Human motion such as walking, running, and jumping, may be characterized as a series of separate events with generally predictable trends, such as plantar pressure at the end of a step and acceleration of a foot through a step. Counting such events within a period of time to monitor activity levels may have application in fitness, healthcare, or other contexts.
- wearable sensors such as accelerometers, may be strapped onto an individual's wrist, foot, or core.
- pressure or other sensors may be fitted into an insole of a shoe. Time-based data from these sensors may be applied to event detection and characterization.
- sensor readings may not come in the form of simple waveforms. Activities, particularly those performed by people in motion, are not always regular, and automated analysis of the resulting data sets may not be straightforward. Identifying individual steps or other events may be complicated due to irregularity of motion and the signal-to-noise ratio that may accompany sensing and data transmission. Sensor data of a single step may include several local maxima and minima. Successive steps may differ in speed and pace of the steps, and in intensity of the landing (e.g. light, heavy).
- a person may perform heterogeneous activities, for instance, first walking for a number of steps, followed by running for a number of steps, tapping their feet, jumping, and resuming walking, walking on stairs or a ramp, or any number of other activities.
- step-detection systems include a processor programmed to analyze time-based sensor data series and identify a step based on peaks and troughs. Smoothing may be employed to eliminate noise effects, thresholds may be applied to discard small peaks or troughs, and enveloping may be applied for reducing data variation. Previous methods may apply a low pass filter or other approach with predetermined threshold parameters, which are in many cases set arbitrarily, and on an underlying assumption that the user is performing only an identified activity that will result in a consistent data profile on each occurrence.
- a low-pass Fourier transform or other filter is applied to sensor data series to capture the low-frequency component, which may result in a filtered data series that will show discrete events more definitively.
- a low-pass filter with a single cut-off frequency for an entire data set may not accurately detect events that vary in duration, events that vary in amplitude, events separated by variable amounts of time, or other heterogeneous activities.
- the method includes, and the system facilitates, acquiring data from sensors and processing the resulting sensor data to define events that are heterogeneous from occurrence to occurrence of the event.
- the method and system apply localized adaptive filtering through a local time-frequency transform, an inverse transform, and an adaptive filter mask.
- the adaptive filter mask is based on the time-frequency representation and is defined for time periods defining data windows, with reference to locally abundant frequencies.
- the method and system provide an adaptive approach that may be applied to detecting events that can be predicted in terms of trends in sensor data associated with the event, but that may not be consistent from one occurrence of an event to the next.
- trends such as plantar pressure following a step or other change in weight distribution on feet, acceleration or rotation during a step or other body movement, or changes in amplitude or frequency of a sound or collection of sounds may all be indicative of events that are generally predictable but that do not result in identical data on each occurrence.
- the events may be generally repetitive or recurring.
- the method and system may prompt a suggested change to improve performance of the event or outcome of the event.
- the method and system may update parameters of how the event is characterized or change parameters of how a device functions to change user experience in relation to the event or prepare for an outcome expected to follow the event.
- the method and system may be applied continuously, in real-time or in batch processing.
- the method and system may be applied using a pressure sensor, a gyroscope, an accelerometer, a thermometer, a humidity sensor or any suitable sensor or combination of sensors depending on the specific application.
- the method and system may be applied to detecting events that are steps or other defined body movements associated with various activities (e.g. walking, running, jumping, biking, skiing, swimming, martial arts, boxing, yoga, gymnastics, dancing, etc.), or portions of any such activities.
- the method and system may be applied to use on individuals, animals, robotics, unmanned or manned vehicles, or any suitable system.
- the method and system may be applied to optimize activities of a user or test subject, including by prompting changes (e.g. audio, visual or tactile alerts to change movement patterns), or by changing device function (e.g. by inflating or deflating bladders around an insole, changing stiffness of wrist or other joint braces, changing output from a hearing aid, etc.).
- Applying the localized adaptive filter to the sensor data may include segmenting the sensor data into data windows and converting the resulting data windows into a time-frequency representation using a transform such as the S-transform.
- the time-frequency representation the relative contributions of many frequencies of the sensor data profile can be represented for each time point.
- Adaptive localized filtering magnifies the most prominent frequencies at each time point and suppresses the least prominent frequencies in the resulting filtered data.
- the adaptive filtering may be based on a power of the magnitude of the time-frequency representation, resulting in greater divergence in contribution from more prominent frequencies as compared with less prominent frequencies. The greater divergence in contribution from more prominent frequencies as compared with less prominent frequencies may facilitate heterogeneous event detection.
- the adaptive localized filtering provides filtered data. Identifying events is facilitated in features of the filtered data compared with features of the sensor data. Once the events are identified along the timeline of the filtered data, the events may be further characterized in either the filtered data or the sensor data. Characterization along the timeline of either the filtered data or the sensor data may include analysis of the derivative or the integral of the data.
- a method and system for heterogeneous event detection Sensor data is obtained and divided into discrete data windows. Each data window is defined by and corresponds to a time period of the sensor data. A time-frequency representation over the time period is calculated for each data window. A filter mask is calculated based on the data window corresponding to the time-frequency representation. The filter mask is applied for reverting the time-frequency representation to a time representation, resulting in filtered data. Features, such as extrema or other inflection points, are identified in the filtered data. The features define events, and transforming the time-frequency representation back into the time domain emphasizes differences between more and less prominent frequencies, facilitating identification of heterogeneous events.
- the method and system may be applied to body movements of people or animals, automaton movement, audio signals, light intensity, or any suitable time-dependent variable.
- a method for detecting heterogeneous events related to movement of an individual comprising: receiving sensor data of movement of the individual; defining a data window over a time period of the sensor data; calculating a time frequency-representation of the data window for providing a time-frequency representation corresponding to the time period; calculating a filter mask based on the time-frequency representation; filtering the time-frequency representation with the filter mask, providing filtered data; identifying features in the filtered data; identifying an event with reference to the features; and outputting the event.
- a method for detecting heterogeneous events comprising: receiving sensor data; defining a data window over a time period of the sensor data; calculating a time frequency-representation of the data window for providing a time-frequency representation corresponding to the time period; calculating a filter mask based on the time-frequency representation; filtering the time-frequency representation with the filter mask, providing filtered data; identifying features in the filtered data; identifying an event with reference to the features; and outputting the event.
- system for detecting heterogeneous events comprising: a sensor for receiving sensor data; and a processor in communication with the sensor for receiving the sensor data; wherein the processor is configured to execute instructions for carrying out the methods described above.
- the sensor data comprises at least two different types of data; the sensor data comprises data of pressure, acceleration, rotation, seismic changes, temperature, humidity, or sound; receiving the sensor data is at a down-sampled rate to increase event detection speed; receiving the sensor data of a first data window is at a rate determined with reference to the filtered data of a second data window, the second data window preceding the first data window in time; the time period of the data window has a duration equal to additional data windows preceding or succeeding the data window; calculating a time frequency-representation of the data window comprises applying an S-transform to the sensor data within the data window; filtering the time-frequency representation with the filter mask comprises applying the filter mask to the time-frequency representation and applying the inverse S-transform to the product of the filter mask and the time-frequency representation; applying the filter mask to the S-transform comprises multiplying the time-frequency representation by the filter mask; calculating a time frequency-representation of the data window comprises applying a Gabor transform to the sensor data
- identifying the event comprises deriving an event count; identifying the event comprises defining the features as endpoints of the event; identifying the event comprises characterizing the sensor data between time points corresponding to the endpoints defined in the filtered data; characterizing the sensor data comprises deriving the sensor data; characterizing the sensor data comprises integrating the sensor data; identifying the event comprises characterizing the filtered data between the endpoints; characterizing the filtered data comprises deriving the filtered data; characterizing the filtered data comprises integrating the filtered data; identifying the event comprises defining a pulse width, a cycle, a ground contact time, a center of pressure, or a path of center of pressure of the sensor data or the filtered data; outputting the event comprises communicating data relating to the event to an individual; outputting the event comprises prompting an individual to change behavior; outputting the event comprises communicating data relating to the event to an individual; outputting the event comprises storing data relating to the event in a computer readable medium; outputting the event comprises outputting contribution to the sensor data from each of
- FIG. 1 is a block diagram of an event detection system
- FIG. 2 is a flowchart of a method for detecting heterogeneous events
- FIG. 3 is a flowchart of a method for detecting heterogeneous events
- FIG. 4 A is an example pressure data series filtered with two Fourier transforms at two cut-off frequencies
- FIG. 4 B is the data series of FIG. 4 A showing only one of the Fourier transforms and the sensor data;
- FIG. 4 C is the data series of FIG. 4 A showing only one of the Fourier transforms and the sensor data;
- FIG. 5 is an S-transform time-frequency plot of the data series of FIG. 4 A ;
- FIG. 6 A is the data series of FIG. 3 filtered with two adaptive, localized filters, one based on the magnitude of the S-transform shown in FIG. 5 and another based on the square of the magnitude of the S-transform shown in FIG. 5 ;
- FIG. 6 B is the data series of FIG. 6 A showing only the filter based on the magnitude of the S-transform shown in FIG. 5 and the sensor data;
- FIG. 6 C is the data series of FIG. 6 A showing only the filter based on the square of the magnitude of the S-transform shown in FIG. 5 and the sensor data;
- FIG. 6 D is the data series of FIG. 6 A showing only the filter based on the magnitude of the S-transform shown in FIG. 5 ;
- FIG. 6 E is the data series of FIG. 6 A showing only the filter based on the square of the magnitude of the S-transform shown in FIG. 5 ;
- FIG. 7 A shows S-transform magnitudes and corresponding signals filtered with adaptive localized filtering based on the magnitude of the S-transform, from 0 to 250 seconds of the data shown in FIGS. 4 A to 6 E ;
- FIG. 7 B shows S-transform magnitudes and corresponding signals filtered with adaptive localized filtering based on the magnitude of the S-transform, from 50 to 300 seconds of the data shown in FIGS. 4 A to 6 E ;
- FIG. 7 C shows S-transform magnitudes and corresponding signals filtered with adaptive localized filtering based on the magnitude of the S-transform, from 100 to 350 seconds of the data shown in FIGS. 4 A to 6 E ;
- FIG. 8 shows an event detection system
- FIG. 9 is a plot of idealized sensor data of the event detection system of FIG. 8 ;
- FIG. 10 is a plot of summation data based on the sensor data of FIG. 9 ;
- FIGS. 11 a to 11 f are a plots of filtered data based on the sensor data of FIG. 9 and the sensor data of FIG. 9 ;
- FIG. 12 is a plot of the first derivative of the sensor data of FIG. 9 showing features of an event
- FIG. 13 shows an event detection system
- FIG. 14 is a plot of idealized sensor data of the event detection system of FIG. 13 ;
- FIG. 15 shows an event detection system
- FIG. 16 is a plot of idealized sensor data of the event detection system of FIG. 15 ;
- FIG. 17 is a plot of idealized filtered sensor data of the event detection system of FIG. 15 .
- FIG. 18 shows an event detection system
- FIG. 19 is a plot of idealized sensor data of the event detection system of FIG. 18 ;
- FIG. 20 shows an event detection system
- the method includes, and the system facilitates, acquiring data from sensors and processing the resulting sensor data to define events that are heterogeneous from occurrence to occurrence of the event.
- the method and system apply localized adaptive filtering through a local time-frequency transform, an inverse transform, and an adaptive filter mask.
- the adaptive filter mask is based on the time-frequency representation and is defined for time periods defining data windows, with reference to locally abundant frequencies.
- the method and system provide an adaptive approach that may be applied to detecting events in real-time or in batch processing.
- the filter mask is determined with respect to local conditions around an event, avoiding the need for arbitrary thresholds, which are applied in many other commonly used detection techniques.
- the method and system provide a robust and adaptive solution for detecting movements of people, animal or automata, detecting sounds, or any receive data associated with any other type of event that may not be consistent from one occurrence of the event to the next.
- Each individual event may be characterized and characteristics of the event may be defined.
- the characteristics may be used to analyze the sensor data for characteristics that may include duty cycles of events and statistics surrounding the characteristics. Event detection may be facilitated by identifying characteristics within the event other than the duration and boundaries of the event. Such characteristics may include the pulse width of each event, the contribution from multiple sensors during an event, and the duty cycle if the events are periodic. These characteristics describe the event, and may be used to verify the event detection process.
- the characteristics may be provided to a user of the method and system as event data.
- the method and system may prompt a suggested change to improve performance of the event or outcome of the event.
- the method and system may update parameters of how the event is characterized or change parameters of how a device functions to change user experience in relation to the event or prepare for an outcome expected to follow the event.
- Applying the localized adaptive filter to the sensor data may include segmenting the sensor data into data windows and converting the resulting data windows into a time-frequency representation using a transform such as the S-transform, including a discretized S-transform.
- a transform such as the S-transform, including a discretized S-transform.
- the relative contributions of many frequencies of the sensor data profile can be represented for each time point.
- Adaptive localized filtering magnifies the most prominent frequencies at each time point and suppresses the least prominent frequencies at each time point in the resulting filtered data.
- the adaptive filtering may be based on a power of the magnitude of the time-frequency representation, resulting in greater divergence in contribution from more prominent frequencies as compared with less prominent frequencies. The greater divergence in contribution from more prominent frequencies as compared with less prominent frequencies may facilitate heterogeneous event detection.
- an event is defined as a distinct waveform bound between two features along the timeline of the sensor data or filtered data, although as described above, locating the features is facilitated in the filtered data relative to locating the features the sensor data.
- the features may include inflection points.
- the inflection points may be local extrema within the data, a derivative of the data, or an integral of the data.
- pulse width is the amount of time that the sensor data of the event is above a specified threshold, typically beyond a baseline level in a data series.
- the pulse width is defined as the timeline between the start and end times of an event. Sensor contribution can be studied within the pulse width. Definition of these characteristics facilitates detection and analysis of events, and of other aspects of the data set.
- Event detection acts as a preliminary step in analyzing input data.
- the information of interest in a data set includes dynamic events rather than the static moments.
- the event may be further analyzed. Analysis of the event and of features within the event is not possible without first detecting the event.
- An event of interest may be heterogeneous in nature, consisting of repetitive or non-repetitive occurrences. In the case of repetitive, reoccurring events, there may be high or low variability from event to event.
- the method and system may be applied to detecting events that are not consistent from one occurrence of the event to the next, and that may occur with inconsistent intervals between occurrences.
- the method and system may be applied to detecting events that are steps or other defined body movements associated with various activities (e.g. walking, running, jumping, biking, skiing, swimming, martial arts, boxing, yoga, gymnastics, dancing, etc.), or portions of any such activities.
- the method may include use of, and system may include, sensors that are worn on the feet, legs, wrists, arms, torso or other portions of the body, depending on the application.
- the sensors may include a pressure sensor, a gyroscope, an accelerometer, a seismograph, a thermometer, a humidity sensor, a microphone or other audio sensor, an optical sensor or any suitable sensor or combination of sensors depending on the specific application.
- Actuators or other output modules may drive changes in device function following detection.
- the method and system may be applied to optimize activities of a user or test subject, including by prompting changes (e.g. audio, visual or tactile alerts to change movement patterns), or by changing device function (e.g. by inflating or deflating bladders around an insole, changing stiffness of wrist or other joint braces, changing output from a hearing aid, etc.). These responses may facilitate improved performance of movements that are detected and characterized as an event. Particularly with application to body movements that are generally repetitive, coaching alerts and changes in device function may improve performance of the physical activity.
- prompting changes e.g. audio, visual or tactile alerts to change movement patterns
- changing device function e.g. by inflating or deflating bladders around an insole, changing stiffness of wrist or other joint braces, changing output from a hearing aid, etc.
- Audio data may also be used in applications to improve device function, such as changing output from a hearing aid in response to changes in background noise or audio data indicative that a known person is talking and to emphasize or de-emphasize certain frequencies in audio output to help a user hear higher or lower frequencies. Audio data may also prompt changes, for example to improve playing a musical instrument, singing, or speaking a language.
- the method and system may be applied to gait analysis of a person, animal, or machine, by first detecting individual steps as events within a gait data set and further analyzing the features of each step event, and the statistics of the overall gait data set.
- a foot strike analysis may be applied to detecting the position on the foot where striking occurs during running or walking to provide coaching feedback improving gait efficiency, and reducing injury potential.
- a rate of pronation and supination may be detecting within each step event, allowing for coaching to improve gait efficiency and reduce injury.
- events that are in turn defined by multiple discrete features may be identified, different types of events based on peaks may be characterized, and plots of the sensor data may be applied contextually to generate the time-frequency representation. These features may facilitate accurate definition of heterogeneous events.
- heterogeneous events may include changing speed, climbing stairs, walking on a ramp or other incline, tapping feet, or other events that may vary in unpredictable ways between footfalls or other events.
- Characteristics of each step may be defined, including the pulse width of the step, which may correlate to the ground contact time, and the center of pressure of the step.
- the sensor data may be represented by summations of multiple pressure sensors.
- the summations of the sensor data may be filtered by the heterogeneous event detection system, and the filtered data may be used to detect boundaries of step events based on features in the filtered data.
- the source data that is filtered to provide the filtered data may include pressure data from one or more sensors, and may also include data of acceleration, rotation, temperature, humidity or other data.
- the time bounds of the step events may be used for further analysis of the sensor data or other data.
- the pressure from each of the multiple sensors may be compared to one another for analysis.
- sensor data, filtered data or both for humidity, temperature or other aspects of the user's feet may be cross-referenced to the pressure data to more thoroughly characterize the event.
- pressure data as the principal type of data, acceleration, rotation or tilt data may also be superimposed over the pulsewidth of the event to characterize the event other than by pressure.
- the number of steps within the series When characterizing a series of walking steps, three components may commonly be defined: the number of steps within the series, the ground contact time of each step, and the path of center of pressure of the series.
- the localized adaptive filtering and identification of features in the filtered data facilitates detecting the start time and end time of each individual step, and the number of steps in the series.
- the ground contact times may be calculated within the bounds of the start and end times for each step.
- the amount of time that a stance phase endures between swing phases defines a ground contact time.
- inflection points of the rising edge and falling edge of a step representing the onset and offset of the ground contact time respectively, may be detected by a number of means.
- ground contact onset and offset points may then be used as bounds in which to calculate centres of pressure for an individual step, to filter out the pressures that are recorded during the swing phase of the step. Any change in center of pressure measurements over multiple steps may yield the path of center of pressure for the walking data.
- Correlating the locations of the sensors relative to an individual's foot when measuring steps or other activities involving steps allows the sensor data recorded by each sensor to be used to define the center of pressure of the entire sensor system at any one instant in time using known center of pressure calculations. Similarly, the path of the center of pressure throughout each event may be determined throughout the entire pressure series. Additional characteristics of the sensor data may be calculated after detection of the step, including any events defined within the step, such as a heel strike event, a forefoot strike event, a ground contact with the ball of the foot event, and a toe-off event, each of which may be grouped into a ground contact event portion of the step event.
- the ground contact event portion may define stance phase
- the remaining features of the data within a cycle may define a swing phase of each step.
- Identified features of the sensor data may be leveraged to analyze the sensor data for characteristics that may include the pulse width, the duty cycle, the ground contact time, the center of pressure of the step, the path of the center of pressure, and other statistics (e.g. mean, deviation, etc.) surrounding the features, and relating to various portions of the step event.
- characteristics of the step events may facilitate assessing the mean and deviation ground contact times, changes in center of pressure, changes in the path of the center of pressure, or other characteristics of a step event.
- These features of step events may be detected by pressure sensors, accelerometers or any other suitable system for detecting movement of a foot and contact of a sole with a walking surface.
- the method and system may include adjusting upstream or downstream aspects of the system's functionality to change how the data is processed. For example, if a series of jumps is detected or the pace of a run increases, the method may increase its sampling rate, while if the user appears to be sitting down, the sampling rate may decrease to save battery life. Functional responses of the system to these changes allow for low-power modes, elongating battery life, while still allowing for high frequency sampling during events in order to provide useful information for users in terms of coaching for avoiding injury, coaching for performance, research or any suitable application.
- the method and system may also prompt the user in response to an event.
- the prompt may be in the form of coaching for athletic performance, to avoid injury or for other reasons.
- the prompting may be communicated to the user in any manner that is appropriate to the application.
- Visual, audio or tactile feedback triggered by event detection may provide clear suggestions on how to improve performance or avoid injury.
- the feedback may be neuroplastic, such as for events detected at a region of interest that has limited or no sensation. Providing a tactile feedback to an area that does have sensation may train the user to intuitively recognize and react to that tactile feedback. For example, pressure, acceleration, rotation and temperature sensors may be placed on the foot of a neuropathic patient who has loss of feeling in their feet.
- the sensors and system may detect a step and convey a signal depicting the detection of the step to a transducer mounted on the patient's back as a vibrational signal that the patient can feel.
- the event may also include a spike in temperature that the patient may not be able to feel and recoil from.
- movements may be detected on the user's head, neck, hands, arms, torso, legs, feet or any suitable combination.
- sensors elsewhere on the body may include pressure, acceleration, rotation, temperature, humidity or any suitable sensor.
- the sensors may be grouped for processing data singularly from several sensors, and may be grouped according to the location of sensors on the user's body. In some cases, parallel sets of events are characterized on the same timeline. For example, data relating to pressure, acceleration, rotation, temperature and humidity on a user's arms and hands may be assessed trough a first group of sensors, and data relating to separately to pressure, acceleration, rotation, temperature and humidity on a user's legs and feet may be assessed separately trough a second group of sensors.
- the first and second groups of sensors may provide data for coaching basketball, football, hockey, soccer, swimming, bicycling or any sport where performance optimization and stress injury may be avoided.
- the event may also include slipping on a slippery surface, detected by an accelerometer but which the patient may not be able to feel and respond to.
- the system may detect small, perhaps unnoticed events by pressure, acceleration and rotation measurements in the foot, leg, and torso. Detection and analysis of these events may provide some insights into fall probability, allowing for feedback to the user to change behaviour and prevent the fall, or to trigger an alarm or notify an emergency contact immediately preceding or following the fall.
- the method and system may be directed to detecting background noise, spoken word, musical performance or other audio input data, events may be identified, different types of events based on peaks may be characterized, and plots of the sensor data may be applied contextually to generate the time-frequency representation. These features may facilitate accurate definition of heterogeneous events based on audio data.
- features such as signal amplitudes at various frequencies may be used to modulate output of the hearing aid to eliminate background noise or focus on a given person speaking.
- audio input data may characterize musical or language performance and provide coaching on that basis. Audio input data may also be included with step detection or other human performance applications to better characterize events and features. For step detection and musical coaching, both audio data and pressure, acceleration or rotation data may be referenced to avoid injury.
- Characteristics of words or passages of music may be defined, including the frequencies and amplitudes of the sounds to detect individual syllables or notes. Relationships between individual syllables or notes may also be characterized to define measures, choruses or other passage of music. Consistency of the words and notes, and consistency of tempo and rhythm, may be characterized. Similarly, background noise may be characterized by amplitudes and known repetitive sounds or white noise to identify an environment or social situation to adjust hearing aid output. Summations of the audio sensor data may be filtered by the heterogeneous event detection system, and the filtered data may be used to detect boundaries of verbal, musical, background noise, physical activity of the user, or other events based on features in the filtered data.
- the source data that is filtered to provide the filtered data may include audio data from one or more sensors, and may also include data of pressure, acceleration, rotation, temperature, humidity or other data depending on the application.
- the time bounds of the step events may be used for further analysis of the sensor data or other data.
- audio inputs from each of multiple sensors may be compared to one another for analysis, and both the source data and the filtered data may be assessed for identification of heterogeneous events.
- the performer's physical posture and the music may be assessed separately to both improve performance and avoid injury.
- FIG. 1 shows a system 10 that may be used to implement a method for defining heterogeneous events.
- the system 10 may include a data source 20 from which a processing module 30 receives and processes data, allowing detection and visualization of heterogeneous events. Downstream functionality of the system 10 in response to data after processing by the processing module 30 is directed by an event data module 40 , which may store processed data, communicate processed data to a user or change operating parameters of the system 10 in response to processed data.
- the data source 20 may include one or more sensors for receiving data of different types of stimulus.
- the data source 20 may also include stored data or simulated data for modelling and optimization.
- the data source 20 shown includes a first sensor 22 and a second sensor 24 .
- a data source may include only the first sensor, such as the data source 420 of FIG. 15 .
- the first sensor 22 receives a first stimulus 12 , resulting in the first data 26 .
- the second sensor 24 receives a second stimulus 14 , resulting in the second data 28 .
- the first sensor 22 and the second sensor 24 may be the same types of sensors for detecting the same types of data, or may be a combination of multiple types of sensors.
- the first sensor 22 , the second sensor 24 , or both may include a pressure sensor, a gyroscope, an accelerometer, a seismograph, a thermometer, a humidity sensor, or any suitable sensor or combination of sensors depending on the specific application.
- the first data 26 , the second data 28 , or both, and correspondingly the first stimulus 12 , the second stimulus 14 , or both, may include measured pressure, acceleration, rotation, seismic signals, temperature, humidity, or any other data.
- the data source 20 may include a shoe-insert, such as the data source 220 of FIG. 8 , the data source 320 of FIG. 13 or the data source 420 of FIG. 15 .
- a shoe insert facilitates measuring the applied pressure at specific portions of a user's foot during walking, running, jumping, biking, skiing, or other activities.
- the first sensor 22 and the second sensor 24 may be combined in a sensor array included in the shoe-insert, such as the first sensor 222 and the second sensor 224 of FIG. 8 .
- the data source may include one or more sensors on a glove, wristband, armband, elbow pad, headband, hairclip, torso harness, shirt, halter, belt, earpiece, ankle bracelet, leg band, knee pad, or any suitable location (not shown), any of which may be designed for an individual, an animal, an automaton a prosthetic or any suitable location for a given application.
- the data source may include one or more sensors on a component of a robotics, unmanned or manned vehicles, or any suitable system.
- the data source 20 may include a glove, such as the data source 620 of FIG. 20 , or any other suitable wearable data source.
- the data source 20 may also include an audio detection device such as the cochlear implant 620 of FIG. 18 .
- the data source 20 may also include a sensory enhancement device, such as the cochlear implant 620 of FIG. 18 , or a tactile output such as in the watch 320 of FIG. 13 .
- the processing module 30 receives sensor data 27 from the data source 20 as a time-varying dataset.
- the sensor data 27 may include the first data 26 , the second data 28 , or both.
- the processing module 30 may be on a smartphone, smartwatch, tablet, computer, or other static, portable or wearable device. Communication between the data source 20 and the processing module 30 may be through any wired or wireless connection.
- the processing module 30 may be included in a single unit with the data source 20 .
- the data source 20 may be connected to the processing module 30 directly or through an intermediary storage and transmission device 29 for providing temporary storage of the data, depending on the particular application of the system 10 .
- the processing module 30 applies an adaptive localized filter process 32 to the sensor data 27 based on parameters 34 , resulting in filtered data 35 .
- the parameters 34 include an adaptive localized filter that is calculated with reference to a transform of the sensor data 27 .
- the processing module 30 applies an event detection process 36 to the filtered data 35 to identify and characterize events, providing event data 37 .
- the adaptive localized filter process 32 may be applied to the first sensor data 26 and the second sensor data 28 simultaneously as a combined data set, or to the first sensor data 26 and the second sensor data 28 separately.
- the event data module 40 may include a output module 42 for communicating the event data 37 to a user or effecting a change to operation of the system 10 , a storage module 44 for storing the event data 37 , or both.
- the event data module 40 may be on a smartphone, smartwatch, tablet, computer, or other device. Communicating the event data 37 to a user may be through visualization on an optical display, vibration through a tactile display, audio communication, text message, or any suitable medium may be applied. Communication between the processing module 30 and the event data module 40 may be through any wired or wireless connection.
- the event data module 40 may be included in a single unit with the processing module 30 .
- FIG. 2 is a flowchart of a method 50 for detecting heterogeneous events.
- the method 50 includes a localized filtering method 60 and an event detection method 70 .
- the method 50 includes receiving sensor data 52 .
- Receiving sensor data 52 provides sensor data.
- the localized filtering method 60 is applied to the sensor data, providing filtered data.
- the event detection method 70 is applied to the filtered data, providing event data.
- Communicating the event data 54 , storing the event data 56 , or both, may follow applying the event detection method 70 .
- Progressing a moving time window by a pre-determined amount of time 58 follows, and the method 50 is repeated with receiving data 52 , providing additional data corresponding to the timeline defined following shifting the moving window by a pre-determined amount of time 58 .
- the localized filtering method 60 is applied to the sensor data.
- the localized filtering method 60 includes selecting a data window 62 .
- the data window corresponds to a time period of the sensor data, which may include a plurality of data windows.
- Calculating a time frequency-representation 64 provides a time-frequency representation corresponding to each of the data windows.
- Calculating a filter mask 66 based on the time-frequency representation provides a filter mask.
- Localized filtering 68 is applied to the time-frequency representation using the filter mask, resulting in the filtered data. Applying each of calculating a time frequency-representation 64 , calculating a filter mask 66 , and localized filtering 68 to each of the plurality of data windows provides an adaptive filter for the data window to facilitate defining events in heterogeneous data.
- Selecting a data window 62 may generally be referred to as a moving-window technique. Selecting a data window 62 allows the remaining steps of the localized filtering method 60 to be applied to subsequent data windows, or precedent data windows, sequentially. As a result, the localized filtering method 60 may be applied to each of the data windows as the time period of each data window passes, and it is not necessary to wait for the entire dataset to be collected before applying the filtering method 60 to any data windows that have already been selected by selecting a data window 62 . Selecting a data window 62 facilitates identification of events in near real-time.
- Calculating a time-frequency representation 64 on each data window facilitates calculating a filter mask 66 for each data window.
- Calculating a time-frequency representation 64 on each data window may be based on an S-transform, Gabor transform, or other suitable transform.
- the S-transform or other localizable transforms may be applied for providing localized information about the sensor data. Providing the localized transform may have advantages over a Fourier transform, which is globally-applied across a dataset as a simple frequency representation.
- Calculating a filter mask 66 based on the time-frequency representation provides a filter mask.
- the filter mask is determined and recalculated for each data window of the time-frequency representation with reference to the characteristics of the time-frequency representation of the data window in respect of which calculating a time-frequency representation 64 is carried out.
- the filter mask may remove irrelevant data and noise by applying a weighting value to each frequency for a given time point on the time-frequency representation.
- the filter mask may be based on the prevalence of the most and least abundant frequencies in the time-frequency representation over the time period.
- the filter mask may be non-binary, applying values other than 0 and 1 to each frequency in the time-frequency representation, depending on the magnitude of each frequency.
- previous low-pass filters include assigning a weight of 1 to frequency values below a threshold value, and a weight of 0 to frequency values above the threshold value.
- previous high-pass filters include assigning a weight of 1 to frequency values above a threshold value, and a weight of 0 to frequency values below the threshold value.
- a single arbitrary threshold may be assigned without the step duration and step frequency being known.
- the step duration and frequency may in some cases be the information an event detection method is directed to defining, and application of a non-binary and localized filter mask may facilitate defining heterogeneous events, such as steps.
- An individual's gait may vary as the individual walks, runs, changes speed, climbs or descends stairs, climbs or descends a ramp or other incline, taps their feet, or makes other unpredictable actions that result in or affect an input of the sensor data.
- the non-binary and adaptive features of the filter mask remove a requirement to assign a single arbitrary frequency cutoff threshold before the step duration and resulting step frequency are known.
- the filter mask adapts to prominent frequencies in each data window. More prominent frequency values at a given time are assigned higher weighting values, while less prominent frequencies are assigned lower weighting values.
- a separate filter mask is calculated for the time-frequency representation corresponding to each data window.
- the filter mask applied to a particular data window is the filter mask that was calculated with reference to the particular data window.
- the square of the magnitude of the S-transform or other time-frequency representation at each frequency may be used as the filter mask.
- the more prominent frequencies in the dataset are assigned a higher mask value, and the less prominent frequencies are assigned a lower value, which may emphasize the more prominent frequencies and minimize the less prominent frequencies.
- Applying the square of the magnitude of the time-frequency representation may result in greater distinction between the contribution of more prominent and less prominent frequencies to the filtered data, compared with approaches in which the mask is directly proportional to the magnitude of the time-frequency representation.
- the difference in contribution between the more prominent and less prominent frequencies to the filtered data may be less pronounced compared with approaches in which the mask is directly proportional to the square of the magnitude of the time-frequency representation.
- the magnitude of the time-frequency representation and the square of the magnitude of the time-frequency representation are examples of values that may be applied. Other power relationships between the filter mask and the time-frequency representation magnitude may also be applied.
- the filter mask may be non-binary and whether based on the magnitude of the time-frequency representation, the square of the magnitude of the time-frequency representation, or another power value of the magnitude of the time-frequency representation, will vary according to the abundancies of the various frequencies of each time point in the data window in respect of which the time-frequency representation and the relevant filter masks are calculated.
- the Localized filtering 68 is applied to the time-frequency representation using the filter mask, resulting in the filtered data.
- the localized filtering 68 may include multiplication of the filter equation with the time-frequency representation of the data window and converting the time-frequency representation back into the time-domain.
- the time-frequency representation may be converted back into the time-domain by an inverse S-transform, an inverse Gabor transform, or any suitable transform.
- the filtered data may then be further processed in the time domain, such as by the event detection method 70 .
- the inverse transform may be completed by any suitable approach . . . may be completed by any suitable approach, such as the time inverse transform described in M. Schimmel, J. Gallart, “The inverse S-transform in filters with time-frequency localization”, IEEE Transactions on Signal Processing, Vol. 53, No. 11, 4417-4422, 2005.
- the magnitudes of the time-frequency representation are large, or result in large deviations between the resulting magnitudes of the time-frequency representation at more prominent frequencies compared with less prominent frequencies at a given time point in the time period, there may be advantages to normalizing the values against the greatest magnitude in the filtered data, after squaring or applying other power relationships to the magnitudes of the time-frequency representation.
- Such a normalization would result in each data point in the filtered data having a value varying between 0 and 1.
- the normalized values may result in a more recognizable plot of events for visualization by a user than would be the case where the localized filtering 68 is applied to the time-frequency transform values without normalization.
- the event detection method 70 may be applied to the filtered data regardless of whether the magnitudes of the time-frequency transform values (or the magnitudes elevated to a power) are normalized before localized filtering 68 is applied to the magnitudes of the time-frequency transform.
- the event detection method 70 is applied to the filtered data.
- the event detection method 70 includes identifying features 72 and counting events based on the features 74 .
- Identifying features 72 may be based on the event duration being greater than a defined minimum event duration, less than a defined maximum duration, or both. Identifying features 72 may be based on the event magnitude being greater than a defined minimum event magnitude, less than a defined maximum event magnitude, or both.
- Inflection points such as local minima, local maxima, or both may be used to detect the beginnings and ends of events.
- Counting events based on features 74 may be applied to each data window based on the criteria used for identifying features 72 based on previous inflection points. Depending on the difference in the magnitude of, or time elapsed between, local extrema, other inflection points or other features, some features may be determined to define endpoints of constituent events within an overall aggregated event including the constituent events, when counting events based on features 74 .
- the method 50 may include communicating the event data to a user 54 , storing the event data 56 , or both. After communicating the event data 54 , storing the event data 56 , or both, progressing a moving time window by a pre-determined amount of time 58 may precede receiving sensor data 52 in a application of the method 50 to a subsequent data window. The corresponding sensor data for a subsequent data window is received and the localized filtering method 60 and the event detection method 70 are applied to the subsequent data window.
- Communicating the event data 54 may include communicating in real time with a user who is applying the method 50 . Communicating the event data 54 may involve a prompt to encourage a habit that is expected to improve performance, avoid injury or otherwise provide a benefit. Communicating the event data 54 may also be to a person who is not the source of the stimulus the leads to receiving sensor data 52 .
- Storing the event data 56 may include storing details of features and events defined by the event data. Following filtering and transformation back into the time domain, the previously recorded inflection points may be used as a reference point for defining events. The type of extrema to search for may be selected based on the previous data window. Where the previous data window included a maximum as an inflection point, then a minimum in the following data window may be the search target. Where the previous data window included a minimum as an inflection point, then a maximum in the following data window may be the search target.
- FIG. 3 is a flowchart of a method 150 for detecting heterogeneous events.
- the method 150 includes the localized filtering method 160 and the event detection method 170 .
- the method 150 includes defining parameters 151 . In some applications, the parameters may already be set or may not be definable by a user or otherwise, and defining parameters 151 may be absent from a method for detecting heterogeneous events (e.g. the method 50 , etc.).
- the method 150 includes receiving sensor data 152 , resulting in the sensor data.
- the localized filtering method 160 is applied to the sensor data, resulting in the filtered data.
- the event detection method 170 is applied to the filtered data, resulting in the event data.
- Communicating the event data 154 , storing the event data 156 , or both, may follow applying the event detection method 170 .
- Progressing a moving time window by a pre-determined amount of time 158 follows, and the method 150 is repeated with receiving data 152 , providing additional data corresponding to the timeline defined following shifting the moving window by a pre-determined amount of time 158 .
- the localized filtering method 160 is applied to the sensor data.
- the localized filtering method 160 includes selecting a data window 162 .
- the data window corresponds to a time period of the sensor data, which may include a plurality of data windows.
- Calculating a time frequency-representation 164 provides a time-frequency representation corresponding to each of the data windows.
- Calculating a filter mask 166 based on the time-frequency representation provides a filter mask.
- Localized filtering 168 is applied to the time-frequency representation using the filter mask, resulting in the filtered data.
- Defining parameters 151 may take place for each data window. Defining parameters 151 may include defining upstream parameters such as the time scale of each data window, or any details of calculating the time-frequency representation 164 , calculating the filter mask 166 , or performing localized filtering 168 .
- the time scale may be selected with reference to the type of events that are expected. In applications where receiving sensor data 152 is directed to sensor data of an individual's gait, a normal step may take between 0.5 and 2 seconds. The minimum and maximum window time scales may be placed around this estimated duration accordingly, such that much shorter or much longer steps would not be considered. The minimum step time scale may be related to the time scale of the previous step.
- An individual step may be defined as longer than 25% of the duration of the previous step, otherwise it will be treated as part of the previous step.
- an individual step may be defined with reference to the force of a footfall on the previous step and any relationship from a baseline force, to distinguish sensor data of the individual taking distinct steps from sensor data of the individual shifting their weight without taking a step. Defining individual steps with reference to magnitude of footfall force may also facilitate defining jump landings, steps on stairs, steps on ramps, steps after taking on a significant load, or other heterogeneous events that are indicative of different types of steps or other activities that register force on a footfall.
- Defining parameters 151 may also include updating downstream user-experience functions of the system within which the method 150 is practiced. In applications that use an insole or other plantar pressure sensors, this may include inflating or deflating bladders around an insole, changing stiffness of wrist or other joint braces, changing output from a hearing aid, changing the pace of a metronome, or other functions.
- the method 50 and the method 150 may each be carried out with the system 10 .
- the sensor data 27 may be received by the processing module 30 from the data source 20 at a predetermined frequency, in some cases via temporary storage 29 .
- the localized filter process 32 may be applied to the sensor data 27 to apply the localized filtering method 60 .
- the event detection process 36 may be applied to the filtered data 35 to carry out the event detection method 70 .
- the event data 37 may be applied in communicating the event data 54 to the user through the output module 42 , in storing the event data 56 in the storage module 44 , or both.
- the sensor data 27 may generally be represented as a time-varying signal. Where the sensor data 27 includes pressure data, the pressure data may represent the application of pressure over a defined amount of time.
- the rate at which the sensor data 27 is sampled from the data source 20 may be adjusted according to the data set after the localized filtering method 60 has been applied to sensor data 27 . Some applications may benefit from greater resolution in the sensor data 27 while other applications may benefit from down-sampling to conserve bandwidth in the sensor data 27 or increase event detection speed.
- FIGS. 4 A to 7 C show simulated sensor data, time-frequency representations, and filtered data in a test application of the method 50 to simulated pressure sensor data that would be obtained in a system similar to the system 10 adapted for use on feet, similarly to the system 210 of FIG. 8 or the system 310 of FIG. 13 .
- the simulated sensor data was modified from empirical pressure data acquired with a single pressure sensor.
- the simulated sensor data was modified to increase heterogeneity of events to be characterized by application of the method 50 .
- the individual events represented in the simulated sensor data differ in their specific pressure-time profiles more than individual events tend to from empirical pressure data.
- the simulated sensor data corresponded to sensor data from a pressure sensor of the types of events that may be seen when applying the method and system to an individual who is walking to measure steps or other events related to the individual's gait (e.g. changing speed, climbing stairs, walking on a ramp or other incline, tapping feet, etc.).
- Two low-pass Fourier transforms were applied to the simulated sensor data, in each case resulting in a frequency representation.
- the two low-pass Fourier transforms used two different cut-off frequencies.
- a first adaptive localized filter was based on the magnitude of an S-transform of the simulated sensor data.
- a second adaptive localized filter was based on the square of the magnitude of the S-transform of the simulated sensor data.
- the adaptive filter provides more accurate event detection than the filter based on a Fourier transform.
- a previous filter based on low-pass Fourier transform identified two events as one, while the adaptive filter correctly identified the two separate events.
- FIGS. 4 A, 4 B, and 4 C show the simulated sensor data series (solid lines) as received from sensors, and filtered data obtained using a low-pass Fourier transform approach, which applies to the entire data set, as opposed to the localized and adaptive approach of the method 50 .
- FIG. 4 A shows filtered data obtained by applying a low-pass Fourier transform-based filter with a first cut-off frequency to the simulated sensor data (dashed lines).
- FIG. 4 A shows filtered data obtained by applying a low-pass Fourier transform filter with a second cut-off frequency to the simulated sensor data (dotted lines).
- FIG. 4 B shows only the simulated sensor data (solid lines) and the filtered data at the first cut-off frequency (dashed lines).
- FIG. 4 C shows only the simulated sensor data (solid lines) and the filtered data at the second cut-off frequency (dotted lines).
- the first cut-off frequency is lower than the second cut-off frequency. As can be seen at between about 750 and about 950 seconds, two peaks in the simulated sensor data were interpreted as one event in the filtered data at the first cut-off frequency (dashed lines).
- FIG. 5 is a time-frequency representation plot of the same simulated sensor data of FIG. 4 A after a localized S-transform of the simulated sensor data, as in the calculating a time frequency-representation 64 portion of the method 50 .
- a filter mask is determined for each data window of the sensor data with reference to the time-frequency plot of FIG. 5 corresponding to the respective time period as in the method 50 in the calculating a filter mask 66 portion of the localized filtering method 60 .
- the respective filter masks are applied to the time-frequency representation of FIG. 5 when filtering the data during localized filtering 68 in the method 50 .
- the filter mask for each data window in time is determined with reference to the magnitudes of each frequency at the data window of the time-frequency representation.
- FIGS. 6 A, 6 B, and 6 C show the simulated sensor data of FIG. 4 A (solid line in each of FIGS. 6 A, 6 B, and 6 C ) and resulting filtered signal using an adaptive, localized filtering method include features from the method 50 as shown in FIG. 2 .
- FIG. 6 A also shows filtered data obtained by applying a filter mask proportional to the time-frequency representation magnitude (dashed lines).
- FIG. 6 A also shows filtered data obtained by applying a filter mask proportional to the square of the time-frequency transformation (dotted lines).
- FIG. 6 B shows only the simulated sensor data (solid lines) and the filtered data based on the magnitude of the time-frequency representation (dashed lines).
- FIG. 6 C shows only the simulated sensor data (solid lines) and the filtered data based on the square of the magnitude of the time-frequency representation (dotted lines).
- FIG. 6 D shows only the filtered data based on the magnitude of the time-frequency representation (solid lines).
- FIG. 6 E shows only the filtered data based on the square of the magnitude of the time-frequency representation (solid lines).
- FIGS. 7 A, 7 B, and 7 C show a series of time-frequency representation magnitude plots of the simulated sensor data after a localized S-transform of the simulated sensor data (bottom).
- FIGS. 7 A, 7 B, and 7 C also show plots (top) of corresponding sensor data (solid lines), filtered signal using an adaptive localized filtering technique in the current data window based on the square of the magnitude of the time-frequency representation (dashed lines; the same plots shown in FIGS. 6 A, 6 C, and 6 E ).
- FIG. 7 A shows sensor data and filtered data from 0 to 250 seconds
- FIG. 7 B shows sensor data and filtered data from 50 to 300 seconds
- FIG. 7 C shows sensor data and filtered data from 100 to 350 seconds.
- FIGS. 6 A, 6 B, and 6 C show source data filtered in accordance with the localized filtering method 60 in FIGS. 6 A, 6 B, and 6 C .
- FIGS. 4 A, 4 B, and 4 C shows application of a Fourier transform to the global dataset.
- the localized filtering method 60 provides a filtered time-representation of the pressure data that highlights the most prominent events or steps, and minimizes the noisy, least important components of the signal.
- the adaptive localized filter provides filtered data more closely approximating the simulated sensor data in several portions of the pressure-time curve.
- the peak at about 50 seconds is more closely approximated by the adaptive localized filter.
- a hard “heel first” strike followed by contact at the ball of the foot may result in the double peak in the simulated sensor data as shown around 150 seconds in FIGS. 6 A to 6 E .
- the adaptive localized filter correctly interpreted this event as a single step. While the global Fourier transform filters applied in FIGS. 4 A to 4 C also defined the step around 150 seconds as a single step, in other cases a simple high-pass or low-pass filter may not provide the same accuracy.
- the troughs at about 225 seconds are deeper and further from the simulated sensor data in the global Fourier transform filter compared with the adaptive localized filter.
- the adaptive localized filter more closely follows the simulated sensor data contours than the global Fourier transform filter. As can be seen at between about 750 and about 950 seconds, two peaks in the simulated sensor data were interpreted as one event in the filtered data at the first cut-off frequency (dashed lines).
- Each of the adaptive localized filters applied show nine events, as most simply shown in FIGS. 6 D and 6 E .
- the nine events would be counted by application of the event detection method 70 .
- the global Fourier transform with the first (lower) cut-off frequency misinterpreted two events as one between about 750 and about 950 seconds.
- the increased accuracy in characterizing heterogeneous events may be facilitated by application of adaptive localized filtering based on the magnitude (or the magnitude elevated to a power) of a localized time-frequency representation determined within a progressing window of the simulated sensor data.
- the simulated sensor data show a well-defined baseline, which facilitates application of the global Fourier transform.
- the global Fourier transform may be more likely to result in inaccurate filtering of the time-frequency representation and inaccurate event detection.
- the adaptive localized filtering method may provide additional advantages for applications with a drifting baseline.
- FIG. 8 shows a schematic of an event detection system 210 .
- the data source 220 includes the first pressure sensor 222 and the second pressure sensor 224 on a footwear insert for a subject individual's foot.
- the first pressure sensor 222 may be located at a location corresponding to a heel of a foot on the data source 220 .
- the second pressure sensor 224 may be located at a location corresponding to a ball of a foot on the data source 220 .
- Sensor data 227 collected by the pressure sensor system 220 may then be transmitted by an intermediary storage and transmission device 229 , and analyzed by the processing module 230 inside a laptop computer.
- the laptop computer also includes an event data module 240 .
- the event data module 240 communicate the event data 237 to the communication module 242 , stored by a storage module 244 , or both.
- FIG. 9 is a plot of idealized sensor data 227 obtained with the event detection system 210 .
- the first sensor data 226 (dotted lines) from the first sensor 222 at the heel of a foot, and the second sensor data 228 (dashed lines) from the second sensor 224 at the forefoot, also called the ball of the foot, where the metatarsals are located, are both shown on the plot.
- the sensor data 227 may be obtained by applying the method 50 or the method 150 using the system 220 .
- FIG. 10 is a plot of summation data based on the sensor data 227 of FIG. 9 showing the idealized sensor data 227 as a summation of the first sensor data 226 and second sensor data 228 .
- various events may be identified, characterized and bound by points of inflection within the sensor data 227 .
- An entire step event 290 may be considered an event, which is bound in the time dimensions by inflection points that deviate from the baseline reading.
- smaller component events of the step event 290 include a heel strike 291 that may be recorded by the first pressure sensor 222 at the heel, and a forefoot strike 292 that may be recorded by the second pressure sensor 224 at the forefoot. Both heel strike 291 and forefoot strike 292 are defined by apparent inflection points within the sensor data 227 , which has a drifting base line.
- FIG. 11 a is a plot of filtered data 235 (solid lines) based on the sensor data 227 of FIG. 9 (dashed lines).
- the step event 290 is shown in the filtered data as a single event without the first event 291 and the second event 292 .
- the bounds of the step event 290 can be identified with greater accuracy than the step event 290 was identified in FIG. 10 based on the sensor data.
- the drifting baseline is filtered out of the sensor data 227 .
- the encompassing event is an entire step event 290 , defined by the inflection points of the filtered data 235 , in this case, minima in the filtered data 235 that correspond to non-extrema inflection points in the sensor data 227 .
- FIGS. 11 b and 11 c show the sensor data 227 from the heel (dashed lines in 11 b ) and the forefoot (dashed lines in 11 c ) superimposed over the filtered data 235 .
- FIGS. 11 d and 11 e respectively show the sensor data at the heel and at the forefoot (dashed lines), and filtered data 235 at the heel (solid line in FIG. 11 d ) and the forefoot (solid line in FIG. 11 e ).
- FIG. 11 f shows the filtered data 235 at the heel (solid line), the filtered data 235 at the forefoot (dashed line) and the combined sensor data 227 (dotted line).
- the bounds of the step event 290 are defined at the intersections of the filtered data 235 at the heel (solid line) and the filtered data 235 at the forefoot (dashed line), which correspond to the same points as identified in the summed filtered data 235 prepared from the sensor data 227 of both combined first sensors 222 and second sensor 224 .
- FIG. 12 shows the idealized sensor data 227 (dashed lines) and the derivative of the sensor data 227 (solid line) superimposed with the event bounds of the step event 290 for defining the event data 237 .
- a ground contact event 293 may be defined the first derivative of the sensor data 227 (filtered data of the first derivative of the sensor data 227 ).
- the bounds of the step event 290 are defined using the filtered data 235 of the first derivative of the sensor data 227 , similarly to the filtered data 235 of the sensor data 227 in FIGS. 11 a to 11 f , the ground contact event 293 within the step event 290 may be identified in the first derivative of the sensor data 227 .
- the ground contact event 293 may include a heel strike event 294 , a ground contact with the ball of the foot event 295 , and a toe-off event 296 .
- Each of the heel strike event 294 , ground contact with the ball of the foot event 295 , and toe-off foot event 296 are bound by inflection points in the first derivative of the sensor data 227 .
- the inflection points bounding the heel strike event 294 include a local maximum and an inflection point in the derived sensor data 227 .
- the inflection points bounding the ground contact with the ball of the foot event 295 include two inflection points in the derived sensor data 227 .
- the inflection points bounding the toe-off event 296 include an inflection point and a local minimum in the derived sensor data 227 .
- the simulated sensor data 227 show a well-defined baseline, which facilitates application of a global Fourier transform.
- the global Fourier transform may be more likely to result in inaccurate filtering of the time-frequency representation and inaccurate event detection.
- the adaptive localized filtering method may provide additional advantages for applications with a drifting baseline.
- FIG. 13 is a schematic of an event detection system 310 .
- the data source 320 includes the first pressure sensor 322 at the heel of an insole, and a second pressure sensor 324 at the forefoot of the insole corresponding to the ball of a foot. Sensor data 327 collected by the data source 320 may then be analyzed by the processing module 330 on the instrumented insole.
- the event data 337 is the communicated to the to the event data module 340 inside a smart watch.
- the event data module 340 may transmit the event data 337 to the communication module 342 , store the event data 337 in a storage module 344 , or both.
- FIG. 14 depicts idealized sensor data 327 from the event detection system 310 .
- the sensor data 327 includes the first sensor data 326 from the first sensor 322 , and the second sensor data 328 from the second sensor 324 .
- a prompt 346 is provided as voice coaching from the communication module 342 of the event data module 340 .
- the prompt 346 offers suggestions to the user to improve their activity efficiency, mitigate injury or other points.
- the prompt 346 is a suggestions to move the foot strike zone away from the heel and towards the forefoot in order to minimize joint injury.
- Two prompts 346 are required before the user changes their behaviour and the heel pressure shown in the second sensor data 328 in decreased.
- a confirmation 348 is communicated to the user as vibration from the watch, approving of the behavioural change.
- the below table shows the amplitudes of the heel strikes before and after the prompts 346 .
- FIG. 15 shows a schematic of an event detection system 410 .
- the data source 420 includes an accelerometer 422 on a footwear insert for a subject individual's foot.
- the accelerometer 422 could similarly be attached to another part of the shoe, such as on the outside of the tongue, or to another part of the body such as the ankle, shin, thigh, hip, chest, back or shoulder.
- Acceleration data 427 collected by the accelerometer 422 may then be transmitted by an intermediary storage and transmission device 429 , and analyzed by a processing module 430 inside a laptop computer, smart phone, smart watch, server computer or other processing device.
- the processing device includes an event data module 440 .
- the event data module 440 may communicate the event data 437 to the communication module 442 , store the event data 437 to a storage module 444 , or both.
- FIG. 16 is an example plot of sensor data 427 based on the sensor data 427 from an accelerometer 422 in FIG. 13 .
- the sensor data 427 represents a user jumping and landing back on the ground.
- Events 490 may be distinguished within the sensor data 427 , corresponding to takeoff and landing of the user, which elicit spikes in acceleration.
- the events 490 illustrate non-repetitive events.
- FIG. 17 is a plot of filtered data 435 (solid lines) based on the sensor data 427 of FIG. 14 (dashed lines).
- Two jumping events 497 a and 497 b are characterized by the bounds of the filtered data 435 through the adaptive localized filter process (e.g. the localized filtering method 60 or the localized filtering method 160 ) by identifying inflection points within the filtered data.
- the events are respectively the accelerations associated with the takeoff and landing of a jumping motion.
- FIG. 18 is a schematic of an event detection system 510 .
- the system is a hearing aid, with built-in capability to take in environmental noise, and select an appropriate filter in order to provide a suitable signal to the user's ear.
- the data source 520 includes a microphone 522 that records noises 512 from the user's environment as sound signals 527 .
- the sound signal 527 is the transmitted by an intermediary storage and transmission device 529 to a processing module 530 within the hearing aid.
- the processing module 530 produces event data which is passed to the event data module 540 where it is communicated to the communication module 542 .
- the communication module may then pass along a modified noise signal to the user, filtered according to the event data.
- FIG. 19 is an example plot of sensor data 527 based on the hearing aid event detection system 510 of FIG. 18 .
- the sensor data 527 represents environmental noise (e.g. talking, music, etc.).
- Events 590 may be distinguished within the sensor data 527 and passed on to an event data module 540 as event data.
- FIG. 20 is a schematic of an event detection system 610 .
- An accelerometer 622 sits at the base of the palm of an instrumented glove 621 .
- Sensor data 627 collected by the accelerometer 622 is then transmitted by an intermediary storage and transmission device 629 , and analyzed by a processing module 630 inside a processing module 630 , in this case, a smart watch.
- the processed event data 637 is communicated to the event data module 640 , where it may be passed to a communication module 66 or stored by a storage module 644 .
- the communication module may communicate an event, such as wrist flexion large enough to cause high internal pressures in the carpal tunnel and resulting nerve damage, to a user through audio, vibratory, or visual cues.
- the glove 621 includes variable stiffness material 649 in a wrist portion 623 of the glove 621 for correcting carpal tunnel inducing behavior, and the stiffness may be adjusted as a parameter of the system 610 such as at defining parameters 151 of the method 150 .
- the accelerometer-as-a-level may also be referred to as “inclination sensing” using the accelerometer 622 .
- Embodiments of the disclosure can be represented as a computer program product stored in a machine-readable medium (also referred to as a computer-readable medium, a processor-readable medium, or a computer usable medium having a computer-readable program code embodied therein).
- the machine-readable medium can be any suitable tangible, non-transitory medium, including magnetic, optical, or electrical storage medium including a diskette, compact disk read only memory (CD-ROM), memory device (volatile or non-volatile), or similar storage mechanism.
- the machine-readable medium can contain various sets of instructions, code sequences, configuration information, or other data, which, when executed, cause a processor to perform steps in a method according to an embodiment of the disclosure.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Physiology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Artificial Intelligence (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Otolaryngology (AREA)
- Neurosurgery (AREA)
- Acoustics & Sound (AREA)
- General Physics & Mathematics (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 16/623,475 filed on Dec. 17, 2019 and entitled “METHOD AND SYSTEM FOR HETEROGENEOUS EVENT DETECTION”, which is a national stage entry of International Patent Application Number PCT/CA2018/050802 filed Jun. 28, 2018 and entitled “METHOD AND SYSTEM FOR HETEROGENEOUS EVENT DETECTION”, which claims the benefit of priority of U.S. Provisional Patent Application No. 62/526,080 filed Jun. 28, 2017 and of U.S. Provisional Patent Application No. 62/574,013, filed Oct. 18, 2017, each entitled “METHOD AND SYSTEM FOR HETEROGENEOUS EVENT DETECTION”. Each of the aforementioned applications is incorporated herein by reference in its entirety.
- The present disclosure relates to heterogeneous event detection.
- Human motion, such as walking, running, and jumping, may be characterized as a series of separate events with generally predictable trends, such as plantar pressure at the end of a step and acceleration of a foot through a step. Counting such events within a period of time to monitor activity levels may have application in fitness, healthcare, or other contexts. To count such events, wearable sensors, such as accelerometers, may be strapped onto an individual's wrist, foot, or core. Similarly, pressure or other sensors may be fitted into an insole of a shoe. Time-based data from these sensors may be applied to event detection and characterization.
- In some cases, sensor readings may not come in the form of simple waveforms. Activities, particularly those performed by people in motion, are not always regular, and automated analysis of the resulting data sets may not be straightforward. Identifying individual steps or other events may be complicated due to irregularity of motion and the signal-to-noise ratio that may accompany sensing and data transmission. Sensor data of a single step may include several local maxima and minima. Successive steps may differ in speed and pace of the steps, and in intensity of the landing (e.g. light, heavy). Moreover, a person may perform heterogeneous activities, for instance, first walking for a number of steps, followed by running for a number of steps, tapping their feet, jumping, and resuming walking, walking on stairs or a ramp, or any number of other activities.
- Many step-detection systems include a processor programmed to analyze time-based sensor data series and identify a step based on peaks and troughs. Smoothing may be employed to eliminate noise effects, thresholds may be applied to discard small peaks or troughs, and enveloping may be applied for reducing data variation. Previous methods may apply a low pass filter or other approach with predetermined threshold parameters, which are in many cases set arbitrarily, and on an underlying assumption that the user is performing only an identified activity that will result in a consistent data profile on each occurrence.
- In view of the shortcomings of some previous approaches to event detection, it is desirable to provide an event detection method for heterogeneous events. In previous approaches, a low-pass Fourier transform or other filter is applied to sensor data series to capture the low-frequency component, which may result in a filtered data series that will show discrete events more definitively. However, a low-pass filter with a single cut-off frequency for an entire data set may not accurately detect events that vary in duration, events that vary in amplitude, events separated by variable amounts of time, or other heterogeneous activities.
- Herein provided are a method and system for heterogeneous event detection. The method includes, and the system facilitates, acquiring data from sensors and processing the resulting sensor data to define events that are heterogeneous from occurrence to occurrence of the event. The method and system apply localized adaptive filtering through a local time-frequency transform, an inverse transform, and an adaptive filter mask. The adaptive filter mask is based on the time-frequency representation and is defined for time periods defining data windows, with reference to locally abundant frequencies. The method and system provide an adaptive approach that may be applied to detecting events that can be predicted in terms of trends in sensor data associated with the event, but that may not be consistent from one occurrence of an event to the next.
- In applications directed to human movement or interpretation of audio data, trends such as plantar pressure following a step or other change in weight distribution on feet, acceleration or rotation during a step or other body movement, or changes in amplitude or frequency of a sound or collection of sounds may all be indicative of events that are generally predictable but that do not result in identical data on each occurrence. The events may be generally repetitive or recurring. In response to the event, the method and system may prompt a suggested change to improve performance of the event or outcome of the event. In response to the event, the method and system may update parameters of how the event is characterized or change parameters of how a device functions to change user experience in relation to the event or prepare for an outcome expected to follow the event.
- The method and system may be applied continuously, in real-time or in batch processing. The method and system may be applied using a pressure sensor, a gyroscope, an accelerometer, a thermometer, a humidity sensor or any suitable sensor or combination of sensors depending on the specific application. The method and system may be applied to detecting events that are steps or other defined body movements associated with various activities (e.g. walking, running, jumping, biking, skiing, swimming, martial arts, boxing, yoga, gymnastics, dancing, etc.), or portions of any such activities. The method and system may be applied to use on individuals, animals, robotics, unmanned or manned vehicles, or any suitable system. The method and system may be applied to optimize activities of a user or test subject, including by prompting changes (e.g. audio, visual or tactile alerts to change movement patterns), or by changing device function (e.g. by inflating or deflating bladders around an insole, changing stiffness of wrist or other joint braces, changing output from a hearing aid, etc.).
- Applying the localized adaptive filter to the sensor data may include segmenting the sensor data into data windows and converting the resulting data windows into a time-frequency representation using a transform such as the S-transform. In the time-frequency representation, the relative contributions of many frequencies of the sensor data profile can be represented for each time point. Adaptive localized filtering magnifies the most prominent frequencies at each time point and suppresses the least prominent frequencies in the resulting filtered data. The adaptive filtering may be based on a power of the magnitude of the time-frequency representation, resulting in greater divergence in contribution from more prominent frequencies as compared with less prominent frequencies. The greater divergence in contribution from more prominent frequencies as compared with less prominent frequencies may facilitate heterogeneous event detection.
- The adaptive localized filtering provides filtered data. Identifying events is facilitated in features of the filtered data compared with features of the sensor data. Once the events are identified along the timeline of the filtered data, the events may be further characterized in either the filtered data or the sensor data. Characterization along the timeline of either the filtered data or the sensor data may include analysis of the derivative or the integral of the data.
- In a first aspect, herein provided is a method and system for heterogeneous event detection. Sensor data is obtained and divided into discrete data windows. Each data window is defined by and corresponds to a time period of the sensor data. A time-frequency representation over the time period is calculated for each data window. A filter mask is calculated based on the data window corresponding to the time-frequency representation. The filter mask is applied for reverting the time-frequency representation to a time representation, resulting in filtered data. Features, such as extrema or other inflection points, are identified in the filtered data. The features define events, and transforming the time-frequency representation back into the time domain emphasizes differences between more and less prominent frequencies, facilitating identification of heterogeneous events. The method and system may be applied to body movements of people or animals, automaton movement, audio signals, light intensity, or any suitable time-dependent variable.
- In a further aspect, herein provided is a method for detecting heterogeneous events related to movement of an individual comprising: receiving sensor data of movement of the individual; defining a data window over a time period of the sensor data; calculating a time frequency-representation of the data window for providing a time-frequency representation corresponding to the time period; calculating a filter mask based on the time-frequency representation; filtering the time-frequency representation with the filter mask, providing filtered data; identifying features in the filtered data; identifying an event with reference to the features; and outputting the event.
- In a further aspect, herein provided is a method for detecting heterogeneous events comprising: receiving sensor data; defining a data window over a time period of the sensor data; calculating a time frequency-representation of the data window for providing a time-frequency representation corresponding to the time period; calculating a filter mask based on the time-frequency representation; filtering the time-frequency representation with the filter mask, providing filtered data; identifying features in the filtered data; identifying an event with reference to the features; and outputting the event.
- In a further aspect, herein provided is system for detecting heterogeneous events comprising: a sensor for receiving sensor data; and a processor in communication with the sensor for receiving the sensor data; wherein the processor is configured to execute instructions for carrying out the methods described above.
- In some embodiments of the methods and systems, the sensor data comprises at least two different types of data; the sensor data comprises data of pressure, acceleration, rotation, seismic changes, temperature, humidity, or sound; receiving the sensor data is at a down-sampled rate to increase event detection speed; receiving the sensor data of a first data window is at a rate determined with reference to the filtered data of a second data window, the second data window preceding the first data window in time; the time period of the data window has a duration equal to additional data windows preceding or succeeding the data window; calculating a time frequency-representation of the data window comprises applying an S-transform to the sensor data within the data window; filtering the time-frequency representation with the filter mask comprises applying the filter mask to the time-frequency representation and applying the inverse S-transform to the product of the filter mask and the time-frequency representation; applying the filter mask to the S-transform comprises multiplying the time-frequency representation by the filter mask; calculating a time frequency-representation of the data window comprises applying a Gabor transform to the sensor data within the data window; filtering the time-frequency representation with the filter mask comprises applying the filter mask to the time-frequency representation and applying the inverse Gabor transform to the product of the filter mask and the time-frequency representation; filter mask to the Gabor transform comprises multiplying the time-frequency representation by the filter mask; the filter mask comprises non-binary values proportional to a positive exponent of the magnitude of the time-frequency transform, wherein more prominent frequencies will be emphasized, and less prominent frequencies will be de-emphasized; the exponent is 1; the exponent is 2; filtering the time-frequency representation comprises normalizing all values in the time-frequency representation; the features comprise inflection points; at least one inflection point of the inflection points comprises a local extremum; identifying the inflection points comprises identifying at least one inflection point of the inflection points within a first data window based on at least one inflection point of a second data window, the second data window preceding the first data window in time. identifying the event comprises deriving an event count; identifying the event comprises defining the features as endpoints of the event; identifying the event comprises characterizing the sensor data between time points corresponding to the endpoints defined in the filtered data; characterizing the sensor data comprises deriving the sensor data; characterizing the sensor data comprises integrating the sensor data; identifying the event comprises characterizing the filtered data between the endpoints; characterizing the filtered data comprises deriving the filtered data; characterizing the filtered data comprises integrating the filtered data; identifying the event comprises defining a pulse width, a cycle, a ground contact time, a center of pressure, or a path of center of pressure of the sensor data or the filtered data; outputting the event comprises communicating data relating to the event to an individual; outputting the event comprises prompting an individual to change behavior; outputting the event comprises communicating data relating to the event to an individual; outputting the event comprises storing data relating to the event in a computer readable medium; outputting the event comprises outputting contribution to the sensor data from each of a plurality of sensors; or any of the foregoing.
- Other aspects and features of the present disclosure will become apparent to those ordinarily skilled in the art upon review of the following description of specific embodiments in conjunction with the accompanying figures.
- Embodiments of the present disclosure will now be described, by way of example only, with reference to the attached figures, in which reference numerals having a common final two digits refer to corresponding features across figures (e.g. the
method 50,method 150, etc.): -
FIG. 1 is a block diagram of an event detection system; -
FIG. 2 is a flowchart of a method for detecting heterogeneous events; -
FIG. 3 is a flowchart of a method for detecting heterogeneous events -
FIG. 4A is an example pressure data series filtered with two Fourier transforms at two cut-off frequencies; -
FIG. 4B is the data series ofFIG. 4A showing only one of the Fourier transforms and the sensor data; -
FIG. 4C is the data series ofFIG. 4A showing only one of the Fourier transforms and the sensor data; -
FIG. 5 is an S-transform time-frequency plot of the data series ofFIG. 4A ; -
FIG. 6A is the data series ofFIG. 3 filtered with two adaptive, localized filters, one based on the magnitude of the S-transform shown inFIG. 5 and another based on the square of the magnitude of the S-transform shown inFIG. 5 ; and -
FIG. 6B is the data series ofFIG. 6A showing only the filter based on the magnitude of the S-transform shown inFIG. 5 and the sensor data; -
FIG. 6C is the data series ofFIG. 6A showing only the filter based on the square of the magnitude of the S-transform shown inFIG. 5 and the sensor data; -
FIG. 6D is the data series ofFIG. 6A showing only the filter based on the magnitude of the S-transform shown inFIG. 5 ; -
FIG. 6E is the data series ofFIG. 6A showing only the filter based on the square of the magnitude of the S-transform shown inFIG. 5 ; -
FIG. 7A shows S-transform magnitudes and corresponding signals filtered with adaptive localized filtering based on the magnitude of the S-transform, from 0 to 250 seconds of the data shown inFIGS. 4A to 6E ; -
FIG. 7B shows S-transform magnitudes and corresponding signals filtered with adaptive localized filtering based on the magnitude of the S-transform, from 50 to 300 seconds of the data shown inFIGS. 4A to 6E ; -
FIG. 7C shows S-transform magnitudes and corresponding signals filtered with adaptive localized filtering based on the magnitude of the S-transform, from 100 to 350 seconds of the data shown inFIGS. 4A to 6E ; -
FIG. 8 shows an event detection system; -
FIG. 9 is a plot of idealized sensor data of the event detection system ofFIG. 8 ; -
FIG. 10 is a plot of summation data based on the sensor data ofFIG. 9 ; -
FIGS. 11 a to 11 f are a plots of filtered data based on the sensor data ofFIG. 9 and the sensor data ofFIG. 9 ; -
FIG. 12 is a plot of the first derivative of the sensor data ofFIG. 9 showing features of an event; -
FIG. 13 shows an event detection system; -
FIG. 14 is a plot of idealized sensor data of the event detection system ofFIG. 13 ; -
FIG. 15 shows an event detection system; -
FIG. 16 is a plot of idealized sensor data of the event detection system ofFIG. 15 ; -
FIG. 17 is a plot of idealized filtered sensor data of the event detection system ofFIG. 15 , -
FIG. 18 shows an event detection system; -
FIG. 19 is a plot of idealized sensor data of the event detection system ofFIG. 18 ; and -
FIG. 20 shows an event detection system. - Herein provided are a method and system for heterogeneous event detection. The method includes, and the system facilitates, acquiring data from sensors and processing the resulting sensor data to define events that are heterogeneous from occurrence to occurrence of the event. The method and system apply localized adaptive filtering through a local time-frequency transform, an inverse transform, and an adaptive filter mask. The adaptive filter mask is based on the time-frequency representation and is defined for time periods defining data windows, with reference to locally abundant frequencies. The method and system provide an adaptive approach that may be applied to detecting events in real-time or in batch processing. The filter mask is determined with respect to local conditions around an event, avoiding the need for arbitrary thresholds, which are applied in many other commonly used detection techniques. The method and system provide a robust and adaptive solution for detecting movements of people, animal or automata, detecting sounds, or any receive data associated with any other type of event that may not be consistent from one occurrence of the event to the next.
- Each individual event may be characterized and characteristics of the event may be defined. The characteristics may be used to analyze the sensor data for characteristics that may include duty cycles of events and statistics surrounding the characteristics. Event detection may be facilitated by identifying characteristics within the event other than the duration and boundaries of the event. Such characteristics may include the pulse width of each event, the contribution from multiple sensors during an event, and the duty cycle if the events are periodic. These characteristics describe the event, and may be used to verify the event detection process. The characteristics may be provided to a user of the method and system as event data. In response to the event, the method and system may prompt a suggested change to improve performance of the event or outcome of the event. In response to the event and the event data, the method and system may update parameters of how the event is characterized or change parameters of how a device functions to change user experience in relation to the event or prepare for an outcome expected to follow the event.
- Applying the localized adaptive filter to the sensor data may include segmenting the sensor data into data windows and converting the resulting data windows into a time-frequency representation using a transform such as the S-transform, including a discretized S-transform. In the time-frequency representation, the relative contributions of many frequencies of the sensor data profile can be represented for each time point. Adaptive localized filtering magnifies the most prominent frequencies at each time point and suppresses the least prominent frequencies at each time point in the resulting filtered data. The adaptive filtering may be based on a power of the magnitude of the time-frequency representation, resulting in greater divergence in contribution from more prominent frequencies as compared with less prominent frequencies. The greater divergence in contribution from more prominent frequencies as compared with less prominent frequencies may facilitate heterogeneous event detection.
- Generally, an event is defined as a distinct waveform bound between two features along the timeline of the sensor data or filtered data, although as described above, locating the features is facilitated in the filtered data relative to locating the features the sensor data. The features may include inflection points. The inflection points may be local extrema within the data, a derivative of the data, or an integral of the data. Once an event has been delineated in the filtered data, further processing of the event within its boundaries may be performed with reference to the filtered data, the sensor data or both. A portion of the event that occurs above a predefined threshold in respect of sensor data from one or more sensors may define a shape or other characteristic of the event. One method of describing the event is by its pulse width, which is the amount of time that the sensor data of the event is above a specified threshold, typically beyond a baseline level in a data series. The pulse width is defined as the timeline between the start and end times of an event. Sensor contribution can be studied within the pulse width. Definition of these characteristics facilitates detection and analysis of events, and of other aspects of the data set.
- Event detection acts as a preliminary step in analyzing input data. Generally, the information of interest in a data set includes dynamic events rather than the static moments. Once an event is detected and delineated, the event may be further analyzed. Analysis of the event and of features within the event is not possible without first detecting the event. An event of interest may be heterogeneous in nature, consisting of repetitive or non-repetitive occurrences. In the case of repetitive, reoccurring events, there may be high or low variability from event to event.
- The method and system may be applied to detecting events that are not consistent from one occurrence of the event to the next, and that may occur with inconsistent intervals between occurrences. The method and system may be applied to detecting events that are steps or other defined body movements associated with various activities (e.g. walking, running, jumping, biking, skiing, swimming, martial arts, boxing, yoga, gymnastics, dancing, etc.), or portions of any such activities. The method may include use of, and system may include, sensors that are worn on the feet, legs, wrists, arms, torso or other portions of the body, depending on the application. The sensors may include a pressure sensor, a gyroscope, an accelerometer, a seismograph, a thermometer, a humidity sensor, a microphone or other audio sensor, an optical sensor or any suitable sensor or combination of sensors depending on the specific application. Actuators or other output modules may drive changes in device function following detection.
- The method and system may be applied to optimize activities of a user or test subject, including by prompting changes (e.g. audio, visual or tactile alerts to change movement patterns), or by changing device function (e.g. by inflating or deflating bladders around an insole, changing stiffness of wrist or other joint braces, changing output from a hearing aid, etc.). These responses may facilitate improved performance of movements that are detected and characterized as an event. Particularly with application to body movements that are generally repetitive, coaching alerts and changes in device function may improve performance of the physical activity. Audio data may also be used in applications to improve device function, such as changing output from a hearing aid in response to changes in background noise or audio data indicative that a known person is talking and to emphasize or de-emphasize certain frequencies in audio output to help a user hear higher or lower frequencies. Audio data may also prompt changes, for example to improve playing a musical instrument, singing, or speaking a language.
- The method and system may be applied to gait analysis of a person, animal, or machine, by first detecting individual steps as events within a gait data set and further analyzing the features of each step event, and the statistics of the overall gait data set. A foot strike analysis may be applied to detecting the position on the foot where striking occurs during running or walking to provide coaching feedback improving gait efficiency, and reducing injury potential. A rate of pronation and supination may be detecting within each step event, allowing for coaching to improve gait efficiency and reduce injury.
- In applications directed to detecting steps, footfalls or other body movements of an individual, events that are in turn defined by multiple discrete features may be identified, different types of events based on peaks may be characterized, and plots of the sensor data may be applied contextually to generate the time-frequency representation. These features may facilitate accurate definition of heterogeneous events. In the context of detecting steps or other footfalls of an individual, such heterogeneous events may include changing speed, climbing stairs, walking on a ramp or other incline, tapping feet, or other events that may vary in unpredictable ways between footfalls or other events.
- Characteristics of each step may be defined, including the pulse width of the step, which may correlate to the ground contact time, and the center of pressure of the step. The sensor data may be represented by summations of multiple pressure sensors. The summations of the sensor data may be filtered by the heterogeneous event detection system, and the filtered data may be used to detect boundaries of step events based on features in the filtered data. The source data that is filtered to provide the filtered data may include pressure data from one or more sensors, and may also include data of acceleration, rotation, temperature, humidity or other data.
- Once the boundaries of the step events have been identified in the filtered data, the time bounds of the step events may be used for further analysis of the sensor data or other data. Within the identified time bounds, the pressure from each of the multiple sensors may be compared to one another for analysis. Within the identified time bounds, sensor data, filtered data or both for humidity, temperature or other aspects of the user's feet may be cross-referenced to the pressure data to more thoroughly characterize the event. Similarly, with pressure data as the principal type of data, acceleration, rotation or tilt data may also be superimposed over the pulsewidth of the event to characterize the event other than by pressure.
- When characterizing a series of walking steps, three components may commonly be defined: the number of steps within the series, the ground contact time of each step, and the path of center of pressure of the series. The localized adaptive filtering and identification of features in the filtered data facilitates detecting the start time and end time of each individual step, and the number of steps in the series. The ground contact times may be calculated within the bounds of the start and end times for each step. Within a step, the amount of time that a stance phase endures between swing phases defines a ground contact time. Between the start and end times of a step, inflection points of the rising edge and falling edge of a step, representing the onset and offset of the ground contact time respectively, may be detected by a number of means. The ground contact onset and offset points may then be used as bounds in which to calculate centres of pressure for an individual step, to filter out the pressures that are recorded during the swing phase of the step. Any change in center of pressure measurements over multiple steps may yield the path of center of pressure for the walking data.
- Correlating the locations of the sensors relative to an individual's foot when measuring steps or other activities involving steps allows the sensor data recorded by each sensor to be used to define the center of pressure of the entire sensor system at any one instant in time using known center of pressure calculations. Similarly, the path of the center of pressure throughout each event may be determined throughout the entire pressure series. Additional characteristics of the sensor data may be calculated after detection of the step, including any events defined within the step, such as a heel strike event, a forefoot strike event, a ground contact with the ball of the foot event, and a toe-off event, each of which may be grouped into a ground contact event portion of the step event. The ground contact event portion may define stance phase, and the remaining features of the data within a cycle may define a swing phase of each step.
- Identified features of the sensor data may be leveraged to analyze the sensor data for characteristics that may include the pulse width, the duty cycle, the ground contact time, the center of pressure of the step, the path of the center of pressure, and other statistics (e.g. mean, deviation, etc.) surrounding the features, and relating to various portions of the step event. Such characteristics of the step events may facilitate assessing the mean and deviation ground contact times, changes in center of pressure, changes in the path of the center of pressure, or other characteristics of a step event. These features of step events may be detected by pressure sensors, accelerometers or any other suitable system for detecting movement of a foot and contact of a sole with a walking surface.
- After step detection and characterization of heterogeneous steps, the method and system may include adjusting upstream or downstream aspects of the system's functionality to change how the data is processed. For example, if a series of jumps is detected or the pace of a run increases, the method may increase its sampling rate, while if the user appears to be sitting down, the sampling rate may decrease to save battery life. Functional responses of the system to these changes allow for low-power modes, elongating battery life, while still allowing for high frequency sampling during events in order to provide useful information for users in terms of coaching for avoiding injury, coaching for performance, research or any suitable application.
- The method and system may also prompt the user in response to an event. The prompt may be in the form of coaching for athletic performance, to avoid injury or for other reasons. The prompting may be communicated to the user in any manner that is appropriate to the application. Visual, audio or tactile feedback triggered by event detection may provide clear suggestions on how to improve performance or avoid injury. The feedback may be neuroplastic, such as for events detected at a region of interest that has limited or no sensation. Providing a tactile feedback to an area that does have sensation may train the user to intuitively recognize and react to that tactile feedback. For example, pressure, acceleration, rotation and temperature sensors may be placed on the foot of a neuropathic patient who has loss of feeling in their feet. The sensors and system may detect a step and convey a signal depicting the detection of the step to a transducer mounted on the patient's back as a vibrational signal that the patient can feel. The event may also include a spike in temperature that the patient may not be able to feel and recoil from.
- In some application, movements may be detected on the user's head, neck, hands, arms, torso, legs, feet or any suitable combination. As with sensors on the feet, sensors elsewhere on the body may include pressure, acceleration, rotation, temperature, humidity or any suitable sensor. The sensors may be grouped for processing data singularly from several sensors, and may be grouped according to the location of sensors on the user's body. In some cases, parallel sets of events are characterized on the same timeline. For example, data relating to pressure, acceleration, rotation, temperature and humidity on a user's arms and hands may be assessed trough a first group of sensors, and data relating to separately to pressure, acceleration, rotation, temperature and humidity on a user's legs and feet may be assessed separately trough a second group of sensors. The first and second groups of sensors may provide data for coaching basketball, football, hockey, soccer, swimming, bicycling or any sport where performance optimization and stress injury may be avoided.
- In some applications, the event may also include slipping on a slippery surface, detected by an accelerometer but which the patient may not be able to feel and respond to. To provide fall prediction and prevention, the system may detect small, perhaps unnoticed events by pressure, acceleration and rotation measurements in the foot, leg, and torso. Detection and analysis of these events may provide some insights into fall probability, allowing for feedback to the user to change behaviour and prevent the fall, or to trigger an alarm or notify an emergency contact immediately preceding or following the fall.
- The method and system may be directed to detecting background noise, spoken word, musical performance or other audio input data, events may be identified, different types of events based on peaks may be characterized, and plots of the sensor data may be applied contextually to generate the time-frequency representation. These features may facilitate accurate definition of heterogeneous events based on audio data. In the context of a hearing aid application, features such as signal amplitudes at various frequencies may be used to modulate output of the hearing aid to eliminate background noise or focus on a given person speaking. In educational applications, audio input data may characterize musical or language performance and provide coaching on that basis. Audio input data may also be included with step detection or other human performance applications to better characterize events and features. For step detection and musical coaching, both audio data and pressure, acceleration or rotation data may be referenced to avoid injury.
- Characteristics of words or passages of music may be defined, including the frequencies and amplitudes of the sounds to detect individual syllables or notes. Relationships between individual syllables or notes may also be characterized to define measures, choruses or other passage of music. Consistency of the words and notes, and consistency of tempo and rhythm, may be characterized. Similarly, background noise may be characterized by amplitudes and known repetitive sounds or white noise to identify an environment or social situation to adjust hearing aid output. Summations of the audio sensor data may be filtered by the heterogeneous event detection system, and the filtered data may be used to detect boundaries of verbal, musical, background noise, physical activity of the user, or other events based on features in the filtered data. The source data that is filtered to provide the filtered data may include audio data from one or more sensors, and may also include data of pressure, acceleration, rotation, temperature, humidity or other data depending on the application.
- Once the boundaries of spoken or musical events have been identified in the filtered data, the time bounds of the step events may be used for further analysis of the sensor data or other data. Within the identified time bounds, audio inputs from each of multiple sensors may be compared to one another for analysis, and both the source data and the filtered data may be assessed for identification of heterogeneous events. In the case of musical coaching, the performer's physical posture and the music may be assessed separately to both improve performance and avoid injury.
-
FIG. 1 shows asystem 10 that may be used to implement a method for defining heterogeneous events. Thesystem 10 may include adata source 20 from which aprocessing module 30 receives and processes data, allowing detection and visualization of heterogeneous events. Downstream functionality of thesystem 10 in response to data after processing by theprocessing module 30 is directed by anevent data module 40, which may store processed data, communicate processed data to a user or change operating parameters of thesystem 10 in response to processed data. - The
data source 20 may include one or more sensors for receiving data of different types of stimulus. Thedata source 20 may also include stored data or simulated data for modelling and optimization. Thedata source 20 shown includes afirst sensor 22 and asecond sensor 24. A data source may include only the first sensor, such as thedata source 420 ofFIG. 15 . Thefirst sensor 22 receives afirst stimulus 12, resulting in thefirst data 26. Thesecond sensor 24 receives asecond stimulus 14, resulting in thesecond data 28. Thefirst sensor 22 and thesecond sensor 24 may be the same types of sensors for detecting the same types of data, or may be a combination of multiple types of sensors. Thefirst sensor 22, thesecond sensor 24, or both, may include a pressure sensor, a gyroscope, an accelerometer, a seismograph, a thermometer, a humidity sensor, or any suitable sensor or combination of sensors depending on the specific application. Thefirst data 26, thesecond data 28, or both, and correspondingly thefirst stimulus 12, thesecond stimulus 14, or both, may include measured pressure, acceleration, rotation, seismic signals, temperature, humidity, or any other data. - The
data source 20 may include a shoe-insert, such as thedata source 220 ofFIG. 8 , thedata source 320 ofFIG. 13 or thedata source 420 ofFIG. 15 . A shoe insert facilitates measuring the applied pressure at specific portions of a user's foot during walking, running, jumping, biking, skiing, or other activities. Thefirst sensor 22 and thesecond sensor 24 may be combined in a sensor array included in the shoe-insert, such as thefirst sensor 222 and thesecond sensor 224 ofFIG. 8 . The data source may include one or more sensors on a glove, wristband, armband, elbow pad, headband, hairclip, torso harness, shirt, halter, belt, earpiece, ankle bracelet, leg band, knee pad, or any suitable location (not shown), any of which may be designed for an individual, an animal, an automaton a prosthetic or any suitable location for a given application. The data source may include one or more sensors on a component of a robotics, unmanned or manned vehicles, or any suitable system. Thedata source 20 may include a glove, such as thedata source 620 ofFIG. 20 , or any other suitable wearable data source. Thedata source 20 may also include an audio detection device such as thecochlear implant 620 ofFIG. 18 . Thedata source 20 may also include a sensory enhancement device, such as thecochlear implant 620 ofFIG. 18 , or a tactile output such as in thewatch 320 ofFIG. 13 . - The
processing module 30 receivessensor data 27 from thedata source 20 as a time-varying dataset. Thesensor data 27 may include thefirst data 26, thesecond data 28, or both. Theprocessing module 30 may be on a smartphone, smartwatch, tablet, computer, or other static, portable or wearable device. Communication between thedata source 20 and theprocessing module 30 may be through any wired or wireless connection. Theprocessing module 30 may be included in a single unit with thedata source 20. Thedata source 20 may be connected to theprocessing module 30 directly or through an intermediary storage andtransmission device 29 for providing temporary storage of the data, depending on the particular application of thesystem 10. - The
processing module 30 applies an adaptivelocalized filter process 32 to thesensor data 27 based onparameters 34, resulting in filtereddata 35. Theparameters 34 include an adaptive localized filter that is calculated with reference to a transform of thesensor data 27. Theprocessing module 30 applies anevent detection process 36 to the filtereddata 35 to identify and characterize events, providingevent data 37. The adaptivelocalized filter process 32 may be applied to thefirst sensor data 26 and thesecond sensor data 28 simultaneously as a combined data set, or to thefirst sensor data 26 and thesecond sensor data 28 separately. - Feedback or a summary of the events is provided to the
event data module 40. Theevent data module 40 may include aoutput module 42 for communicating theevent data 37 to a user or effecting a change to operation of thesystem 10, astorage module 44 for storing theevent data 37, or both. Theevent data module 40 may be on a smartphone, smartwatch, tablet, computer, or other device. Communicating theevent data 37 to a user may be through visualization on an optical display, vibration through a tactile display, audio communication, text message, or any suitable medium may be applied. Communication between theprocessing module 30 and theevent data module 40 may be through any wired or wireless connection. Theevent data module 40 may be included in a single unit with theprocessing module 30. -
FIG. 2 is a flowchart of amethod 50 for detecting heterogeneous events. Themethod 50 includes alocalized filtering method 60 and anevent detection method 70. Themethod 50 includes receivingsensor data 52. Receivingsensor data 52 provides sensor data. Thelocalized filtering method 60 is applied to the sensor data, providing filtered data. Theevent detection method 70 is applied to the filtered data, providing event data. Communicating theevent data 54, storing theevent data 56, or both, may follow applying theevent detection method 70. Progressing a moving time window by a pre-determined amount oftime 58 follows, and themethod 50 is repeated with receivingdata 52, providing additional data corresponding to the timeline defined following shifting the moving window by a pre-determined amount oftime 58. - The
localized filtering method 60 is applied to the sensor data. Thelocalized filtering method 60 includes selecting adata window 62. The data window corresponds to a time period of the sensor data, which may include a plurality of data windows. Calculating a time frequency-representation 64 provides a time-frequency representation corresponding to each of the data windows. Calculating afilter mask 66 based on the time-frequency representation provides a filter mask. Localizedfiltering 68 is applied to the time-frequency representation using the filter mask, resulting in the filtered data. Applying each of calculating a time frequency-representation 64, calculating afilter mask 66, andlocalized filtering 68 to each of the plurality of data windows provides an adaptive filter for the data window to facilitate defining events in heterogeneous data. - Selecting a
data window 62 may generally be referred to as a moving-window technique. Selecting adata window 62 allows the remaining steps of thelocalized filtering method 60 to be applied to subsequent data windows, or precedent data windows, sequentially. As a result, thelocalized filtering method 60 may be applied to each of the data windows as the time period of each data window passes, and it is not necessary to wait for the entire dataset to be collected before applying thefiltering method 60 to any data windows that have already been selected by selecting adata window 62. Selecting adata window 62 facilitates identification of events in near real-time. - Calculating a time-
frequency representation 64 on each data window facilitates calculating afilter mask 66 for each data window. Calculating a time-frequency representation 64 on each data window may be based on an S-transform, Gabor transform, or other suitable transform. The S-transform or other localizable transforms may be applied for providing localized information about the sensor data. Providing the localized transform may have advantages over a Fourier transform, which is globally-applied across a dataset as a simple frequency representation. - Calculating a
filter mask 66 based on the time-frequency representation provides a filter mask. The filter mask is determined and recalculated for each data window of the time-frequency representation with reference to the characteristics of the time-frequency representation of the data window in respect of which calculating a time-frequency representation 64 is carried out. - The filter mask may remove irrelevant data and noise by applying a weighting value to each frequency for a given time point on the time-frequency representation. The filter mask may be based on the prevalence of the most and least abundant frequencies in the time-frequency representation over the time period. The filter mask may be non-binary, applying values other than 0 and 1 to each frequency in the time-frequency representation, depending on the magnitude of each frequency. In contrast, previous low-pass filters include assigning a weight of 1 to frequency values below a threshold value, and a weight of 0 to frequency values above the threshold value. Similarly, previous high-pass filters include assigning a weight of 1 to frequency values above a threshold value, and a weight of 0 to frequency values below the threshold value. In such previous approaches for detecting steps in an individual's gait, a single arbitrary threshold may be assigned without the step duration and step frequency being known. The step duration and frequency may in some cases be the information an event detection method is directed to defining, and application of a non-binary and localized filter mask may facilitate defining heterogeneous events, such as steps. An individual's gait may vary as the individual walks, runs, changes speed, climbs or descends stairs, climbs or descends a ramp or other incline, taps their feet, or makes other unpredictable actions that result in or affect an input of the sensor data.
- The non-binary and adaptive features of the filter mask remove a requirement to assign a single arbitrary frequency cutoff threshold before the step duration and resulting step frequency are known. The filter mask adapts to prominent frequencies in each data window. More prominent frequency values at a given time are assigned higher weighting values, while less prominent frequencies are assigned lower weighting values. A separate filter mask is calculated for the time-frequency representation corresponding to each data window. The filter mask applied to a particular data window is the filter mask that was calculated with reference to the particular data window.
- The square of the magnitude of the S-transform or other time-frequency representation at each frequency may be used as the filter mask. As a result, the more prominent frequencies in the dataset are assigned a higher mask value, and the less prominent frequencies are assigned a lower value, which may emphasize the more prominent frequencies and minimize the less prominent frequencies. Applying the square of the magnitude of the time-frequency representation may result in greater distinction between the contribution of more prominent and less prominent frequencies to the filtered data, compared with approaches in which the mask is directly proportional to the magnitude of the time-frequency representation. Where the mask may be directly proportional to the magnitude of the time-frequency representation, the difference in contribution between the more prominent and less prominent frequencies to the filtered data may be less pronounced compared with approaches in which the mask is directly proportional to the square of the magnitude of the time-frequency representation. The magnitude of the time-frequency representation and the square of the magnitude of the time-frequency representation are examples of values that may be applied. Other power relationships between the filter mask and the time-frequency representation magnitude may also be applied.
- The filter mask may be non-binary and whether based on the magnitude of the time-frequency representation, the square of the magnitude of the time-frequency representation, or another power value of the magnitude of the time-frequency representation, will vary according to the abundancies of the various frequencies of each time point in the data window in respect of which the time-frequency representation and the relevant filter masks are calculated.
- Localized
filtering 68 is applied to the time-frequency representation using the filter mask, resulting in the filtered data. Thelocalized filtering 68 may include multiplication of the filter equation with the time-frequency representation of the data window and converting the time-frequency representation back into the time-domain. The time-frequency representation may be converted back into the time-domain by an inverse S-transform, an inverse Gabor transform, or any suitable transform. The filtered data may then be further processed in the time domain, such as by theevent detection method 70. The inverse transform may be completed by any suitable approach . . . may be completed by any suitable approach, such as the time inverse transform described in M. Schimmel, J. Gallart, “The inverse S-transform in filters with time-frequency localization”, IEEE Transactions on Signal Processing, Vol. 53, No. 11, 4417-4422, 2005. - Where the magnitudes of the time-frequency representation are large, or result in large deviations between the resulting magnitudes of the time-frequency representation at more prominent frequencies compared with less prominent frequencies at a given time point in the time period, there may be advantages to normalizing the values against the greatest magnitude in the filtered data, after squaring or applying other power relationships to the magnitudes of the time-frequency representation. Such a normalization would result in each data point in the filtered data having a value varying between 0 and 1. The normalized values may result in a more recognizable plot of events for visualization by a user than would be the case where the localized
filtering 68 is applied to the time-frequency transform values without normalization. However, theevent detection method 70 may be applied to the filtered data regardless of whether the magnitudes of the time-frequency transform values (or the magnitudes elevated to a power) are normalized before localizedfiltering 68 is applied to the magnitudes of the time-frequency transform. - The
event detection method 70 is applied to the filtered data. Theevent detection method 70 includes identifyingfeatures 72 and counting events based on thefeatures 74. - Identifying features 72 may be based on the event duration being greater than a defined minimum event duration, less than a defined maximum duration, or both. Identifying features 72 may be based on the event magnitude being greater than a defined minimum event magnitude, less than a defined maximum event magnitude, or both.
- Inflection points, such as local minima, local maxima, or both may be used to detect the beginnings and ends of events.
- Counting events based on
features 74 may be applied to each data window based on the criteria used for identifyingfeatures 72 based on previous inflection points. Depending on the difference in the magnitude of, or time elapsed between, local extrema, other inflection points or other features, some features may be determined to define endpoints of constituent events within an overall aggregated event including the constituent events, when counting events based on features 74. - After the events are counted within a given data window, the
method 50 may include communicating the event data to auser 54, storing theevent data 56, or both. After communicating theevent data 54, storing theevent data 56, or both, progressing a moving time window by a pre-determined amount oftime 58 may precede receivingsensor data 52 in a application of themethod 50 to a subsequent data window. The corresponding sensor data for a subsequent data window is received and thelocalized filtering method 60 and theevent detection method 70 are applied to the subsequent data window. - Communicating the
event data 54 may include communicating in real time with a user who is applying themethod 50. Communicating theevent data 54 may involve a prompt to encourage a habit that is expected to improve performance, avoid injury or otherwise provide a benefit. Communicating theevent data 54 may also be to a person who is not the source of the stimulus the leads to receivingsensor data 52. - Storing the
event data 56 may include storing details of features and events defined by the event data. Following filtering and transformation back into the time domain, the previously recorded inflection points may be used as a reference point for defining events. The type of extrema to search for may be selected based on the previous data window. Where the previous data window included a maximum as an inflection point, then a minimum in the following data window may be the search target. Where the previous data window included a minimum as an inflection point, then a maximum in the following data window may be the search target. -
FIG. 3 is a flowchart of amethod 150 for detecting heterogeneous events. Themethod 150 includes the localizedfiltering method 160 and theevent detection method 170. Themethod 150 includes definingparameters 151. In some applications, the parameters may already be set or may not be definable by a user or otherwise, and definingparameters 151 may be absent from a method for detecting heterogeneous events (e.g. themethod 50, etc.). Themethod 150 includes receivingsensor data 152, resulting in the sensor data. Thelocalized filtering method 160 is applied to the sensor data, resulting in the filtered data. Theevent detection method 170 is applied to the filtered data, resulting in the event data. Communicating theevent data 154, storing theevent data 156, or both, may follow applying theevent detection method 170. Progressing a moving time window by a pre-determined amount oftime 158 follows, and themethod 150 is repeated with receivingdata 152, providing additional data corresponding to the timeline defined following shifting the moving window by a pre-determined amount oftime 158. - The
localized filtering method 160 is applied to the sensor data. Thelocalized filtering method 160 includes selecting adata window 162. The data window corresponds to a time period of the sensor data, which may include a plurality of data windows. Calculating a time frequency-representation 164 provides a time-frequency representation corresponding to each of the data windows. Calculating afilter mask 166 based on the time-frequency representation provides a filter mask.Localized filtering 168 is applied to the time-frequency representation using the filter mask, resulting in the filtered data. Applying each of calculating a time frequency-representation 164, calculating afilter mask 166, andlocalized filtering 168 to each of the plurality of data windows provides an adaptive filter for the data window, to facilitate defining events in heterogeneous data. - Defining
parameters 151 may take place for each data window. Definingparameters 151 may include defining upstream parameters such as the time scale of each data window, or any details of calculating the time-frequency representation 164, calculating thefilter mask 166, or performinglocalized filtering 168. The time scale may be selected with reference to the type of events that are expected. In applications where receivingsensor data 152 is directed to sensor data of an individual's gait, a normal step may take between 0.5 and 2 seconds. The minimum and maximum window time scales may be placed around this estimated duration accordingly, such that much shorter or much longer steps would not be considered. The minimum step time scale may be related to the time scale of the previous step. An individual step may be defined as longer than 25% of the duration of the previous step, otherwise it will be treated as part of the previous step. Similarly, an individual step may be defined with reference to the force of a footfall on the previous step and any relationship from a baseline force, to distinguish sensor data of the individual taking distinct steps from sensor data of the individual shifting their weight without taking a step. Defining individual steps with reference to magnitude of footfall force may also facilitate defining jump landings, steps on stairs, steps on ramps, steps after taking on a significant load, or other heterogeneous events that are indicative of different types of steps or other activities that register force on a footfall. - Defining
parameters 151 may also include updating downstream user-experience functions of the system within which themethod 150 is practiced. In applications that use an insole or other plantar pressure sensors, this may include inflating or deflating bladders around an insole, changing stiffness of wrist or other joint braces, changing output from a hearing aid, changing the pace of a metronome, or other functions. - The
method 50 and themethod 150 may each be carried out with thesystem 10. Thesensor data 27 may be received by theprocessing module 30 from thedata source 20 at a predetermined frequency, in some cases viatemporary storage 29. Thelocalized filter process 32 may be applied to thesensor data 27 to apply thelocalized filtering method 60. Theevent detection process 36 may be applied to the filtereddata 35 to carry out theevent detection method 70. Theevent data 37 may be applied in communicating theevent data 54 to the user through theoutput module 42, in storing theevent data 56 in thestorage module 44, or both. Thesensor data 27 may generally be represented as a time-varying signal. Where thesensor data 27 includes pressure data, the pressure data may represent the application of pressure over a defined amount of time. - The rate at which the
sensor data 27 is sampled from thedata source 20 may be adjusted according to the data set after thelocalized filtering method 60 has been applied tosensor data 27. Some applications may benefit from greater resolution in thesensor data 27 while other applications may benefit from down-sampling to conserve bandwidth in thesensor data 27 or increase event detection speed. -
FIGS. 4A to 7C show simulated sensor data, time-frequency representations, and filtered data in a test application of themethod 50 to simulated pressure sensor data that would be obtained in a system similar to thesystem 10 adapted for use on feet, similarly to thesystem 210 ofFIG. 8 or thesystem 310 ofFIG. 13 . - The simulated sensor data was modified from empirical pressure data acquired with a single pressure sensor. The simulated sensor data was modified to increase heterogeneity of events to be characterized by application of the
method 50. The individual events represented in the simulated sensor data differ in their specific pressure-time profiles more than individual events tend to from empirical pressure data. The simulated sensor data corresponded to sensor data from a pressure sensor of the types of events that may be seen when applying the method and system to an individual who is walking to measure steps or other events related to the individual's gait (e.g. changing speed, climbing stairs, walking on a ramp or other incline, tapping feet, etc.). - Two low-pass Fourier transforms were applied to the simulated sensor data, in each case resulting in a frequency representation. The two low-pass Fourier transforms used two different cut-off frequencies.
- Two S-transforms were applied to the simulated sensor data, in each case resulting in a time-frequency representation. A first adaptive localized filter was based on the magnitude of an S-transform of the simulated sensor data. A second adaptive localized filter was based on the square of the magnitude of the S-transform of the simulated sensor data.
- Normalization was not applied to the filtered data. As further described below with reference to
FIGS. 4A to 7C , the adaptive filter provides more accurate event detection than the filter based on a Fourier transform. In one case, a previous filter based on low-pass Fourier transform identified two events as one, while the adaptive filter correctly identified the two separate events. -
FIGS. 4A, 4B, and 4C show the simulated sensor data series (solid lines) as received from sensors, and filtered data obtained using a low-pass Fourier transform approach, which applies to the entire data set, as opposed to the localized and adaptive approach of themethod 50.FIG. 4A shows filtered data obtained by applying a low-pass Fourier transform-based filter with a first cut-off frequency to the simulated sensor data (dashed lines).FIG. 4A shows filtered data obtained by applying a low-pass Fourier transform filter with a second cut-off frequency to the simulated sensor data (dotted lines).FIG. 4B shows only the simulated sensor data (solid lines) and the filtered data at the first cut-off frequency (dashed lines).FIG. 4C shows only the simulated sensor data (solid lines) and the filtered data at the second cut-off frequency (dotted lines). - The first cut-off frequency is lower than the second cut-off frequency. As can be seen at between about 750 and about 950 seconds, two peaks in the simulated sensor data were interpreted as one event in the filtered data at the first cut-off frequency (dashed lines).
-
FIG. 5 is a time-frequency representation plot of the same simulated sensor data ofFIG. 4A after a localized S-transform of the simulated sensor data, as in the calculating a time frequency-representation 64 portion of themethod 50. A filter mask is determined for each data window of the sensor data with reference to the time-frequency plot ofFIG. 5 corresponding to the respective time period as in themethod 50 in the calculating afilter mask 66 portion of thelocalized filtering method 60. The respective filter masks are applied to the time-frequency representation ofFIG. 5 when filtering the data duringlocalized filtering 68 in themethod 50. The filter mask for each data window in time is determined with reference to the magnitudes of each frequency at the data window of the time-frequency representation. -
FIGS. 6A, 6B, and 6C show the simulated sensor data ofFIG. 4A (solid line in each ofFIGS. 6A, 6B, and 6C ) and resulting filtered signal using an adaptive, localized filtering method include features from themethod 50 as shown inFIG. 2 .FIG. 6A also shows filtered data obtained by applying a filter mask proportional to the time-frequency representation magnitude (dashed lines).FIG. 6A also shows filtered data obtained by applying a filter mask proportional to the square of the time-frequency transformation (dotted lines).FIG. 6B shows only the simulated sensor data (solid lines) and the filtered data based on the magnitude of the time-frequency representation (dashed lines).FIG. 6C shows only the simulated sensor data (solid lines) and the filtered data based on the square of the magnitude of the time-frequency representation (dotted lines). -
FIG. 6D shows only the filtered data based on the magnitude of the time-frequency representation (solid lines).FIG. 6E shows only the filtered data based on the square of the magnitude of the time-frequency representation (solid lines). - As can be seen at between about 750 and about 950 seconds, the two peaks in the simulated sensor data that were interpreted as one event in the filtered data at the first cut-off frequency of the Fourier transform (dashed lines in
FIGS. 4A and 4B ) were interpreted as separate events when applying the adaptive localized filter based on the localized time-frequency representations. -
FIGS. 7A, 7B, and 7C show a series of time-frequency representation magnitude plots of the simulated sensor data after a localized S-transform of the simulated sensor data (bottom).FIGS. 7A, 7B, and 7C also show plots (top) of corresponding sensor data (solid lines), filtered signal using an adaptive localized filtering technique in the current data window based on the square of the magnitude of the time-frequency representation (dashed lines; the same plots shown inFIGS. 6A, 6C, and 6E ).FIG. 7A shows sensor data and filtered data from 0 to 250 seconds,FIG. 7B shows sensor data and filtered data from 50 to 300 seconds, andFIG. 7C shows sensor data and filtered data from 100 to 350 seconds. - This example application shows source data filtered in accordance with the
localized filtering method 60 inFIGS. 6A, 6B, and 6C . In contrast,FIGS. 4A, 4B, and 4C shows application of a Fourier transform to the global dataset. As shown inFIGS. 7A, 7B, and 7C , thelocalized filtering method 60 provides a filtered time-representation of the pressure data that highlights the most prominent events or steps, and minimizes the noisy, least important components of the signal. - As shown by comparing the plots of the filtered data obtained by filtering the source data using the global filter (
FIGS. 4A to 4C ) with plots of the filtered data obtained by filtering the source data using the adaptive localized filter (FIGS. 6A to 6E ), the adaptive localized filter provides filtered data more closely approximating the simulated sensor data in several portions of the pressure-time curve. Several specific examples of the adaptive filter more closely approximating the simulated sensor data follow below. - The peak at about 50 seconds is more closely approximated by the adaptive localized filter.
- When detecting events that are steps taken by an individual, a hard “heel first” strike, followed by contact at the ball of the foot may result in the double peak in the simulated sensor data as shown around 150 seconds in
FIGS. 6A to 6E . The adaptive localized filter correctly interpreted this event as a single step. While the global Fourier transform filters applied inFIGS. 4A to 4C also defined the step around 150 seconds as a single step, in other cases a simple high-pass or low-pass filter may not provide the same accuracy. The troughs at about 225 seconds are deeper and further from the simulated sensor data in the global Fourier transform filter compared with the adaptive localized filter. - Between about 600 and about 1000 seconds, the adaptive localized filter more closely follows the simulated sensor data contours than the global Fourier transform filter. As can be seen at between about 750 and about 950 seconds, two peaks in the simulated sensor data were interpreted as one event in the filtered data at the first cut-off frequency (dashed lines).
- Each of the adaptive localized filters applied show nine events, as most simply shown in
FIGS. 6D and 6E . The nine events would be counted by application of theevent detection method 70. In contrast, the global Fourier transform with the first (lower) cut-off frequency misinterpreted two events as one between about 750 and about 950 seconds. The increased accuracy in characterizing heterogeneous events may be facilitated by application of adaptive localized filtering based on the magnitude (or the magnitude elevated to a power) of a localized time-frequency representation determined within a progressing window of the simulated sensor data. - The simulated sensor data show a well-defined baseline, which facilitates application of the global Fourier transform. In other applications with a drifting baseline, the global Fourier transform may be more likely to result in inaccurate filtering of the time-frequency representation and inaccurate event detection. Similarly, the adaptive localized filtering method may provide additional advantages for applications with a drifting baseline.
-
FIG. 8 shows a schematic of anevent detection system 210. Thedata source 220 includes thefirst pressure sensor 222 and thesecond pressure sensor 224 on a footwear insert for a subject individual's foot. In such applications, thefirst pressure sensor 222 may be located at a location corresponding to a heel of a foot on thedata source 220. Thesecond pressure sensor 224 may be located at a location corresponding to a ball of a foot on thedata source 220.Sensor data 227 collected by thepressure sensor system 220 may then be transmitted by an intermediary storage andtransmission device 229, and analyzed by theprocessing module 230 inside a laptop computer. The laptop computer also includes anevent data module 240. Theevent data module 240 communicate theevent data 237 to thecommunication module 242, stored by astorage module 244, or both. -
FIG. 9 is a plot ofidealized sensor data 227 obtained with theevent detection system 210. The first sensor data 226 (dotted lines) from thefirst sensor 222 at the heel of a foot, and the second sensor data 228 (dashed lines) from thesecond sensor 224 at the forefoot, also called the ball of the foot, where the metatarsals are located, are both shown on the plot. Thesensor data 227 may be obtained by applying themethod 50 or themethod 150 using thesystem 220. -
FIG. 10 is a plot of summation data based on thesensor data 227 ofFIG. 9 showing theidealized sensor data 227 as a summation of thefirst sensor data 226 andsecond sensor data 228. Within thesensor data 227, various events may be identified, characterized and bound by points of inflection within thesensor data 227. Anentire step event 290 may be considered an event, which is bound in the time dimensions by inflection points that deviate from the baseline reading. However, smaller component events of thestep event 290 include aheel strike 291 that may be recorded by thefirst pressure sensor 222 at the heel, and aforefoot strike 292 that may be recorded by thesecond pressure sensor 224 at the forefoot. Bothheel strike 291 andforefoot strike 292 are defined by apparent inflection points within thesensor data 227, which has a drifting base line. -
FIG. 11 a is a plot of filtered data 235 (solid lines) based on thesensor data 227 ofFIG. 9 (dashed lines). Thestep event 290 is shown in the filtered data as a single event without thefirst event 291 and thesecond event 292. Using the filtered data, the bounds of thestep event 290 can be identified with greater accuracy than thestep event 290 was identified inFIG. 10 based on the sensor data. The drifting baseline is filtered out of thesensor data 227. In this example, the encompassing event is anentire step event 290, defined by the inflection points of the filtereddata 235, in this case, minima in the filtereddata 235 that correspond to non-extrema inflection points in thesensor data 227. -
FIGS. 11 b and 11 c show thesensor data 227 from the heel (dashed lines in 11 b) and the forefoot (dashed lines in 11 c) superimposed over the filtereddata 235. -
FIGS. 11 d and 11 e respectively show the sensor data at the heel and at the forefoot (dashed lines), and filtereddata 235 at the heel (solid line inFIG. 11 d ) and the forefoot (solid line inFIG. 11 e ). -
FIG. 11 f shows the filtereddata 235 at the heel (solid line), the filtereddata 235 at the forefoot (dashed line) and the combined sensor data 227 (dotted line). The bounds of thestep event 290 are defined at the intersections of the filtereddata 235 at the heel (solid line) and the filtereddata 235 at the forefoot (dashed line), which correspond to the same points as identified in the summed filtereddata 235 prepared from thesensor data 227 of both combinedfirst sensors 222 andsecond sensor 224. -
FIG. 12 shows the idealized sensor data 227 (dashed lines) and the derivative of the sensor data 227 (solid line) superimposed with the event bounds of thestep event 290 for defining theevent data 237. Within thestep event 290, aground contact event 293 may be defined the first derivative of the sensor data 227 (filtered data of the first derivative of the sensor data 227). Once the bounds of thestep event 290 are defined using the filtereddata 235 of the first derivative of thesensor data 227, similarly to the filtereddata 235 of thesensor data 227 inFIGS. 11 a to 11 f , theground contact event 293 within thestep event 290 may be identified in the first derivative of thesensor data 227. - Underlying the
step event 290 is theground contact event 293. Theground contact event 293 may include aheel strike event 294, a ground contact with the ball of thefoot event 295, and a toe-off event 296. Each of theheel strike event 294, ground contact with the ball of thefoot event 295, and toe-offfoot event 296 are bound by inflection points in the first derivative of thesensor data 227. The inflection points bounding theheel strike event 294 include a local maximum and an inflection point in the derivedsensor data 227. The inflection points bounding the ground contact with the ball of thefoot event 295 include two inflection points in the derivedsensor data 227. The inflection points bounding the toe-off event 296 include an inflection point and a local minimum in the derivedsensor data 227. - The
simulated sensor data 227 show a well-defined baseline, which facilitates application of a global Fourier transform. In other applications with a drifting baseline, the global Fourier transform may be more likely to result in inaccurate filtering of the time-frequency representation and inaccurate event detection. Similarly, the adaptive localized filtering method may provide additional advantages for applications with a drifting baseline. -
FIG. 13 is a schematic of anevent detection system 310. Thedata source 320 includes thefirst pressure sensor 322 at the heel of an insole, and asecond pressure sensor 324 at the forefoot of the insole corresponding to the ball of a foot.Sensor data 327 collected by thedata source 320 may then be analyzed by theprocessing module 330 on the instrumented insole. Theevent data 337 is the communicated to the to theevent data module 340 inside a smart watch. Theevent data module 340 may transmit theevent data 337 to thecommunication module 342, store theevent data 337 in astorage module 344, or both. -
FIG. 14 depictsidealized sensor data 327 from theevent detection system 310. Thesensor data 327 includes thefirst sensor data 326 from thefirst sensor 322, and thesecond sensor data 328 from thesecond sensor 324. Based on thefirst sensor data 326 and thesecond sensor data 328 after the bounds of steps are defined by the filtered data, a prompt 346 is provided as voice coaching from thecommunication module 342 of theevent data module 340. The prompt 346 offers suggestions to the user to improve their activity efficiency, mitigate injury or other points. The prompt 346 is a suggestions to move the foot strike zone away from the heel and towards the forefoot in order to minimize joint injury. Twoprompts 346 are required before the user changes their behaviour and the heel pressure shown in thesecond sensor data 328 in decreased. Subsequently, aconfirmation 348 is communicated to the user as vibration from the watch, approving of the behavioural change. The below table shows the amplitudes of the heel strikes before and after theprompts 346. -
Time Amplitude 27.80 93.20 28.52 92.45 29.35 73.54 30.13 84.09 30.88 71.35 31.65 97.75 32.39 43.74 33.14 47.90 -
FIG. 15 shows a schematic of anevent detection system 410. Thedata source 420 includes anaccelerometer 422 on a footwear insert for a subject individual's foot. Theaccelerometer 422 could similarly be attached to another part of the shoe, such as on the outside of the tongue, or to another part of the body such as the ankle, shin, thigh, hip, chest, back or shoulder.Acceleration data 427 collected by theaccelerometer 422 may then be transmitted by an intermediary storage andtransmission device 429, and analyzed by aprocessing module 430 inside a laptop computer, smart phone, smart watch, server computer or other processing device. The processing device includes anevent data module 440. Theevent data module 440 may communicate the event data 437 to thecommunication module 442, store the event data 437 to astorage module 444, or both. -
FIG. 16 is an example plot ofsensor data 427 based on thesensor data 427 from anaccelerometer 422 inFIG. 13 . Thesensor data 427 represents a user jumping and landing back on the ground. Events 490 may be distinguished within thesensor data 427, corresponding to takeoff and landing of the user, which elicit spikes in acceleration. The events 490 illustrate non-repetitive events. -
FIG. 17 is a plot of filtered data 435 (solid lines) based on thesensor data 427 ofFIG. 14 (dashed lines). Two jumpingevents data 435 through the adaptive localized filter process (e.g. thelocalized filtering method 60 or the localized filtering method 160) by identifying inflection points within the filtered data. In this case, the events are respectively the accelerations associated with the takeoff and landing of a jumping motion. -
FIG. 18 is a schematic of anevent detection system 510. In this embodiment, the system is a hearing aid, with built-in capability to take in environmental noise, and select an appropriate filter in order to provide a suitable signal to the user's ear. Thedata source 520 includes amicrophone 522 that recordsnoises 512 from the user's environment as sound signals 527. Thesound signal 527 is the transmitted by an intermediary storage andtransmission device 529 to aprocessing module 530 within the hearing aid. Theprocessing module 530 produces event data which is passed to theevent data module 540 where it is communicated to the communication module 542. The communication module may then pass along a modified noise signal to the user, filtered according to the event data. -
FIG. 19 is an example plot ofsensor data 527 based on the hearing aidevent detection system 510 ofFIG. 18 . Thesensor data 527 represents environmental noise (e.g. talking, music, etc.). Events 590 may be distinguished within thesensor data 527 and passed on to anevent data module 540 as event data. -
FIG. 20 is a schematic of anevent detection system 610. Anaccelerometer 622 sits at the base of the palm of an instrumentedglove 621.Sensor data 627 collected by theaccelerometer 622 is then transmitted by an intermediary storage andtransmission device 629, and analyzed by aprocessing module 630 inside aprocessing module 630, in this case, a smart watch. The processed event data 637 is communicated to theevent data module 640, where it may be passed to acommunication module 66 or stored by astorage module 644. The communication module may communicate an event, such as wrist flexion large enough to cause high internal pressures in the carpal tunnel and resulting nerve damage, to a user through audio, vibratory, or visual cues. Theglove 621 includesvariable stiffness material 649 in awrist portion 623 of theglove 621 for correcting carpal tunnel inducing behavior, and the stiffness may be adjusted as a parameter of thesystem 610 such as at definingparameters 151 of themethod 150. The accelerometer-as-a-level may also be referred to as “inclination sensing” using theaccelerometer 622. - In the preceding description, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the embodiments. However, it will be apparent to one skilled in the art that these specific details are not required. In other instances, well-known electrical structures and circuits are shown in block diagram form in order not to obscure the understanding. For example, specific details are not provided as to whether the embodiments described herein are implemented as a software routine, hardware circuit, firmware, or a combination thereof.
- Embodiments of the disclosure can be represented as a computer program product stored in a machine-readable medium (also referred to as a computer-readable medium, a processor-readable medium, or a computer usable medium having a computer-readable program code embodied therein). The machine-readable medium can be any suitable tangible, non-transitory medium, including magnetic, optical, or electrical storage medium including a diskette, compact disk read only memory (CD-ROM), memory device (volatile or non-volatile), or similar storage mechanism. The machine-readable medium can contain various sets of instructions, code sequences, configuration information, or other data, which, when executed, cause a processor to perform steps in a method according to an embodiment of the disclosure. Those of ordinary skill in the art will appreciate that other instructions and operations necessary to implement the described implementations can also be stored on the machine-readable medium. The instructions stored on the machine-readable medium can be executed by a processor or other suitable processing device, and can interface with circuitry to perform the described tasks.
- The above-described embodiments are intended to be examples only. Alterations, modifications and variations can be effected to the particular embodiments by those of skill in the art. The scope of the claims should not be limited by the particular embodiments set forth herein, but should be construed in a manner consistent with the specification as a whole.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/046,968 US20230085511A1 (en) | 2017-06-28 | 2022-10-17 | Method and system for heterogeneous event detection |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762526080P | 2017-06-28 | 2017-06-28 | |
US201762574013P | 2017-10-18 | 2017-10-18 | |
PCT/CA2018/050802 WO2019000100A1 (en) | 2017-06-28 | 2018-06-28 | Method and system for heterogeneous event detection |
US201916623475A | 2019-12-17 | 2019-12-17 | |
US18/046,968 US20230085511A1 (en) | 2017-06-28 | 2022-10-17 | Method and system for heterogeneous event detection |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CA2018/050802 Continuation WO2019000100A1 (en) | 2017-06-28 | 2018-06-28 | Method and system for heterogeneous event detection |
US16/623,475 Continuation US11504030B2 (en) | 2017-06-28 | 2018-06-28 | Method and system for heterogeneous event detection |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230085511A1 true US20230085511A1 (en) | 2023-03-16 |
Family
ID=64740796
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/623,475 Active 2039-06-08 US11504030B2 (en) | 2017-06-28 | 2018-06-28 | Method and system for heterogeneous event detection |
US18/046,968 Pending US20230085511A1 (en) | 2017-06-28 | 2022-10-17 | Method and system for heterogeneous event detection |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/623,475 Active 2039-06-08 US11504030B2 (en) | 2017-06-28 | 2018-06-28 | Method and system for heterogeneous event detection |
Country Status (3)
Country | Link |
---|---|
US (2) | US11504030B2 (en) |
CA (1) | CA3067460A1 (en) |
WO (1) | WO2019000100A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11350877B2 (en) * | 2018-09-24 | 2022-06-07 | Arizona Board Of Regents On Behalf Of Arizona State University | Smart shoes with adaptive sampling for rehabilitation and health monitoring |
US11693423B2 (en) * | 2018-12-19 | 2023-07-04 | Waymo Llc | Model for excluding vehicle from sensor field of view |
LU101071B1 (en) * | 2018-12-21 | 2020-06-24 | Luxembourg Inst Science & Tech List | Gait analysis data treatment |
US11168984B2 (en) * | 2019-02-08 | 2021-11-09 | The Boeing Company | Celestial navigation system and method |
WO2020186353A1 (en) * | 2019-03-18 | 2020-09-24 | Healthtech Connex Inc. | System and method for automatic evoked potential measurement |
US11304650B1 (en) * | 2019-03-20 | 2022-04-19 | University Of South Florida | Systems and methods for heel-to-shin testing |
JP7177994B2 (en) * | 2019-05-29 | 2022-11-29 | 日本電気株式会社 | Information processing device, walking environment determination device, walking environment determination system, information processing method and program |
FR3100431B1 (en) * | 2019-09-09 | 2021-08-20 | Livestep | Connected sole and associated method of fall detection and gait analysis |
CN116548928B (en) * | 2023-07-11 | 2023-09-08 | 西安浩阳志德医疗科技有限公司 | Nursing service system based on internet |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9418705B2 (en) * | 2010-08-26 | 2016-08-16 | Blast Motion Inc. | Sensor and media event detection system |
US20130085700A1 (en) * | 2011-09-30 | 2013-04-04 | Apple Inc. | Techniques for improved pedometer readings |
US9459118B2 (en) * | 2013-06-07 | 2016-10-04 | Apple Inc. | Adjusting step count to compensate for arm swing |
US20160029968A1 (en) * | 2014-08-04 | 2016-02-04 | Analog Devices, Inc. | Tracking slow varying frequency in a noisy environment and applications in healthcare |
-
2018
- 2018-06-28 CA CA3067460A patent/CA3067460A1/en active Pending
- 2018-06-28 US US16/623,475 patent/US11504030B2/en active Active
- 2018-06-28 WO PCT/CA2018/050802 patent/WO2019000100A1/en active Application Filing
-
2022
- 2022-10-17 US US18/046,968 patent/US20230085511A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US11504030B2 (en) | 2022-11-22 |
US20200178849A1 (en) | 2020-06-11 |
WO2019000100A1 (en) | 2019-01-03 |
CA3067460A1 (en) | 2019-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230085511A1 (en) | Method and system for heterogeneous event detection | |
US10413250B2 (en) | Method and apparatus for generating assessments using physical activity and biometric parameters | |
EP3337401B1 (en) | Method and system for adjusting audio signals based on motion deviation | |
CN104969035B (en) | Step detection method and system based on inertia harmonic wave | |
US11589781B2 (en) | Assessing diseases by analyzing gait measurements | |
US20150260514A1 (en) | Method to determine physical properties of the ground | |
Hemmatpour et al. | Nonlinear Predictive Threshold Model for Real‐Time Abnormal Gait Detection | |
Kong et al. | Comparison of gait event detection from shanks and feet in single-task and multi-task walking of healthy older adults | |
Walker et al. | A continuous patient activity monitor: validation and relation to disability | |
Dehzangi et al. | Activity detection using fusion of multi-pressure sensors in insoles | |
Kuusik et al. | Comparative study of four instrumented mobility analysis tests on neurological disease patients | |
Qin et al. | A smart phone based gait monitor system | |
EP3603505A1 (en) | Information processing system, information processing device, and information processing method | |
Garudadri et al. | Improved gait speed calculation via modulation spectral analysis of noisy accelerometer data | |
US20240237922A1 (en) | Estimation device, estimation system, estimation method, and recording medium | |
Rao et al. | Analysis of joints for tracking fitness and monitoring progress in physiotherapy | |
Liu | The development of a body-worn sensor-based system for fall risk assessment | |
Perumal | Gait and Tremor Monitoring System for Patients with Parkinson’s Disease Using Wearable Sensors | |
Alam | Parkinson's Symptoms quantification using wearable sensors | |
Krauss et al. | ActiSmile, a portable biofeedback device on physical activity | |
Perez Leon | A Smartphone-based System for Clinical Gait Assessment | |
Leon | A smartphone-based system for clinical gait assessment | |
Schinckus et al. | Spatiotemporal analysis of cane-assisted gait in post-stroke patients using inertial measurement units |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ORPYX MEDICAL TECHNOLOGIES INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHENG, CHUN HING;PURDY, MICHAEL TODD;STEVENS, TRAVIS MICHAEL;REEL/FRAME:061437/0781 Effective date: 20181114 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: PERCEPTIVE CREDIT HOLDINGS IV, LP, NEW YORK Free format text: SECURITY AGREEMENT;ASSIGNOR:ORPYX MEDICAL TECHNOLOGIES INC.;REEL/FRAME:068420/0048 Effective date: 20240716 |