EP3431002B1 - Surveillance basée sur la radiofréquence de l'activité d'un utilisateur - Google Patents

Surveillance basée sur la radiofréquence de l'activité d'un utilisateur Download PDF

Info

Publication number
EP3431002B1
EP3431002B1 EP17182239.8A EP17182239A EP3431002B1 EP 3431002 B1 EP3431002 B1 EP 3431002B1 EP 17182239 A EP17182239 A EP 17182239A EP 3431002 B1 EP3431002 B1 EP 3431002B1
Authority
EP
European Patent Office
Prior art keywords
data
sensor data
user
pseudo
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17182239.8A
Other languages
German (de)
English (en)
Other versions
EP3431002A1 (fr
Inventor
Fatih SUNOR
Brook EATON
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Technologies Oy
Original Assignee
Nokia Technologies Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy filed Critical Nokia Technologies Oy
Priority to EP17182239.8A priority Critical patent/EP3431002B1/fr
Priority to US16/631,802 priority patent/US20200170538A1/en
Priority to PCT/IB2018/055202 priority patent/WO2019016659A1/fr
Publication of EP3431002A1 publication Critical patent/EP3431002A1/fr
Application granted granted Critical
Publication of EP3431002B1 publication Critical patent/EP3431002B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/0507Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  using microwaves or terahertz waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

Definitions

  • the subject matter described herein relates to health and fitness monitors.
  • a health and fitness activity tracker such as a body worn device or wearable, may be used to determine how many steps the wearer has taken over a given period of time, how many stairs the wearer has climbed over a given period of time, how high the user has jumped over a given period of time, how active is the wearer is over a given period of time, how many calories the user is expending, and/or the like.
  • US 2017/0097413 describes apparatuses and techniques for radar-enabled sensor fusion.
  • a radar field is provided and reflection signals that correspond to a target in the radar field are received.
  • the reflection signals are transformed to provide radar data, from which a radar feature indicating a physical characteristic of the target is extracted.
  • a sensor is activated to provide supplemental sensor data associated with the physical characteristic.
  • the radar feature is then augmented with the supplemental sensor data to enhance the radar feature, such as by increasing an accuracy or resolution of the radar feature.
  • the radio frequency based data may be generated from radio frequency signals reflected from at least the user.
  • the sensor may be coupled to, and/or included in, a user equipment associated with the user, wherein the sensor and/or the user equipment is worn by the user or not worn by the user.
  • the pseudo sensor data and the native sensor data may provide a stream of data to the application to enable tracking the activity of the user.
  • the sensor may not be able to generate the native sensor data, when the sensor is not worn by the user and/or the user equipment is being charged.
  • the pseudo sensor data may be derived in response to an indication that the user equipment and/or the sensor is not providing native sensor data for the user.
  • the application may include a health and fitness tracker application.
  • the user equipment may include a health and fitness tracker.
  • the machine learning model may include a neural network, a linear regression model, a regression neural network, and/or a regression learning technique.
  • the machine learning model may be configured by machine learning based on at least reference radio frequency based data and reference sensor data collected from at least one reference user performing activities comprising walking, running, jumping, gesturing, standing, and/or sitting.
  • the native sensor data may include accelerometer data, gyroscope data, and/or barometer data.
  • the pseudo sensor data may include pseudo accelerometer data, pseudo gyroscope data, and/or pseudo barometer data.
  • the radio frequency based data may be generated from radar signals reflected from at least the user.
  • the health and fitness tracker may, from time to time, not be useable for activity tracking.
  • the health and fitness tracker may not be providing data indicative of a user's activity due to a dead battery, communication link loss, removal from the user's body of the health and fitness tracker, and/or for other reasons.
  • the user's activity may not be tracked, which may be problematic for some users.
  • a radio frequency based tracker to track a user while, for example, the health and fitness tracker is not providing data indicative of a user's activity.
  • the health and fitness tracker may include, or be coupled to, at least one sensor that monitors a user's activity, such as walking, running, and/or other types of activity. But when the sensor is no longer able to provide data representative of the user's activity due to, for example, the health and fitness tracker and/or the sensor being removed from the user's body and/or for other reasons, the radio frequency (RF) based tracker may be used to track the user's activity, in accordance with some example embodiments.
  • RF radio frequency
  • the RF based tracker may transmit one or more radio frequency signals, and may receive the reflected returns resulting from the transmissions to enable tracking of the user's activity.
  • the RF based tracker may include a radar to track the user's activity.
  • the radar may be implemented as an ultra-wideband (UWB) radar, although other types of radios and/or radars may be used as well.
  • UWB ultra-wideband
  • the RF based tracker may track the user's activity using RF signals, such as UWB signals or other types of signals, received as returns from the user being tracked.
  • RF signals such as UWB signals or other types of signals
  • the RF based tracker including the UWB radar may transmit UWB signals and then may receive UWB signals as they reflect back from the user.
  • These UWB signals may carry or provide information that can be used to determine the location or distance of the user (including portions, such as body parts) with high-resolution over time.
  • the received RF signals may be detected and/or digitized to form RF based data, and this RF based data may be may be binned based on time of arrival to determine the location and/or position (and thus activity) of the user over time.
  • the received RF signals (or RF based data representative of the RF signals) essentially paint a picture of the user's activity over time.
  • this RF based data may not be readily processed by health and fitness tracker applications having application programming interfaces (APIs) accustomed to operating with, for example, sensor data, such as x-y-z accelerometer data or other types of sensor data, indicative of the user's activity.
  • APIs application programming interfaces
  • the RF based data indicative of the user's activity over time may be transformed into pseudo sensor data, such as pseudo accelerometer data or other types of pseudo sensor data.
  • the pseudo sensor data may be considered pseudo in the sense that the pseudo sensor data is derived from the RF based data.
  • sensors such as accelerometers and/or the like, may natively generate sensor data that can be provided to an application, such as a health and fitness tracker application, to enable tracking the user's activity.
  • the pseudo sensor data is derived from RF based data provided by RF based tracker disclosed herein.
  • the form of the pseudo sensor data is similar to the native sensor data, such as that the pseudo sensor data can be readily processed by, for example, the health and fitness tracker application having an API accustomed to operating with native sensor data.
  • the RF based tracker includes a machine learning (ML) model that is trained, or configured to, transform the RF based data indicative of the user's activity into pseudo sensor data, such pseudo accelerometer data and/or other types of sensor data, which may be representative of the user's activity.
  • ML machine learning
  • FIG. 1 depicts an example of a system 100, in accordance with some example embodiments.
  • a health and fitness tracker 104 is being worn by a user 102.
  • the health and fitness tracker 104 may generate sensor data (also referred to herein as "native sensor data").
  • This native sensor data may be generated by at least one sensor associated with, or located at, the health and fitness tracker 104.
  • a sensor may comprise an accelerometer that generates sensor data, such as accelerometer data in the x, y, and z axis.
  • This sensor data may be representative of (e.g., indicative of, or provide a measurement of, etc.) the user's activity (or lack of activity) such as running, jumping, walking, gesturing, and/or other activities performed by the user.
  • the sensor may comprise a transducer, a pressure transducer, and/or a gyroscope, which may be indicative of the user's activity.
  • the sensor may comprise a barometer, which may also indicate the user's activity (for example, change in elevation due to jumping or climbing).
  • the health and fitness tracker (which may include, or be coupled, via wired and/or wireless connection(s), to at least one sensor such as an accelerometer and/or the like) may measure, generate, collect, and/or monitor one or more parameters associated with the activity of a user, such as miles walked, stairs climbed, calories burned, heart rate, breathing rate, and/or other activities associated with the user.
  • a user equipment such as the health and fitness tracker and/or the at least one sensor, may be worn by a user.
  • the health and fitness tracker and/or the at least one sensor may be implemented as a smartwatch, a user equipment, and/or other type of device or wearable.
  • the user equipment such as the health and fitness tracker and/or the at least one sensor
  • the user equipment may not be a wearable device worn by a user.
  • the health and fitness tracker and/or the at least one sensor may be implemented in a sheet, a pillow, and/or in other ways so as not to be considered a wearable.
  • the health and fitness tracker 104 may correspond to a smartwatch, for example.
  • a health and fitness tracker is the Nokia Steel HR, which tracks a user's activity, although other types of health and fitness trackers may be used as well.
  • the smartwatch may include a sensor that generates the sensor data natively.
  • the native sensor data may be processed by an application or service, such as a health and fitness tracker application, at the smartwatch or at another device that is coupled (via a wireless or wired connection(s)) to the smartwatch.
  • the other device may include a smartphone, a cloud server, and/or other types of user equipment and/or processors.
  • the sensor may be a standalone sensor.
  • user 102 may wear or carry at least one sensor, such as an accelerometer, a barometer, and/or the like.
  • the sensors may provide the sensor data to another device, such as a user equipment (for example, a smartwatch, a smart phone, and/or the like) and/or a remote server (for example, a cloud server coupled to the Internet).
  • a user equipment for example, a smartwatch, a smart phone, and/or the like
  • a remote server for example, a cloud server coupled to the Internet.
  • the cloud server may provide the application or service, such as the health and fitness tracker application providing activity tracking and other services.
  • FIG. 2 shows an example of sensor data, such as accelerometer data over time for the x-axis 202, y-axis 204, and z-axis 206, in accordance with some example embodiments.
  • the sensor may be coupled to, and/or included in, a user equipment such as the health and fitness tracker 104 (or a sensor) associated with the user.
  • the activity of the user may cause the health and fitness tracker 104 (or the associated sensor) to generate sensor data, from which the user's activity (e.g., steps walked, stairs climbed, miles run, calories burned, and/or other types of activity) can be determined.
  • the health and fitness tracker and/or associated sensor may natively generate sensor data, which in this example is x, y, and z accelerometer data, representative of the user's activity.
  • This native sensor data may be provided to an application, such as a health and fitness tracker application or the application's API, to determine, as noted, the user's activity such as steps walked, stairs climbed, miles run, calories burned, and/or other types of activity.
  • This application may located at the health and fitness tracker 104 and/or located at another device such as a remote device, an Internet coupled cloud server, a user equipment (for example, a smartphone), and/or the like.
  • the user 102 may, as noted, remove the health and fitness tracker 104 (and/or remove the sensor).
  • the RF based tracker 110 (which may include one or more radios such as one or more UWB radars, for example) may track the user's 102 activity.
  • the user may be in a room, and as the user walks around the room, the RF based tracker may receive RF signal returns from the user.
  • the received RF signals may be decoded and/or digitized into RF based data.
  • the pseudo sensor data may be derived using a machine learning model, in accordance with some example embodiments.
  • the previous example illustrates native sensor data not being generated due to the removal of the health and fitness tracker 104 (and/or the associated sensor), the native sensor data may not be generated, as noted, for other reasons.
  • FIG. 3A depicts an example of RF based data 305 and 310 received at the RF based tracker 110 in response to transmitted RF signals directed toward the user 102, in accordance with some example embodiments.
  • the vertical line 390 represents a point in time during which the user 102 performs an activity.
  • the sensor data 202, 204, 206 indicates the activity at 210A-C, which corresponds in time to the RF based data activity 315A-B.
  • the RF based data is obtained from the UWB radar signals.
  • other types of RF based signals may be used as well to provide RF based data representative of the user's position and/or location as a way to assess the user's activity.
  • other types of RF signals in accordance with WiFi, Bluetooth, millimeter wave (MMW), and/or other types of RF signals may be transmitted and/or received.
  • RF based data may be generated representative of the user's activity. This RF based data may then be processed, as noted herein, to form pseudo sensor data, in accordance with some example embodiments.
  • the UWB radar may, when compared to other types of radars, have a light power spectrum and transmit pulses having very short durations (for example, less than 1 nanosecond).
  • the low power of UWB may enable safe use with human subjects, when compared to higher-powered radars.
  • this RF based data may, as noted, be transformed into pseudo sensor data, such as pseudo accelerometer data, in accordance with some example embodiments.
  • this transformation may, as noted, be performed by a machine learning (ML) model trained, or configured, to transform the RF based data returned from the user, such as user 102, into the pseudo sensor data, in accordance with some example embodiments.
  • ML machine learning
  • the sensor data may take other forms as noted above, in which case the RF based tracker 110 may include a machine learning model configured to transform the RF based data to the other forms of pseudo sensor data.
  • FIG. 3B depicts an example of a system 300 including the health and fitness tracker 104 including a sensor 399, such as an accelerometer and/or other type of sensor, generating native sensor data 380A-B, 382A-B, and 384A-B while the health and fitness tracker 104 including the sensor is being worn 360A-B by user 102, in accordance with some example embodiments.
  • the RF based tracker 110 may provide at 370 pseudo sensor data 390, 392, 394, in accordance with some example embodiments.
  • the pseudo sensor data may fill in at least a portion of the gap at 370 in the native sensor data to enable a more continuous (or nearly continuous) data stream representative of the user's activity.
  • the native sensor data in the form of, for example, accelerometer data 380A-B, 382A-B, and 384A-B along with the pseudo sensor data 390, 392, and 394 may be provided to an application 390, such as a health and fitness tracker application, to enable tracking the user's activity.
  • the application 390 may, as noted above, be located at the health and fitness tracker 104 and/or located at another device such as a remote device, an Internet coupled cloud server, a user equipment (for example, a smartphone, smartwatch, or tablet), and/or the like.
  • the health and fitness tracker 104 including the sensor 399 may generate the native sensor data 380A-B, 382A-B, and 384A-B, which contains information from which the user's activity can be determined.
  • the application 390 may determine the quantity of steps walked by the user over time from native sensor data 380A, 382A, and 384A, which in this example is accelerometer data indicative of steps walked over time. Even when the native sensor data is not available, the application 390 can determine the steps walked using the pseudo sensor data 390, 392, and 394.
  • the application may determine the steps walked from the remaining stream of native sensor data 380B, 382B, and 384B. In this way, the user's activity (which in this example corresponds to steps walked by the user) can be determined in a more continuous manner using native sensor data and pseudo sensor data, when compared to not using the pseudo sensor data in accordance with some example embodiments.
  • the "stream" of native sensor data may be considered continuous in the sense that the sensor 399 provides the native sensor data from time to time, such as at intervals of 0.25 second, 0.5 second, 1 second, 2 seconds, 30 seconds, 1 minute, and/or the like.
  • the sensor 399 may generate sensor data at these intervals, and this sensor data may still be considered continuous.
  • this time may be considered a gap in the continuous stream of sensor data.
  • the gap may be defined by a predefined time where there is no sensor data, such as 2 minutes (although other times may be used as well), while in some embodiments, the gap may be defined by a message or trigger signal indicating that the sensor 399 is not able to generate sensor data.
  • FIG. 4 depicts another example of a system 400, in accordance with some example embodiments. Unlike system 100, system 400 shows that the user 102 has docked the health and fitness tracker 104 (or sensor coupled to, or included in the tracker 104) at the RF based tracker 110 for charging or for some other reason.
  • FIG. 4 depicts that the RF based tracker 110 may operate cooperatively with other RF based trackers 412 and 414.
  • the other the RF based tracker 412/414 may cooperatively generate and gather RF based data for the user 102.
  • This collected RF based data may be processed by the RF based trackers 110, 410, and/or 412 to derive the pseudo sensor data, in accordance with some example embodiments.
  • this collected RF based data may be forwarded to one of the RF based trackers 110, 410, and/or 412 for processing into the pseudo sensor data such as pseudo accelerometer data, in accordance with some example embodiments.
  • the RF based data collected by one of more of the RF based trackers 110, 410, and/or 412 may be forwarded to another processor, such as a cloud server, a smart phone, and/or another processor-based device, to transform the RF based data into the pseudo sensor data, in accordance with some example embodiments.
  • another processor such as a cloud server, a smart phone, and/or another processor-based device, to transform the RF based data into the pseudo sensor data, in accordance with some example embodiments.
  • the machine learning model may be trained, or configured, in a training phase.
  • the RF based tracker 110 may collect reference RF based data from a user, such as a reference wearer.
  • This reference wearer may be wearing the health and fitness tracker 104 and/or a sensor such as sensor 399, which generate reference sensor data, such as reference accelerometer data.
  • the reference wearer may perform one or more activities, such as sitting, walking, jumping, running, hand waving, and/or other types of activities. From this activity, the reference RF based data and the reference sensor data may be collected and then provided to a machine learning model.
  • the machine learning model refers to a transform, which may be configured or learned.
  • Examples of machine learning models include a neural network, a linear regression model, a regression neural network, a regression technique, and/or other types of artificial intelligence technologies configured to learn how to transform time sequences of RF based data corresponding to the reference wearer's activity to pseudo sensor data, such as pseudo x-y-z accelerometer data or other types of pseudo sensor data.
  • the learning may be performed in a supervised way and/or an unsupervised manner.
  • the reference sensor data and reference RF based data may be collected from a plurality of references wearers as well.
  • FIG. 5A depicts an example of a machine learning model 599 in a training phase, in accordance with some example embodiments.
  • the reference RF based data 510 may be provided at the input to the machine learning model, and reference sensor data 530 may be provided at the output.
  • the machine learning model 599 (which in this example comprises a neural network) may include one or more layers, such as hidden layer(s) 540 configured to learn a configuration that provides the output 530 given the input 510.
  • the machine learning model 599 may iterate through the reference RF based data input 510 given the reference sensor data at the output until model 599 learns a hidden layer 540 configuration (e.g., a set of weights and/or other parameters).
  • a hidden layer 540 configuration e.g., a set of weights and/or other parameters.
  • the machine learning model 599 may be implemented as a neural network including one or multi-layer perceptrons, such as multi-layer perceptron regressors.
  • the one or more multi-layer perceptrons may learn by optimizing the squared loss (e.g., the square of the difference between 510 and 530) using gradient descent, such as a stochastic gradient descent, limited memory Broyden-Fletcher-Goldfarb-Shanno (LMBFGS), and/or the like.
  • the multi-layer perceptron regressor may train iteratively through the reference data at the input 510 and output 530 since at each time step, the partial derivatives of the noted loss function (with respect to the model parameters) are computed to update the perceptron's parameters.
  • the trained machine learning model can be used generally for operational use to derive pseudo accelerometer data from RF based data.
  • FIG. 5B depicts an example of a machine learning model 500 trained, or configured, to take RF based data 590, and provide the pseudo sensor data 592, such as pseudo accelerometer data (or other types of pseudo sensor data), in accordance with some example embodiments.
  • the trained machine learning model may serve as a transform that can be used generally to derive pseudo sensor data, such as pseudo accelerometer data, from the radar data.
  • This pseudo sensor data can fill in at least a portion of any gaps, which may correspond to, for example, times that the tracker 104 and/or sensor 399 are not providing native sensor data.
  • the RF based tracker 110 including the machine learning model 500 may receive as an input the native RF based data 590, and output pseudo sensor data 592, such as pseudo accelerometer data.
  • the pseudo sensor data can fill in at least a portion of a gap (as shown at FIG. 3B at 370, for example) to enable providing a more continuous, or nearly continuous, stream of data for health and fitness tracking by the application 390, the health and fitness tracker 104, and/or the like.
  • FIG. 5C plots native sensor data 568A-C and, for comparison, pseudo sensor data 566A-C, in accordance with some example embodiments.
  • the pseudo sensor data 566A-C may be generated as noted using the trained, or configured, machine learning model 500.
  • the pseudo sensor data 566A-C may be provided to a health and fitness tracker application, such as application 390 which may count steps or monitor and/or track other activity associated with a user.
  • the health and fitness tracker application may yield a step count for example that is the same or similar to the step count obtained using native sensor data 568A-C.
  • FIG. 6 depicts an example of a process 600 for generating a machine learning model during a training phase, in accordance with some example embodiments.
  • the description of FIG. 6 also refers to FIGs. 1 , 2 , and 3 .
  • the RF based tracker 110 may collect RF based reference data, in accordance with some example embodiments.
  • the RF based tracker 110 may, as noted, transmit RF signals, and then receive RF signals (which may be reflected, for example, back from a reference user 102).
  • the received RF signals may be decoded and/or digitized, into RF based data, as noted above. Since the RF based data in this example is obtained from a reference user, the RF based data is referred as reference RF based data.
  • the reference RF based data may correlate in time to sensor data, such as accelerometer data, natively generated by the health and fitness tracker 104 and/or sensor 399. Since the sensor data in this example is obtained from a reference user, this sensor data is referred as reference sensor data.
  • the RF based tracker 110 may receive, from the health and fitness tracker 104 (and/or sensor 399), the reference sensor data.
  • the reference RF based data at time t1 and the reference sensor data at time t1 correlate to the same activity, such as a jump or a step.
  • the radio frequency based tracker 110 may generate, based on the reference radar data and reference sensor data, a machine learning model, in accordance with some example embodiments.
  • the machine learning model 599 may iterate through the reference RF based data input 510 given the reference sensor data at the output 530 until the model 599 learns a hidden layer 540 configuration (e.g., a set of weights and/or other parameters).
  • the machine learning model may be provided, at 608, to the RF based tracker 110 for operational use, in accordance with some example embodiments.
  • the RF based tracker 110 may use the machine learning model (see, e.g., FIG. 5B at 500) to transform RF based data (which may be received when the wearer has taken off the activity tracker 104 or at other times native sensor data is not available or being generated) into pseudo sensor data, such as pseudo accelerometer data and/or other types of pseudo sensor data.
  • This pseudo sensor data can be used, as noted, to fill in at least a portion of a gap in the user's native sensor data and thus provide a continuous, or nearly continuous, stream of data for activity tracking even when the native sensor data is not being generated or is not available (e.g., when the user is not wearing the activity tracker/sensor or for other reasons as noted).
  • FIG. 7 depicts an example of a process 700 for the RF based tracker during an operational phase, in accordance with some example embodiments. The description of FIG. 7 also refers to FIGs. 1 , 3B , and 5B .
  • the RF based tracker 110 may collect RF based data, in accordance with some example embodiments.
  • the health and fitness tracker 104 (or sensor 399 therein) may not be generating actual, native sensor data, such as accelerometer data, for a variety of reasons including the health and fitness tracker 104 and/or sensor 399 not being worn by user 102.
  • the RF based tracker 110 may collect the RF based data for user 102 (an example of which is shown at FIG. 3A ).
  • the RF based data may correspond to decoded and/or detected returns, as noted above, from an RF signal transmitted towards the user 102.
  • the RF signals sent towards or received from the user 102 may comprise a variety of types of radio frequency signals including UWB, WiFi, MMW, Bluetooth, and/or other types of signals.
  • the health and fitness tracker 104 may, in accordance with some example embodiments, trigger the RF based tracker 110 to track the user 102 and initiate transmission and reception of RF signals.
  • the health and fitness tracker 104 may trigger the RF based tracker 110 to track the user 102 in response to the health and fitness tracker 104 (and/or sensor 399) being docked (for example, for charging) at the radio-based tracker 110.
  • the health and fitness tracker 104 may trigger the RF based tracker 110 to track the user in response to the health and fitness tracker 104 transmitting an indication, such as by transmitting a trigger signal or sending a message, to the RF based tracker 110.
  • the RF based tracker 110 may generate pseudo sensor data without a trigger or even when the health and fitness tracker (and/or sensor) is able to generate native sensor data.
  • the RF based tracker 110 may process the RF based data into pseudo sensor data such as pseudo accelerometer data and/or other types of pseudo sensor data, in accordance with some example embodiments.
  • the machine learning model 500 may derive (e.g., transform) the pseudo sensor data such as pseudo accelerometer data from the RF based data representative of the user's activity over time.
  • the pseudo sensor data such as the pseudo accelerometer data may be provided to an application 390 such as a health and fitness tracker application, in accordance with some example embodiments.
  • an application 390 such as a health and fitness tracker application
  • the RF-based tracker 110 (or other processor-based device) may provide the pseudo sensor data, via a wireless and/or wired interface, to the application 390 at a device, such as the health and fitness tracker 104 or other device.
  • the application 390 may have an application programming interface configured to receive the pseudo sensor data (along with any native sensor data that may have been generated directly by the health and fitness tracker's 104 sensor), and to generate activity information, such as calories burned, steps walked, stairs climbed, miles walked, heart rate, breathing rate, and/or other types of activity to a user, such as user 102.
  • FIG. 8 depicts an example of a RF based tracker 110, in accordance with some example embodiments.
  • the RF based tracker 110 may include at least one processor 820 and at least one memory 840 including program code which when executed by the at least one processor 820 causes the operations disclosed herein with respect to the RF based tracker including, for example, deriving, from radio frequency based data, pseudo sensor data representative of at least an activity of a user, the pseudo sensor data derived based on at least a machine learning model configured to transform the radio frequency based data into the pseudo sensor data; and providing the pseudo sensor data.
  • the RF based tracker 110 may include at least one antenna 807 coupled to an RF transceiver 805.
  • the RF transceiver 805 may transmit and/or receive signals such as RF signals (e.g., radar signals and/or other types of RF signals), which can be used to enable tracking user 102.
  • the RF transceiver and/or processor 820 may also control transmission and/or reception of the RF signals. Alternatively or additionally, the RF transceiver and/or processor 820 may generate (for example, decode and/or digitize) RF based data from received RF signals.
  • the RF-based tracker 110 may include a charger 890, which may be used to charge the activity tracker 104.
  • the RF based tracker 110 may send RF signals, via RF transceiver 805 and antenna 807, to enable tracking the user's 102 activity.
  • the RF-based tracker 110 may include a machine learning model 815, which can be used to derive pseudo sensor data 817 from RF based data.
  • FIG. 9 illustrates a block diagram of an apparatus 10, in accordance with some example embodiments.
  • the apparatus 10 may represent a user equipment, such as a smart phone and/or other processor based device, or may represent other type of wireless device which may serve as a standalone health and fitness tracker 104.
  • the apparatus may include at least one sensor 399, such as an accelerometer, a pressure transducer, a barometer, and/or the like.
  • a sensor may couple to apparatus 10, which collectively provide the health and fitness tracker 104.
  • a health and fitness tracker refers to a device that at least monitors, measures, determines, tracks, and/or collects one or more parameters associated with a user's activity. Examples of these parameters include health and/or fitness related parameters, such as steps walked, stairs climbed, miles/kilometers walked, heart rate, breathing rate, calories burned, sleep patterns, weight management, and/or the like.
  • the apparatus 10 may include at least one antenna 12 in communication with a transmitter 14 and a receiver 16. Alternatively transmit and receive antennas may be separate.
  • the apparatus 10 may also include a processor 20 configured to provide signals to and receive signals from the transmitter and receiver, respectively, and to control the functioning of the apparatus.
  • Processor 20 may be configured to control the functioning of the transmitter and receiver by effecting control signaling via electrical leads to the transmitter and receiver.
  • processor 20 may be configured to control other elements of apparatus 10 by effecting control signaling via electrical leads connecting processor 20 to the other elements, such as a display or a memory.
  • the processor 20 may, for example, be embodied in a variety of ways including circuitry, at least one processing core, one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits (for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), and/or the like), or some combination thereof. Accordingly, although illustrated in FIG. 9 as a single processor, in some example embodiments the processor 20 may comprise a plurality of processors or processing cores.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the apparatus 10 may be capable of operating with one or more air interface standards, communication protocols, modulation types, access types, and/or the like.
  • Signals sent and received by the processor 20 may include signaling information in accordance with an air interface standard of an applicable cellular system, and/or any number of different wireline or wireless networking techniques, comprising but not limited to Wi-Fi, wireless local access network (WLAN) techniques, such as Institute of Electrical and Electronics Engineers (IEEE) 802.11, 802.16, 802.3, ADSL, DOCSIS, and/or the like.
  • these signals may include speech data, user generated data, user requested data, and/or the like.
  • the apparatus 10 and/or a cellular modem therein may be capable of operating in accordance with various first generation (1G) communication protocols, second generation (2G or 2.5G) communication protocols, third-generation (3G) communication protocols, fourth-generation (4G) communication protocols, fifth-generation (5G) communication protocols, Internet Protocol Multimedia Subsystem (IMS) communication protocols (for example, session initiation protocol (SIP) and/or the like.
  • the apparatus 10 may be capable of operating in accordance with 2G wireless communication protocols IS-136, Time Division Multiple Access TDMA, Global System for Mobile communications, GSM, IS-95, Code Division Multiple Access, CDMA, and/or the like.
  • the apparatus 10 may be capable of operating in accordance with 2.5G wireless communication protocols General Packet Radio Service (GPRS), Enhanced Data GSM Environment (EDGE), and/or the like. Further, for example, the apparatus 10 may be capable of operating in accordance with 3G wireless communication protocols, such as Universal Mobile Telecommunications System (UMTS), Code Division Multiple Access 2000 (CDMA2000), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), and/or the like. The apparatus 10 may be additionally capable of operating in accordance with 3.9G wireless communication protocols, such as Long Term Evolution (LTE), Evolved Universal Terrestrial Radio Access Network (E-UTRAN), and/or the like. Additionally, for example, the apparatus 10 may be capable of operating in accordance with 4G wireless communication protocols, such as LTE Advanced, 5G, and/or the like as well as similar wireless communication protocols that may be subsequently developed.
  • GPRS General Packet Radio Service
  • EDGE Enhanced Data GSM Environment
  • the processor 20 may include circuitry for implementing audio/video and logic functions of apparatus 10.
  • the processor 20 may comprise a digital signal processor device, a microprocessor device, an analog-to-digital converter, a digital-to-analog converter, and/or the like. Control and signal processing functions of the apparatus 10 may be allocated between these devices according to their respective capabilities.
  • the processor 20 may additionally comprise an internal voice coder (VC) 20a, an internal data modem (DM) 20b, and/or the like.
  • the processor 20 may include functionality to operate one or more software programs, which may be stored in memory.
  • processor 20 and stored software instructions may be configured to cause apparatus 10 to perform actions.
  • processor 20 may be capable of operating a connectivity program, such as a web browser.
  • the connectivity program may allow the apparatus 10 to transmit and receive web content, such as location-based content, according to a protocol, such as wireless application protocol, WAP, hypertext transfer protocol, HTTP, and/or the like.
  • Apparatus 10 may also comprise a user interface including, for example, an earphone or speaker 24, a ringer 22, a microphone 26, a display 28, a user input interface, and/or the like, which may be operationally coupled to the processor 20.
  • the display 28 may, as noted above, include a touch sensitive display, where a user may touch and/or gesture to make selections, enter values, and/or the like.
  • the processor 20 may also include user interface circuitry configured to control at least some functions of one or more elements of the user interface, such as the speaker 24, the ringer 22, the microphone 26, the display 28, and/or the like.
  • the processor 20 and/or user interface circuitry comprising the processor 20 may be configured to control one or more functions of one or more elements of the user interface through computer program instructions, for example, software and/or firmware, stored on a memory accessible to the processor 20, for example, volatile memory 40, non-volatile memory 42, and/or the like.
  • the apparatus 10 may include a battery for powering various circuits related to the mobile terminal, for example, a circuit to provide mechanical vibration as a detectable output.
  • the user input interface may comprise devices allowing the apparatus 20 to receive data, such as a keypad 30 (which can be a virtual keyboard presented on display 28 or an externally coupled keyboard) and/or other input devices.
  • apparatus 10 may also include one or more mechanisms for sharing and/or obtaining data.
  • the apparatus 10 may include a short-range radio frequency (RF) transceiver and/or interrogator 64, so data may be shared with and/or obtained from electronic devices in accordance with RF techniques.
  • RF radio frequency
  • the apparatus 10 may include other short-range transceivers, such as an infrared (IR) transceiver 66, a BluetoothTM (BT) transceiver 68 operating using BluetoothTM wireless technology, a wireless universal serial bus (USB) transceiver 70, a BluetoothTM Low Energy transceiver, a ZigBee transceiver, an ANT transceiver, a cellular device-to-device transceiver, a wireless local area link transceiver, and/or any other short-range radio technology.
  • Apparatus 10 and, in particular, the short-range transceiver may be capable of transmitting data to and/or receiving data from electronic devices within the proximity of the apparatus, such as within 10 meters, for example.
  • the apparatus 10 including the Wi-Fi or wireless local area networking modem may also be capable of transmitting and/or receiving data from electronic devices according to various wireless networking techniques, including 6LoWpan, Wi-Fi, Wi-Fi low power, WLAN techniques such as IEEE 802.11 techniques, IEEE 802.15 techniques, IEEE 802.16 techniques, and/or the like.
  • various wireless networking techniques including 6LoWpan, Wi-Fi, Wi-Fi low power, WLAN techniques such as IEEE 802.11 techniques, IEEE 802.15 techniques, IEEE 802.16 techniques, and/or the like.
  • the apparatus 10 may comprise memory, such as a subscriber identity module (SIM) 38, a removable user identity module (R-UIM), an eUICC, an UICC, and/or the like, which may store information elements related to a mobile subscriber.
  • SIM subscriber identity module
  • R-UIM removable user identity module
  • eUICC eUICC
  • UICC UICC
  • the apparatus 10 may include volatile memory 40 and/or non-volatile memory 42.
  • volatile memory 40 may include Random Access Memory (RAM) including dynamic and/or static RAM, on-chip or off-chip cache memory, and/or the like.
  • RAM Random Access Memory
  • Non-volatile memory 42 which may be embedded and/or removable, may include, for example, read-only memory, flash memory, magnetic storage devices, for example, hard disks, floppy disk drives, magnetic tape, optical disc drives and/or media, non-volatile random access memory (NVRAM), and/or the like. Like volatile memory 40, non-volatile memory 42 may include a cache area for temporary storage of data. At least part of the volatile and/or non-volatile memory may be embedded in processor 20.
  • the memories may store one or more software programs, instructions, pieces of information, data, and/or the like which may be used by the apparatus for performing operations disclosed herein including, for example, deriving, from radio frequency based data, pseudo sensor data representative of at least an activity of a user, the pseudo sensor data derived based on at least a machine learning model configured to transform the radio frequency based data into the pseudo sensor data; and providing the pseudo sensor data.
  • the memories may comprise an identifier, such as an international mobile equipment identification (IMEI) code, capable of uniquely identifying apparatus 10.
  • the memories may comprise an identifier, such as an international mobile equipment identification (IMEI) code, capable of uniquely identifying apparatus 10.
  • the processor 20 may be configured using computer code stored at memory 40 and/or 42 to control and/or provide one or more aspects disclosed herein (see, for example, process 600, 700, and/or other operations disclosed herein).
  • the processor 20 may be configured using computer code stored at memory 40 and/or 42 to at least including, for example, deriving, from radio frequency based data, pseudo sensor data representative of at least an activity of a user, the pseudo sensor data derived based on at least a machine learning model configured to transform the radio frequency based data into the pseudo sensor data; and providing the pseudo sensor data.
  • the processor 20 may be configured using computer code stored at memory 40 and/or 42 to at least collect reference RF based data and reference sensor data from a reference wearer and then generate a machine learning model configured to derive pseudo accelerometer data from RF based data.
  • a "computer-readable medium" may be any non-transitory media that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer or data processor circuitry, with examples depicted at FIG. 9
  • computer-readable medium may comprise a non-transitory computer-readable storage medium that may be any media that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
  • the user's activity may include body functions, such as heart rate, breathing rate, sleep patterns, weight management, and/or other functions.
  • an RF signal may be transmitted towards a user, and the reflected RF signals that are returned may be processed (e.g., binned, etc.) to determine breathing rate (which can be determined as a change in distance caused by the expanding chest cavity).
  • the RF signals may include information indicating heart rate, for example.
  • the activity in terms of breathing rate and/or heart rate may also be determined.
  • the pseudo sensor data may be generated alone.
  • the base stations and user equipment (or one or more components therein) and/or the processes described herein can be implemented using one or more of the following: a processor executing program code, an application-specific integrated circuit (ASIC), a digital signal processor (DSP), an embedded processor, a field programmable gate array (FPGA), and/or combinations thereof.
  • ASIC application-specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • These various implementations may include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
  • These computer programs also known as programs, software, software applications, applications, components, program code, or code
  • computer-readable medium refers to any computer program product, machine-readable medium, computer-readable storage medium, apparatus and/or device (for example, magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions.
  • PLDs Programmable Logic Devices
  • systems are also described herein that may include a processor and a memory coupled to the processor.
  • the memory may include one or more programs that cause the processor to perform one or more of the operations described herein.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Artificial Intelligence (AREA)
  • Physiology (AREA)
  • Signal Processing (AREA)
  • Dentistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Claims (13)

  1. Procédé mis en œuvre par ordinateur, comprenant les étapes suivantes :
    déduire (710), de données fondées sur la radiofréquence, des pseudodonnées de capteur (370, 592) représentatives d'au moins une activité d'un utilisateur, les pseudodonnées de capteur étant déduites sur la base d'au moins un modèle d'apprentissage automatique (500) configuré pour transformer les données fondées sur la radiofréquence (590) en pseudodonnées de capteur, dans lequel les pseudodonnées de capteur représentent des données déduites des données fondées sur la radiofréquence, plutôt que des données obtenues directement à partir d'un capteur qui génère des données natives de capteur indiquant l'activité de l'utilisateur ; et
    fournir les pseudodonnées de capteur et des données natives de capteur (360A, 360B) à une application pour permettre à l'application de traiter les pseudodonnées de capteur et les données natives de capteur afin de permettre le suivi de l'activité de l'utilisateur, dans lequel les données natives de capteur sont obtenues directement à partir d'un capteur (104, 399) qui génère des données natives de capteur indiquant l'activité de l'utilisateur,
    dans lequel les pseudodonnées de capteur sont déduites en réponse à au moins un manque dans les données natives de capteur, et dans lequel l'au moins un manque est causé au moins en partie par le fait que le capteur ne peut pas générer les données natives de capteur, représentatives de l'activité de l'utilisateur.
  2. Procédé mis en œuvre par ordinateur, selon la revendication 1, dans lequel les données fondées sur la radiofréquence sont générées à partir de signaux radiofréquences réfléchis par au moins l'utilisateur.
  3. Procédé mis en œuvre par ordinateur, selon la revendication 1 ou la revendication 2, dans lequel le capteur est couplé à un équipement d'utilisateur et/ou inclus dans un équipement d'utilisateur associé à l'utilisateur, dans lequel le capteur et/ou l'équipement d'utilisateur est ou sont porté(s) ou non par l'utilisateur.
  4. Procédé mis en œuvre par ordinateur, selon l'une quelconque des revendications précédentes, dans lequel les pseudodonnées de capteur (370, 592) et les données natives de capteur (360A, 360B) fournissent un flux de données à l'application pour permettre le suivi de l'activité de l'utilisateur.
  5. Procédé mis en œuvre par ordinateur, selon l'une quelconque des revendications précédentes, dans lequel le capteur ne peut pas générer les données natives de capteur, lorsque le capteur n'est pas porté par l'utilisateur et/ou que l'équipement d'utilisateur est en cours de charge.
  6. Procédé mis en œuvre par ordinateur, selon l'une quelconque des revendications précédentes, dans lequel les pseudodonnées de capteur sont déduites en réponse à une indication de ce que l'équipement d'utilisateur et/ou le capteur ne fournit pas ou ne fournissent pas de données natives de capteur pour l'utilisateur.
  7. Procédé mis en œuvre par ordinateur, selon l'une quelconque des revendications précédentes, dans lequel l'application comprend une application de suivi de la santé et de la condition physique, et dans lequel l'équipement d'utilisateur comprend une unité de suivi de la santé et de la condition physique.
  8. Procédé mis en œuvre par ordinateur, selon l'une quelconque des revendications précédentes, dans lequel le modèle d'apprentissage automatique (500) comprend un réseau neuronal, un modèle de régression linéaire, un réseau neuronal de régression et/ou une technique d'apprentissage par régression, et dans lequel le modèle d'apprentissage automatique est configuré par apprentissage automatique sur la base au moins de données de référence fondées sur la radiofréquence et de données de capteur de référence collectées auprès d'au moins un utilisateur de référence accomplissant des activités comprenant la marche, la course, le saut, des gestes, la station debout et/ou la station assise.
  9. Procédé mis en œuvre par ordinateur, selon l'une quelconque des revendications précédentes, dans lequel les données natives de capteur (360A, 360B) comprennent des données d'accéléromètre, des données de gyroscope et/ou des données de baromètre, dans lequel les pseudodonnées de capteur comprennent des pseudodonnées d'accéléromètre, des pseudodonnées de gyroscope et/ou des pseudodonnées de baromètre, et dans lequel les données fondées sur la radiofréquence sont générées à partir de signaux radar réfléchis par au moins l'utilisateur.
  10. Support de stockage non transitoire lisible par ordinateur, comprenant un code de programme qui, lorsqu'il est exécuté, provoque la mise en œuvre d'un procédé selon l'une quelconque des revendications 1 à 9.
  11. Appareil comprenant :
    des moyens destinés à déduire, de données fondées sur la radiofréquence, des pseudodonnées de capteur représentatives d'au moins une activité d'un utilisateur, les pseudodonnées de capteur étant déduites sur la base d'au moins un modèle d'apprentissage automatique configuré pour transformer les données fondées sur la radiofréquence en pseudodonnées de capteur, dans lequel les pseudodonnées de capteur représentent des données déduites des données fondées sur la radiofréquence, plutôt que des données obtenues directement à partir d'un capteur qui génère des données natives de capteur indiquant l'activité de l'utilisateur ; et
    des moyens destinés à fournir les pseudodonnées de capteur et des données natives de capteur à une application pour permettre à l'application de traiter les pseudodonnées de capteur et les données natives de capteur afin de permettre le suivi de l'activité de l'utilisateur, dans lequel les données natives de capteur sont obtenues directement à partir d'un capteur qui génère des données natives de capteur indiquant l'activité de l'utilisateur,
    dans lequel les pseudodonnées de capteur sont déduites en réponse à au moins un manque dans les données natives de capteur, et dans lequel l'au moins un manque est causé au moins en partie par le fait que le capteur ne peut pas générer les données natives de capteur, représentatives de l'activité de l'utilisateur.
  12. Appareil selon la revendication 11, comprenant en outre des moyens destinés à mettre en œuvre un procédé selon l'une quelconque des revendications 1 à 9.
  13. Appareil selon la revendication 11, l'appareil comportant au moins un processeur et au moins une mémoire contenant un code de programme qui, lorsqu'il est exécuté, provoque la mise en œuvre d'un procédé selon l'une quelconque des revendications 1 à 9.
EP17182239.8A 2017-07-20 2017-07-20 Surveillance basée sur la radiofréquence de l'activité d'un utilisateur Active EP3431002B1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP17182239.8A EP3431002B1 (fr) 2017-07-20 2017-07-20 Surveillance basée sur la radiofréquence de l'activité d'un utilisateur
US16/631,802 US20200170538A1 (en) 2017-07-20 2018-07-13 RF Based Monitoring Of User Activity
PCT/IB2018/055202 WO2019016659A1 (fr) 2017-07-20 2018-07-13 Surveillance basée sur la rf d'une activité d'utilisateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP17182239.8A EP3431002B1 (fr) 2017-07-20 2017-07-20 Surveillance basée sur la radiofréquence de l'activité d'un utilisateur

Publications (2)

Publication Number Publication Date
EP3431002A1 EP3431002A1 (fr) 2019-01-23
EP3431002B1 true EP3431002B1 (fr) 2021-11-03

Family

ID=59409169

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17182239.8A Active EP3431002B1 (fr) 2017-07-20 2017-07-20 Surveillance basée sur la radiofréquence de l'activité d'un utilisateur

Country Status (3)

Country Link
US (1) US20200170538A1 (fr)
EP (1) EP3431002B1 (fr)
WO (1) WO2019016659A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220240855A1 (en) * 2019-08-09 2022-08-04 Prevayl Innovations Limited Wearable device and method
CN111583700A (zh) * 2020-06-01 2020-08-25 珠海格力电器股份有限公司 定位方法、装置、设备、系统、存储介质和车库导航方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6238338B1 (en) * 1999-07-19 2001-05-29 Altec, Inc. Biosignal monitoring system and method
WO2014210210A1 (fr) * 2013-06-25 2014-12-31 Lark Technologies, Inc. Procédé pour classer un mouvement d'utilisateur
US10817065B1 (en) * 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
US20170156594A1 (en) * 2015-12-07 2017-06-08 Bodymedia, Inc. Systems, methods, and devices to determine and predict physilogical states of individuals and to administer therapy, reports, notifications, and the like therefor
US10433768B2 (en) * 2015-12-21 2019-10-08 Amer Sports Digital Services Oy Activity intensity level determination
US11364013B2 (en) * 2017-01-05 2022-06-21 Koninklijke Philips N.V. Ultrasound imaging system with a neural network for image formation and tissue characterization
US9989622B1 (en) * 2017-03-16 2018-06-05 Cognitive Systems Corp. Controlling radio states for motion detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
WO2019016659A1 (fr) 2019-01-24
EP3431002A1 (fr) 2019-01-23
US20200170538A1 (en) 2020-06-04

Similar Documents

Publication Publication Date Title
US11500056B2 (en) Method, apparatus, and system for wireless tracking with graph-based particle filtering
EP2681900B1 (fr) Mesure de distance avec capture des mouvements du corps
US9584975B2 (en) Techniques for determining movements based on sensor measurements from a plurality of mobile devices co-located with a person
US11755886B2 (en) Passive positioning with radio frequency sensing labels
CN109945855A (zh) 细粒度位置数据采集
US20150185042A1 (en) Dynamic computation of distance of travel on wearable devices
US20150185045A1 (en) Dynamic calibration of relationships of motion units
EP3431002B1 (fr) Surveillance basée sur la radiofréquence de l'activité d'un utilisateur
EP3971812A1 (fr) Procédé de fourniture de service d'essayage de vêtement au moyen d'un avatar 3d, et système associé
KR20160062479A (ko) 데이터 스케줄링 및 전력 제어를 위한 방법 및 그 전자장치
GB2548176A (en) Activity intensity level determination
CN109077710A (zh) 自适应心率估计的方法、装置和系统
Horng et al. The smart fall detection mechanism for healthcare under free-living conditions
Vecchio et al. Posture recognition using the interdistances between wearable devices
Hernandez et al. Wi-PT: Wireless sensing based low-cost physical rehabilitation tracking
Milosevic et al. Wireless MEMS for wearable sensor networks
KR20150057803A (ko) 멀티 센서 착용형 디바이스 기반의 인터페이스 시스템 및 그 방법
Courtay et al. Zyggie: A Wireless Body Area Network platform for indoor positioning and motion tracking
Shenoy Sensor information processing for wearable iot devices
CN114912065A (zh) 运动距离的计算方法、装置、可穿戴设备及介质
Guizar et al. Impact of MAC scheduling on positioning accuracy for motion capture with ultra wideband body area networks
Zanaj et al. A wearable fall detection system based on lora lpwan technology
US20190142307A1 (en) Sensor data management
FI129882B (en) Sensor data management
KR101752387B1 (ko) 이상 활동 탐지를 위한 이동 단말기 및 이를 포함하는 시스템

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190301

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: NOKIA TECHNOLOGIES OY

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200420

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210608

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1443256

Country of ref document: AT

Kind code of ref document: T

Effective date: 20211115

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017048604

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20211103

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1443256

Country of ref document: AT

Kind code of ref document: T

Effective date: 20211103

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220203

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220303

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220303

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220203

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220204

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017048604

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20220804

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20220720

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20220731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220720

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220731

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220731

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220720

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220731

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230527

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220720

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230531

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20170720

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211103