GB2579998A - Sensor Based context management - Google Patents

Sensor Based context management Download PDF

Info

Publication number
GB2579998A
GB2579998A GB2004037.4A GB202004037A GB2579998A GB 2579998 A GB2579998 A GB 2579998A GB 202004037 A GB202004037 A GB 202004037A GB 2579998 A GB2579998 A GB 2579998A
Authority
GB
United Kingdom
Prior art keywords
sensor data
type
activity
machine
type sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2004037.4A
Other versions
GB2579998B (en
GB202004037D0 (en
Inventor
Eriksson Timo
Martikka Mikko
Lindman Erik
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suunto Oy
Original Assignee
Suunto Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suunto Oy filed Critical Suunto Oy
Priority to GB2004037.4A priority Critical patent/GB2579998B/en
Priority claimed from GB1522525.3A external-priority patent/GB2545668B/en
Publication of GB202004037D0 publication Critical patent/GB202004037D0/en
Publication of GB2579998A publication Critical patent/GB2579998A/en
Application granted granted Critical
Publication of GB2579998B publication Critical patent/GB2579998B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0204Acoustic sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0271Thermal or temperature sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/029Humidity sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • A63B2024/0071Distinction between different activities, movements, or kind of sports performed
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Dentistry (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Telephonic Communication Services (AREA)

Abstract

An apparatus comprises: a memory to store sensor data and at least one processing core that compiles a message based on the sensor data; the message is transmitted from the apparatus to a server, causing receiving in the apparatus from the server of a machine readable instruction that comprises reference data of a predetermined activity type; estimated activity type is derived by comparing the sensor data to the reference data. The sensor data is preferably acceleration sensor data; additional sensor data may include sound or vibration data. A server, methods and a computer program are also disclosed.

Description

SENSOR BASED CONTEXT MANAGEMENT
FIELD
[0001] The present invention relates to identification of user activity based on sensor information.
BACKGROUND
[0002] User sessions, such as training sessions, may be recorded, for example in notebooks, spreadsheets or other suitable media. Recorded training sessions enable more systematic training, and progress toward set goals can be assessed and tracked from the records so produced. Such records may be stored for future reference, for example to assess progress an individual is making as a result of the training. An activity session may comprise a training session or another kind of session.
[0003] Personal devices, such as, for example, smart watches, smartphones or smart jewellery, may be configured to produce recorded sessions of user activity. Such recorded sessions may be useful in managing physical training, child safety or in professional uses. Recorded sessions, or more generally sensor-based activity management, may be of varying type, such as, for example, running, walking, skiing, canoeing, wandering, or assisting the elderly.
[0004] Recorded sessions may be viewed using a personal computer, for example, wherein recordings may be copied from a personal device to the personal computer. Files on a personal computer may be protected using passwords and/or encryption, for example.
[0005] Personal devices may be furnished with sensors, which may be used, for example, in determining a location of the personal device. For example, a satellite positioning sensor may receive positioning information from a satellite constellation, and deduce therefrom where the personal device is located. A recorded training session may comprise a route determined by repeatedly determining the location of the personal device during the training session. Such a route may be later observed using a personal computer, for example.
[0006] Alternatively to a satellite positioning sensor, a personal device may be configured to determine its location using, for example, a cellular network based location determining method, wherein a cellular network is used to assist determination of the location. For example, a cell that maintains attachment of the personal device to the cellular network may have a known location, providing a location estimate of the personal device owing to the attachment and a finite geographic extent of the cell.
SUMMARY OF THE INVENTION
[0007] The invention is defined by the features of the independent claims. Some specific embodiments are defined in the dependent claims.
[0008] According to a first aspect of the present invention, there is provided an apparatus comprising a memory configured to store first-type sensor data, at least one processing core configured to compile a message based at least partly on the first-type sensor data, to cause the message to be transmitted from the apparatus, to cause receiving in the apparatus of a machine readable instruction, and to derive an estimated activity type, using the machine readable instruction, based at least partly on sensor data.
[0009] Various embodiments of the first aspect may comprise at least one feature from the following bulleted list: * the machine readable instruction comprises at least one of the following: an executable program, an executable script and a set of at least two machine-readable characteristics, wherein each of the characteristics characterizes sensor data produced during a predefined activity type * the at least one processing core is configured to derive the estimated activity type at least in part by comparing, using the machine readable instruction, the first-type sensor data, or a processed form of the first-type sensor data, to reference data * the first-type sensor data comprises acceleration sensor data * the memory is further configured to store second-type sensor data, and wherein the at least one processing core is configured to derive the estimated activity type, using the machine readable instruction, based at least in part on the second-type sensor data * the second-type sensor data is of a different type than the first-type sensor data * the second-type sensor data comprises at least one of sound sensor data, microphone-derived data and vibration sensor data * the at least one processing core is configured to derive the estimated activity type at least in part by comparing the second-type sensor data, or a processed form of the second-type sensor data, to reference data, the reference data comprising reference data of a first type and a second type * the at least one processing core is configured to present the estimated activity type to a user for verification * the at least one processing core is configured to cause the memory to store, in a sequence of estimated activity types, the estimated activity type and a second estimated activity type * the at least one processing core is configured to cause the memory to delete the machine readable instruction responsive to a determination that an activity session has ended.
[0010] According to a second aspect of the present invention, there is provided an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to receive a message from a user device, the message comprising information characterizing first-type sensor data, determine, based at least partly on the first-type sensor data, an activity context, and transmit to the user device a machine-readable instruction configured to cause activity type determination in the activity context.
[0011] According to a third aspect of the present invention, there is provided a method, comprising storing first-type sensor data in an apparatus, compiling a message based at least partly on the first-type sensor data, causing the message to be transmitted from the apparatus, causing receiving in the apparatus of a machine readable instruction, and deriving an estimated activity type, using the machine readable instruction, based at least partly on sensor data [0012] Various embodiments of the third aspect may comprise at least one feature corresponding to a feature in the preceding bulleted list laid out in connection with the first aspect.
[0013] According to a fourth aspect of the present invention, there is provided a method, comprising receiving a message from a user device, the message comprising information characterizing first-type sensor data, determining, based at least partly on the first-type sensor data, an activity context, and transmitting to the user device a machine-readable instruction configured to cause activity type determination in the activity context.
[0014] According to a fifth aspect of the present invention, there is provided an apparatus comprising means for storing first-type sensor data in an apparatus, means for compiling a message based at least partly on the first-type sensor data, means for causing the message to be transmitted from the apparatus, means for causing receiving in the apparatus of a machine readable instruction, and means for deriving an estimated activity type, using the machine readable instruction, based at least partly on sensor data.
[0015] According to a sixth aspect of the present invention, there is provided a non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause an apparatus to at least store first-type sensor data, compile a message based at least partly on the first-type sensor data, cause the message to be transmitted from the apparatus, cause receiving in the apparatus of a machine readable instruction, and derive an estimated activity type, using the machine readable instruction, based at least partly on sensor data.
[0016] According to a seventh aspect of the present invention, there is provided a computer program configured to cause a method in accordance with at least one of the third and fourth aspects to be performed.
According to an eighth aspect of the present invention, there is provided an apparatus for identification of user activity comprising: a memory configured to store first-type sensor data relating to an activity; at least one processing core configured to: compile a message based at least partly on the first-type sensor data, cause the message to be transmitted from the apparatus, to a server external to the apparatus, cause receiving in the apparatus from the server a response to the message as a machine readable instruction comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the apparatus is operating, and derive an estimated activity type, using the reference data specific to the context, based at least partly on sensor data by comparing the sensor data to the reference data specific to the context.
Various embodiments of the eighth aspect may comprise at least one feature from the following bulleted list: * the machine readable instruction comprises at least one of the following: an executable program and an executable script * wherein the at least one processing core is configured to derive the estimated activity type at least in part by comparing, using the machine readable instruction, the first-type sensor data, or a processed form of the first-type sensor data, to reference data * the first-type sensor data comprises acceleration sensor data * the memory is further configured to store second-type sensor data, and wherein the at least one processing core is configured to derive the estimated activity type, using the machine readable instruction, based at least in part on the second-type sensor data * the second-type sensor data is of a different type than the first-type sensor data * the second-type sensor data comprises at least one of: sound sensor data, microphone-derived data and vibration sensor data * the at least one processing core is configured to derive the estimated activity type at least in part by comparing the second-type sensor data, or a processed form of the second-type sensor data, to reference data, the reference data comprising reference data of a first type and a second type * the at least one processing core is configured to present the estimated activity type to a user for verification * the at least one processing core is configured to cause the memory to store, in a sequence of estimated activity types, the estimated activity type and a second estimated activity type * the at least one processing core is configured to cause the memory to delete the machine readable instruction responsive to a determination that an activity session has ended.
According to a ninth aspect of the present invention, there is provided a server comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to: receive a message from a user device, the message comprising information characterizing first-type sensor data relating to an activity; determine, based at least partly on the first-type sensor data, an activity context, and -transmit to the user device a response comprising machine-readable instruction comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the apparatus is operating, and wherein the machine-readable instruction is configured to cause activity type determination in the activity context.
According to a tenth aspect of the present invention, there is provided a method carried out at an apparatus for identification of user activity, comprising: storing, in an apparatus, first-type sensor data relating to an activity; compiling a message based at least partly on the first-type sensor data; causing the message to be transmitted from the apparatus to a server external to the apparatus; - causing receiving in the apparatus from the server a response to the message as a machine readable instruction comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the apparatus is operating, and deriving an estimated activity type, using the machine readable instruction, based at least partly on sensor data.
Various embodiments of the tenth aspect may comprise at least one feature from the following bulleted list: * the machine readable instruction comprises at least one of the following: an executable program and an executable script * estimated activity type is derived at least in part by comparing, using the machine readable instruction, the first-type sensor data or a processed form of the first-type sensor data, to reference data * the first-type sensor data comprises acceleration sensor data * the method further comprises storing second-type sensor data and wherein the estimated activity type is derived, using the machine readable instruction, based at least in part on the second-type sensor data * the second-type sensor data is of a different type than the first-type sensor data * the second-type sensor data comprises at least one of sound sensor data, microphone-derived data and vibration sensor data * the estimated activity type is derived at least in part by comparing the second-type sensor data, or a processed form of the second-type sensor data, to reference data, the reference data comprising reference data of a first type and a second type * the method further comprises presenting the estimated activity type to a user for verification * the method further comprise storing, in a sequence of estimated activity types, the estimated activity type and a second estimated activity type * the method further comprises deleting the machine readable instruction responsive to a determination that an activity session has ended According to an eleventh aspect of the present invention, there is provided a method carried out at a sewer, comprising: - receiving a message from a user device, the message comprising information characterizing first-type sensor data relating to an activity; determining, based at least partly on the first-type sensor data, an activity context, and -transmitting to the user device a machine-readable instruction comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the apparatus is operating, and wherein the machine-readable instruction is configured to cause activity type determination in the activity context.
According to a twelfth aspect of the present invention, there is provided a non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause an apparatus for identification of user activity to at least: - store first-type sensor data relating to an activity; - compile a message based at least partly on the first-type sensor data; - cause the message to be transmitted from the apparatus to a server external to the apparatus; cause receiving in the apparatus from the server a response to the message as a machine readable instruction comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the apparatus is operating, and derive an estimated activity type, using the machine readable instruction, based at least partly on sensor data.
According to another aspect of the present invention, there is provided a computer program configured to cause a method in accordance with at least one of the tenth and eleventh aspects to be performed.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] FIGURE 1 illustrates an example system in accordance with at least some embodiments of the present invention; [0018] FIGURE 2 illustrates an example multi sensorial time series; [0019] FIGURE 2B illustrates a second example multisensorial time series; [0020] FIGURE 3 illustrates an example apparatus capable of supporting at least some embodiments of the present invention; [0021] FIGURE 4 illustrates signalling in accordance with at least some embodiments of the present invention, and [0022] FIGURE 5 is a flow graph of a method in accordance with at least some embodiments of the present invention.
EMBODIMENTS
[0023] Employing sensor data to determine an activity type enhances usability of personal devices. For example, a user may be partly disabled, rendering his use of a device unassisted more difficult. Employing sensor data from more than one sensor may further enhance the accuracy of activity type estimation. To empower a reduced-capability user device to determine an activity type, a machine-readable instruction may be selected for, and provided to, the user device, by a back-end server. The selecting may be based on sensor data collected by the user device, such that the machine-readable instruction enables the user device to derive an estimated activity type in the user context where the user device is. Thus the derivation is partly performed in the back-end server, enabling the user device to provide a good estimation using only limited processor, energy and/or memory resources.
[0024] FIGURE 1 illustrates an example system in accordance with at least some embodiments of the present invention. The system comprises device 110, which may comprise, for example, a smart watch, digital watch, smartphone, phablet device, tablet device, or another type of suitable device. Device 110 may comprise a display, which may comprise a touchscreen display, for example. The display may be limited in size. Device may be powered, for example, by a rechargeable battery. An example of a limited-size display is a display worn on a wrist.
[0025] Device 110 may be communicatively coupled with a communications network. For example, in FIGURE 1 device 110 is coupled, via wireless link 112, with base station 120. Base station 120 may comprise a cellular or non-cellular base station, wherein a non-cellular base station may be referred to as an access point. Examples of cellular technologies include wideband code division multiple access, WCDMA, and long term evolution, LTE, while examples of non-cellular technologies include wireless local area network, WLAN, and worldwide interoperability for microwave access, WiMAX.
Base station 120 may be coupled with network node 130 via connection 123. Connection 123 may be a wire-line connection, for example. Network node 130 may comprise, for example, a controller or gateway device. Network node 130 may interface, via connection 134, with network 140, which may comprise, for example, the Internet or a corporate network. Network 140 may be coupled with further networks via connection 141. In some embodiments, device 110 is not configured to couple with base station 120. Network 140 may comprise, or be communicatively coupled, with a back-end server, for example.
[0026] Device 110 may be configured to receive, from satellite constellation 150, satellite positioning information via satellite link 151. The satellite constellation may comprise, for example the global positioning system, GPS, or the Galileo constellation.
Satellite constellation 150 may comprise more than one satellite, although only one satellite is illustrated in FIGURE 1 for the same of clarity. Likewise, receiving the positioning information over satellite link 151 may comprise receiving data from more than one satellite.
[0027] Alternatively or additionally to receiving data from a satellite constellation, device 110 may obtain positioning information by interacting with a network in which base station 120 is comprised. For example, cellular networks may employ various ways to position a device, such as trilateration, multilateration or positioning based on an identity of a base station with which attachment is possible or ongoing. Likewise a non-cellular base station, or access point, may know its own location and provide it to device 110, enabling device 110 to position itself within communication range of this access point.
[0028] Device 110 may be configured to obtain a current time from satellite constellation 150, base station 120 or by requesting it from a user, for example. Once device 110 has the current time and an estimate of its location, device 110 may consult a look-up table, for example, to determine a time remaining until sunset or sunrise, for example. Device 110 may likewise gain knowledge of the time of year.
[0029] Device 110 may comprise, or be coupled with, at least one sensor, such as, for example, an acceleration sensor, moisture sensor, temperature sensor, heart rate sensor or a blood oxygen level sensor. Device 110 may be configured to produce and store, using the at least one sensor, sensor data, for example in a time series that comprises a plurality of samples taken in a time sequence.
[0030] Device 110 may be configured to provide an activity session. An activity session may be associated with an activity type. Examples of activity types include rowing, paddling, cycling, jogging, walking, hunting, swimming and paragliding. In a simple form, an activity session may comprise device 110 storing sensor data produced with sensors comprised in device 110, or in another device with which device 110 is associated or paired. An activity session may be determined to have started and ended at certain points in time, such that the determination takes place afterward or concurrently with the starting and/or ending. In other words, device 110 may store sensor data to enable subsequent identification of activity sessions based at least partly on the stored sensor data.
[0031] An activity session in device 110 may enhance a utility a user can obtain from the activity, for example, where the activity involves movement outdoors, the activity session may provide a recording of the activity session. Device 110 may, in some embodiments, provide the user with contextual information. Such contextual information may comprise, for example, locally relevant weather information, received via base station 120, for example. Such contextual information may comprise at least one of the following: a rain warning, a temperature warning, an indication of time remaining before sunset, an indication of a nearby service that is relevant to the activity, a security warning, an indication of nearby users and an indication of a nearby location where several other users have taken photographs. Contextual information may be presented during an activity session.
[0032] A recording of an activity session may comprise information on at least one of the following: a route taken during the activity session, a metabolic rate or metabolic effect of the activity session, a time the activity session lasted, a quantity of energy consumed during the activity session, a sound recording obtained during the activity session and an elevation map along the length of the route taken during the activity session. A route may be determined based on positioning information, for example. Metabolic effect and consumed energy may be determined, at least partly, based on information concerning the user that device 110 has access to. A recording may be stored in device 110, an auxiliary device, or in a server or data cloud storage service. A recording stored in a server or cloud may be encrypted prior to transmission to the server or cloud, to protect privacy of the user. A recording may be produced even if the user has not indicated an activity session has started, since a beginning and ending of an activity session may be determined after the session has ended, for example based, at least partly, on sensor data.
[0033] Device 110 may have access to a backhaul communications link to provide indications relating to ongoing activity. For example, search and rescue services may be given access to information on joggers in a certain area of a forest, to enable their rescue if a chemical leak, for example, makes the forest unsafe for humans. Alternatively or additionally, further users may be enabled to receive information on ongoing activity sessions. Such further users may be pre-configured as users who are cleared to receive such information, with non-cleared users not being provided such information, for example. As a specific example, users on a friend list may be able to obtain information on an ongoing activity session. The friend list may be maintained in a social media service, for example. The information on the ongoing activity session may be provided from device as periodic updates, for example.
[0034] After an activity has ended, device 110 may have stored therein, or in a memory to which device 110 has access, sensor data. The stored sensor data may be stored as a time series that spans the activity session as well as time preceding and/or succeeding the activity session. The beginning and ending points in time of the activity session may be selected from the time series by the user. The beginning and ending points may be preselected by device 110, or a personal computer, for a user to accept or reject, the pre-selecting being based on changes in the time series. For example, where, in the time series, acceleration sensor data begins to indicate more active movements of device 110, a beginning point of an activity session may be pre-selected. Such a change may correspond to a time in the time series when the user stopped driving a car and began jogging, for example. Likewise, a phase in the time series where the more active movements end may be pre-selected as an ending point of the activity session.
[0035] Sensor data may comprise information from more than one sensor, wherein the more than one sensor may comprise sensors of at least two distinct types. For example, sensor data may comprise acceleration sensor data and air pressure sensor data. Further examples are sound volume sensor data, moisture sensor data and electromagnetic sensor data. In general sensor data from a sensor of a first type may be referred to as first-type sensor data, and sensor data from a sensor of a second type may be referred to as second-type sensor data. A type of a sensor may be defined by a physical property the sensor is configured to measure. For example, all temperature sensors may be considered temperature-type sensors, regardless of the physical principle used to measure temperature in the temperature sensor.
[0036] Pre-selecting the beginning and/or ending points in the time series may comprise detecting that sensor data characteristics of more than one sensor type changes at approximately the same phase of the time series. Using more than one type of sensor data may enhance the accuracy of beginning and/or ending time point pre-selection, in case the sensors in question are affected by the activity performed during the activity session.
[0037] An activity type may be determined based, at least partly, on the sensor data.
This determining may take place when the activity is occurring, or afterwards, when analysing the sensor data. The activity type may be determined by device 110 or by a personal computer that has access to the sensor data, for example, or a server that is provided access to the sensor data. Where a server is given access to the sensor data, the sensor data may be anonymized. The determination of the activity type may comprise comparing the sensor data to reference data. The reference data may comprise reference datasets, each reference dataset being associated with an activity type. The determination may comprise determining the reference dataset that most resembles the sensor data, for example in a least-squares sense. Alternatively to the sensor data itself, a processed form of the sensor data may be compared to the reference data. The processed form may comprise, for example, a frequency spectrum obtained from the sensor data. Alternatively, the processed form may comprise a set of local minima and/or maxima from the sensor data time series. The determined activity type may be selected as the activity type associated with the reference dataset that most resembles the processed or original sensor data.
[0038] Different activity types may be associated with different characteristic frequencies. For example, acceleration sensor data may reflect a higher characteristic frequency when the user has been running, as opposed to walking. Thus the determination of activity type may be based, in some embodiments, at least partly, on deciding which reference dataset has a characteristic frequency that most closely matches a characteristic frequency of a section of the sensor-derived information time series under investigation.
Alternatively or in addition, acceleration sensor data may be employed to determine a characteristic movement amplitude.
[0039] Where device 110 is configured to store a time series of more than one type of sensor data, plural sensor data types may be employed in determining the activity type.
The reference data may comprise reference datasets that are multi-sensorial in nature in such a way, that each reference dataset comprises data that may be compared to each sensor data type that is available. For example, where device 110 is configured to compile a time series of acceleration and sound sensor data types, the reference data may comprise reference datasets, each reference dataset corresponding to an activity type, wherein each reference dataset comprises data that may be compared with the acceleration data and data that may be compared with the sound data. The determined activity type may be determined as the activity type that is associated with the multi-sensorial reference dataset that most closely matches the sensor data stored by device 110. Again, original or processed sensor data may be compared to the reference datasets. Where device 110 comprises, for example, a smartphone, it may comprise plural sensors to accomplish the smartphone function. Examples of such sensors comprise microphones to enable voice calls and cameras to enable video calls. Furthermore a radio receiver may, in some cases, be configurable to measure electric or magnetic field properties. Device 110 may comprise a radio receiver, in general, where device 110 is furnished with a wireless communication capability.
[0040] A first example of multi-sensorial activity type determination is hunting, wherein device 110 stores first-type sensor data that comprises acceleration sensor data and second-type sensor data that comprises sound data. The reference data would comprise a hunting reference dataset, which would comprise acceleration reference data and sound reference data, to enable comparison with sensor data stored by device 110. Hunting may involve periods of low sound and low acceleration and intermittent combinations of loud, short sound and a low-amplitude high-frequency acceleration corresponding to a gunshot sound and kick.
[0041] A second example of multi-sensorial activity type determination is swimming, wherein device 110 stores first-type sensor data that comprises moisture sensor data and second-type sensor data that comprises magnetic field data from a compass sensor. The reference data would comprise a swimming reference dataset, which would comprise moisture reference data and magnetic field reference data, to enable comparison with sensor data stored by device 110. Swimming may involve high moisture due to being immersed in water, and elliptical movements of an arm, to which device 110 may be attached, which may be detectable as periodically varying magnetic field data. In other words, the direction of the Earth's magnetic field may vary from the point of view of the magnetic compass sensor in a periodic way in the time series.
[0042] Overall, a determined, or derived, activity type may be considered an estimated activity type until the user has confirmed the determination is correct. hi some embodiments, a few, for example two or three, most likely activity types may be presented to the user as estimated activity types for the user to choose the correct activity type from.
Using two or more types of sensor data increases a likelihood the estimated activity type is correct.
[0043] A context process may be employed in deriving an estimated activity type based on sensor data. A context process may comprise first determining a context in which the sensor data has been produced. For example, the context process may comprise using the sensor data to determine the context, such as a user context, and then deriving an activity type within that context. For example, a context may comprise outdoor activity, and deriving an estimated activity type may comprise first determining, based on the sensor data, the user is in an outdoor context, selecting an outdoor-context machine readable instruction, and using the machine-readable instruction to differentiate between different outdoor-context activity types, such as jogging and orienteering. As another example, a context may comprise indoor activity, and deriving an estimated activity type may comprise first determining, based on the sensor data, the user is in an indoor context, selecting an indoor-context machine readable instruction, and using this machine-readable instruction to differentiate between different indoor activity types, such as 100 meter runs and wrestling.
[0044] The machine readable instruction may comprise, for example, a script, such as an executable or compilable script, an executable computer program, a software plugin or a non-executable computer-readable descriptor that enables device 110 to differentiate between at least two activity types within the determined context. The machine readable instruction may comprise indications as to which type or types of sensor data, and in which format, are to be used in deriving the activity type using the machine readable instruction.
[0045] Determining an outdoor context may comprise determining the sensor data indicates a wide range of geographic movement, indicating the user has roamed outdoors. Determining an indoor context may comprise determining the sensor data indicates a narrower range of geographic movement, indicating the user has remained within an small range during the activity session. Where temperature-type sensor data is available, a lower temperature may be associated with an outdoor activity and a higher temperature may be associated with an indoor activity. The temperature may be indicative of this in particular where the user is in a geographic area where winter, autumn or spring conditions cause an outdoor temperature to be lower than an indoor temperature. The geographic area may be available in positioning data.
[0046] Therefore, in some embodiments, deriving an estimated activity type is a two-phase process first comprising determining a context based on the sensor data, and then deriving an estimated activity type within that context, using a machine-readable instruction specific to that context. Selecting the context and/or the activity type within the context may comprise comparing sensor data, or processed sensor data, to reference data.
The two-phase process may employ two types of reference data, context-type reference data and activity-type reference data, respectively.
[0047] The context process may adaptively learn, based on previous activity sessions recorded by a plurality of users, how to more accurately determine contexts and/or activity 30 types. The determining of context may be based on the context-type reference data, for example, the context-type reference data being adaptively updated in dependence of the previous sessions recorded by the plurality of users. Adapting the context-type reference data may take place in a server, for example, the server being configured to provide updated context-type reference data to device such as device 110, or a personal computer associated therewith. A server may have access to information from the plurality of users, and high processing capability, and thus be more advantageously placed to update the context-type reference data than device 110, for example.
[0048] The two-phase process described above may be performed in a distributed fashion, wherein a user device, such as device 110, initially obtains sensor data of at least one, and in some embodiments at least two, types. This sensor data is used to compile a message, the message comprising the sensor data, at least in part, in raw or processed format, the message being transmitted from the user device to a back-end server. The server may use the sensor data in the message to determine a context in which device 110 seems to be in. This determining may be based on reference data, for example. The server may then provide a machine-readable instruction to the user device, to thereby enable the user device to derive an activity type within the context. This deriving may also be based, at least partly, on reference data. The reference data used in the server need not be the same as the reference data used in the user device.
[0049] Selection of the machine readable instruction in the server may be based, in addition to the sensor data, on capabilities of device 110. In particular, the machine readable instruction may be selected so that sensor data device 110 is capable of producing may be accepted as input to the machine readable instruction. To enable this selecting, the message transmitted from device 110 to the server may comprise an indication concerning device 110. Examples of such an indication include a model and make of device 110, a serial number of device 110 and an indication of sensor types disposed in device 110. Alternatively to being comprised in the same message, device 110 may provide the indication to the server in another message.
[0050] An advantage of the distributed two-phase process is that the user device need not be enabled to detect a great range of potential activity types with different characteristics. For example, the size of reference data used in the user device may be reduced by performing context detection in a server. The machine readable instruction may comprise, at least in part, the reference data used in the user device. As the size of reference data may be thus reduced, the user device may be built with less memory and/or the activity type derivation may consume less memory in the user device.
[0051] The machine readable instruction may enable detecting events within the context. An example is detecting a number of shots fired when hunting. Further examples include a number of times a down-hill slope has been skied down, a number of laps run on a track, a number of golf strokes played and, where applicable, initial velocities of golf balls immediately following a stroke. Further examples of detecting events are detecting a number of steps during running or strokes during swimming.
[0052] While an activity type is in general selectable within a context, an activity type may itself be seen as a context within which a further selection may be possible. For example, an initial context may comprise indoor activity, wherein an activity type may be identified as watersports. Within that activity type, swimming may be derived as an activity type. Further, breaststroke may be derived as an activity type where swimming is seen as a context. Therefore, while the terms "context" and "activity type" are employed herein for convenience, what is relevant is the hierarchical relationship between the two.
[0053] A user interface of device 110 may be modified based on the context or activity type. For example, where hunting is derived as an activity type, a user interface may display the number of detected shots. As another example, when swimming the number of laps, or the distance covered, may be displayed. In some embodiments, user interface adaptations have a hierarchical behaviour, such that initially, for example, an outdoor user interface is activated as a response to a determination that the current context is an outdoor context. When an activity type is identified within the context, the user interface may be further adapted to serve that specific activity type. In some embodiments, device 110 is configured to receive from a server user interface information to enable device 110 to adapt to a great number of situations, without a need to store all the user interface adaptations in device 110 beforehand.
[0054] In some embodiments, a server may transmit a request message to device 110, the request message being configured to cause device 110 to perform at least one measurement, such as a capturing of sensor data, for example, and to return a result of the measurement to the server. This way, the server may be enabled to detect what is occurring in the surroundings of device 110 [0055] The machine readable instructions may be adapted by the server. For example, a user who first obtains a device 110 may initially be provided, responsive to the messages sent from device 110, with machine readable instructions that reflect an average user population. Thereafter, as the user engages in activity sessions, the machine readable instructions may be adapted to more accurately reflect use by this particular user. For example, limb length may affect periodical properties of sensor data captured while the user is swimming. To enable the adapting, the server may request sensor data from device 110, for example periodically, and compare sensor data so obtained to the machine readable instructions, to hone the instructions for future use with this particular user.
[0056] FIGURE 2 illustrates an example multisensorial time series. On the upper axis, 201, is illustrated a moisture sensor time series 210 while the lower axis, 202, illustrates a time series 220 of deviation of magnetic north from an axis of device 110.
[0057] Moisture time series 210 displays an initial portion of low moisture, followed by a rapid increase of moisture that then remains at a relatively constant, elevated, level before beginning to decline, at a lower rate than the increase, as device 110 dries.
[0058] Magnetic deviation time series 220 displays an initial, erratic sequence of deviation changes owing to movement of the user as he operates a locker room lock, for example, followed by a period of approximately periodic movements, before an erratic sequence begins once more. The wavelength of the periodically repeating motion has been exaggerated in FIGURE 2 to render the illustration clearer.
[0059] A swimming activity type may be determined as an estimated activity type, beginning from point 203 and ending in point 205 of the time series, based on a comparison with a reference dataset comprised in a machine readable instruction associated with a water sports context, for example. Via the reference dataset, swimming as an activity type may be associated with simultaneous high moisture and periodic movements [0060] FIGURE 2B illustrates a second example multisensorial time series. In FIGURE 2B, like numbering denotes like elements as in FIGURE 2. Unlike in FIGURE 2, not one but two activity sessions are determined in the time series of FIGURE 2B. Namely, a cycling session is determined to start at beginning point 207 and to end at point 203, when the swimming session begins. Thus the compound activity session may relate to triathlon, for example. In cycling, moisture remains low, and magnetic deviation changes only slowly, for example as the user cycles in a velodrome.
[0061] FIGURE 3 illustrates an example apparatus capable of supporting at least some embodiments of the present invention. Illustrated is device 300, which may comprise, for example, a mobile communication device such as mobile 110 of FIGURE 1 or FIGURE 2. Comprised in device 300 is processor 310, which may comprise, for example, a single-or multi-core processor wherein a single-core processor comprises one processing core and a multi-core processor comprises more than one processing core. Processor 310 may comprise more than one processor. A processing core may comprise, for example, a Cortex-A8 processing core manufactured by ARM Holdings or a Steamroller processing core produced by Advanced Micro Devices Corporation. Processor 310 may comprise at least one Qualcomm Snapdragon and/or Intel Atom processor. Processor 310 may comprise at least one application-specific integrated circuit, ASIC. Processor 310 may comprise at least one field-programmable gate array, FPGA. Processor 310 may be means for performing method steps in device 300. Processor 310 may be configured, at least in part by computer instructions, to perform actions.
[0062] Device 300 may comprise memory 320. Memory 320 may comprise random-access memory and/or permanent memory. Memory 320 may comprise at least one RANI chip. Memory 320 may comprise solid-state, magnetic, optical and/or holographic memory, for example. Memory 320 may be at least in part accessible to processor 310. Memory 320 may be at least in part comprised in processor 310. Memory 320 may be means for storing information. Memory 320 may comprise computer instructions that processor 310 is configured to execute. When computer instructions configured to cause processor 310 to perform certain actions are stored in memory 320, and device 300 overall is configured to run under the direction of processor 310 using computer instructions from memory 320, processor 310 and/or its at least one processing core may be considered to be configured to perform said certain actions. Memory 320 may be at least in part comprised in processor 310. Memory 320 may be at least in part external to device 300 but accessible to device 300.
[0063] Device 300 may comprise a transmitter 330. Device 300 may comprise a receiver 340. Transmitter 330 and receiver 340 may be configured to transmit and receive, respectively, information in accordance with at least one cellular or non-cellular standard.
Transmitter 330 may comprise more than one transmitter. Receiver 340 may comprise more than one receiver. Transmitter 330 and/or receiver 340 may be configured to operate in accordance with global system for mobile communication, GSM, wideband code division multiple access, WCDMA, long term evolution, LTE, IS-95, wireless local area network, WLAN, Ethernet and/or worldwide interoperability for microwave access, Wi MAX, standards, for example.
[0064] Device 300 may comprise a near-field communication, NFC, transceiver 350.
NFC transceiver 350 may support at least one NFC technology, such as NFC, Bluetooth, Wibree or similar technologies.
[0065] Device 300 may comprise user interface, UI, 360. UI 360 may comprise at least one of a display, a keyboard, a touchscreen, a vibrator arranged to signal to a user by causing device 300 to vibrate, a speaker and a microphone. A user may be able to operate device 300 via UI 360, for example to manage activity sessions.
[0066] Device 300 may comprise or be arranged to accept a user identity module 370. User identity module 370 may comprise, for example, a subscriber identity module, SIM, card installable in device 300. A user identity module 370 may comprise information identifying a subscription of a user of device 300. A user identity module 370 may comprise cryptographic information usable to verify the identity of a user of device 300 and/or to facilitate encryption of communicated information and billing of the user of device 300 for communication effected via device 300.
[0067] Processor 310 may be furnished with a transmitter arranged to output information from processor 310, via electrical leads internal to device 300, to other devices comprised in device 300. Such a transmitter may comprise a serial bus transmitter arranged to, for example, output information via at least one electrical lead to memory 320 for storage therein. Alternatively to a serial bus, the transmitter may comprise a parallel bus transmitter. Likewise processor 310 may comprise a receiver arranged to receive information in processor 310, via electrical leads internal to device 300, from other devices comprised in device 300. Such a receiver may comprise a serial bus receiver arranged to, for example, receive information via at least one electrical lead from receiver 340 for processing in processor 310. Alternatively to a serial bus, the receiver may comprise a parallel bus receiver.
[0068] Device 300 may comprise further devices not illustrated in FIGURE 3. For example, where device 300 comprises a smartphone, it may comprise at least one digital camera. Some devices 300 may comprise a back-facing camera and a front-facing camera, wherein the back-facing camera may be intended for digital photography and the front-facing camera for video telephony. Device 300 may comprise a fingerprint sensor arranged to authenticate, at least in part, a user of device 300. In some embodiments, device 300 lacks at least one device described above. For example, some devices 300 may lack a NFC transceiver 350 and/or user identity module 370.
[0069] Processor 310, memory 320, transmitter 330, receiver 340, NFC transceiver 350, UI 360 and/or user identity module 370 may be interconnected by electrical leads internal to device 300 in a multitude of different ways. For example, each of the aforementioned devices may be separately connected to a master bus internal to device 300, to allow for the devices to exchange information. However, as the skilled person will appreciate, this is only one example and depending on the embodiment various ways of interconnecting at least two of the aforementioned devices may be selected without departing from the scope of the present invention.
[0070] FIGURE 4 illustrates signalling in accordance with at least some embodiments of the present invention. On the vertical axes are disposed, on the left, device of FIGURE 1, and on the right, a server SRV. Time advances from the top toward the bottom. Initially, in phase 410, device 110 obtains sensor data from at least one, and in some embodiments from at least two sensors. The sensor data may comprise first-type sensor data and, in some embodiments, also second-type sensor data. The sensor or sensors may be comprised in device 110, for example. The sensor data may be stored in a time series, for example at a sampling frequency of 1 Hz, 10 Hz, 1 Khz or indeed another sampling interval. The sampling interval need not be the same in first-type sensor data and second-type sensor data.
[0071] Phase 410 may comprise one or more activity sessions of at least one activity type. Where multiple activity sessions are present, they may be of the same activity type or different activity types. The user need not, in at least some embodiments, indicate to device 110 that activity sessions are ongoing. During phase 410, device 110 may, but in some embodiments need not, identify activity types or sessions. The time series compiled during phase 410 may last 10 or 24 hours, for example. As a specific example, the time series may last from the previous time sensor data was downloaded from device 110 to another device, such as, for example, personal computer PC1.
[0072] In phase 420, the sensor data is provided, at least partly, in raw or processed format to server SRV. This phase may further comprise providing to server SRV optional activity and/or event reference data. The providing may proceed via base station 120, for example. The time series may be encrypted during downloading to protect the user's 5 privacy.
[0073] In phase 430, server SRV may determine, based at least partly on the sensor data in the message of phase 420, a context and an associated machine readable instruction. Where activity and/or event reference data is provided in phase 420, that data may be employed in phase 430.
[0074] In phase 440 the machine readable instruction determined in phase 430 is provided to device HO, enabling, in phase 450, a derivation of an estimated activity type within that context, based on sensor data. The derivation of phase 450 may be based on sensor data that was included in the message of phase 420, or device 110 may capture new sensor data and use it, with the machine readable instruction, in the derivation of phase 450.
[0075] FIGURE 5 is a flow graph of a method in accordance with at least some embodiments of the present invention. The phases of the illustrated method may be performed in device 110, an auxiliary device or a personal computer, for example, or in a control device configured to control the functioning thereof, when implanted therein.
[0076] Phase 510 comprises storing first-type sensor data in an apparatus. Phase 520 comprises compiling a message based at least partly on the first-type sensor data. Phase 530 comprises causing the message to be transmitted from the apparatus. Phase 540 comprises causing receiving in the apparatus of a machine readable instruction, Finally, phase 550 comprises deriving an estimated activity type, using the machine readable instruction, based at least partly on sensor data. The message of phase 510 may comprise activity and/or event reference data.
[0077] It is to be understood that the embodiments of the invention disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to equivalents thereof as would be recognized by those ordinarily skilled in the relevant arts. It should also be understood that terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting.
[0078] Reference throughout this specification to one embodiment or an embodiment means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Where reference is made to a numerical value using a term such as, for example, about or substantially, the exact numerical value is also disclosed.
[0079] As used herein, a plurality of items, structural elements, compositional elements, and/or materials may be presented in a common list for convenience. However, these lists should be construed as though each member of the list is individually identified as a separate and unique member. Thus, no individual member of such list should be construed as a de facto equivalent of any other member of the same list solely based on their presentation in a common group without indications to the contrary. In addition, various embodiments and example of the present invention may be referred to herein along with alternatives for the various components thereof It is understood that such embodiments, examples, and alternatives are not to be construed as de facto equivalents of one another, but are to be considered as separate and autonomous representations of the present invention.
[0080] Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the preceding description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
[0081] While the forgoing examples are illustrative of the principles of the present invention in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details of implementation can be made without the exercise of inventive faculty, and without departing from the principles and concepts of the invention. Accordingly, it is not intended that the invention be limited, except as by the claims set forth below.
[0082] The verbs "to comprise" and "to include" are used in this document as open limitations that neither exclude nor require the existence of also un-recited features. The features recited in depending claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of "a" or "an", that is, a singular form, throughout this document does not exclude a plurality.
INDUSTRIAL APPLICABILITY
[0083] At least some embodiments of the present invention find industrial application in facilitating analysis of sensor data.
ACRONYMS LIST
GPS Global Positioning System LTE Long Term Evolution
NFC Near-Field Communication
WCDMA Wideband Code Division Multiple Access WiMAX worldwide interoperability for microwave access WEAN Wireless local area network
REFERENCE SIGNS LIST
Device Base Station Network Node Network Satellite Constellation 201, 202 Axes in FIGURE 2 203, 2 05, Activity session endpoints in FIGURE2 and FIGURE2B 210, 220 Sensor data time series in FIGUREs 2 and 2B 310-370 Structure illustrated in FIGURE 3 410-430 Phases of the method of FIGURE 4 510-530 Phases of the method of FIGURE 5

Claims (15)

  1. CLAIMS: 1. An apparatus for identification of user activity comprising: -a memory configured to store first-type sensor data relating to an activity; at least one processing core configured to: compile a message based at least partly on the first-type sensor data, cause the message to be transmitted from the apparatus, to a server external to the apparatus, cause receiving in the apparatus from the server a response to the message as a machine readable instruction comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the apparatus is operating, and derive an estimated activity type, using the reference data specific to the context, based at least partly on sensor data by comparing the sensor data to the reference data specific to the context.
  2. 2. The apparatus according to claim I, wherein the machine readable instruction comprises at least one of the following: an executable program and an executable script.
  3. 3. The apparatus according to claim 1 or 2, wherein the at least one processing core is configured to derive the estimated activity type at least in part by comparing, using the machine readable instruction, the first-type sensor data, or a processed form of the first-type sensor data, to reference data.
  4. 4. The apparatus according to any of claims 1 -3, wherein the first-type sensor data comprises acceleration sensor data.
  5. 5. The apparatus according to any of claims I -4, wherein the memory is further configured to store second-type sensor data, and wherein the at least one processing core is configured to derive the estimated activity type, using the machine readable instruction, based at least in part on the second-type sensor data.
  6. 6. The apparatus according to claim 4, wherein the second-type sensor data is of a different type than the first-type sensor data.
  7. 7. The apparatus according to any of claims 5 -6, wherein the second-type sensor data comprises at least one of sound sensor data, microphone-derived data and vibration sensor data.
  8. 8. The apparatus according to any of claims 5 -7, wherein the at least one processing core is configured to derive the estimated activity type at least in part by comparing the second-type sensor data, or a processed form of the second-type sensor data, to reference data, the reference data comprising reference data of a first type and a second type.
  9. 9. The apparatus according to any of claims 1 -8, wherein the at least one processing core is configured to present the estimated activity type to a user for verification.
  10. 10. The apparatus according to any of claims 1 -9, wherein the at least one processing core is configured to cause the memory to store, in a sequence of estimated activity types, the estimated activity type and a second estimated activity type.
  11. 11. The apparatus according to any of claims 1 -10, wherein the at least one processing core is configured to cause the memory to delete the machine readable instruction responsive to a determination that an activity session has ended.
  12. 12. A server comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to: receive a message from a user device, the message comprising information characterizing first-type sensor data relating to an activity; determine, based at least partly on the first-type sensor data, an activity context, and transmit to the user device a response comprising machine-readable instruction comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the apparatus is operating, and wherein the machine-readable instruction is configured to cause activity type determination in the activity context.
  13. 13. A method carried out at an apparatus for identification of user activity, comprising: storing, in an apparatus, first-type sensor data relating to an activity; compiling a message based at least partly on the first-type sensor data; causing the message to be transmitted from the apparatus to a server external to the apparatus; -causing receiving in the apparatus from the server a response to the message as a machine readable instruction comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the apparatus is operating, and deriving an estimated activity type, using the machine readable instruction, based at least partly on sensor data.
  14. 14. The method according to claim 13, wherein the machine readable instruction comprises at least one of the following: an executable program and an executable script.
  15. 15. The method according to claim 13 or 14, estimated activity type is derived at least in part by comparing, using the machine readable instruction, the first-type sensor data, or a processed form of the first-type sensor data, to reference data 16. The method according to any of claims 13 -15, wherein the first-type sensor data comprises acceleration sensor data.17. The method according to any of claims 13 -16, further comprising storing second-type sensor data and wherein the estimated activity type is derived, using the machine readable instruction, based at least in part on the second-type sensor data.18. The method according to claim 17, wherein the second-type sensor data is of a different type than the first-type sensor data.19. The method according to any of claims 17 -18, wherein the second-type sensor data comprises at least one of: sound sensor data, microphone-derived data and vibration sensor data.20. The method according to any of claims 17 -19, wherein the estimated activity type is derived at least in part by comparing the second-type sensor data, or a processed form of the second-type sensor data, to reference data, the reference data comprising reference data of a first type and a second type.21. The method according to any of claims 13 -20, further comprising presenting the estimated activity type to a user for verification.22. The method according to any of claims 13 -21, further comprising storing, in a sequence of estimated activity types, the estimated activity type and a second estimated activity type.23. The method according to any of claims 13 -22, further comprising deleting the machine readable instruction responsive to a determination that an activity session has ended.24. A method carried out at a server, comprising: receiving a message from a user device, the message comprising information characterizing first-type sensor data relating to an activity; determining, based at least partly on the first-type sensor data, an activity context, and transmitting to the user device a machine-readable instruction comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the apparatus is operating, and wherein the machine-readable instruction is configured to cause activity type determination in the activity context.26. A non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause an apparatus for identification of user activity to at least: store first-type sensor data relating to an activity; compile a message based at least partly on the first-type sensor data; cause the message to be transmitted from the apparatus to a server external to the apparatus; -cause receiving in the apparatus from the server a response to the message as a machine readable instruction comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the apparatus is operating, and derive an estimated activity type, using the machine readable instruction, based at least partly on sensor data.27. A computer program configured to cause a method in accordance with at least one of claims 13 -24 to be performed.
GB2004037.4A 2015-12-21 2015-12-21 Sensor Based context management Active GB2579998B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2004037.4A GB2579998B (en) 2015-12-21 2015-12-21 Sensor Based context management

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1522525.3A GB2545668B (en) 2015-12-21 2015-12-21 Sensor based context management
GB2004037.4A GB2579998B (en) 2015-12-21 2015-12-21 Sensor Based context management

Publications (3)

Publication Number Publication Date
GB202004037D0 GB202004037D0 (en) 2020-05-06
GB2579998A true GB2579998A (en) 2020-07-08
GB2579998B GB2579998B (en) 2021-02-10

Family

ID=70546831

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2004037.4A Active GB2579998B (en) 2015-12-21 2015-12-21 Sensor Based context management

Country Status (1)

Country Link
GB (1) GB2579998B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011123932A1 (en) * 2010-04-06 2011-10-13 Nelson Greenberg Virtual exerciser device
US20150326709A1 (en) * 2013-06-28 2015-11-12 Facebook, Inc. User Activity Tracking System

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011123932A1 (en) * 2010-04-06 2011-10-13 Nelson Greenberg Virtual exerciser device
US20150326709A1 (en) * 2013-06-28 2015-11-12 Facebook, Inc. User Activity Tracking System

Also Published As

Publication number Publication date
GB2579998B (en) 2021-02-10
GB202004037D0 (en) 2020-05-06

Similar Documents

Publication Publication Date Title
US11607144B2 (en) Sensor based context management
US10433768B2 (en) Activity intensity level determination
US10373059B2 (en) Thematic mag based activity type prediction
US10327673B2 (en) Activity intensity level determination
US10288443B2 (en) Thematic map based route optimization
US10856776B2 (en) Activity intensity level determination
CN103189758B (en) Based on the environment sensing of audio frequency
US9143920B2 (en) Fine grain position data collection
WO2012035190A1 (en) Method and apparatus for maintaining access pont information
WO2013048542A1 (en) Flexible architecture for location based crowdsourcing of contextual data
CN105144693A (en) Always-on camera sampling strategies
US20170272902A1 (en) Handling sensor information
US11210299B2 (en) Apparatus and method for presenting thematic maps
US11587484B2 (en) Method for controlling a display
US20190142307A1 (en) Sensor data management
GB2579998A (en) Sensor Based context management
US11215457B2 (en) Thematic map based route optimization
FI129844B (en) Method for controlling a display
FI129882B (en) Sensor data management
GB2544982A (en) Thematic map based route optimization
FI127825B (en) Thematic map based route optimization
FI126908B (en) Activity type prediction based on thematic map

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20201210 AND 20201216

REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40028122

Country of ref document: HK

732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20220908 AND 20220914