WO2017001848A1 - A sensing system for detecting the behavior of a user - Google Patents
A sensing system for detecting the behavior of a user Download PDFInfo
- Publication number
- WO2017001848A1 WO2017001848A1 PCT/GB2016/051952 GB2016051952W WO2017001848A1 WO 2017001848 A1 WO2017001848 A1 WO 2017001848A1 GB 2016051952 W GB2016051952 W GB 2016051952W WO 2017001848 A1 WO2017001848 A1 WO 2017001848A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensing device
- user
- mobile device
- sensor
- information
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
- A61B5/6815—Ear
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6898—Portable consumer electronic devices, e.g. music players, telephones, tablet computers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7246—Details of waveform analysis using correlation, e.g. template matching or determination of similarity
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7275—Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0242—Operational features adapted to measure environmental factors, e.g. temperature, pollution
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
- A61B5/0531—Measuring skin impedance
- A61B5/0533—Measuring galvanic skin response
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1112—Global tracking of patients, e.g. by using GPS
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4088—Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4884—Other medical applications inducing physiological or psychological stress, e.g. applications for stress testing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
Definitions
- the present disclosure relates to a system for detecting challenging behaviors, enabling a customizable learning experience, and providing context-aware, intelligent daily assistance for people with learning disabilities.
- a new wearable sensing system which comprises a new miniaturized wearable sensor, for example that may be worn either as a wrist worn or ear worn sensor, and a seamlessly integrated mobile app that adapts to individual care needs.
- a system which comprises of one or more wearable sensing devices and a mobile device/computer, for providing a personalized learning experience, and/or assisting/monitoring the daily activities of a user with learning disabilities.
- the user may be a child.
- the wearable sensing devices can be worn on and attached to any part of the body and consist of a combination of different sensors, which includes motion sensors (e.g. accelerometers, gyroscopes, etc.), physiological sensors (e.g. heart rate, galvanic skin response, etc.), position sensors (e.g. GPS, altitude), and/or ambient sensors (e.g. temperature, humidity).
- the devices have wireless connectivity, memory storage (for storing the sensing data), and computing capabilities for detecting the relevant movement, behaviors, and activities.
- the sensing information will be transmitted and displayed via the mobile device/computer.
- Clause 3 A system, according to clause 1 or 2, wherein the wearable devices can capture the normal and challenging behaviors of people with learning disabilities, and the behaviors profiles of the user can be derived to identify precursors of challenging behaviors. Such information will be provided to the teachers/carers/parents/therapists, and which enable early intervention.
- Clause 4 A system, according to any of clauses 1-3, wherein the wearable sensor can capture the gait and activity patterns of the user, based on the acceleration patterns. From which the information will be used in the software application to encourage active and healthy lifestyle.
- a system, according to any preceding clause, the software uses the built-in and attached camera of the computer/ smartphone to capture a drawing or photo.
- a visual timetable, a matching game, a sorting task or an information arrangement game can be created by dragging and dropping the visual objects in the captured image.
- Clause 7. A system, according to any preceding clause, wherein the software is able to analyse the sensor data to determine how closely a visual timetable is followed (e.g. based on the sensor data/activities detected in Clause 2 and 3).
- a system, according to any preceding clause, can interface with other location sensing devices, such as GPS and iBeacon (Bluetooth low energy (BLE) beacons), to identify the location of the user.
- GPS and iBeacon Bluetooth low energy (BLE) beacons
- a system can use the location information to capture challenging behaviors, such as wondering around the corridor at night.
- a system can use the location information to create interactive games for help the people with learning difficulties (LD) to learn and carry out daily activities.
- ordinary objects such as fridge can be tagged with iBeacon and the interaction with the fridge can be used as a point in the game.
- Figure 1 is a schematic diagram of the system
- Figure 2 illustrates the wearable sensor device design
- Figure 3 shows how the user can draw visual assets for the app using pen and paper
- Figure 4 illustrates how a user can use the camera of the smartphone/mobile device to take a photo of the drawing of the visual assets
- Figure 5 depicts how a user can add logic to the app by cutting out (by drawing a rectangle around the visual asset) and drag-and-drop the captured visual assets (i.e. the photo of the drawing) to form a puzzle game;
- Figure 6 demonstrates how a user can test and use the app, where the cut out visual assets can be dragged-and-dropped to void spaces, similar to a puzzle game
- Figure 7 illustrate some example health Apps created.
- the new sensing device 2 is designed based on Bluetooth Low Energy (BLE) and integrated with inertial, physiological, ambient, and position sensors. With BLE, the new device can seamlessly be integrated with mobile devices.
- Figure 1 depicts the design of the system.
- the miniaturised sensing platform can be packaged as a wrist-worn or ear-worn sensor 2 for the user (e.g. a person with a learning disability. For example, the person may be a child.).
- the sensor will be integrated with a smartphone/mobile device 3 via the wireless (BLE) link 6.
- the smartphone/mobile device 3 will also work as a gateway to forward the information gathered to the cloud server 4 for behavioural profiling and long term trend analysis.
- the carer/parent/teacher/therapist can use the associated mobile app 5 to monitor the progress of the user and introduce events/instructions/games to teach and help the user to learn to carry out daily tasks.
- the wearable device will be able to integrated with ambient sensors 1 and enable indoor localisation, as shown in Figure 1.
- Figure 2 illustrates the main components of the wearable device 2.
- the device 2 mainly consists of the following:
- a battery charging circuit 10 will be designed in the sensor node to protect the battery 9 and enable safe battery charging. User can charge up the battery 9 by plugging it onto a conventional 5V charger.
- the device 2 is embedded with a few bright LEDs and/or a vibration motor.
- a touch sensor will be incorporated in the sensor node as the control interface; as such, the user can control the device 2 by simply touching it.
- sensors such as inertial sensors and temperature sensors, may be required, and various interfaces are incorporated in the device 2 to facilitate sensor integration.
- each device 2 consists of an inertial measurement unit, a humidity sensor, and a skin temperature sensor.
- a low power flash memory is integrated into the device 2 for data storage.
- a software application e.g. a mobile App
- the programme operates with simple drag-and- drop actions.
- a custom app can be created to help a person with a particular health need. The following steps describe the process of creating the custom app:
- Step (1) - Draw visual assets for the app using pen and paper ( Figure 3)
- the software application allows the user to create their own healthcare app using pen and paper.
- Apps can take the form of a visual timetable (as used widely in special needs schools), a matching game, a sorting task or an information arrangement game.
- Example apps are provided on the home screen to give the user an understanding of the types of apps that can be created. In the case of a tooth-brushing app, the user can draw a puzzle that teaches the key aspects of dental hygiene.
- Step (2) Take a photo of the assets using the in-software viewfinder ( Figure 4)
- the software features a virtual viewfinder with which the user can align the paper-based design and take a photo to import the assets into the software.
- Assets are imported at HD resolution to take advantage of the high resolution screens on modern tablet computers.
- Step (3) Add logic to the app to define user interaction ( Figure 5)
- Step (4) - Test or use the app ( Figure 6)
- the user adds logic to their application by delineating rectangular tiles on the visual assets and repositioning them to new locations.
- the software stores the original and final location of each tile enabling the software to determine whether it has been moved to the correct location.
- Tapping on the play button allows the user to test their app.
- the app enters a full-screen mode to reduce visual distraction.
- the user must rearrange the tiles to their correct positions. For example, in the hand-washing app, the user must put the steps involved with hand-washing in the correct order. Placing tiles in the incorrect order/position causes the tile position to be reset.
- Step (5) Connect the app to wearable and ambient sensors
- the wearable sensor devices can be connected to each tile of the app. This can be used for creating interactive visual timetables of health- related activities.
- the tooth-brushing sensor event can be attached to tooth-brushing visual asset in a morning routine app.
- Step (6) Provide monitoring of health activities
- the software is able to analyse the sensor data to determine how closely a visual timetable is followed.
- the tooth-brushing app it is possible to determine the duration of brushing and how suitable a brushing action was used.
- a library of example health-related apps has been created in the system (Figure 7). These include a tooth-brushing app to teach the theory of dental hygiene and to test the person's performance in practice, a hand-washing app to teach the person the order in which different aspects of hand-washing should be performed, an obesity awareness app that allows the person to identify healthy and unhealthy foods, a morning routine app that encourages the person to wash hands before eating, to eat a healthy breakfast and to brush teeth after eating and an exercise app that tracks the level of activity of the person. For example, the person may be a child.
- the wearable device 2 can detect the iBeacon 1 signals (i.e. other wireless transceiver in proximity) and GPS signals to identify the location of the user.
- the iBeacon 1 signals i.e. other wireless transceiver in proximity
- GPS signals i.e. other wireless transceiver in proximity
- the sensor can detect if the user has been to the bathroom in the morning to brush his/her teeth, or if the person is wondering around the corridor at night.
- Such information can be fed to the carers/teachers/parents/therapists for them to design targeted training games/exercises app using the software application (mobile app).
- the location information can also be used to help the person navigate.
- the wearable device 2 will show a message or vibrate if the person should get off the bus in the next bus stop, when he/she is travelling to school.
- the system will enable the development of interactive games for the people with learning disabilities to learn or remind them to carry out daily activities.
- the living environment can be converted into a gaming environment. Every object (such as washing machines, etc.,) can be tagged, and the interaction with the objects can be captured and the interaction can be considered as a scoring point in a game.
Abstract
A system for detecting a behavior of a user, the system comprising at least one wearable sensing device and a mobile device wherein the at least one sensing device is arranged to communicate with the mobile device.
Description
A SENSING SYSTEM FOR DETECTING THE BEHAVIOR OF A
USER
The present disclosure relates to a system for detecting challenging behaviors, enabling a customizable learning experience, and providing context-aware, intelligent daily assistance for people with learning disabilities.
In addition to general health problems, people with learning disabilities are known to have higher incidents of dementia, respiratory diseases, gastrointestinal cancer, ADHD/hyperkinesis and conduct disorders, epilepsy, physical and sensory impairments, dysphagia, poor oral health, and tend to be prone to injuries, accidents and falls. Often due to the lack of expressive skills, people with learning disabilities are more likely to have undiagnosed long-term conditions and which leads to high risk of premature death. With the aim of improving the care of people with learning disabilities, a new wearable sensing system is provided which comprises a new miniaturized wearable sensor, for example that may be worn either as a wrist worn or ear worn sensor, and a seamlessly integrated mobile app that adapts to individual care needs.
Aspects of the present disclosure are set forth in the accompanying independent claims. Optional features of embodiments are set out in the dependent claims. Optional features of embodiments of the disclosure are also set out in the following clauses:
Clause 1. A system, which comprises of one or more wearable sensing devices and a mobile device/computer, for providing a personalized learning experience, and/or assisting/monitoring the daily activities of a user with learning disabilities. For example, the user may be a child.
Clause 2. A system, according to clause 1, wherein the wearable sensing devices can be worn on and attached to any part of the body and consist of a combination of different sensors, which includes motion sensors (e.g. accelerometers, gyroscopes, etc.), physiological sensors (e.g. heart rate, galvanic skin response, etc.), position sensors (e.g. GPS, altitude), and/or ambient sensors (e.g. temperature, humidity). The devices have wireless connectivity, memory storage (for storing the sensing data), and computing capabilities for detecting the relevant movement, behaviors, and activities. The sensing information will be transmitted and displayed via the mobile device/computer.
Clause 3. A system, according to clause 1 or 2, wherein the wearable devices can capture the normal and challenging behaviors of people with learning disabilities, and the behaviors profiles of the user can be derived to identify precursors of challenging behaviors. Such information will be provided to the teachers/carers/parents/therapists, and which enable early intervention.
Clause 4. A system, according to any of clauses 1-3, wherein the wearable sensor can capture the gait and activity patterns of the user, based on the acceleration patterns. From which the information will be used in the software application to encourage active and healthy lifestyle.
Clause 5. A system, according to any preceding clause, wherein the software application on the smartphone/computer allows health professionals, teachers, carers, and parents to create their own health education/monitoring Apps.
Clause 6. A system, according to any preceding clause, the software uses the built-in and attached camera of the computer/ smartphone to capture a drawing or photo. A visual timetable, a matching game, a sorting task or an information arrangement game can be created by dragging and dropping the visual objects in the captured image.
Clause 7. A system, according to any preceding clause, wherein the software is able to analyse the sensor data to determine how closely a visual timetable is followed (e.g. based on the sensor data/activities detected in Clause 2 and 3). Clause 8. A system, according to any preceding clause, can interface with other location sensing devices, such as GPS and iBeacon (Bluetooth low energy (BLE) beacons), to identify the location of the user.
Clause 9. A system, according to clause 8, can use the location information to capture challenging behaviors, such as wondering around the corridor at night.
Clause 10. A system, according to Clause 8 or 9, can use the location information to create interactive games for help the people with learning difficulties (LD) to learn and carry out daily activities. For example, ordinary objects, such as fridge can be tagged with iBeacon and the interaction with the fridge can be used as a point in the game.
Specific embodiments are described below by way of example only and with reference to the accompanying drawings in which:
Figure 1 is a schematic diagram of the system;
Figure 2 illustrates the wearable sensor device design;
Figure 3 shows how the user can draw visual assets for the app using pen and paper;
Figure 4 illustrates how a user can use the camera of the smartphone/mobile device to take a photo of the drawing of the visual assets;
Figure 5 depicts how a user can add logic to the app by cutting out (by drawing a rectangle around the visual asset) and drag-and-drop the captured visual assets (i.e. the photo of the drawing) to form a puzzle game;
Figure 6 demonstrates how a user can test and use the app, where the cut out visual assets can be dragged-and-dropped to void spaces, similar to a puzzle game;
Figure 7 illustrate some example health Apps created.
System Design
Devices for people with learning disabilities need to be robust, lightweight, easy to use, and resilient against potential misuse. The new sensing device 2 is designed based on Bluetooth Low Energy (BLE) and integrated with inertial, physiological, ambient, and position sensors. With BLE, the new device can seamlessly be integrated with mobile devices. Figure 1 depicts the design of the system.
As shown in Figure 1, the miniaturised sensing platform can be packaged as a wrist-worn or ear-worn sensor 2 for the user (e.g. a person with a learning disability. For example, the person may be a child.). The sensor will be integrated with a smartphone/mobile device 3 via the wireless (BLE) link 6. Apart from acting as a data aggregator, the smartphone/mobile device 3 will also work as a gateway to forward the information gathered to the cloud server 4 for behavioural profiling and long term trend analysis. The carer/parent/teacher/therapist, on the other hand, can use the associated mobile app 5 to monitor the progress of the user and introduce events/instructions/games to teach and help the user to learn to carry out daily tasks. In addition, by using the BLE link 6, the wearable device will be able to integrated with ambient sensors 1 and enable indoor localisation, as shown in Figure 1.
Wearable Sensing Device
Figure 2 illustrates the main components of the wearable device 2.
As shown in Figure 2, the device 2 mainly consists of the following:
• An ultra-low power microcontroller 7
• Wireless (BLE) transceiver 8
• A rechargeable battery 9
• A battery charging circuit 10
o A battery charging circuit 10 will be designed in the sensor node to protect the battery 9 and enable safe battery charging. User can
charge up the battery 9 by plugging it onto a conventional 5V charger.
• LEDs /vibration motor 11
o To provide feedback to the user, the device 2 is embedded with a few bright LEDs and/or a vibration motor.
• Touch sensor control 12
o A touch sensor will be incorporated in the sensor node as the control interface; as such, the user can control the device 2 by simply touching it.
• Sensors and Interface 13
o To detect challenging behaviors, different sensors, such as inertial sensors and temperature sensors, may be required, and various interfaces are incorporated in the device 2 to facilitate sensor integration.
o Different sensors, such as motion sensors (inertial sensors), physiological sensors (heart rate), ambient (temperature), positional (GPS) sensors, etc., can be integrated into the device. By default, each device 2 consists of an inertial measurement unit, a humidity sensor, and a skin temperature sensor.
• On-node memory 14
o A low power flash memory is integrated into the device 2 for data storage.
Customisable Health /Education Application
A software application (e.g. a mobile App) is designed and created to allow health professionals, teachers, carers, and parents to create their own health education/monitoring Apps. The programme operates with simple drag-and- drop actions. Within a 3-20 minute timeframe, a custom app can be created to help a person with a particular health need. The following steps describe the process of creating the custom app:
Step (1) - Draw visual assets for the app using pen and paper (Figure 3)
The software application allows the user to create their own healthcare app using pen and paper. Apps can take the form of a visual timetable (as used widely in special needs schools), a matching game, a sorting task or an information arrangement game. Example apps are provided on the home screen to give the user an understanding of the types of apps that can be created. In the case of a tooth-brushing app, the user can draw a puzzle that teaches the key aspects of dental hygiene.
Step (2) - Take a photo of the assets using the in-software viewfinder (Figure 4)
The software features a virtual viewfinder with which the user can align the paper-based design and take a photo to import the assets into the software. Assets are imported at HD resolution to take advantage of the high resolution screens on modern tablet computers.
Step (3) - Add logic to the app to define user interaction (Figure 5)
Step (4) - Test or use the app (Figure 6) The user adds logic to their application by delineating rectangular tiles on the visual assets and repositioning them to new locations. The software stores the original and final location of each tile enabling the software to determine whether it has been moved to the correct location.
Tapping on the play button allows the user to test their app. The app enters a full-screen mode to reduce visual distraction. The user must rearrange the tiles to their correct positions. For example, in the hand-washing app, the user must put the steps involved with hand-washing in the correct order. Placing tiles in the incorrect order/position causes the tile position to be reset.
Step (5) - Connect the app to wearable and ambient sensors The wearable sensor devices can be connected to each tile of the app. This can be used for creating interactive visual timetables of health- related activities. For example, the tooth-brushing sensor event can be attached to tooth-brushing visual asset in a morning routine app.
Step (6) - Provide monitoring of health activities
The software is able to analyse the sensor data to determine how closely a visual timetable is followed. In the case of the tooth-brushing app it is possible to determine the duration of brushing and how suitable a brushing action was used. Example Health Apps
In addition to allowing users to create their custom applications, a library of example health-related apps has been created in the system (Figure 7). These include a tooth-brushing app to teach the theory of dental hygiene and to test the person's performance in practice, a hand-washing app to teach the person the order in which different aspects of hand-washing should be performed, an obesity awareness app that allows the person to identify healthy and unhealthy foods, a morning routine app that encourages the person to wash hands before eating, to eat a healthy breakfast and to brush teeth after eating and an exercise app that tracks the level of activity of the person. For example, the person may be a child.
Locational aware interactive learning
With the BLE transceiver 8 and interface with the mobile device/smartphone/computer 3, the wearable device 2 can detect the iBeacon 1 signals (i.e. other wireless transceiver in proximity) and GPS signals to identify the location of the user. With the location information, interactive learning tools and games can be developed using the software application. For instance, from the location information, the sensor can detect if the user has been to the bathroom in the morning to brush his/her teeth, or if the person is wondering around the corridor at night. Such information can be fed to the carers/teachers/parents/therapists for them to design targeted training games/exercises app using the software application (mobile app). Furthermore, the location information can also be used to help the person navigate. For instance, the wearable device 2 will show a message or vibrate if the person should get off the bus in the next bus stop, when he/she is travelling to school.
With the integration with iBeacon 1 and other ambient sensors 1, the system will enable the development of interactive games for the people with learning disabilities to learn or remind them to carry out daily activities. In other words, the living environment can be converted into a gaming environment. Every object (such as washing machines, etc.,) can be tagged, and the interaction with the objects can be captured and the interaction can be considered as a scoring point in a game.
It will be understood that the above description is of specific embodiments by way of example only and that many modifications and alterations will be within the skilled person's reach and are intended to be covered by the scope of the appendent claims.
Claims
1. A system for detecting a behavior of a user, the system comprising at least one wearable sensing device and a mobile device wherein the at least one sensing device is arranged to communicate with the mobile device.
2. A system according to claim 1, wherein the at least one wearable sensing device can be worn on and/ or attached to any part of the user's body.
3. A system according to either of claim 1 or 2, wherein the at least one sensing device comprises at least one of: a motion sensor for example an accelerometer and/or gyroscope, a physiological sensor for example a heart rate and/or galvanic skin response sensor, a position sensor for example a GPS and/or an altitude sensor, and/or an ambient sensor for example a temperature and/or a humidity sensor.
4. A system according to any preceding claim, where the at least one sensing device and the mobile device communicate via a wireless communication means.
5. A system according to any preceding claim, wherein the system further comprises at least one location sensing device arranged to communicate with the mobile device wherein the at least one location sensing device is arranged to determine the location of the user.
6. A system according to claim 5, wherein the at least one location sensing device is a GPS and/or Bluetooth beacon.
7. A system according to any preceding claim, wherein the mobile device comprises a processor for determining a behavior of a user based on information received from the at least one sensing device and/or the at least one or a location sensing device.
8. A system according to any preceding claim, wherein the mobile device is arranged to communicate with an external computing device.
9. A system according to claim 8, wherein the external computing device is arranged to determine a behavior of a user.
10. A system according to claim 7 or 9, wherein a behavior is detected by comparing information with known sensing patterns indicative of the
behavior, wherein the information is received from the at least one sensing device and/or the at least one or a location sensing device.
11. A system according to any preceding claim, wherein the mobile device comprises memory storage for storing the information is received from the at least one sensing device and/or the at least one or a location sensing device.
12. A system according to any preceding claim, wherein the mobile device comprises a display unit for displaying information relating to the information received from the at least one sensing device and/or the at least one or a location sensing device and/or the or a behavior detected.
13. A system according to any preceding claim, wherein information received from the at least one sensing device and/or the at least one or a location sensing device may be used to identify precursors of a given behavior.
14. A system according to any preceding claim, where the at least one
wearable sensor comprises a gait sensor arranged to capture the gait of the user.
15. A system according to claim 14, wherein the gait sensor comprises an accelerometer.
16. A system according to any preceding claim, wherein the mobile device comprises functionality to enable a user to create a health education and/or monitoring application which may be run on the mobile device.
17. A system according to claims 16, wherein the application may comprise a visual timetable, a game, a sorting task and/or an information arrangement game.
18. A system according to claim 16 or 17, wherein the mobile device comprises a camera and wherein information obtained from the camera is used as an input to create the application.
19. A system according to any of claims 16-18, wherein information obtained from the at least one sensing device and/or the at least one or a location sensing device is used as an input to the application.
20. A system according to claim 19, wherein the application is arranged to display information on the mobile device dependent on the information
received from the at least one sensing device and/or the at least one or a location sensing device.
21. A computer program product comprising coded instructions which, when run on a processor, implement a system according to any preceding claim.
22. A tangible computer readable medium comprising a computer program product according to claim 21.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GBGB1511483.8A GB201511483D0 (en) | 2015-06-30 | 2015-06-30 | A sensing system |
GB1511483.8 | 2015-06-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017001848A1 true WO2017001848A1 (en) | 2017-01-05 |
Family
ID=53872463
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2016/051952 WO2017001848A1 (en) | 2015-06-30 | 2016-06-29 | A sensing system for detecting the behavior of a user |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB201511483D0 (en) |
WO (1) | WO2017001848A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107625518A (en) * | 2017-10-16 | 2018-01-26 | 朱彦臻 | A kind of application method for supplying releive intelligent developing apparatus and device that the elderly uses |
CN111436941A (en) * | 2020-03-23 | 2020-07-24 | 广东艾诗凯奇智能科技有限公司 | Reminding system and method for potential disease prevention and server |
US11315681B2 (en) | 2015-10-07 | 2022-04-26 | Smith & Nephew, Inc. | Reduced pressure therapy device operation and authorization monitoring |
US11369730B2 (en) | 2016-09-29 | 2022-06-28 | Smith & Nephew, Inc. | Construction and protection of components in negative pressure wound therapy systems |
US11602461B2 (en) | 2016-05-13 | 2023-03-14 | Smith & Nephew, Inc. | Automatic wound coupling detection in negative pressure wound therapy systems |
US11712508B2 (en) | 2017-07-10 | 2023-08-01 | Smith & Nephew, Inc. | Systems and methods for directly interacting with communications module of wound therapy apparatus |
US11793924B2 (en) | 2018-12-19 | 2023-10-24 | T.J.Smith And Nephew, Limited | Systems and methods for delivering prescribed wound therapy |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130310658A1 (en) * | 2012-04-26 | 2013-11-21 | Nathan W. Ricks | Activity Measurement Systems |
US20140081090A1 (en) * | 2010-06-07 | 2014-03-20 | Affectiva, Inc. | Provision of atypical brain activity alerts |
US20140085050A1 (en) * | 2012-09-25 | 2014-03-27 | Aliphcom | Validation of biometric identification used to authenticate identity of a user of wearable sensors |
-
2015
- 2015-06-30 GB GBGB1511483.8A patent/GB201511483D0/en not_active Ceased
-
2016
- 2016-06-29 WO PCT/GB2016/051952 patent/WO2017001848A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140081090A1 (en) * | 2010-06-07 | 2014-03-20 | Affectiva, Inc. | Provision of atypical brain activity alerts |
US20130310658A1 (en) * | 2012-04-26 | 2013-11-21 | Nathan W. Ricks | Activity Measurement Systems |
US20140085050A1 (en) * | 2012-09-25 | 2014-03-27 | Aliphcom | Validation of biometric identification used to authenticate identity of a user of wearable sensors |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11315681B2 (en) | 2015-10-07 | 2022-04-26 | Smith & Nephew, Inc. | Reduced pressure therapy device operation and authorization monitoring |
US11783943B2 (en) | 2015-10-07 | 2023-10-10 | Smith & Nephew, Inc. | Reduced pressure therapy device operation and authorization monitoring |
US11602461B2 (en) | 2016-05-13 | 2023-03-14 | Smith & Nephew, Inc. | Automatic wound coupling detection in negative pressure wound therapy systems |
US11369730B2 (en) | 2016-09-29 | 2022-06-28 | Smith & Nephew, Inc. | Construction and protection of components in negative pressure wound therapy systems |
US11712508B2 (en) | 2017-07-10 | 2023-08-01 | Smith & Nephew, Inc. | Systems and methods for directly interacting with communications module of wound therapy apparatus |
CN107625518A (en) * | 2017-10-16 | 2018-01-26 | 朱彦臻 | A kind of application method for supplying releive intelligent developing apparatus and device that the elderly uses |
US11793924B2 (en) | 2018-12-19 | 2023-10-24 | T.J.Smith And Nephew, Limited | Systems and methods for delivering prescribed wound therapy |
CN111436941A (en) * | 2020-03-23 | 2020-07-24 | 广东艾诗凯奇智能科技有限公司 | Reminding system and method for potential disease prevention and server |
CN111436941B (en) * | 2020-03-23 | 2021-12-07 | 未来穿戴技术有限公司 | Reminding system and method for potential disease prevention and server |
Also Published As
Publication number | Publication date |
---|---|
GB201511483D0 (en) | 2015-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017001848A1 (en) | A sensing system for detecting the behavior of a user | |
US20220291820A1 (en) | Sedentary Notification Management System for Portable Biometric Devices | |
JP6934929B2 (en) | Multi-function smart mobility aid and how to use | |
Hamm et al. | Fall prevention intervention technologies: A conceptual framework and survey of the state of the art | |
US10376739B2 (en) | Balance testing and training system and method | |
US9311789B1 (en) | Systems and methods for sensorimotor rehabilitation | |
US20170243508A1 (en) | Generation of sedentary time information by activity tracking device | |
Bharucha et al. | Intelligent assistive technology applications to dementia care: current capabilities, limitations, and future challenges | |
Ejupi et al. | A kinect and inertial sensor-based system for the self-assessment of fall risk: A home-based study in older people | |
JP2015058096A (en) | Exercise support device, exercise support method, and exercise support program | |
US10080530B2 (en) | Periodic inactivity alerts and achievement messages | |
US20130130213A1 (en) | Activity monitor and analyzer with voice direction for exercise | |
KR101367801B1 (en) | Finger exercising apparatus and method for assisting exercise of finger | |
CN108272436A (en) | The interdependent user interface management of unit state | |
US20170239523A1 (en) | Live presentation of detailed activity captured by activity tracking device | |
CN203084647U (en) | Human motion information interaction and display system | |
Yared et al. | Ambient technology to assist elderly people in indoor risks | |
CN106779614A (en) | Study monitoring method, device and wearable device for wearable device | |
Hänsel et al. | Wearable computing for health and fitness: exploring the relationship between data and human behaviour | |
Nawaz et al. | Designing smart home technology for fall prevention in older people | |
US20170140662A1 (en) | Wearable computing device for youth and developmentally disabled | |
KR20160065601A (en) | Exercise therapy rehabilitation system for developmental disabilities | |
Vogiatzaki et al. | Maintaining mental wellbeing of elderly at home | |
Subramonyam | SIGCHI: magic mirror-embodied interactions for the quantified self | |
US20200297265A1 (en) | Screening for and monitoring a condition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16742381 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16742381 Country of ref document: EP Kind code of ref document: A1 |