WO2017016941A1 - Wearable device, method and computer program product - Google Patents

Wearable device, method and computer program product Download PDF

Info

Publication number
WO2017016941A1
WO2017016941A1 PCT/EP2016/067224 EP2016067224W WO2017016941A1 WO 2017016941 A1 WO2017016941 A1 WO 2017016941A1 EP 2016067224 W EP2016067224 W EP 2016067224W WO 2017016941 A1 WO2017016941 A1 WO 2017016941A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearable device
user
wearer
support information
activity
Prior art date
Application number
PCT/EP2016/067224
Other languages
French (fr)
Inventor
Radu Serban Jasinschi
Murtaza Bulut
Caifeng Shan
Ronaldus Maria Aarts
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2017016941A1 publication Critical patent/WO2017016941A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0242Operational features adapted to measure environmental factors, e.g. temperature, pollution
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7445Display arrangements, e.g. multiple display units

Definitions

  • the present invention relates to a wearable device comprising at least one feedback module and a processor for controlling the at least one feedback module to provide support information with respect to a first activity that a wearer of the wearable device intends to perform.
  • the present invention further relates to a method for providing support information on a wearable device.
  • the present invention further relates to a computer program product for implementing such a method when executed on a processor of such a wearable device.
  • Wearable devices are a new class of electronic systems that can provide data acquisition through a variety of unobtrusive sensors that may be worn by a user.
  • the sensors gather information, for example, about the environment, the user's activity, or the user's health status.
  • information for example, about the environment, the user's activity, or the user's health status.
  • challenges related to the coordination, computation, communication, privacy, security, and presentation of the collected data.
  • analysis of the data is needed to make the data gathered by the sensors useful and relevant to end-users.
  • additional sources of information may be used to supplement the data gathered by the sensors.
  • the many challenges that wearable technology presents require new designs in hardware and software.
  • the current wearable devices for supporting the cognitively impaired persons are systems based on single sensors and/or operations.
  • a GPS enabled insoles may allow people with early signs of dementia like Alzheimer's disease (AD) to be tracked during their (outdoor) walks based on a GPS chip embedded in insoles of a shoe.
  • AD Alzheimer's disease
  • a GPS chip embedded in insoles of a shoe is another example.
  • Philips Lifeline® Help button this system is made of a small device made up of a pendant around the user's neck that has a button that is activated if he/she, e.g., falls inside a home; this information is transmitted to a service center which provides as quickly as possible assistance to him/her.
  • this personal help button is merely a device to use in emergencies, and not to assistant the wearer in daily activities, and not suitable for AD patients.
  • the major problem for AD patients is that they are heavily dependent on others and cannot perform a task independently. This situation puts an enormous burden on caregivers who has to be attentive 100% of time, resulting in caregiver burnout.
  • giving tools for the patient to perform certain activities can be also beneficial for the patients as this can increase their life quality, and maybe slow down the progression of the disease.
  • a wearable device for cognitively impaired user comprising at least one feedback module for providing feedback information to the cognitively impaired user; one or more sensors for sensing data from the cognitively impaired user; and a processor adapted to initialize the feedback module; identify a first movement of the cognitively impaired user based on the data sensed by the one or more sensors; control the at least one feedback module to provide support information for supporting a second movement that the cognitively impaired user of the wearable device intends to perform subsequently after the first movement; wherein the support information for supporting the second movement is determined based on the identified first movement of the cognitively impaired user.
  • the present invention is based on the idea that using a wearable device with smart sensing, computation, and user feedback, the video/text support information will be displayed on a display module of the wearable device and/or the audio support information will be provided by a speaker as an assistive tool for cognitively impaired user.
  • This wearable device has the ability to determine and guide the execution of tasks by cognitively impaired people.
  • the sensing components may include, e.g., light (infrared) sensor, radar, to determine the appearance, shape, distance of objects in a 3-D scene, and sound sensor to determine audio patterns by the wearer and/or in the environment.
  • the computation may use a combination of the sensory data to identify the current movement or current activity of the cognitively impaired user and to determine step-by-step execution of tasks support information to the cognitively impaired user.
  • the wearer may get a feedback from the wearable device to indicate whether the cognitively impaired user executed a task or activity which may involve a specific movement, such as holding a cup, is correct or not.
  • the system may provide video/audio guidance information to support the wearer to correct such mistakes.
  • the wearer may perform a task independently without any support from caregivers.
  • the feedback module is a display module; and the processor is adapted to initialize the display module; control the at least one display module to provide visual support information to the cognitively impaired user.
  • the feedback module can be a speaker for providing audio support information to the cognitively impaired user.
  • the one or more sensors comprise at least one of an onward imaging sensor for capturing one or more images indicative of the current view of the wearer and identifying one or more objects in the captured one or more images; an audio sensor for capturing sounds present in the environment or in the proximity of the user; a temperature sensor for measuring the temperature of the wearer; a light sensor for measuring the light in the environment or in the proximity of the wearer; and a motion sensor, such as a gyroscope, or an accelerometer, for identifying the current physical activity of the wearer.
  • the processor is adapted to identify the first movement of the cognitively impaired user by evaluating the changes of each types of sensed data based on predefined rules comprising thresholds for each types of sensed data.
  • each types of sensed data is to more reliably and predictably analyze the current activity or movment performed by the cognitively impaired user by evaluating the changes of each types of sensed data based on predefined rules comprising thresholds for each types of sensed data, for example, rules for evaluating whether the changes of all types of sensed data have reached the corresponding threshold that indicates which activities or movements the cognitive imparled user are currently performed, how the activity or movement is performed (e.g. whether the performed activity or movement meets one or more predefined thresholds or not), or indicate whether the congnitive impaired user has performed one activity or movement improperly and whether the cognitively impaired user would have to redo the improperly performed activity or movement, etc.
  • the rules can be defined by learning from sensed data.
  • the wearable device will be trained prior to the deployment with all types of sensed data as mentioned above that are derived from the cognitive impaired user. Alternatively, representative data derived from other subjects can be used. Later, when the personal data is available (i.e. when the cognitive imparled user starts to use the wearable device), the training models will be adapted accordingly.
  • the wearable device may be trained by Deep Learning (DL) algorithms. This is because DL algorithm can provide reasonably good performance on each type of sensed data, and can continuously and automatically learn when new sensed data becomes available. It is understood that other types of machine learning method may be used as well, such as Random Forest classifiers. Examples of relevant machine learning algorithms can be found, for instance, in Kevi P. Murphy el. Machine Learning: A Probabilistic
  • the threshold of temperature sensor data for determining whethether the user is performing movement A may be between 36.2 °C and 36.6 °C; the threshold of audio sensor data for determining whethether the user is performing movement A is to verify whether at least the keywords "XX” or equivalent keywords "XY" is detected or intercepted in the audio/acoustic data; the threshold of imaging sensor for determining whether the user is performing movement A is to verify whether at least object "M" is detected in the captured images or video data, etc.
  • a more reliably decision regarding whether the user is performing movement A can be made if all the thresholds for each types of sensed data can be met.
  • the classifiers can be better trained. It is appreciated that, in simple situations, individual sensor data can be used to determine thresholds for such individual sensor data.
  • the processor is adapted to identify the second activity of the wearer based on signals acquired by the one or more sensors.
  • the processor is further adapted to display a list of a plurality of activities that a wearer of the wearable device intends to perform; limit the plurality of activities based on the identified activity that the wearer of the wearable device has performed.
  • the processor is further adapted to rank the plurality of activities intended to be performed by the wearer, and wherein the ranking is based on the second activity that the wearer has performed, or based on a predefined wearer's profile.
  • one or more sensors further comprise a sensor adapted for measuring the wearer's physiological and/or psychological state.
  • the audio signal measured by the microphone can be used to assess the physiological and/or psychological state of the person.
  • emotion and mood of the user can be estimated.
  • the movement data of the user can be also used. It is expected that depending on the psychological state the user will move differently, while performing a certain movement or activity. For example, when the cognitively impaired user is stressed, or angry, one can expect more sharp, fast, or rough movements, and possible different ordering (or errors, which can be measured by observing repetitions) while performing a movement or activity.
  • the psychological state is assessed mainly by observing how the users movements have changes with respect to the way how the activity or movement is performed.
  • the physiological and/or psychological state of the cognitively impaired user can be derived from how the user responds to the suggestions provided by the system. When the user is in a good mood, we can expect higher compliance versus when the user is in a negative mood. As stated above, if speech data, and audio data are available they can be also analyzed together with the task performance behavior to infer the user psychological and physiological state.
  • the support information is predefined on a step-by-step basis, and wherein support information related to a next step of activity is provided only if the current step of activity is completely performed by the wearer.
  • processor is further adapted to change the predefined support information in response to the wearer's physiological and/or
  • processor is further adapted to change the predefined support information by one or more of the following options:
  • the speed of providing supported information can be adapted based on the , so the wearer can better follow the supported information, such as instructions.
  • the user will have the highest chance to understand and follow the supported information. For example, when it is detected that the user cannot follow the instruction, e.g. due to confusion or other reasons, it can be preferable to provide such supporting information in a compact format or in a format that is easy to visualize to be followed.
  • the wearable device further comprises a transmitter adapted to receive signals and/or data from one or more devices, wherein the one or more devices are in the environment or in the proximity of the wearer.
  • the wearable device further comprises a speaker
  • the processor is further configured for controlling the speaker to provide support information with respect to the first activity that a wearer of the wearable device intends to perform.
  • a method of displaying information on the wearable device comprising initializing the feedback module; providing support information with respect to a first activity that a wearer of the wearable device intends to perform; wherein the support information with respect the first activity is determined based on an identified second activity that the wearer is performing, and wherein the second activity is performed by the wearer prior to the first activity.
  • a computer program product comprising a computer program code for, when executed on the processor of the wearable device according to one or more of the above embodiments, implementing the steps of the method according to one or more of the above embodiments.
  • a computer program product may be made available to the wearable device in any suitable form, e.g. as a software application (app) available in an app store, and may be used to configure the wearable device such that the wearable device may implement the aforementioned method.
  • Fig. 1 schematically depicts a wearable device according to an embodiment of the present invention
  • Fig. 2 schematically depicts an aspect of the wearable device of Fig 1 according to an embodiment of the present invention
  • Fig. 3 schematically depicts an embodiment of the wearable device of Fig 1
  • Figs. 4A-4B schematically depicts some aspects of an embodiment of the wearable device of Fig 1 according to an embodiment of the present invention
  • Fig. 5 shows a flow chart of methods according to an embodiment of the present invention.
  • a wearable device is a device that can be worn on a user and provides the user with computing functionality.
  • the wearable device may be configured to perform specific computing tasks as specified in a software application (app) that may be retrieved from the Internet or another computer-readable medium.
  • the wearable device may be worn on the head of a wearer.
  • Non-limiting examples of such wearable devices include smart headgear, e.g. eyeglasses, goggles, a helmet, a hat, a visor, a headband, a Google Glass, or any other device that can be supported on or from the wearer's head, and so on.
  • the wearable device can be worn on the wrist of the wearer.
  • Non-limiting examples of such wearable devices include smart watch, bandage, etc.
  • a feedback module may be a display module for providing video/text support information to the wearer, and/or a speaker for providing audio/speech support information to the wearer.
  • the wearable device may comprise one or more sensors.
  • the sensors may comprise at least one of an onward imaging sensor for capturing one or more images indicative of the current view of the wearer and identifying one or more objects in the captured one or more images; an audio sensor for capturing the sounds present in the environment or in the proximity of the user; a temperature sensor for measuring the temperature of the wearer; a light sensor for measuring the light in the environment or in the proximity of the wearer, or a motion sensor, such as a gyroscope, or an accelerometer, for identifying the current physical activity of the wearer.
  • the current activity of the wearer can be detected.
  • the above mentioned one or more sensors may be integral to an external device, such as a smartphone, a tablet, etc., or other wearable device, a surveillance device such as an alarm camera mounted in a room, or a vital sign monitoring device such as a device including a vital signs camera used for remote photoplethysmography, and is communicatively coupled via a wired or wireless connection to the wearable device.
  • the data derived from the one or more of sensors may be directly derived from a remote data source such as the Internet/Cloud.
  • FIG. 1 schematically depicts an embodiment of a wearable device 100.
  • Fig. 2 schematically depicts a block diagram of an embodiment of the wearable device 100, further highlighting the functionality of the wearable device 100 in terms of functional blocks, at least some of which may be optional functionality.
  • the wearable device 100 is depicted as smart glasses, but it should be understood that the wearable device 100 may take any suitable shape as previously explained.
  • the wearable device 100 comprises an image sensor 114 for capturing an image indicative of the current view of the wearer and a sound sensor 116, e.g. a microphone, for detecting a sound present in the environment or in the proximity of the wearer of the wearable device 100.
  • a sound sensor 116 e.g. a microphone
  • Such captured images or video data or signal and/or the captured audio or acoustic data may be used for analyzing and identifying the activity of the wearer.
  • the activity identification may be achieved, for instance, by detecting the objects in the captured images or video data and comparing the identified objects against a Look-Up-Table (LUT) defining the types of support information for each wearer's activity, e.g. a list of objects corresponding to a specific activity in which the needed types of support information is predefined. If the detected objects in the captured images or video data are identical to the same list of objects in the LUT corresponding to a specific activity, then such activity is identified.
  • the user's activity can also be identified by analyzing the movement patterns of the detected object in the video, for example, the wearer's hand movements.
  • Common video analytics methods such as pattern recognition algorithms, may be used for detecting the objects or the movement patterns of the detected object in the captured images or video data. For instance, the movement pattern of the wearer's hand can be derived, by tracking the hand in the video to derive the trajectory of the motion.
  • visual tracking techniques in the literature can be used to track the detected object(s). For example, a publication titled “Object Tracking: A survey” by Yilmaz, et al tactile ACM Computing Surveys, Vol. 38, Issue 4, 2006, describes the state-of-the-art object tracking methods.
  • the wearer's activity identification may be achieved, for instance, by detecting the audio or acoustic data and identifying the current activities of the wearer based on intercepting or detecting the keywords in the audio/acoustic data, such as "coffee machine", “cup”, “water boiler”, or analyzing and/or identifying the source of the audio/acoustic data.
  • the audio/acoustic data may be generated by one or more further systems 200 in the environment or in the proximity of the wearer or are currently used by the wearer, such as the sound of making coffee produced by the coffee machine.
  • the source of the audio/acoustic data may be identified as coffee machine.
  • the detected data keywords in the audio/acoustic or the source of the audio/acoustic data may be compared against a list of keywords in the LUT. Each activity in the LUT may link to one or more keywords. If the detected keywords are identical to the same keywords in the LUT corresponding to an activity, then such activity is identified.
  • the activity identification may be also achieved by receiving the data acquired from the one or more further systems 200 in the environment or in the proximity of the wearer or are currently used by the wearer.
  • the received data may include the identification information of the devices that the wearer is using, such as the types or the versions of the one or more further systems 200, so as to precisely identify the activity of the wearer, i.e. on which further systems 200 the wearer is currently used, such as a coffee machine.
  • Such data may be received from the further systems 200 via a Near Field Communication (NFC) communication channel.
  • NFC Near Field Communication
  • the wearable device 100 may comprise a temperature sensor 118 for measuring the temperature of the wearer; a light sensor 120 for measuring the light in the environment or in the proximity of the wearer, a motion sensor 124, such as a gyroscope, or an accelero meter, for identifying the current physical activity of the wearer, such as running, sleeping, etc..
  • a temperature sensor 118 for measuring the temperature of the wearer
  • a light sensor 120 for measuring the light in the environment or in the proximity of the wearer
  • a motion sensor 124 such as a gyroscope, or an accelero meter
  • the wearable device 100 may comprise at least one display module 106, under control of a discrete display controller (not shown).
  • the display controller may be implemented by a processor 110 of the wearable device 100, as shown in Fig. 2.
  • the display module 106 may be a transparent or see-through display module.
  • the display module may be a two-dimensional or a three-dimensional display module.
  • the at least one display module 106 is typically arranged to cover the field of view of the wearer when the wearable device 100 is worn by the wearer such that the wearer of the wearable device 100 may observe the field of view through an image displayed on the at least one the display module 106.
  • the wearable device 100 comprises a pair of transparent display modules 106 including a first display module that can be observed by the right eye of the wearer and a second display module that can be observed by the left eye of the wearer.
  • the at least one display module 106 may be a single display module covering both eyes of the wearer.
  • the at least one display module 106 may be provided in any suitable form, such as a transparent lens portion as shown in Fig.
  • the wearable device 100 may comprise a pair of such a lens portions, i.e. one for each eye as explained above.
  • the one or more transparent lens portions are dimensioned such that substantially the entire field of view of the wearer is obtained through the one or more transparent lens portions.
  • the at least one display module 106 may be shaped as a lens to be mounted in a frame 125 of the wearable device 100 or a component housing 135 of the wearable device 100.
  • the display module 106 may be controlled by the processor 110 to generate video/text support information to the wearer based on the detected current activity of the wearer.
  • the processor 110 may be adapted to display a list of all the potential activities in the display module 106 that the wearer will intend to perform and will rank all the potential activities based on the possibility factors analyzed by the processor 110. Such possibility factors may be determined based on a predefined wearer's profile stored in the data storage 112.
  • the wearer's profile may include user's personal history, such as the habit, the lifestyle patterns, etc. Based on the detected current activity of the wearer, the less relevant activities will be deleted from the list by the processor 110 until the only the most relevant activity will be shown in the display module 106.
  • the wearable device 100 may further include a user interface 108 for allowing a user to select one activity from the displayed list of all potential activities. This may be used, e.g. at the situation that the two potential activities may have equal possibility factors.
  • the user interface 108 may further allow the user to insert a new activity into the list if such new activity is not mentioned in the list. Such new activity may be stored in the data storage 112. Consequently, the support information with respect to the most relevant activity or the selected activity will be displayed in the display module 106.
  • the processor 110 may be adapted to directly control the display module 106 to display the support information with respect to the most relevant activity.
  • the wearable device 100 may further comprise a speaker 144.
  • the speaker 144 may be controlled by the processor 110 to generate audio/speech support information to the wearer based on the detected current activity of the wearer.
  • the support information displayed on the display module 106 may include the information for supporting the wearer to perform a specific task.
  • the support information for supporting the wearer to make a coffee may include the information of the locations of the coffee jar, the spoon, and the coffee cup, and the text or images/videos showing how to open the jar, how to use the spoon to get the coffee, and how to pour the water from the water boiler, etc..
  • Such assistance information may be acquired by the one or more further devices 200 connected or communicated to the wearable device 100 via a transmitter such that the support information may be transmitted to wearable device in real time.
  • the support information may be stored in a remote database or in the data storage 112 of the wearable device 100.
  • the support information displayed on the display module 106 may be controlled to be automatically hidden, blurred or partly removed from the display module by the processor 110 when a specific activity of the wearer is identified. For instance, when the wearer of the wearable devices performing a task that needs full attention from the wearer or engaging an activity that she/he should not be distracted, all the support information displayed on the display module will be automatically hidden or blurred by the processor 110 in order to avoid any distraction to the wearer.
  • the hidden or removed assistance information may be provided by wearable device in a subtle audio or tactile stimulus manner.
  • the wearer assistance information displayed on said display module may include the duration of an operation or activity that the wearer is performed for a specific activity, e.g. using one or more further devices 200.
  • This has the advantage that it enables the wearer to hand- freely monitor the time spent on his/her current activity.
  • the information indicating the duration of such activity may be highlighted, such as using different colors/brightness or different font/format, in order to alert the wearer.
  • a frame 125 of the wearable device 100 may have any suitable shape and may be made of any suitable material, e.g. a metal, metal alloy, plastics material or combination thereof.
  • Several components of the wearable device 100 may be mounted in the frame 125, such as in the component housing 135 forming part of the frame 125.
  • the component housing 135 may have any suitable shape, preferably an ergonomic shape that allows the wearable device 100 to be worn by its wearer in a comfortable manner.
  • the functioning of at least part of the wearable device 100 may be controlled by the processor 110 that executes instructions, i.e. computer program code, stored in a non- transitory computer readable medium, such as data storage 112.
  • processor 110 in combination with processor-readable instructions stored in the data storage 112 may function as a controller of the wearable device 100.
  • the data storage 112 may store data that is associated with the generation of support information on the at least one display module 106.
  • the wearable device 100 may be adapted to wirelessly communicate with a remote system, e.g. the further system 200 as shown in Fig. 2.
  • the wearable device 100 may include a wireless communication interface 102 for wirelessly communicating with a remote target such as the remote further system 200. Any suitable wireless communication protocol may be used for any of the wireless
  • the wearable device 100 communicates with the remote system 200, e.g., an infrared link, Zigbee, Bluetooth, a wireless local area network protocol such as in accordance with the IEEE 802.11 standards, a 2G, 3G or 4G telecommunication protocol, and so on.
  • the remote further system 200 may for instance be controlled to provide the wearer of the wearable device 100 with feedback information and/or instructions, as will be further explained below.
  • the suitable wireless communication protocol may be Near Field
  • NFC Near Field Communication
  • NFC Near Field Communication
  • the wearable device 100 may optionally comprise a further wireless communication interface 104 for wirelessly communicating with a further remote system, e.g. a wireless LAN, through which the wearable device 100 may access a remote data source such as the Internet, for instance to store data such as user preferences, user specific information, and so on.
  • a further remote system e.g. a wireless LAN
  • the wearable device 100 may include one wireless communication interface that is able to communicate with the remote further system 200 and a further remote target such as the further network.
  • the processor 110 may further be adapted to control wireless communication interface 102 and, if present, wireless communication interface 104.
  • the further system 200 may be one or more devices in the environment or in the proximity of the wearer. More specifically, the further system 200 may be one or more devices currently used by the wearer. For instance, the further system 200 may be a coffee machine or a water boiler used by the wearer in the morning after woke up.
  • the processor 110 may be further adapted to determine whether a specific activity has been completed or correctly performed by the wearer. If not, the processor 110 may be adapted to repeatedly provide the support information to the wearer. For example, if the activity is to find the coffee jar, then the activity will be determined to be completed or correctly performed by the wearer when the user has found the jar, which may be identified by detecting the objects in the captured images or video data and comparing the identified objects against a LUT, or by receiving information from the coffee jar confirming that the coffee jar, not other devices, is held by the wearer. Alternatively, the physiological signal of the wearer may be used in combination with the information related to how the activity is being performed. For example, if the activity of finding the coffee jar takes longer than a predetermined duration for this activity and consequently the wearer's physiological state has been changed to be nervous or stressed measured by a sensor, then this activity will be determined to be not successfully performed.
  • the wearable device 100 may comprise a sensor 122 for measuring the wearer's physiological and/or psychological state, such as an anxiety/stress level of the wearer.
  • sensor 122 may be an extra image sensor for measuring the facial information of the wearer, or a sensor for measuring the heartbeat of the wearer.
  • facial expression analysis algorithms can be used to assess users' emotion and mood.
  • the sensor 122 may be a vital signs camera for measuring the vital signs of the cognitively impaired user such as Heart Rate (HR).
  • HR Heart Rate
  • HRV HR Variability
  • the measured data can be used to assess the physiological and/or psychological state of the person as well.
  • Example of meaured data are HRV low frequency and high frequency variations and balance.
  • high LF (low frequency)-HRV is usually indicative of a restless state in comparison to the high HF (high frequency)-HRV.
  • the collected physiological data can be used to train classifiers of the machine learning method that will be used to detect the user physiological and/or psychological states that are desired to be monitored.
  • the processor 110 may be adapted to change the support information in response to the wearer's physiological and/or psychological state measured by the sensor. Such changes may include: 1) stopping providing support information to the wearer; 2) providing additional support information to the wearer; 3) changing the transmission speed of the support information provided to the wearer; and/or 4) changing the format of the supported information provided to the wearer. For instance, if a wearer has done a wrong action, the anxiety/stress level of the wearer may be dramatically increased. If value of the anxiety/stress level is above a predefined maximum value, then the support information will be stopped by the processor 110.
  • the support information may be repeatedly provided to the wearer with more slow speed in order to relax the wearer, or the audio/video support information in addition to the text information may be provided to the wearer. Based on the audio/video support information, the wearer may have a feeling that he/she is currently doing the task together with someone.
  • the wearable device 100 may be arranged to detect a user instruction and to trigger an operation in response to the detected user instruction, e.g. using at least one further sensor including the motion sensor 124 in case the user instruction is a head motion, or by using the image sensor 114 or a camera to capture an image of a gesture- based instruction made by the wearer.
  • Other suitable sensors for such gesture or motion capturing will be apparent to the skilled person.
  • the processor 110 may be arranged to recognize a gesture or motion made by its wearer from the captured sensor data and to interpret the recognized gesture or motion as an instruction, for instance to identify a task performed by the wearer of the wearable device 100, e.g., reading, computing, and so on.
  • Non- limiting examples of such a motion for instance include a turn or nod of the wearer's head.
  • Non-limiting examples of such a gesture for instance include a hand or finger gesture in the field of view through the wearable device 100, which may be detected in an image captured with the image sensor 114.
  • the at least one further sensor may include the sound sensor 116 to detect a spoken instruction, wherein the processor 110 may be communicatively coupled to the further sensor in order to process the sensor data and detect the spoken instruction.
  • the at least one further sensor may additionally or alternatively include an input sensor, e.g. a button or the like for facilitating the wearer of the wearable device 100 to select the user instruction from a list of options.
  • an input sensor e.g. a button or the like for facilitating the wearer of the wearable device 100 to select the user instruction from a list of options.
  • Such list of options for instance may be displayed on the at least one transparent display module 106 of the wearable device 100, when present.
  • the wearable device 100 may further include the user interface 108 for receiving input from the user.
  • User interface 108 may include, for example, a touchpad, a keypad, buttons, a microphone, and/or other input devices.
  • the processor 110 may control at least some of the functioning of wearable device 100 based on input received through user interface 108.
  • the at least one further sensor may define or form part of the user interface 108.
  • FIG. 2 shows various components of wearable device 100, i.e., wireless communication interfaces 102 and 104, user interface 108, processor 110, data storage 112, image sensor 114, sound sensor 116, the temperature sensor 118, the light sensor 120, the sensor 122 for measuring the wearer's physiological and/or psychological state, the motion sensor 124, as being separate from the at least one display module 106, one or more of these components may be mounted on or integrated into the at least one display module 106.
  • the image sensor 114 may be mounted on a see-through display module 106
  • the user interface 108 could be provided as a touchpad on a see-through display module 106
  • processor 110 and data storage 112 may make up a computing system in a see-through display module 106
  • the other components of wearable device 100 could be similarly integrated into a see-through display module 106.
  • the wearable device 100 may be provided in the form of separate devices that can be worn on any part of the body or carried by the wearer, apart from at least the one display module 106, which typically will be mounted on the head.
  • the separate devices that make up wearable device 100 may be communicatively coupled together in either a wired or wireless fashion.
  • Fig. 3 schematically depicts an embodiment of the wearable device.
  • the video/audio information as well as the physical activity information of the wearer are sensed by the image sensor 114, the audio sensor 116 and the motion sensor 124 respectively and are sent together with other sensed data to the processor 110.
  • the processor 110 can be located either in server or the cloud, or in the wearable device 100.
  • the processor 110 further receives wearer's profile data from a database or data storage 112. Such data may include wearer's hobby, wearer's daily routines, etc. Based on the received sensed information together with the wearer's profile data, the processor 110 may recognize the current activity that the user is currently performed.
  • the processor 110 provides a sequential set of instructions as the support information to the wearer to support the wearer to perform an activity.
  • the instructions may be adjusted on-the-fly during execution of the task based on the performance of the wearer.
  • Figs. 4A-4B schematically depict an embodiment of the wearable device, further showing an example of the wearer's support information displayed on the display module 106 for the wearer's activity of making coffee.
  • FIG. 4A an aspect of an embodiment of the wearable device 100 related to recognize the activity to be performed is disclosed.
  • the wearable device 100 is worn and initialized by the wearer. Consequently, the step of listing possible activities 201 is started.
  • the wearer's daily routine and personal hobby information are transmitted to the processor 110. Based on such information, a list of possible activities that the wearer normally does in the morning are displayed in the display module 106, such as "use toilet”, “brush teeth”, “make coffee", "eat food”, “cook meal” etc.
  • the procedure of determining the activity that the wearer is intended to perform based on the sensed wearer's behavior and position 211 is started. This procedure may comprise several sub steps:
  • step 213 when it has been sensed by the one or more sensors 114, 116, 118, 120, 122, 124 that wearer is walking into the kitchen, the list of activities is narrowed by the processor 110 to only include the activities that the wearer can perform in the kitchen.
  • a probability factor 170 of each activity is also displayed simultaneously.
  • Such probability factor can be learned by the system from the wearer's profile, such as the wearer's habit and history. For example, after the wearer wears the wearable device 100 for many days, the wearable device may automatically learn the habit and life patterns of the wearer, e.g. after the wearer wakes up in the morning, it is 80% that the wearer would go to kitchen for a cup of coffee, and 20% that the wearer would go to toilet. In this way, the wearable device 100 may be a self-learning system, which is personalized for individual users.
  • step 215 when it has been sensed by the one or more sensors 114, 116, 118, 120, 122, 124 that wearer is picking up a water boiler 200, the list of activities is narrowed by the processor 110 to only include the activities that can be performed with the water boiler. Like the previous step 213, based on the wearer's habits and history, it is determined by the processor 110 that it is highly likely (90% probability) that the wearer is going to prepare a coffee. Accordingly, support information related to "Coffee Preparation Task Assistance" is initiated and displayed in the display module 106.
  • step 217 when it has been sensed by the one or more sensors 114, 116, 118, 120, 122, 124 that wearer is picking up a Nescafe® jar, it is determined by the processor 110 that it is 100% sure now that the wearer will prepare a coffee.
  • the wearer's desired /potential activity is not displayed in the display module 106 in step 213, namely such activity cannot be performed in the kitchen, the wearer may be allowed to input the name of the desired activity via the user interface 108. Consequently, the wearer will be guided by the wearable device 100 to the room were the task needs to be performed.
  • Fig. 4B an aspect of an embodiment of the wearable device 100 related to recognizing and tracking the individual steps during the activity performance and guiding the wearer based on the activity related parameters is disclosed.
  • the location of the Nescafe coffee jar is displayed to the wearer in step 231. Meanwhile, the wearer's reactions to this information are observed by the one or more sensors 114, 116, 120, 122, 124, such as whether the wearer's vital signs, e.g. heartbeat, are unchanged, and whether the wearer extends his hand toward the jar and picks it up.
  • the wearer's vital signs e.g. heartbeat
  • step 233 after verified by the processor 110 that the jar held by the wearer is indeed a coffee jar, the support information with respect to how to open the coffee jar is displayed.
  • the verification can be done by capturing the image containing the content information of the jar by the image sensor 114 or can be done via laser scanners, or a barcode scanner in the wearable device 100. Meanwhile, the one or more sensors 114, 116, 118, 120, 122, 124 further track whether the coffee jar is opened or not by the wearer. The next instructions will be provided to the wearer only if the coffee jar is opened.
  • step 235 after verified by the processor 110 that the coffee jar is opened by the wearer, the instruction related to the location of the spoons is displayed to the wearer. Meanwhile, the one or more sensors 114, 116, 118, 120, 122, 124 further track whether the spoon is picked up by the wearer or not. The next instructions will be provided to the wearer only if the spoon is picked up by the wearer.
  • step 237 after verified by the processor 110 that the spoon is picked up by the wearer, the instruction related to the location of the cup, preferably the wearer's favorite cup, is displayed to the wearer. Meanwhile, the one or more sensors 114, 116, 118, 120, 122, 124 further track whether the cup is picked up by the wearer or not. The next instructions will be provided to the wearer only if the cup is picked up by the wearer.
  • step 239 after verified by the processor 110 that the cup is picked up by the wearer, the instruction related to how to use the spoon to get the coffee and how to put the coffee into the cup is displayed to the wearer. Meanwhile, the one or more sensors 114, 116, 118, 120, 122, 124 further track whether the coffee is indeed put into the cup by the wearer or not. The next instructions will be provided to the wearer only if the coffee has been put into the cup by the wearer. In step 241, after verified by the processor 110 that the coffee has been put into the cup by the wearer, the instruction related to how to pour the water is displayed to the wearer.
  • the instruction may also relate to the duration of pouring the water to the cup.
  • Such information is based on the measurement of the volume of the cup and the current pouring speed, which is sensed/calculated by video/audio data derived from the image sensor 114 and audio sensor 116.
  • video/audio data derived from the image sensor 114 and audio sensor 116.
  • One example of calculating the pouring speeding based on the video/audio data can be found by Rafa Absar et al. in "USABILITY OF NON- SPEECH SOUNDS IN USER INTERFACES", Proceedings of the 14th International
  • step 243 after verified by the processor 110 that the cup is full with water based on the determined duration of the pouring as well as the audio analysis of the sounds that boiler makes, an alert message may be shown to the wearer to stop pouring water in order to prevent burns.
  • the wearer may be instructed to stir and wait for the coffee to cool down by the wearable device 100.
  • support information in any of the abovementioned steps may not limit to text information. Any other medium information, such as video/audio information, may also be provided to the wearer for the purpose of supporting the wearer to perform an activity.
  • Any other medium information such as video/audio information, may also be provided to the wearer for the purpose of supporting the wearer to perform an activity.
  • a warning message may also be provided to the wearer if the wearer does not follow the instruction correctly, or if the wearer performs some activities that render one or more devices in a wrong status. For instance, in step 235, after the wearer reacts to the instruction and opens the drawer and picks the spoon, the drawer may be not closed due to the neglect of the wearer. Once this is detected by the one or more sensors 114, 116, 118, 120, 122, 124, it may be determined by the processor 110 that the position of the drawer is not in the correct status, which may create a dangerous situation. Subsequently, an alert may be generated by the processor 110 and may be shown to the user to inform him/her to close the drawer. Such warning information may be continuously provided until the dangerous situation is removed.
  • the psychological and physiological state changes of the wearer in each step, and in between the changes of two steps where new instruction is introduced to the wearer state is observed. This is done with the purpose to understand if these is increased anxiety/stress (maybe as a result of not understanding the task or instructions), or tiredness, or lack of focus, or particular disabilities (tremors of hand). Consequently, the instructions are adapted accordingly.
  • step 231 it may be detected that the wearer is not able to find the coffee jar correctly.
  • the detection can be done automatically by observing, e.g. the time it takes and the increases heart rate, decreased Heart Rate Variability (HRV), increased skin conductance response, and respiration rate of the wearer. Accordingly, the location of the coffee jar is repeatedly displayed to the wearer. Meanwhile, additional support may be provided to the wearer other than showing information. For instance, a light beam 180 may be automatically provided to the wearer by the wearable device 100 in the direction of the coffee jar in order to better show the location to the wearer.
  • HRV Heart Rate Variability
  • the wearable device may be adapted to communicate with other father devices 200 in order to provide improved support information.
  • a connection can be established between the water boiler and the wearable device 100 such that the wearable device 100 may identify that the wearer is currently using the water boiler.
  • both the coffee jar and the wearable device 100 may comprise a NFC chip such that all needed information related to coffee jar, such as the expiration date, will be provided to the wearable device based on the NFC communication channel.
  • the wearable device 100 may communicate to the water boiler and may control the water boiler when it will pour the water, and when it will stop pouring the water, based on analysis of the cup size, and pouring speed. This may avoid the heat water overflow.
  • the processor 110 may implement a method 400 for displaying support information in the flow chart of Fig. 4.
  • the method 400 commences in step 401, after the wearable device 100 is switched on, initializing the display module 106, after which the method 400 progresses to step 403 in which support information with respect to a first activity that a wearer of the wearable device intends to perform is displayed; wherein the support information with respect the first activity is determined based on an identified second activity that the wearer is performed.
  • aspects of the present invention may be embodied as a wearable device, method or computer program product. Aspects of the present invention may take the form of a computer program product embodied in one or more computer-readable medium(s) having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • Such a system, apparatus or device may be accessible over any suitable network connection; for instance, the system, apparatus or device may be accessible over a network for retrieval of the computer readable program code over the network.
  • a network may for instance be the Internet, a mobile communications network or the like.
  • the computer readable storage medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out the methods of the present invention by execution on the processor 110 may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the processor 110 as a stand-alone software package, e.g. an app, or may be executed partly on the processor 1 10 and partly on a remote server.
  • the remote server may be connected to the head-mountable computing device 100 through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer, e.g. through the Internet using an Internet Service Provider.
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider e.g. AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • the computer program instructions may be loaded onto the processor 110 to cause a series of operational steps to be performed on the processor 110, to produce a computer-implemented process such that the instructions which execute on the processor 110 provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • the computer program product may form part of a head-mountable computing device 100, e.g. may be installed on the head-mountable computing device 100.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Hospice & Palliative Care (AREA)
  • General Business, Economics & Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A wearable device is provided. The wearable device comprises at least one feedback module;one or more sensors; and a processor adapted to initialize the feedback module; control the at least one feedback module to provide support information with respect to a first activity that a wearer of the wearable device intends to perform; wherein the support information with respect to the first activity is determined based on an identified second activity that the wearer is performed, wherein the second activity has performed by the wearer prior to the first activity.

Description

Wearable device, method and computer program product
FIELD OF THE INVENTION
The present invention relates to a wearable device comprising at least one feedback module and a processor for controlling the at least one feedback module to provide support information with respect to a first activity that a wearer of the wearable device intends to perform.
The present invention further relates to a method for providing support information on a wearable device.
The present invention further relates to a computer program product for implementing such a method when executed on a processor of such a wearable device.
BACKGROUND OF THE INVENTION
Wearable devices are a new class of electronic systems that can provide data acquisition through a variety of unobtrusive sensors that may be worn by a user. The sensors gather information, for example, about the environment, the user's activity, or the user's health status. However, there are significant challenges related to the coordination, computation, communication, privacy, security, and presentation of the collected data. In addition, analysis of the data is needed to make the data gathered by the sensors useful and relevant to end-users. In some cases, additional sources of information may be used to supplement the data gathered by the sensors. The many challenges that wearable technology presents require new designs in hardware and software.
The current wearable devices for supporting the cognitively impaired persons are systems based on single sensors and/or operations. For example, a GPS enabled insoles may allow people with early signs of dementia like Alzheimer's disease (AD) to be tracked during their (outdoor) walks based on a GPS chip embedded in insoles of a shoe. Another example is the Philips Lifeline® Help button; this system is made of a small device made up of a pendant around the user's neck that has a button that is activated if he/she, e.g., falls inside a home; this information is transmitted to a service center which provides as quickly as possible assistance to him/her. However, this personal help button is merely a device to use in emergencies, and not to assistant the wearer in daily activities, and not suitable for AD patients. The major problem for AD patients is that they are heavily dependent on others and cannot perform a task independently. This situation puts an enormous burden on caregivers who has to be attentive 100% of time, resulting in caregiver burnout. Moreover, giving tools for the patient to perform certain activities can be also beneficial for the patients as this can increase their life quality, and maybe slow down the progression of the disease.
SUMMARY OF THE INVENTION
It is an object of the present invention to provide a wearable device for displaying support information to a cognitively impaired user of the wearable device.
It is further an object of the present invention to provide a method for displaying support information using such a wearable device.
It is further an object of the present invention to provide a computer program product for implementing such a method when executed on a processor of such wearable device.
According to an aspect, there is provided a wearable device for cognitively impaired user comprising at least one feedback module for providing feedback information to the cognitively impaired user; one or more sensors for sensing data from the cognitively impaired user; and a processor adapted to initialize the feedback module; identify a first movement of the cognitively impaired user based on the data sensed by the one or more sensors; control the at least one feedback module to provide support information for supporting a second movement that the cognitively impaired user of the wearable device intends to perform subsequently after the first movement; wherein the support information for supporting the second movement is determined based on the identified first movement of the cognitively impaired user.
The present invention is based on the idea that using a wearable device with smart sensing, computation, and user feedback, the video/text support information will be displayed on a display module of the wearable device and/or the audio support information will be provided by a speaker as an assistive tool for cognitively impaired user. This wearable device has the ability to determine and guide the execution of tasks by cognitively impaired people. The sensing components may include, e.g., light (infrared) sensor, radar, to determine the appearance, shape, distance of objects in a 3-D scene, and sound sensor to determine audio patterns by the wearer and/or in the environment. The computation may use a combination of the sensory data to identify the current movement or current activity of the cognitively impaired user and to determine step-by-step execution of tasks support information to the cognitively impaired user. The wearer may get a feedback from the wearable device to indicate whether the cognitively impaired user executed a task or activity which may involve a specific movement, such as holding a cup, is correct or not. In case the wearer has made some mistakes when performing such task or activity, the system may provide video/audio guidance information to support the wearer to correct such mistakes.
Therefore, the wearer may perform a task independently without any support from caregivers.
In an embodiment, the feedback module is a display module; and the processor is adapted to initialize the display module; control the at least one display module to provide visual support information to the cognitively impaired user. Alternatively, the feedback module can be a speaker for providing audio support information to the cognitively impaired user.
In an embodiment, the one or more sensors comprise at least one of an onward imaging sensor for capturing one or more images indicative of the current view of the wearer and identifying one or more objects in the captured one or more images; an audio sensor for capturing sounds present in the environment or in the proximity of the user; a temperature sensor for measuring the temperature of the wearer; a light sensor for measuring the light in the environment or in the proximity of the wearer; and a motion sensor, such as a gyroscope, or an accelerometer, for identifying the current physical activity of the wearer. The processor is adapted to identify the first movement of the cognitively impaired user by evaluating the changes of each types of sensed data based on predefined rules comprising thresholds for each types of sensed data.
The intention of taking into account each types of sensed data as mentioned above is to more reliably and predictably analyze the current activity or movment performed by the cognitively impaired user by evaluating the changes of each types of sensed data based on predefined rules comprising thresholds for each types of sensed data, for example, rules for evaluating whether the changes of all types of sensed data have reached the corresponding threshold that indicates which activities or movements the cognitive imparled user are currently performed, how the activity or movement is performed (e.g. whether the performed activity or movement meets one or more predefined thresholds or not), or indicate whether the congnitive impaired user has performed one activity or movement improperly and whether the cognitively impaired user would have to redo the improperly performed activity or movement, etc. The rules can be defined by learning from sensed data. The wearable device will be trained prior to the deployment with all types of sensed data as mentioned above that are derived from the cognitive impaired user. Alternatively, representative data derived from other subjects can be used. Later, when the personal data is available (i.e. when the cognitive imparled user starts to use the wearable device), the training models will be adapted accordingly. Preferably, the wearable device may be trained by Deep Learning (DL) algorithms. This is because DL algorithm can provide reasonably good performance on each type of sensed data, and can continuously and automatically learn when new sensed data becomes available. It is understood that other types of machine learning method may be used as well, such as Random Forest classifiers. Examples of relevant machine learning algorithms can be found, for instance, in Kevi P. Murphy el. Machine Learning: A Probabilistic
Perspective (Adaptive Computation and Machine Learning series).
For instance, by learning from sensed data derived from a congnitive impaired user, the threshold of temperature sensor data for determining whethether the user is performing movement A may be between 36.2 °C and 36.6 °C; the threshold of audio sensor data for determining whethether the user is performing movement A is to verify whether at least the keywords "XX" or equivalent keywords "XY" is detected or intercepted in the audio/acoustic data; the threshold of imaging sensor for determining whether the user is performing movement A is to verify whether at least object "M" is detected in the captured images or video data, etc. A more reliably decision regarding whether the user is performing movement A can be made if all the thresholds for each types of sensed data can be met.
In other words, it is desired that all types of the sensor signals are used together. By using all available types of signals and data, the classifiers can be better trained. It is appreciated that, in simple situations, individual sensor data can be used to determine thresholds for such individual sensor data.
In an embodiment, the processor is adapted to identify the second activity of the wearer based on signals acquired by the one or more sensors.
In an embodiment, the processor is further adapted to display a list of a plurality of activities that a wearer of the wearable device intends to perform; limit the plurality of activities based on the identified activity that the wearer of the wearable device has performed.
In an embodiment, the processor is further adapted to rank the plurality of activities intended to be performed by the wearer, and wherein the ranking is based on the second activity that the wearer has performed, or based on a predefined wearer's profile.
In an embodiment, one or more sensors further comprise a sensor adapted for measuring the wearer's physiological and/or psychological state. From the sensors mentioned above, the audio signal measured by the microphone can be used to assess the physiological and/or psychological state of the person. Using speech analysis, emotion and mood of the user can be estimated. In addition, the movement data of the user can be also used. It is expected that depending on the psychological state the user will move differently, while performing a certain movement or activity. For example, when the cognitively impaired user is stressed, or angry, one can expect more sharp, fast, or rough movements, and possible different ordering (or errors, which can be measured by observing repetitions) while performing a movement or activity. As another example, when the person is calm is content, it is expected that the movement or activity is performed smoothly, with less errors, and repetitions. In summary, the psychological state is assessed mainly by observing how the users movements have changes with respect to the way how the activity or movement is performed. In yet another example, the physiological and/or psychological state of the cognitively impaired user can be derived from how the user responds to the suggestions provided by the system. When the user is in a good mood, we can expect higher compliance versus when the user is in a negative mood. As stated above, if speech data, and audio data are available they can be also analyzed together with the task performance behavior to infer the user psychological and physiological state.
In an embodiment, the support information is predefined on a step-by-step basis, and wherein support information related to a next step of activity is provided only if the current step of activity is completely performed by the wearer.
In an embodiment, wherein the processor is further adapted to change the predefined support information in response to the wearer's physiological and/or
psychological state measured by the sensor.
In an embodiment, wherein the processor is further adapted to change the predefined support information by one or more of the following options:
- stopping providing support information to the wearer;
providing additional support information to the wearer;
changing the transmission speed of the support information provided to the wearer; and
changing the format of the supported information provided to the wearer. Therefore, the speed of providing supported information can be adapted based on the , so the wearer can better follow the supported information, such as instructions. Advantageously, the user will have the highest chance to understand and follow the supported information. For example, when it is detected that the user cannot follow the instruction, e.g. due to confusion or other reasons, it can be preferable to provide such supporting information in a compact format or in a format that is easy to visualize to be followed.
In an embodiment, wherein the wearable device further comprises a transmitter adapted to receive signals and/or data from one or more devices, wherein the one or more devices are in the environment or in the proximity of the wearer.
In an embodiment, wherein the wearable device further comprises a speaker, and wherein the processor is further configured for controlling the speaker to provide support information with respect to the first activity that a wearer of the wearable device intends to perform.
According to another aspect, there is provided a method of displaying information on the wearable device according to one or more of the above embodiments, the method comprising initializing the feedback module; providing support information with respect to a first activity that a wearer of the wearable device intends to perform; wherein the support information with respect the first activity is determined based on an identified second activity that the wearer is performing, and wherein the second activity is performed by the wearer prior to the first activity.
In accordance with yet another aspect, there is provided a computer program product comprising a computer program code for, when executed on the processor of the wearable device according to one or more of the above embodiments, implementing the steps of the method according to one or more of the above embodiments. Such a computer program product may be made available to the wearable device in any suitable form, e.g. as a software application (app) available in an app store, and may be used to configure the wearable device such that the wearable device may implement the aforementioned method. BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the invention are described in more detail and by way of non- limiting examples with reference to the accompanying drawings, wherein
Fig. 1 schematically depicts a wearable device according to an embodiment of the present invention;
Fig. 2 schematically depicts an aspect of the wearable device of Fig 1 according to an embodiment of the present invention;
Fig. 3 schematically depicts an embodiment of the wearable device of Fig 1; Figs. 4A-4B schematically depicts some aspects of an embodiment of the wearable device of Fig 1 according to an embodiment of the present invention; Fig. 5 shows a flow chart of methods according to an embodiment of the present invention.
DETAILED DESCRIPTION OF THE EMBODIMENTS
It should be understood that the figures are merely schematic and are not drawn to scale. It should also be understood that the same reference numerals are used throughout the figures to indicate the same or similar parts.
In the context of the present application, a wearable device is a device that can be worn on a user and provides the user with computing functionality. The wearable device may be configured to perform specific computing tasks as specified in a software application (app) that may be retrieved from the Internet or another computer-readable medium. The wearable device may be worn on the head of a wearer. Non-limiting examples of such wearable devices include smart headgear, e.g. eyeglasses, goggles, a helmet, a hat, a visor, a headband, a Google Glass, or any other device that can be supported on or from the wearer's head, and so on. Alternatively, the wearable device can be worn on the wrist of the wearer. Non-limiting examples of such wearable devices include smart watch, bandage, etc.
In the context of the present application, a feedback module may be a display module for providing video/text support information to the wearer, and/or a speaker for providing audio/speech support information to the wearer.
In the context of the present application, the wearable device may comprise one or more sensors. The sensors may comprise at least one of an onward imaging sensor for capturing one or more images indicative of the current view of the wearer and identifying one or more objects in the captured one or more images; an audio sensor for capturing the sounds present in the environment or in the proximity of the user; a temperature sensor for measuring the temperature of the wearer; a light sensor for measuring the light in the environment or in the proximity of the wearer, or a motion sensor, such as a gyroscope, or an accelerometer, for identifying the current physical activity of the wearer. By analyzing the sensed data derived from the first type of sensor, the current activity of the wearer can be detected. Alternatively, the above mentioned one or more sensors may be integral to an external device, such as a smartphone, a tablet, etc., or other wearable device, a surveillance device such as an alarm camera mounted in a room, or a vital sign monitoring device such as a device including a vital signs camera used for remote photoplethysmography, and is communicatively coupled via a wired or wireless connection to the wearable device. Alternatively, the data derived from the one or more of sensors may be directly derived from a remote data source such as the Internet/Cloud.
FIG. 1 schematically depicts an embodiment of a wearable device 100. Fig. 2 schematically depicts a block diagram of an embodiment of the wearable device 100, further highlighting the functionality of the wearable device 100 in terms of functional blocks, at least some of which may be optional functionality. By way of non-limiting example, the wearable device 100 is depicted as smart glasses, but it should be understood that the wearable device 100 may take any suitable shape as previously explained.
The wearable device 100 comprises an image sensor 114 for capturing an image indicative of the current view of the wearer and a sound sensor 116, e.g. a microphone, for detecting a sound present in the environment or in the proximity of the wearer of the wearable device 100. Such captured images or video data or signal and/or the captured audio or acoustic data may be used for analyzing and identifying the activity of the wearer.
The activity identification may be achieved, for instance, by detecting the objects in the captured images or video data and comparing the identified objects against a Look-Up-Table (LUT) defining the types of support information for each wearer's activity, e.g. a list of objects corresponding to a specific activity in which the needed types of support information is predefined. If the detected objects in the captured images or video data are identical to the same list of objects in the LUT corresponding to a specific activity, then such activity is identified. In addition to object detection, the user's activity can also be identified by analyzing the movement patterns of the detected object in the video, for example, the wearer's hand movements. Common video analytics methods, such as pattern recognition algorithms, may be used for detecting the objects or the movement patterns of the detected object in the captured images or video data. For instance, the movement pattern of the wearer's hand can be derived, by tracking the hand in the video to derive the trajectory of the motion. There are various visual tracking techniques in the literature can be used to track the detected object(s). For example, a publication titled "Object Tracking: A survey" by Yilmaz, et al„ ACM Computing Surveys, Vol. 38, Issue 4, 2006, describes the state-of-the-art object tracking methods.
Alternatively, the wearer's activity identification may be achieved, for instance, by detecting the audio or acoustic data and identifying the current activities of the wearer based on intercepting or detecting the keywords in the audio/acoustic data, such as "coffee machine", "cup", "water boiler", or analyzing and/or identifying the source of the audio/acoustic data. In one embodiment, the audio/acoustic data may be generated by one or more further systems 200 in the environment or in the proximity of the wearer or are currently used by the wearer, such as the sound of making coffee produced by the coffee machine. The source of the audio/acoustic data may be identified as coffee machine. The detected data keywords in the audio/acoustic or the source of the audio/acoustic data may be compared against a list of keywords in the LUT. Each activity in the LUT may link to one or more keywords. If the detected keywords are identical to the same keywords in the LUT corresponding to an activity, then such activity is identified.
The activity identification may be also achieved by receiving the data acquired from the one or more further systems 200 in the environment or in the proximity of the wearer or are currently used by the wearer. The received data may include the identification information of the devices that the wearer is using, such as the types or the versions of the one or more further systems 200, so as to precisely identify the activity of the wearer, i.e. on which further systems 200 the wearer is currently used, such as a coffee machine. Such data may be received from the further systems 200 via a Near Field Communication (NFC) communication channel.
The wearable device 100 may comprise a temperature sensor 118 for measuring the temperature of the wearer; a light sensor 120 for measuring the light in the environment or in the proximity of the wearer, a motion sensor 124, such as a gyroscope, or an accelero meter, for identifying the current physical activity of the wearer, such as running, sleeping, etc.. By analyzing the sensed temperature, and/or the light in the environment or in the proximity of the wearer, the current activity of the wearer can be detected.
The wearable device 100 may comprise at least one display module 106, under control of a discrete display controller (not shown). Alternatively, the display controller may be implemented by a processor 110 of the wearable device 100, as shown in Fig. 2. The display module 106 may be a transparent or see-through display module. The display module may be a two-dimensional or a three-dimensional display module.
When present, the at least one display module 106 is typically arranged to cover the field of view of the wearer when the wearable device 100 is worn by the wearer such that the wearer of the wearable device 100 may observe the field of view through an image displayed on the at least one the display module 106. In an embodiment, the wearable device 100 comprises a pair of transparent display modules 106 including a first display module that can be observed by the right eye of the wearer and a second display module that can be observed by the left eye of the wearer. Alternatively, the at least one display module 106 may be a single display module covering both eyes of the wearer. The at least one display module 106 may be provided in any suitable form, such as a transparent lens portion as shown in Fig. 1 onto which an image is projected as is well-known per se. Alternatively, the wearable device 100 may comprise a pair of such a lens portions, i.e. one for each eye as explained above. The one or more transparent lens portions are dimensioned such that substantially the entire field of view of the wearer is obtained through the one or more transparent lens portions. For instance, the at least one display module 106 may be shaped as a lens to be mounted in a frame 125 of the wearable device 100 or a component housing 135 of the wearable device 100.
The display module 106 may be controlled by the processor 110 to generate video/text support information to the wearer based on the detected current activity of the wearer. The processor 110 may be adapted to display a list of all the potential activities in the display module 106 that the wearer will intend to perform and will rank all the potential activities based on the possibility factors analyzed by the processor 110. Such possibility factors may be determined based on a predefined wearer's profile stored in the data storage 112. The wearer's profile may include user's personal history, such as the habit, the lifestyle patterns, etc. Based on the detected current activity of the wearer, the less relevant activities will be deleted from the list by the processor 110 until the only the most relevant activity will be shown in the display module 106. Alternatively, the wearable device 100 may further include a user interface 108 for allowing a user to select one activity from the displayed list of all potential activities. This may be used, e.g. at the situation that the two potential activities may have equal possibility factors. Alternatively, the user interface 108 may further allow the user to insert a new activity into the list if such new activity is not mentioned in the list. Such new activity may be stored in the data storage 112. Consequently, the support information with respect to the most relevant activity or the selected activity will be displayed in the display module 106. It is understood that the processor 110 may be adapted to directly control the display module 106 to display the support information with respect to the most relevant activity.
In an embodiment, the wearable device 100 may further comprise a speaker 144. The speaker 144 may be controlled by the processor 110 to generate audio/speech support information to the wearer based on the detected current activity of the wearer.
The support information displayed on the display module 106 may include the information for supporting the wearer to perform a specific task. For instance, the support information for supporting the wearer to make a coffee may include the information of the locations of the coffee jar, the spoon, and the coffee cup, and the text or images/videos showing how to open the jar, how to use the spoon to get the coffee, and how to pour the water from the water boiler, etc.. Such assistance information may be acquired by the one or more further devices 200 connected or communicated to the wearable device 100 via a transmitter such that the support information may be transmitted to wearable device in real time. Alternatively, the support information may be stored in a remote database or in the data storage 112 of the wearable device 100. Alternatively, the support information displayed on the display module 106 may be controlled to be automatically hidden, blurred or partly removed from the display module by the processor 110 when a specific activity of the wearer is identified. For instance, when the wearer of the wearable devices performing a task that needs full attention from the wearer or engaging an activity that she/he should not be distracted, all the support information displayed on the display module will be automatically hidden or blurred by the processor 110 in order to avoid any distraction to the wearer.
Alternatively, the hidden or removed assistance information may be provided by wearable device in a subtle audio or tactile stimulus manner.
In an embodiment, the wearer assistance information displayed on said display module may include the duration of an operation or activity that the wearer is performed for a specific activity, e.g. using one or more further devices 200. This has the advantage that it enables the wearer to hand- freely monitor the time spent on his/her current activity. In addition, if the duration of performing a specific activity is longer than a pre-defined threshold, the information indicating the duration of such activity may be highlighted, such as using different colors/brightness or different font/format, in order to alert the wearer.
It will be understood that a frame 125 of the wearable device 100 may have any suitable shape and may be made of any suitable material, e.g. a metal, metal alloy, plastics material or combination thereof. Several components of the wearable device 100 may be mounted in the frame 125, such as in the component housing 135 forming part of the frame 125. The component housing 135 may have any suitable shape, preferably an ergonomic shape that allows the wearable device 100 to be worn by its wearer in a comfortable manner.
The functioning of at least part of the wearable device 100 may be controlled by the processor 110 that executes instructions, i.e. computer program code, stored in a non- transitory computer readable medium, such as data storage 112. Thus, processor 110 in combination with processor-readable instructions stored in the data storage 112 may function as a controller of the wearable device 100. In addition to instructions that may be executed by the processor 110, the data storage 112 may store data that is associated with the generation of support information on the at least one display module 106.
In an embodiment, the wearable device 100 may be adapted to wirelessly communicate with a remote system, e.g. the further system 200 as shown in Fig. 2. To this end, the wearable device 100 may include a wireless communication interface 102 for wirelessly communicating with a remote target such as the remote further system 200. Any suitable wireless communication protocol may be used for any of the wireless
communication between the wearable device 100 and the remote system 200, e.g., an infrared link, Zigbee, Bluetooth, a wireless local area network protocol such as in accordance with the IEEE 802.11 standards, a 2G, 3G or 4G telecommunication protocol, and so on. The remote further system 200 may for instance be controlled to provide the wearer of the wearable device 100 with feedback information and/or instructions, as will be further explained below. Alternatively, the suitable wireless communication protocol may be Near Field
Communication (NFC), and both the wearable device 100 and the further system 200 may comprise a passive or active Near Field Communication (NFC) component for enabling the NFC communication.
The wearable device 100 may optionally comprise a further wireless communication interface 104 for wirelessly communicating with a further remote system, e.g. a wireless LAN, through which the wearable device 100 may access a remote data source such as the Internet, for instance to store data such as user preferences, user specific information, and so on. Alternatively, the wearable device 100 may include one wireless communication interface that is able to communicate with the remote further system 200 and a further remote target such as the further network. The processor 110 may further be adapted to control wireless communication interface 102 and, if present, wireless communication interface 104.
The further system 200 may be one or more devices in the environment or in the proximity of the wearer. More specifically, the further system 200 may be one or more devices currently used by the wearer. For instance, the further system 200 may be a coffee machine or a water boiler used by the wearer in the morning after woke up.
The processor 110 may be further adapted to determine whether a specific activity has been completed or correctly performed by the wearer. If not, the processor 110 may be adapted to repeatedly provide the support information to the wearer. For example, if the activity is to find the coffee jar, then the activity will be determined to be completed or correctly performed by the wearer when the user has found the jar, which may be identified by detecting the objects in the captured images or video data and comparing the identified objects against a LUT, or by receiving information from the coffee jar confirming that the coffee jar, not other devices, is held by the wearer. Alternatively, the physiological signal of the wearer may be used in combination with the information related to how the activity is being performed. For example, if the activity of finding the coffee jar takes longer than a predetermined duration for this activity and consequently the wearer's physiological state has been changed to be nervous or stressed measured by a sensor, then this activity will be determined to be not successfully performed.
The wearable device 100 may comprise a sensor 122 for measuring the wearer's physiological and/or psychological state, such as an anxiety/stress level of the wearer. Such sensor 122 may be an extra image sensor for measuring the facial information of the wearer, or a sensor for measuring the heartbeat of the wearer. Once the facial information of the user has been monitored, then facial expression analysis algorithms can be used to assess users' emotion and mood. There are several facial expression analysis algorithms that can be used, one example is Paul Ekman Group's Facial Action Coding system for emotion extraction (https://www.paulekman.com/product-category/facs/). It is appreciated that other methods are also available which are based on training a classifier with emotional faces. In another example, the sensor 122 may be a vital signs camera for measuring the vital signs of the cognitively impaired user such as Heart Rate (HR). Once the HR and HR Variability (HRV) of the user has been measured, then the measured data can be used to assess the physiological and/or psychological state of the person as well. Example of meaured data are HRV low frequency and high frequency variations and balance. For example, high LF (low frequency)-HRV is usually indicative of a restless state in comparison to the high HF (high frequency)-HRV. One can also look at the ratio between LF-HRV and HF-HRV. Similar to the facial analysis methods, the collected physiological data can be used to train classifiers of the machine learning method that will be used to detect the user physiological and/or psychological states that are desired to be monitored.
The processor 110 may be adapted to change the support information in response to the wearer's physiological and/or psychological state measured by the sensor. Such changes may include: 1) stopping providing support information to the wearer; 2) providing additional support information to the wearer; 3) changing the transmission speed of the support information provided to the wearer; and/or 4) changing the format of the supported information provided to the wearer. For instance, if a wearer has done a wrong action, the anxiety/stress level of the wearer may be dramatically increased. If value of the anxiety/stress level is above a predefined maximum value, then the support information will be stopped by the processor 110. Alternatively, if the value of the anxiety/stress level is below the predefined maximum value, but still above a certain predefined value, then the support information may be repeatedly provided to the wearer with more slow speed in order to relax the wearer, or the audio/video support information in addition to the text information may be provided to the wearer. Based on the audio/video support information, the wearer may have a feeling that he/she is currently doing the task together with someone.
In an embodiment, the wearable device 100 may be arranged to detect a user instruction and to trigger an operation in response to the detected user instruction, e.g. using at least one further sensor including the motion sensor 124 in case the user instruction is a head motion, or by using the image sensor 114 or a camera to capture an image of a gesture- based instruction made by the wearer. Other suitable sensors for such gesture or motion capturing will be apparent to the skilled person. The processor 110 may be arranged to recognize a gesture or motion made by its wearer from the captured sensor data and to interpret the recognized gesture or motion as an instruction, for instance to identify a task performed by the wearer of the wearable device 100, e.g., reading, computing, and so on. Non- limiting examples of such a motion for instance include a turn or nod of the wearer's head. Non-limiting examples of such a gesture for instance include a hand or finger gesture in the field of view through the wearable device 100, which may be detected in an image captured with the image sensor 114.
Alternatively or additionally, the at least one further sensor may include the sound sensor 116 to detect a spoken instruction, wherein the processor 110 may be communicatively coupled to the further sensor in order to process the sensor data and detect the spoken instruction.
The at least one further sensor may additionally or alternatively include an input sensor, e.g. a button or the like for facilitating the wearer of the wearable device 100 to select the user instruction from a list of options. Such list of options for instance may be displayed on the at least one transparent display module 106 of the wearable device 100, when present.
The wearable device 100 may further include the user interface 108 for receiving input from the user. User interface 108 may include, for example, a touchpad, a keypad, buttons, a microphone, and/or other input devices. The processor 110 may control at least some of the functioning of wearable device 100 based on input received through user interface 108. In some embodiments, the at least one further sensor may define or form part of the user interface 108.
Although Fig. 2 shows various components of wearable device 100, i.e., wireless communication interfaces 102 and 104, user interface 108, processor 110, data storage 112, image sensor 114, sound sensor 116, the temperature sensor 118, the light sensor 120, the sensor 122 for measuring the wearer's physiological and/or psychological state, the motion sensor 124, as being separate from the at least one display module 106, one or more of these components may be mounted on or integrated into the at least one display module 106. For example, the image sensor 114 may be mounted on a see-through display module 106, the user interface 108 could be provided as a touchpad on a see-through display module 106, processor 110 and data storage 112 may make up a computing system in a see-through display module 106, and the other components of wearable device 100 could be similarly integrated into a see-through display module 106.
Alternatively, the wearable device 100 may be provided in the form of separate devices that can be worn on any part of the body or carried by the wearer, apart from at least the one display module 106, which typically will be mounted on the head. The separate devices that make up wearable device 100 may be communicatively coupled together in either a wired or wireless fashion.
Fig. 3 schematically depicts an embodiment of the wearable device. In operation, the video/audio information as well as the physical activity information of the wearer are sensed by the image sensor 114, the audio sensor 116 and the motion sensor 124 respectively and are sent together with other sensed data to the processor 110. The processor 110 can be located either in server or the cloud, or in the wearable device 100. The processor 110 further receives wearer's profile data from a database or data storage 112. Such data may include wearer's hobby, wearer's daily routines, etc. Based on the received sensed information together with the wearer's profile data, the processor 110 may recognize the current activity that the user is currently performed. Finally, based on the recognized current activity, the processor 110 provides a sequential set of instructions as the support information to the wearer to support the wearer to perform an activity. The instructions may be adjusted on-the-fly during execution of the task based on the performance of the wearer.
Figs. 4A-4B schematically depict an embodiment of the wearable device, further showing an example of the wearer's support information displayed on the display module 106 for the wearer's activity of making coffee. In fig. 4A, an aspect of an embodiment of the wearable device 100 related to recognize the activity to be performed is disclosed. After the wearer's woke up, the wearable device 100 is worn and initialized by the wearer. Consequently, the step of listing possible activities 201 is started. The wearer's daily routine and personal hobby information are transmitted to the processor 110. Based on such information, a list of possible activities that the wearer normally does in the morning are displayed in the display module 106, such as "use toilet", "brush teeth", "make coffee", "eat food", "cook meal" etc. Afterwards, the procedure of determining the activity that the wearer is intended to perform based on the sensed wearer's behavior and position 211 is started. This procedure may comprise several sub steps:
In step 213, when it has been sensed by the one or more sensors 114, 116, 118, 120, 122, 124 that wearer is walking into the kitchen, the list of activities is narrowed by the processor 110 to only include the activities that the wearer can perform in the kitchen.
Optionally, a probability factor 170 of each activity is also displayed simultaneously. Such probability factor can be learned by the system from the wearer's profile, such as the wearer's habit and history. For example, after the wearer wears the wearable device 100 for many days, the wearable device may automatically learn the habit and life patterns of the wearer, e.g. after the wearer wakes up in the morning, it is 80% that the wearer would go to kitchen for a cup of coffee, and 20% that the wearer would go to toilet. In this way, the wearable device 100 may be a self-learning system, which is personalized for individual users.
In step 215, when it has been sensed by the one or more sensors 114, 116, 118, 120, 122, 124 that wearer is picking up a water boiler 200, the list of activities is narrowed by the processor 110 to only include the activities that can be performed with the water boiler. Like the previous step 213, based on the wearer's habits and history, it is determined by the processor 110 that it is highly likely (90% probability) that the wearer is going to prepare a coffee. Accordingly, support information related to "Coffee Preparation Task Assistance" is initiated and displayed in the display module 106.
In step 217, when it has been sensed by the one or more sensors 114, 116, 118, 120, 122, 124 that wearer is picking up a Nescafe® jar, it is determined by the processor 110 that it is 100% sure now that the wearer will prepare a coffee.
Note that if the wearer's desired /potential activity is not displayed in the display module 106 in step 213, namely such activity cannot be performed in the kitchen, the wearer may be allowed to input the name of the desired activity via the user interface 108. Consequently, the wearer will be guided by the wearable device 100 to the room were the task needs to be performed.
In Fig. 4B, an aspect of an embodiment of the wearable device 100 related to recognizing and tracking the individual steps during the activity performance and guiding the wearer based on the activity related parameters is disclosed.
After the "Coffee Preparation task Assistance" is initiated in step 215, the location of the Nescafe coffee jar is displayed to the wearer in step 231. Meanwhile, the wearer's reactions to this information are observed by the one or more sensors 114, 116, 120, 122, 124, such as whether the wearer's vital signs, e.g. heartbeat, are unchanged, and whether the wearer extends his hand toward the jar and picks it up.
In step 233, after verified by the processor 110 that the jar held by the wearer is indeed a coffee jar, the support information with respect to how to open the coffee jar is displayed. The verification can be done by capturing the image containing the content information of the jar by the image sensor 114 or can be done via laser scanners, or a barcode scanner in the wearable device 100. Meanwhile, the one or more sensors 114, 116, 118, 120, 122, 124 further track whether the coffee jar is opened or not by the wearer. The next instructions will be provided to the wearer only if the coffee jar is opened.
In step 235, after verified by the processor 110 that the coffee jar is opened by the wearer, the instruction related to the location of the spoons is displayed to the wearer. Meanwhile, the one or more sensors 114, 116, 118, 120, 122, 124 further track whether the spoon is picked up by the wearer or not. The next instructions will be provided to the wearer only if the spoon is picked up by the wearer.
In step 237, after verified by the processor 110 that the spoon is picked up by the wearer, the instruction related to the location of the cup, preferably the wearer's favorite cup, is displayed to the wearer. Meanwhile, the one or more sensors 114, 116, 118, 120, 122, 124 further track whether the cup is picked up by the wearer or not. The next instructions will be provided to the wearer only if the cup is picked up by the wearer.
In step 239, after verified by the processor 110 that the cup is picked up by the wearer, the instruction related to how to use the spoon to get the coffee and how to put the coffee into the cup is displayed to the wearer. Meanwhile, the one or more sensors 114, 116, 118, 120, 122, 124 further track whether the coffee is indeed put into the cup by the wearer or not. The next instructions will be provided to the wearer only if the coffee has been put into the cup by the wearer. In step 241, after verified by the processor 110 that the coffee has been put into the cup by the wearer, the instruction related to how to pour the water is displayed to the wearer.
Alternatively, the instruction may also relate to the duration of pouring the water to the cup. Such information is based on the measurement of the volume of the cup and the current pouring speed, which is sensed/calculated by video/audio data derived from the image sensor 114 and audio sensor 116. One example of calculating the pouring speeding based on the video/audio data can be found by Rafa Absar et al. in "USABILITY OF NON- SPEECH SOUNDS IN USER INTERFACES", Proceedings of the 14th International
Conference on Auditory Display, Paris, France June 24 - 27, 2008.
In step 243, after verified by the processor 110 that the cup is full with water based on the determined duration of the pouring as well as the audio analysis of the sounds that boiler makes, an alert message may be shown to the wearer to stop pouring water in order to prevent burns. Optionally, the wearer may be instructed to stir and wait for the coffee to cool down by the wearable device 100.
Note that the support information in any of the abovementioned steps may not limit to text information. Any other medium information, such as video/audio information, may also be provided to the wearer for the purpose of supporting the wearer to perform an activity.
Note that a warning message may also be provided to the wearer if the wearer does not follow the instruction correctly, or if the wearer performs some activities that render one or more devices in a wrong status. For instance, in step 235, after the wearer reacts to the instruction and opens the drawer and picks the spoon, the drawer may be not closed due to the neglect of the wearer. Once this is detected by the one or more sensors 114, 116, 118, 120, 122, 124, it may be determined by the processor 110 that the position of the drawer is not in the correct status, which may create a dangerous situation. Subsequently, an alert may be generated by the processor 110 and may be shown to the user to inform him/her to close the drawer. Such warning information may be continuously provided until the dangerous situation is removed.
In addition to monitoring and analyzing each step of activities performed by the wearer, the psychological and physiological state changes of the wearer in each step, and in between the changes of two steps where new instruction is introduced to the wearer state is observed. This is done with the purpose to understand if these is increased anxiety/stress (maybe as a result of not understanding the task or instructions), or tiredness, or lack of focus, or particular disabilities (tremors of hand). Consequently, the instructions are adapted accordingly.
For example, in step 231 , it may be detected that the wearer is not able to find the coffee jar correctly. The detection can be done automatically by observing, e.g. the time it takes and the increases heart rate, decreased Heart Rate Variability (HRV), increased skin conductance response, and respiration rate of the wearer. Accordingly, the location of the coffee jar is repeatedly displayed to the wearer. Meanwhile, additional support may be provided to the wearer other than showing information. For instance, a light beam 180 may be automatically provided to the wearer by the wearable device 100 in the direction of the coffee jar in order to better show the location to the wearer.
Note that the wearable device may be adapted to communicate with other father devices 200 in order to provide improved support information. For instance, in step 215, a connection can be established between the water boiler and the wearable device 100 such that the wearable device 100 may identify that the wearer is currently using the water boiler. Also in step 233, both the coffee jar and the wearable device 100 may comprise a NFC chip such that all needed information related to coffee jar, such as the expiration date, will be provided to the wearable device based on the NFC communication channel. Another example is that, in step 241, the wearable device 100 may communicate to the water boiler and may control the water boiler when it will pour the water, and when it will stop pouring the water, based on analysis of the cup size, and pouring speed. This may avoid the heat water overflow.
The processor 110 may implement a method 400 for displaying support information in the flow chart of Fig. 4. The method 400 commences in step 401, after the wearable device 100 is switched on, initializing the display module 106, after which the method 400 progresses to step 403 in which support information with respect to a first activity that a wearer of the wearable device intends to perform is displayed; wherein the support information with respect the first activity is determined based on an identified second activity that the wearer is performed.
Aspects of the present invention may be embodied as a wearable device, method or computer program product. Aspects of the present invention may take the form of a computer program product embodied in one or more computer-readable medium(s) having computer readable program code embodied thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Such a system, apparatus or device may be accessible over any suitable network connection; for instance, the system, apparatus or device may be accessible over a network for retrieval of the computer readable program code over the network. Such a network may for instance be the Internet, a mobile communications network or the like. More specific examples (a non- exhaustive list) of the computer readable storage medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out the methods of the present invention by execution on the processor 110 may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the processor 110 as a stand-alone software package, e.g. an app, or may be executed partly on the processor 1 10 and partly on a remote server. In the latter scenario, the remote server may be connected to the head-mountable computing device 100 through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer, e.g. through the Internet using an Internet Service Provider.
Aspects of the present invention are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions to be executed in whole or in part on the processor 110 of the head-mountable computing device 100, such that the instructions create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable medium that can direct the head-mountable computing device 100 to function in a particular manner.
The computer program instructions may be loaded onto the processor 110 to cause a series of operational steps to be performed on the processor 110, to produce a computer-implemented process such that the instructions which execute on the processor 110 provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. The computer program product may form part of a head-mountable computing device 100, e.g. may be installed on the head-mountable computing device 100.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps other than those listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several distinct elements. In the device claim enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims

CLAIMS:
1. A wearable device for cognitively impaired user comprising:
at least one feedback module for providing feedback information to the cognitively impaired user;
one or more sensors for sensing data from the cognitively impaired user; and - a processor adapted to:
initialize the feedback module;
identify a first movement of the cognitively impaired user based on the data sensed by the one or more sensors;
control the at least one feedback module to provide support information for supporting a second movement that the cognitively impaired user of the wearable device intends to perform subsequently after the first movement;
wherein the support information for supporting the second movement is determined based on the identified first movement of the cognitively impaired user.
2. The wearable device according to claim 1, wherein the feedback module is a display module; and the processor is adapted to:
initialize the display module; control the at least one display module to provide support information with respect to a first activity that the cognitively impaired user of the wearable device intends to perform.
3. The wearable device according to claim 1 or 2, wherein the one or more sensors comprise at least one of:
an imaging sensor for capturing one or more images indicative of the current view of the user and identifying one or more objects in the captured one or more images; - an audio sensor for capturing the sounds present in the environment or in the proximity of the user;
a temperature sensor for measuring the temperature of the user; a light sensor for measuring the light in the environment or in the proximity of the user; or a motion sensor, such as a gyroscope, or an accelerometer, for identifying the current physical activity of the user, wherein the processor is adapted to identify the first movement of the cognitively impaired user by evaluating the changes of each types of sensed data based on predefined rules comprising thresholds for each types of sensed data.
4. The wearable device according to claim 2, wherein the processor is further adapted to:
display a list of a plurality of activities that a user of the wearable device intends to perform;
limit the plurality of activities displayed on the display module based on the identified second activity that the user of the wearable device is performed.
5. The wearable device according to any of claim 4, wherein the processor is further adapted to rank the plurality of activities intended to be performed by the user, and wherein the rank is based on the second activity that the wearer is performed, or based on a predefined user's profile.
6. The wearable device according to any of claims 1-5, wherein the support information is predefined on a step-by-step basis, and wherein support information related to a next step of activity is provided only if the current step of activity is completely performed by the user.
7. The wearable device according to claim 1, wherein the one or more sensors further comprise a sensor adapted for measuring the user's physiological and/or
psychological state.
8. The wearable device according to claim 7, wherein the processor is further adapted to change the predefined support information in response to the user's physiological and/or psychological state measured by the sensor.
9. The wearable device according to claim 8, wherein the processor is further adapted to change the predefined support information by one or more of the following options:
stopping providing support information to the user; providing additional support information to the user;
changing the transmission speed of the support information provided to the user;
changing the format of the supported information provided to the user.
10. The wearable device according to any of claims 1-9, wherein the wearable device further comprises a transmitter adapted to receive signals and/or data from one or more devices, wherein the one or more devices are in the environment or in the proximity of the user.
11. The wearable device according to claim 1 , wherein the wearable device further comprises a speaker, and wherein the processor is further configured for controlling the speaker to provide support information with respect to the first activity that the userof the wearable device intends to perform.
12. A method of providing support information on the wearable device of any of claims 1-11, the method comprising:
initializing a feedback module;
sensing data from the cognitively impaired user by one or more sensors attached to the wearable device;
identifying a first movement of the cognitively impaired user based on the data sensed by the one or more sensors;
providing support information with respect to a second movement that a wearer of the wearable device intends to perform subsequently after the first movement via the feedback module;
wherein the support information with respect the second movement is determined based on the identified first movement of the cognitively impaired user.
13. The method of displaying information on the wearable device according to claim 12, wherein the method comprising:
changing the support information in response to the user's physiological and/or psychological state measured by a sensor of the wearable device.
14. Computer program product comprising program code means for causing a computer to carry out the steps of the method as claimed in claim 12 when said computer program is carried out on the computer.
PCT/EP2016/067224 2015-07-29 2016-07-20 Wearable device, method and computer program product WO2017016941A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP15178797.5 2015-07-29
EP15178797 2015-07-29

Publications (1)

Publication Number Publication Date
WO2017016941A1 true WO2017016941A1 (en) 2017-02-02

Family

ID=53794035

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/067224 WO2017016941A1 (en) 2015-07-29 2016-07-20 Wearable device, method and computer program product

Country Status (1)

Country Link
WO (1) WO2017016941A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3360466A1 (en) * 2017-02-08 2018-08-15 Koninklijke Philips N.V. A method and apparatus for monitoring a subject
EP3876025A1 (en) * 2020-03-02 2021-09-08 Siemens Aktiengesellschaft Obstacle detection and collision warning assembly
US20210287014A1 (en) * 2019-09-17 2021-09-16 Battelle Memorial Institute Activity assistance system
WO2022060432A1 (en) * 2019-09-17 2022-03-24 Battelle Memorial Institute Activity assistance system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014116826A1 (en) * 2013-01-24 2014-07-31 The Trustees Of Columbia University In The City Of New York Mobile, neurally-assisted personal assistant
US20140222462A1 (en) * 2013-02-07 2014-08-07 Ian Shakil System and Method for Augmenting Healthcare Provider Performance
WO2014140830A1 (en) * 2013-03-15 2014-09-18 Orcam Technologies Ltd. Apparatus and method for hierarchical object identification using a camera on glasses
US20140347265A1 (en) * 2013-03-15 2014-11-27 Interaxon Inc. Wearable computing apparatus and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014116826A1 (en) * 2013-01-24 2014-07-31 The Trustees Of Columbia University In The City Of New York Mobile, neurally-assisted personal assistant
US20140222462A1 (en) * 2013-02-07 2014-08-07 Ian Shakil System and Method for Augmenting Healthcare Provider Performance
WO2014140830A1 (en) * 2013-03-15 2014-09-18 Orcam Technologies Ltd. Apparatus and method for hierarchical object identification using a camera on glasses
US20140347265A1 (en) * 2013-03-15 2014-11-27 Interaxon Inc. Wearable computing apparatus and method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3360466A1 (en) * 2017-02-08 2018-08-15 Koninklijke Philips N.V. A method and apparatus for monitoring a subject
WO2018146184A1 (en) * 2017-02-08 2018-08-16 Koninklijke Philips N.V. A method and apparatus for monitoring a subject
US20210287014A1 (en) * 2019-09-17 2021-09-16 Battelle Memorial Institute Activity assistance system
WO2022060432A1 (en) * 2019-09-17 2022-03-24 Battelle Memorial Institute Activity assistance system
US11798272B2 (en) 2019-09-17 2023-10-24 Battelle Memorial Institute Activity assistance system
EP3876025A1 (en) * 2020-03-02 2021-09-08 Siemens Aktiengesellschaft Obstacle detection and collision warning assembly
WO2021175641A1 (en) * 2020-03-02 2021-09-10 Siemens Aktiengesellschaft Arrangement for obstacle detection and collision warning

Similar Documents

Publication Publication Date Title
CN110419018B (en) Automatic control of wearable display device based on external conditions
CN112034977B (en) Method for MR intelligent glasses content interaction, information input and recommendation technology application
JP6664512B2 (en) Calibration method of eyebrain interface system, slave device and host device in system
US11154203B2 (en) Detecting fever from images and temperatures
KR101859311B1 (en) Smart wearable devices and methods for optimizing output
US10791938B2 (en) Smartglasses for detecting congestive heart failure
US20210345888A1 (en) Detecting alcohol intoxication from video images
CN112181152A (en) Advertisement push management method, equipment and application based on MR glasses
JP2017526078A (en) System and method for biomechanics-based ocular signals for interacting with real and virtual objects
JP2017507400A (en) System and method for media selection and editing by gaze
WO2014138925A1 (en) Wearable computing apparatus and method
WO2017013051A1 (en) Head-mountable computing device, method and computer program product
KR102029219B1 (en) Method for recogniging user intention by estimating brain signals, and brain-computer interface apparatus based on head mounted display implementing the method
WO2017016941A1 (en) Wearable device, method and computer program product
CN111656304A (en) Communication method and system
WO2021061588A1 (en) Creation of optimal working, learning, and resting environments on electronic devices
JP2018005512A (en) Program, electronic device, information processing device and system
KR101723841B1 (en) Apparatus for eye-brain interface and method for controlling the same
US20230282080A1 (en) Sound-based attentive state assessment
KR20160016149A (en) System and method for preventing drowsiness by wearable glass device
KR102564202B1 (en) Electronic device providing interaction with virtual animals for user's stress relief and control method thereof
US20230418372A1 (en) Gaze behavior detection
US20240164672A1 (en) Stress detection
WO2023114079A1 (en) User interactions and eye tracking with text embedded elements
JP2023520448A (en) A system for providing guidance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16741922

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16741922

Country of ref document: EP

Kind code of ref document: A1