WO2023217745A1 - A system and method for assessing a health status of a user based on interactions with lighting control interfaces - Google Patents

A system and method for assessing a health status of a user based on interactions with lighting control interfaces Download PDF

Info

Publication number
WO2023217745A1
WO2023217745A1 PCT/EP2023/062209 EP2023062209W WO2023217745A1 WO 2023217745 A1 WO2023217745 A1 WO 2023217745A1 EP 2023062209 W EP2023062209 W EP 2023062209W WO 2023217745 A1 WO2023217745 A1 WO 2023217745A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
lighting control
health status
indicative
speed
Prior art date
Application number
PCT/EP2023/062209
Other languages
French (fr)
Inventor
Berent Willem MEERBEEK
Dzmitry Viktorovich Aliakseyeu
Original Assignee
Signify Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding B.V. filed Critical Signify Holding B.V.
Publication of WO2023217745A1 publication Critical patent/WO2023217745A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6889Rooms
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the invention relates to a method for assessing a health status of a user based on interactions with lighting control interfaces.
  • the invention further relates to a system for assessing a health status of a user.
  • the invention further relates to a computer program for assessing a health status of a user.
  • the inventors have realized that when the sensorimotor skills of a user decline (e.g., decrease in tactile acuity, motor slowing, deterioration of haptic performance and fine manipulative movements), it typically takes the user longer to perform certain functions. Requesting a user to perform such predetermined functions related to sensorimotor abilities may have a negative impact on the user and may result in false measurements because the user is aware of that his / her sensorimotor skills are being tested. It is therefore an object to provide an unobtrusive way to identify a health status of a user.
  • the object is achieved by a method for assessing a health status of a user based on interactions with a plurality of lighting control interfaces, wherein the lighting control interfaces are configured to control one or more lighting devices based on the interactions, the method comprising: receiving first data indicative of a first lighting control action by the user at a first physical location; receiving second data indicative of a consecutive second lighting control action by the user at a second physical location; determining, based on the first and the second data, a time duration between the first lighting control action and the consecutive second lighting control action; identifying the health status of the user by comparing the time duration to one or more reference time durations indicative of respective health statuses.
  • the lighting system is a system that receives multiple lighting control actions during a day, and that it may therefore be used to monitor the health status of a user.
  • Certain sequences of interactions with lighting control interfaces typically follow each other. For example, actions on lighting control interfaces in adjacent rooms may be paired (e.g., switching the lights off in the living room may be followed by switching the lights on in the bedroom within a period of time). Consecutive interactions within a predetermined time period may be considered as only consecutive actions of the user are relevant for identifying the health status of the user.
  • this duration may be relevant to monitor the time duration between switching the lights off in the living room and switching the lights on in the bedroom if this duration is less than a predetermined time period, say 15 minutes, as this duration may be indicative of the movement of the user between the first (living room) and the second (bedroom) physical location.
  • a predetermined time period say 15 minutes
  • this duration may be indicative of the movement of the user between the first (living room) and the second (bedroom) physical location.
  • this duration may not be relevant to monitor the time duration between switching the lights off in the living room and switching the lights on in the bedroom if this duration is more than the predetermined time period, as this may imply that the user may have performed other actions between interacting the lighting control interfaces, thus, this time duration may not be indicative of the duration of movement of the user between the first and the second physical locations.
  • the time elapsed between consecutive interactions of the user with lighting control interfaces can be determined and thus, the speed of movement of the user may be identified based on the elapsed time and distance between the physical locations.
  • the speed of movement is indicative of the health status of the user, i.e., speed of movements may be indicative of sensorimotor skills of the user and/or vision impairment.
  • the health status of the user can be identified by comparing the time elapsed between the consecutive lighting control actions (or the speed of movement of the user) to one or more reference time durations (or reference speed of movements) indicative of health status (e.g., reference speed of movements of the general population, reference speed of movements of a sample population with similar characteristics to the user, historic data (past speed of movements) of the user, etc.).
  • the one or more reference speed of movements may be past speed of movements (or respective past time durations), determined over an observation period, between past first lighting control actions and respective past second lighting control actions of the user, and the identifying of the health status of the user may comprise: identifying the health status of the user by determining a deviation of the speed of movement (or time duration) from the past speed of movements (or past time durations respectively).
  • the past speed of movements (or respective time durations for the same distance) may be stored as reference speed of movements (or reference time durations respectively) indicative of health statuses.
  • An increase in the time elapsed between consecutive lighting control actions may, for instance, be indicative of an increase in movement duration of the user, over a period of time.
  • changes e.g., decline or improvement in sensorimotor skills and/or vision impairment, etc.
  • Information about the lighting level may further be used to determine or distinguish the type of impairment. For example, certain vision impairments, e.g., cataract, macular degeneration, etc., are prominent at low light levels. For example, a decline in the time elapsed between consecutive interactions of the user with the lighting control interfaces that is present only at low light levels may be indicative of a vision impairment rather than a decline in sensorimotor skills.
  • the method may include generating a prediction for a future time duration, wherein the prediction is based at least in part on the past time durations as input into a predictive model.
  • a future time duration based on past time durations of the user, a future health status of the user may be identified by comparing the future time duration to one or more reference time durations indicative of respective health statuses.
  • the health status of the user may further be identified based on a distance between the first physical location and the second physical location.
  • the speed of movement of the user may be determined. Determining the speed of movement over a prolonged observation period may be used to identify the development of motor slowing by the user.
  • the distance between the first physical location and the second physical location may be determined based on a floor plan designating said first and second physical locations and distance measurements between the first and second physical locations.
  • the distance and/or approximate location of lighting control interfaces can be estimated based on a floorplan (e.g., provided by the user) and/or based on rooms to which they are assigned to (kitchen, living, bedroom, e.g., as picked by user during commissioning in an app). For example, if the first lighting control interface is located on the ground floor of a building and the second lighting control interface is located on a different floor, the time duration between the first and second lighting control action is indicative of the ability of the user to climb stairs (speed of movement while climbing stairs). A decline in the ability of the user to climb the stairs may be indicative of sensorimotor skills decline and/or vision impairment.
  • the distance between the first physical location and the second physical location may be determined based on communication signals transmitted between devices located in the first and second physical location.
  • RF signal strength analysis e.g., Zigbee RSSI values, Zigbee neighbor tables, or Bluetooth beaconing/proximity, etc.
  • the position or relative location may further be used to identify the distance between the physical location of the first lighting control interface at the moment in time of the first lighting control action and the physical location of the second lighting control interface at the moment in time of the second lighting control action by the user.
  • the method may comprise dynamically determining the distance between the first and the second physical location at time instances that the user interacts with the first and the second lighting control interface.
  • a lighting control interface may be a mobile device that dynamically changes location over time. By dynamically (at different moments in time) determining the distance between the first and the second physical location of the lighting control interfaces at the time instance that the user interacts with the first and respectively the second lighting control interface, the speed of movement of the user may be dynamically determined even when the lighting control interface dynamically changes location over time.
  • the method may further comprise selecting the first lighting control interface and the second lighting control interface from a plurality of lighting control interfaces based on information indicative of the locations of the lighting control interfaces.
  • a lighting system may comprise three or more lighting control interfaces.
  • a lighting system may receive interactions with lighting control interfaces from multiple users.
  • the health status of the user may further be identified based on a user profile.
  • Data indicative of a user profile may for example comprise profile-specific labels ⁇ tags of personal lighting control interfaces, profile-specific patterns of activity indicative of different users, types of lighting control interfaces used by different users, e.g., a user only uses a mobile device, or a user only uses a rotary dimmer, or data indicative of personal health condition of different users (e.g., Parkinson disease, Alzheimer disease, etc.), etc.
  • the health status of the user may be identified for that particular user based on the user profile. For example, only interactions with profile-specific personal lighting control interfaces (e.g., lighting control interfaces labeled after a user) may be considered to determine the time duration between consecutive lighting control actions for the said user.
  • the health status of the user may further be identified based on context data, wherein the context data may be data indicative of an activity of the user from a set of predetermined activities indicative of health statuses.
  • the user may interact with a first lighting control interface to select a cooking scene and as a result the predetermined user activity “cooking” may be inferred to occur.
  • the user may further interact with a second lighting control interface to select a different scene (e.g., dinner scene).
  • the time duration of the predetermined cooking activity can be inferred based on the time elapsed between the first and the consecutive second lighting control action. If the user develops sensorimotor skill decline, it may take longer to cook dinner, and this can be observed from data over time.
  • context-data e.g., cooking, ironing, taking a shower, etc.
  • the method may comprise determining a change in the health status of the user and provide a notification via a user interface indicative of the change in the health status. For example, if there is a change in the health status of the user, e.g., a decline or improvement in sensorimotor skills the user, the user or a caregiver may be notified by a message.
  • the method may comprise outputting aggregations of this decline (or improvement data), for example, on a dashboard with key sensorimotor skill indicators in a numeric or graphical form. Such dashboard could be presented in a digital user interface (e.g., web user interface).
  • the method may further include controlling the one or more lighting devices in response to the change in the health status of the user.
  • controlling the one or more lighting devices in response to the change in the health status of the user the lighting system may better match the health status of the user.
  • the object is achieved by system for assessing a health status of a user based on interactions with lighting control interfaces, wherein the lighting control interfaces are configured to control one or more lighting devices based on the interactions, comprising: a first lighting control interface configured to generate first data indicative of a first lighting control action by the user at a first physical location; a second lighting control interface configured to generate second data indicative of a consecutive second lighting control action by the user at a second physical location; a controller configured to receive the first and second data, and further configured to determine, based on the first and second data, a time duration between the first lighting control action and the consecutive second lighting control action; and identify the health status of the user by comparing the time duration to one or more reference time durations indicative of health statuses.
  • the object is achieved by a computer program that is configured to perform a method for assessing the health status of a user based on interactions with lighting control interfaces configured to control one or more lighting devices, the method comprising: receiving first data indicative of a first lighting control action by the user at a first physical location; receiving second data indicative of a consecutive second lighting control action by the user at a second physical location; determining, based on the first and the second data, a time duration between the first lighting control action and the consecutive second lighting control action; identifying the health status of the user by comparing the time duration to one or more reference time durations indicative of health statuses.
  • the system, method and computer program product may have similar and/or identical embodiments and advantages as the above- mentioned lighting devices.
  • Fig. 1 shows schematically a block diagram of a system for assessing a health status of a user
  • Fig. 2 shows schematically a flow diagram of method for assessing a health status of a user
  • Fig. 3 shows an example interaction with a first and second lighting control interface
  • Fig. 4 shows an example interaction with a first and second lighting control interface
  • Fig. 5 shows schematically a second embodiment of method for assessing a health status of a user.
  • Fig. 1 shows an example of a system 100 for assessing a health status of a user based on interactions with lighting control interfaces 102, 104.
  • the system 100 comprises a first lighting control interface 102 configured to control one or more lighting devices 132 and a second lighting control interface 104 configured to control one or more lighting devices 134.
  • the lighting control interfaces 102, 104 may for example be switches to turn lighting on/off, dim up/down, or set a lighting scene.
  • the lighting control interfaces 102, 104 may reside on a user device (e.g., mobile phone, tablet, etc.) configured to control the one or more lighting devices 132, 134.
  • a user device e.g., mobile phone, tablet, etc.
  • the lighting control interfaces 102, 104 may also be in the form of remote-control devices, for example with buttons for selecting light characteristics, or buttons of a touch sensitive screen such as a touch sensitive screen of a smart phone, tablet, etc.
  • the lighting control interfaces 102, 104 may comprise drive/sense circuitry to detect a state change of occupancy, e.g., audio, PIR sensor, etc.
  • the lighting devices 132, 134 are configured to provide general illumination, such as ambient or functional illumination and may include light sources of different types (e.g., incandescent lamps, fluorescent lamps, and/or LED light sources) and can be of any type (e.g., table lamps, floor lamps, ceiling lamps etc.).
  • light sources of different types e.g., incandescent lamps, fluorescent lamps, and/or LED light sources
  • can be of any type e.g., table lamps, floor lamps, ceiling lamps etc.
  • the system 100 further comprises at least one data processor or controller 106.
  • the system may further comprise at least one data repository or storage or memory 108 for storing computer program code instructions.
  • the controller 106 may be communicatively coupled to the cloud 120.
  • the controller 106 may be comprised in a central device (e.g., a smartphone, personal computer, a hub), a web-based portal, a combination of a central device and a web-based portal, etc.
  • the controller 106 may be communicatively coupled to a memory module 108.
  • the first lighting control interface 102 is configured to generate first data indicative of a first lighting control action by the user at a first physical location 152.
  • the lighting control action may include, but is not limited to, turning a lighting device on, turning a lighting device off, dimming the light, setting a lighting device to a light intensity level, selecting a color for a lighting device, selecting a lighting scene for a lighting device, or some other lighting control action.
  • the second lighting control interface 104 is configured to generate second data indicative of a consecutive second lighting control action by the user at a second physical location 154.
  • the lighting control interfaces may be any types of lighting control interfaces configured to receive user inputs indicative of respective lighting control actions.
  • the lighting control interfaces may, for example, be light switches, presence sensors, mobile device, voice interfaces, etc.
  • the lighting control interfaces may be comprised in separate devices, or the lighting control interfaces may be an interface of a single device.
  • the first and second lighting control actions are provided by the user at respective (different) physical locations. These physical locations may, for example, be the location of lighting control interfaces in different rooms in a building, the location of lighting control interfaces in different areas in a room, the location of lighting control interfaces in different outdoor locations, etc.
  • a user may thus provide the first lighting control action at the first physical location 152 (e.g., a location of a lighting control interface in the bedroom), and the second lighting control action at the second physical location 154 (e.g., a location of a lighting control interface in the bathroom) to control the one or more lighting devices, which may be located at these locations.
  • the controller 106 is configured to receive 202 the first data and receive 204 the second data, and determine 206, based on the first and second data, a time duration between the first lighting control action and the consecutive second lighting control action.
  • the first data may for example comprise a timestamp of the moment that the first lighting control action has been provided
  • the second data may for example comprise a timestamp of the moment that the second lighting control action has been provided
  • the controller may be configured to determine the time duration based on the timestamps.
  • the controller 106 may be configured to receive inputs at the moments when the first and second lighting control actions have been provided, and determine the time duration based on a time difference between the moments.
  • the controller 106 is further configured to identify 208 the health status of the user by comparing the time duration to one or more reference time durations indicative of health statuses.
  • the time duration between the two consecutive interactions with the lighting control interfaces 102, 104 may be indicative of the time duration of the movement between the first 152 and the second physical location 154. Since the time duration of the movement between the first 152 and the second physical location 154 may be indicative of the sensorimotor skills of the user, the health status of the user may be identified from a comparison to the reference time durations of movement.
  • the one or more reference time durations may be sample reference time durations of movement based on similar sample user populations, for example reference time durations based on a similar user age group, gender, medical condition (e.g., Parkinson’s disease, vision impairment, etc.).
  • the one or more reference time durations may be past time durations, determined over an observation period, between past first lighting control actions and respective past second lighting control actions of the user.
  • Historic data of past time durations of the user may be stored in the memory module 108 as reference time durations indicative of health statuses.
  • the controller 106 may be further configured to identify the health status of the user by determining a deviation of the time duration from the past time durations.
  • the health status of the user may change over time, e.g., due to impaired vision, slowing (or acceleration) of movement, decline (or improvement) in sensorimotor control and functioning with age (e.g., increased time to interact with the lighting control interfaces 102, 104 due to tremor), it may take longer (or shorter) for the user to move between the first 152 and the second physical location 154 and correspondingly to interact with the first and the consecutive second lighting control interface.
  • the time duration between the first and the consecutive second lighting control action may increase (or decrease) over time.
  • a variability of the time duration between consecutive lighting control actions over time may also be indicative of temporal movement variability that is associated with a decline in sensorimotor skills of the user.
  • Standard statistical inference analysis time-series analysis (e.g., regression, moving average) or machine-learning (e.g., Bayesian network, Isolation Forest, regression tree, etc.) can be used to reveal such incidental/irregular variations (trends) in the health status of the user. Additionally, or alternatively other techniques for anomaly detection can be used, including k-nearest neighbor, Support Vector Machine (SVM), and neural networks (e.g., LSTM). Determining deviations (e.g., by comparison, by determining variability, a moving average trend, etc.) of the time duration from the past time durations over time may be indicative of a change (improvement or decline) in the health status of the user over time.
  • SVM Support Vector Machine
  • LSTM neural networks
  • a user 320 moves from a first area (e.g., bedroom) to a second area (e.g., bathroom). Following this movement, the user 320 may interact with the lighting control interface 302 located at physical location 312, to, for example, turn off the lighting device 332 located in the bedroom. Then, the user 320 may move to another physical location 314, and interact with the lighting control interface 304 located at the other physical location 314, to, for example, turn on the lighting device 334 located in the bathroom.
  • the controller 106 receives first and second data indicative of the first and respectively the second lighting control action by the user and determines, based on the first and the second data, a time duration between the first lighting control action and the consecutive second lighting control action.
  • the time duration between the two consecutive interactions with the lighting control interfaces 302, 304 may be indicative of the time duration of the movement between the first 312 and the second physical location 314.
  • the controller 106 is configured to identify the health status of the user by comparing the time duration of the movement of the user to one or more reference time durations indicative of health statuses. For example, the controller 106 may compare the time duration of movement to past time durations of the same movement (e.g., past time durations, determined over a different observation period, between past first lighting control actions with lighting control interface 302 and respective past second lighting control actions with lighting control interface 304).
  • the controller 106 may further be configured to identify the health status of the user based on a distance between the first physical location 152 and the second physical location 154. Using the distance between the physical locations of 152, 154 of the lighting control interfaces 102, 104 and the time elapsed between the first and the consecutive second lighting control action, the speed of movement of the user may be determined by the controller 106 (e.g., speed equals the distance between physical locations 152, 154 divided by the time elapsed between the first and the consecutive second lighting control action). The speed of movement or walking gait may be indicative of the health status of the user. For instance, a change in the speed of movement over time (deviation from reference past speed of movements of the user) may be indicative of impaired mobility or indicative of impaired vision.
  • a change in the speed of movement may even be indicative of an onset or progression of a disease, for example, Parkinson’s patients have difficulties in generating rhythmic movements such as walking and show a progressive diminishing in the speed of movement.
  • a deviation of the average speed of movement or gait speed compared to a reference speed of movement for a representative user sample population with similar characteristics as the user (e.g., similar age, gender, etc.) may be indicative of the health status of the user.
  • a slow(er) walking speed or gait speed compared to the speed of movement of the user sample population may be indicative and/or associated with dementia or cognitive impairment of the user.
  • the distance between the first physical location 152 and second physical location 154 may be determined based on a layout plan designating the physical locations and distance measurements between the physical locations.
  • the layout plan may relate to a spatial arrangement of the lighting control interfaces. For instance, for a building lighting system of a building, the layout plan may include a floor plan comprising the physical location of lighting control interfaces in a parking area, a floor, a corridor and/or a room, etc.
  • the controller 106 may be configured to determine the distance between the first physical location 152 and second physical location 154 based on the floor plan.
  • the controller 106 may be configured to determine the distance between the first physical location 152 and second physical location 154 based on communication signals transmitted from devices located in the first 152 and second physical location 154. For example, any method for determining the location of a device based on signals emitted from devices located in the first 152 and second physical location 154 (e.g., RF -based positioning, positioning based on optical signals, etc.) can be used to identify the physical location 152, 154 of the lighting control interfaces 102, 104.
  • any method for determining the location of a device based on signals emitted from devices located in the first 152 and second physical location 154 e.g., RF -based positioning, positioning based on optical signals, etc.
  • the physical location 152, 154 of the lighting control interfaces 102, 104 may be determined in respect to the relative distance of the lighting control interfaces 102, 104 from the lighting devices 132, 134, for example based on signals emitted from the lighting control interfaces 102, 104 to the lighting devices 132, 134.
  • the controller 106 may determine, based on the position or relative location of the lighting control interfaces 102, 104, the distance between the first 152 and the second physical location 154. It should be understood that techniques for determining the distance between two locations are known in the art, and will therefore not be discussed in further detail.
  • the controller 106 may further be configured to dynamically determine the distance between the first 152 and the second physical location 154 at time instances that the user interacts with the first 102 and the second lighting control interface 104.
  • a lighting control interface may be a mobile device that dynamically changes location over time or the user may change the location of a lighting control interface (e.g., new commissioning).
  • the controller 106 may be configured to determine the location of the lighting control interfaces 102, 104 at all instances that the user interacts with the lighting control interfaces, at regularly scheduled events (e.g., every second, or third interaction), only when the lighting control interface resides on a mobile device, etc.
  • the controller 106 may dynamically determine the distance between the first 152 and the second physical location 154 based on dynamically determining the physical locations 152, 154 of the first and second lighting control interfaces 102, 104.
  • Fig. 4 shows an example of a mobile device 430 that compromises two lighting control interfaces 402 and 404 configured to control the lighting devices 432 and 434 respectively.
  • a user 420 interacts with the first lighting control interface 402, for example to turn-off the light of lighting device 432.
  • the system may determine the position 452 of the mobile device 430, for example based on signals communicated between the mobile device 430 and the lighting device 432 (RF-based indoor positioning, positioning based on optical signals, etc.) or based on signals communicated between the mobile device 430 and a central hub, server, etc.
  • the user interacts with the second lighting control interface 404 of the mobile device 430, for example to turn-on the lights in lighting device 434.
  • the system may determine the new position 454 of the mobile device 430 based on signals communicated between the mobile device 430 and the lighting device 434 (RF-based indoor positioning, positioning based on optical signals, etc.) or based on signals communicated between the mobile device 430 and a central hub, server, etc.
  • the controller 106 may dynamically determine the distance between the first 452 and the second location 454 of the mobile device 430 based on the locations 452, 454 of the mobile device 430 at time instances that the user interacted with the lighting control interfaces 402, 404.
  • the controller 106 may further be configured to select the first lighting control interface 102 and the second lighting control interface 104 from the plurality of lighting control interfaces based on information indicative of the physical locations 152, 154 of the lighting control interfaces. Said information indicative of the physical locations of the lighting control interfaces 152, 154 may be derived from a floor plan, e.g.., the controller 106 may be configured to receive a floor plan and determine the physical locations based on the floor plan.
  • a lighting system may comprise three or more lighting control interfaces. However, not all interactions with the lighting control interfaces need to be considered in determining the health status of the user. The selection on which lighting control interfaces may be considered for determining the health status of the user may be based on the physical locations of the lighting control interfaces.
  • a first lighting control interface 102 located at a ground floor and a second lighting control interface 104 located at the first floor are suitable for determining the ability of the user to climb stairs, indicative of the health status of the user, and thus may be considered for determining the health status of the user.
  • two lighting control interfaces that are adjacent to each other on the same wall may provide time durations not indicative of a health status and thus may not be considered for determining the health status of the user.
  • a lighting system may receive interactions with lighting control interfaces 102, 104 from multiple users.
  • the controller 106 may further be configured to receive data indicative of a user profile.
  • the data may for example comprise labeling (tagging) of personal lighting control interfaces (e.g., a mobile device belonging to a first user), labeling (tagging) of patterns of interactions with lighting control interfaces performed by a first user or respectively a second user (e.g., interaction with lighting control interface 102 followed by a consecutive interaction with lighting control interface 102 only performed by a first user), etc.
  • Information about the presence of multiple users may be received via, e.g., a user input.
  • the controller 106 may be further configured to identify/determine the health status of the user based on the user profile. For example, only interactions with profilespecific lighting control interfaces (lighting control interfaces labeled after a user) may be considered to determine the time duration between the consecutive interactions with lighting control interfaces for an exemplary user. In another example, only interactions with a rotary switch may be considered to determine the time duration between the consecutive interactions with lighting control interfaces for a second exemplary user, etc. Additionally, or alternatively, the health status of the user may be identified by comparing the time duration between the consecutive interactions with lighting control interfaces to one or more profilespecific reference time durations indicative of respective health statuses, e.g., reference time durations for sample user population with Parkinson’s disease, Alzheimer disease, etc.
  • the controller 106 may further be configured to identify the health status of the user based on context data, wherein the context data may be data indicative of an activity of the user from a set of predetermined activities indicative of health statuses.
  • the time duration of predetermined user activities e.g., cooking, having dinner, ironing, taking a shower, etc.
  • predetermined user activities e.g., cooking, having dinner, ironing, taking a shower, etc.
  • Context-data may be extracted based on the labeling of lighting control interfaces by the user (e.g., labeling of living room lights, staircase sensor, etc.), based on the selection of predefined (labeled) lighting scenes by the user (e.g., a lighting control interface may include multiple lighting scene selection buttons that may be pressed to selectively cause implementation of various lighting scenes (e.g., dinner, cooking, etc.), etc.
  • a lighting control interface may include multiple lighting scene selection buttons that may be pressed to selectively cause implementation of various lighting scenes (e.g., dinner, cooking, etc.), etc.
  • the user may interact with a first lighting control interface 102 to select a cooking lighting scene, and as a result the predetermined user activity “cooking” may be determined to occur (e.g., by the controller 106).
  • the user may further interact with a second lighting control interface 104 to select a different scene (e.g., a dinner scene).
  • the time duration of the predetermined cooking activity may be determined (e.g., by the controller 106) based on the time elapsed between the first and the consecutive second lighting control action. If the user experiences sensorimotor skill decline, it may take longer to cook dinner (e.g., due to the development of tremor or muscle rigidity with age, etc.), and this can be observed from data over time. Additionally or alternatively, the user may interact with a first lighting control interface 102 to select a dinner lighting scene and as a result the predetermined user activity “having dinner” may be determined to occur (e.g., by the controller 106). The user may further interact with a second lighting control interface 104 to select a different scene (e.g., select a “watching TV” lighting scene).
  • a first lighting control interface 102 to select a dinner lighting scene and as a result the predetermined user activity “having dinner” may be determined to occur (e.g., by the controller 106).
  • the user may further interact with a second lighting control interface 104 to
  • the time duration of the predetermined having dinner activity can be inferred based on the time elapsed between the first and the consecutive second lighting control action.
  • a longer duration of the dinner activity may be indicative of a decline in the health status of the user, for example people may develop tremor with age which might impair the ability to hold cutlery.
  • the health status of the user may be identified. Time-of-day information may be used to determine the activity of the user.
  • an interaction of the user with the first lighting control interface 102 (labeled as “bedroom”) to turn-on the lighting device 132 and a consecutive second interaction of the user with the second lighting control interface 104 (labeled as “bathroom”) to turn-on the lighting device 134, may be combined with time-of-day information (e.g., after midnight) to extract “mid-night toilet visit” activity.
  • time-of-day information e.g., after midnight
  • Fig. 5 shows a method 500 of assessing a health status of a user based on interactions with lighting control interfaces.
  • the method 500 may be performed by the controller 106 of Fig. 1, for example.
  • the method may comprise determining 510 (e.g., by the controller 106) a change in the health status of the user and provide 512 a notification via a user interface indicative of the change in the health status. For example, if there is a change in the health status of the user, e.g., a decline or improvement in sensorimotor skills the user, the user or a caregiver may be notified by a message.
  • the method 500 may comprise outputting 512 (e.g., by the controller 106) aggregations of this decline (or improvement), for example on a dashboard with key sensorimotor skill indicators in a numeric or graphical form. Such dashboard could be presented in a digital user interface (e.g., a web user interface).
  • the method 500 may comprise controlling 514 (e.g., by the controller 106) the one or more lighting devices 132, 134 in response to the change in the health status of the user.
  • the controller 106 may adjust the lighting control on the devices 132, 134 based on the detection of the development of motor slowing of the user (e.g., if people move slower from one room to another, the lighting devices 132, 134 may be configured to use a longer time set to switch off after no motion has been detected). Additionally or alternatively, lighting control parameters, such as the hold time and/or sensitivity of a motion sensor connected to the lighting devices 132, 134, may be increased in response to a change in sensorimotor user skills.
  • the system 100 can also incorporate a predictive element, which uses the past time durations, determined over a different observation period, between past first lighting control actions and respective past second lighting control actions of the user to predict the health status of the user in the future.
  • the controller 106 may use a regression algorithm to predict a future time duration between consecutive user interactions with lighting control interfaces of the user.
  • a long short term memory neural network (LSTM), or other time series model e.g., Autoregressive integrated moving average (ARIMA) model
  • ARIMA Autoregressive integrated moving average
  • a future health status of the user may be identified based on comparing the future time duration to one or more reference time durations indicative of respective health statuses.
  • the method 200, 500 may be executed by computer program code of a computer program product when the computer program product is run on a processing unit of a computing device, such as the controller 106 of the system 100.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • Use of the verb "comprise” and its conjugations does not exclude the presence of elements or steps other than those stated in a claim.
  • the article “a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer or processing unit. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
  • aspects of the invention may be implemented in a computer program product, which may be a collection of computer program instructions stored on a computer readable storage device which may be executed by a computer.
  • the instructions of the present invention may be in any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs) or Java classes.
  • the instructions can be provided as complete executable programs, partial executable programs, as modifications to existing programs (e.g., updates) or extensions for existing programs (e.g., plugins).
  • parts of the processing of the present invention may be distributed over multiple computers or processors or even the ‘cloud’.
  • Storage media suitable for storing computer program instructions include all forms of nonvolatile memory, including but not limited to EPROM, EEPROM and flash memory devices, magnetic disks such as the internal and external hard disk drives, removable disks and CD-ROM disks.
  • the computer program product may be distributed on such a storage medium, or may be offered for download through HTTP, FTP, email or through a server connected to a network such as the Internet.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Data Mining & Analysis (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

A method and system for assessing a health status of a user based on interactions with a plurality of lighting control interfaces is disclosed. The lighting control interfaces are configured to control one or more lighting devices based on the interactions. The method comprises receiving first data indicative of a first lighting control action by the user at a first physical location, receiving second data indicative of a consecutive second lighting control action by the user at a second physical location, determining, based on the first and the second data, a time duration between the first lighting control action and the consecutive second lighting control action and identifying the health status of the user by comparing the time duration to one or more reference time durations indicative of respective health statuses.

Description

A system and method for assessing a health status of a user based on interactions with lighting control interfaces
FIELD OF THE INVENTION
The invention relates to a method for assessing a health status of a user based on interactions with lighting control interfaces. The invention further relates to a system for assessing a health status of a user. The invention further relates to a computer program for assessing a health status of a user.
BACKGROUND
With advanced age comes a decline in health status, e.g., impaired vision, slowing of movement, decline in sensorimotor control and functioning, etc. This health status decline has a negative impact on the ability of older adults to perform functional activities of daily living and maintain their independence. Often, this decline is a slow process, and it is difficult to self-diagnose by the elderly person or by a caregiver.
SUMMARY OF THE INVENTION
The inventors have realized that when the sensorimotor skills of a user decline (e.g., decrease in tactile acuity, motor slowing, deterioration of haptic performance and fine manipulative movements), it typically takes the user longer to perform certain functions. Requesting a user to perform such predetermined functions related to sensorimotor abilities may have a negative impact on the user and may result in false measurements because the user is aware of that his / her sensorimotor skills are being tested. It is therefore an object to provide an unobtrusive way to identify a health status of a user.
According to a first aspect, the object is achieved by a method for assessing a health status of a user based on interactions with a plurality of lighting control interfaces, wherein the lighting control interfaces are configured to control one or more lighting devices based on the interactions, the method comprising: receiving first data indicative of a first lighting control action by the user at a first physical location; receiving second data indicative of a consecutive second lighting control action by the user at a second physical location; determining, based on the first and the second data, a time duration between the first lighting control action and the consecutive second lighting control action; identifying the health status of the user by comparing the time duration to one or more reference time durations indicative of respective health statuses.
The inventors have realized that the lighting system is a system that receives multiple lighting control actions during a day, and that it may therefore be used to monitor the health status of a user. Certain sequences of interactions with lighting control interfaces typically follow each other. For example, actions on lighting control interfaces in adjacent rooms may be paired (e.g., switching the lights off in the living room may be followed by switching the lights on in the bedroom within a period of time). Consecutive interactions within a predetermined time period may be considered as only consecutive actions of the user are relevant for identifying the health status of the user. For example, it may be relevant to monitor the time duration between switching the lights off in the living room and switching the lights on in the bedroom if this duration is less than a predetermined time period, say 15 minutes, as this duration may be indicative of the movement of the user between the first (living room) and the second (bedroom) physical location. However, it may not be relevant to monitor the time duration between switching the lights off in the living room and switching the lights on in the bedroom if this duration is more than the predetermined time period, as this may imply that the user may have performed other actions between interacting the lighting control interfaces, thus, this time duration may not be indicative of the duration of movement of the user between the first and the second physical locations. The time elapsed between consecutive interactions of the user with lighting control interfaces can be determined and thus, the speed of movement of the user may be identified based on the elapsed time and distance between the physical locations. The speed of movement is indicative of the health status of the user, i.e., speed of movements may be indicative of sensorimotor skills of the user and/or vision impairment. Thus, the health status of the user can be identified by comparing the time elapsed between the consecutive lighting control actions (or the speed of movement of the user) to one or more reference time durations (or reference speed of movements) indicative of health status (e.g., reference speed of movements of the general population, reference speed of movements of a sample population with similar characteristics to the user, historic data (past speed of movements) of the user, etc.). By identifying the health status of the user based on the interaction of the user with lighting control interfaces, an unobtrusive way for identifying a health status of a user is provided. The one or more reference speed of movements (or time durations for the same distance) may be past speed of movements (or respective past time durations), determined over an observation period, between past first lighting control actions and respective past second lighting control actions of the user, and the identifying of the health status of the user may comprise: identifying the health status of the user by determining a deviation of the speed of movement (or time duration) from the past speed of movements (or past time durations respectively). The past speed of movements (or respective time durations for the same distance) may be stored as reference speed of movements (or reference time durations respectively) indicative of health statuses. An increase in the time elapsed between consecutive lighting control actions may, for instance, be indicative of an increase in movement duration of the user, over a period of time. By comparing the time elapsed between consecutive interactions of the user with the lighting control interfaces over a prolonged observation period, changes (e.g., decline or improvement in sensorimotor skills and/or vision impairment, etc.) in the health status of the user over time may be determined. Information about the lighting level may further be used to determine or distinguish the type of impairment. For example, certain vision impairments, e.g., cataract, macular degeneration, etc., are prominent at low light levels. For example, a decline in the time elapsed between consecutive interactions of the user with the lighting control interfaces that is present only at low light levels may be indicative of a vision impairment rather than a decline in sensorimotor skills.
The method may include generating a prediction for a future time duration, wherein the prediction is based at least in part on the past time durations as input into a predictive model. By inferring a future time duration based on past time durations of the user, a future health status of the user may be identified by comparing the future time duration to one or more reference time durations indicative of respective health statuses.
The health status of the user may further be identified based on a distance between the first physical location and the second physical location. Using the distance and/or approximate location of lighting control interfaces and the time elapsed between the first lighting control action performed at the first physical location and the consecutive second lighting control action performed at the second physical location, the speed of movement of the user may be determined. Determining the speed of movement over a prolonged observation period may be used to identify the development of motor slowing by the user. The distance between the first physical location and the second physical location may be determined based on a floor plan designating said first and second physical locations and distance measurements between the first and second physical locations. The distance and/or approximate location of lighting control interfaces can be estimated based on a floorplan (e.g., provided by the user) and/or based on rooms to which they are assigned to (kitchen, living, bedroom, e.g., as picked by user during commissioning in an app). For example, if the first lighting control interface is located on the ground floor of a building and the second lighting control interface is located on a different floor, the time duration between the first and second lighting control action is indicative of the ability of the user to climb stairs (speed of movement while climbing stairs). A decline in the ability of the user to climb the stairs may be indicative of sensorimotor skills decline and/or vision impairment.
The distance between the first physical location and the second physical location may be determined based on communication signals transmitted between devices located in the first and second physical location. For example, RF signal strength analysis (e.g., Zigbee RSSI values, Zigbee neighbor tables, or Bluetooth beaconing/proximity, etc.), can be used to identify the position of lighting control interfaces or the relative location of lighting control interfaces in respect to the lighting devices. The position or relative location may further be used to identify the distance between the physical location of the first lighting control interface at the moment in time of the first lighting control action and the physical location of the second lighting control interface at the moment in time of the second lighting control action by the user.
The method may comprise dynamically determining the distance between the first and the second physical location at time instances that the user interacts with the first and the second lighting control interface. For example, a lighting control interface may be a mobile device that dynamically changes location over time. By dynamically (at different moments in time) determining the distance between the first and the second physical location of the lighting control interfaces at the time instance that the user interacts with the first and respectively the second lighting control interface, the speed of movement of the user may be dynamically determined even when the lighting control interface dynamically changes location over time.
The method may further comprise selecting the first lighting control interface and the second lighting control interface from a plurality of lighting control interfaces based on information indicative of the locations of the lighting control interfaces. A lighting system may comprise three or more lighting control interfaces. By selecting the first and second lighting control interfaces based on information indicative of the locations of the lighting control interfaces, the most appropriate pairs of interactions of the user with the lighting control interfaces are considered for determining the health status of the user.
A lighting system may receive interactions with lighting control interfaces from multiple users. The health status of the user may further be identified based on a user profile. Data indicative of a user profile may for example comprise profile-specific labels\tags of personal lighting control interfaces, profile-specific patterns of activity indicative of different users, types of lighting control interfaces used by different users, e.g., a user only uses a mobile device, or a user only uses a rotary dimmer, or data indicative of personal health condition of different users (e.g., Parkinson disease, Alzheimer disease, etc.), etc. By receiving data indicative of a user profile, the health status of the user may be identified for that particular user based on the user profile. For example, only interactions with profile-specific personal lighting control interfaces (e.g., lighting control interfaces labeled after a user) may be considered to determine the time duration between consecutive lighting control actions for the said user.
The health status of the user may further be identified based on context data, wherein the context data may be data indicative of an activity of the user from a set of predetermined activities indicative of health statuses. For example, the user may interact with a first lighting control interface to select a cooking scene and as a result the predetermined user activity “cooking” may be inferred to occur. The user may further interact with a second lighting control interface to select a different scene (e.g., dinner scene). The time duration of the predetermined cooking activity can be inferred based on the time elapsed between the first and the consecutive second lighting control action. If the user develops sensorimotor skill decline, it may take longer to cook dinner, and this can be observed from data over time. By associating context-data to predetermined user activities (e.g., cooking, ironing, taking a shower, etc.), the trend of the predetermined activity over a prolonged observation period may be determined.
The method may comprise determining a change in the health status of the user and provide a notification via a user interface indicative of the change in the health status. For example, if there is a change in the health status of the user, e.g., a decline or improvement in sensorimotor skills the user, the user or a caregiver may be notified by a message. Alternatively or additionally, the method may comprise outputting aggregations of this decline (or improvement data), for example, on a dashboard with key sensorimotor skill indicators in a numeric or graphical form. Such dashboard could be presented in a digital user interface (e.g., web user interface).
The method may further include controlling the one or more lighting devices in response to the change in the health status of the user. By controlling the one or more lighting devices in response to the change in the health status of the user, the lighting system may better match the health status of the user.
According to a second aspect, the object is achieved by system for assessing a health status of a user based on interactions with lighting control interfaces, wherein the lighting control interfaces are configured to control one or more lighting devices based on the interactions, comprising: a first lighting control interface configured to generate first data indicative of a first lighting control action by the user at a first physical location; a second lighting control interface configured to generate second data indicative of a consecutive second lighting control action by the user at a second physical location; a controller configured to receive the first and second data, and further configured to determine, based on the first and second data, a time duration between the first lighting control action and the consecutive second lighting control action; and identify the health status of the user by comparing the time duration to one or more reference time durations indicative of health statuses.
According to a third aspect, the object is achieved by a computer program that is configured to perform a method for assessing the health status of a user based on interactions with lighting control interfaces configured to control one or more lighting devices, the method comprising: receiving first data indicative of a first lighting control action by the user at a first physical location; receiving second data indicative of a consecutive second lighting control action by the user at a second physical location; determining, based on the first and the second data, a time duration between the first lighting control action and the consecutive second lighting control action; identifying the health status of the user by comparing the time duration to one or more reference time durations indicative of health statuses. It should be understood that the system, method and computer program product may have similar and/or identical embodiments and advantages as the above- mentioned lighting devices.
BRIEF DESCRIPTION OF THE DRAWINGS
The above, as well as additional objects, features and advantages of the disclosed systems, devices and methods will be better understood through the following illustrative and non-limiting detailed description of embodiments of devices and methods, with reference to the appended drawings, in which:
Fig. 1 shows schematically a block diagram of a system for assessing a health status of a user;
Fig. 2 shows schematically a flow diagram of method for assessing a health status of a user;
Fig. 3 shows an example interaction with a first and second lighting control interface;
Fig. 4 shows an example interaction with a first and second lighting control interface;
Fig. 5 shows schematically a second embodiment of method for assessing a health status of a user.
All the figures are schematic, not necessarily to scale, and generally only show parts which are necessary in order to elucidate the invention, wherein other parts may be omitted or merely suggested.
DETAILED DESCRIPTION
Fig. 1 shows an example of a system 100 for assessing a health status of a user based on interactions with lighting control interfaces 102, 104. The system 100 comprises a first lighting control interface 102 configured to control one or more lighting devices 132 and a second lighting control interface 104 configured to control one or more lighting devices 134. The lighting control interfaces 102, 104 may for example be switches to turn lighting on/off, dim up/down, or set a lighting scene. The lighting control interfaces 102, 104 may reside on a user device (e.g., mobile phone, tablet, etc.) configured to control the one or more lighting devices 132, 134. The lighting control interfaces 102, 104 may also be in the form of remote-control devices, for example with buttons for selecting light characteristics, or buttons of a touch sensitive screen such as a touch sensitive screen of a smart phone, tablet, etc. The lighting control interfaces 102, 104 may comprise drive/sense circuitry to detect a state change of occupancy, e.g., audio, PIR sensor, etc.
The lighting devices 132, 134 are configured to provide general illumination, such as ambient or functional illumination and may include light sources of different types (e.g., incandescent lamps, fluorescent lamps, and/or LED light sources) and can be of any type (e.g., table lamps, floor lamps, ceiling lamps etc.).
The system 100 further comprises at least one data processor or controller 106. The system may further comprise at least one data repository or storage or memory 108 for storing computer program code instructions. The controller 106 may be communicatively coupled to the cloud 120. However, it should be noted that the controller 106 may be comprised in a central device (e.g., a smartphone, personal computer, a hub), a web-based portal, a combination of a central device and a web-based portal, etc. The controller 106 may be communicatively coupled to a memory module 108.
The first lighting control interface 102 is configured to generate first data indicative of a first lighting control action by the user at a first physical location 152. The lighting control action may include, but is not limited to, turning a lighting device on, turning a lighting device off, dimming the light, setting a lighting device to a light intensity level, selecting a color for a lighting device, selecting a lighting scene for a lighting device, or some other lighting control action. The second lighting control interface 104 is configured to generate second data indicative of a consecutive second lighting control action by the user at a second physical location 154. The lighting control interfaces may be any types of lighting control interfaces configured to receive user inputs indicative of respective lighting control actions. The lighting control interfaces may, for example, be light switches, presence sensors, mobile device, voice interfaces, etc. The lighting control interfaces may be comprised in separate devices, or the lighting control interfaces may be an interface of a single device.
The first and second lighting control actions are provided by the user at respective (different) physical locations. These physical locations may, for example, be the location of lighting control interfaces in different rooms in a building, the location of lighting control interfaces in different areas in a room, the location of lighting control interfaces in different outdoor locations, etc. A user may thus provide the first lighting control action at the first physical location 152 (e.g., a location of a lighting control interface in the bedroom), and the second lighting control action at the second physical location 154 (e.g., a location of a lighting control interface in the bathroom) to control the one or more lighting devices, which may be located at these locations. Fig. 2 illustrates, in a flow chart, an example of steps of a method 200 for assessing a health status of a user based on interactions with lighting control interfaces, in accordance with the present disclosure. The controller 106 is configured to receive 202 the first data and receive 204 the second data, and determine 206, based on the first and second data, a time duration between the first lighting control action and the consecutive second lighting control action. The first data may for example comprise a timestamp of the moment that the first lighting control action has been provided, and the second data may for example comprise a timestamp of the moment that the second lighting control action has been provided, and the controller may be configured to determine the time duration based on the timestamps. Alternatively, the controller 106 may be configured to receive inputs at the moments when the first and second lighting control actions have been provided, and determine the time duration based on a time difference between the moments.
The controller 106 is further configured to identify 208 the health status of the user by comparing the time duration to one or more reference time durations indicative of health statuses. The time duration between the two consecutive interactions with the lighting control interfaces 102, 104 may be indicative of the time duration of the movement between the first 152 and the second physical location 154. Since the time duration of the movement between the first 152 and the second physical location 154 may be indicative of the sensorimotor skills of the user, the health status of the user may be identified from a comparison to the reference time durations of movement. The one or more reference time durations may be sample reference time durations of movement based on similar sample user populations, for example reference time durations based on a similar user age group, gender, medical condition (e.g., Parkinson’s disease, vision impairment, etc.).
The one or more reference time durations may be past time durations, determined over an observation period, between past first lighting control actions and respective past second lighting control actions of the user. Historic data of past time durations of the user may be stored in the memory module 108 as reference time durations indicative of health statuses. The controller 106 may be further configured to identify the health status of the user by determining a deviation of the time duration from the past time durations. Since the health status of the user may change over time, e.g., due to impaired vision, slowing (or acceleration) of movement, decline (or improvement) in sensorimotor control and functioning with age (e.g., increased time to interact with the lighting control interfaces 102, 104 due to tremor), it may take longer (or shorter) for the user to move between the first 152 and the second physical location 154 and correspondingly to interact with the first and the consecutive second lighting control interface. As a result, the time duration between the first and the consecutive second lighting control action may increase (or decrease) over time. A variability of the time duration between consecutive lighting control actions over time may also be indicative of temporal movement variability that is associated with a decline in sensorimotor skills of the user. Standard statistical inference analysis, time-series analysis (e.g., regression, moving average) or machine-learning (e.g., Bayesian network, Isolation Forest, regression tree, etc.) can be used to reveal such incidental/irregular variations (trends) in the health status of the user. Additionally, or alternatively other techniques for anomaly detection can be used, including k-nearest neighbor, Support Vector Machine (SVM), and neural networks (e.g., LSTM). Determining deviations (e.g., by comparison, by determining variability, a moving average trend, etc.) of the time duration from the past time durations over time may be indicative of a change (improvement or decline) in the health status of the user over time.
In the example shown in Fig. 3, a user 320 moves from a first area (e.g., bedroom) to a second area (e.g., bathroom). Following this movement, the user 320 may interact with the lighting control interface 302 located at physical location 312, to, for example, turn off the lighting device 332 located in the bedroom. Then, the user 320 may move to another physical location 314, and interact with the lighting control interface 304 located at the other physical location 314, to, for example, turn on the lighting device 334 located in the bathroom. The controller 106 receives first and second data indicative of the first and respectively the second lighting control action by the user and determines, based on the first and the second data, a time duration between the first lighting control action and the consecutive second lighting control action. The time duration between the two consecutive interactions with the lighting control interfaces 302, 304 may be indicative of the time duration of the movement between the first 312 and the second physical location 314. The controller 106 is configured to identify the health status of the user by comparing the time duration of the movement of the user to one or more reference time durations indicative of health statuses. For example, the controller 106 may compare the time duration of movement to past time durations of the same movement (e.g., past time durations, determined over a different observation period, between past first lighting control actions with lighting control interface 302 and respective past second lighting control actions with lighting control interface 304).
The controller 106 may further be configured to identify the health status of the user based on a distance between the first physical location 152 and the second physical location 154. Using the distance between the physical locations of 152, 154 of the lighting control interfaces 102, 104 and the time elapsed between the first and the consecutive second lighting control action, the speed of movement of the user may be determined by the controller 106 (e.g., speed equals the distance between physical locations 152, 154 divided by the time elapsed between the first and the consecutive second lighting control action). The speed of movement or walking gait may be indicative of the health status of the user. For instance, a change in the speed of movement over time (deviation from reference past speed of movements of the user) may be indicative of impaired mobility or indicative of impaired vision. A change in the speed of movement may even be indicative of an onset or progression of a disease, for example, Parkinson’s patients have difficulties in generating rhythmic movements such as walking and show a progressive diminishing in the speed of movement. A deviation of the average speed of movement or gait speed compared to a reference speed of movement for a representative user sample population with similar characteristics as the user (e.g., similar age, gender, etc.) may be indicative of the health status of the user. For example, a slow(er) walking speed or gait speed compared to the speed of movement of the user sample population may be indicative and/or associated with dementia or cognitive impairment of the user.
The distance between the first physical location 152 and second physical location 154 may be determined based on a layout plan designating the physical locations and distance measurements between the physical locations. The layout plan may relate to a spatial arrangement of the lighting control interfaces. For instance, for a building lighting system of a building, the layout plan may include a floor plan comprising the physical location of lighting control interfaces in a parking area, a floor, a corridor and/or a room, etc. The controller 106 may be configured to determine the distance between the first physical location 152 and second physical location 154 based on the floor plan.
The controller 106 may be configured to determine the distance between the first physical location 152 and second physical location 154 based on communication signals transmitted from devices located in the first 152 and second physical location 154. For example, any method for determining the location of a device based on signals emitted from devices located in the first 152 and second physical location 154 (e.g., RF -based positioning, positioning based on optical signals, etc.) can be used to identify the physical location 152, 154 of the lighting control interfaces 102, 104. It should be noted that the physical location 152, 154 of the lighting control interfaces 102, 104 may be determined in respect to the relative distance of the lighting control interfaces 102, 104 from the lighting devices 132, 134, for example based on signals emitted from the lighting control interfaces 102, 104 to the lighting devices 132, 134. The controller 106 may determine, based on the position or relative location of the lighting control interfaces 102, 104, the distance between the first 152 and the second physical location 154. It should be understood that techniques for determining the distance between two locations are known in the art, and will therefore not be discussed in further detail.
The controller 106 may further be configured to dynamically determine the distance between the first 152 and the second physical location 154 at time instances that the user interacts with the first 102 and the second lighting control interface 104. For example, a lighting control interface may be a mobile device that dynamically changes location over time or the user may change the location of a lighting control interface (e.g., new commissioning). The controller 106 may be configured to determine the location of the lighting control interfaces 102, 104 at all instances that the user interacts with the lighting control interfaces, at regularly scheduled events (e.g., every second, or third interaction), only when the lighting control interface resides on a mobile device, etc. The controller 106 may dynamically determine the distance between the first 152 and the second physical location 154 based on dynamically determining the physical locations 152, 154 of the first and second lighting control interfaces 102, 104.
Fig. 4 shows an example of a mobile device 430 that compromises two lighting control interfaces 402 and 404 configured to control the lighting devices 432 and 434 respectively. A user 420 interacts with the first lighting control interface 402, for example to turn-off the light of lighting device 432. At the time instance that the user interacts with the first lighting control interface 402 and performs the first lighting control action, the system may determine the position 452 of the mobile device 430, for example based on signals communicated between the mobile device 430 and the lighting device 432 (RF-based indoor positioning, positioning based on optical signals, etc.) or based on signals communicated between the mobile device 430 and a central hub, server, etc. At a further time instance, the user interacts with the second lighting control interface 404 of the mobile device 430, for example to turn-on the lights in lighting device 434. At the time instance that the user interacts with the second lighting control interface 404 of the mobile device 430 and performs the second lighting control action, the system may determine the new position 454 of the mobile device 430 based on signals communicated between the mobile device 430 and the lighting device 434 (RF-based indoor positioning, positioning based on optical signals, etc.) or based on signals communicated between the mobile device 430 and a central hub, server, etc. The controller 106 may dynamically determine the distance between the first 452 and the second location 454 of the mobile device 430 based on the locations 452, 454 of the mobile device 430 at time instances that the user interacted with the lighting control interfaces 402, 404.
The controller 106 may further be configured to select the first lighting control interface 102 and the second lighting control interface 104 from the plurality of lighting control interfaces based on information indicative of the physical locations 152, 154 of the lighting control interfaces. Said information indicative of the physical locations of the lighting control interfaces 152, 154 may be derived from a floor plan, e.g.., the controller 106 may be configured to receive a floor plan and determine the physical locations based on the floor plan. A lighting system may comprise three or more lighting control interfaces. However, not all interactions with the lighting control interfaces need to be considered in determining the health status of the user. The selection on which lighting control interfaces may be considered for determining the health status of the user may be based on the physical locations of the lighting control interfaces. For example, a first lighting control interface 102 located at a ground floor and a second lighting control interface 104 located at the first floor are suitable for determining the ability of the user to climb stairs, indicative of the health status of the user, and thus may be considered for determining the health status of the user. However, two lighting control interfaces that are adjacent to each other on the same wall may provide time durations not indicative of a health status and thus may not be considered for determining the health status of the user.
A lighting system may receive interactions with lighting control interfaces 102, 104 from multiple users. The controller 106 may further be configured to receive data indicative of a user profile. The data may for example comprise labeling (tagging) of personal lighting control interfaces (e.g., a mobile device belonging to a first user), labeling (tagging) of patterns of interactions with lighting control interfaces performed by a first user or respectively a second user (e.g., interaction with lighting control interface 102 followed by a consecutive interaction with lighting control interface 102 only performed by a first user), etc. Information about the presence of multiple users may be received via, e.g., a user input.
The controller 106 may be further configured to identify/determine the health status of the user based on the user profile. For example, only interactions with profilespecific lighting control interfaces (lighting control interfaces labeled after a user) may be considered to determine the time duration between the consecutive interactions with lighting control interfaces for an exemplary user. In another example, only interactions with a rotary switch may be considered to determine the time duration between the consecutive interactions with lighting control interfaces for a second exemplary user, etc. Additionally, or alternatively, the health status of the user may be identified by comparing the time duration between the consecutive interactions with lighting control interfaces to one or more profilespecific reference time durations indicative of respective health statuses, e.g., reference time durations for sample user population with Parkinson’s disease, Alzheimer disease, etc.
The controller 106 may further be configured to identify the health status of the user based on context data, wherein the context data may be data indicative of an activity of the user from a set of predetermined activities indicative of health statuses. The time duration of predetermined user activities (e.g., cooking, having dinner, ironing, taking a shower, etc.) may be associated with the health status of the user. Context-data may be extracted based on the labeling of lighting control interfaces by the user (e.g., labeling of living room lights, staircase sensor, etc.), based on the selection of predefined (labeled) lighting scenes by the user (e.g., a lighting control interface may include multiple lighting scene selection buttons that may be pressed to selectively cause implementation of various lighting scenes (e.g., dinner, cooking, etc.), etc. For example, the user may interact with a first lighting control interface 102 to select a cooking lighting scene, and as a result the predetermined user activity “cooking” may be determined to occur (e.g., by the controller 106). The user may further interact with a second lighting control interface 104 to select a different scene (e.g., a dinner scene). The time duration of the predetermined cooking activity may be determined (e.g., by the controller 106) based on the time elapsed between the first and the consecutive second lighting control action. If the user experiences sensorimotor skill decline, it may take longer to cook dinner (e.g., due to the development of tremor or muscle rigidity with age, etc.), and this can be observed from data over time. Additionally or alternatively, the user may interact with a first lighting control interface 102 to select a dinner lighting scene and as a result the predetermined user activity “having dinner” may be determined to occur (e.g., by the controller 106). The user may further interact with a second lighting control interface 104 to select a different scene (e.g., select a “watching TV” lighting scene). The time duration of the predetermined having dinner activity can be inferred based on the time elapsed between the first and the consecutive second lighting control action. A longer duration of the dinner activity may be indicative of a decline in the health status of the user, for example people may develop tremor with age which might impair the ability to hold cutlery. By observing the trend of the predetermined user activities (e.g., cooking, having dinner, ironing, taking a shower, etc.) over a prolonged observation period, the health status of the user may be identified. Time-of-day information may be used to determine the activity of the user. For example, an interaction of the user with the first lighting control interface 102 (labeled as “bedroom”) to turn-on the lighting device 132 and a consecutive second interaction of the user with the second lighting control interface 104 (labeled as “bathroom”) to turn-on the lighting device 134, may be combined with time-of-day information (e.g., after midnight) to extract “mid-night toilet visit” activity.
Fig. 5 shows a method 500 of assessing a health status of a user based on interactions with lighting control interfaces. In this method 500, additional (optional) steps have been added to the method of Fig. 2. The method 500 may be performed by the controller 106 of Fig. 1, for example. The method may comprise determining 510 (e.g., by the controller 106) a change in the health status of the user and provide 512 a notification via a user interface indicative of the change in the health status. For example, if there is a change in the health status of the user, e.g., a decline or improvement in sensorimotor skills the user, the user or a caregiver may be notified by a message. Alternatively or additionally, the method 500 may comprise outputting 512 (e.g., by the controller 106) aggregations of this decline (or improvement), for example on a dashboard with key sensorimotor skill indicators in a numeric or graphical form. Such dashboard could be presented in a digital user interface (e.g., a web user interface). Optionally, the method 500 may comprise controlling 514 (e.g., by the controller 106) the one or more lighting devices 132, 134 in response to the change in the health status of the user. For instance, the controller 106 may adjust the lighting control on the devices 132, 134 based on the detection of the development of motor slowing of the user (e.g., if people move slower from one room to another, the lighting devices 132, 134 may be configured to use a longer time set to switch off after no motion has been detected). Additionally or alternatively, lighting control parameters, such as the hold time and/or sensitivity of a motion sensor connected to the lighting devices 132, 134, may be increased in response to a change in sensorimotor user skills.
The system 100 can also incorporate a predictive element, which uses the past time durations, determined over a different observation period, between past first lighting control actions and respective past second lighting control actions of the user to predict the health status of the user in the future. In one example, the controller 106 may use a regression algorithm to predict a future time duration between consecutive user interactions with lighting control interfaces of the user. Alternatively or additionally, a long short term memory neural network (LSTM), or other time series model (e.g., Autoregressive integrated moving average (ARIMA) model) can be designed to predict a future time duration of the user based on past time durations between consecutive lighting control actions of the user. A future health status of the user may be identified based on comparing the future time duration to one or more reference time durations indicative of respective health statuses.
The method 200, 500 may be executed by computer program code of a computer program product when the computer program product is run on a processing unit of a computing device, such as the controller 106 of the system 100.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims.
In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. Use of the verb "comprise" and its conjugations does not exclude the presence of elements or steps other than those stated in a claim. The article "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer or processing unit. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Aspects of the invention may be implemented in a computer program product, which may be a collection of computer program instructions stored on a computer readable storage device which may be executed by a computer. The instructions of the present invention may be in any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs) or Java classes. The instructions can be provided as complete executable programs, partial executable programs, as modifications to existing programs (e.g., updates) or extensions for existing programs (e.g., plugins). Moreover, parts of the processing of the present invention may be distributed over multiple computers or processors or even the ‘cloud’.
Storage media suitable for storing computer program instructions include all forms of nonvolatile memory, including but not limited to EPROM, EEPROM and flash memory devices, magnetic disks such as the internal and external hard disk drives, removable disks and CD-ROM disks. The computer program product may be distributed on such a storage medium, or may be offered for download through HTTP, FTP, email or through a server connected to a network such as the Internet.

Claims

CLAIMS:
1. A method for assessing a health status of a user based on interactions with a plurality of lighting control interfaces, the health status being indicative of sensorimotor skills and/or vision impairment of the user, wherein the lighting control interfaces are configured to control one or more lighting devices based on the interactions, the method comprising: receiving first data indicative of a first lighting control action by the user at a first physical location; receiving second data indicative of a consecutive second lighting control action by the user at a second physical location, within a predefined time period; determining, based on the first and the second data, a time duration between the first lighting control action and the consecutive second lighting control action; determining a distance between said first physical location and said second physical location based on a floor plan designating said first and second physical locations and distance measurements between the first and second physical locations; determining a speed of movement of the user based on said time duration and said distance, said speed of movement being indicative of the health status of the user; identifying the health status of the user by identifying a deviation of the speed of movement to one or more reference speed of movements indicative of respective health statuses.
2. The method according to claim 1, wherein the one or more reference speed of movements are past speed of movements, determined over an observation period, between past first lighting control actions and respective past second lighting control actions of the user, and wherein the identifying of the health status of the user comprises: identifying a trend in the health status of the user by determining a deviation of the speed of movement from the past speed of movements.
3. The method according to claim 2, wherein the method comprises: capturing the past speed of movements of the user by receiving sets of first and respective second data over a period of time; determining time durations between first lighting control actions and second lighting control actions; determining speeds of movements based on time durations and distance and storing the past speed of movements as the one or more reference speed of movements.
4. The method according to claims 2 or 3, further including: generating a prediction for a future speed of movement, wherein the prediction is based at least in part on the past speed of movements as input into a predictive model; and identifying a future health status of the user by comparing the future speed of movement to one or more reference speed of movements indicative of respective health statuses.
5. The method according to claim 1, wherein the method comprises determining the distance between said first physical location and said second physical location further based on communication signals transmitted from devices located in the first and second physical location.
6. The method according to claim 1, wherein the method comprises dynamically determining the distance between the first and the second physical location at time instances that the user interacts with the first and the second lighting control interface.
7. The method according to any of the preceding claims, wherein the plurality of lighting control interfaces comprises three or more lighting control interfaces, wherein the first lighting control action is an interaction with a first lighting control interface and the second lighting control action is an interaction with a second lighting control interface, wherein the method comprises: selecting the first lighting control interface and the second lighting control interface from the plurality of lighting control interfaces based on the physical locations of the lighting control interfaces.
8. The method according to any of the preceding claims, wherein the method comprises: receiving data indicative of a user profile of the user, identifying the health status of the user further based on the user profile.
9. The method according to any of the preceding claims, wherein the health status of the user is further identified based on context data, wherein the context data is data indicative of an activity of the user from a set of predetermined activities indicative of health statuses.
10. The method according to claim 2, further including: determining a change in the health status of the user and providing a notification via a user interface indicative of the change in the health status.
11. The method according to claim 2, further including: determining a change in the health status of the user and controlling the one or more lighting devices in response to the change in the health status of the user.
12. A computer program product for a computing device, the computer program product comprising computer program code to perform the method of any preceding claim when the computer program product is run on a processing unit of the computing device.
13. A system for assessing a health status of a user based on interactions with a plurality of lighting control interfaces, the health status being indicative of sensorimotor skills and/or vision impairment of the user, wherein the lighting control interfaces are configured to control one or more lighting devices based on the interactions, comprising: a first lighting control interface configured to generate first data indicative of a first lighting control action by the user at a first physical location; a second lighting control interface configured to generate second data indicative of a consecutive second lighting control action by the user at a second physical location, within a predetermined period of time; a controller configured to receive the first and second data, and further configured to determine, based on the first and second data, a time duration between the first lighting control action and the consecutive second lighting control action; determine a distance between said first physical location and said second physical location based on a floor plan designating said first and second physical locations and distance measurements between the first and second physical locations; determine a speed of movement of the user based on said time duration and said distance, said speed of movement being indicative of the health status of the user, and identify the health status of the user by comparing the speed of movement to one or more reference speed of movements indicative of respective health statuses.
PCT/EP2023/062209 2022-05-10 2023-05-09 A system and method for assessing a health status of a user based on interactions with lighting control interfaces WO2023217745A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP22172457.8 2022-05-10
EP22172457 2022-05-10

Publications (1)

Publication Number Publication Date
WO2023217745A1 true WO2023217745A1 (en) 2023-11-16

Family

ID=81975231

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/062209 WO2023217745A1 (en) 2022-05-10 2023-05-09 A system and method for assessing a health status of a user based on interactions with lighting control interfaces

Country Status (1)

Country Link
WO (1) WO2023217745A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108182787A (en) * 2018-01-17 2018-06-19 深圳海斯凯医学技术有限公司 A kind of wired home health monitoring systems platform based on big data Internet of Things
US20190244508A1 (en) * 2016-10-20 2019-08-08 Signify Holding B.V. A system and method for monitoring activities of daily living of a person
WO2019199365A2 (en) * 2018-04-13 2019-10-17 BrainofT Inc. Utilizing context information of environment component regions for event/activity prediction
US20210057101A1 (en) * 2019-08-20 2021-02-25 Vinya Intelligence Inc. In-home remote monitoring systems and methods for predicting health status decline

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190244508A1 (en) * 2016-10-20 2019-08-08 Signify Holding B.V. A system and method for monitoring activities of daily living of a person
CN108182787A (en) * 2018-01-17 2018-06-19 深圳海斯凯医学技术有限公司 A kind of wired home health monitoring systems platform based on big data Internet of Things
WO2019199365A2 (en) * 2018-04-13 2019-10-17 BrainofT Inc. Utilizing context information of environment component regions for event/activity prediction
US20210057101A1 (en) * 2019-08-20 2021-02-25 Vinya Intelligence Inc. In-home remote monitoring systems and methods for predicting health status decline

Similar Documents

Publication Publication Date Title
US11846954B2 (en) Home and building automation system
US10978064B2 (en) Contextually relevant spoken device-to-device communication between IoT devices
JP6731506B2 (en) A smart home hazard detector that gives a non-alarm status signal at the right moment
CN109661856B (en) Illumination control method, system and storage medium
US20190349213A1 (en) Systems and Methods for Home Automation Control
CN107439056B (en) Illumination control device
Flores-Martin et al. Smart nursing homes: Self-management architecture based on iot and machine learning for rural areas
EP3198577B1 (en) A system for managing services
CN110140428B (en) Adaptive lighting automation
CN110599747A (en) User reminding method and device and intelligent doorbell system
US10609787B2 (en) Recommendation engine for a lighting system
Radziszewski et al. Designing calm and non-intrusive ambient assisted living system for monitoring nighttime wanderings
US11116060B2 (en) Presence simulation system and method
WO2023217745A1 (en) A system and method for assessing a health status of a user based on interactions with lighting control interfaces
Cunha et al. AmbLEDs collaborative healthcare for AAL systems
US20160062329A1 (en) Control method of presented information, control device of presented information, and speaker
KR20200036199A (en) Method and System of Care Service for The Cognitively Impaired
CN110462665B (en) Method and apparatus for monitoring use of a lighting system
CASAGRANDE Review on Assisted Living Technologies
Montanini Smartphone Applications for AAL and Well-being in the Home Environment
KR20200081745A (en) Method for controlling a smart emotional light device based on IoT by considering biological rhythm and life patterns of human
Vadillo Moreno et al. Deployment of a smart telecare system to carry out an intelligent health monitoring at home
Vadillo et al. Deployment of a Smart Telecare System to Carry out an Intelligent Health Monitoring at Home

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23725678

Country of ref document: EP

Kind code of ref document: A1