WO2015125142A1 - Methods and systems for personalized sensory sensitivity simulation and alerting - Google Patents

Methods and systems for personalized sensory sensitivity simulation and alerting Download PDF

Info

Publication number
WO2015125142A1
WO2015125142A1 PCT/IL2015/050188 IL2015050188W WO2015125142A1 WO 2015125142 A1 WO2015125142 A1 WO 2015125142A1 IL 2015050188 W IL2015050188 W IL 2015050188W WO 2015125142 A1 WO2015125142 A1 WO 2015125142A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensory
defensiveness
condition
analysis
scene
Prior art date
Application number
PCT/IL2015/050188
Other languages
French (fr)
Inventor
Dafna Miriam SHOMRONI LESS
Yahel ATSMON
Original Assignee
Shomroni Less Dafna Miriam
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shomroni Less Dafna Miriam filed Critical Shomroni Less Dafna Miriam
Publication of WO2015125142A1 publication Critical patent/WO2015125142A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4005Detecting, measuring or recording for evaluating the nervous system for evaluating the sensory system
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14539Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring pH
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Abstract

A method presenting one sensory defensiveness condition stimulus indication on a mobile device. The method comprises setting a sensory profile defining at least one sensory defensiveness condition stimulus which trigger at least one sensory defensiveness condition of a target individual, capturing, using at least one sensor of a mobile device, scene data comprising at least one of an audio stream, video stream and kinesthetic stream of a scene in proximity to said mobile device, performing an analysis said scene data to detect a presence or an absence of said at least one sensory defensiveness condition stimulus in said scene, generating a presentation which indicative of said presence or said absence according to an outcome of said analysis, and presenting said presentation on said mobile device.

Description

METHODS AND SYSTEMS FOR PERSONALIZED SENSORY
SENSITIVITY SIMULATION AND ALERTING
BACKGROUND
The present invention, in some embodiments thereof, relates to sensory sensitivity simulation and, more specifically, but not exclusively, to methods and systems for personalized sensory sensitivity simulation and alerting.
Sensory processing disorder (SPD) is a condition that exists when sensory signals don't integrate to provide appropriate responses; the various types of sensory information are processed by multisensory integration. Different people experience a wide range of difficulties when processing input coming from a variety of senses. For example, some people find wool fabrics itchy and hard to wear while others don't and some individuals might experience motion sickness while riding amusement park rides, while their friends are having fun. However, sensory processing disorder is characterized by significant problems to organize sensation coming from the body and the environment and manifested by difficulties in the performance in one or more of the main areas of life: productivity, leisure and play or activities of daily living.
Sensory defensiveness is a subset of sensory processing disorder. Sensory defensiveness is a condition defined as having "a tendency to react negatively or with alarm to sensory input which is generally considered harmless or non-irritating" to neurotypical people, see Wilbarger, Patricia and Wilbarger, Julia. (1991). Sensory Defensiveness in Children Aged 2-12: An Intervention Guide for Parents and Other Caretakers.
Common symptoms of sensory defensiveness include intolerance of high- pitched noises, intolerance of chewing sounds, intolerance of overhead lights (especially fluorescent lighting); experiencing a feeling of being attacked upon being touched (especially from light touch or sudden touch); intolerance of certain types of fabrics in contact with the skin; becoming nauseated upon smelling something that does not smell bad to neurotypical individuals; difficulty maintaining eye-contact; severe intolerance of foods due to taste, texture, or temperature; and generally becoming overwhelmed when exposed to a lot of sensory stimuli at once. SUMMARY
According to some embodiments of the present invention, there is provided a wearable device which detects a pre-atypical behaviour physiological pattern using a wearable device. The wearable device comprises a housing, a memory adapted to store a pre-atypical behaviour physiological pattern of a wearer triggered by at least one sensory defensiveness condition stimulus, a sensing circuit integrated into the housing and adapted to capture at least one physiological parameter of the wearer, and at least one processor adapted to perform an analysis the at least one physiological parameter to detect a presence or an absence of the pre-atypical behaviour physiological pattern, generate a notification indicative of the presence or the absence according to an outcome of the analysis, and instruct a presentation of the notification.
Optionally, the housing is sized and shaped as a wrist worn device.
Optionally, the sensing circuit comprises a member of a group consisting of a human pulse detection module, a blood oxygen concentration detection module, a body surface humidity detection module, and a body temperature detection module.
Optionally, the wearable device further comprises at least one motion sensor; wherein the at least one physiological parameter comprises a limb motion pattern.
Optionally, the wearable device further comprises a display for displaying the notification.
Optionally, the wearable device further comprises a wireless unit for transmitting the notification to a mobile device.
According to some embodiments of the present invention, there is provided a method of detecting a pre-atypical behaviour physiological pattern using a wearable device. The method comprises storing a pre-atypical behaviour physiological pattern of a target individual triggered by at least one sensory defensiveness condition stimulus, capturing, using at least one sensor of a wearable device, at least one physiological parameter of the target individual, performing an analysis the at least one physiological parameter to detect a presence or an absence of the pre-atypical behaviour physiological pattern, generating a notification indicative of the presence or the absence according to an outcome of the analysis, and instructing a presentation of the notification to at least one of the target individual and a third party. Optionally, the pre-atypical behaviour physiological pattern is identified by monitoring a response of the target individual to the at least one sensory defensiveness condition stimulus.
Optionally, the pre-atypical behaviour physiological pattern is identified using a statistical classifier.
Optionally, the pre-atypical behaviour physiological pattern is identified according to a correlation with scene data comprising at least one of an audio stream, video stream and kinesthetic stream of a scene in proximity to the target individual.
Optionally, the at least one physiological parameter is selected from a group consisting of a skin surface temperature, a skin pH, a body temperature, a blood oxygen saturation, an electrocardiograph (ECG) pattern, a pulse rate, a respiration rate, and blood pressure.
Optionally, the pre-atypical behaviour physiological pattern is indicative of a sensory overload, an overstimulation, an emotional dysregulation (ED) and a physical state or pattern which proceeds to an atypical sensory-based behaviour.
Optionally, the performing comprises: capturing scene data comprising at least one of an audio stream, video stream and kinesthetic stream of a scene in proximity to the wearable device, correlating between the pre-atypical behaviour physiological pattern and the scene data, and determining whether to output the notification according to the correlating.
According to some embodiments of the present invention, there are provided a method presenting one sensory defensiveness condition stimulus indication on a mobile device. The method comprises setting a sensory profile defining at least one sensory defensiveness condition stimulus which trigger at least one sensory defensiveness condition of a target individual, capturing, using at least one sensor of a mobile device, scene data comprising at least one of an audio stream, video stream and kinesthetic stream of a scene in proximity to the mobile device, performing an analysis the scene data to detect a presence or an absence of the at least one sensory defensiveness condition stimulus in the scene, generating a presentation which indicative of the presence or the absence according to an outcome of the analysis, and presenting the presentation on the mobile device. Optionally, the analysis comprises an image processing analysis applying a determination function on the video stream to identify an object triggering the at least one sensory defensiveness condition.
Optionally, the analysis comprises an image processing analysis applying a determination function on the video stream to identify a color triggering the at least one sensory defensiveness condition.
Optionally, the analysis comprises an image processing analysis applying a determination function on the video stream to identify a source of flickering light triggering the at least one sensory defensiveness condition.
Optionally, the analysis comprises an image processing analysis applying a determination function on the video stream to identify a presence of a crowd in the scene which triggers the at least one sensory defensiveness condition.
Optionally, the capturing, the performing, the generating, and the presenting are performed in response to a user triggering action.
Optionally, the capturing, the performing, the generating, and the presenting are iteratively performed; wherein the mobile device is a wearable device.
Optionally, the analysis comprises a sound processing analysis applying a determination function on the audio stream to identify a sound having a characteristic triggering the at least one sensory defensiveness condition.
More optionally the characteristic is selected from a group consisting of a pattern, a frequency, intensity, and a pitch.
Optionally, the analysis comprises a motion processing analysis applying a determination function on the kinesthetic stream to identify a motion of the mobile device having a motion characteristic triggering the at least one sensory defensiveness condition.
Optionally, the generating comprises generating a simulation of an effect of the at least one sensory defensiveness condition stimulus on the target individual by processing the scene data.
More optionally, the simulation of the intensity of the effect of the at least one sensory defensiveness condition stimulus is correlated with a reference intensity defined in the sensory profile. More optionally, the generating a simulation comprises processing the video stream to identify an object in the video stream and to edit the video stream so that at least one visual characteristic of the object is adapted therein to emulate the effect of the at least one sensory defensiveness condition stimulus on the target individual.
Optionally, the sensory profile defines an intensity level for the at least one sensory defensiveness condition stimulus; wherein the analyzing is performed while taking into account the intensity level.
Optionally, the setting comprises receiving from a user the at least one sensory defensiveness condition stimulus on the mobile device.
Optionally, the setting comprises extracting the at least one sensory defensiveness condition stimulus from a sensory profile form filled by a physician.
According to some embodiments of the present invention, there are provided a mobile device which presents sensory defensiveness condition indication. The mobile device comprises a processor, a memory which stores a sensory profile defining at least one sensory defensiveness condition stimulus which trigger at least one sensory defensiveness condition of a target individual, at least one sensor which captures scene data comprising at least one of an audio stream, video stream and kinesthetic stream of a scene in proximity to the mobile device, a detection module executed by the processor to perform an analysis the scene data to detect a presence or an absence of the at least one sensory defensiveness condition stimulus in the scene, a presentation module executed by the processor to generate a presentation which indicative of the presence or the absence according to an outcome of the analysis, and a presentation unit which presents the presentation on the mobile device.
Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting. BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
Some embodiments of the invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.
In the drawings:
FIG. 1 is a flowchart of a method of real time presentation of indications of sensory defensiveness conditions of a target individual or sensory defensiveness condition stimuli from a scene captured by sensors of or connected to a mobile device, according to some embodiments of the present invention;
FIG. 2 is a schematic illustration of a mobile device for of real time simulation or indication of sensory defensiveness conditions of a target individual, according to some embodiments of the present invention;
FIG. 3A is an image of a scene wherein defensiveness condition stimuli of a certain target individual;
FIG. 3B is an image which depicts an effect of a defensiveness condition stimuli in the scene depicted in FIG. 3A, according to some embodiments of the present invention; and
FIG. 3C is a set of exemplary frames depicting an effect of a defensiveness condition stimuli in the scene depicted in FIG. 3A, according to some embodiments of the present invention;
FIG. 4 is a schematic illustration of an exemplary sensor(s) enhanced wearable device that includes sensors and modules to gather real time physical data of a target individual (a wearer) so as to identify a physical reaction to stimuli in the scene wherein the target individual is located, according to some embodiments of the present invention;
FIG. 5 is a flowchart of a method of personalizing a reaction of a target individual to stimuli by identifying a pre-atypical behaviour physiological pattern using a wearable device, such as the device depicted in FIG. 4, according to some embodiments of the present invention; FIG. 6 is a flowchart of a method of combining outputs of physiological parameter sensors with the outputs of sensors for detecting a pre-atypical behaviour physiological pattern indicative of sensory overload, according to some embodiments of the present invention; and
FIG. 7 is a process of identifying an atypical behaviour physiological pattern using statistical classifier(s) for classifying behaviour physiological patterns which are recorded using a wearable device, according to some embodiments of the present invention. DETAILED DESCRIPTION
The present invention, in some embodiments thereof, relates to sensory sensitivity simulation and, more specifically, but not exclusively, to methods and systems for personalized sensory sensitivity simulation and alerting.
According to some embodiments of the present invention, there are provided methods and systems of allowing a user to use his mobile device, for example a Smartphone or a wearable computing device, to identify in a scene sensory defensiveness condition stimuli which trigger defensiveness conditions in a target individual, such as a neurotypical (NT). In use, the methods and systems may be used for presenting to a user, such as a caregiver, indications about one or more sensory defensiveness condition stimuli in the scene, pointing to him which objects or noises in the scene, which are generally considered harmless or non-irritating. In such a manner, the systems and methods guide the user which stimuli should be adjusted or removed from the scene to prevent a sensory defensiveness condition wherein the target individual reacts negatively or with alarm.
Optionally, the user who uses the methods and systems is presented in real time with a simulation which emulates how the target individual experiences the scene. The simulation may be a video clip or a stream having images of the scene which are edited to emulate how the target individual sees certain objects or which objects are not perceived negatively. The simulation may include an audio stream playing sounds from the scene in an edited manner, indicating which sounds may trigger a sensory defensiveness condition. The simulation is optionally presented in real time, for example while the scene data is captured by the mobile device. Optionally, the scene sensory defensiveness condition stimuli identified in the scene are sensory defensiveness condition stimuli defined in a sensory profile adapted for the target individual. For example, the sensory profile may define common scene sensory defensiveness condition stimuli such as high-pitched noises, chewing sounds, and overhead lights (especially fluorescent lighting). The presence or absence of the sensory defensiveness condition stimuli may be identified from the analysis of scene data captured by the mobile device, for example an audio stream, video stream and a kinesthetic stream of the scene in proximity to the mobile device. The analysis may be performed by applying determination functions on the captured scene data.
The sensory profile may be defined according to the SENSORY PROFILE™ defined by Winnie Dunn, see Winnie Dunn, Sensory Profile: User's Manual, Psychological Corporation, 1999, ISBN 0761638016, 9780761638018; Catana E. Brown et al, Adolescent-Adult Sensory Profile: User's Manual, Communication Skill Builders/Therapy Skill Builders, 2002, ISBN 0761649719, 9780761649717; and Winnie Dunn, infant-toddler Sensory Profile: User's Manual, Therapy Skill Builders, Psychological Corporation, 2002, ISBN 0761649557, 9780761649557, which are incorporated herein by reference.
According to some embodiments of the present invention, there are provided methods and systems wherein a sensor(s) enhanced wearable device, such as a wristband, a smart watch, a sticker and/or the like is used to gather real time physical data of a target individual so as to identify a physical reaction to stimuli in the scene wherein the target individual is located. This identification may be used to identify an atypical sensory-based behaviour, such as a sensory overload, overstimulation, Emotional dysregulation (ED) and/or a physical state or pattern which proceeds to an atypical sensory-based behaviour. A pattern or values of one or more physical parameters indicative of atypical sensory-based behaviour or a pattern or values of one or more physical parameters which precedes of an atypical sensory-based behaviour may be referred to herein for brevity as a pre-atypical behaviour physiological pattern.
In use, the sensor(s) enhanced wearable device may be used for monitoring a wearer and to react to the detection of a pre-atypical behaviour physiological pattern by generating a notification to the target individual or surroundings, for instance generating a haptic feedback, playing a relaxing tone or music, playing a recording a caregiver (e.g. a voice of a talking or singing parent) and/or the like. Additionally or alternatively, the sensor(s) enhanced wearable device may generate, in response to the detection of the pre-atypical behaviour physiological pattern, a notification to a caregiver (e.g. a parent), for instance a blinking LED or display and/or by sending a message to a client of the parent, for instance an SMS, an IM message, an email or a client application notification.
Optionally, sensors embedded in the sensor(s) enhanced wearable device keep track of physiological parameters such as skin surface temperature, skin pH, and/or vital signs (e.g. body Temperature, blood oxygen saturation, electrocardiograph (ECG) patterns, Pulse Rate, Respiration Rate, Blood Pressure) of the target individual. The output of the sensors is analyzed, either locally or at a remote server, to detect a presence or an absence of a pre-atypical behaviour physiological pattern.
Optionally, the presence or the absence of a pre-atypical behaviour physiological pattern and/or an atypical behaviour event is determined by an analysis of a user motion as recorded by embedded motion sensors, such as accelerometers or gyroscopes such as Repetitive movements (Waving hands, clappings, etc.), distinct physical posture (Holding hands on the ears for example).
Additionally or alternatively, the sensor(s) enhanced wearable device includes sensing circuit for detecting a human pulse detection module, a blood oxygen concentration detection module, a body surface humidity detection module, and/or a body temperature detection module. Optionally the sensor(s) enhanced wearable device also includes one or more accelerometers for identifying current motion patterns, one or more microprocessors for processing inputs from the sensors and a wireless communication circuitry for forwarding data to a target device, such as a mobile device.
Additionally or alternatively, the sensor(s) enhanced wearable device includes a sensing circuit for detecting a scene sensory defensiveness condition stimuli, for example a camera, a microphone or a communication module for receiving data about the scene sensory defensiveness condition stimuli from a remote device, such as a mobile device or a stationary device that is installed nearby. In such embodiments, the pre-atypical behaviour physiological pattern includes indications of the scene. These scenes might include for example loud environment Noise or a significant visual event. Additionally or alternatively, the detection of a pre-atypical behaviour physiological pattern is determined using statistical classifiers generated based on a training set of a plurality of physiological parameters records. In such embodiments, the sensor(s) enhanced wearable device may be used to log physiological parameter changes and events of atypical behaviour events into a log record. This log record may then be forwarded to a remote server for updating the statistical classifiers.
Additionally or alternatively, the detection of a pre-atypical behaviour physiological pattern is determined based on a personal profile of the target individual. In such embodiments, a personal profile is built for the target individual by analyzing a log record, for example as outlined above. The personal profile may be indicative of personal pre-atypical behaviour physiological patterns which have triggered atypical behaviours in the past.
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways.
The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention. The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks. The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
Reference is now made to FIG. 1, which is a flowchart of a method 100 of real time presentation of indications of sensory defensiveness conditions of a target individual or sensory defensiveness condition stimuli from a scene captured by sensors of or connected to a mobile device, according to some embodiments of the present invention. The method allows a caretaker or any other individual to receive an indication of stimuli in a scene which may trigger sensory defensiveness conditions at the target individual. As used herein a mobile device may be a handheld device such as a Smartphone, a tablet, a wearable device, such as a Smart watch and a Smart glasses and/or any other computing based device.
Reference is also made to FIG. 2 which is a schematic illustration of a mobile device 200, such as a Smartphone, for of real time simulation or indication of sensory defensiveness condition stimuli triggering sensory defensiveness conditions of a target individual from a scene captured by sensors 203 of or connected to a mobile device, according to some embodiments of the present invention. The mobile device 200 includes a user interface module which generates a user interface, such as a graphical user interface (GUI) displayed on a display 201 (or any other presentation unit) and allows a user to provide a sensory profile defining sensory defensiveness condition stimuli which trigger sensory defensiveness condition(s) of an individual.
The mobile device 200 includes a personalization module 206 set to store one or more sensory defensiveness conditions and/or sensory defensiveness condition stimuli of one or more individuals. The personalization module 206 optionally generates a user interface (UI), such as a graphical UI (GUI), which is presented to a user to allow him to define one or more defensiveness condition stimuli. For instance, a list of defensiveness condition stimuli is presented to the user, allowing to select a defensiveness condition stimulus and optionally to define reaction intensity, for example using an intensity bar. Optionally, the also define ranges of the defensiveness condition stimuli, for example a dB range, a frequency range, a color range and/or the like.
The mobile device 200 further includes a stimuli detection module 202 which is installed in the mobile device 200 and set to receive and analyze outputs of one or more sensors 203 which capture scene data, for example visual, audible, and/or kinesthetic streams, or any other fixations, of a scene in proximity to the mobile device in response to a user input made for example the above GUI. For example, the mobile device 200 includes or connected to an imager, for example an integrated camera of a mobile device 200, such as a Smartphone, for capturing a visual stream, such as a video sequence, an image, and the like In such embodiments, video sequence may be captured and analyzed by stimuli detection module 202 for detecting defensiveness condition stimuli. Additionally or alternatively, the mobile device 200 includes an audio recording device, such as an integrated microphone of the mobile device 200, such as a Smartphone, for capturing an audio stream, such as an audio recording. In such embodiments, audio sequence is captured and analyzed by stimuli detection module 202 for detecting defensiveness condition stimuli, for instance noises having a certain, pattern (e.g. monotonic), pitch or frequency and/or above a decibel (dB) threshold. Additionally or alternatively, the mobile device 200 includes a motion sensor, such as an accelerometer of the mobile device 200. In such embodiments, a motion reading is captured and analyzed by stimuli detection module 202 for detecting defensiveness condition stimuli, for example an excessive motion above a threshold.
In use, a detection module 207 installed in the mobile device 200 receives outputs of the sensor(s) 203 and analyzes them to detect a presence or an absence of one or more defensiveness condition stimuli which have been associated with a target individual in a scene around the mobile device 200. For example, the sensor(s) 203 image the surrounding in front of the mobile device 200 and/or records sounds around the mobile device 200 and forwards the captured video and/or audio sequence to the detection module 207 for analysis.
The detection module 207 applies one or more detection functions to detect one or an absence of one or more defensiveness condition stimuli. Based on the output of the analysis, for instance upon a detection of a presence of such defensiveness condition stimuli, a presentation module 208 generates an indicative presentation displayed on the display 201 or using another presentation unit, such as a speaker. The presentation may be in the form of a notification, for example "no defensiveness stimuli have been found", "the surrounding noise is to high", "note the red colored items in the scene" etc. Such presentation may be extracted from a database that allows matching between the identified defensiveness stimuli and records defining notifications to present. Additionally or alternatively, the presentation is in the form of a simulation of the effect of one or more detected defensiveness condition stimuli on the target individual. For instance, a video clip emulating noises and/or deformed visual objects may be generated by processing the captured audio video stream and editing one or more audible and/or visual characteristics of sounds and/or objects therein. For example, the simulation may be playing recorded surrounding noises in an intensified manner, activating vibration elements of the mobile device 207 to emulate tactile pressure, and/or presenting processed images with visually edited objects of a scene in proximity to the mobile device 207. This allows creating a simulation of the effects of visual objects and sounds on the current reality perspective of the target individual. Optionally, the intensity of a simulated defensiveness condition stimulus is correlated with a reference intensity set in the sensory profile, for instance set by the user for this defensiveness condition stimulus, for example using the above personalization module 206.
For instance, FIG. 3A is an image of a scene wherein defensiveness condition stimuli of a certain target individual, namely sensitivity to the color red, sensitivity to monotonic noise, and sensitivity to bright light which are identified by the detection module 207. In this example, the presentation module 208 may generates, optionally in real time, an image, for example as shown in FIG. 3B which is an image that depicts an effect of a defensiveness condition stimuli in the scene depicted in FIG. 3A. In this image, a red colored object above a certain size are identified and smeared to emulate an effect on the target individual. Optionally, recorded background white noise is played intensified during the presentation. In another example, the presentation module 208 may generates, optionally in real time, a video sequence, for example as shown in FIG. 3C which is a set of exemplary frames depicting an effect of a defensiveness condition stimuli in the scene depicted in FIG. 3A. In this video sequence, the red colored object above a certain size are identified and blurred and smeared to emulate an effect on the target individual. Optionally, recorded background white noise is played intensified during the presentation. In addition, bright light sources, such as the window, may be identified and marked (not shown). Optionally, in both examples, the smearing breadth is a function of the intensity set by the user.
Reference is now also made, once again, to FIG. 1 for describing an exemplary flow that allows a user to receive an indication or a simulation of defensiveness condition stimuli in a scene.
First, as shown at 101, a sensory profile defining sensory defensiveness condition(s) of a target individual is set, for example by user input made using a GUI presented on the mobile device 200, for instance as described above. The sensory profile defines one or more defensiveness condition stimuli of the target individual. Optionally, the sensory profile is extracted from a clinical test to the target individual and/or a clinical questionnaire given to a user, such as a caregiver, for example a parent or a physician. For example, the sensory profile may be extracted from a sensory profile form; such as described in SENSORY PROFILE™ by Winnie Dunn, see Winnie Dunn, Sensory Profile: User's Manual, Psychological Corporation, 1999, ISBN 0761638016, 9780761638018; Catana E. Brown et. al, Adolescent-Adult Sensory Profile: User's Manual, Communication Skill Builders/Therapy Skill Builders, 2002, ISBN 0761649719, 9780761649717; and Winnie Dunn, infant-toddler Sensory Profile: User's Manual, Therapy Skill Builders, Psychological Corporation, 2002, ISBN 0761649557, 9780761649557, which are incorporated herein by reference. The sensory profile may be defined in a preliminary stage, for example when an application comprises the personalization module 206, the detection module 207, and the presentation module 208, for example an application from an application store such as Goggle Play™ or the App Store™. As shown at 102, in use, for example, when the user triggers a defensiveness condition stimuli detection session, visual, audible, and/or kinesthetic recording of a scene in proximity to the mobile device 200 are captured using the sensor(s) 203.
Now, as shown at 103, the captured data is analyzed to detect a presence or absence of one or more defensiveness condition stimuli in the scene. Optionally, a set of detection functions are applied according to the sensory profile. A detection function is optionally a function which is applied on the outputs of the sensors to detect a signal, an object, a motion pattern, a color and/or any other defensiveness condition stimulus that can be extracted from the sensors. The functions are optionally locally stored in a memory 209 of the mobile device 200.
Exemplary detection function that may be used for detecting Auditory defensiveness conditions are:
a noise function which detects loud noises having a dB above a threshold (i.e., vacuum cleaner, dog barking, hair dryer noises); and
a background noise function which detects monotonic noises (i.e., fan, refrigerator).
Exemplary detection function that may be used for detecting visual defensiveness conditions are:
bright light function which detects bright light source from an analysis of a video sequence (i.e., sunlight through window in car), see for example, Maciej Laskowski, Detection of light sources in digital photographs Institute of Computer Graphics Technical University of Szczecin, Szczecin/Poland, which is incorporated herein by reference;
flickering light which detects flickering light based on occurrence and shape properties at an unprecedented speed source from an analysis of a video sequence, see Rovamo J, Raninen A, 1996, "Modelling of human flicker detection at various light levels" Perception 25 ECVP Abstract Supplement Modelling of human flicker detection at various light levels, which is incorporated herein by reference;
color function which detects objects with colors defined as triggering a defensiveness condition from an analysis of a video sequence; and public function which detects the number of people in a scene from an analysis of a video sequence, see Thomas KLINGER et. al., Object Detection and Counting in Public Places, Laboratory for IEEE 1394 Industrial Solutions
Carinthia Tech Institute / School of Electronics which is incorporated herein by reference.
Exemplary detection function that may be used for detecting olfaction defensiveness conditions is an object function which detects predefined objects from an analysis of a video sequence, for example objects having smell to which the target individual is sensitive.
Optionally, the detection functions are uploaded via a network 210, such as the
Internet, from a central unit 211, such as a server in response to the definition of the sensory profile by the user and/or by an analysis of a form as described above.
Now, as shown at 104, a presentation indicative of a presence or absence of one or more defensiveness condition stimuli in the scene is generated. The presentation may be a list of the detected defensiveness condition stimuli and/or a simulation indicative of the detected defensiveness condition stimuli, for example as described above. Optionally, a plurality of simulation filters, such as image conversion and/or audio conversion functions are stored in the mobile device 200, for example in association with the detection functions. In such embodiments, simulation filters are selected and used to process a fixation captured from the scene, for instance an image, a video sequence, and/or an audio sequence captured from the scene. Each simulation filter, when applied, converts an object depicted in the fixation, for example an object having a certain color, hue, or shape, a light source, and/or the like and/or a characteristic of the fixation. For example a dB level, brightness, a frequency and/or the like.
As shown at 105, the presentation, for example the notification list and/or simulation are presented to the user in real time, notifying him about defensiveness condition stimuli which are found in the scene. The method 100 may be implemented upon demand. In such embodiments, a user may initiate a process as recited in 102-105 to scan a scene in a room which the target individual is about to get into and/or to try and identify what is bothering which a target individual. In such embodiments, 102-105 may be triggered by a user input to a GUI presented on the display 201 and/or by any other user interface. Additionally or alternatively, the method 100 may be iteratively held, for example as shown at 106. In such embodiments, 102-105 may iteratively held to process data captured by wearable sensor(s), for instance of a wearable mobile device, such as Google Glass™ and/or the like and/or by monitoring sensor(s) which are mounted in a room wherein the target individual is located or about to be located.
Reference is now made to FIG. 4, which is a schematic illustration of an exemplary sensor(s) enhanced wearable device 220, having a housing sized and shaped as such as a wristband, a smart watch, a sticker and/or the like that includes sensors and modules to gather real time physical data of a target individual (a wearer) so as to identify a physical reaction to stimuli in the scene wherein the target individual is located, according to some embodiments of the present invention. The exemplary sensor(s) enhanced wearable device 220 includes a wireless communication circuitry 231, a memory 224 for, inter alia, storing log records, processor(s) 227, such as microprocessor(s) for processing patterns, as described below, battery or any other powers source 210, one or more sensing circuits 228 and optionally one or more motion sensors 223. Optionally, the sensor(s) enhanced wearable device 220 includes a presentation unit such as a curved display 229. FIG. 4 also depicts a wireless connected apparatus 250, for instance a mobile device such as 200.
In use, the processor(s) 227 identify an atypical sensory-based behaviour, such as a sensory overload, overstimulation, Emotional dysregulation (ED) and/or a physical state or pattern which precedes to an atypical sensory-based behaviour by analyzing the outputs of the motion sensors 223, for instance limb motion such as gestures and/or the sensing circuits 228.
In use, the sensor(s) enhanced wearable device 220 may be used, either alone or together with the client 200, for monitoring a wearer or surroundings to react to the detection of a pre-atypical behaviour physiological pattern by generating a notification to the target individual, for instance generating a haptic feedback, playing a relaxing tone or music, playing a recording a caregiver (e.g. a voice of a talking or singing parent) and/or the like. Additionally or alternatively, the sensor(s) enhanced wearable device 220 may generate, in response to the detection of the pre-atypical behaviour physiological pattern and using the display unit 229 or the wireless communication circuitry 231, a notification to a caregiver (e.g. a parent), for instance a blinking LED or display and/or by sending a message to a client of the parent, for instance an SMS, an IM message, an email or a client application notification.
Optionally, the sensing circuits 228 keep track of physiological parameters such as skin surface temperature, skin pH, and/or vital signs (e.g. body Temperature, blood oxygen saturation, electrocardiograph (ECG) patterns, Pulse Rate, Respiration Rate, Blood Pressure) of the target individual. The output of the sensing circuits 228 is analyzed, either locally or at a remote server, to detect a presence or an absence of a pre-atypical behaviour physiological pattern. The sensing circuit 228 may include a human pulse detection module, a blood oxygen concentration detection module, a body surface humidity detection module, and/or a body temperature detection module, for example as used in FitBit™, Microsoft Band™ LG G Watch R™, Olive™, Samsung Galaxy Gear™, Samsung Gear Live™ and/or Basis Peak™, Apple Watch™. Optionally the motion sensors 223 are accelerometers used for identifying current motion patterns.
Optionally, the processor(s) 227 detects presence or the absence of a pre- atypical behaviour physiological pattern and/or an atypical behaviour event by an analysis of a user motion as recorded by embedded motion sensors 223.
Additionally or alternatively, the sensing circuits 228 includes one or more sensors, such as 203, for detecting a scene sensory defensiveness condition stimuli, for example a camera, a microphone or a communication module for receiving data about the scene sensory defensiveness condition stimuli from a remote device, such as a mobile device or a stationary device installed nearby. In such embodiments, the pre- atypical behaviour physiological pattern includes indications of the scene. These scenes might include for example loud environment Noise or a significant visual event.
Additionally or alternatively, the detection of a pre-atypical behaviour physiological pattern is determined using statistical classifiers generated based on a training set of a plurality of physiological parameters records. In such embodiments, the sensor(s) enhanced wearable device may be used to log physiological parameter changes and events of atypical behaviour events into a log record. This log record may then be forwarded to a remote server for updating the statistical classifiers.
Additionally or alternatively, the detection of a pre-atypical behaviour physiological pattern is determined based on a personal profile of the target individual. In such embodiments, a personal profile is built for the target individual by analyzing a log record, for example as outlined above. The personal profile may be indicative of personal pre-atypical behaviour physiological patterns which have triggered atypical behaviours in the past.
Additionally or alternatively, the wearable device 220 includes a feedback unit
231 for generating a feedback in response to the detection of a pre-atypical behaviour physiological pattern. In such a manner, the wearer may be educated that when the feedback is provided by the feedback unit 231, for instance, visual, audible, and/or tactile feedback he can practice a preventive measure to avoid a defensiveness condition and/or an sensory overload.
For example, reference is now made to FIG. 5, which is a flowchart of a method of personalizing a reaction of a target individual to stimuli by identifying a pre-atypical behaviour physiological pattern using a wearable device, such as the device depicted in FIG. 4, according to some embodiments of the present invention.
As shown at 400, the wearer is presented with visual, audible, and/or tactile stimuli, for example based on testimony of guardian and/or a self testimony.
During the exposure to the visual, audible, and/or tactile stimuli, as shown at 402, the sensing circuits 228 of the wearable device 220 keeps track of physiological parameters of the wearer.
As shown at 404, the outputs of the sensing circuits 228 are analyzed, either locally or at a remote server, to identify a pre-atypical behaviour physiological pattern that is built during the exposure to the visual, audible, and/or tactile stimuli. For example, in this example changes in a current stress are measured.
Optionally, as shown at 406, a feedback to the stimuli is selected, either automatically or manually, to allow using the wearable device 400 to automatically provide a feedback to the detection of the pre-atypical behaviour physiological pattern in future events. The feedback may have a certain identifiable type and amplitude, for example a unique tactile frequency.
Reference is now made to FIG. 6, which is a flowchart of a method of combining outputs of physiological parameter sensors, such as the sensing circuits 228 of the wearable device 220 with the outputs of sensors 203 which are described above for detecting a pre-atypical behaviour physiological pattern indicative of sensory overload, according to some embodiments of the present invention. As shown at 601, 602, physiological parameters from wearable sensors (e.g. 228) and environmental parameters such as visual, audible, and/or kinesthetic sequences indicative of defensiveness condition stimuli from sensors in the immediate environment (e.g. 203) are received. As shown at 603, 604, the outputs are analyzed to detect a behaviour physiological pattern, optionally pre-atypical behaviour physiological pattern, which is indicative of a stress scenario and defensiveness condition stimuli. The detection of the behaviour physiological pattern and the defensiveness condition stimuli is optionally performed as described above.
As shown at 605, the behaviour physiological pattern and the defensiveness condition stimuli are correlated by a correlation function, for instance based on time of detection and/or the sensed information.
Now, as shown at 606, the correlated information is analyzed to identify a presence or an absence of a sensory overload, overstimulation, ED and/or a physical state or pattern that proceeds to a pre-atypical sensory-based behaviour. This allows generating a notification to the target individual and/or to a guardian or a third party.
Optionally, as shown at 607, the correlation is stored and used to define a new pre-atypical behaviour physiological pattern indicative of a stress scenario and the defensiveness condition stimuli which trigger the pre-atypical behaviour physiological pattern. This allows responding the pre-atypical behaviour physiological pattern with instructions indicative of the defensiveness condition stimuli triggering the pre-atypical behaviour physiological pattern.
Reference is also made to FIG. 7, which is a process of identifying a pre-atypical behaviour physiological pattern using statistical classifier(s) for classifying behaviour physiological patterns which are recorded using a wearable device such as 220, according to some embodiments of the present invention. First, as shown at 701 and 702 outputs of sensors of a wearable device, such as 220, are aggregated during a learning period, for instance of one hour, one day, one week, one month or any intermediate or linger period. Then, as shown at 703, the recorded data is classified using a statistical classifier, such as statistical machine learning methods or neural networks. The classifier is optionally generated based on a training set that comprises hundreds, thousands, tens of thousands of records, includes recorded sensory data and each associated with one or more pre-atypical behaviour physiological patterns identified with defensiveness condition stimuli. This allows, as shown at 704, classifying new pre- atypical behaviour physiological patterns of a target individuals based on data of other target individuals and, as shown at 704, storing the personal profile for future usage.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The descriptions of the various embodiments of the present invention have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
It is expected that during the life of a patent maturing from this application many relevant systems and methods will be developed and the scope of the term a module, a camera, a sensor and a processor is intended to include all such new technologies a priori.
As used herein the term "about" refers to + 10 %. The terms "comprises", "comprising", "includes", "including", "having" and their conjugates mean "including but not limited to". This term encompasses the terms "consisting of" and "consisting essentially of".
The phrase "consisting essentially of" means that the composition or method may include additional ingredients and/or steps, but only if the additional ingredients and/or steps do not materially alter the basic and novel characteristics of the claimed composition or method.
As used herein, the singular form "a", "an" and "the" include plural references unless the context clearly dictates otherwise. For example, the term "a compound" or "at least one compound" may include a plurality of compounds, including mixtures thereof.
The word "exemplary" is used herein to mean "serving as an example, instance or illustration". Any embodiment described as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.
The word "optionally" is used herein to mean "is provided in some embodiments and not provided in other embodiments". Any particular embodiment of the invention may include a plurality of "optional" features unless such features conflict.
Throughout this application, various embodiments of this invention may be presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
Whenever a numerical range is indicated herein, it is meant to include any cited numeral (fractional or integral) within the indicated range. The phrases "ranging/ranges between" a first indicate number and a second indicate number and "ranging/ranges from" a first indicate number "to" a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numerals therebetween.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.
All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting.

Claims

WHAT IS CLAIMED IS:
1. A wearable device which detects a pre-atypical behaviour physiological pattern using a wearable device, comprising:
a housing;
a memory adapted to store a pre-atypical behaviour physiological pattern of a wearer triggered by at least one sensory defensiveness condition stimulus;
a sensing circuit integrated into said housing and adapted to capture at least one physiological parameter of said wearer;
at least one processor adapted to:
perform an analysis said at least one physiological parameter to detect a presence or an absence of said pre-atypical behaviour physiological pattern, generate a notification indicative of said presence or said absence according to an outcome of said analysis, and
instruct a presentation of said notification.
2. The wearable device of claim 1, wherein said housing is sized and shaped as a wrist worn device.
3. The wearable device of claim 1, wherein said sensing circuit comprises a member of a group consisting of a human pulse detection module, a blood oxygen concentration detection module, a body surface humidity detection module, and a body temperature detection module.
4. The wearable device of claim 1, further comprising at least one motion sensor; wherein said at least one physiological parameter comprises a limb motion pattern.
5. The wearable device of claim 1, further comprising a display for displaying said notification.
6. The wearable device of claim 1, further comprising a wireless unit for transmitting said notification to a mobile device.
7. A method of detecting a pre-atypical behaviour physiological pattern using a wearable device, comprising:
storing a pre-atypical behaviour physiological pattern of a target individual triggered by at least one sensory defensiveness condition stimulus;
capturing, using at least one sensor of a wearable device, at least one physiological parameter of said target individual;
performing an analysis said at least one physiological parameter to detect a presence or an absence of said pre-atypical behaviour physiological pattern;
generating a notification indicative of said presence or said absence according to an outcome of said analysis; and
instructing a presentation of said notification to at least one of said target individual and a third party.
8. The method of claim 7, wherein said pre-atypical behaviour physiological pattern is identified by monitoring a response of said target individual to said at least one sensory defensiveness condition stimulus.
9. The method of claim 7, wherein said pre-atypical behaviour physiological pattern is identified using a statistical classifier.
10. The method of claim 7, wherein said pre-atypical behaviour physiological pattern is identified according to a correlation with scene data comprising at least one of an audio stream, video stream and kinesthetic stream of a scene in proximity to said target individual.
11. The method of claim 7, wherein said at least one physiological parameter is selected from a group consisting of a skin surface temperature, a skin pH, a body temperature, a blood oxygen saturation, an electrocardiograph (ECG) pattern, a pulse rate, a respiration rate, and blood pressure.
12. The method of claim 7, wherein said pre-atypical behaviour physiological pattern is indicative of a sensory overload, an overstimulation, an emotional dysregulation (ED) and a physical state or pattern which proceeds to an atypical sensory-based behaviour.
13. The method of claim 7, wherein said performing comprises:
capturing scene data comprising at least one of an audio stream, video stream and kinesthetic stream of a scene in proximity to said wearable device;
correlating between said pre-atypical behaviour physiological pattern and said scene data; and
determining whether to output said notification according to said correlating.
14. A method presenting one sensory defensiveness condition stimulus indication on a mobile device, comprising:
setting a sensory profile defining at least one sensory defensiveness condition stimulus which triggers at least one sensory defensiveness condition of a target individual;
capturing, using at least one sensor of a mobile device, scene data comprising at least one of an audio stream, video stream and kinesthetic stream of a scene in proximity to said mobile device;
performing an analysis said scene data to detect a presence or an absence of said at least one sensory defensiveness condition stimulus in said scene;
generating a presentation which indicative of said presence or said absence according to an outcome of said analysis; and
presenting said presentation on said mobile device.
15. The method of claim 14, wherein said analysis comprises an image processing analysis applying a determination function on said video stream to identify an object triggering said at least one sensory defensiveness condition.
16. The method of claim 14, wherein said analysis comprises an image processing analysis applying a determination function on said video stream to identify a color triggering said at least one sensory defensiveness condition.
17. The method of claim 14, wherein said analysis comprises an image processing analysis applying a determination function on said video stream to identify a source of flickering light triggering said at least one sensory defensiveness condition.
18. The method of claim 14, wherein said analysis comprises an image processing analysis applying a determination function on said video stream to identify a presence of a crowd in said scene which triggers said at least one sensory defensiveness condition.
19. The method of claim 14, wherein said capturing, said performing, said generating, and said presenting are performed in response to a user triggering action.
20. The method of claim 14, wherein said capturing, said performing, said generating, and said presenting are iteratively perform; wherein said mobile device is a wearable device.
21. The method of claim 14, wherein said analysis comprises a sound processing analysis applying a determination function on said audio stream to identify a sound having a characteristic triggering said at least one sensory defensiveness condition.
22. The method of claim 21, wherein said characteristic is selected from a group consisting of a pattern, a frequency, intensity, and a pitch.
23. The method of claim 14, wherein said analysis comprises a motion processing analysis applying a determination function on said kinesthetic stream to identify a motion of said mobile device having a motion characteristic triggering said at least one sensory defensiveness condition.
24. The method of claim 14, wherein said generating comprises generating a simulation of an effect of said at least one sensory defensiveness condition stimulus on said target individual by processing said scene data.
25. The method of claim 24, wherein the simulation of the intensity of said effect of said at least one sensory defensiveness condition stimulus is correlated with a reference intensity defined in said sensory profile.
26. The method of claim 24, wherein said generating a simulation comprises processing said video stream to identify an object in said video stream and to edit said video stream so that at least one visual characteristic of said object is adapted therein to emulate the effect of said at least one sensory defensiveness condition stimulus on said target individual.
27. The method of claim 14, wherein said sensory profile defines an intensity level for said at least one sensory defensiveness condition stimulus; wherein said analyzing is performed while taking into account said intensity level.
28. A computer readable medium comprising computer executable instructions adapted to perform the method of claim 14.
29. The method of claim 14, wherein said setting comprises receiving from a user said at least one sensory defensiveness condition stimulus on said mobile device.
30. The method of claim 14, wherein said setting comprises extracting said at least one sensory defensiveness condition stimulus from a sensory profile form filled by a physician.
31. A mobile device which presents sensory defensiveness condition indication, comprising:
a processor;
a memory which stores a sensory profile defining at least one sensory defensiveness condition stimulus which trigger at least one sensory defensiveness condition of a target individual; at least one sensor which captures scene data comprising at least one of an audio stream, video stream and kinesthetic stream of a scene in proximity to said mobile device;
a detection module executed by said processor to perform an analysis said scene data to detect a presence or an absence of said at least one sensory defensiveness condition stimulus in said scene;
a presentation module executed by said processor to generate a presentation which indicative of said presence or said absence according to an outcome of said analysis; and
a presentation unit which presents said presentation on said mobile device.
PCT/IL2015/050188 2014-02-19 2015-02-18 Methods and systems for personalized sensory sensitivity simulation and alerting WO2015125142A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461941534P 2014-02-19 2014-02-19
US61/941,534 2014-02-19

Publications (1)

Publication Number Publication Date
WO2015125142A1 true WO2015125142A1 (en) 2015-08-27

Family

ID=53877710

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2015/050188 WO2015125142A1 (en) 2014-02-19 2015-02-18 Methods and systems for personalized sensory sensitivity simulation and alerting

Country Status (1)

Country Link
WO (1) WO2015125142A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110392299A (en) * 2019-06-17 2019-10-29 北京达佳互联信息技术有限公司 Processing method, device, electronic equipment and the storage medium of volume
CN112345066A (en) * 2020-10-30 2021-02-09 曹敏明 Noise disturbs citizen and warns system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5897506A (en) * 1997-09-19 1999-04-27 Cohn; Lipe Pulse rate monitor for allergy detection and control
US20110245633A1 (en) * 2010-03-04 2011-10-06 Neumitra LLC Devices and methods for treating psychological disorders
US20120203081A1 (en) * 2006-12-19 2012-08-09 Leboeuf Steven Francis Physiological and environmental monitoring apparatus and systems
US8494507B1 (en) * 2009-02-16 2013-07-23 Handhold Adaptive, LLC Adaptive, portable, multi-sensory aid for the disabled

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5897506A (en) * 1997-09-19 1999-04-27 Cohn; Lipe Pulse rate monitor for allergy detection and control
US20120203081A1 (en) * 2006-12-19 2012-08-09 Leboeuf Steven Francis Physiological and environmental monitoring apparatus and systems
US8494507B1 (en) * 2009-02-16 2013-07-23 Handhold Adaptive, LLC Adaptive, portable, multi-sensory aid for the disabled
US20110245633A1 (en) * 2010-03-04 2011-10-06 Neumitra LLC Devices and methods for treating psychological disorders

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110392299A (en) * 2019-06-17 2019-10-29 北京达佳互联信息技术有限公司 Processing method, device, electronic equipment and the storage medium of volume
CN112345066A (en) * 2020-10-30 2021-02-09 曹敏明 Noise disturbs citizen and warns system

Similar Documents

Publication Publication Date Title
KR102039848B1 (en) Personal emotion-based cognitive assistance systems, methods of providing personal emotion-based cognitive assistance, and non-transitory computer readable media for improving memory and decision making
Obrist et al. Multisensory experiences in HCI
US20180032126A1 (en) Method and system for measuring emotional state
JP6741688B2 (en) Method and system for extracting a user's operating characteristics using a Hall effect sensor to provide feedback to the user
JP5878678B1 (en) Sensory stimulation to increase the accuracy of sleep staging
US10960173B2 (en) Recommendation based on dominant emotion using user-specific baseline emotion and emotion analysis
JP6898869B2 (en) Systems and methods for detecting bad breath
US10431116B2 (en) Orator effectiveness through real-time feedback system with automatic detection of human behavioral and emotional states of orator and audience
US9891884B1 (en) Augmented reality enabled response modification
US11127501B2 (en) Systems and methods for health monitoring
JP5574407B2 (en) Facial motion estimation apparatus and facial motion estimation method
KR101988334B1 (en) a mobile handset and a method of analysis efficiency for multimedia content displayed on the mobile handset
WO2023005402A1 (en) Respiratory rate detection method and apparatus based on thermal imaging, and electronic device
JP2015118185A (en) Information processing apparatus, information processing method, and program
Gay et al. CaptureMyEmotion: helping autistic children understand their emotions using facial expression recognition and mobile technologies
WO2018222589A1 (en) System and method for treating disorders with a virtual reality system
CN109074487A (en) It is read scene cut using neurology into semantic component
WO2015125142A1 (en) Methods and systems for personalized sensory sensitivity simulation and alerting
CN109862822B (en) Identifying sensory inputs that affect an individual's working memory load
US20160271498A1 (en) System and method for modifying human behavior through use of gaming applications
WO2022065446A1 (en) Feeling determination device, feeling determination method, and feeling determination program
TW201808224A (en) Display devices and methods for controlling a display device
Hong et al. The quantified self
Kim et al. Mediating individual affective experience through the emotional photo frame
WO2014106216A1 (en) Collection of affect data from multiple mobile devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15752338

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15752338

Country of ref document: EP

Kind code of ref document: A1