US20220031193A1 - Apparatus and methods for detecting stroke in a patient - Google Patents

Apparatus and methods for detecting stroke in a patient Download PDF

Info

Publication number
US20220031193A1
US20220031193A1 US17/414,018 US201917414018A US2022031193A1 US 20220031193 A1 US20220031193 A1 US 20220031193A1 US 201917414018 A US201917414018 A US 201917414018A US 2022031193 A1 US2022031193 A1 US 2022031193A1
Authority
US
United States
Prior art keywords
movement
user
data
patient
stroke detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/414,018
Other languages
English (en)
Inventor
Johan WASSELIUS
Petter Ericson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uman Sense AB
Original Assignee
Uman Sense AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB1820892.6A external-priority patent/GB201820892D0/en
Application filed by Uman Sense AB filed Critical Uman Sense AB
Assigned to UMAN SENSE AB reassignment UMAN SENSE AB ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ERICSON, PETTER, WASSELIUS, Johan
Publication of US20220031193A1 publication Critical patent/US20220031193A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1124Determining motor skills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/162Testing reaction times
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4058Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
    • A61B5/4064Evaluating the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4094Diagnosing or monitoring seizure diseases, e.g. epilepsy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6828Leg
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7465Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
    • A61B5/747Arrangements for interactive communication between patient and care services, e.g. by using a telephone network in case of emergency, i.e. alerting emergency services
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches

Definitions

  • the present invention relates to apparatus and methods for detecting stroke in a patient.
  • a stroke is a medical condition in which poor blood flow to the brain results in cell death.
  • An ischemic stroke is typically caused by a lack of blood flow to parts of the brain resulting from a blockage in an artery that supplies blood to the brain.
  • the blood normally delivers oxygen and nutrients to the brain. Once the oxygen and nutrients are cut off by the blockage, the brain cells cannot make enough energy and will eventually stop working. If the blockage is not cleared, the brain cells will eventually die.
  • a haemorrhagic stroke is cause by bleeding in the brain. The bleeding is typically caused by a damaged blood vessel leaking blood.
  • a haemorrhagic stroke may also be caused by a burst brain aneurism. In both cases, the blood spreads into the surrounding brain tissue causing increased pressure, limiting the operating of the brain cells and eventually damaging the brain tissue.
  • the resultant effect is a change in the function of the brain, as brain cells cease to function correctly. This change can be observed through physical symptoms such as an inability to move or feel on one side of the body, problems communicating, and loss of vision. These physical symptoms often appear more or less immediately after the stroke has begun.
  • a stroke was very difficult to treat. Although the patient's symptoms might be recognised and diagnosed as a stroke, limited treatment was available. Where the stroke was identified as an ischemic stroke, a clot-dissolving drug such as a tissue plasminogen activator was given to the patient intravenously, in the hope that the drug would reach the clot and dissolve it sufficiently to allow blood flow to resume through the affected artery. Such a treatment would need to be successfully applied within just a few hours of the start of the stroke to ensure that the damage to the brain tissue was limited. Some studies suggest that the time window for getting the best results from clot-dissolving drug is three hours from the first signs of the stroke.
  • ischemic strokes have been successfully treated via an endovascular procedure called ‘mechanical thrombectomy’ in which the blood clot is removed by sending a clot retrieval device to the site of the blocked blood vessel in the brain. The device secures the clot and pulls the clot back out the blood vessel as the device is removed.
  • haemorrhagic strokes may also be treated via an endovascular procedure by delivering a metal clip to the damaged blood vessel or ruptured aneurysm. The clip is secured to restrict the blood flood and prevent further blood from leaking into the surrounding blood tissue.
  • significant damage mitigation can be achieved if the procedure is performed within a few hours of the first signs of the stroke.
  • US patent application 2018153477 discloses a device for monitoring patients for a stroke via several sensors for determining ‘physiological signals’.
  • the physiological signals may comprise a heart rate signal, an atrial rate signal, a heart rate variability signal, a blood pressure signal, a blood pressure variability signal, a heart sound signal, etc.
  • U.S. Pat. No. 7,981,058 discloses a device for monitoring patients using low cost biaxial motion sensors.
  • the first sensor captures objective acceleration data
  • the second biaxial sensor captures subjective acceleration data relative to at least the first accelerometer. Acceleration data is then used to determine nonlinear parameters and to generate at least two levels of motor function information.
  • US2017281054 is a US application disclosing a device for monitoring patients for a stroke via a plurality of motion sensors located at one or more anatomical locations on a patient's body and configured to detect a plurality of motion parameters corresponding to a motion of a portion of the patient's body. The motion of the portion of the patient's body is then based on a plurality of predetermined motion sentence features.
  • US2015157252 is a US application disclosing a device for monitoring patients via several sensors. A technique is described for switching between sensors to determine which sensor is providing the best indication of a physiological signal status of the patient.
  • US20160213318 is a US application disclosing a system and method for detecting a stroke in a sleeping individual.
  • the system comprises a sensor which is worn on the hand for detecting the absence of electrical or muscular activity in the hand.
  • WO2018110925 is an international application which discloses an apparatus with sensors on left and right hands to measure the movement of the left side and right side for detecting a stroke during deep sleep.
  • problems associated with the above systems include problems related to producing a reliable signal indicative of a patient stroke without trigger too many false positives. What is needed is a system capable of generating a stroke detection signal with minimal false positives, and where false positives do occur, the system can gracefully handle them without too much inconvenience to the user. Furthermore, a stroke condition may occur when a patient is awake instead of when they are sleeping.
  • Examples of the present invention aim to address the aforementioned problems.
  • a stroke detection apparatus comprising: a data processing device comprising a processor; at least one wearable sensor configured to generate movement data of at least a portion of the user's body;
  • the data processing device configured to process first movement data for a first movement and second movement data for a second movement received from the at least one wearable sensor; wherein the data processing device is configured to determine asymmetry of user's movement based on the first and second movement data and generate a stroke detection signal in dependence on the determined asymmetry.
  • the data processing device is configured to determine the asymmetry of the user's movement based on the first and second movement data based on the user performing at least one predetermined body gesture.
  • the data processing device is configured to prompt the user to perform the at least one predetermined body gesture.
  • the data processing device generates a stroke detection signal when the data processing device determines that the asymmetry of the user's movement exceeds a predetermined threshold.
  • the data processing device is configured to determine the predetermined threshold based on the user's historical movement data for the user's body.
  • the predetermined threshold is based on one or more pre-sets relating to user characteristics.
  • an automated emergency services request is generated in dependence on the stroke escalation signal.
  • the at least one wearable sensor is a wearable on the arms and/or legs.
  • a first wearable sensor is worn on one of the user's wrists and a second wearable sensor is worn on the other of the user's wrists.
  • the at least one wearable sensor comprises a first sensor configured to measure movement on a first side of a plane of symmetry the user's body and a second wearable sensor configured to measure movement on a second side of the plane of symmetry of the user's body.
  • the plane of symmetry of the user's body is one or more of a sagittal plane, a frontal plane and/or a transverse plane.
  • the at least one wearable sensor is configured to measure the first movement data and the second movement data are measured at the same time.
  • the at least one wearable sensor is configured to measure the first movement data and the second movement data at different times.
  • the at least one wearable sensor is configured to transmit the generated movement data to the data processing device.
  • the data processing device is configured to prompt the user to perform additional predetermined body gestures when the data processing device determines asymmetry of user's movement based on the first and second movement data.
  • a method of generating a stroke detection signal comprising: generating movement data of at least a portion of the user's body with at least one wearable sensor;
  • first movement data for a first movement and second movement data for a second movement received from the at least one wearable sensor processing first movement data for a first movement and second movement data for a second movement received from the at least one wearable sensor; determining asymmetry of user's movement based on the first and second movement data; and generating a stroke detection signal in dependence on the determined asymmetry.
  • FIG. 1 is a schematic diagram showing a patient wearing a multi-sensor stroke detection apparatus
  • FIG. 2 shows a perspective view of stroke detection apparatus according to an example of the present application
  • FIG. 3 is a schematic diagram of a stroke detection apparatus according to an example of the present application.
  • FIG. 4 a and FIG. 4 b are process flow diagrams showing execution flow of the wearable sensor and control device respectively;
  • FIG. 5 a and FIG. 5 b are sequence diagrams showing data-transmission order between a first and second wearable sensor device and a control device;
  • FIG. 6 is a process flow diagram showing execution flow of the control device 300 ;
  • FIG. 7 is an alternative process flow diagram showing execution flow of the control device 300 ;
  • FIGS. 8 shows a schematic view of the patient's body
  • FIGS. 9 a to 9 h show a series of gestures to be performed by the patient according to an example of the present application
  • FIGS. 10 a to 10 e show a series of gestures to be performed by the patient according to an example of the present application
  • FIGS. 11 a to 11 e show a series of gestures to be performed by the patient according to an example of the present application
  • FIGS. 12 a to 12 i show a series of gestures to be performed by the patient according to an example of the present application
  • FIGS. 13 a to 13 c show a series of gestures to be performed by the patient according to an example of the present application
  • FIGS. 14 a to 14 f show a series of gestures to be performed by the patient according to an example of the present application.
  • FIGS. 15 a to 15 b show a series of gestures to be performed by the patient according to an example of the present application.
  • FIG. 1 is schematic diagram showing a user 100 with a stroke detection apparatus 102 comprising a plurality of wearable sensors 200 a, 200 b.
  • the user 100 is a patient 100 .
  • the patient 100 may be at particular risk of transient ischemic attack (TIA) or other types of stroke such as ischemic stroke, haemorrhagic stroke, aneurysms, arteriovenous malformations (AVM), cryptogenic stroke, and/or brain stem stroke.
  • TIA transient ischemic attack
  • AVM arteriovenous malformations
  • cryptogenic stroke and/or brain stem stroke.
  • the term “stroke” will be used.
  • the user is not a patient, but uses the stroke detection apparatus 102 as a precaution.
  • the term “user” and “patient” may be used interchangeably however for the purposes of clarity, the term “patient” will be used hereinafter.
  • wearable sensors 200 a, 200 b are respectively attached to the patient's 100 right wrist 104 a and left wrist 104 b.
  • the reference number 200 generally refers to the wearable sensor, but for the purposes of clarity the reference numbers 200 a, 200 b refer to the wearable sensors 200 a, 200 b on the patient's 100 right wrist 104 a and left wrist 104 b.
  • wearable sensors may be worn on the right and left ankles, 106 a, 106 b either as an alternative to the wrists or in combination with the wrists.
  • Other positions in which the wearable sensors may be warn include footwear, headwear, as well as attached to clothing such as trousers, and upper body garments.
  • one or more sensors 200 a, 200 b can be mounted on any suitable part of the patient's body 110 .
  • FIG. 2 shows perspective view of example of a wearable sensor 200 shown in FIG. 1 .
  • the wearable sensor 200 comprises a strap 202 configured to secure the wearable sensor 200 to the patient 100 , sensor body 204 housing processing board 206 .
  • Processing board 206 may comprise power source 208 , data processing device 210 , and sensor package 212 .
  • Sensor package 212 may include any suitable sensor component or plurality of sensor components configured to measure an inclination, a position, an orientation, and/or an acceleration of the part of the patient's 100 body 110 to which the wearable sensor 200 is attached.
  • Sensor package 212 may comprise a piezoelectric, piezoresistive and/or capacitive component to convert the mechanical motion into an electrical signal.
  • any suitable sensor configured to detect motion of one or more portions of the patient's body is used.
  • a piezoceramic (e.g. lead zirconate titanate) or single crystal (e.g. quartz, tourmaline) sensor may be used.
  • capacitive accelerometers are employed due to their superior performance in the low frequency range.
  • the data processing device 210 may be implemented by special-purpose software (or firmware) run on one or more general-purpose or special-purpose computing devices, such as hardware processor(s).
  • Each “element” or “means” of such a computing device refers to a conceptual equivalent of a method step; there is not always a one-to-one correspondence between elements/means and particular pieces of hardware or software routines.
  • One piece of hardware sometimes comprises different means/elements.
  • a processing unit serves as one element/means when executing one instruction, but serves as another element/means when executing another instruction.
  • one element/means may be implemented by one instruction in some cases, but by a plurality of instructions in some other cases.
  • Such a software-controlled computing device may include one or more processing units, e.g. a CPU (“Central Processing Unit”), a DSP (“Digital Signal Processor”), an ASIC (“Application-Specific Integrated Circuit”), discrete analog and/or digital components, or some other programmable logical device, such as an FPGA (“Field Programmable Gate Array”).
  • the data processing device 210 may further include a system memory and a system bus that couples various system components including the system memory to the processing unit.
  • the system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • the system memory may include computer storage media in the form of volatile and/or non-volatile memory such as read only memory (ROM), random access memory (RAM) and flash memory.
  • ROM read only memory
  • RAM random access memory
  • the special-purpose software may be stored in the system memory, or on other removable/non-removable volatile/non-volatile computer storage media which is included in or accessible to the computing device, such as magnetic media, optical media, flash memory cards, digital tape, solid state RAM, solid state ROM, etc.
  • the special-purpose software may be provided to the data processing device 210 on any suitable computer-readable medium, including a record medium and a read-only memory.
  • the data processing device 210 includes one or more communication interfaces, such as a serial interface, a USB interface, a wireless networking interface, etc, as well as one or more data acquisition devices, such as an analogue to digital (A/D) converter.
  • the data processing device 210 may include a transmitter component configured to send sensor data received from the sensor package 212 and processed by the data processing device 210 and/or A/D converter (not shown) over the one or more communication interfaces.
  • a communication interface is provided via a Bluetooth® or Wi-Fi transceiver and the processed sensor data is sent to a control device 300 (described below with reference to FIG. 3 ).
  • the processed sensor data may alternatively be sent to one or more remote devices via a GSM, LTE, or any other similar licensed or unlicensed mobile communications interface.
  • Power source 208 may comprise a battery, kinetic energy source, or other power source suitable for a wearable device.
  • the power source 208 is arranged to provide an energy source for powering the data processing device 210 and sensor package 212 of the processing board 206 .
  • Wearable sensor 200 may further comprise a fastening component 214 configured to be secured with a counterpart component 216 , to allow the wearable sensor 200 to be secured to a limb of the patient 100 .
  • the fastening component 214 comprises a sensor (not shown) configured to determine whether the strap 202 of wearable sensor 200 is in an ‘open’ configuration or a ‘secured’ configuration.
  • An example of an ‘open’ configuration of the strap 202 is shown in FIG. 2 where the wearable sensor 200 is not secured to anything.
  • An example of the ‘closed’ configuration is shown in FIG. 1 where the wearable sensor 200 is secured to the patient 100 .
  • an example of the ‘closed’ configuration is shown in FIG. 3 , where the fastening component 214 has been fastened to counterpart component 216 to arrange the strap of wearable sensor 200 in a secured loop.
  • the sensor of fastening component 214 is electrically connected to processing board 206 such that data processing device 210 may determine the configuration of the strap 202 of wearable sensor 200 . Accordingly, the data processing device 210 can determine if the wearable sensor 200 is being worn by the patient 100 and the data processing device 210 can generate a “wearing” or “not wearing” status information for the wearable sensor 200 . The data processing device 210 can use the “wearing” or “not wearing” status information for the wearable sensor 200 when determining information relating to the patient 100 . For example, if the data processing device 210 determines that the patient 100 is not wearing the wearable sensor 200 from “not wearing” status information, then the data processing device 210 can determine that any alerts associated with the patient 100 may be a false alarm.
  • FIG. 3 shows an example wireless network 302 according to an example.
  • two wearable sensors shown as 200 a and 200 b are worn by the patient 100 (not shown) and the respective strap 202 of each wearable sensor is in the ‘closed’ configuration.
  • Each wearable sensors 200 a, 200 b collects sensor data from the patient 100 , process said data, and transmit the data to the control device 300 via a wireless networking interface 304 .
  • the wearable sensors 200 a, 200 b collects sensor data from the patient 100 , and transmits the data to the control device 300 without processing the sensor data via wireless networking interface 304 . This is discussed in further detail in reference to FIG. 5 b .
  • one of the wearable sensors 200 a is in wireless communication with the other wearable sensor 200 b.
  • the wearable sensors 200 a, 200 b can communicate over Bluetooth® or low energy Bluetooth®.
  • One of the wearable sensors 200 a is in wireless communication with the control device 300 via the wireless networking interface 304 .
  • the other wearable sensors 200 b is not in in wireless communication with the control device 300 .
  • one of the wearable sensors 200 a processes the collected sensor data from some or all of the wearable sensors 200 a, 200 b.
  • one of the wearable sensors 200 a is a master wearable sensor 200 a and the other of the wearable sensors 200 b is a slave wearable sensor 200 b. This is discussed in further detail in reference to FIG. 5 a.
  • wearable sensors 200 a, 200 b Whilst the examples as shown in the Figures only show two wearable sensors 200 a, 200 b, in an alternative example, additional wearable sensors can be worn by the patient 100 .
  • the patient 100 can wear a wearable sensor 200 a, 200 b on each limb. This may be desirable for patients 100 at particular risk and increased sensor data collection is required.
  • the plurality of wearable sensors 200 a, 200 b may establish a personal area network (PAN) or a body area network (BAN) for the wearable sensors 200 a, 200 b to communicate with each other.
  • PAN personal area network
  • BAN body area network
  • the plurality of wearable sensors 200 a, 200 b can establish a mesh network between each other. This is described in further detail with reference to FIG. 5 a.
  • the wearable sensors 200 a, 200 b can be connected to the control device 300 via a wired connection.
  • a wired connection between the wearable sensors 200 a, 200 b and the control device 300 may interfere with the patient's 100 movement of their body.
  • a communication interface between the wearable sensors 200 a, 200 b and the control device 300 is provided via a Bluetooth® transceiver, Wi-Fi transceiver, GSM transceiver, LTE transceiver, or any other similar licensed or unlicensed mobile communications interface.
  • the control device 300 may be any mobile or remote processing or any other suitable control device 300 .
  • the control device 300 is a mobile phone device such an AppleTM iPhoneTM, iPadTM, Apple WatchTM, AndroidTM device, Wear OSTM device, Laptop device, or similar.
  • the control device 300 receives the data from the wearable sensors 200 a, 200 b, processes the received data, determines a patient condition, and executes an escalation process where appropriate. This process is described in greater detail below and with reference to FIG. 4 b.
  • the control device 300 comprises the wearable sensor 200 a.
  • the control device 300 can be a smartphone which comprises one or more accelerometers (e.g. a 6-axis accelerometer).
  • the smartphone is then mounted in a strap and worn on the user's arm.
  • the smartphone mounted on one arm will function both as a wearable sensor 200 and as the control device 300 according to the present application.
  • another wearable sensor 200 b e.g. a smartwatch can be worn on the other arm.
  • FIG. 4 a shows an example of the process flow for the data processing device 210 .
  • the process flow shown in FIG. 4 a may be executed in a continuous loop or periodically. Where the process is executed periodically, an energy saving mode may be employed between executions to minimise battery usage.
  • step 400 sensor output is received by data processing device 210 from sensor package 212 .
  • the output from the piezoelectric component may require an optional pre-processing step 410 to ensure a desired analogue signal is produced.
  • a pre-processing step may be to reduce high frequency noise generated by the piezoelectric component from the analogue signal.
  • Pre-processing step 410 may occur at the sensor package 212 or on the data processing device 210 .
  • a conversion of the signal from an analogue signal to a digital signal may be performed in step 420 .
  • This analogue to digital conversion may occur at the sensor package 212 or on the data processing device 210 .
  • the digital signal is then processed in step 430 to reduce noise and to time stamp the sensor readings.
  • Step 430 may optionally comprise converting the acceleration vector generated by the accelerometer into a norm of the acceleration vector. i.e. The acceleration vector is converted to a strictly positive length in a single direction.
  • This provides several advantages, including a reduced storage space for storing the vector data and an invariance to accelerometer orientation.
  • Other filters are envisaged, in combination with the above or independently, steps to ensure that the filtered acceleration vector signal is invariant to gravity or orientation of the sensor package 212 .
  • any of the preceding filtering steps are performed locally to the wearable sensors 200 a, 200 b, e.g. by data processing device 210 .
  • Step 430 may further optionally comprise, in combination with the above or independently, applying a high pass filter to remove the acceleration vector resulting from the gravitational force on the accelerometer. This may be achieved by removing slow or unchanging acceleration vectors from a differential of the acceleration vector. This advantageously allows the removal of the noise resulting from gravitational forces.
  • Step 430 may further optionally comprise filtering signals resulting from movements of the human body that are not a direct consequence of the signals from the brain.
  • the wearable sensors 200 a, 200 b do not detect indirect outcomes of the electrical signals reaching muscles. Examples of movements that may be filtered include:
  • the data processing device 210 filters signals in signal processing step 430 to exclude one or more movements not related to muscle movement due to the central nervous system.
  • the data processing device 210 filters the signal to exclude on or more of: Passive limb mechanics, Other biological signals, such as heart function, tremors, or other involuntary muscular movements, environmental noise (e.g. a bus or vehicle engine). In this way, the data processing device 210 filters signals and the movement of the patient 100 can be analysed when the patient 100 is awake and out of bed.
  • the sensor data is then optionally formatted according to a defined data structure in step 440 for transmission to control device 300 .
  • wearable sensor 200 a, 200 b transmits the formatted sensor data to control device 300 using wireless networking interface 304 .
  • the steps as shown in FIG. 4 a are carried out by the data processing device 210 . In some other examples, the steps as shown in FIG. 4 a are partly or completely carried out by a remote processing device, for example the control device 300 .
  • FIG. 4 b shows an example of the process flow for the control device 300 .
  • the control device 300 is a mobile phone 300 , but can be any suitable mobile device.
  • the process flow shown in FIG. 4 b may be executed in a continuous loop or periodically. Where the process is executed periodically, an energy saving mode may be employed between executions to minimise battery usage.
  • the control device 300 receives the formatted sensor data from wearable sensor 200 a, 200 b.
  • the received sensor data is then optionally consolidated, in step 470 , with existing data previously received from wearable sensor 200 a, 200 b as well as any other wearable sensors transmitting sensor data to control device 300 .
  • the data is stored in a local database stored on control device 300 .
  • the system implements ‘chunking’ of the formatted sensor data, which comprises break the data into chunks, each chunk comprising a header which indicates some parameters (e.g. the time stamp for the recorded signal data, size etc.) This allows each data chunk to resynchronise the control device 300 to the clocks of wearable sensors.
  • step 475 data analysis is performed on the sensor data by control device 300 .
  • a determination of a patient condition such as an on-going stroke condition e.g. a transient ischemic attack, is then made in dependence on the data analysis 475 . This comprises the determination that a first movement of the patient's body and a second movement of the patient are asymmetric as shown in step 480 .
  • a patient condition may be determined to be present based on the sensor data exceeding a predetermined asymmetry threshold.
  • steps of data analysis 475 , determination of patient event 480 and escalation process 490 are described in more detail below and in reference to FIGS. 6, 7 and FIGS. 9 a to 9 h , 10 a to 10 e , 11 a to 11 e , 12 a to 12 i , 13 a to 13 c , 14 a to 14 f and 15 a to 15 b below.
  • control device 300 makes a transmission to the remote network as shown in step 495 .
  • the control device 300 contacts network point 308 via network interface 306 in step 495 to request emergency service for handling of the patient condition.
  • mobile device may communicate at least one of a patient condition, a GPS location of the mobile device, a patient ID, a patient medical history, a recent sensor data report, etc.
  • step 475 and subsequent steps may be executed as a directly subsequent step to step 460 .
  • step 475 and subsequent steps may be executed in an independent loop that is triggered independently by e.g. a periodic timing interrupt.
  • the steps as shown in FIG. 4 b are carried out by the control device 300 . In some other examples, the steps as shown in FIG. 4 b are partly or completely carried out by the data processing device 210 . In this way, the data processing device 210 is capable of carrying out all the steps in FIGS. 4 a and 4 b and can request emergency service for handling of the patient condition as shown in step 495 .
  • FIG. 5 a shows a flow for collecting the data from more than one wearable sensor 200 a, 200 b at control device 300 .
  • steps 510 and 520 sensor data from a first wearable sensor 200 a positioned on a left side of the body and second wearable sensor 200 b position on a right-hand side of the body is collected.
  • step 530 sensor data from the first wearable sensor 200 a is transmitted to the second wearable sensor 200 b via wireless networking interface 304 .
  • the sensor data from the first wearable sensor 200 a is combined with the sensor data from the second wearable sensor 200 a (collected previously in step 520 ) and transmitted to control device 300 in step 540 .
  • FIG. 5 b shows an alternative flow for collecting the data from more than one wearable sensor at control device 300 .
  • steps 510 and 520 sensor data from a first wearable sensor 200 a positioned on a left side of the body and a second wearable sensor 200 b positioned on a right-hand side of the body is collected.
  • control device 300 instructs the first wearable sensor 200 a via wireless network interface 304 to send sensor data collected by first wearable sensor 200 a to control device 300 .
  • the first wearable sensor 200 a sends the collected data to control device 300 via wireless network interface 304 .
  • control device 300 instructs the second wearable sensor 200 b via wireless network interface 304 to send sensor data collected by the second wearable sensor 200 b to control device 300 .
  • the second wearable sensor 200 b sends the collected data to control device 300 via wireless network interface 304 .
  • FIG. 6 shows an example of the steps of data analysis 475 , determination of patient event 480 and escalation process 490 as shown in FIG. 4 b.
  • One of the symptoms of a patient having a stroke is sudden numbness or weakness of face, arm, or leg, especially on one side of the body. This means that the patient during a stroke event a patient is susceptible to asymmetric body movements.
  • FIG. 8 shows a perspective view of a patient's body 110 with a first axis 800 .
  • the first axis is also known as the vertical or longitudinal axis of the body 110 .
  • the body also has a second axis 802 and a third axis 804 .
  • the second axis 802 is also known as a transverse axis of the body 110 .
  • the third axis 804 is also known as a sagittal axis of the body 110 .
  • FIG. 8 also shows the body 110 having a first plane 810 , also known as a sagittal plane 810 in which the first axis 800 and the third axis 804 lie.
  • the body 110 comprises a second plane 808 , also known as a transverse plane 808 , in which the second axis 802 , and the third axis 804 lie.
  • the body 110 also comprises a third plane 806 , also known as the frontal plane, in which the first axis 800 and the second axis 802 lie.
  • symmetry of body movements relates to similarity of a first movement and a second movement either side of one or more planes 806 , 808 , 810 of the body 110 .
  • asymmetry of body movements relates to the dissimilarity of a first movement and a second movement either side of one or more planes 806 , 808 , 810 of the body 110 .
  • the planes 806 , 808 , 810 of the body 110 are also planes of symmetry for movement of the body 110 .
  • first plane of symmetry 810 For the purposes of clarity, reference will only be made to the first plane of symmetry 810 .
  • using the first plane of symmetry 810 for stroke-detection purposes reflects the fundamental plane of symmetry expressed in the hemispheres of the user's brain.
  • reference to symmetry of movement of the body 110 can be in respect of any of the first, second and/or third planes of symmetry 806 , 808 , 810 .
  • the first plane of symmetry 810 divides the body 110 into a left-hand side 812 and a right-hand side 814 .
  • One symptom of a stroke may be is sudden numbness or weakness of face, arm, or leg, especially on one side of the body about the sagittal plane 810 of symmetry.
  • the patient 100 is prompted to perform one or more predetermined gestures as shown in step 600 of FIG. 6 .
  • the control device 300 may prompt the patient 100 for carrying out the predetermined gestures once a system condition has been met.
  • the system condition is that a threshold for the probability of a patient condition being present has been exceeded. The determination of the probability of a patient condition being present is discussed in GB1820892.6 which is incorporated herein in its entirety by reference.
  • the system condition is that the control device 300 may prompt the patient 100 based on a timer, an external request or a request from the patient 100 .
  • a remote medical practitioner may request the control device 300 to prompt the patient 100 to carry out the predetermined gestures.
  • the patient 100 may not be feeling well and may wish to check whether they are suffering from a stroke.
  • the control device 300 then instructs the patient 100 to carry out one or more gestures.
  • the patient 100 carries out a first gesture as shown in step 602 .
  • the patient 100 then carries a second gesture as shown in step 604 .
  • the first and second gestures are carried out at the same time.
  • the first gesture can be made by a first part of the patient's body 110 and the second gesture can be made by a second part of the patient's body 110 .
  • the patient's right wrist 104 a can make the first gesture
  • the patient's left wrist 104 b can make the second gesture.
  • FIGS. 14 a to 14 e show a series of leg gestures to be performed by the patient 100 according to an example of the present application.
  • a first leg 1400 is moved and then the same movement is repeated with the other leg 1402 .
  • the movement of the leg 1400 can be made at the hip joint backwards and forwards as shown in FIGS. 14 a , 14 b , 14 c . Additionally or alternatively the movement of the leg can be made at the knee joint as shown in FIG. 14 e or at the ankle joint as shown in FIG. 14 f.
  • control device 300 displays the first and second gesture for the patient 100 to perform in steps 602 , 604 on a screen.
  • the control device 300 display an animation of the predetermined gestures so that the patient 100 may follow and repeat the same movements.
  • the control device 300 prompts the patient 100 to carry out symmetrical movements with both arms or legs and leaves the patient 100 to decide which symmetrical movements they should perform.
  • control device 300 does not prompt the patient 100 to perform the first and second gestures.
  • the patient 100 performs the first and second gestures without a prompt from the control device 300 .
  • the patent 100 may have been instructed to perform the first and second gestures periodically by a medical practitioner and the control device 300 then analysis the movement data in step 475 as previously discussed.
  • the first wearable sensor 200 a is mounted on the patient's right wrist 104 a and the second wearable sensor 200 b is mounted on the patient's left wrist 104 b.
  • This means that the first gesture and the second gesture are made on opposite sides of a plane of symmetry (e.g. the sagittal plane) 810 of the patient's body 110 .
  • the first and second wearable sensors 200 a, 200 b measure the movement made by a first part of the patient's body 110 and the movement made by a second part of the patient's body 110 whilst the gestures are being performed.
  • the first and second wearable sensors 200 a, 200 b measure one or more of acceleration, velocity, timing, distance, range of movement of the right wrist 104 a and the left wrist 104 b with respect to the plane 810 of symmetry.
  • the first and second wearable sensors 200 a, 200 b can also measure whether the first and second movements are jerky or smooth in nature.
  • control device 300 analysis the movement data form the first and second wearable sensors 200 a, 200 b in step 475 and determines if there is a significant difference between the movement of the first gesture and the second gesture.
  • control device 300 may determine that the velocity, speed or acceleration of the first gesture and the second gesture may be different. For example, the control device 300 may determine that the range of movement of the first gesture and the second gesture may be different. For example, the control device 300 may determine that the direction of movement of the first gesture and the second gesture may be different. For example, the control device 300 may determine that the range of rotation of the first gesture and the second gesture may be different. For example, the control device 300 may determine that the timing of the first gesture and the second gesture may be different.
  • the control device 300 determines whether the first and second gestures are symmetric as shown in step 606 .
  • the first and second gestures require the patient 100 moving their right wrist 104 a and left wrist 104 b together to perform a simple hand clap.
  • Various other gestures can be performed additionally or alternatively.
  • FIGS. 9 a to 9 h , 10 a to 10 e , 11 a to 11 e , 12 a to 12 i , 13 a to 13 c , 14 a to 14 f and 15 a to 15 b show other possible gestures for the patient 100 to perform during steps 602 , 604 and/or 612 which are discussed in further detail below.
  • the control device 300 determines a differential for movement data of the first and second gestures. In some examples, the control device 300 determines that the first and second gestures are asymmetric when the differential is above a predetermined threshold. In some examples, the control device 300 determines that there is asymmetrical movement when the differential of the movement data between the first and second gesture is more than a difference of 5%, 10%, 20%, 30%, 40% or 50%. For example, the control device 300 determines that the patient's right wrist 104 a moves 10% quicker and through a 10% greater range than the patient's left wrist 104 b. In some other examples, the control device 300 determines an activity measurement and/or an energy expenditure associated with the movement data of the first and second gestures.
  • the control device 300 determines that the first and second gestures are asymmetric when the differential of the activity measurement and/or an energy expenditure is above a predetermined threshold. In some examples, the control device 300 determines that there is asymmetrical movement when the differential of the activity measurement and/or an energy expenditure associated with movement data between the first and second gesture is more than a difference of 5%, 10%, 20%, 30%, 40% or 50%.
  • the threshold of the differential for determining asymmetric movement of the patient's body 110 can be dynamic. In some examples, the threshold of the differential can be modified based on the patient's characteristics (e.g. health, age, sex, or any other suitable parameter that may affect the movement of the patient's body 110 ).
  • the control device 300 is configured to determine the predetermined threshold based on the patient's historical movement data for the patient's body 110 . This means that the control device 300 can determine whether the first and second gestures are significantly asymmetric with respect to patient historical movement data. For example, the patient 100 may have limited movement in a part of their body due to an old injury which may result in the appearance of asymmetric movement of the patient 100 . In this way, the control device 300 can ignore asymmetric behaviour due non-stroke conditions.
  • control device 300 is configured to determine the predetermined threshold based on one or more pre-sets relating to a characteristic of the patient 100 .
  • a present may describe movement relating to a young patient or an old patient.
  • the pre-sets relate to other characteristics of the patient 100 such as inactive, active etc.
  • control device 300 can perform the escalation process step 490 .
  • the control device 300 can escalate and contact the medical services as shown in step 608 .
  • control device 300 determines that the first and second gestures are symmetric, then the control device 300 can cancel the alert as shown in step 610 . After the alert has been cancelled, the control device 300 continues to receive and analysis the data received from the wearable sensors 200 a, 200 b as discussed in reference to FIGS. 4 a , 4 b.
  • the steps 602 , 604 of the patient 100 performing the first and second gestures are optionally carried out at the same time. For example, if the patient 100 is prompted to clap their hands together, then the first and second gestures of the right wrist 104 a and the left wrist 104 b moving towards each other will be measured by the wearable sensors 200 a, 200 b at the same time.
  • the steps 602 , 604 can be carried out at different times.
  • the first gesture and the second gesture may be performed by the patient 100 at different times.
  • the patient 100 can perform a first gesture with the wearable sensor 200 a mounted on the right wrist 104 a and then subsequently perform a second gesture with the wearable sensor 200 b mounted on the left wrist 104 b.
  • the determination of the patient event in step 480 can be carried out with a single wearable sensor 200 a whereby the patient swaps the wearable sensor 200 a from the right wrist 104 a to the left wrist 104 a after making the first gesture but before making the second gesture.
  • the control device 300 may prompt the patient 100 to carry out additional movements as shown in step 612 .
  • the control device 300 may prompt the patient 100 to carry out similar or more complex predetermined gestures.
  • the additional gestures may be used by the control device 300 to determine the presence of asymmetrical body movement for a borderline case.
  • the patient 100 or a remote medical practitioner may request the additional movements to be performed by the patient 100 as a double check.
  • the control device 300 determines in step 614 whether the additional gestures performed in step 612 are asymmetric.
  • the determining step of 614 is similar to step 606 . If the control device 300 determines that the additional gestures are asymmetric, then the control 300 escalates to the medical services in step 608 as before. Similarly, if the control device 300 determines that the additional gestures are actually symmetrical, then the control device cancels the alert in step 610 as previously discussed.
  • step 612 or step 606 may additionally comprise further tests for the patient 100 to determine whether the medical services should be alerted.
  • the further tests may be required to be performed before the patient performs the gestures in steps 606 or 612 .
  • the further tests may be required to be performed after the patient performs the gestures in steps 606 or 612 .
  • control device 300 may display a countdown alert 700 presented to the patient 100 on the display of control device 300 .
  • the countdown alert 700 may comprise a simple countdown alert showing a countdown in seconds before which control device 300 will move to step 608 to alert the medical services.
  • the patient 100 has the option of cancelling the countdown at any time. If the patient 100 fails to respond to the countdown alert, control device 300 will move straight to step 608 to alert medical service. Where the patient 100 cancels the countdown by performing the required cancellation task (e.g. pressing a CANCEL button), the patient 100 is presented with the option to reset the escalation process. Alternatively, the patient 100 may be presented with the option of moving straight to step 608 to alert medical service if the patient 100 feels that something is still wrong.
  • the required cancellation task e.g. pressing a CANCEL button
  • the patient 100 cannot cancel the countdown alert until all of the user tests are successfully completed.
  • a two-part countdown is used.
  • Countdown 1 is a short-term countdown (e.g. less than 60 seconds) and may be cancelled by the patient 100 .
  • Countdown 2 is a longer-term countdown (e.g. longer than 60 seconds) that can only be cancelled by successfully completing all of the patient 100 tests.
  • the patient 100 is additionally required to perform simple cognitive tests such as performing simple mental arithmetic as shown in 710 . If the control device 300 determines that the patient 100 has failed the further tests 700 , 710 , then the control device 300 can escalate and alert the medical services in step 608 .
  • FIGS. 9 a to 9 h , 10 a to 10 e , 11 a to 11 e , 12 a to 12 i , 13 a to 13 c , 14 a to 14 f and 15 a to 15 b show a series of gestures to be performed by the patient 100 according to an example of the present application.
  • FIGS. 9 a to 9 d show the patient 100 moving their right arm 900 from a downwardly pointing direction to an over the head upwardly pointing direction.
  • FIGS. 9 e to 9 h show the patient moving their left arm 902 from a downwardly pointing direction to an over the head upwardly pointing direction.
  • the gestures in FIGS. 9 a to 9 d mirror the gestures in FIGS. 9 e to 9 h .
  • the gestures are mirrored about the sagittal plane 810 of symmetry. In this way the stroke detection apparatus 102 can determine whether there is asymmetry between the movements as previously described in reference to FIGS. 4 a , 4 b above.
  • the gestures as shown in FIGS. 9 a to 9 h can be carried out at the same time or at different times.
  • the patient 100 is moving the arms 900 , 902 in a plane parallel with the frontal plane 806 .
  • FIGS. 10 a to 10 e show a series of arm gestures whereby the patient 100 moves their arms 900 , 902 from a first position resting on their hips in FIG. 10 a to a position where their arms 900 , 902 are pointing upwardly over their head in FIG. 10 e .
  • the gestures in FIGS. 10 a to 10 e are similar to the gestures as shown in FIGS. 9 a to 9 h except that both arms 900 , 902 are moved at the same time.
  • the gestures as shown in 10 a to 10 e can be used to determine asymmetry of the patient movement about the sagittal plane 810 of symmetry.
  • the patient 100 is moving the arms 900 , 902 in a plane parallel with the frontal plane 806 .
  • FIGS. 11 a to 11 e again show a series of arm gestures similar to FIGS. 10 a to 10 e .
  • the patient's right arm 900 is resting on the hips and the patient's left arm 902 is raised over their head.
  • the patient then proceeds to raise and lower the right and left arms 900 , 902 respectively until in the position shown in FIG. 11 e .
  • the patient 100 can then repeat the process to return to the original position in FIG. 11 a .
  • the gestures in FIGS. 11 a to 11 e may be useful for a patient 100 to carry out because it requires an element of cognitive ability to coordinate the arms 900 , 902 .
  • An inability of the patient to coordinate their arms 900 , 902 may be another indicator of a stroke condition.
  • FIGS. 12 a to 12 i again show a series of arm gestures similar to FIGS. 10 a to 10 e .
  • the patient 100 swings both their arms 900 , 902 in an anticlockwise direction until the left arm 902 is above the shoulder in FIG. 12 c .
  • the patient 100 then swings both their arms 900 , 902 down and up in a clockwise direction until the right arm 900 is above the shoulder as shown in FIG. 12 i .
  • the gestures in FIGS. 12 a to 12 i may be useful for a patient 100 to carry out because it requires an element of cognitive ability to coordinate the arms 900 , 902 .
  • FIGS. 13 a to 13 c show a series of arm gestures for the patient 100 .
  • the patient 100 moves the arms 900 , 902 in a plane parallel with the sagittal plane 810 .
  • the patient 100 moves their arms through the transverse plane 808 and rotates them from a first position above their head shown in FIG. 13 a down to a second position by their sides as shown in FIG. 13 c.
  • FIGS. 14 a to 14 f are a series of leg gestures and have been previously discussed above.
  • FIGS. 15 a and 15 b show another series of arm gestures.
  • the patient 100 moves their arms 900 , 902 from a first position where the patient 100 has their arms 900 , 902 pointing out from their sides at shoulder height as shown in FIG. 15 a to a second position whereby the patient 100 has their arms 900 , 902 together pointing out in front of them.
  • the patient 100 moves their arms 900 , 902 in a plane parallel with the transverse plane 808 .
  • the patient 100 can be prompted by the control device 300 to make other movements.
  • the patient 100 is required to move their hands in circles in front of them in a plane parallel with the frontal plane 806 with each hand moving in the same or opposite directions.
  • the patient 100 is required to move their hands in a linear motion up and down in front of them in a plane parallel with the sagittal plane 810 with each hand moving in the same or opposite directions.
  • the patient 100 can be prompted to make diadochokinetic movements. That is, rapid small movements such as moving the finger to the nose, alternating hand clapping or toe tapping movements.
  • the gestures are complex enough such that the control device 300 is can separate it from involuntary movements such as tremor and twitches and jerks that may be seen in epileptic seizures.
  • the gestures can be performed in any place and at any time of the day. This means that the patient 100 can perform the gestures when prompted by the control device 300 even when travel or away from home.
  • control device 300 can determine the movement of the gestures with only the wearable sensors 200 a, 200 b. This means that not further equipment is necessary to determine the movement of the patient 100 .
  • the gestures can be used demonstrate the symmetrical movement of one or more particular muscle groups or multiple muscles groups in a clear and simple way. This means that the gestures can be easy for the patient 100 to carry out whilst being sufficiently complex to determine asymmetric movement indicative of a stroke condition.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Psychiatry (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Psychology (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Medicine (AREA)
  • Critical Care (AREA)
  • Emergency Management (AREA)
  • Nursing (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Business, Economics & Management (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
US17/414,018 2018-12-20 2019-12-19 Apparatus and methods for detecting stroke in a patient Pending US20220031193A1 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
GBGB1820892.6A GB201820892D0 (en) 2018-12-20 2018-12-20 Stroke Detection Sensor
GB1820892.6 2018-12-20
SE1930370 2019-11-12
SE1930370-0 2019-11-12
PCT/SE2019/051319 WO2020130923A1 (en) 2018-12-20 2019-12-19 Apparatus and methods for detecting stroke in a patient

Publications (1)

Publication Number Publication Date
US20220031193A1 true US20220031193A1 (en) 2022-02-03

Family

ID=71101529

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/415,183 Pending US20220061738A1 (en) 2018-12-20 2019-12-19 Stroke detection sensor
US17/414,018 Pending US20220031193A1 (en) 2018-12-20 2019-12-19 Apparatus and methods for detecting stroke in a patient

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/415,183 Pending US20220061738A1 (en) 2018-12-20 2019-12-19 Stroke detection sensor

Country Status (6)

Country Link
US (2) US20220061738A1 (zh)
EP (2) EP3897384A4 (zh)
JP (2) JP7461952B2 (zh)
KR (2) KR20210104692A (zh)
CN (2) CN113226176A (zh)
WO (2) WO2020130924A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4044906A4 (en) 2019-10-15 2023-05-24 Imperative Care, Inc. MULTIVARIABLE ATTACK DETECTION SYSTEMS AND METHODS
US11906540B1 (en) * 2020-10-30 2024-02-20 Bby Solutions, Inc. Automatic detection of falls using hybrid data processing approaches

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0512192D0 (en) * 2005-06-15 2005-07-20 Greater Glasgow Nhs Board Seizure detection apparatus
CN101243471B (zh) 2005-08-19 2013-03-06 皇家飞利浦电子股份有限公司 对用户的运动进行分析的系统和方法
US8109891B2 (en) * 2005-09-19 2012-02-07 Biolert Ltd Device and method for detecting an epileptic event
CN101938940A (zh) 2006-03-08 2011-01-05 皇家飞利浦电子股份有限公司 用于监控肢体机能使用的方法和系统
US7558622B2 (en) * 2006-05-24 2009-07-07 Bao Tran Mesh network stroke monitoring appliance
US8075499B2 (en) * 2007-05-18 2011-12-13 Vaidhi Nathan Abnormal motion detector and monitor
US9161711B2 (en) * 2009-08-19 2015-10-20 Movea System and method for detecting an epileptic seizure in a prone epileptic person
US9717439B2 (en) * 2010-03-31 2017-08-01 Medtronic, Inc. Patient data display
GB2494356B (en) * 2010-07-09 2017-05-31 Univ California System comprised of sensors, communications, processing and inference on servers and other devices
WO2012118998A2 (en) * 2011-03-02 2012-09-07 The Regents Of The University Of California Apparatus, system, and method for detecting activities and anomalies in time series data
US10631760B2 (en) * 2011-09-02 2020-04-28 Jeffrey Albert Dracup Method for prediction, detection, monitoring, analysis and alerting of seizures and other potentially injurious or life-threatening states
WO2013056099A1 (en) * 2011-10-14 2013-04-18 Flint Hills Scientific, Llc Apparatus and systems for event detection using probabilistic measures
US8779918B2 (en) * 2011-12-16 2014-07-15 Richard Housley Convulsive seizure detection and notification system
US20130171596A1 (en) 2012-01-04 2013-07-04 Barry J. French Augmented reality neurological evaluation method
US20150164377A1 (en) * 2013-03-13 2015-06-18 Vaidhi Nathan System and method of body motion analytics recognition and alerting
US9788779B2 (en) * 2013-03-14 2017-10-17 Flint Hills Scientific, L.L.C. Seizure detection based on work level excursion
DK178081B9 (en) * 2013-06-21 2015-05-11 Ictalcare As Method of indicating the probability of psychogenic non-epileptic seizures
US20150018723A1 (en) * 2013-07-09 2015-01-15 Industry-Academic Cooperation Foundation, Kyungpook National University Apparatus for early detection of paralysis based on motion sensing
US20170188895A1 (en) * 2014-03-12 2017-07-06 Smart Monitor Corp System and method of body motion analytics recognition and alerting
WO2016172557A1 (en) * 2015-04-22 2016-10-27 Sahin Nedim T Systems, environment and methods for identification and analysis of recurring transitory physiological states and events using a wearable data collection device
KR102449869B1 (ko) * 2015-05-28 2022-10-04 삼성전자주식회사 뇌파 센서 유닛 및 이를 이용한 뇌파 측정 장치
US11638550B2 (en) * 2015-07-07 2023-05-02 Stryker Corporation Systems and methods for stroke detection
US20180249967A1 (en) 2015-09-25 2018-09-06 Intel Corporation Devices, systems, and associated methods for evaluating a potential stroke condition in a subject
KR102045366B1 (ko) * 2015-10-28 2019-12-05 경북대학교 산학협력단 수면 중 뇌졸중 판단 장치
US10878220B2 (en) * 2015-12-31 2020-12-29 Cerner Innovation, Inc. Methods and systems for assigning locations to devices
EP3402405B1 (en) * 2016-01-12 2023-04-12 Yale University System for diagnosis and notification regarding the onset of a stroke
US20190282127A1 (en) * 2016-05-23 2019-09-19 Koninklijke Philips N.V. System and method for early detection of transient ischemic attack
JP6888095B2 (ja) 2016-09-14 2021-06-16 エフ ホフマン−ラ ロッシュ アクチェン ゲゼルシャフト 認知および動作の疾患もしくは障害についてのデジタルバイオマーカー
WO2018102579A1 (en) * 2016-12-02 2018-06-07 Cardiac Pacemakers, Inc. Multi-sensor stroke detection
KR101970481B1 (ko) * 2017-03-31 2019-04-22 한국표준과학연구원 뇌졸중 모니터링 시스템

Also Published As

Publication number Publication date
CN113226176A (zh) 2021-08-06
JP2022516033A (ja) 2022-02-24
JP2022516035A (ja) 2022-02-24
KR20210104692A (ko) 2021-08-25
EP3897384A4 (en) 2022-08-31
EP3897383A4 (en) 2022-10-26
KR20210104691A (ko) 2021-08-25
WO2020130923A1 (en) 2020-06-25
EP3897384A1 (en) 2021-10-27
EP3897383A1 (en) 2021-10-27
CN113226175A (zh) 2021-08-06
WO2020130924A1 (en) 2020-06-25
US20220061738A1 (en) 2022-03-03
JP7461952B2 (ja) 2024-04-04

Similar Documents

Publication Publication Date Title
US20220401035A1 (en) System and method to detect changes in health parameters and activate lifesaving measures
US20210052177A1 (en) Wearable cardiac electrophysiology measurement devices, software, systems and methods
US10191537B2 (en) Smart wearable devices and methods for customized haptic feedback
US10485471B2 (en) System and method for identifying ictal states in a patient
US20180103859A1 (en) Systems, Devices, and/or Methods for Managing Patient Monitoring
EP2875778B1 (en) Wearable mobile device and method of measuring biological signal with the same
WO2016078258A1 (zh) 一种生理体征监测方法、装置及计算机存储介质
US20060252999A1 (en) Method and system for wearable vital signs and physiology, activity, and environmental monitoring
CN105411554A (zh) 一种无线式无创伤人体生理参数采集、检测及智能诊断系统
KR20190016886A (ko) 저전력 모션 센서를 이용하여 실시간 심장박동 이벤트 감지를 위한 시스템 및 방법
WO2019217368A1 (en) System for monitoring and providing alerts of a fall risk by predicting risk of experiencing symptoms related to abnormal blood pressure(s) and/or heart rate
CN107405088A (zh) 用于为血压测量设备提供控制信号的装置和方法
US20220031193A1 (en) Apparatus and methods for detecting stroke in a patient
CN209790817U (zh) 心肺运动数据采集和康复训练设备
KR20170133003A (ko) 환자 낙상 인식을 위한 웨어러블 장치 및 서버
EP3229666A1 (en) Device and method for determining a state of consciousness
Kumar et al. Motor recovery monitoring in post acute stroke patients using wireless accelerometer and cross-correlation
KR101754576B1 (ko) 탈착 디바이스를 이용한 생체 신호 분석 시스템 및 방법
Salem et al. Nocturnal epileptic seizures detection using inertial and muscular sensors
Zhao et al. The emerging wearable solutions in mHealth
Gheryani et al. Epileptic Seizures Detection based on Inertial and Physiological Data from Wireless Body Sensors
Vishwakarma et al. IOT-BEAT: an intelligent nurse for the cardiac patient with music therapy
Avella-Rodríguez et al. Multimodal Wearable Technology Approaches to Human Falls
Panicker et al. Wireless sensor based systems for biomedical monitoring: A review
Baga et al. PERFORM: A platform for monitoring and management of chronic neurodegenerative diseases: The Parkinson and Amyotrophic Lateral Sclerosis case

Legal Events

Date Code Title Description
AS Assignment

Owner name: UMAN SENSE AB, SWEDEN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WASSELIUS, JOHAN;ERICSON, PETTER;SIGNING DATES FROM 20210603 TO 20210607;REEL/FRAME:056559/0869

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER