WO2023187660A1 - Meditation systems and methods - Google Patents

Meditation systems and methods Download PDF

Info

Publication number
WO2023187660A1
WO2023187660A1 PCT/IB2023/053097 IB2023053097W WO2023187660A1 WO 2023187660 A1 WO2023187660 A1 WO 2023187660A1 IB 2023053097 W IB2023053097 W IB 2023053097W WO 2023187660 A1 WO2023187660 A1 WO 2023187660A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
attributes
auditory
data
platform
Prior art date
Application number
PCT/IB2023/053097
Other languages
French (fr)
Inventor
Adam Ty HAMILTON
Anthony James WILLIAMS
Original Assignee
Escapist Technologies Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Escapist Technologies Pty Ltd filed Critical Escapist Technologies Pty Ltd
Publication of WO2023187660A1 publication Critical patent/WO2023187660A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/033Headphones for stereophonic communication
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M21/02Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis for inducing sleep or relaxation, e.g. by direct nerve stimulation, hypnosis, analgesia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/25Bioelectric electrodes therefor
    • A61B5/251Means for maintaining electrode contact with the body
    • A61B5/256Wearable electrodes, e.g. having straps or bands
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/25Bioelectric electrodes therefor
    • A61B5/279Bioelectric electrodes therefor specially adapted for particular uses
    • A61B5/28Bioelectric electrodes therefor specially adapted for particular uses for electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/25Bioelectric electrodes therefor
    • A61B5/279Bioelectric electrodes therefor specially adapted for particular uses
    • A61B5/291Bioelectric electrodes therefor specially adapted for particular uses for electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4836Diagnosis combined with treatment in closed-loop systems or methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • A61B5/6815Ear
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/683Means for maintaining contact with the body
    • A61B5/6831Straps, bands or harnesses
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • A61B5/7415Sound rendering of measured values, e.g. by pitch or volume variation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1083Reduction of ambient noise
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0271Thermal or temperature sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0027Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the hearing sense
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0061Simulated heartbeat pulsed or modulated
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0088Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus modulated by a simulated respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/33Controlling, regulating or measuring
    • A61M2205/3375Acoustical, e.g. ultrasonic, measuring means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/35Communication
    • A61M2205/3576Communication with non implanted data transmission devices, e.g. using external transmitter or receiver
    • A61M2205/3592Communication with non implanted data transmission devices, e.g. using external transmitter or receiver using telemetric means, e.g. radio or optical transmission
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/50General characteristics of the apparatus with microprocessors or computers
    • A61M2205/502User interfaces, e.g. screens or keyboards
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/50General characteristics of the apparatus with microprocessors or computers
    • A61M2205/502User interfaces, e.g. screens or keyboards
    • A61M2205/505Touch-screens; Virtual keyboard or keypads; Virtual buttons; Soft keys; Mouse touches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/04Heartbeat characteristics, e.g. ECG, blood pressure modulation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/08Other bio-electrical signals
    • A61M2230/10Electroencephalographic signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/20Blood composition characteristics
    • A61M2230/205Blood composition characteristics partial oxygen pressure (P-O2)
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/30Blood pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/40Respiratory characteristics
    • A61M2230/42Rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/50Temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/63Motion, e.g. physical activity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2460/00Details of hearing devices, i.e. of ear- or headphones covered by H04R1/10 or H04R5/033 but not provided for in any of their subgroups, or of hearing aids covered by H04R25/00 but not provided for in any of its subgroups
    • H04R2460/01Hearing devices using active noise cancellation

Definitions

  • the present invention relates to the field of audio signal generation and bioelectrical signal detection. More particularly, the present invention relates to systems and methods for collecting bioelectrical signals such as movement, breathing rate, heart rate, and brainwave activity, and utilizing the data to generate audio signals in noise-cancelling headphones.
  • bioelectrical signals such as movement, breathing rate, heart rate, and brainwave activity
  • Noise-cancelling headphones have become increasingly popular due to their ability to reduce unwanted ambient sounds, making them ideal for use in noisy environments such as airplanes, trains, and buses.
  • current noise cancelling headphones only cancel out external noise and do not provide any additional functionality.
  • bioelectrical signals that can be detected from a user, such as movement, breathing rate, heart rate, and brainwave activity. These signals can be used to determine a user’s physiological state, and have several applications in healthcare, fitness, and entertainment. However, there is currently no noise-cancelling headphones that utilize these signals for audio generation.
  • An object of the present invention is to provide systems and methods for collecting bioelectrical signals such as movement, breathing rate, heart rate, and brainwave activity, and utilizing the data to generate audio signals in noise-cancelling headphones.
  • Regular meditation can positively contribute to our overall wellness, but it can be difficult for many to make it part of a routine that can be kept.
  • the present invention is a science-based meditation technology that helps make every meditation enjoyable and rewarding, so that the user can be calm, centered and ready to take on whatever the day throws at you.
  • the present invention is a premium, noise-cancelling headphone system that is exceptional for listening to music and podcases, that also includes built- in biometric sensors and configured to be used with an integrated app that uniquely enables the measurement of pulse, heart-rate variability, breathing and neuro rhythm (EEG brain waves) to accurately determine your state of mind and dynamically alter (if needed) what you hear.
  • the present invention coaches the user to lower stress and quiet the mind, at will.
  • the present invention includes systems and methods for collecting bioelectrical signals from a user and utilizing at least a portion of the data to generate audio signals in noise cancelling headphones.
  • the system can comprise one or more sensors that can be connected to the user in a wired, wireless, direct contact or indirect contact fashion, including one or more EEG sensors, ECG sensors, accelerometers, and microphones.
  • the data collected from one or more of the sensors is used to generate one or more bioelectrical signal datasets.
  • the bioelectrical signal datasets are then utilized to produce sound data sets and audio signals via the noise cancelling headphones.
  • the audio signals can be conditioned based on the bioelectrical signals, such as changing the tempo or rhythm of the audio signal based on the user’s heart rate, or adjusting the volume based on the user’s breathing rate.
  • the system also includes multipurpose headphones for all uses, meaning that the headphones can be used both for noise cancellation and for generating audio signals based on the bioelectrical signals collected from the user.
  • the present invention is a system comprising a sensing platform configured to monitor one or more user attributes of a user and an auditory platform configured to present one or more auditory attributes to the user based at least in part on one or more of the user attributes.
  • the system can further comprise a user interface configured to receive user data representative of one or more of the user attributes and send auditory data representative of one or more of the auditory attributes.
  • the system can further comprise a processing platform configured to process the user data and/or the auditory data.
  • the present invention is a system comprising a sensing platform comprising a first sensor configured to monitor one or more user attributes of a user and an auditory platform comprising headphones configured to present one or more auditory attributes to the user based at least in part on one or more of the user attributes, wherein the first sensor is selected from the group consisting of an EEG sensor, an ECG sensor, an accelerometer, a microphone, a thermometer, and an oximeter, wherein the user attributes are selected from the group consisting of user movements and user physiology, and wherein the auditory attributes are selected from the group consisting of volume, tone, bass, rhythm and tempo.
  • the user attribute can be stress, and the system configured to lower a level of the stress of the user through real-time adaptation of the auditory attributes presented to the user such that, over a time period, the user’s stress level lowers during use of the system.
  • At least one of the first and at least second sensors can be located distal the headphones. At least one of the first and at least second sensors can be located in one or more of a bracelet, necklace, and/or a head strap.
  • the system can further comprise an analyzing platform configured to analyze the user data for one or more physiological state indications of the user.
  • the headphones can be noise cancelling headphones.
  • the sensing platform can comprise the first sensor and at least a second sensor and the user attributes can be selected from the group consisting of breathing rate, heart rate, brainwave activities, temperature, and oxidation levels.
  • the system can further comprise a communication platform configured to transmit at least a portion of the user data remote from the user.
  • At least one of the attributes can be stress, and a level of the user’s stress can be lowered in response to the adapted one or more of the auditory attributes.
  • Monitoring can comprise monitoring with a sensing platform comprising at least a first sensor.
  • the method can further comprise receiving user data representative of one or more of the user attributes and sending auditory data representative of one or more of the auditory attributes.
  • FIG. 5 illustrates an exemplary embodiment of the present invention, according to a preferred embodiment.
  • FIGS. 6A-6C illustrate exemplary components of the present invention, according to exemplary embodiments, and use indications.
  • FIGS. 7A-7D illustrate exemplary components of the present invention, according to preferred embodiments, and use indications.
  • FIG. 8 shows the headphone system of FIG. 5 in zoom.
  • FIGS. 9-11 are zoomed illustrations of FIGS. 7B-7D, respectively.
  • FIGS. 13-24 are various exemplary headphone systems of the present invention, including varying views.
  • FIG. 25 illustrates an exemplary embodiment of the present invention in use.
  • Ranges may be expressed herein as from “about” or “approximately” or “substantially” one particular value and/or to “about” or “approximately” or “substantially” another particular value. When such a range is expressed, other exemplary embodiments include from the one particular value and/or to the other particular value.
  • substantially free of something can include both being “at least substantially free” of something, or “at least substantially pure”, and being “completely free” of something, or “completely pure”.
  • the present invention is a method and system for collecting and utilizing bioelectrical signals for audio generation in noise cancelling headphones.
  • the system includes multiple sensors that can be connected to the user, such as EEG sensors, ECG sensors, an accelerometer, and a microphone. These sensors collect data on the user’s physiology and movement, including breathing rate, heart rate, brainwave activities, temperature, and oxidation levels, among others.
  • the collected data is used to generate one or more bioelectrical signal datasets, which are then utilized to produce sound data sets and audio signals via the noise cancelling headphones.
  • the audio signals can be conditioned based on the bioelectrical signals, such as changing the tempo or rhythm of the audio signal based on the user’s heart rate, or adjusting the volume based on the user’s breathing rate.
  • the system also includes multipurpose headphones for all uses, including those beyond dynamic control for meditation.
  • the headphones can be used both for noise cancellation and for generating audio signals based on the bioelectrical signals collected from the user.
  • the bioelectrical signal datasets can be analyzed in real-time or stored for future analysis.
  • the analysis can be used to determine the user’s physiological state, such as their level of stress or relaxation, and can be used to adjust the audio signals accordingly. For example, if the user’s stress levels are high, the audio signals can be adjusted to promote relaxation, such as by playing calming music or sounds.
  • the system includes a user interface that allows the user to control the audio signals generated by the headphones.
  • the user interface can be in the form of a mobile application or a physical control panel on the headphones.
  • the system includes a machine learning algorithm that can learn the users’ preferences over time and adjust the audio signals accordingly. For example, if the user consistently responds positively to certain types of music or sounds, the algorithm can learn this and adjust the audio signals to match the user’s preferences.
  • the system includes a communication module that allows the user’s bioelectrical signal datasets to be transmitted to a remote device, such as a healthcare provider or fitness coach. This allows the user’s physiological state to be monitored remotely, and can be used for a variety of applications, such as remote health monitoring or fitness coaching.
  • the present invention has multiple applications across healthcare, fitness, and entertainment.
  • the system can be used for remote health monitoring, as well as for biofeedback training for stress management and relaxation.
  • biofeedback training for stress management and relaxation.
  • the system can be used for real-time monitoring of the user’s physical activity and physiological state, as well as for personalized coaching based on the user’s bioelectrical signals.
  • entertainment the system can be used to enhance the user’s listening experience by adjusting the audio signals based on the user’s physiological state.
  • a computing device may be referred to as a mobile device, mobile computing device, a mobile station (MS), terminal, cellular phone, cellular handset, personal digital assistant (PDA), smartphone, wireless phone, organizer, handheld computer, desktop computer, laptop computer, tablet computer, set-top box, television, appliance, game device, medical device, display device, or some other like terminology.
  • a computing device may be a processor, controller, or a central processing unit (CPU).
  • a computing device may be a set of hardware components.
  • the user interface, headphones and/or other components of the present invention that include presence-sensitive input can be a device that accepts input by the proximity of a finger, a stylus, or an object near the device.
  • a presence-sensitive input device may also be a radio receiver (for example, a Wi-Fi receiver) and processor which is able to infer proximity changes via measurements of signal strength, signal frequency shifts, signal to noise ratio, data error rates, and other changes in signal characteristics.
  • a presence-sensitive input device may also detect changes in an electric, magnetic, or gravity field.
  • a presence-sensitive display may have two main attributes. First, it may enable a user to interact directly with what is displayed, rather than indirectly via a pointer controlled by a mouse or touchpad. Secondly, it may allow a user to interact without requiring any intermediate device that would need to be held in the hand.
  • Such displays may be attached to computers, or to networks as terminals. Such displays may also play a prominent role in the design of digital appliances such as a personal digital assistant (PDA), satellite navigation devices, mobile phones, and video games. Further, such displays may include a capture device and a display.
  • a computer-readable medium may include, for example: a magnetic storage device such as a hard disk, a floppy disk or a magnetic strip; an optical storage device such as a compact disk (CD) or digital versatile disk (DVD); a smart card; and a flash memory device such as a card, stick or key drive, or embedded component.
  • a carrier wave may be employed to carry computer-readable electronic data including those used in transmitting and receiving electronic data such as electronic mail (e-mail) or in accessing a computer network such as the Internet or a local area network (LAN).
  • a person of ordinary skill in the art will recognize many modifications may be made to this configuration without departing from the scope or spirit of the claimed subject matter.
  • FIG. 1 depicts a block diagram of illustrative computing device architecture 100, according to an example implementation. Certain aspects of FIG. 1 may be embodied in a computing device (for example, a mobile computing device). As desired, embodiments of the disclosed technology may include a computing device with more or less of the components illustrated in FIG. 1. It will be understood that the computing device architecture 100 is provided for example purposes only and does not limit the scope of the various embodiments of the present disclosed systems, methods, and computer-readable mediums.
  • the computing device architecture 100 of FIG. 1 includes a CPU 102, where computer instructions are processed; a display interface 106 that acts as a communication interface and provides functions for rendering video, graphics, images, and texts on the display.
  • the display interface 106 may be directly connected to a local display, such as a touch-screen display associated with a mobile computing device.
  • the display interface 106 may be configured for providing data, images, and other information for an extemal/remote display that is not necessarily physically connected to the mobile computing device.
  • a desktop monitor may be utilized for mirroring graphics and other information that is presented on a mobile computing device.
  • the display interface 106 may wirelessly communicate, for example, via a Wi-Fi channel or other available network connection interface 112 to the extemal/remote display.
  • the network connection interface 112 may be configured as a communication interface and may provide functions for rendering video, graphics, images, text, other information, or any combination thereof on the display.
  • a communication interface may include a serial port, a parallel port, a general-purpose input and output (GPIO) port, a game port, a universal serial bus (USB), a micro-USB port, a high-definition multimedia (HDMI) port, a video port, an audio port, a Bluetooth port, a near-field communication (NFC) port, another like communication interface, or any combination thereof.
  • the computing device architecture 100 may include a keyboard interface 104 that provides a communication interface to a keyboard.
  • the computing device architecture 100 may include a presence-sensitive display interface 107 for connecting to a presence-sensitive display.
  • the presence-sensitive display interface 107 may provide a communication interface to various devices such as a pointing device, a touch screen, a depth camera, etc. which may or may not be associated with a display.
  • Example embodiments of the computing device architecture 100 may include an antenna interface 110 that provides a communication interface to an antenna; a network connection interface 112 that provides a communication interface to a network.
  • a camera interface 114 is provided that acts as a communication interface and provides functions for capturing digital images from a camera.
  • a sound interface 116 is provided as a communication interface for converting sound into electrical signals using a microphone and for converting electrical signals into sound using a speaker.
  • a random-access memory (RAM) 118 is provided, where computer instructions and data may be stored in a volatile memory device for processing by the CPU 102.
  • the computing device architecture 100 includes a read-only memory (ROM) 120 where invariant low-level system code or data for basic system functions such as basic input and output (I/O), startup, or reception of keystrokes from a keyboard are stored in a non-volatile memory device.
  • ROM read-only memory
  • the computing device architecture 100 includes a storage medium 122 or other suitable type of memory (e.g., RAM, ROM, programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic disks, optical disks, floppy disks, hard disks, removable cartridges, flash drives), where the files include an operating system 124, application programs 126 (including, for example, a web browser application, a widget or gadget engine, and or other applications, as necessary) and data files 128 are stored.
  • the computing device architecture 100 includes a power source 130 that provides an appropriate alternating current (AC) or direct current (DC) to power components.
  • AC alternating current
  • DC direct current
  • the storage medium 122 itself may include a number of physical drive units, such as a redundant array of independent disks (RAID), a floppy disk drive, a flash memory, a USB flash drive, an external hard disk drive, thumb drive, pen drive, key drive, a High-Density Digital Versatile Disc (HD-DVD) optical disc drive, an internal hard disk drive, a Blu-Ray optical disc drive, or a Holographic Digital Data Storage (HDDS) optical disc drive, an external mini-dual inline memory module (DIMM) synchronous dynamic random access memory (SDRAM), or an external micro-DIMM SDRAM.
  • RAID redundant array of independent disks
  • HD-DVD High-Density Digital Versatile Disc
  • HD-DVD High-Density Digital Versatile Disc
  • HDDS Holographic Digital Data Storage
  • DIMM mini-dual inline memory module
  • SDRAM synchronous dynamic random access memory
  • micro-DIMM SDRAM an external micro-DIMM SDRAM
  • Such computer readable storage media allow a computing device to access computer-executable process steps, application programs and the like, stored on removable and non-removable memory media, to off-load data from the device or to upload data onto the device.
  • a computer program product such as one utilizing a communication system may be tangibly embodied in storage medium 122, which may comprise a machine-readable storage medium.
  • the term computing device may be a CPU, or conceptualized as a CPU (for example, the CPU 102 of FIG. 1).
  • the computing device may be coupled, connected, and/or in communication with one or more peripheral devices, such as display.
  • the term computing device, as used herein may refer to a mobile computing device, such as a smartphone or tablet computer.
  • the computing device may output content to its local display and/or speaker(s).
  • the computing device may output content to an external display device (e.g., over Wi-Fi) such as a TV or an external computing system.
  • the computing device may include any number of hardware and/or software applications that are executed to facilitate any of the operations.
  • one or more I/O interfaces may facilitate communication between the computing device and one or more input/output devices.
  • a universal serial bus port, a serial port, a disk drive, a CD-ROM drive, and/or one or more user interface devices such as a display, keyboard, keypad, mouse, control panel, touch screen display, microphone, etc.
  • the one or more I/O interfaces may be utilized to receive or collect data and/or user instructions from a wide variety of input devices. Received data may be processed by one or more computer processors as desired in various embodiments of the disclosed technology and/or stored in one or more memory devices.
  • One or more network interfaces may facilitate connection of the computing device inputs and outputs to one or more suitable networks and/or connections; for example, the connections that facilitate communication with any number of sensors associated with the system.
  • the one or more network interfaces may further facilitate connection to one or more suitable networks; for example, a local area network, a wide area network, the Internet, a cellular network, a radio frequency network, a Bluetooth enabled network, a Wi-Fi enabled network, a satellite-based network any wired network, any wireless network, etc., for communication with external devices and/or systems.
  • the action icons lead to different menus or pages within the mobile for a presentation of the meditation content and a reception of the user input, as disclosed herein.
  • the pages can present a plurality of meditation contents, where the icons correspond to the pages in a one-to-one manner, where the pages correspond to the meditation contents in a one-to-one manner.
  • the wearable devices embody multiple forms, such as a headphone system 220, a headband or skin patch 230, and/or a wristband or bracelet 240.
  • Sensing technology can be integrated in any of the devices 220, 230, 240.
  • the sensing technology is preferably non-invasive and senses parameters of interest. For example, pulse, heart rate variability, respiration, oximetry and others, can be sensed by the present invention. Further, the present invention can use sensed parameter data to calculate other important parameters that cannot be directly measured, but nonetheless are useful for the present invention to accurately determine a user’s state of mind and dynamically alter (if needed) what the user hears in a feedback loop enabling the present invention to continual monitor user parameters, and adjust delivery of audio in order to coach the user to lower stress and quiet the mind.
  • the present invention can incorporate sensing for any physiological signal, from cardiac monitoring and SpO2 to body temperature, pulse rate, respiration rate and blood pressure.
  • the headphone system 220 can host one or more sensors.
  • the headphone system 220 can include a headband, or a cord, or an earpiece, such as an ear cup, an ear bud, an ear pad, an earphone, an in-ear piece, or a pivoting earphone, any of which can host one or more sensors, whether externally or internally, such as via fastening, adhering, mating, or others.
  • headphone system 220 includes noise-cancelling hardware or software.
  • the headband or the skin patch 230 can hosts, whether on a flexible or rigid portion thereof, one or more sensors, whether externally or internally, such as via fastening, adhering, mating, or others.
  • one or more of the devices 220, 230, 240 can communicate with the mobile device 210. In some exemplary embodiment, one or more of the devices 220, 230, 240 can communicate with one or more of other of the devices 220, 230, 240.
  • the communication can be in a wired or wireless manner, such as via a radio communication technique, an optical communication technique, an infrared communication technique, a sound communication technique, or others.
  • FIG. 3 shows a diagram of an embodiment of the user 200 meditating and employing the mobile device 210 of FIG 2, where headband 230 is an ECG and/or an EEG sensing band, and wristband or bracelet 240 comprises, for example, a heart sensor, a pH sensor, or other biometric sensing technology, with each device configured for wired or wireless communication if a particular set-up requires same.
  • headband 230 is an ECG and/or an EEG sensing band
  • wristband or bracelet 240 comprises, for example, a heart sensor, a pH sensor, or other biometric sensing technology, with each device configured for wired or wireless communication if a particular set-up requires same.
  • FIGS. 4A, 4B and 4C illustrate exemplary embodiments of the present invention.
  • FIGS. 4A and 4C show details of an exemplary headphone system 220, while FIG. 4B illustrates the wearing of the headphone system 220 and removable headband 230.
  • FIG. 5 illustrates an exemplary embodiment of the present invention.
  • the present invention can include an intuitive user interface that allows the user to interact rapidly, naturally and frustration-free. Power and mode buttons are shown on the right. The top and bottom buttons of each of the three are extended further (are higher), than the middle ones. Details on the buttons can be used to provide the user with additional tactile feedback. The size, shape, orientation, number and other aspects of the controls/buttons are variable.
  • FIGS. 6A, 6B and 6C illustrate another exemplary embodiment of the present invention.
  • the present invention as shown in FIGS. 6A-6C has for example, a silverish color and metallic finish.
  • the panels on the earcup housings have hinges inside for as perfect a fit to the ears as possible.
  • Under the headband of the headphone system 220 there can be placed an elastic one comprising, for example, three EEG sensors. This arrangement offers customable fit for various head sizes.
  • the elastic secondary headband can be flexible in order to rotate to measure signals at, for example, C3-Cz-C4, P3-Pz-P4, and/or F3-Fz-F4 electrodes.
  • the ear cushions can be elliptically shaped so as to wrap from the inside out, enabling extra thickness for comfort.
  • Essential oil marks can also be placed on the cushions around the temple locations.
  • An LED indicator can provide visual feedback of the headphone status.
  • FIGS. 7A-7D illustrate exemplary components of the present invention, according to preferred embodiments, and use indications.
  • FIG. 8 shows the headphone system of FIG. 5 in zoom.
  • FIGS. 9-11 are zoomed illustrations of FIGS. 7B-7D, respectively.
  • FIG. 12 is another exemplary embodiment of the present invention.
  • FIGS. 13-24 are various exemplary headphone systems of the present invention, including varying views.
  • FIG. 25 illustrates an exemplary embodiment of the present invention in use.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Cardiology (AREA)
  • Physiology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Acoustics & Sound (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Anesthesiology (AREA)
  • Signal Processing (AREA)
  • Epidemiology (AREA)
  • Artificial Intelligence (AREA)
  • Educational Technology (AREA)
  • Hematology (AREA)
  • Pulmonology (AREA)
  • Primary Health Care (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Pain & Pain Management (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Otolaryngology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Systems and methods of changing a level of at least one user attribute through real-time adaptation of auditory attributes presented to the user such that, over a time period, the user's level of the at least one user attribute changes during use of the system/method. A system can include a sensing platform configured to monitor one or more user attributes of a user, an auditory platform configured to present one or more auditory attributes to the user based at least in part on one or more of the user attributes, a user interface configured to receive user data representative of one or more of the user attributes and send auditory data representative of one or more of the auditory attributes, and a processing platform configured to process the user data and/or the auditory data.

Description

MEDITATION SYSTEMS AND METHODS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] Not Applicable
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
[0002] Not Applicable
THE NAMES OF THE PARTIES TO A JOINT RESEARCH AGREEMENT
[0003] Not Applicable
SEQUENCE LISTING
[0004] Not Applicable
STATEMENT REGARDING PRIOR DISCLOSURES BY THE INVENTOR OR A JOINT INVENTOR
[0005] Not Applicable
BACKGROUND OF THE DISCLOSURE
1. Field of the Invention
[0006] The present invention relates to the field of audio signal generation and bioelectrical signal detection. More particularly, the present invention relates to systems and methods for collecting bioelectrical signals such as movement, breathing rate, heart rate, and brainwave activity, and utilizing the data to generate audio signals in noise-cancelling headphones.
2. Description of Related Art
[0007] Noise-cancelling headphones have become increasingly popular due to their ability to reduce unwanted ambient sounds, making them ideal for use in noisy environments such as airplanes, trains, and buses. However, current noise cancelling headphones only cancel out external noise and do not provide any additional functionality.
[0008] There are also several bioelectrical signals that can be detected from a user, such as movement, breathing rate, heart rate, and brainwave activity. These signals can be used to determine a user’s physiological state, and have several applications in healthcare, fitness, and entertainment. However, there is currently no noise-cancelling headphones that utilize these signals for audio generation.
[0009] An object of the present invention is to provide systems and methods for collecting bioelectrical signals such as movement, breathing rate, heart rate, and brainwave activity, and utilizing the data to generate audio signals in noise-cancelling headphones.
BRIEF SUMMARY OF THE INVENTION
[0010] Regular meditation can positively contribute to our overall wellness, but it can be difficult for many to make it part of a routine that can be kept. The present invention is a science-based meditation technology that helps make every meditation enjoyable and rewarding, so that the user can be calm, centered and ready to take on whatever the day throws at you.
[0011] In an exemplary embodiment, the present invention is a premium, noise-cancelling headphone system that is exceptional for listening to music and podcases, that also includes built- in biometric sensors and configured to be used with an integrated app that uniquely enables the measurement of pulse, heart-rate variability, breathing and neuro rhythm (EEG brain waves) to accurately determine your state of mind and dynamically alter (if needed) what you hear. Using responsive 3D audio, the present invention coaches the user to lower stress and quiet the mind, at will.
[0012] In an exemplary embodiment, the present invention includes systems and methods for collecting bioelectrical signals from a user and utilizing at least a portion of the data to generate audio signals in noise cancelling headphones. The system can comprise one or more sensors that can be connected to the user in a wired, wireless, direct contact or indirect contact fashion, including one or more EEG sensors, ECG sensors, accelerometers, and microphones. The data collected from one or more of the sensors is used to generate one or more bioelectrical signal datasets.
[0013] The bioelectrical signal datasets are then utilized to produce sound data sets and audio signals via the noise cancelling headphones. The audio signals can be conditioned based on the bioelectrical signals, such as changing the tempo or rhythm of the audio signal based on the user’s heart rate, or adjusting the volume based on the user’s breathing rate. [0014] The system also includes multipurpose headphones for all uses, meaning that the headphones can be used both for noise cancellation and for generating audio signals based on the bioelectrical signals collected from the user.
[0015] In an exemplary embodiment, the present invention is a system comprising a sensing platform configured to monitor one or more user attributes of a user and an auditory platform configured to present one or more auditory attributes to the user based at least in part on one or more of the user attributes.
[0016] The system can further comprise a user interface configured to receive user data representative of one or more of the user attributes and send auditory data representative of one or more of the auditory attributes.
[0017] The system can further comprise a processing platform configured to process the user data and/or the auditory data.
[0018] The system can further comprise an analyzing platform configured to analyze the user data for one or more physiological state indications of the user.
[0019] The system can further comprise a machine learning platform configured to learn one or more preferences of the user. The machine learning platform can further be configured to adjust the auditory data.
[0020] In an exemplary embodiment, the present invention is a system comprising a sensing platform comprising a first sensor configured to monitor one or more user attributes of a user and an auditory platform comprising headphones configured to present one or more auditory attributes to the user based at least in part on one or more of the user attributes, wherein the first sensor is selected from the group consisting of an EEG sensor, an ECG sensor, an accelerometer, a microphone, a thermometer, and an oximeter, wherein the user attributes are selected from the group consisting of user movements and user physiology, and wherein the auditory attributes are selected from the group consisting of volume, tone, bass, rhythm and tempo.
[0021] The user attribute can be stress, and the system configured to lower a level of the stress of the user through real-time adaptation of the auditory attributes presented to the user such that, over a time period, the user’s stress level lowers during use of the system.
[0022] The first sensor can be located in the headphones. [0023] The sensing platform can comprise at least a second sensor different than the first sensor, wherein at least one of the first and at least second sensors are located in the headphones.
[0024] At least one of the first and at least second sensors can be located distal the headphones. At least one of the first and at least second sensors can be located in one or more of a bracelet, necklace, and/or a head strap.
[0025] In an exemplary embodiment, the present invention is a system comprising a sensing platform configured to monitor one or more user attributes of a user, an auditory platform configured to present one or more auditory attributes to the user based at least in part on one or more of the user attributes, a user interface configured to receive user data representative of one or more of the user attributes and send auditory data representative of one or more of the auditory attributes, and a processing platform configured to process the user data and/or the auditory data, wherein the sensing platform comprises at least a first sensor configured to monitor one or more of the user attributes, wherein the auditory platform comprises headphones, and wherein the system is configured to change a level of at least one of the user attributes through real-time adaptation of the auditory attributes presented to the user such that, over a time period, the user’s level of the at least one user attribute changes during use of the system.
[0026] The system can further comprise an analyzing platform configured to analyze the user data for one or more physiological state indications of the user.
[0027] The system can further comprise a machine learning platform configured to learn one or more preferences of the user. The machine learning platform can further be configured to adjust the auditory data.
[0028] The headphones can be noise cancelling headphones.
[0029] The sensing platform can comprise the first sensor and at least a second sensor and the user attributes can be selected from the group consisting of breathing rate, heart rate, brainwave activities, temperature, and oxidation levels.
[0030] The system can further comprise a communication platform configured to transmit at least a portion of the user data remote from the user.
[0031] In an exemplary embodiment, the present invention is a method of changing a level of at least one user attribute comprising monitoring one or more user attributes of a user, presenting one or more auditory attributes to the user based at least in part on one or more of the user attributes, and adapting one or more of the auditory attributes presented to the user such that, over a time period, the user’s level of the at least one user attribute changes during the method.
[0032] At least one of the attributes can be stress, and a level of the user’s stress can be lowered in response to the adapted one or more of the auditory attributes.
[0033] Monitoring can comprise monitoring with a sensing platform comprising at least a first sensor.
[0034] Presenting can comprise presenting with an auditory platform configured to present the one or more auditory attributes to the user based at least in part on the one or more of the user attributes.
[0035] The method can further comprise receiving user data representative of one or more of the user attributes and sending auditory data representative of one or more of the auditory attributes.
[0036] Other aspects and features of exemplary embodiments of this disclosure will become apparent to those of ordinary skill in the art, upon reviewing the following description of specific, exemplary embodiments of this disclosure in concert with the various figures. While features of this disclosure may be discussed relative to certain exemplary embodiments and figures, all exemplary embodiments of this disclosure can include one or more of the features discussed in this application. While one or more exemplary embodiments may be discussed as having certain advantageous features, one or more of such features may also be used with the other various exemplary embodiments discussed in this application. In similar fashion, while exemplary embodiments may be discussed below as system or method exemplary embodiments, it is to be understood that such exemplary embodiments can be implemented in various devices, systems, and methods. As such, discussion of one feature with one exemplary embodiment does not limit other exemplary embodiments from possessing and including that same feature.
BRIEF DESCRIPTION OF THE DRAWINGS
[0037] The accompanying Figures, which are incorporated in and constitute a part of this specification, illustrate several aspects described below.
[0038] FIG. 1 depicts a block diagram of illustrative computing device architecture 100, according to an example implementation. [0039] FIG. 2 shows a diagram of an exemplary embodiment of a user meditating and employing a mobile device and a wearable device according to a preferred embodiment of the present invention.
[0040] FIG. 3 shows a diagram of an embodiment of a user meditating and employing a mobile device and a wearable device, where the wearable device includes an EEG sensor band and a heart sensor according to a preferred embodiment of the present invention.
[0041] FIGS. 4A, 4B and 4C are exemplary embodiments of elements of the present invention, including a headphone system and a band sensor for ease of forehead mounting.
[0042] FIG. 5 illustrates an exemplary embodiment of the present invention, according to a preferred embodiment.
[0043] FIGS. 6A-6C illustrate exemplary components of the present invention, according to exemplary embodiments, and use indications.
[0044] FIGS. 7A-7D illustrate exemplary components of the present invention, according to preferred embodiments, and use indications.
[0045] FIG. 8 shows the headphone system of FIG. 5 in zoom.
[0046] FIGS. 9-11 are zoomed illustrations of FIGS. 7B-7D, respectively.
[0047] FIG. 12 is another exemplary embodiment of the present invention.
[0048] FIGS. 13-24 are various exemplary headphone systems of the present invention, including varying views.
[0049] FIG. 25 illustrates an exemplary embodiment of the present invention in use.
DETAIL DESCRIPTION OF THE INVENTION
[0050] To facilitate an understanding of the principles and features of the various exemplary embodiments of the invention, various illustrative exemplary embodiments are explained below. Although exemplary embodiments of the invention are explained in detail, it is to be understood that other exemplary embodiments are contemplated. Accordingly, it is not intended that the invention is limited in its scope to the details of construction and arrangement of components set forth in the following description or illustrated in the drawings. The invention is capable of other exemplary embodiments and of being practiced or carried out in various ways. Also, in describing the exemplary embodiments, specific terminology will be resorted to for the sake of clarity.
[0051] It must also be noted that, as used in the specification and the appended claims, the singular forms “a,” “an” and “the” include plural references unless the context clearly dictates otherwise. For example, reference to a component is intended also to include composition of a plurality of components. References to a composition containing “a” constituent is intended to include other constituents in addition to the one named.
[0052] Also, in describing the exemplary embodiments, terminology will be resorted to for the sake of clarity. It is intended that each term contemplates its broadest meaning as understood by those skilled in the art and includes all technical equivalents which operate in a similar manner to accomplish a similar purpose.
[0053] Ranges may be expressed herein as from “about” or “approximately” or “substantially” one particular value and/or to “about” or “approximately” or “substantially” another particular value. When such a range is expressed, other exemplary embodiments include from the one particular value and/or to the other particular value.
[0054] Similarly, as used herein, “substantially free” of something, or “substantially pure”, and like characterizations, can include both being “at least substantially free” of something, or “at least substantially pure”, and being “completely free” of something, or “completely pure”.
[0055] By “comprising” or “containing” or “including” is meant that at least the named compound, element, particle, or method step is present in the composition or article or method, but does not exclude the presence of other compounds, materials, particles, method steps, even if the other such compounds, material, particles, method steps have the same function as what is named.
[0056] It is also to be understood that the mention of one or more method steps does not preclude the presence of additional method steps or intervening method steps between those steps expressly identified. Similarly, it is also to be understood that the mention of one or more components in a composition does not preclude the presence of additional components than those expressly identified.
[0057] The materials described as making up the various elements of the invention are intended to be illustrative and not restrictive. Many suitable materials that would perform the same or a similar function as the materials described herein are intended to be embraced within the scope of the invention. Such other materials not described herein can include, but are not limited to, for example, materials that are developed after the time of the development of the invention.
[0058] The present invention is a method and system for collecting and utilizing bioelectrical signals for audio generation in noise cancelling headphones. The system includes multiple sensors that can be connected to the user, such as EEG sensors, ECG sensors, an accelerometer, and a microphone. These sensors collect data on the user’s physiology and movement, including breathing rate, heart rate, brainwave activities, temperature, and oxidation levels, among others.
[0059] The collected data is used to generate one or more bioelectrical signal datasets, which are then utilized to produce sound data sets and audio signals via the noise cancelling headphones. The audio signals can be conditioned based on the bioelectrical signals, such as changing the tempo or rhythm of the audio signal based on the user’s heart rate, or adjusting the volume based on the user’s breathing rate.
[0060] The system also includes multipurpose headphones for all uses, including those beyond dynamic control for meditation. The headphones can be used both for noise cancellation and for generating audio signals based on the bioelectrical signals collected from the user.
[0061] The bioelectrical signal datasets can be analyzed in real-time or stored for future analysis. The analysis can be used to determine the user’s physiological state, such as their level of stress or relaxation, and can be used to adjust the audio signals accordingly. For example, if the user’s stress levels are high, the audio signals can be adjusted to promote relaxation, such as by playing calming music or sounds.
[0062] In an exemplary embodiment of the invention, the system includes a user interface that allows the user to control the audio signals generated by the headphones. The user interface can be in the form of a mobile application or a physical control panel on the headphones.
[0063] In another embodiment of the invention, the system includes a machine learning algorithm that can learn the users’ preferences over time and adjust the audio signals accordingly. For example, if the user consistently responds positively to certain types of music or sounds, the algorithm can learn this and adjust the audio signals to match the user’s preferences. [0064] In yet another embodiment of the invention, the system includes a communication module that allows the user’s bioelectrical signal datasets to be transmitted to a remote device, such as a healthcare provider or fitness coach. This allows the user’s physiological state to be monitored remotely, and can be used for a variety of applications, such as remote health monitoring or fitness coaching.
[0065] The present invention has multiple applications across healthcare, fitness, and entertainment. In healthcare, the system can be used for remote health monitoring, as well as for biofeedback training for stress management and relaxation. In fitness, the system can be used for real-time monitoring of the user’s physical activity and physiological state, as well as for personalized coaching based on the user’s bioelectrical signals. In entertainment, the system can be used to enhance the user’s listening experience by adjusting the audio signals based on the user’s physiological state.
[0066] In some instances, a computing device may be referred to as a mobile device, mobile computing device, a mobile station (MS), terminal, cellular phone, cellular handset, personal digital assistant (PDA), smartphone, wireless phone, organizer, handheld computer, desktop computer, laptop computer, tablet computer, set-top box, television, appliance, game device, medical device, display device, or some other like terminology. In other instances, a computing device may be a processor, controller, or a central processing unit (CPU). In yet other instances, a computing device may be a set of hardware components.
[0067] The user interface, headphones and/or other components of the present invention that include presence-sensitive input can be a device that accepts input by the proximity of a finger, a stylus, or an object near the device. A presence-sensitive input device may also be a radio receiver (for example, a Wi-Fi receiver) and processor which is able to infer proximity changes via measurements of signal strength, signal frequency shifts, signal to noise ratio, data error rates, and other changes in signal characteristics. A presence-sensitive input device may also detect changes in an electric, magnetic, or gravity field.
[0068] A presence-sensitive input device may be combined with a display to provide a presencesensitive display. For example, a user may provide an input to a computing device by touching the surface of a presence-sensitive display using a finger. In another example implementation, a user may provide input to a computing device by gesturing without physically touching any object. For example, a gesture may be received via a video camera or depth camera.
[0069] In some instances, a presence-sensitive display may have two main attributes. First, it may enable a user to interact directly with what is displayed, rather than indirectly via a pointer controlled by a mouse or touchpad. Secondly, it may allow a user to interact without requiring any intermediate device that would need to be held in the hand. Such displays may be attached to computers, or to networks as terminals. Such displays may also play a prominent role in the design of digital appliances such as a personal digital assistant (PDA), satellite navigation devices, mobile phones, and video games. Further, such displays may include a capture device and a display.
[0070] Various aspects described herein may be implemented using standard programming or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computing device to implement the disclosed subject matter. A computer-readable medium may include, for example: a magnetic storage device such as a hard disk, a floppy disk or a magnetic strip; an optical storage device such as a compact disk (CD) or digital versatile disk (DVD); a smart card; and a flash memory device such as a card, stick or key drive, or embedded component. Additionally, it should be appreciated that a carrier wave may be employed to carry computer-readable electronic data including those used in transmitting and receiving electronic data such as electronic mail (e-mail) or in accessing a computer network such as the Internet or a local area network (LAN). Of course, a person of ordinary skill in the art will recognize many modifications may be made to this configuration without departing from the scope or spirit of the claimed subject matter.
[0071] FIG. 1 depicts a block diagram of illustrative computing device architecture 100, according to an example implementation. Certain aspects of FIG. 1 may be embodied in a computing device (for example, a mobile computing device). As desired, embodiments of the disclosed technology may include a computing device with more or less of the components illustrated in FIG. 1. It will be understood that the computing device architecture 100 is provided for example purposes only and does not limit the scope of the various embodiments of the present disclosed systems, methods, and computer-readable mediums.
[0072] The computing device architecture 100 of FIG. 1 includes a CPU 102, where computer instructions are processed; a display interface 106 that acts as a communication interface and provides functions for rendering video, graphics, images, and texts on the display. According to certain some embodiments of the disclosed technology, the display interface 106 may be directly connected to a local display, such as a touch-screen display associated with a mobile computing device. In another example embodiment, the display interface 106 may be configured for providing data, images, and other information for an extemal/remote display that is not necessarily physically connected to the mobile computing device. For example, a desktop monitor may be utilized for mirroring graphics and other information that is presented on a mobile computing device. According to certain some embodiments, the display interface 106 may wirelessly communicate, for example, via a Wi-Fi channel or other available network connection interface 112 to the extemal/remote display.
[0073] In an example embodiment, the network connection interface 112 may be configured as a communication interface and may provide functions for rendering video, graphics, images, text, other information, or any combination thereof on the display. In one example, a communication interface may include a serial port, a parallel port, a general-purpose input and output (GPIO) port, a game port, a universal serial bus (USB), a micro-USB port, a high-definition multimedia (HDMI) port, a video port, an audio port, a Bluetooth port, a near-field communication (NFC) port, another like communication interface, or any combination thereof.
[0074] The computing device architecture 100 may include a keyboard interface 104 that provides a communication interface to a keyboard. In one example embodiment, the computing device architecture 100 may include a presence-sensitive display interface 107 for connecting to a presence-sensitive display. According to certain some embodiments of the disclosed technology, the presence-sensitive display interface 107 may provide a communication interface to various devices such as a pointing device, a touch screen, a depth camera, etc. which may or may not be associated with a display.
[0075] The computing device architecture 100 may be configured to use an input device via one or more of input/ output interfaces (for example, the keyboard interface 104, the display interface 106, the presence sensitive display interface 107, network connection interface 112, camera interface 114, sound interface 116, etc.) to allow a user to capture information into the computing device architecture 100. The input device may include a mouse, a trackball, a directional pad, a track pad, a touch-verified track pad, a presence-sensitive track pad, a presence-sensitive display, a scroll wheel, a digital camera, a digital video camera, a web camera, a microphone, a sensor, a smartcard, and the like. Additionally, the input device may be integrated with the computing device architecture 100 or may be a separate device. For example, the input device may be an accelerometer, a magnetometer, a digital camera, a microphone, and an optical sensor.
[0076] Example embodiments of the computing device architecture 100 may include an antenna interface 110 that provides a communication interface to an antenna; a network connection interface 112 that provides a communication interface to a network. According to certain embodiments, a camera interface 114 is provided that acts as a communication interface and provides functions for capturing digital images from a camera. According to certain embodiments, a sound interface 116 is provided as a communication interface for converting sound into electrical signals using a microphone and for converting electrical signals into sound using a speaker. According to example embodiments, a random-access memory (RAM) 118 is provided, where computer instructions and data may be stored in a volatile memory device for processing by the CPU 102.
[0077] According to an example embodiment, the computing device architecture 100 includes a read-only memory (ROM) 120 where invariant low-level system code or data for basic system functions such as basic input and output (I/O), startup, or reception of keystrokes from a keyboard are stored in a non-volatile memory device. According to an example embodiment, the computing device architecture 100 includes a storage medium 122 or other suitable type of memory (e.g., RAM, ROM, programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic disks, optical disks, floppy disks, hard disks, removable cartridges, flash drives), where the files include an operating system 124, application programs 126 (including, for example, a web browser application, a widget or gadget engine, and or other applications, as necessary) and data files 128 are stored. According to an example embodiment, the computing device architecture 100 includes a power source 130 that provides an appropriate alternating current (AC) or direct current (DC) to power components. According to an example embodiment, the computing device architecture 100 includes a telephony subsystem 132 that allows the device 100 to transmit and receive sound over a telephone network. The constituent devices and the CPU 102 communicate with each other over a bus 134. [0078] According to an example embodiment, the CPU 102 has appropriate structure to be a computer processor. In one arrangement, the CPU 102 may include more than one processing unit. The RAM 118 interfaces with the computer bus 134 to provide quick RAM storage to the CPU 102 during the execution of software programs such as the operating system application programs, and device drivers. More specifically, the CPU 102 loads computer-executable process steps from the storage medium 122 or other media into a field of the RAM 118 in order to execute software programs. Data may be stored in the RAM 118, where the data may be accessed by the computer CPU 102 during execution. In one example configuration, the device architecture 100 includes at least 125 MB of RAM, and 256 MB of flash memory.
[0079] The storage medium 122 itself may include a number of physical drive units, such as a redundant array of independent disks (RAID), a floppy disk drive, a flash memory, a USB flash drive, an external hard disk drive, thumb drive, pen drive, key drive, a High-Density Digital Versatile Disc (HD-DVD) optical disc drive, an internal hard disk drive, a Blu-Ray optical disc drive, or a Holographic Digital Data Storage (HDDS) optical disc drive, an external mini-dual inline memory module (DIMM) synchronous dynamic random access memory (SDRAM), or an external micro-DIMM SDRAM. Such computer readable storage media allow a computing device to access computer-executable process steps, application programs and the like, stored on removable and non-removable memory media, to off-load data from the device or to upload data onto the device. A computer program product, such as one utilizing a communication system may be tangibly embodied in storage medium 122, which may comprise a machine-readable storage medium.
[0080] According to one example embodiment, the term computing device, as used herein, may be a CPU, or conceptualized as a CPU (for example, the CPU 102 of FIG. 1). In this example embodiment, the computing device may be coupled, connected, and/or in communication with one or more peripheral devices, such as display. In another example embodiment, the term computing device, as used herein, may refer to a mobile computing device, such as a smartphone or tablet computer. In this example embodiment, the computing device may output content to its local display and/or speaker(s). In another example embodiment, the computing device may output content to an external display device (e.g., over Wi-Fi) such as a TV or an external computing system. [0081] In some embodiments of the disclosed technology, the computing device may include any number of hardware and/or software applications that are executed to facilitate any of the operations. In some embodiments, one or more I/O interfaces may facilitate communication between the computing device and one or more input/output devices. For example, a universal serial bus port, a serial port, a disk drive, a CD-ROM drive, and/or one or more user interface devices, such as a display, keyboard, keypad, mouse, control panel, touch screen display, microphone, etc., may facilitate user interaction with the computing device. The one or more I/O interfaces may be utilized to receive or collect data and/or user instructions from a wide variety of input devices. Received data may be processed by one or more computer processors as desired in various embodiments of the disclosed technology and/or stored in one or more memory devices.
[0082] One or more network interfaces may facilitate connection of the computing device inputs and outputs to one or more suitable networks and/or connections; for example, the connections that facilitate communication with any number of sensors associated with the system. The one or more network interfaces may further facilitate connection to one or more suitable networks; for example, a local area network, a wide area network, the Internet, a cellular network, a radio frequency network, a Bluetooth enabled network, a Wi-Fi enabled network, a satellite-based network any wired network, any wireless network, etc., for communication with external devices and/or systems.
[0083] FIG. 2 illustrates a user 200 meditating and employing a presence-sensitive input device comprising a user interface 210, which in an exemplary embodiment can be a mobile device 210 and wearable devices 220, 230, 240, each of which may employ presence-sensitive inputs. The user 200 can use one or more of the devices 210, 220, 230, 240 to meditate. As shown, the mobile device 210 can be embodied as a smartphone running a mobile app depicting a graphical user interface with a plurality of action icons that are concurrently displayed in order to receive a user input, such as a touch input, a sound input, an image input, a shake input, or others. When activated, the action icons lead to different menus or pages within the mobile for a presentation of the meditation content and a reception of the user input, as disclosed herein. For example, the pages can present a plurality of meditation contents, where the icons correspond to the pages in a one-to-one manner, where the pages correspond to the meditation contents in a one-to-one manner. [0084] The wearable devices embody multiple forms, such as a headphone system 220, a headband or skin patch 230, and/or a wristband or bracelet 240.
[0085] Sensing technology can be integrated in any of the devices 220, 230, 240. The sensing technology is preferably non-invasive and senses parameters of interest. For example, pulse, heart rate variability, respiration, oximetry and others, can be sensed by the present invention. Further, the present invention can use sensed parameter data to calculate other important parameters that cannot be directly measured, but nonetheless are useful for the present invention to accurately determine a user’s state of mind and dynamically alter (if needed) what the user hears in a feedback loop enabling the present invention to continual monitor user parameters, and adjust delivery of audio in order to coach the user to lower stress and quiet the mind.
[0086] The present invention can incorporate sensing for any physiological signal, from cardiac monitoring and SpO2 to body temperature, pulse rate, respiration rate and blood pressure.
[0087] The headphone system 220, whether wired or wireless, whether circum-aural, supra-aural, open, semi-open, semi-closed, closed back, ear-fitting, or a headset, can host one or more sensors. The headphone system 220 can include a headband, or a cord, or an earpiece, such as an ear cup, an ear bud, an ear pad, an earphone, an in-ear piece, or a pivoting earphone, any of which can host one or more sensors, whether externally or internally, such as via fastening, adhering, mating, or others. In some embodiments, headphone system 220 includes noise-cancelling hardware or software.
[0088] For example, the headband or the skin patch 230 can hosts, whether on a flexible or rigid portion thereof, one or more sensors, whether externally or internally, such as via fastening, adhering, mating, or others.
[0089] For example, the wristband or the bracelet 240 can similarly host, whether on a flexible or rigid portion thereof, one or more sensors, whether externally or internally, such as via fastening, adhering, mating, or others.
[0090] In some exemplary embodiment, one or more of the devices 220, 230, 240 can communicate with the mobile device 210. In some exemplary embodiment, one or more of the devices 220, 230, 240 can communicate with one or more of other of the devices 220, 230, 240. The communication can be in a wired or wireless manner, such as via a radio communication technique, an optical communication technique, an infrared communication technique, a sound communication technique, or others.
[0091] FIG. 3 shows a diagram of an embodiment of the user 200 meditating and employing the mobile device 210 of FIG 2, where headband 230 is an ECG and/or an EEG sensing band, and wristband or bracelet 240 comprises, for example, a heart sensor, a pH sensor, or other biometric sensing technology, with each device configured for wired or wireless communication if a particular set-up requires same.
[0092] FIGS. 4A, 4B and 4C illustrate exemplary embodiments of the present invention. FIGS. 4A and 4C show details of an exemplary headphone system 220, while FIG. 4B illustrates the wearing of the headphone system 220 and removable headband 230.
[0093] FIG. 5 illustrates an exemplary embodiment of the present invention. The present invention can include an intuitive user interface that allows the user to interact rapidly, naturally and frustration-free. Power and mode buttons are shown on the right. The top and bottom buttons of each of the three are extended further (are higher), than the middle ones. Details on the buttons can be used to provide the user with additional tactile feedback. The size, shape, orientation, number and other aspects of the controls/buttons are variable.
[0094] FIGS. 6A, 6B and 6C illustrate another exemplary embodiment of the present invention. The present invention as shown in FIGS. 6A-6C, has for example, a silverish color and metallic finish. The panels on the earcup housings have hinges inside for as perfect a fit to the ears as possible. Under the headband of the headphone system 220, there can be placed an elastic one comprising, for example, three EEG sensors. This arrangement offers customable fit for various head sizes. The elastic secondary headband can be flexible in order to rotate to measure signals at, for example, C3-Cz-C4, P3-Pz-P4, and/or F3-Fz-F4 electrodes.
[0095] The ear cushions can be elliptically shaped so as to wrap from the inside out, enabling extra thickness for comfort. Essential oil marks can also be placed on the cushions around the temple locations. An LED indicator can provide visual feedback of the headphone status.
[0096] FIGS. 7A-7D illustrate exemplary components of the present invention, according to preferred embodiments, and use indications. FIG. 8 shows the headphone system of FIG. 5 in zoom. FIGS. 9-11 are zoomed illustrations of FIGS. 7B-7D, respectively. FIG. 12 is another exemplary embodiment of the present invention. FIGS. 13-24 are various exemplary headphone systems of the present invention, including varying views. FIG. 25 illustrates an exemplary embodiment of the present invention in use.
[0097] Numerous characteristics and advantages have been set forth in the foregoing description, together with details of structure and function. While the invention has been disclosed in several forms, it will be apparent to those skilled in the art that many modifications, additions, and deletions, especially in matters of shape, size, and arrangement of parts, can be made therein without departing from the spirit and scope of the invention and its equivalents as set forth in the following claims. Therefore, other modifications or exemplary embodiments as may be suggested by the teachings herein are particularly reserved as they fall within the breadth and scope of the claims here appended.

Claims

CLAIMS What is claimed is:
1. A system comprising: a sensing platform configured to monitor one or more user attributes of a user; and an auditory platform configured to present one or more auditory attributes to the user based at least in part on one or more of the user attributes.
2. The system of Claim 1 further comprising a user interface configured to: receive user data representative of one or more of the user attributes; and send auditory data representative of one or more of the auditory attributes.
3. The system of Claim 2 further comprising a processing platform configured to process the user data and/or the auditory data.
4. The system of Claim 3 further comprising an analyzing platform configured to analyze the user data for one or more physiological state indications of the user.
5. The system of Claim 4 further comprising a machine learning platform configured to learn one or more preferences of the user.
6. The system of Claim 5, wherein the machine learning platform is further configured to adjust the auditory data.
7. A system comprising: a sensing platform comprising a first sensor configured to monitor one or more user attributes of a user; and an auditory platform comprising headphones configured to present one or more auditory attributes to the user based at least in part on one or more of the user attributes; wherein the first sensor is selected from the group consisting of an EEG sensor, an ECG sensor, an accelerometer, a microphone, a thermometer, and an oximeter; wherein the user attributes are selected from the group consisting of user movements and user physiology; and wherein the auditory attributes are selected from the group consisting of volume, tone, bass, rhythm and tempo.
8. The system of Claim 7, wherein the user attribute is stress; and wherein the system is configured to lower a level of the stress of the user through real-time adaptation of the auditory attributes presented to the user such that, over a time period, the user’s stress level lowers during use of the system.
9. The system of Claim 7, wherein the first sensor is located in the headphones.
10. The system of Claim 7, wherein the sensing platform comprises at least a second sensor different than the first sensor; wherein at least one of the first and at least second sensors are located in the headphones.
11. The system of Claim 10, wherein at least one of the first and at least second sensors are located distal the headphones.
12. The system of Claim 10, wherein at least one of the first and at least second sensors are located distal the headphones, in one or more of a bracelet, necklace, and/or a head strap.
13. A system comprising: a sensing platform configured to monitor one or more user attributes of a user; an auditory platform configured to present one or more auditory attributes to the user based at least in part on one or more of the user attributes; a user interface configured to: receive user data representative of one or more of the user attributes; and send auditory data representative of one or more of the auditory attributes; and a processing platform configured to process the user data and/or the auditory data; wherein the sensing platform comprises at least a first sensor configured to monitor one or more of the user attributes; wherein the auditory platform comprises headphones; and wherein the system is configured to change a level of at least one of the user attributes through real-time adaptation of the auditory attributes presented to the user such that, over a time period, the user’s level of the at least one user attribute changes during use of the system.
14. The system of Claim 13 further comprising an analyzing platform configured to analyze the user data for one or more physiological state indications of the user.
15. The system of Claim 14 further comprising a machine learning platform configured to learn one or more preferences of the user.
16. The system of Claim 15, wherein the machine learning platform is further configured to adjust the auditory data.
17. The system of Claim 13, wherein the headphones are noise cancelling headphones.
18. The system of Claim 13, wherein the sensing platform comprises the first sensor and at least a second sensor; and wherein the user attributes are selected from the group consisting of breathing rate, heart rate, brainwave activities, temperature, and oxidation levels.
19. The system of Claim 13 further comprising a communication platform configured to transmit at least a portion of the user data remote from the user.
20. A method of changing a level of at least one user attribute comprising: monitoring one or more user attributes of a user; presenting one or more auditory attributes to the user based at least in part on one or more of the user attributes; and adapting one or more of the auditory attributes presented to the user such that, over a time period, the user’s level of the at least one user attribute changes during the method.
21. The method of Claim 20, wherein at least one user attribute is stress; and wherein a level of the user’s stress lowers in response to the adapted one or more of the auditory attributes.
22. The method of Claim 20, wherein monitoring comprises monitoring with a sensing platform comprising at least a first sensor; wherein presenting comprising presenting with an auditory platform configured to present the one or more auditory attributes to the user based at least in part on the one or more of the user attributes.
23. The method of Claim 22 further comprising: receiving user data representative of one or more of the user attributes; and sending auditory data representative of one or more of the auditory attributes.
PCT/IB2023/053097 2022-03-28 2023-03-28 Meditation systems and methods WO2023187660A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263269997P 2022-03-28 2022-03-28
US63/269,997 2022-03-28

Publications (1)

Publication Number Publication Date
WO2023187660A1 true WO2023187660A1 (en) 2023-10-05

Family

ID=88199877

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/053097 WO2023187660A1 (en) 2022-03-28 2023-03-28 Meditation systems and methods

Country Status (1)

Country Link
WO (1) WO2023187660A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170339484A1 (en) * 2014-11-02 2017-11-23 Ngoggle Inc. Smart audio headphone system
US20190222918A1 (en) * 2015-08-05 2019-07-18 Emotiv Inc. Method and system for collecting and processing bioelectrical and audio signals
US20200029881A1 (en) * 2016-09-29 2020-01-30 Mindset Innovation Inc. Biosignal headphones

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170339484A1 (en) * 2014-11-02 2017-11-23 Ngoggle Inc. Smart audio headphone system
US20190222918A1 (en) * 2015-08-05 2019-07-18 Emotiv Inc. Method and system for collecting and processing bioelectrical and audio signals
US20200029881A1 (en) * 2016-09-29 2020-01-30 Mindset Innovation Inc. Biosignal headphones

Similar Documents

Publication Publication Date Title
US10366778B2 (en) Method and device for processing content based on bio-signals
US10216474B2 (en) Variable computing engine for interactive media based upon user biometrics
US10824232B2 (en) Sound outputting apparatus, electronic apparatus, and control method thereof
CN111867475B (en) Infrasound biosensor system and method
US10345901B2 (en) Sound outputting apparatus, electronic apparatus, and control method thereof
KR102320815B1 (en) Wearable apparatus and the controlling method thereof
US10685577B2 (en) Systems and methods for delivering sensory input during a dream state
CN101467875B (en) Ear-wearing type physiology feedback device
CN104703662A (en) Personal wellness device
JP2021509842A (en) Wearable computing device
US20170095199A1 (en) Biosignal measurement, analysis and neurostimulation
CN103581428A (en) Terminal and control method thereof
JP6939797B2 (en) Information processing equipment, information processing methods, and programs
CN108874130B (en) Play control method and related product
WO2021070456A1 (en) Earphone, information processing device, and information processing method
CN107929913A (en) VR psychology loosens experiencing system
US20220137915A1 (en) Daydream-aware information recovery system
Valeriani et al. Towards a wearable device for controlling a smartphone with eye winks
WO2023187660A1 (en) Meditation systems and methods
CN108837271B (en) Electronic device, output method of prompt message and related product
CN113906368A (en) Modifying audio based on physiological observations
US20230240610A1 (en) In-ear motion sensors for ar/vr applications and devices
US20230277130A1 (en) In-ear microphones for ar/vr applications and devices
TW202332408A (en) In-ear motion sensors for ar/vr applications and devices
US20230293116A1 (en) Virtual assist device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23778625

Country of ref document: EP

Kind code of ref document: A1