WO2017137890A1 - Systems and methods for providing information to a user based on acqumed sensor data and database information - Google Patents

Systems and methods for providing information to a user based on acqumed sensor data and database information Download PDF

Info

Publication number
WO2017137890A1
WO2017137890A1 PCT/IB2017/050655 IB2017050655W WO2017137890A1 WO 2017137890 A1 WO2017137890 A1 WO 2017137890A1 IB 2017050655 W IB2017050655 W IB 2017050655W WO 2017137890 A1 WO2017137890 A1 WO 2017137890A1
Authority
WO
WIPO (PCT)
Prior art keywords
human
intelligible
sensor
signal output
processor
Prior art date
Application number
PCT/IB2017/050655
Other languages
French (fr)
Inventor
Julien Penders
Eric Dy
Marco Altini
Original Assignee
Bloom Technologies NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bloom Technologies NV filed Critical Bloom Technologies NV
Priority to EP17705176.0A priority Critical patent/EP3414688A1/en
Publication of WO2017137890A1 publication Critical patent/WO2017137890A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Definitions

  • This invention relates generally to the field of signal data analyses, and more specifically to new and useful systems and methods for providing information to a user based on acquired sensor data and database information.
  • the present disclosure is directed to systems and methods for increasing the intelligibility of data, for example by transforming structured data into unstructured data and vice versa.
  • One aspect of the present disclosure is directed to a system for providing information to a user based on acquired sensor data and database information.
  • the system includes: a first sensor; a processor communicatively coupled to the first sensor; and a computer- readable medium having non-transitory, processor-executable instructions stored thereon.
  • execution of the instructions causes the processor to perform a method including: acquiring a first signal output from the first sensor, identifying a first reproducible feature in the first signal output, extracting the first reproducible feature from the first signal output, and translating the first reproducible feature into a first human-intelligible element.
  • the first human-intelligible element describes an unknown condition detected by the first sensor.
  • the method executed by the processor further includes: querying, using the first human-intelligible element, a data source to identify a second human- intelligible element.
  • the second human-intelligible element identifies the unknown condition described by the first human-intelligible element.
  • the data source is tailored to the user using one or more of a tone and an angle.
  • the system further includes a second sensor.
  • the second sensor is communicatively coupled to the processor.
  • the method executed by the processor further includes: acquiring a second signal output from the second sensor; identifying a second reproducible feature in the second signal output; extracting the second reproducible feature from the second signal output; translating the second reproducible feature into a second human-intelligible element; integrating the first human-intelligible element from the first sensor and the second human-intelligible element from the second sensor; and querying, using the integrated first and second human-intelligible elements, a data source to identify a third human-intelligible element.
  • the second reproducible feature is different than the first reproducible feature.
  • the second human-intelligible element describes an unknown condition detected by the second sensor.
  • the third human- intelligible element identifies the unknown condition described by the first and second human- intelligible elements.
  • the method executed by the processor further includes: receiving a text input from a user; extracting a second human-intelligible element from the text input; integrating the first and second human-intelligible elements; and querying, using the integrated first and second human-intelligible elements, a data source to identify a third human- intelligible element.
  • the third human-intelligible element identifies the unknown condition described by the first and second human-intelligible elements.
  • the system includes: a first sensor; a first processor communicatively coupled to the first sensor; a first computer-readable medium having a first set of non-transitory, processor- executable instructions stored thereon; a computing device communicatively coupled to the first processor, a second processor, and a data source; and a second computer-readable medium having a second set of non-transitory, processor-executable instructions stored thereon.
  • execution of the first set of instructions causes the first processor to perform a method including: acquiring a first signal output from a first sensor, identifying a reproducible feature in the first signal output, and extracting the reproducible feature from the first signal output.
  • execution of the second set of instructions causes the second processor to perform a method including: receiving the extracted reproducible feature from the first processor; translating the extracted reproducible feature into a first human- intelligible element, and querying, using the first human-intelligible element, the data source to identify a second human-intelligible element.
  • the first human-intelligible element describes an unknown condition detected by the first sensor.
  • the second human-intelligible element identifies the unknown condition described by the first human-intelligible element.
  • the system further includes a Data2Language engine.
  • translating is performed by the Data2Language engine.
  • the system further includes one or more of a word database, an image database, and a relation database.
  • the first human-intelligible element is derived from one or more of the word database, the image database, and the relation database.
  • the method executed by the second processor further includes acquiring an input from an expert to enrich a word database, an image database, and a relation database.
  • the method performed by the first processor further includes extracting a pattern from the reproducible feature.
  • translating further includes mapping the reproducible feature to the first human-intelligible element.
  • the first and second human-intelligible elements include one or more of words, phrases, sentences, and images.
  • translating further includes: creating an association between two or more words; and combining the two or more words into one or more sentences.
  • the method performed by the second processor further includes automatically inputting the first human-intelligible element into a search engine.
  • the system further includes a second sensor.
  • acquiring a first signal output from the first sensor further includes acquiring a second signal output from the second sensor.
  • the method performed by the second processor further includes integrating the first human-intelligible element derived from the first and second data outputs from the first and second sensors, respectively, to identify the unknown condition.
  • the first sensor is different from the second sensor.
  • the reproducible feature includes one or more of a contraction duration, a contraction frequency, a contraction intensity, a maternal heart rate, a fetal heart rate, and a number of fetal movements.
  • the system further includes a display communicatively coupled to the processor.
  • the display is configured to display the identified condition to the user.
  • the identified condition includes one or more of: a stage of pregnancy, a post-partum condition, a preconception condition, a maternal health or wellbeing status, a maternal stress level, a maternal sleep index, a fetal movement level, a fetal health or wellbeing status, a fetal stress level, and a stage of labor.
  • the data source includes scientific papers, reports, clinical papers, clinical reports, websites, blogs, social media platforms, medical records, medical textbooks, and clinical notes.
  • Another aspect of the present disclosure is directed to a computer-implemented method for providing information to a user based on acquired sensor data and database information.
  • the method includes: acquiring a first signal output from a first sensor; identifying a reproducible feature in the first signal output; extracting the reproducible feature from the first signal output; translating the reproducible feature into a first human-intelligible element; and querying, using the first human-intelligible element, a data source to identify a second human-intelligible element.
  • the first human-intelligible element describes an unknown condition detected by the first sensor.
  • the second human-intelligible element identifies the unknown condition described by the first human-intelligible element.
  • the first human-intelligible element is derived from one or more of a word database, an image database, and a relation database.
  • the method further includes acquiring an input from an expert to enrich one or more of a word database, an image database, and a relation database.
  • the method further includes extracting a pattern from the reproducible feature.
  • translating further includes mapping the reproducible feature to the first human-intelligible element.
  • the first human-intelligible element includes one or more of words, phrases, images, and sentences.
  • translating further includes: creating an association between two or more words; and combining the two or more words into one or more sentences.
  • the method further includes automatically inputting the first human-intelligible element into a search engine to identify the second human-intelligible element.
  • the method further includes acquiring a second signal output from a second sensor.
  • the method further includes integrating the first human- intelligible element derived from the first and second data outputs from the first and second sensors, respectively, to identify the unknown condition.
  • the first sensor is different from the second sensor.
  • the reproducible feature includes one or more of a contraction duration, a contraction frequency, a contraction intensity, a time interval between contractions, a maternal heart rate, a fetal heart rate, a heart rate variability, a step, a number of calories burned over a certain period of time, blood-pressure level, blood oxygenation level, blood-glucose level, stress level, a number of hours of sleep, body position during sleep, a sleep quality index, a fitness index, a wellbeing index, and a number of fetal movements.
  • the identified condition includes one or more of: a stage of pregnancy, a post-partum stage, a preconception stage, a maternal health or wellbeing status, a maternal stress level, a fetal health or wellbeing status, a fetal stress level, a maternal sleep index, a fetal movement level, and a stage of labor.
  • the method further includes displaying the identified condition to the user on a display.
  • acquiring, identifying, and extracting are performed by a first processor; and translating and querying are performed by a second processor.
  • the method is performed by one processor.
  • the method further includes: providing feedback to the user based on the first and second human-intelligible elements.
  • the feedback is tailored to the user using one or more of a tone and an angle.
  • Another aspect of the present disclosure is directed to a method of linking observations to data acquired by sensors.
  • the method includes: receiving at least one signal output from a sensor associated with at least one user; receiving a plurality of human-intelligible elements from the at least one user; identifying similarities in the plurality of human-intelligible elements and the at least one signal output received from the at least one user; and creating a link between the similar human-intelligible elements and the at least one signal output.
  • the plurality of human-intelligible elements includes one or more of: an observed symptom, a diagnosis, an observation about wellbeing, a symptom, a condition, and a feeling.
  • the method further includes developing, over time, a probability that a particular human-intelligible element will be associated with a particular signal output.
  • FIG. 1A illustrates one embodiment of a system for providing information to a user based on acquired sensor data and database information.
  • FIG. IB illustrates one embodiment of a sensing device of a system for providing information to a user based on acquired sensor data and database information.
  • FIG. 1C illustrates one embodiment of a computing device of a system for providing information to a user based on acquired sensor data and database information.
  • FIG. 2 illustrates one embodiment of a system for providing information to a user based on acquired sensor data and database information.
  • FIG. 3 illustrates one embodiment of a system for providing information to a user based on acquired sensor data and database information.
  • FIG. 4 illustrates one embodiment of a system for providing information to a user based on acquired sensor data and database information that incorporates input from an expert.
  • FIG. 5 illustrates one embodiment of a system for providing information to a user based on acquired sensor data and one or more data sources.
  • FIG. 6 illustrates one embodiment of a system for providing information to a user based on one or more integrated data types and one or more data sources.
  • FIG. 7 illustrates a flowchart of one embodiment of a method of providing information to a user based on acquired sensor data and database information.
  • FIG. 8 illustrates a flowchart of one embodiment of a method of adding expert input into the system.
  • FIG. 9 illustrates a flowchart of one embodiment of a method of linking observations to data acquired by sensors.
  • FIG. 10 illustrates a flow chart of one embodiment of a method of providing information to a user based on user input and/or acquired sensor data and database information.
  • FIG. 11A illustrate an example of a computing device configured to provide information to a user based on acquired data and database information.
  • FIG. 11B illustrates an example of a computing device configured to provide information to a user based on acquired data and database information.
  • FIG. llC illustrates an example of a computing device configured to provide information to a user based on acquired data and database information.
  • FIG. 11D illustrates an example of a computing device configured to provide information to a user based on acquired data and database information.
  • FIG. HE illustrates an example of a computing device configured to provide information to a user based on acquired data and database information.
  • FIG. 12A illustrate an example of a computing device configured to provide information to a user based on acquired data and database information.
  • FIG. 12B illustrate an example of a computing device configured to provide information to a user based on acquired data and database information.
  • FIG. 12C illustrate an example of a computing device configured to provide information to a user based on acquired data and database information.
  • the systems and methods described here are not directed to an abstract idea.
  • the methods and systems described herein are directed to a specialized process to link structured (i.e., organized) and unstructured (i.e., non-controlled, human- intelligible) data, by converting or translating structured data into unstructured data.
  • the methods and systems described here are not directed to: a fundamental economic principle, a human activity, and/or a mathematical relationship/formula.
  • the systems and methods described herein amount to significantly more than an alleged abstract idea.
  • the systems and methods described herein improve analysis and use of structured and unstructured data and may improve the functioning of the computing device that executes software and/or implements the methods.
  • the methods described herein may: speed up computations; reduce memory consumption when performing the computations; improve reliability of the computations;
  • the methods executed by the computing device constitute a technical effect in which structured data is transformed into unstructured data or unstructured data is transformed into structured data.
  • Methods and systems are described herein to link structured (i.e., organized) and unstructured (i.e., non-controlled, human-intelligible) data, by converting or translating structured data into unstructured data. More specifically, the proposed method and system address the three problems mentioned above by: (1) Automatically and dynamically converting sensor data into human-intelligible information.
  • Sensor data features, patterns, and/or trends are converted or translated into a vocabulary, thus establishing the missing link between structured and unstructured data; (2) Querying, using this vocabulary, unstructured data with sensor data, comparing sensor data to unstructured data, and processing sensor data using tools typically reserved for unstructured data processing (e.g., Natural Language Processing (NLP)) to link sensor data to medical knowledge contained in unstructured information; and (3) Integrating, using this vocabulary, sensor data with other unstructured data inputs (e.g., patients' notes, doctors' notes, medical records, publications, manual input, text, sound, etc.) therefore allowing structured data and unstructured data to be processed and analyzed together using methods such as NLP.
  • NLP Natural Language Processing
  • a user includes: any person who desires to monitor and/or track personal parameters (e.g., health, diet, condition, wellbeing, etc.), population level parameters (e.g., disease characteristics and/or progression, health, fitness, etc.), and/or environmental parameters (e.g., environmental conditions at a point in time, environmental changes over time, etc.).
  • personal parameters e.g., health, diet, condition, wellbeing, etc.
  • population level parameters e.g., disease characteristics and/or progression, health, fitness, etc.
  • environmental parameters e.g., environmental conditions at a point in time, environmental changes over time, etc.
  • a user may include a pregnant woman, a woman or a man trying to conceive, a woman in her postpartum period, a person suffering from a condition or disease, a data scientist, a physician, a healthcare professional, an athlete, a personal trainer, a geneticist, an environmentalist, a call center, a care center, a care team, and/or any other individual desiring to track and/or monitor a physical, environmental, and/or otherwise observable phenomena.
  • Information provided to a user by the system includes, but is not limited to, health
  • data is acquired by the system using one or more sensors.
  • a sensor provides a means for monitoring, including but not limited to, sensors producing waveforms representing biological, physiological, neurological, psychological, physical, chemical, electrical, environmental, and mechanical signals, such as pressure, sound, temperature, heart rate, contractions, and the like, probes, surveillance equipment, measuring equipment, and any other means for monitoring parameters representative of or characteristic for an application domain.
  • a sensor may be a special purpose or general purpose sensor, adapted for measuring just one or a number of physical parameters, such as temperature, noise, pressure, movement (e.g., inertial sensor, accelerometer, gyroscope, pedometer, magnetometer), heart rate (e.g., electrocardiogram, Doppler ultrasound, acoustics sensor, optical sensor, thermal and infrared sensor, radar-based and radio-frequency sensor), stress, skin impedance (e.g., galvanic skin conductance, electrodermal activity), tissue impedance (e.g., impedance spectroscopy, bio- impedance), contact impedance, muscle contraction (e.g., electromyogram, electrohysterogram), and conductivity.
  • physical parameters such as temperature, noise, pressure, movement (e.g., inertial sensor, accelerometer, gyroscope, pedometer, magnetometer), heart rate (e.g., electrocardiogram, Doppler ultrasound, acoustics sensor, optical sensor, thermal and in
  • the sensors may be individual sensors and/or sensors connected in a sensor network.
  • the sensors may be attached on the body (wearable sensors), in proximity of the body (portable sensors), or distributed in the environment (radiofrequency or radar sensors).
  • Sensor systems and/or sensor networks may include not only sensors that monitor the system itself, such as the human body in the case of a body sensor network, but also sensors that sense the context and environment in which the system or user is evolving.
  • a reproducible feature includes, but is not limited to, a physiological feature: a contraction duration; a contraction frequency; a contraction intensity; a time interval between two contractions; a heart rate; heart rate variability; a number of fetal movements; a step; a number of calories burned over a certain period of time; blood-pressure level; blood oxygenation level; blood-glucose level; stress level; a number of hours of sleep; body position during sleep; a sleep quality index; a fitness index; a wellbeing index; and/or any other physiological response or output.
  • a reproducible feature includes, but is not limited to, an environmental feature: a level of CO or C0 2 ; a presence of smoke; temperature; humidity level; illumination level; ultra-violet light level; a presence or a level of a water contaminant; a presence or a level of small particles in air; a barometric pressure level; and/or any other environmental parameter.
  • one or more patterns are extracted from the reproducible feature. Patterns may be identified in the time domain, frequency domain, time- frequency domain, in signal amplitude, in signal shape, and/or signal phase. For example, a pattern is a set of different features at a certain point in time, or the same feature at different time points, or a combination of both. Examples of patterns include, but are not limited to, a contraction with an increased heart rate; a number of contractions with an increased contraction frequency over a defined time period; decreased fetal movements with decreased fetal heart rate; and increased CO with increased heart rate.
  • sensor data is linked to information (e.g., human- intelligible element) stored in one or more databases and/or one or more data sources.
  • Databases include, but are not limited to, word databases, sentence (relation) databases, image databases, phrase databases, sound databases, character databases, and/or any other database type.
  • Data sources include, but are not limited to, medical data, medical records, information contained in books, doctor notes or notebooks, publications, clinical case studies, scientific literature, websites, blogs, social media platforms, laboratory notebooks, observations from the general population, observations from experts, and/or any other information stored and/or available in a non-controlled format (i.e., unstructured data).
  • a tone (e.g., positive, encouraging, negative, suggestive, etc.) and/or point- of- view, perspective, or angle (e.g., doctor's, user's, general public's, etc.) of the information provided to the user may be adapted by changing the content of the databases and/or data sources.
  • the user may create a profile in the system describing a condition or situation of the user, demographic (e.g., socioeconomic status, ethnicity, age, weight, health history, sex, family health history, etc.) information about the user, important contacts for the user (e.g., doctor, family, social worker, support group, etc.), goals of the user, and/or any additional information.
  • the system may perform analytics on the profile information to determine an appropriate tone and/or angle for the user.
  • the system may mine information from one or more Internet accounts (e.g., blogs, email, social networking applications, etc.) of the user to determine an appropriate tone and/or angle for the user.
  • the user selects a tone and/or angle of the information that is provided to the user. For example, the user may choose to get the perspective of a midwife or a doctor on a question related to her pregnancy, or she may want to compare the two perspectives on the question.
  • one or more features and/or patterns are translated into one or more human-intelligible elements using, for example a database as a source of human- intelligible elements.
  • a human-intelligible element includes, but is not limited to, a word, phrase, sentence, image, graphic, sound, character, and/or gesture.
  • a human- intelligible element includes at least one, more than one, a plurality, or a set of human-intelligible elements.
  • additional human-intelligible elements are identified by querying one or more data sources of information.
  • the identified human-intelligible elements may further describe or identify an unknown condition.
  • an identified condition includes, but is not limited to, a description of a condition; a term or name for a condition; information about a condition; and/or related symptoms or observations associated with a condition.
  • a system 10 for providing information to a user based on acquired sensor data and database information includes a sensing device 12 comprising a first processor and a sensor; a computing device 14 communicatively coupled to the sensing device 12 and comprising a second processor; and, optionally, a server 16.
  • Various components of the system 10 function to acquire sensor data and translate the sensor data into a format understandable and/or useable by a user.
  • the computing device 14, sensing device 12, and/or server 16 may communicate wirelessly using Bluetooth, Wi-Fi, CDMA, LTE, other cellular protocol, other radiofrequency, or another wireless protocol.
  • the system 10 may include a server
  • the server 16 may be a local server on the computing device 14 or a remote server. In some embodiments, the server 16 is a virtual server. In some embodiments, the server 16 may share data between the computing device 14 and the sensing device 12. In some embodiments, the server may include one or more databases and/or data sources used by the processor of the computing device 14 and/or the sensing device 12.
  • the system 10 further includes a sensing device 12.
  • the sensing device 12 measures a biological, physiological, neurological, psychological, physical, chemical, electrical, environmental, and/or mechanical signal; identifies reproducible features in the signal, extracts the reproducible features from the signal; and sends, transmits, or exports the extracted reproducible features and/or patterns to the computing device 14.
  • the computing device 14 may receive and/or import the data from the sensing device 12 to translate, analyze (e.g., query a data source), and/or display the data to a user.
  • sending or transmitting information occurs via a wired connection (e.g., IEEE 1394, Thunderbolt, Lightning, DVI, HDMI, Serial, Universal Serial Bus, Parallel, Ethernet, Coaxial, VGA, PS/2) or wirelessly (e.g., via Bluetooth, low energy Bluetooth, near-field communication, Infrared, WLAN, or other RF technology).
  • a wired connection e.g., IEEE 1394, Thunderbolt, Lightning, DVI, HDMI, Serial, Universal Serial Bus, Parallel, Ethernet, Coaxial, VGA, PS/2
  • wirelessly e.g., via Bluetooth, low energy Bluetooth, near-field communication, Infrared, WLAN, or other RF technology.
  • the sensing device 12 may include a Bloom Life Belli, a FitBit, a Pebble smartwatch, a heart rate monitor (e.g., ECG, chest strap, etc.), a muscle contraction monitor (e.g., electromyogram), a pulse oximeter, an Apple Watch, a blood pressure cuff, caliper, pedometer, movement monitor (e.g., accelerometer, Doppler ultrasound, etc.), Airbot (i.e., air quality sensing), Waterbot (i.e., water quality sensing), Sensordrone (i.e., environment sensing), Lapka environmental monitor, Sensaris, or any other device used for sensing and/or measuring physiological and/or environmental parameters.
  • a heart rate monitor e.g., ECG, chest strap, etc.
  • a muscle contraction monitor e.g., electromyogram
  • a pulse oximeter e.g., an Apple Watch
  • a blood pressure cuff e.g., caliper
  • pedometer
  • the system includes a computing device 14.
  • the computing device functions to receive extracted reproducible features and/or patterns from the sensing device and translate the extracted reproducible features and/or patterns into human intelligible elements.
  • the computing device 14 is a stationary computing device.
  • the stationary computing device includes a desktop computer or a workstation.
  • a computing device 14 is a mobile or portable computing device.
  • a portable computing device includes, but is not limited to, a laptop, netbook, tablet, mobile phone, personal digital assistant, or wearable device (e.g., Google Glass, Apple Watch, etc.).
  • the computing device 14 is a computational device, wrapped in a chassis that includes a display (visual with or without touch responsive capabilities), a central processing unit (e.g., processor or microprocessor), internal storage (e.g., flash drive), n number of components (e.g., specialized chips and/or sensors), and/or n number of radios (e.g., WLAN, LTE, WiFi, Bluetooth, GPS, etc.).
  • a display visual with or without touch responsive capabilities
  • a central processing unit e.g., processor or microprocessor
  • internal storage e.g., flash drive
  • n number of components e.g., specialized chips and/or sensors
  • radios e.g., WLAN, LTE, WiFi, Bluetooth, GPS, etc.
  • the processor 22, 32 is coupled, via one or more buses, to the memory 26, 36 in order to read information from and write information to the memory 26, 36.
  • the memory 26, 36 may be any suitable computer- readable medium that stores computer-readable instructions for execution by computer- executable components.
  • the computer-readable instructions include software stored in a non-transitory format, some such software having been downloaded as an application 24, 34 onto the memory 26, 36 of the sensing device 12 and/or computing device 14.
  • the processor 22, 32 in conjunction with the software stored in the memory 26, 36, executes an operating system and one or more applications 24, 34. Some methods described elsewhere herein may be programmed as software instructions contained within the one or more applications 24, 34 stored in the memory 26, 36 and executable by the processor 22, 32.
  • the first processor 22 referred to herein as an
  • acquisition/extraction processor 22 associated with the sensing device 12 is configured to execute one or more sets of instructions to effect the functioning of the sensing device 12.
  • a set of instructions effects acquiring a signal output from a sensor 28, identifying a reproducible feature in the signal output, and extracting the reproducible feature from the signal output.
  • the acquisition/extraction processor 22 may further extract a pattern from the reproducible feature.
  • the second processor 32 referred to herein as a translation processor 32, associated with the computing device 14 is configured to execute one or more sets of instructions to effect the functioning of the computing device 14.
  • a set of instructions effects receiving the extracted reproducible feature from the acquisition/extraction processor 22, translating the extracted reproducible feature into a first human-intelligible element, wherein the first human-intelligible element describes an unknown condition detected by the sensor 28, and querying, using the first human-intelligible element, the data source to identify a second human-intelligible element, wherein the second human- intelligible element identifies the unknown condition described by the first human-intelligible element.
  • the information and/or identified unknown condition is displayed on a display 38 of the computing device 14 to a user; transmitted, for example, to a call or care center; and/or transmitted to and updates, for example, the medical records of a user.
  • the acquisition/extraction and translation processors 22, 32 are both associated with the sensing device 12. Alternatively, acquisition, extraction, and translation are all performed by one processor in the sensing device 12. In some such embodiments, the system does not include a computing device or the computing device predominantly functions to display information to a user and/or receive user input.
  • the acquisition/extraction and translation processors 22, 32 are both associated with the computing device 14. Alternatively, acquisition, extraction, and translation are all performed by one processor in the computing device 14. In some such embodiments, the sensing device predominantly functions to acquire sensor data about a user's physiology or the environment in which the user is evolving.
  • acquisition, extraction and translation are distributed between the sensing device 12, the computing device 14, and the server 16 to optimize the overall performance of the system 10.
  • the overall performance may be optimized for lowest power consumption, fastest analysis time, lowest latency, highest accuracy, or any optimal combinations of performance as dictated by the needs and requirements of a specific application.
  • the translation processor 32 executes a set of instructions comprising: receiving at least one signal output from a sensor associated with at least one user; receiving a plurality of human- intelligible elements from the at least one user (e.g., via a user interface); identifying similarities in the plurality of human- intelligible elements and the at least one signal output received from the at least one user; creating a link between the similar human-intelligible elements and the at least one signal output; and developing, over time, a probability that a particular human- intelligible element will be associated with a particular signal output.
  • a sensor 28 of the sensing device 12 is integrally coupled to or positioned within the sensing device 12.
  • a sensor 28 of the sensing device 12 is communicatively coupled to the sensing device 12 but otherwise detached from or remote from the sensing device 12.
  • a sensor 28 of the sensing device 12 includes, but is not limited to, an inertial sensor; accelerometer; gyroscope; pedometer; magnetometer; electrocardiogram; Doppler ultrasound; acoustic sensor; optical sensor; thermal and infrared sensor; radar-based sensor; radio-frequency sensor; galvanic skin conductance sensor; electrodermal activity sensor; impedance spectroscopy; bio-impedance; contact impedance sensor; electromyogram; electrohysterogram; conductivity sensor; air quality sensor; water quality sensor; pressure sensor; thermometer; and/or any other type of sensor.
  • the sensing device 12 includes: one, more than one, a plurality of, or multiple sensors.
  • the sensors may be arranged for wireless communication with a network node of a sensor network and/or may communicate directly with each other and/or the processor 22 of the sensing device 12.
  • a power supply such as a battery 40, 42 is included within the sensing device 12 and/or computing device 14 and is electrically coupled to provide power to the processor 22, 32 and other electronic components.
  • the battery 40, 42 may be rechargeable or disposable.
  • the computing device 14 includes a display 38 that is configured to display an identified condition to a user and/or receive one or more inputs from a user.
  • the display 38 includes a Thin Film
  • the display 38 may include controls, which enable a user to interact with the display 38.
  • the display 38 may include buttons, sliders, toggle buttons, toggle switches, switches, dropdown menus, combo boxes, text input fields, check boxes, radio buttons, picker controls, segmented controls, steppers, and/or any other type of control.
  • the user may use different tactile or haptic lengths or pressures to navigate on the display 38. For example, a user may use a short press, long press, light press, or forceful press to navigate on the display 38.
  • the system further comprises a second sensor and/or second sensing device or multiple sensors and/or sensing devices.
  • the first sensing device measures or monitors a different process or a different aspect of the same process than the second sensing device.
  • the human intelligible elements derived from the first sensing device and the second sensing device are integrated, translated, and/or queried together.
  • the first sensing device may be coupled to a body of a user and the second sensing device may include an environmental sensor.
  • the body sensor data may be converted into human-intelligible elements that describe the physiology, health, and wellbeing of the individual.
  • the environmental sensor data may be converted into human-intelligible elements that describe the environment in which the user is evolving.
  • body and environmental data are of different types, they may both be translated to human-intelligible elements and processed, interpreted, and/or queried together.
  • a computer-implemented method for providing information to a user based on acquired sensor data and database information includes acquiring a signal output from a sensor (e.g., Sensor 1, Sensor 2,...
  • Sensor N S100, identifying a reproducible feature in the signal output SllO, extracting the reproducible feature from the signal output (i.e., Feature Extraction) S120, translating the reproducible feature into a first human- intelligible element (i.e., Translating to human-intelligible element), wherein the first human- intelligible element describes an unknown condition detected by the sensor S130, and querying (e.g., using a Search Engine or NLP Engine), using the first human-intelligible element, a data source to identify a second human-intelligible element, wherein the second human-intelligible element identifies the unknown condition described by the first human-intelligible element S140.
  • the method functions to acquire sensor data and translate the sensor data into human-intelligible elements or information.
  • a computer-implemented method for providing information to a user based on acquired sensor data and database information includes block SI 00, which recites acquiring a signal output from a sensor.
  • Block SI 00 functions to acquire, receive, or otherwise collect sensor data from one or more sensors.
  • n number of sensors 28a, 28b, 28n may be communicatively coupled to the sensing device 12 and configured to transmit sensor data to the acquisition/extraction processor 22 of the sensing device 12.
  • the n number of sensors 28a, 28b, 28n and acquisition/extraction processor 22 may form part of a sensor node, so that the sensor node is configured to gather or acquire sensory information from the n number of sensors 28a, 28b, 28n and process the sensory information.
  • a computer-implemented method for providing information to a user based on acquired sensor data and database information includes block SllO, which recites identifying a reproducible feature in the signal output.
  • Block SllO functions to determine points of interest or data of interest in the signal output from the one or more sensors.
  • features that can be identified and extracted include: statistical features (e.g. average, median, variance, kurtosis), time domain features (e.g. amplitude, derivative, slope), frequency domain features (e.g. peak frequency in power density spectrum, width of the main peak, frequency and width of the secondary peaks and harmonics), or time-frequency domain features (e.g. wavelets).
  • features are identified using a feature database.
  • Features not present in the feature database are added to the feature database to create new knowledge in the system.
  • Features not present in the feature database are identified based on the ability of the acquisition/extraction processor 22 to distinguish the feature from the rest of the signal, recognize the reproducibility of the feature along the signal, and/or recognize the non-isolated nature of the feature.
  • Such new feature identification may be performed using supervised or unsupervised machine learning techniques.
  • a computer-implemented method for providing information to a user based on acquired sensor data and database information includes block S120, which recites extracting the reproducible feature from the signal output.
  • Block S120 functions to isolate the feature from the signal output for translation of the feature, for example by the translation processor 32 of the computing device 14. For example, as shown in FIG. 2, the translation processor 32 of the computing device 14.
  • acquisition/extraction processor 22 extracts the reproducible feature (i.e., feature extraction 44) from the signal output from one or more sensors 28a, 28b, 28n.
  • the method performed by the acquisition/extraction processor 22 further includes extracting a pattern (i.e., pattern extraction 46) from the reproducible feature.
  • General signal processing techniques can be used to identify and extract features from digitized sensor signal data, such as various transform techniques (Fourier, Wavelets); by integration, derivation, and differentiation techniques; by template matching; by comparing physical features of the sensor signal data such as amplitude, frequency, and phase to a set threshold or thresholds; and/or by fitting the data to mathematical functions, etc.
  • feature identification and extraction is performed using unsupervised or supervised machine learning techniques, which automatically identify and extract features.
  • a computer-implemented method for providing information to a user based on acquired sensor data and database information includes block S130, which recites translating the reproducible feature into a first human-intelligible element, wherein the first human-intelligible element describes an unknown condition detected by the sensor.
  • Block S130 functions to associate each reproducible feature with a human intelligible element, for example a word, phrase, or sentence.
  • the extracted feature or pattern
  • the translation processor 32 of the computing device 14 receives the extracted feature or pattern and translates the extracted feature or pattern into a human- intelligible element (i.e., translating to human- intelligible element 48).
  • block S130 is performed by a Data2Language Engine.
  • structured data 56 is translated or mapped to a human-intelligible element (i.e., unstructured data 70) (e.g., language 72) by a Data2 Words engine 62.
  • the words used by the Data2 Words engine 62 may be stored in a domain (e.g., athletics, pregnancy, environmental, etc.) and/or user- specific word database 64.
  • the word database 64 may be updated during the use and functioning of the system.
  • an association between two or more words may be created and the two or more words may be combined via a Words2Sentences engine 66.
  • the word associations used by the Words2 Sentences engine 66 may be stored in a domain and user- specific sentence or relation database 68.
  • the words are combined based on relations and/or associations between multiple words. The combining of two or more words can be based on the chronological order through which the data and words are created, the causal relationships between words, or any other type of semantic relationship.
  • a feature is mapped to a human-intelligible element using a look-up table in which each feature is mapped to one word.
  • a feature is mapped to multiple words.
  • the mapping of each feature to each word is given a probability. For example, the system may indicate that the feature is 80% associated to a first word and 20% associated to a second word.
  • the system evaluates the association between the feature and the human-intelligible element. Evaluation may include elimination of redundant human- intelligible elements, reduction of the number of human-intelligible elements, deduction of further human-intelligible elements from the human-intelligible elements already associated with the features, the exclusion of contradictory human-intelligible elements, extension of the number of human-intelligible elements, etc.
  • the evaluation technique(s) are based, for example, on the type(s) of human-intelligible elements associated with the features. In the case of linguistic information, using the human-intelligible elements comprising descriptive words and sentences in a particular human language, such as the English language, linguistic information evaluation techniques may be applied, for example.
  • linguistic information evaluation includes identifying synonyms among the human-intelligible elements associated with a specific set of features. In some embodiments, the synonyms may be filtered or combined to reduce the number of human-intelligible elements. Linguistic information evaluation may also include automatically summarizing a set of human-intelligible elements to obtain a reduced, more concise, set of human-intelligible elements. In another example, linguistic information evaluation may include semantic analysis to analyze a set of human-intelligible elements and deduct a shorter, more concised and more relevant, set of human-intelligible elements. In the case of video, pictorial, or sound type human-intelligible elements; suitable video, picture, and sound evaluation techniques are used for performing information evaluation on the plurality of human- intelligible elements.
  • a computer-implemented method for providing information to a user based on acquired sensor data and database information includes block S140, which recites querying, using the first human-intelligible element, a data source to identify a second human-intelligible element, wherein the second human-intelligible element identifies or describes (e.g., so that a user, expert, doctor, etc. can identify the unknown condition and/or take action on the unknown condition) the unknown condition described by the first human- intelligible element.
  • Block S140 functions to further link human-intelligible elements with additional human-intelligible elements that are particularly relevant for the system, target user, and/or target domain to further describe the unknown condition.
  • an identified condition includes, but is not limited to, a description of a condition; a term or name for a condition; information about a condition; and/or related symptoms or observations associated with a condition.
  • the identified condition may include: one or more of a user health status or wellbeing, a diet, a fitness level, disease characteristics, disease progression, a stage of pregnancy, a maternal health or wellbeing status, a maternal stress level, a fetal health or wellbeing status, a fetal stress level, a stage of labor, a post-partum condition or period, a preconception period or condition, environmental conditions at a point in time, environmental changes over time, etc.
  • the first human intelligible element is processed using a search engine 78 or an NLP engine 80, which queries one or more data sources 82 (e.g., medical records, publications, scientific journals, books, websites, blogs, social media platforms, etc.) for human-intelligible elements related to the first human- intelligible element.
  • data sources 82 e.g., medical records, publications, scientific journals, books, websites, blogs, social media platforms, etc.
  • the probability of associating a feature with more than one human-intelligible element may be used as an input to the search engine 78 or NLP engine 80 to query a data source 82 to identify the second human- intelligible element.
  • the system may only search human-intelligible elements with the highest probability; and/or weigh the output (e.g., second human- intelligible element) of the search engine with the probability associated with its input (e.g., first human-intelligible element).
  • the system may compare the probabilities of human-intelligible elements generated from the multiple sensors, and only maintain the human-intelligible elements for which the probabilities are the highest for at least two of the sensors to query the search engine.
  • the search engine 78 or NLP engine 80 is directed to user and/or domain-specific content.
  • block S140 occurs automatically, upon a command from a user, upon execution of a particular step by the translation processor 32, and/or after any other prerequisite.
  • the second human-intelligible element is given a probability to identify the unknown condition described by the first human-intelligible element.
  • the system may indicate that the unknown condition has a 70% chance to be associated to one specific pregnancy stage, 20% to a second pregnancy stage, and 10% to a third pregnancy stage.
  • the second human-intelligible element identified using the search engine 78 or NLP engine 80 may be integrated and/or processed together with the first human-intelligible element to provide user feedback 84 (e.g., describe your condition; how are you feeling today?; is someone smoking around you?; etc.) or
  • recommendations for the user e.g., go see a doctor, have a cup of coffee, you are in labor, etc.
  • update medical records of the user 86 provide support to the user 88 (e.g., a support group, call center, doctor, etc.); update or notify the same or another user (e.g., doctor, call center, support group, care center, hospital, etc.); and/or request information and/or feedback .
  • a method of providing information to a user based on user input and/or acquired sensor data and database information includes receiving a text input from a user S400a; extracting a first human-intelligible element from the text input S400b; integrating the first human-intelligible element (e.g., from a first data source 500) with a second human-intelligible element (e.g., from a second data source 510) S420; and querying, using the integrated first and second human-intelligible elements, a data source to identify a third human-intelligible element, wherein the third human-intelligible element identifies the unknown condition described by the first and second human-intelligible elements S430.
  • the method includes receiving inputs from two or more data sources 500, 510, 520, as shown in FIG. 6.
  • the method may include receiving one or more signal outputs from one or more sensors 28a, 28b, 28n in block S410a; and extracting one or more reproducible features 44a, 44b from the one or more signal outputs and translating the reproducible feature to a first human-intelligible element 48a, 48b in block S410b, as described elsewhere herein.
  • the method functions to integrate, analyze, and/or interpret text data and/or sensor data of different data types.
  • the system only receives a text input or a signal output. In some embodiments, the system receives both a text input and a signal output. In variations of such embodiments, the system may receive multiple text inputs and/or signal outputs.
  • the system only acquires and analyzes text input. In some embodiments, the system only acquires signal output from one or more sensors. In some embodiments, the system acquires a combination of signal outputs from sensors and text inputs from users. In some embodiments, a first sensor and a second sensor measure similar phenomena or events of the same user or environment or two different users or environments. In some embodiments, the first and second sensors measure two different phenomena or events of the same user or environment or two different users or environments. In some embodiments, a first text input and a second text input describe similar phenomena or events observed by the same user or two different users. In some embodiments, a first text input and a second text input describe two different phenomena or events observed by the same user or two different users.
  • a method of providing information to a user based on user input and/or acquired sensor data and database information includes block S400a, which recites receiving a text input 520 from a user.
  • Block S400a functions to acquire data about the user and/or the environment in which the user is evolving.
  • a user may input text 520 into a user interface of the computing device or the sensing device, for example using a keypad, keyboard, touch interface, voice-to-speech recognition, or any other user input device or input method.
  • the system collects or acquires text 520 from emails, SMS, and/or other sources of text information for which the user had granted the system access.
  • a method of providing information to a user based on user input and/or acquired sensor data and database information includes block S400b, which recites extracting a first human-intelligible element from the text input 520.
  • Block S400b functions to extract meaningful human-intelligible elements from the text input 520.
  • extracting meaningful human-intelligible elements from the text input 520 may include excluding prepositions, articles, pronouns, conjunctions, direct objects, and/or indirect objects.
  • extracting meaningful human-intelligible elements from the text input520 may include extracting verbs, adjectives, nouns, predicates, and/or other substantive words or phrases from the text input 520.
  • a method of providing information to a user based on user input and/or acquired sensor data and database information includes block S420, which recites integrating the first human-intelligible element with a second human-intelligible element.
  • Block S420 functions to combine the first and second human-intelligible elements, so that they can be analyzed, queried, and/or interpreted together.
  • the first and second human- intelligible elements may be integrated to describe a health status of a user, measured by the first sensor or received by the system through a first text input, in an environment in which the user is evolving, measured by a second sensor or received by the system through a second text input.
  • the system may integrate the first and second human-intelligible elements by determining a probability that the first human-intelligible element is associated with the second human-intelligible element; and/or weighing the probability that the first human-intelligible element is associated with the second human-intelligible element.
  • the system may compare the probabilities of human-intelligible elements generated from the multiple sensors and text input, and only integrate the human-intelligible elements for which the probabilities are the highest for at least two inputs (e.g., one or more sensors and/or text).
  • a method of providing information to a user based on user input and/or acquired sensor data and database information includes block S430, which recites querying, using the integrated first and second human-intelligible elements, a data source to identify a third human-intelligible element, wherein the third human-intelligible element identifies the unknown condition (e.g., event, phenomena, etc.) described by the first and second human-intelligible elements.
  • the extracted and integrated human intelligible elements may be queried together using a search engine 78 or NLP engine 80 to identify and/or describe an unknown condition of the user.
  • the method further includes acquiring expert input 74 to enrich a database associated with the system, for example a word database, an image database, and/or a relation database.
  • the expert input 74 is automatically stored in and enriches the word, image, and/or relation databases, thus automatically and dynamically creating new knowledge in the system, as shown by the bidirectional arrows in FIG. 4 between the Data2Language Engines and the processed expert input 76.
  • the method may include receiving expert input 74 that describes a situation or condition (e.g., disease, illness, pregnancy, tiredness, etc.) as encountered by a user of the system S200; processing the expert input 76 to extract human-intelligible elements that describe the situation or condition S210; adding the extracted human-intelligible elements to a database (e.g., word, image, and/or relation) S220; and associating the reproducible features and/or patterns with the expert description of the situation or condition S230.
  • a situation or condition e.g., disease, illness, pregnancy, tiredness, etc.
  • a database e.g., word, image, and/or relation
  • a method of incorporating expert input into a system includes block S210, which recites processing the expert input 74 to extract human-intelligible elements that describe the situation or condition.
  • the expert input 74 is processed using text and/or speech processing techniques. For example, if a user inputs into the system: "I'm feeling dizzy” or "I have morning sickness,” the user input is processed using text processing techniques to extract meaningful human-intelligible elements (e.g., "dizzy” and "morning sickness”). The extracted human-intelligible elements are added to one or more databases.
  • a method of linking observations to data acquired by sensors includes receiving at least one signal output from a sensor associated with at least one user S300; receiving a plurality of human-intelligible elements from the at least one user S310; identifying similarities in the plurality of human-intelligible elements and the at least one signal output received from the at least one user S320; creating a link between the similar human-intelligible elements and the at least one signal output S330; and developing, over time, a probability that a particular human-intelligible element will be associated with a particular signal output S340.
  • the system functions to enable the system to learn from user input and/or crowdsource for relationships between measured phenomena (e.g., using sensors) and observed or experienced phenomena (e.g., described by users).
  • the user may be a pregnant female experiencing labor contractions.
  • the pregnant female may associate several human-intelligible elements or descriptors (e.g., severe cramps; three minutes apart; last one minute each; nausea; etc.) with the sensor output that indicates she is having a contraction.
  • the system learns to associate a contraction signal or feature from a sensor with one or more of the human-intelligible elements provided by the user.
  • the user provided human-intelligible elements are stored in one or more databases associated with the system, for example the word database, image database, and/or relation database, thereby creating new knowledge in the system.
  • a method of linking observations to data acquired by sensors includes block S310, which recites receiving a plurality of human-intelligible elements from the at least one user.
  • Block S310 functions to acquire information, observations, and/or data from users of the system.
  • the user may input, using the computing device of the system, one or more human-intelligible elements describing the physiological or environmental situation of the user, one or more symptoms or feelings of the user, a condition of the user, or any other information the user is willing to provide.
  • the system may mine data from Internet accounts and/or additional sensing devices possessed by the user, for example after the user links his Internet accounts and/or additional sensing devices to the system or grants the system access to the Internet accounts and/or additional sensing devices.
  • a method of linking observations to data acquired by sensors includes block S320, which recites identifying similarities in the plurality of human-intelligible elements and the at least one signal output received from the at least one user.
  • identifying similarities comprises comparing the plurality of human-intelligible elements received from the user to the pluarity of human-intelligible elements derived from the at least one signal output.
  • a method of linking observations to data acquired by sensors includes block S330, which recites creating a link between the similar human-intelligible elements and the at least one signal output. For example, if a user reports a certain condition (e.g. symptoms or physiological conditions), a relation may be created between the signal output and the human-intelligible elements provided by the user, therefore creating a learning mechanism in the system by which new human-intelligible elements may be associated to a certain signal output.
  • a certain condition e.g. symptoms or physiological conditions
  • Such link or relation between the human-intelligible element and the at least one signal output may, in some embodiments, be based on one or more of: a time, location, human- intelligible elements used to report the condition, other user input, and a rule set thats associates one or more features (identified/extracted from the signal output) and one or more conditions or human-intelligible elements.
  • a method of linking observations to data acquired by sensors includes block S340, which recites developing, over time, a probability that a particular human- intelligible element will be associated with a particular signal output.
  • Block S340 functions to create new knowledge in the system and/or databases so that the new knowledge can be used to identify and/or describe additional unknown conditions encountered by users and thus the system.
  • the system may develop a probability of 90% between a strong contraction (e.g., based on sensor data and user perception of contraction strength) and labor induction (e.g., based on sensor data and/or user feedback) or a probability of 10% between a weak contraction and labor induction.
  • sensor data is used to automatically provide feedback (in an unstructured format or human-intelligible format) to a pregnant woman (FIG. 11 A) about her health and lifestyle. For example, patterns in her physiological data acquired by one or more sensors allow the system to conclude that she didn't sleep well last night (FIG. 11B).
  • the system combines the interpretation of her physiological data with text, as described above in FIG. 6, such as a specific question (e.g., human-intelligible elements) she inputs into the system: "Can I still drink my morning cup of coffee?" (FIG. 11C).
  • the system uses the interpretation of the physiological data and the text input to query an NLP engine to provide relevant and actionable feedback for the pregnant woman (FIGS. 11D and HE). For example, the system indicates to the user that she can enjoy her cup of coffee (FIG. 11D) and provides additional information that the user may find relevant to her current situation, for example pregnancy (FIG. 10E).
  • sensor data is used to enhance information that is provided to a pregnant woman in response to a specific question.
  • a pregnant woman who experiences morning sickness may wonder: "What can I eat to help with morning sickness?” (FIG. 12A).
  • the system may, in a first step, use traditional search engines and/or NLP to search data sources and provide an answer to the pregnant woman's question (FIG. 12B). Additionally, in some embodiments, the system may integrate sensor data collected using a wearable sensor (e.g. the Bloom Belli sensor) to provide more information about her question and enhance the system response with personal sensor- based insights (FIG. 12C).
  • a wearable sensor e.g. the Bloom Belli sensor
  • the system knows her sleep history based on the sensor data, and it integrates the sleep history (i.e., structured data) with her question or text input (i.e., unstructured data), to query data sources and provide additional information (e.g., why she is experiencing morning sickness; what other activities she can do to avoid it in the future).
  • sleep history i.e., structured data
  • question or text input i.e., unstructured data
  • the systems and methods of the embodiments and variations thereof can be embodied and/or implemented at least in part as a machine configured to receive a computer- readable medium storing computer-readable instructions.
  • the instructions are preferably executed by computer-executable components preferably integrated with the system and one or more portions of the processor.
  • the computer-readable medium can be stored on any suitable computer-readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (e.g., CD or DVD), hard drives, floppy drives, or any suitable device.
  • the computer-executable component is preferably a general or application-specific processor, but any suitable dedicated hardware or hardware/firmware combination can alternatively or additionally execute the instructions.
  • a human intelligible element may include, and is contemplated to include, a plurality of human intelligible elements or a set of human-intelligible elements.
  • the claims and disclosure may include terms such as “a plurality,” “one or more,” or “at least one;” however, the absence of such terms is not intended to mean, and should not be interpreted to mean, that a plurality is not conceived.
  • the term “comprising” or “comprises” is intended to mean that the devices, systems, and methods include the recited elements, and may additionally include any other elements.
  • Consisting essentially of shall mean that the devices, systems, and methods include the recited elements and exclude other elements of essential significance to the combination for the stated purpose. Thus, a system or method consisting essentially of the elements as defined herein would not exclude other materials, features, or steps that do not materially affect the basic and novel characteristic(s) of the claimed invention.
  • Consisting of shall mean that the devices, systems, and methods include the recited elements and exclude anything more than a trivial or inconsequential element or step. Embodiments defined by each of these transitional terms are within the scope of this disclosure.

Abstract

Described herein are systems and methods for providing information to a user based on acquire sensor data and linking observations to acquired sensor data. A computer-implemented method for providing information to a user based on acquired sensor data and database information includes acquiring a first signal output from a first sensor; identifying a reproducible feature in the first signal output; extracting the reproducible feature from the first signal output; translating the reproducible feature into a first human-intelligible element; and querying, using the first human-intelligible element, a data source to identify a second human-intelligible element. Together, the first and second human-intelligible elements describe or identify the unknown condition. The method may include identifying similarities in a plurality of human- intelligible elements received from a user and a signal output received from a sensor associated with the user; and creating a link between the similar human-intelligible elements and the signal output.

Description

SYSTEMS AND METHODS FOR PROVIDING INFORMATION TO A USER BASED ON ACQUmED SENSOR DATA AND DATABASE INFORMATION
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Patent Application Ser. No.
62/293,503, entitled "Systems and Methods for Providing Information to a User Based on Acquired Sensor Data and Database Information," filed February 10, 2016, the disclosure of which is incorporated by reference in its entirety.
TECHNICAL FIELD
[0002] This invention relates generally to the field of signal data analyses, and more specifically to new and useful systems and methods for providing information to a user based on acquired sensor data and database information.
BACKGROUND
[0003] Healthcare is in the midst of the digital health revolution. An exponentially growing amount of health and wellness data are generated by wearable and Internet of Things devices. These data are acquired in a digital and organized format (i.e., structure data). These digital data are in addition to the already vast amount of available medical data that used to be contained in books, doctor notebooks, medical records, scientific journals, etc. in a non- controlled, human-intelligible format (i.e., unstructured data). No human being is able to process such a vast amount of data. One of the key challenges of the digital health revolution is to convert these data into meaningful information that can be understood and actionable by an individual and her care team.
[0004] To address this challenge, research groups and companies have developed a variety of different methods to convert this growing amount of digital data (BIG data) into smaller actionable data (SMALL data), based on approaches borrowed from statistical data analytics, pattern recognition, machine learning, or deep learning fields. These methods have shown to be very successful in analyzing large datasets and identifying patterns or trends that may be indicative of health conditions. Some available methods even enable prediction of the occurrence of medical conditions based on subtle changes in the data. These methods, however, remain confined to the space of structured data - that is they convert big data into small data, but the output remains numerical and therefore difficult to interpret by humans. Thus, current data analysis approaches fail to link structured data (sensor data) to unstructured data (language).
[0005] Thus, there are at least three significant problems in the field of signal data analyses: (1) Intelligibility of the data: current big data analytics methods yield structured data, which is poorly understandable to humans. We communicate with words, gestures and pictures, not with digital data; (2) Most medical knowledge today is contained in unstructured data such as publications, medical records, notes from doctors, textbooks, websites, blogs, social media platforms, etc. These data are not directly relatable to sensor data since they comprise different formats (i.e., structured versus unstructured); (3) Different data types, especially structured and unstructured data types, are organized independently and separately today which makes their joint processing and interpretation very complex.
[0006] Thus, there is a need for new and useful systems and methods for providing information to a user based on acquired sensor data (i.e., structured data) and database information (i.e., structured and unstructured data). This invention provides such new and useful systems and methods.
SUMMARY
[0007] The present disclosure is directed to systems and methods for increasing the intelligibility of data, for example by transforming structured data into unstructured data and vice versa. One aspect of the present disclosure is directed to a system for providing information to a user based on acquired sensor data and database information. In some embodiments, the system includes: a first sensor; a processor communicatively coupled to the first sensor; and a computer- readable medium having non-transitory, processor-executable instructions stored thereon. In some embodiments, execution of the instructions causes the processor to perform a method including: acquiring a first signal output from the first sensor, identifying a first reproducible feature in the first signal output, extracting the first reproducible feature from the first signal output, and translating the first reproducible feature into a first human-intelligible element. In some such embodiments, the first human-intelligible element describes an unknown condition detected by the first sensor.
[0008] In some embodiments, the method executed by the processor further includes: querying, using the first human-intelligible element, a data source to identify a second human- intelligible element. In some such embodiments, the second human-intelligible element identifies the unknown condition described by the first human-intelligible element.
[0009] In some embodiments, the data source is tailored to the user using one or more of a tone and an angle.
[0010] In some embodiments, the system further includes a second sensor. In some such embodiments, the second sensor is communicatively coupled to the processor.
[0011] In some embodiments, the method executed by the processor further includes: acquiring a second signal output from the second sensor; identifying a second reproducible feature in the second signal output; extracting the second reproducible feature from the second signal output; translating the second reproducible feature into a second human-intelligible element; integrating the first human-intelligible element from the first sensor and the second human-intelligible element from the second sensor; and querying, using the integrated first and second human-intelligible elements, a data source to identify a third human-intelligible element.
[0012] In some embodiments, the second reproducible feature is different than the first reproducible feature. In some embodiments, the second human-intelligible element describes an unknown condition detected by the second sensor. In some such embodiments, the third human- intelligible element identifies the unknown condition described by the first and second human- intelligible elements.
[0013] In some embodiments, the method executed by the processor further includes: receiving a text input from a user; extracting a second human-intelligible element from the text input; integrating the first and second human-intelligible elements; and querying, using the integrated first and second human-intelligible elements, a data source to identify a third human- intelligible element. In some such embodiments, the third human-intelligible element identifies the unknown condition described by the first and second human-intelligible elements.
[0014] Another aspect of the present disclosure is directed to a system for providing information to a user based on acquired sensor data and database information. In some embodiments, the system includes: a first sensor; a first processor communicatively coupled to the first sensor; a first computer-readable medium having a first set of non-transitory, processor- executable instructions stored thereon; a computing device communicatively coupled to the first processor, a second processor, and a data source; and a second computer-readable medium having a second set of non-transitory, processor-executable instructions stored thereon. In some such embodiments, execution of the first set of instructions causes the first processor to perform a method including: acquiring a first signal output from a first sensor, identifying a reproducible feature in the first signal output, and extracting the reproducible feature from the first signal output. Further, in some such embodiments, execution of the second set of instructions causes the second processor to perform a method including: receiving the extracted reproducible feature from the first processor; translating the extracted reproducible feature into a first human- intelligible element, and querying, using the first human-intelligible element, the data source to identify a second human-intelligible element.
[0015] In some embodiments, the first human-intelligible element describes an unknown condition detected by the first sensor. In some such embodiments, the second human-intelligible element identifies the unknown condition described by the first human-intelligible element.
[0016] In some embodiments, the system further includes a Data2Language engine.
[0017] In some embodiments, translating is performed by the Data2Language engine.
[0018] In some embodiments, the system further includes one or more of a word database, an image database, and a relation database.
[0019] In some embodiments, the first human-intelligible element is derived from one or more of the word database, the image database, and the relation database.
[0020] In some embodiments, the method executed by the second processor further includes acquiring an input from an expert to enrich a word database, an image database, and a relation database.
[0021] In some embodiments, the method performed by the first processor further includes extracting a pattern from the reproducible feature.
[0022] In some embodiments, translating further includes mapping the reproducible feature to the first human-intelligible element.
[0023] In some embodiments, the first and second human-intelligible elements include one or more of words, phrases, sentences, and images.
[0024] In some embodiments, translating further includes: creating an association between two or more words; and combining the two or more words into one or more sentences.
[0025] In some embodiments, the method performed by the second processor further includes automatically inputting the first human-intelligible element into a search engine.
[0026] In some embodiments, the system further includes a second sensor. [0027] In some embodiments, acquiring a first signal output from the first sensor further includes acquiring a second signal output from the second sensor.
[0028] In some embodiments, the method performed by the second processor further includes integrating the first human-intelligible element derived from the first and second data outputs from the first and second sensors, respectively, to identify the unknown condition.
[0029] In some embodiments, the first sensor is different from the second sensor.
[0030] In some embodiments, the reproducible feature includes one or more of a contraction duration, a contraction frequency, a contraction intensity, a maternal heart rate, a fetal heart rate, and a number of fetal movements.
[0031] In some embodiments, the system further includes a display communicatively coupled to the processor. In some such embodiments, the display is configured to display the identified condition to the user.
[0032] In some embodiments, the identified condition includes one or more of: a stage of pregnancy, a post-partum condition, a preconception condition, a maternal health or wellbeing status, a maternal stress level, a maternal sleep index, a fetal movement level, a fetal health or wellbeing status, a fetal stress level, and a stage of labor.
[0033] In some embodiments, the data source includes scientific papers, reports, clinical papers, clinical reports, websites, blogs, social media platforms, medical records, medical textbooks, and clinical notes.
[0034] Another aspect of the present disclosure is directed to a computer-implemented method for providing information to a user based on acquired sensor data and database information. In some embodiments, the method includes: acquiring a first signal output from a first sensor; identifying a reproducible feature in the first signal output; extracting the reproducible feature from the first signal output; translating the reproducible feature into a first human-intelligible element; and querying, using the first human-intelligible element, a data source to identify a second human-intelligible element.
[0035] In some embodiments, the first human-intelligible element describes an unknown condition detected by the first sensor. In some such embodiments, the second human-intelligible element identifies the unknown condition described by the first human-intelligible element.
[0036] In some embodiments, the first human-intelligible element is derived from one or more of a word database, an image database, and a relation database. [0037] In some embodiments, the method further includes acquiring an input from an expert to enrich one or more of a word database, an image database, and a relation database.
[0038] In some embodiments, the method further includes extracting a pattern from the reproducible feature.
[0039] In some embodiments, translating further includes mapping the reproducible feature to the first human-intelligible element.
[0040] In some embodiments, the first human-intelligible element includes one or more of words, phrases, images, and sentences.
[0041] In some embodiments, translating further includes: creating an association between two or more words; and combining the two or more words into one or more sentences.
[0042] In some embodiments, the method further includes automatically inputting the first human-intelligible element into a search engine to identify the second human-intelligible element.
[0043] In some embodiments, the method further includes acquiring a second signal output from a second sensor.
[0044] In some embodiments, the method further includes integrating the first human- intelligible element derived from the first and second data outputs from the first and second sensors, respectively, to identify the unknown condition.
[0045] In some embodiments, the first sensor is different from the second sensor.
[0046] In some embodiments, the reproducible feature includes one or more of a contraction duration, a contraction frequency, a contraction intensity, a time interval between contractions, a maternal heart rate, a fetal heart rate, a heart rate variability, a step, a number of calories burned over a certain period of time, blood-pressure level, blood oxygenation level, blood-glucose level, stress level, a number of hours of sleep, body position during sleep, a sleep quality index, a fitness index, a wellbeing index, and a number of fetal movements.
[0047] In some embodiments, the identified condition includes one or more of: a stage of pregnancy, a post-partum stage, a preconception stage, a maternal health or wellbeing status, a maternal stress level, a fetal health or wellbeing status, a fetal stress level, a maternal sleep index, a fetal movement level, and a stage of labor.
[0048] In some embodiments, the method further includes displaying the identified condition to the user on a display. [0049] In some embodiments, acquiring, identifying, and extracting are performed by a first processor; and translating and querying are performed by a second processor.
[0050] In some embodiments, the method is performed by one processor.
[0051] In some embodiments, the method further includes: providing feedback to the user based on the first and second human-intelligible elements.
[0052] In some embodiments, the feedback is tailored to the user using one or more of a tone and an angle.
[0053] Another aspect of the present disclosure is directed to a method of linking observations to data acquired by sensors. In some embodiments, the method includes: receiving at least one signal output from a sensor associated with at least one user; receiving a plurality of human-intelligible elements from the at least one user; identifying similarities in the plurality of human-intelligible elements and the at least one signal output received from the at least one user; and creating a link between the similar human-intelligible elements and the at least one signal output.
[0054] In some embodiments, the plurality of human-intelligible elements includes one or more of: an observed symptom, a diagnosis, an observation about wellbeing, a symptom, a condition, and a feeling.
[0055] In some embodiments, the method further includes developing, over time, a probability that a particular human-intelligible element will be associated with a particular signal output.
BRIEF DESCRIPTION OF THE DRAWINGS
[0056] FIG. 1A illustrates one embodiment of a system for providing information to a user based on acquired sensor data and database information.
[0057] FIG. IB illustrates one embodiment of a sensing device of a system for providing information to a user based on acquired sensor data and database information.
[0058] FIG. 1C illustrates one embodiment of a computing device of a system for providing information to a user based on acquired sensor data and database information.
[0059] FIG. 2 illustrates one embodiment of a system for providing information to a user based on acquired sensor data and database information. [0060] FIG. 3 illustrates one embodiment of a system for providing information to a user based on acquired sensor data and database information.
[0061] FIG. 4 illustrates one embodiment of a system for providing information to a user based on acquired sensor data and database information that incorporates input from an expert.
[0062] FIG. 5 illustrates one embodiment of a system for providing information to a user based on acquired sensor data and one or more data sources.
[0063] FIG. 6 illustrates one embodiment of a system for providing information to a user based on one or more integrated data types and one or more data sources.
[0064] FIG. 7 illustrates a flowchart of one embodiment of a method of providing information to a user based on acquired sensor data and database information.
[0065] FIG. 8 illustrates a flowchart of one embodiment of a method of adding expert input into the system.
[0066] FIG. 9 illustrates a flowchart of one embodiment of a method of linking observations to data acquired by sensors.
[0067] FIG. 10 illustrates a flow chart of one embodiment of a method of providing information to a user based on user input and/or acquired sensor data and database information.
[0068] FIG. 11A illustrate an example of a computing device configured to provide information to a user based on acquired data and database information.
[0069] FIG. 11B illustrates an example of a computing device configured to provide information to a user based on acquired data and database information.
[0070] FIG. llC illustrates an example of a computing device configured to provide information to a user based on acquired data and database information.
[0071] FIG. 11D illustrates an example of a computing device configured to provide information to a user based on acquired data and database information.
[0072] FIG. HE illustrates an example of a computing device configured to provide information to a user based on acquired data and database information.
[0073] FIG. 12A illustrate an example of a computing device configured to provide information to a user based on acquired data and database information.
[0074] FIG. 12B illustrate an example of a computing device configured to provide information to a user based on acquired data and database information. [0075] FIG. 12C illustrate an example of a computing device configured to provide information to a user based on acquired data and database information.
DETAILED DESCRD7TION
[0076] The following description of the preferred embodiments of the invention is not intended to limit the invention to these preferred embodiments, but rather to enable any person skilled in the art to make and use this invention. Disclosed herein are systems and methods for providing information to a user based on acquired sensor data and database information.
[0077] Note that the systems and methods described here are not directed to an abstract idea. In particular, the methods and systems described herein are directed to a specialized process to link structured (i.e., organized) and unstructured (i.e., non-controlled, human- intelligible) data, by converting or translating structured data into unstructured data. The methods and systems described here are not directed to: a fundamental economic principle, a human activity, and/or a mathematical relationship/formula. Moreover, the systems and methods described herein amount to significantly more than an alleged abstract idea. In particular, the systems and methods described herein improve analysis and use of structured and unstructured data and may improve the functioning of the computing device that executes software and/or implements the methods. For example, the methods described herein may: speed up computations; reduce memory consumption when performing the computations; improve reliability of the computations;
improve the user-friendliness of a user interface that displays human intelligible data related to structured data; and/or improve other performance metrics related to the function of the computing device. Furthermore, the methods executed by the computing device constitute a technical effect in which structured data is transformed into unstructured data or unstructured data is transformed into structured data.
[0078] Methods and systems are described herein to link structured (i.e., organized) and unstructured (i.e., non-controlled, human-intelligible) data, by converting or translating structured data into unstructured data. More specifically, the proposed method and system address the three problems mentioned above by: (1) Automatically and dynamically converting sensor data into human-intelligible information. Sensor data features, patterns, and/or trends are converted or translated into a vocabulary, thus establishing the missing link between structured and unstructured data; (2) Querying, using this vocabulary, unstructured data with sensor data, comparing sensor data to unstructured data, and processing sensor data using tools typically reserved for unstructured data processing (e.g., Natural Language Processing (NLP)) to link sensor data to medical knowledge contained in unstructured information; and (3) Integrating, using this vocabulary, sensor data with other unstructured data inputs (e.g., patients' notes, doctors' notes, medical records, publications, manual input, text, sound, etc.) therefore allowing structured data and unstructured data to be processed and analyzed together using methods such as NLP.
[0079] In general, the systems and methods described herein are intended for use by a user. As used herein, a user includes: any person who desires to monitor and/or track personal parameters (e.g., health, diet, condition, wellbeing, etc.), population level parameters (e.g., disease characteristics and/or progression, health, fitness, etc.), and/or environmental parameters (e.g., environmental conditions at a point in time, environmental changes over time, etc.). A user may include a pregnant woman, a woman or a man trying to conceive, a woman in her postpartum period, a person suffering from a condition or disease, a data scientist, a physician, a healthcare professional, an athlete, a personal trainer, a geneticist, an environmentalist, a call center, a care center, a care team, and/or any other individual desiring to track and/or monitor a physical, environmental, and/or otherwise observable phenomena.
[0080] Information provided to a user by the system includes, but is not limited to, health
(e.g., eating habits, exercise habits, etc.), fitness (e.g., fitness level, performance, training efficiency, recovery, etc.), environmental (e.g., weather, temperature, air quality, etc.), location, diet (e.g., lactose-intolerance, gluten, aspartame, etc.), condition-specific (e.g., pregnancy, diabetes, heart disease, obesity, etc.), well-being, and/or any other information requested by the user or measured by a sensor of the system.
[0081] In some embodiments, data is acquired by the system using one or more sensors.
A sensor provides a means for monitoring, including but not limited to, sensors producing waveforms representing biological, physiological, neurological, psychological, physical, chemical, electrical, environmental, and mechanical signals, such as pressure, sound, temperature, heart rate, contractions, and the like, probes, surveillance equipment, measuring equipment, and any other means for monitoring parameters representative of or characteristic for an application domain. A sensor may be a special purpose or general purpose sensor, adapted for measuring just one or a number of physical parameters, such as temperature, noise, pressure, movement (e.g., inertial sensor, accelerometer, gyroscope, pedometer, magnetometer), heart rate (e.g., electrocardiogram, Doppler ultrasound, acoustics sensor, optical sensor, thermal and infrared sensor, radar-based and radio-frequency sensor), stress, skin impedance (e.g., galvanic skin conductance, electrodermal activity), tissue impedance (e.g., impedance spectroscopy, bio- impedance), contact impedance, muscle contraction (e.g., electromyogram, electrohysterogram), and conductivity. The sensors may be individual sensors and/or sensors connected in a sensor network. The sensors may be attached on the body (wearable sensors), in proximity of the body (portable sensors), or distributed in the environment (radiofrequency or radar sensors). Sensor systems and/or sensor networks may include not only sensors that monitor the system itself, such as the human body in the case of a body sensor network, but also sensors that sense the context and environment in which the system or user is evolving.
[0082] In some embodiments, one or more reproducible features are extracted from the signal output or sensor data from the sensor. In some embodiments, a reproducible feature includes, but is not limited to, a physiological feature: a contraction duration; a contraction frequency; a contraction intensity; a time interval between two contractions; a heart rate; heart rate variability; a number of fetal movements; a step; a number of calories burned over a certain period of time; blood-pressure level; blood oxygenation level; blood-glucose level; stress level; a number of hours of sleep; body position during sleep; a sleep quality index; a fitness index; a wellbeing index; and/or any other physiological response or output. In some embodiments, a reproducible feature includes, but is not limited to, an environmental feature: a level of CO or C02; a presence of smoke; temperature; humidity level; illumination level; ultra-violet light level; a presence or a level of a water contaminant; a presence or a level of small particles in air; a barometric pressure level; and/or any other environmental parameter.
[0083] In some embodiments, one or more patterns are extracted from the reproducible feature. Patterns may be identified in the time domain, frequency domain, time- frequency domain, in signal amplitude, in signal shape, and/or signal phase. For example, a pattern is a set of different features at a certain point in time, or the same feature at different time points, or a combination of both. Examples of patterns include, but are not limited to, a contraction with an increased heart rate; a number of contractions with an increased contraction frequency over a defined time period; decreased fetal movements with decreased fetal heart rate; and increased CO with increased heart rate. [0084] In some embodiments, sensor data is linked to information (e.g., human- intelligible element) stored in one or more databases and/or one or more data sources. Databases include, but are not limited to, word databases, sentence (relation) databases, image databases, phrase databases, sound databases, character databases, and/or any other database type. Data sources include, but are not limited to, medical data, medical records, information contained in books, doctor notes or notebooks, publications, clinical case studies, scientific literature, websites, blogs, social media platforms, laboratory notebooks, observations from the general population, observations from experts, and/or any other information stored and/or available in a non-controlled format (i.e., unstructured data). In some embodiments, a tone (e.g., positive, encouraging, negative, suggestive, etc.) and/or point- of- view, perspective, or angle (e.g., doctor's, user's, general public's, etc.) of the information provided to the user may be adapted by changing the content of the databases and/or data sources. For example, the user may create a profile in the system describing a condition or situation of the user, demographic (e.g., socioeconomic status, ethnicity, age, weight, health history, sex, family health history, etc.) information about the user, important contacts for the user (e.g., doctor, family, social worker, support group, etc.), goals of the user, and/or any additional information. The system may perform analytics on the profile information to determine an appropriate tone and/or angle for the user. Alternatively or additionally, in some embodiments, the system may mine information from one or more Internet accounts (e.g., blogs, email, social networking applications, etc.) of the user to determine an appropriate tone and/or angle for the user. In some embodiments, the user selects a tone and/or angle of the information that is provided to the user. For example, the user may choose to get the perspective of a midwife or a doctor on a question related to her pregnancy, or she may want to compare the two perspectives on the question.
[0085] In some embodiments, one or more features and/or patterns are translated into one or more human-intelligible elements using, for example a database as a source of human- intelligible elements. A human-intelligible element includes, but is not limited to, a word, phrase, sentence, image, graphic, sound, character, and/or gesture. In some embodiments, a human- intelligible element includes at least one, more than one, a plurality, or a set of human-intelligible elements.
[0086] In some embodiments, additional human-intelligible elements are identified by querying one or more data sources of information. The identified human-intelligible elements may further describe or identify an unknown condition. In some embodiments, an identified condition includes, but is not limited to, a description of a condition; a term or name for a condition; information about a condition; and/or related symptoms or observations associated with a condition.
SYSTEM
[0087] In some embodiments, as shown in FIG. 1A, a system 10 for providing information to a user based on acquired sensor data and database information includes a sensing device 12 comprising a first processor and a sensor; a computing device 14 communicatively coupled to the sensing device 12 and comprising a second processor; and, optionally, a server 16. Various components of the system 10 function to acquire sensor data and translate the sensor data into a format understandable and/or useable by a user.
[0088] In some embodiments, there is one-way or two-way communication between the computing device 14 and the server 16, the computing device 14 and the sensing device 12, and/or the server 16 and the sensing device 12. The computing device 14, sensing device 12, and/or server 16 may communicate wirelessly using Bluetooth, Wi-Fi, CDMA, LTE, other cellular protocol, other radiofrequency, or another wireless protocol.
[0089] In some embodiments, as shown in FIG. 1A, the system 10 may include a server
16. The server 16 may be a local server on the computing device 14 or a remote server. In some embodiments, the server 16 is a virtual server. In some embodiments, the server 16 may share data between the computing device 14 and the sensing device 12. In some embodiments, the server may include one or more databases and/or data sources used by the processor of the computing device 14 and/or the sensing device 12.
[0090] In some embodiments, as shown in FIG. 1A and FIG. IB, the system 10 further includes a sensing device 12. The sensing device 12 measures a biological, physiological, neurological, psychological, physical, chemical, electrical, environmental, and/or mechanical signal; identifies reproducible features in the signal, extracts the reproducible features from the signal; and sends, transmits, or exports the extracted reproducible features and/or patterns to the computing device 14. The computing device 14 may receive and/or import the data from the sensing device 12 to translate, analyze (e.g., query a data source), and/or display the data to a user. In some embodiments, sending or transmitting information (i.e., communication between system components) occurs via a wired connection (e.g., IEEE 1394, Thunderbolt, Lightning, DVI, HDMI, Serial, Universal Serial Bus, Parallel, Ethernet, Coaxial, VGA, PS/2) or wirelessly (e.g., via Bluetooth, low energy Bluetooth, near-field communication, Infrared, WLAN, or other RF technology). The sensing device 12 may include a Bloom Life Belli, a FitBit, a Pebble smartwatch, a heart rate monitor (e.g., ECG, chest strap, etc.), a muscle contraction monitor (e.g., electromyogram), a pulse oximeter, an Apple Watch, a blood pressure cuff, caliper, pedometer, movement monitor (e.g., accelerometer, Doppler ultrasound, etc.), Airbot (i.e., air quality sensing), Waterbot (i.e., water quality sensing), Sensordrone (i.e., environment sensing), Lapka environmental monitor, Sensaris, or any other device used for sensing and/or measuring physiological and/or environmental parameters.
[0091] In some embodiments, as shown in FIG. 1A and FIG. 1C, the system includes a computing device 14. The computing device functions to receive extracted reproducible features and/or patterns from the sensing device and translate the extracted reproducible features and/or patterns into human intelligible elements. In some embodiments, the computing device 14 is a stationary computing device. In some such embodiments, the stationary computing device includes a desktop computer or a workstation. In some embodiments, as shown in FIG. 1A and FIG. 1C, a computing device 14 is a mobile or portable computing device. In some such embodiments, a portable computing device includes, but is not limited to, a laptop, netbook, tablet, mobile phone, personal digital assistant, or wearable device (e.g., Google Glass, Apple Watch, etc.). In some embodiments, the computing device 14 is a computational device, wrapped in a chassis that includes a display (visual with or without touch responsive capabilities), a central processing unit (e.g., processor or microprocessor), internal storage (e.g., flash drive), n number of components (e.g., specialized chips and/or sensors), and/or n number of radios (e.g., WLAN, LTE, WiFi, Bluetooth, GPS, etc.).
[0092] As shown in FIG. IB and FIG. 1C, the sensing device 12 and computing device
14 of some embodiments both include a processor 22, 32. In some embodiments, the processor 22, 32 is coupled, via one or more buses, to the memory 26, 36 in order to read information from and write information to the memory 26, 36. The memory 26, 36 may be any suitable computer- readable medium that stores computer-readable instructions for execution by computer- executable components. In some embodiments, the computer-readable instructions include software stored in a non-transitory format, some such software having been downloaded as an application 24, 34 onto the memory 26, 36 of the sensing device 12 and/or computing device 14. The processor 22, 32, in conjunction with the software stored in the memory 26, 36, executes an operating system and one or more applications 24, 34. Some methods described elsewhere herein may be programmed as software instructions contained within the one or more applications 24, 34 stored in the memory 26, 36 and executable by the processor 22, 32.
[0093] As shown in FIG. IB, the first processor 22, referred to herein as an
acquisition/extraction processor 22, associated with the sensing device 12 is configured to execute one or more sets of instructions to effect the functioning of the sensing device 12. In some embodiments, as shown in FIG. 2, a set of instructions effects acquiring a signal output from a sensor 28, identifying a reproducible feature in the signal output, and extracting the reproducible feature from the signal output. Optionally, the acquisition/extraction processor 22 may further extract a pattern from the reproducible feature.
[0094] As shown in FIG. 1C, the second processor 32, referred to herein as a translation processor 32, associated with the computing device 14 is configured to execute one or more sets of instructions to effect the functioning of the computing device 14. In some embodiments, as shown in FIG. 2, a set of instructions effects receiving the extracted reproducible feature from the acquisition/extraction processor 22, translating the extracted reproducible feature into a first human-intelligible element, wherein the first human-intelligible element describes an unknown condition detected by the sensor 28, and querying, using the first human-intelligible element, the data source to identify a second human-intelligible element, wherein the second human- intelligible element identifies the unknown condition described by the first human-intelligible element. In some embodiments, as shown in FIG. 2, the information and/or identified unknown condition is displayed on a display 38 of the computing device 14 to a user; transmitted, for example, to a call or care center; and/or transmitted to and updates, for example, the medical records of a user.
[0095] In some embodiments, the acquisition/extraction and translation processors 22, 32 are both associated with the sensing device 12. Alternatively, acquisition, extraction, and translation are all performed by one processor in the sensing device 12. In some such embodiments, the system does not include a computing device or the computing device predominantly functions to display information to a user and/or receive user input.
[0096] In some embodiments, the acquisition/extraction and translation processors 22, 32 are both associated with the computing device 14. Alternatively, acquisition, extraction, and translation are all performed by one processor in the computing device 14. In some such embodiments, the sensing device predominantly functions to acquire sensor data about a user's physiology or the environment in which the user is evolving.
[0097] In some embodiments, acquisition, extraction and translation are distributed between the sensing device 12, the computing device 14, and the server 16 to optimize the overall performance of the system 10. In some such embodiments, the overall performance may be optimized for lowest power consumption, fastest analysis time, lowest latency, highest accuracy, or any optimal combinations of performance as dictated by the needs and requirements of a specific application.
[0098] In some embodiments, as shown in FIG. 9, the acquisition/extraction processor
22 and/or the translation processor 32 executes a set of instructions comprising: receiving at least one signal output from a sensor associated with at least one user; receiving a plurality of human- intelligible elements from the at least one user (e.g., via a user interface); identifying similarities in the plurality of human- intelligible elements and the at least one signal output received from the at least one user; creating a link between the similar human-intelligible elements and the at least one signal output; and developing, over time, a probability that a particular human- intelligible element will be associated with a particular signal output.
[0099] In some embodiments, as shown in FIG. IB, a sensor 28 of the sensing device 12 is integrally coupled to or positioned within the sensing device 12. Alternatively, in some embodiments, a sensor 28 of the sensing device 12 is communicatively coupled to the sensing device 12 but otherwise detached from or remote from the sensing device 12. A sensor 28 of the sensing device 12 includes, but is not limited to, an inertial sensor; accelerometer; gyroscope; pedometer; magnetometer; electrocardiogram; Doppler ultrasound; acoustic sensor; optical sensor; thermal and infrared sensor; radar-based sensor; radio-frequency sensor; galvanic skin conductance sensor; electrodermal activity sensor; impedance spectroscopy; bio-impedance; contact impedance sensor; electromyogram; electrohysterogram; conductivity sensor; air quality sensor; water quality sensor; pressure sensor; thermometer; and/or any other type of sensor. In some embodiments, the sensing device 12 includes: one, more than one, a plurality of, or multiple sensors. The sensors may be arranged for wireless communication with a network node of a sensor network and/or may communicate directly with each other and/or the processor 22 of the sensing device 12. [00100] In some embodiments, a power supply, such as a battery 40, 42 is included within the sensing device 12 and/or computing device 14 and is electrically coupled to provide power to the processor 22, 32 and other electronic components. The battery 40, 42 may be rechargeable or disposable.
[00101] In some embodiments, as shown in FIG. 1C, the computing device 14 includes a display 38 that is configured to display an identified condition to a user and/or receive one or more inputs from a user. In some embodiments, the display 38 includes a Thin Film
Transistor liquid crystal display (LCD), in-place switching LCD, resistive touchscreen LCD, capacitive touchscreen LCD, light emitting diode (LED), organic light emitting diode (OLED), Active-Matrix organic LED (AMOLED), Super AMOLED, Retina display, Haptic/Tactile touchscreen, and/or Gorilla Glass. The display 38 may include controls, which enable a user to interact with the display 38. The display 38 may include buttons, sliders, toggle buttons, toggle switches, switches, dropdown menus, combo boxes, text input fields, check boxes, radio buttons, picker controls, segmented controls, steppers, and/or any other type of control. In some embodiments, the user may use different tactile or haptic lengths or pressures to navigate on the display 38. For example, a user may use a short press, long press, light press, or forceful press to navigate on the display 38.
[00102] In some embodiments, the system further comprises a second sensor and/or second sensing device or multiple sensors and/or sensing devices. In some such embodiments, the first sensing device measures or monitors a different process or a different aspect of the same process than the second sensing device. In some such embodiments, the human intelligible elements derived from the first sensing device and the second sensing device are integrated, translated, and/or queried together. For example, the first sensing device may be coupled to a body of a user and the second sensing device may include an environmental sensor. The body sensor data may be converted into human-intelligible elements that describe the physiology, health, and wellbeing of the individual. The environmental sensor data may be converted into human-intelligible elements that describe the environment in which the user is evolving.
Although the body and environmental data are of different types, they may both be translated to human-intelligible elements and processed, interpreted, and/or queried together.
METHODS [00103] As shown in FIGS. 2-5 and FIG. 7, a computer-implemented method for providing information to a user based on acquired sensor data and database information includes acquiring a signal output from a sensor (e.g., Sensor 1, Sensor 2,... Sensor N) S100, identifying a reproducible feature in the signal output SllO, extracting the reproducible feature from the signal output (i.e., Feature Extraction) S120, translating the reproducible feature into a first human- intelligible element (i.e., Translating to human-intelligible element), wherein the first human- intelligible element describes an unknown condition detected by the sensor S130, and querying (e.g., using a Search Engine or NLP Engine), using the first human-intelligible element, a data source to identify a second human-intelligible element, wherein the second human-intelligible element identifies the unknown condition described by the first human-intelligible element S140. The method functions to acquire sensor data and translate the sensor data into human-intelligible elements or information.
[00104] As shown in FIG. 7, a computer-implemented method for providing information to a user based on acquired sensor data and database information includes block SI 00, which recites acquiring a signal output from a sensor. Block SI 00 functions to acquire, receive, or otherwise collect sensor data from one or more sensors. For example, as shown in FIG. 2, n number of sensors 28a, 28b, 28n may be communicatively coupled to the sensing device 12 and configured to transmit sensor data to the acquisition/extraction processor 22 of the sensing device 12. The n number of sensors 28a, 28b, 28n and acquisition/extraction processor 22 may form part of a sensor node, so that the sensor node is configured to gather or acquire sensory information from the n number of sensors 28a, 28b, 28n and process the sensory information.
[00105] As shown in FIG. 7, a computer-implemented method for providing information to a user based on acquired sensor data and database information includes block SllO, which recites identifying a reproducible feature in the signal output. Block SllO functions to determine points of interest or data of interest in the signal output from the one or more sensors. Examples of features that can be identified and extracted include: statistical features (e.g. average, median, variance, kurtosis), time domain features (e.g. amplitude, derivative, slope), frequency domain features (e.g. peak frequency in power density spectrum, width of the main peak, frequency and width of the secondary peaks and harmonics), or time-frequency domain features (e.g. wavelets). In one embodiment, features are identified using a feature database. Features not present in the feature database are added to the feature database to create new knowledge in the system. Features not present in the feature database are identified based on the ability of the acquisition/extraction processor 22 to distinguish the feature from the rest of the signal, recognize the reproducibility of the feature along the signal, and/or recognize the non-isolated nature of the feature. Such new feature identification may be performed using supervised or unsupervised machine learning techniques.
[00106] As shown in FIG. 7, a computer-implemented method for providing information to a user based on acquired sensor data and database information includes block S120, which recites extracting the reproducible feature from the signal output. Block S120 functions to isolate the feature from the signal output for translation of the feature, for example by the translation processor 32 of the computing device 14. For example, as shown in FIG. 2, the
acquisition/extraction processor 22 extracts the reproducible feature (i.e., feature extraction 44) from the signal output from one or more sensors 28a, 28b, 28n. In some embodiments, the method performed by the acquisition/extraction processor 22 further includes extracting a pattern (i.e., pattern extraction 46) from the reproducible feature.
[00107] General signal processing techniques can be used to identify and extract features from digitized sensor signal data, such as various transform techniques (Fourier, Wavelets); by integration, derivation, and differentiation techniques; by template matching; by comparing physical features of the sensor signal data such as amplitude, frequency, and phase to a set threshold or thresholds; and/or by fitting the data to mathematical functions, etc. In some embodiments, feature identification and extraction is performed using unsupervised or supervised machine learning techniques, which automatically identify and extract features.
[00108] As shown in FIG. 7, a computer-implemented method for providing information to a user based on acquired sensor data and database information includes block S130, which recites translating the reproducible feature into a first human-intelligible element, wherein the first human-intelligible element describes an unknown condition detected by the sensor. Block S130 functions to associate each reproducible feature with a human intelligible element, for example a word, phrase, or sentence. For example, as shown in FIG. 2, the extracted feature (or pattern) is transmitted to the translation processor 32 of the computing device 14. The translation processor 32 of the computing device 14 receives the extracted feature or pattern and translates the extracted feature or pattern into a human- intelligible element (i.e., translating to human- intelligible element 48). In some embodiments, block S130 is performed by a Data2Language Engine. In one embodiment, as shown in FIG. 3, the reproducible feature 58 or pattern 60
(structured data 56) is translated or mapped to a human-intelligible element (i.e., unstructured data 70) (e.g., language 72) by a Data2 Words engine 62. The words used by the Data2 Words engine 62 may be stored in a domain (e.g., athletics, pregnancy, environmental, etc.) and/or user- specific word database 64. The word database 64 may be updated during the use and functioning of the system. In some embodiments, an association between two or more words may be created and the two or more words may be combined via a Words2Sentences engine 66. The word associations used by the Words2 Sentences engine 66 may be stored in a domain and user- specific sentence or relation database 68. In some such embodiments, the words are combined based on relations and/or associations between multiple words. The combining of two or more words can be based on the chronological order through which the data and words are created, the causal relationships between words, or any other type of semantic relationship.
[00109] In some embodiments, a feature is mapped to a human-intelligible element using a look-up table in which each feature is mapped to one word. In some embodiments, a feature is mapped to multiple words. In some variations of the embodiment, the mapping of each feature to each word is given a probability. For example, the system may indicate that the feature is 80% associated to a first word and 20% associated to a second word.
[00110] In some embodiments, the system evaluates the association between the feature and the human-intelligible element. Evaluation may include elimination of redundant human- intelligible elements, reduction of the number of human-intelligible elements, deduction of further human-intelligible elements from the human-intelligible elements already associated with the features, the exclusion of contradictory human-intelligible elements, extension of the number of human-intelligible elements, etc. The evaluation technique(s) are based, for example, on the type(s) of human-intelligible elements associated with the features. In the case of linguistic information, using the human-intelligible elements comprising descriptive words and sentences in a particular human language, such as the English language, linguistic information evaluation techniques may be applied, for example. In some embodiments, linguistic information evaluation includes identifying synonyms among the human-intelligible elements associated with a specific set of features. In some embodiments, the synonyms may be filtered or combined to reduce the number of human-intelligible elements. Linguistic information evaluation may also include automatically summarizing a set of human-intelligible elements to obtain a reduced, more concise, set of human-intelligible elements. In another example, linguistic information evaluation may include semantic analysis to analyze a set of human-intelligible elements and deduct a shorter, more concised and more relevant, set of human-intelligible elements. In the case of video, pictorial, or sound type human-intelligible elements; suitable video, picture, and sound evaluation techniques are used for performing information evaluation on the plurality of human- intelligible elements.
[00111] As shown in FIG. 7, a computer-implemented method for providing information to a user based on acquired sensor data and database information includes block S140, which recites querying, using the first human-intelligible element, a data source to identify a second human-intelligible element, wherein the second human-intelligible element identifies or describes (e.g., so that a user, expert, doctor, etc. can identify the unknown condition and/or take action on the unknown condition) the unknown condition described by the first human- intelligible element. Block S140 functions to further link human-intelligible elements with additional human-intelligible elements that are particularly relevant for the system, target user, and/or target domain to further describe the unknown condition. In some embodiments, an identified condition includes, but is not limited to, a description of a condition; a term or name for a condition; information about a condition; and/or related symptoms or observations associated with a condition. For example, the identified condition may include: one or more of a user health status or wellbeing, a diet, a fitness level, disease characteristics, disease progression, a stage of pregnancy, a maternal health or wellbeing status, a maternal stress level, a fetal health or wellbeing status, a fetal stress level, a stage of labor, a post-partum condition or period, a preconception period or condition, environmental conditions at a point in time, environmental changes over time, etc. In some embodiments, as shown in FIG. 5, the first human intelligible element is processed using a search engine 78 or an NLP engine 80, which queries one or more data sources 82 (e.g., medical records, publications, scientific journals, books, websites, blogs, social media platforms, etc.) for human-intelligible elements related to the first human- intelligible element. In some embodiments, the probability of associating a feature with more than one human-intelligible element, as described elsewhere herein, may be used as an input to the search engine 78 or NLP engine 80 to query a data source 82 to identify the second human- intelligible element. For example, the system may only search human-intelligible elements with the highest probability; and/or weigh the output (e.g., second human- intelligible element) of the search engine with the probability associated with its input (e.g., first human-intelligible element). In some embodiments of the system in which multiple sensors are employed, the system may compare the probabilities of human-intelligible elements generated from the multiple sensors, and only maintain the human-intelligible elements for which the probabilities are the highest for at least two of the sensors to query the search engine. In some embodiments, the search engine 78 or NLP engine 80 is directed to user and/or domain-specific content.
Further, in some embodiments, block S140 occurs automatically, upon a command from a user, upon execution of a particular step by the translation processor 32, and/or after any other prerequisite.
[00112] In some embodiments, the second human-intelligible element is given a probability to identify the unknown condition described by the first human-intelligible element. For example, the system may indicate that the unknown condition has a 70% chance to be associated to one specific pregnancy stage, 20% to a second pregnancy stage, and 10% to a third pregnancy stage.
[00113] In some embodiments, as shown in FIG. 5, the second human-intelligible element identified using the search engine 78 or NLP engine 80 may be integrated and/or processed together with the first human-intelligible element to provide user feedback 84 (e.g., describe your condition; how are you feeling today?; is someone smoking around you?; etc.) or
recommendations for the user (e.g., go see a doctor, have a cup of coffee, you are in labor, etc.); update medical records of the user 86; provide support to the user 88 (e.g., a support group, call center, doctor, etc.); update or notify the same or another user (e.g., doctor, call center, support group, care center, hospital, etc.); and/or request information and/or feedback .
[00114] As shown in FIG. 6 and FIG. 10, a method of providing information to a user based on user input and/or acquired sensor data and database information includes receiving a text input from a user S400a; extracting a first human-intelligible element from the text input S400b; integrating the first human-intelligible element (e.g., from a first data source 500) with a second human-intelligible element (e.g., from a second data source 510) S420; and querying, using the integrated first and second human-intelligible elements, a data source to identify a third human-intelligible element, wherein the third human-intelligible element identifies the unknown condition described by the first and second human-intelligible elements S430. Alternatively or additionally, in some embodiments, the method includes receiving inputs from two or more data sources 500, 510, 520, as shown in FIG. 6. For example, as shown in FIGS. 6 and 10, the method may include receiving one or more signal outputs from one or more sensors 28a, 28b, 28n in block S410a; and extracting one or more reproducible features 44a, 44b from the one or more signal outputs and translating the reproducible feature to a first human-intelligible element 48a, 48b in block S410b, as described elsewhere herein. The method functions to integrate, analyze, and/or interpret text data and/or sensor data of different data types. In some
embodiments, the system only receives a text input or a signal output. In some embodiments, the system receives both a text input and a signal output. In variations of such embodiments, the system may receive multiple text inputs and/or signal outputs.
[00115] In some embodiments, the system only acquires and analyzes text input. In some embodiments, the system only acquires signal output from one or more sensors. In some embodiments, the system acquires a combination of signal outputs from sensors and text inputs from users. In some embodiments, a first sensor and a second sensor measure similar phenomena or events of the same user or environment or two different users or environments. In some embodiments, the first and second sensors measure two different phenomena or events of the same user or environment or two different users or environments. In some embodiments, a first text input and a second text input describe similar phenomena or events observed by the same user or two different users. In some embodiments, a first text input and a second text input describe two different phenomena or events observed by the same user or two different users.
[00116] As shown in FIGS. 6 and 10, a method of providing information to a user based on user input and/or acquired sensor data and database information includes block S400a, which recites receiving a text input 520 from a user. Block S400a functions to acquire data about the user and/or the environment in which the user is evolving. In some embodiments, a user may input text 520 into a user interface of the computing device or the sensing device, for example using a keypad, keyboard, touch interface, voice-to-speech recognition, or any other user input device or input method. In some embodiments, the system collects or acquires text 520 from emails, SMS, and/or other sources of text information for which the user had granted the system access.
[00117] As shown in FIGS. 6 and 10, a method of providing information to a user based on user input and/or acquired sensor data and database information includes block S400b, which recites extracting a first human-intelligible element from the text input 520. Block S400b functions to extract meaningful human-intelligible elements from the text input 520. For example, extracting meaningful human-intelligible elements from the text input 520 may include excluding prepositions, articles, pronouns, conjunctions, direct objects, and/or indirect objects. Further, for example, extracting meaningful human-intelligible elements from the text input520 may include extracting verbs, adjectives, nouns, predicates, and/or other substantive words or phrases from the text input 520.
[00118] As shown in FIG. 10, a method of providing information to a user based on user input and/or acquired sensor data and database information includes block S420, which recites integrating the first human-intelligible element with a second human-intelligible element. Block S420 functions to combine the first and second human-intelligible elements, so that they can be analyzed, queried, and/or interpreted together. For example, the first and second human- intelligible elements may be integrated to describe a health status of a user, measured by the first sensor or received by the system through a first text input, in an environment in which the user is evolving, measured by a second sensor or received by the system through a second text input. In some embodiments, the system may integrate the first and second human-intelligible elements by determining a probability that the first human-intelligible element is associated with the second human-intelligible element; and/or weighing the probability that the first human-intelligible element is associated with the second human-intelligible element. In some embodiments of the system in which multiple sensors are employed, the system may compare the probabilities of human-intelligible elements generated from the multiple sensors and text input, and only integrate the human-intelligible elements for which the probabilities are the highest for at least two inputs (e.g., one or more sensors and/or text).
[00119] As shown in FIG. 10, a method of providing information to a user based on user input and/or acquired sensor data and database information includes block S430, which recites querying, using the integrated first and second human-intelligible elements, a data source to identify a third human-intelligible element, wherein the third human-intelligible element identifies the unknown condition (e.g., event, phenomena, etc.) described by the first and second human-intelligible elements. For example, the extracted and integrated human intelligible elements may be queried together using a search engine 78 or NLP engine 80 to identify and/or describe an unknown condition of the user. [00120] In some embodiments, as shown in FIG. 4 and FIG. 8, the method further includes acquiring expert input 74 to enrich a database associated with the system, for example a word database, an image database, and/or a relation database. In some such embodiments, the expert input 74 is automatically stored in and enriches the word, image, and/or relation databases, thus automatically and dynamically creating new knowledge in the system, as shown by the bidirectional arrows in FIG. 4 between the Data2Language Engines and the processed expert input 76. For example, the method may include receiving expert input 74 that describes a situation or condition (e.g., disease, illness, pregnancy, tiredness, etc.) as encountered by a user of the system S200; processing the expert input 76 to extract human-intelligible elements that describe the situation or condition S210; adding the extracted human-intelligible elements to a database (e.g., word, image, and/or relation) S220; and associating the reproducible features and/or patterns with the expert description of the situation or condition S230.
[00121] As shown in FIG. 8, a method of incorporating expert input into a system includes block S210, which recites processing the expert input 74 to extract human-intelligible elements that describe the situation or condition. In some embodiments, the expert input 74 is processed using text and/or speech processing techniques. For example, if a user inputs into the system: "I'm feeling dizzy" or "I have morning sickness," the user input is processed using text processing techniques to extract meaningful human-intelligible elements (e.g., "dizzy" and "morning sickness"). The extracted human-intelligible elements are added to one or more databases.
[00122] As shown in FIG. 9, a method of linking observations to data acquired by sensors includes receiving at least one signal output from a sensor associated with at least one user S300; receiving a plurality of human-intelligible elements from the at least one user S310; identifying similarities in the plurality of human-intelligible elements and the at least one signal output received from the at least one user S320; creating a link between the similar human-intelligible elements and the at least one signal output S330; and developing, over time, a probability that a particular human-intelligible element will be associated with a particular signal output S340. The method of FIG. 9 functions to enable the system to learn from user input and/or crowdsource for relationships between measured phenomena (e.g., using sensors) and observed or experienced phenomena (e.g., described by users). For example, the user may be a pregnant female experiencing labor contractions. The pregnant female may associate several human-intelligible elements or descriptors (e.g., severe cramps; three minutes apart; last one minute each; nausea; etc.) with the sensor output that indicates she is having a contraction. The system learns to associate a contraction signal or feature from a sensor with one or more of the human-intelligible elements provided by the user. The user provided human-intelligible elements are stored in one or more databases associated with the system, for example the word database, image database, and/or relation database, thereby creating new knowledge in the system.
[00123] As shown in FIG. 9, a method of linking observations to data acquired by sensors includes block S310, which recites receiving a plurality of human-intelligible elements from the at least one user. Block S310 functions to acquire information, observations, and/or data from users of the system. For example, the user may input, using the computing device of the system, one or more human-intelligible elements describing the physiological or environmental situation of the user, one or more symptoms or feelings of the user, a condition of the user, or any other information the user is willing to provide. In some embodiments, the system may mine data from Internet accounts and/or additional sensing devices possessed by the user, for example after the user links his Internet accounts and/or additional sensing devices to the system or grants the system access to the Internet accounts and/or additional sensing devices.
[00124] As shown in FIG. 9, a method of linking observations to data acquired by sensors includes block S320, which recites identifying similarities in the plurality of human-intelligible elements and the at least one signal output received from the at least one user. In some embodiments, identifying similarities comprises comparing the plurality of human-intelligible elements received from the user to the pluarity of human-intelligible elements derived from the at least one signal output.
[00125] As shown in FIG. 9, a method of linking observations to data acquired by sensors includes block S330, which recites creating a link between the similar human-intelligible elements and the at least one signal output. For example, if a user reports a certain condition (e.g. symptoms or physiological conditions), a relation may be created between the signal output and the human-intelligible elements provided by the user, therefore creating a learning mechanism in the system by which new human-intelligible elements may be associated to a certain signal output. Such link or relation between the human-intelligible element and the at least one signal output may, in some embodiments, be based on one or more of: a time, location, human- intelligible elements used to report the condition, other user input, and a rule set thats associates one or more features (identified/extracted from the signal output) and one or more conditions or human-intelligible elements.
[00126] As shown in FIG. 9, a method of linking observations to data acquired by sensors includes block S340, which recites developing, over time, a probability that a particular human- intelligible element will be associated with a particular signal output. Block S340 functions to create new knowledge in the system and/or databases so that the new knowledge can be used to identify and/or describe additional unknown conditions encountered by users and thus the system. For example, the system may develop a probability of 90% between a strong contraction (e.g., based on sensor data and user perception of contraction strength) and labor induction (e.g., based on sensor data and/or user feedback) or a probability of 10% between a weak contraction and labor induction.
EXAMPLES
[00127] As shown in FIGS. 11A-11E, sensor data is used to automatically provide feedback (in an unstructured format or human-intelligible format) to a pregnant woman (FIG. 11 A) about her health and lifestyle. For example, patterns in her physiological data acquired by one or more sensors allow the system to conclude that she didn't sleep well last night (FIG. 11B). The system combines the interpretation of her physiological data with text, as described above in FIG. 6, such as a specific question (e.g., human-intelligible elements) she inputs into the system: "Can I still drink my morning cup of coffee?" (FIG. 11C). The system uses the interpretation of the physiological data and the text input to query an NLP engine to provide relevant and actionable feedback for the pregnant woman (FIGS. 11D and HE). For example, the system indicates to the user that she can enjoy her cup of coffee (FIG. 11D) and provides additional information that the user may find relevant to her current situation, for example pregnancy (FIG. 10E).
[00128] As shown in FIGS. 12A-12C, sensor data is used to enhance information that is provided to a pregnant woman in response to a specific question. For example, a pregnant woman who experiences morning sickness may wonder: "What can I eat to help with morning sickness?" (FIG. 12A). The system may, in a first step, use traditional search engines and/or NLP to search data sources and provide an answer to the pregnant woman's question (FIG. 12B). Additionally, in some embodiments, the system may integrate sensor data collected using a wearable sensor (e.g. the Bloom Belli sensor) to provide more information about her question and enhance the system response with personal sensor- based insights (FIG. 12C). For example, the system knows her sleep history based on the sensor data, and it integrates the sleep history (i.e., structured data) with her question or text input (i.e., unstructured data), to query data sources and provide additional information (e.g., why she is experiencing morning sickness; what other activities she can do to avoid it in the future).
[00129] The systems and methods of the embodiments and variations thereof can be embodied and/or implemented at least in part as a machine configured to receive a computer- readable medium storing computer-readable instructions. The instructions are preferably executed by computer-executable components preferably integrated with the system and one or more portions of the processor. The computer-readable medium can be stored on any suitable computer-readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (e.g., CD or DVD), hard drives, floppy drives, or any suitable device. The computer-executable component is preferably a general or application-specific processor, but any suitable dedicated hardware or hardware/firmware combination can alternatively or additionally execute the instructions.
[00130] As used in the description and claims, the singular form "a", "an" and "the" include both singular and plural references unless the context clearly dictates otherwise. For example, the term "a human intelligible element" may include, and is contemplated to include, a plurality of human intelligible elements or a set of human-intelligible elements. At times, the claims and disclosure may include terms such as "a plurality," "one or more," or "at least one;" however, the absence of such terms is not intended to mean, and should not be interpreted to mean, that a plurality is not conceived.
[00131] The term "about" or "approximately," when used before a numerical designation or range (e.g., to define a length or pressure), indicates approximations which may vary by ( + ) or ( - ) 5%, 1% or 0.1%. All numerical ranges provided herein are inclusive of the stated start and end numbers. The term "substantially" indicates mostly (i.e., greater than 50%) or essentially all of a device, substance, or composition.
[00132] As used herein, the term "comprising" or "comprises" is intended to mean that the devices, systems, and methods include the recited elements, and may additionally include any other elements. "Consisting essentially of shall mean that the devices, systems, and methods include the recited elements and exclude other elements of essential significance to the combination for the stated purpose. Thus, a system or method consisting essentially of the elements as defined herein would not exclude other materials, features, or steps that do not materially affect the basic and novel characteristic(s) of the claimed invention. "Consisting of shall mean that the devices, systems, and methods include the recited elements and exclude anything more than a trivial or inconsequential element or step. Embodiments defined by each of these transitional terms are within the scope of this disclosure.
[00133] The examples and illustrations included herein show, by way of illustration and not of limitation, specific embodiments in which the subject matter may be practiced. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Such embodiments of the inventive subject matter may be referred to herein individually or collectively by the term "invention" merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept, if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A system for providing information to a user based on acquired sensor data and database information, the system comprising: a first sensor; a processor communicatively coupled to the first sensor; and a computer-readable medium having non-transitory, processor-executable instructions stored thereon, wherein execution of the instructions causes the processor to perform a method comprising: acquiring a first signal output from the first sensor, identifying a first reproducible feature in the first signal output, extracting the first reproducible feature from the first signal output, and translating the first reproducible feature into a first human-intelligible element, wherein the first human-intelligible element describes an unknown condition detected by the first sensor.
2. The system of claim 1, wherein the method executed by the processor further comprises: querying, using the first human-intelligible element, a data source to identify a second human-intelligible element, wherein the second human-intelligible element identifies the unknown condition described by the first human-intelligible element.
3. The system of claim 2, wherein the data source is tailored to the user using one or more of a tone and an angle.
4. The system of claim 1 , further comprising a second sensor, wherein the second sensor is communicatively coupled to the processor.
5. The system of claim 4, wherein the method executed by the processor further comprises: acquiring a second signal output from the second sensor; identifying a second reproducible feature in the second signal output, wherein the second reproducible feature is different than the first reproducible feature; extracting the second reproducible feature from the second signal output; translating the second reproducible feature into a second human-intelligible element, wherein the second human-intelligible element describes an unknown condition detected by the second sensor; integrating the first human-intelligible element from the first sensor and the second human-intelligible element from the second sensor; and querying, using the integrated first and second human-intelligible elements, a data source to identify a third human-intelligible element, wherein the third human-intelligible element identifies the unknown condition described by the first and second human-intelligible elements.
6. The system of claim 1, wherein the method executed by the processor further comprises: receiving a text input from a user; extracting a second human-intelligible element from the text input; integrating the first and second human-intelligible elements; and querying, using the integrated first and second human-intelligible elements, a data source to identify a third human-intelligible element, wherein the third human-intelligible element identifies the unknown condition described by the first and second human-intelligible elements.
7. A system for providing information to a user based on acquired sensor data and database information, the system comprising: a first sensor; a first processor communicatively coupled to the first sensor; a first computer-readable medium having non-transitory, processor-executable instructions stored thereon, wherein execution of the instructions causes the first processor to perform a method comprising: acquiring a first signal output from a first sensor, identifying a reproducible feature in the first signal output, and extracting the reproducible feature from the first signal output; a computing device communicatively coupled to the first processor, a second processor, and a data source; and a second computer-readable medium having non-transitory, processor-executable instructions stored thereon, wherein execution of the instructions causes the second processor to perform a method comprising: receiving the extracted reproducible feature from the first processor; translating the extracted reproducible feature into a first human-intelligible element, wherein the first human-intelligible element describes an unknown condition detected by the first sensor, and querying, using the first human-intelligible element, the data source to identify a second human-intelligible element, wherein the second human-intelligible element identifies the unknown condition described by the first human-intelligible element.
8. The system of claim 7, further comprising a Data2Language engine.
9. The system of claim 8, wherein translating is performed by the Data2Language engine.
10. The system of claim 7, further comprising one or more of a word database, an image database, and a relation database.
11. The system of claim 10, wherein the first human-intelligible element is derived from one or more of the word database, the image database, and the relation database.
12. The system of claim 7, wherein the method executed by the second processor further comprises acquiring an input from an expert to enrich a word database, an image database, and a relation database.
13. The system of claim 7, wherein the method performed by the first processor further includes extracting a pattern from the reproducible feature.
14. The system of claim 7, wherein translating further comprises mapping the reproducible feature to the first human-intelligible element.
15. The system of claim 7, wherein the first and second human-intelligible elements comprise one or more of words, phrases, sentences, and images.
16. The system of claim 15, wherein translating further comprises: creating an association between two or more words; and combining the two or more words into one or more sentences.
17. The system of claim 7, wherein the method performed by the second processor further includes automatically inputting the first human-intelligible element into a search engine.
18. The system of claim 7, further comprising a second sensor.
19. The system of claim 18, wherein acquiring a first signal output from the first sensor further comprises acquiring a second signal output from the second sensor.
20. The system of claim 19, wherein the method performed by the second processor further includes integrating the first human-intelligible element derived from the first and second data outputs from the first and second sensors, respectively, to identify the unknown condition.
21. The system of claim 18, wherein the first sensor is different from the second sensor.
22. The system of claim 7, wherein the reproducible feature includes one or more of a contraction duration, a contraction frequency, a contraction intensity, a maternal heart rate, a fetal heart rate, and a number of fetal movements.
23. The system of claim 7, further comprising a display communicatively coupled to the processor, wherein the display is configured to display the identified condition to the user.
24. The system of claim 7, wherein the identified condition comprises one or more of: a stage of pregnancy, a post-partum condition, a preconception condition, a maternal health or wellbeing status, a maternal stress level, a maternal sleep index, a fetal movement level, a fetal health or wellbeing status, a fetal stress level, and a stage of labor.
25. The system of claim 7, wherein the data source includes scientific papers, reports, clinical papers, clinical reports, websites, blogs, social media platforms, medical records, medical textbooks, and clinical notes.
26. A computer-implemented method for providing information to a user based on acquired sensor data and database information, the method comprising: acquiring a first signal output from a first sensor; identifying a reproducible feature in the first signal output; extracting the reproducible feature from the first signal output; translating the reproducible feature into a first human-intelligible element, wherein the first human-intelligible element describes an unknown condition detected by the first sensor; and querying, using the first human-intelligible element, a data source to identify a second human-intelligible element, wherein the second human-intelligible element identifies the unknown condition described by the first human-intelligible element.
27. The method of claim 26, wherein the first human-intelligible element is derived from one or more of a word database, an image database, and a relation database
28. The method of claim 26, further comprising acquiring an input from an expert to enrich one or more of a word database, an image database, and a relation database.
29. The method of claim 26, further comprising extracting a pattern from the reproducible feature.
30. The method of claim 26, wherein translating further comprises mapping the reproducible feature to the first human-intelligible element.
31. The method of claim 26, wherein the first human-intelligible element comprises one or more of words, phrases, images, and sentences.
32. The method of claim 31, wherein translating further comprises: creating an association between two or more words; and combining the two or more words into one or more sentences.
33. The method of claim 26, further comprising automatically inputting the first human- intelligible element into a search engine to identify the second human-intelligible element.
34. The method of claim 26, further comprising acquiring a second signal output from a second sensor.
35. The method of claim 34, further comprising integrating the first human-intelligible element derived from the first and second data outputs from the first and second sensors, respectively, to identify the unknown condition.
36. The method of claim 34, wherein the first sensor is different from the second sensor.
37. The method of claim 26, wherein the reproducible feature includes one or more of a contraction duration, a contraction frequency, a contraction intensity, a time interval between contractions, a maternal heart rate, a fetal heart rate, a heart rate variability, a step; a number of calories burned over a certain period of time, blood-pressure level, blood oxygenation level, blood-glucose level, stress level, a number of hours of sleep, body position during sleep, a sleep quality index, a fitness index, a wellbeing index, and a number of fetal movements.
38. The method of claim 26, wherein the identified condition comprises one or more of: a stage of pregnancy, a post-partum stage, a preconception stage, a maternal health or wellbeing status, a maternal stress level, a fetal health or wellbeing status, a fetal stress level, a maternal sleep index, a fetal movement level, and a stage of labor.
39. The method of claim 26, further comprising displaying the identified condition to the user on a display.
40. The method of claim 26, wherein acquiring, identifying, and extracting are performed by a first processor; and translating and querying are performed by a second processor.
41. The method of claim 26, wherein the method is performed by one processor.
42. The method of claim 26, further comprising: providing feedback to the user based on the first and second human-intelligible elements.
43. The method of claim 42, wherein the feedback is tailored to the user using one or more of a tone and an angle.
44. A method of linking observations to data acquired by sensors, the method comprising: receiving at least one signal output from a sensor associated with at least one user; receiving a plurality of human- intelligible elements from the at least one user; identifying similarities in the plurality of human-intelligible elements and the at least one signal output received from the at least one user; and creating a link between the similar human-intelligible elements and the at least one signal output.
45. The method of claim 44, wherein the plurality of human- intelligible elements includes one or more of: an observed symptom, a diagnosis, an observation about wellbeing, a symptom, a condition, and a feeling.
46. The method of claim 44, further comprising developing, over time, a probability that a particular human-intelligible element will be associated with a particular signal output.
PCT/IB2017/050655 2016-02-10 2017-02-07 Systems and methods for providing information to a user based on acqumed sensor data and database information WO2017137890A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP17705176.0A EP3414688A1 (en) 2016-02-10 2017-02-07 Systems and methods for providing information to a user based on acqumed sensor data and database information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662293503P 2016-02-10 2016-02-10
US62/293,503 2016-02-10

Publications (1)

Publication Number Publication Date
WO2017137890A1 true WO2017137890A1 (en) 2017-08-17

Family

ID=58044113

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/050655 WO2017137890A1 (en) 2016-02-10 2017-02-07 Systems and methods for providing information to a user based on acqumed sensor data and database information

Country Status (2)

Country Link
EP (1) EP3414688A1 (en)
WO (1) WO2017137890A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140088996A1 (en) * 2012-09-21 2014-03-27 Md Revolution, Inc. Systems and methods for developing and implementing personalized health and wellness programs
WO2015020886A1 (en) * 2013-08-08 2015-02-12 Gaster Richard S Wireless pregnancy monitor

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140088996A1 (en) * 2012-09-21 2014-03-27 Md Revolution, Inc. Systems and methods for developing and implementing personalized health and wellness programs
WO2015020886A1 (en) * 2013-08-08 2015-02-12 Gaster Richard S Wireless pregnancy monitor

Also Published As

Publication number Publication date
EP3414688A1 (en) 2018-12-19

Similar Documents

Publication Publication Date Title
Baig et al. A systematic review of wearable sensors and IoT-based monitoring applications for older adults–a focus on ageing population and independent living
Albahri et al. IoT-based telemedicine for disease prevention and health promotion: State-of-the-Art
Hassan et al. Intelligent hybrid remote patient-monitoring model with cloud-based framework for knowledge discovery
Kumar et al. Hierarchical deep neural network for mental stress state detection using IoT based biomarkers
Rodriguez-León et al. Mobile and wearable technology for the monitoring of diabetes-related parameters: Systematic review
Tjepkema-Cloostermans et al. Outcome prediction in postanoxic coma with deep learning
US20190013093A1 (en) Systems and methods for analyzing healthcare data
Sow et al. Mining of sensor data in healthcare: A survey
CN112863630A (en) Personalized accurate medical question-answering system based on data and knowledge
Meyers et al. Health-related quality of life in neurology
Tan et al. Health care monitoring system and analytics based on internet of things framework
US20230037749A1 (en) Method and system for detecting mood
US10847261B1 (en) Methods and systems for prioritizing comprehensive diagnoses
Oyebode et al. Machine learning techniques in adaptive and personalized systems for health and wellness
US20210134461A1 (en) Methods and systems for prioritizing comprehensive prognoses and generating an associated treatment instruction set
CN114072782A (en) Question recommendation method, device and system, electronic equipment and readable storage medium
Chavda et al. Early detection of cardiac disease using machine learning
Akhtar et al. Detection of sleep paralysis by using IoT based device and its relationship between sleep paralysis and sleep quality
Jung et al. Development of U-healthcare monitoring system based on context-aware for knowledge service
Shen A Multi-source Based Healthcare Method for Heart Disease Prediction by Machine Learning
Kalatzis et al. Interactive dimensionality reduction for improving patient adherence in remote health monitoring
JP2016134132A (en) Information processing system and program
WO2017137890A1 (en) Systems and methods for providing information to a user based on acqumed sensor data and database information
Bhattacharya et al. Remote cardiovascular health monitoring system with auto-diagnosis
Gachet et al. Big data processing of bio-signal sensors information for self-management of health and diseases

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17705176

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2017705176

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2017705176

Country of ref document: EP

Effective date: 20180910