WO2014066854A1 - Management, control and communication with sensors - Google Patents

Management, control and communication with sensors Download PDF

Info

Publication number
WO2014066854A1
WO2014066854A1 PCT/US2013/066966 US2013066966W WO2014066854A1 WO 2014066854 A1 WO2014066854 A1 WO 2014066854A1 US 2013066966 W US2013066966 W US 2013066966W WO 2014066854 A1 WO2014066854 A1 WO 2014066854A1
Authority
WO
WIPO (PCT)
Prior art keywords
computing device
wearable device
data
devices
health sensor
Prior art date
Application number
PCT/US2013/066966
Other languages
French (fr)
Inventor
Ram David Adva Fish
Henry Messenger
Original Assignee
Imetrikus, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Imetrikus, Inc. filed Critical Imetrikus, Inc.
Priority to EP13848885.3A priority Critical patent/EP2911577A4/en
Publication of WO2014066854A1 publication Critical patent/WO2014066854A1/en
Priority to HK16102374.4A priority patent/HK1214116A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • A61B5/749Voice-controlled interfaces
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0266Operational features for monitoring or limiting apparatus function
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/04Constructional details of apparatus
    • A61B2560/0487Special user inputs or interfaces
    • A61B2560/0493Special user inputs or interfaces controlled by voice
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7465Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
    • A61B5/747Arrangements for interactive communication between patient and care services, e.g. by using a telephone network in case of emergency, i.e. alerting emergency services
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/31Voice input

Definitions

  • Embodiments of the present invention relate generally to health care-based monitoring systems, and more specifically, to mechanisms for managing, controlling, and communicating data between devices.
  • PSAP public safety access point
  • An emergency services person located at a PSAP may need to manually place a second call to the local fire station, police, or Emergency Medical Services (EMS) squad, thereby wasting precious time that could be used to save the person's life. Further, if the person is unconscious, they would not be able to relate the nature of their injuries nor their physical location.
  • EMS Emergency Medical Services
  • a wearable device may be worn by the user and the wearable device may monitor the activities and/or health of the user using a variety of sensors and/or components (e.g., GPS units, a blood pressure unit, an accelerometer, etc.).
  • the wearable device may also provide a simple interface (e.g., a single button) to allow a user to initiate a voice call (e.g., to request help).
  • the wearable device may be configured to call a single destination (e.g., a PSAP) in response to a user request (e.g., in response to the user pushing the button) and may not be able to initiate voice calls to other destinations in response to the user request.
  • the wearable device may use various other devices (e.g., health sensors). For example, the wearable device may user a blood pressure sensor, a thermometer, a weight sensor (e.g., a scale), etc., to monitor the condition, health and/or state of the user.
  • FIG. 1 is a block diagram illustrating one embodiment of a system for detecting a predefined user state, according to one embodiment of the present disclosure.
  • FIG. 2 is a block diagram illustrating one embodiment of a wearable device, according to one embodiment of the present disclosure.
  • FIG. 3 is block diagram illustrating an example system architecture, according to another embodiment of the present disclosure.
  • FIG. 4 is a flow diagram illustrating a method of pairing a first device with a computing device, according to one embodiment of the present disclosure.
  • FIG. 5 is a flow diagram illustrating a method of pairing a first device with a computing device, according to another embodiment of the present disclosure.
  • FIG. 6 is a block diagram of an exemplary computer system that may perform one or more of the operations described herein, according to one embodiment of the present disclosure.
  • a wearable device may use various other devices (e.g., health sensors).
  • the wearable device may use a blood pressure sensor, a thermometer, a weight sensor (e.g., a scale), etc., to monitor the condition, health and/or state of the user. Users may not be familiar with the procedures, operations, commands, etc., that may be needed to pair the wearable device with each of the other devices.
  • each type of health sensor may have a different procedure and/or passwords for pairing the health sensor with the wearable device.
  • Embodiments of the invention provide mechanisms for pairing devices such that the devices may communicate data.
  • a wearable device may be paired with other devices (e.g., health sensors).
  • a user may use the wearable device to establish a voice call with a remote operator in a call center.
  • the remote operator may instruct the user to prepare a first device for pairing with the wearable device.
  • the remote operator may use a server to pair the wearable device with the first device.
  • the server may instruct the wearable device to activate/power up a communication interface and to scan for devices.
  • the wearable device may provide a list of devices that are visible to the wearable device.
  • the server may receive input identifying the first device from the list of devices and may provide instructions to the wearable device for pairing the first device with the wearable device. Some embodiments may allow the server to pair the wearable device with the first device automatically (e.g., without input and/or intervention from the user). This may allow users to pair a wearable device with other devices (e.g., health sensors) more quickly and efficiently.
  • Some embodiments may allow the server to pair the wearable device with the first device automatically (e.g., without input and/or intervention from the user). This may allow users to pair a wearable device with other devices (e.g., health sensors) more quickly and efficiently.
  • the wearable device may operate as a tunnel and may forward data from the wearable device to the server and vice versa. This may allow the wearable device to pair with devices when the wearable device is not able understand and/or process the data and/or information from these devices.
  • the wearable device may forward the data to the server and the server may process the data, instead of having the wearable device process the data.
  • the other devices e.g., the other health sensors
  • FIG. 1 is a block diagram illustrating one embodiment of a system 10 for detecting a predefined user state.
  • the system 10 includes wearable devices 12a-12n communicatively connected to a distributed cloud computing system 14.
  • a wearable device 12 may be a small- size computing device that can be worn as a watch, a pendant, a ring, a pager, or the like, and can be held in any orientation.
  • each of the wearable devices 12a-12n is operable to communicate with a corresponding one of users 16a-16n (e.g., via a microphone, speaker, and voice recognition software), external health sensors 18a-18n (e.g., an EKG, blood pressure device, weight scale, glucometer) via, for example, a short-range over the air (OTA) transmission method (e.g., Bluetooth, Wi-Fi, etc.), a call center 30, a first-to-answer system 32, and care giver and/or family member 34, and the distributed cloud computing system 14 via, for example, a long range OTA transmission method (e.g., over a 3rd Generation (3G) or 4th Generation (4G) cellular transmission network 20, such as a Long Term Evolution (LTE) network, a Code Division Multiple Access (CDMA) network, etc.).
  • 3G 3rd Generation
  • 4G 4th Generation
  • Each wearable device 12 is configured to detect a predefined state of a user.
  • the predefined state may include a user physical state (e.g., a user fall inside or outside a building, a user fall from a bicycle, a car incident involving a user, a user taking a shower, etc.) or an emotional state (e.g., a user screaming, a user crying, etc.).
  • the wearable device 12 may include multiple sensors for detecting a predefined user state.
  • the wearable user device 12 may include an accelerometer for measuring an acceleration of the user, a magnetometer for measuring a magnetic field associated with the user's change of orientation, a gyroscope for providing a more precise determination of orientation of the user, and a microphone for receiving audio. Based on data received from the above sensors, the wearable device 12 may identify a suspected user state, and then categorize the suspected user state as an activity of daily life, a confirmed predefined user state, or an inconclusive event. The wearable user device 12 may then communicate with the distributed cloud computing system 14 to obtain a re-confirmation or change of classification from the distributed cloud computing system 14.
  • the wearable user device 12 transmits data provided by the sensors to the distributed cloud computing system 14, which then determines a user state based on this data.
  • the wearable device 12 includes a low-power processor (e.g., low- power processing device) to process data receive from sensors and/or detect anomalous sensor inputs.
  • the low-power processor may cause a second processing device to further analyze the sensor inputs (e.g., may wake up a main CPU). If the second processing device determines that there is possibly an anomalous event in progress the second processing device may send dataset to the distributed cloud computing system 14.
  • the distributed cloud computing system 14 concludes there is an anomalous event, the distributed cloud computing system 14 may instruct the wearable device 12 to initiate a voice call.
  • the wearable user device 12 may also obtain audio data from one or more microphones on the wearable device 12.
  • the wearable user device 12 may record the user's voice and/or sounds which are captured by the one or more microphones, and may provide the recorded sounds and/or voice to the distributed cloud computing system 14 for processing (e.g., for voice or speech recognition).
  • the wearable devices 12a-12n may continually or periodically gather/obtain data from the sensors and/or the one or more microphones (e.g., gather/obtain datasets and audio data) and the wearable devices 12a-12n may transmit these datasets to the distributed cloud computing system 14.
  • the datasets may be transmitted to the distributed cloud computing system 14 at periodic intervals, or when a particular event occurs (e.g., user pushes a button on the wearable device 12a-12n or a fall is detected).
  • the datasets may include data indicative of measurements or information obtained by the sensors which may be within or coupled to the wearable device 12a-12n.
  • the datasets may include temperature readings (e.g., 98.5 degrees Fahrenheit, measurements obtained from an
  • the wearable device 12a-12n may transmit a dataset per sensor (e.g., one dataset for the accelerometer, one data set for an aGPS receiver, etc.) In another embodiment, the wearable device 12a-12n may combine data received from multiple sensors into a dataset.
  • a dataset per sensor e.g., one dataset for the accelerometer, one data set for an aGPS receiver, etc.
  • the wearable device 12a-12n may combine data received from multiple sensors into a dataset.
  • Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services.
  • the term "cloud” refers to one or more computational services (e.g., servers) connected by a computer network.
  • the distributed cloud computing system 14 may include one or more computers configured as a telephony server 22 communicatively connected to the wearable devices 12a- 12n, the Internet 24, and one or more cellular communication networks 20, including, for example, the public circuit-switched telephone network (PSTN) 26.
  • the distributed cloud computing system 14 may further include one or more computers configured as a Web server 28 communicatively connected to the Internet 24 for permitting each of the users 16a-16n to communicate with a call center 30, first-to-answer systems 32, and care givers and/or family 34.
  • the web server 28 may also provide an interface for users to interact with the distributed cloud computing system 14 (e.g., to access their accounts, profiles, or subscriptions, to access stored datasets and/or audio data, etc.)
  • the distributed cloud computing system 14 may further include one or more computers configured as a real-time data monitoring and computation server 36 communicatively connected to the wearable devices 12a-12n for receiving measurement data (e.g., datasets), for processing measurement data to draw conclusions concerning a potential predefined user state, for transmitting user state confirmation results and other commands back to the wearable devices 12a-12n, for storing and retrieving present and past historical predefined user state data from a database 37 which may be employed in the user state confirmation process, and in retraining further optimized and individualized classifiers that can in turn be transmitted to the wearable device 12a-12n.
  • the web server 28 may store and retrieve present and past historical predefined user state data, instead of the real-time data monitoring and computation serve 36 or the database 37.
  • the wearable devices 12a-12n may include a button, which a user 16 may use to initiate voice calls. For example, a user 16a may push the button on the device 12a to initiate a voice call in order to obtain assistance or help (e.g., because the user has slipped or fallen, or because the user requires medical assistance).
  • the wearable devices 12a-12n may periodically transmit datasets to the distributed cloud computing system 14.
  • the wearable devices 12a-12n may also transmit datasets to the distributed cloud computing system 14 when the user press or pushes the button on the wearable devices 12a-12n.
  • the wearable devices 12a-12n may be single-button devices (e.g., devices which only have one button) which provide a simplified interface to users.
  • the distributed cloud computing system 14 may receive a request from the wearable device 12a-12n to initiate the voice call.
  • the distributed cloud computing system 14 may also receive datasets from the wearable device 12a-12n associated with an event experienced by the user. After receiving the request to initiate the voice call, the distributed cloud computing system 14 may analyze the datasets to determine whether the event experienced by the user is an activity of daily life (ADL), a confirmed fall, or an inconclusive event. In another embodiment, the distributed cloud computing system 14 may identify a destination for routing the voice call, based on the analysis of the datasets.
  • ADL activity of daily life
  • the distributed cloud computing system 14 may identify a destination for routing the voice call, based on the analysis of the datasets.
  • the distributed cloud computing system 14 may identify a first-to-answer system 32 (e.g., a 911 or emergency response call center) as destination for the voice call.
  • a first-to-answer system 32 e.g., a 911 or emergency response call center
  • the distributed cloud computing system 14 may identify a family member 24, as destination for the voice call. After identifying a destination for the voice call, the distributed cloud computing system 14 routes the voice call to the identified destination.
  • the distributed cloud computing system 14 may also analyze audio data received from a wearable device 12 to determine what event has happened to a user.
  • the wearable device 12 may provide audio data (e.g., a recording of the user's voice or other sounds) to the distributed cloud computing system 14.
  • the distributed cloud computing system 14 may analyze the sound data and may determine that a user is asking for help (e.g., based on the user's words in the recording).
  • the distributed cloud computing system 14 may identify a destination for the voice call, based on the audio data and/or the datasets received from the wearable device 12 and may route the voice call to the identified destination.
  • the audio data may be used in conjunction with the datasets to identify a destination for routing the voice call.
  • the distributed cloud computing system 14 may monitor the status of the voice call, after it routes the voice call to the identified destination. For example, the distributed cloud computing system 14 may route the voice call to a first destination and may route the call to a second destination if the call is not answered by the first destination.
  • the distributed cloud computing system 14 may also use subscription data (e.g., information associated with a user's account or subscription to a service) to identify destinations for routing the voice call.
  • the subscription data may include a list and/or an order for destinations where the voice call should be routed.
  • the distributed cloud computing system 14 may also use a time of day and/or a geographic location to identify destinations for routing a voice call. For example, based on certain times of data and/or based on the location of the wearable device, the distributed cloud computing system 14 may route the voice to different locations.
  • the distributed cloud computing system 14 also includes a pairing module 27.
  • the pairing module 27 may allow the distributed cloud computing system 14 to pair health sensors (e.g., health sensor 18a) with wearable devices (e.g., wearable device 12a). Pairing a health sensor to a wearable device may refer to establishing one or more communication channels between the health sensor and the wearable device (e.g., a physical communication channel or a logical communication channel). Pairing a health sensor to a wearable device may also refer to storing an identifier for the wearable device on the health sensor and/or storing an identifier for the health sensor on the wearable device.
  • the wearable device 12a may store an identifier for the health sensor 18a to indicate that the wearable device 12a is allowed to create a communication channel with the health sensor 18a and to communicate data with the health sensor 18a. Pairing a health sensor to a wearable device allows the health sensor and the wearable device to communicate data between the health sensor and the wearable device (e.g., sensor data, messages, information, etc.) using the one or more communication channels.
  • One example of pairing may be pairing two Bluetooth capable devices, (e.g., Bluetooth pairing). Although some embodiments described herein may refer to Bluetooth and/or Bluetooth pairing, other embodiments may use different communication channels, different communication protocols, and /or different communication interfaces to pair a health sensor to a wearable device.
  • the wearable device may be paired with the health sensor using protocols and/or channels such as ZigBee, Z-Wave, RuBee, 802.15, 802.11, transmission control protocol/internet protocol (TCP/IP), user datagram protocol (UDP), and/or other communication protocols.
  • protocols and/or channels such as ZigBee, Z-Wave, RuBee, 802.15, 802.11, transmission control protocol/internet protocol (TCP/IP), user datagram protocol (UDP), and/or other communication protocols.
  • a user may acquire additional devices, such as medical sensors, to monitor the state, condition, and/or health of the user.
  • additional devices such as medical sensors
  • the user may request and/or order an additional health sensor (e.g., a weigh scale).
  • an additional health sensor e.g., a weigh scale
  • a caregiver or a doctor of the user may request and/or order the additional health sensor.
  • Examples of health sensors include, but are not limited to, an EKG, a blood pressure sensor, a weight scale, an inertial measurement unit, a pressure sensor for measuring air pressure or altitude, a heart rate sensor, a blood perfusion sensor, a temperature sensor (e.g., a thermometer), a glucose level sensor (e.g., a glucometer), etc.
  • the health sensor may be paired with a wearable device in order to monitor the health, condition, and/or state of the user.
  • the user may be unfamiliar with the process, procedures, and/or methods for pairing the health sensor with the wearable device (e.g., for establishing or setting up one or more communication channels between the health sensor and the wearable device).
  • the user may place a voice call to a caregiver, the call center 30 (e.g., a remote person or operator in the call center), etc., using the wearable device.
  • the wearable device may include a cellular communication interface (e.g., a communication interface used to communicate with a cellular network) and the user may place a voice call to the remote operator of the call center 30 using the cellular communication interface.
  • a voice call may also refer to a voice-over-IP (VOIP) call.
  • the user may indicate during the voice call that the user wishes to pair the health sensor (e.g., 18a) to the wearable device (e.g., 12a) of the user (e.g., 16a).
  • the remote operator may provide instructions to the user (e.g., verbal instructions) to instruct the user to prepare the health sensor for pairing. For example, the remote operator may instruct the user to power or turn on the health sensor, or may instruct the user to activate a particular switch or button on the health sensor. After the user indicates that the health sensor is prepared for pairing, the remote operator may use the pairing module 27 to pair the health sensor with the wearable device. In one embodiment, the remote operator may provide user input to the pairing module 27 to indicate that a wearable device should be paired with a device (e.g., a health sensor). For example, the remote operator may receiving input from a remote operator indicating that the user of a wearable device wishes to pair the wearable device with an additional device (e.g., with a health sensor).
  • a device e.g., a health sensor
  • the remote operator may provide the user input after speaking with the user via a voice call (as discussed above).
  • an automated system e.g., an automated answering system
  • the user may provide input to the automated system (e.g., via button presses) to indicate that the health sensor is ready for pairing and the distributed cloud computing system 14 may initiate the pairing process as described above.
  • pairing module 27 may transmit one or more instructions to the wearable device (e.g., 12a) to activate the communication interface that may be used to pair with the health sensor (e.g., 18a).
  • the pairing module 27 may transmit instructions to the health sensor 12a instructing the health sensor 12a to turn on and/or activate a Bluetooth communication interface.
  • the pairing module may use a second communication interface of the wearable device to transmit the one or more instructions to the wearable device.
  • the pairing module 27 may use a cellular communication interface (e.g., a 3G communication interface, a 4G communication interface, etc.) to transmit the instructions to the wearable device.
  • the wearable device may turn on the first communication interface (e.g., turn on Bluetooth) based on the instructions.
  • the pairing module 27 may send one or more additional instructions instructing the wearable device to scan for devices (e.g., sensors, other computing devices, etc.) that are visible to the wearable device.
  • the pairing module 27 may instruct the wearable device 12a to perform a scan for Bluetooth devices that are visible to the wearable device 12a.
  • One or more devices e.g., health sensors or other devices
  • a device e.g., health sensor 18a
  • a wearable device may be visible with to the wearable device (and vice versa) if the wearable device is able to transmit data to and/or receive data from the device.
  • One or more devices may also be visible to the wearable device if the one or more devices are within a range of a communication interface of the wearable device (e.g., within 10 feet, 100 feet, 1 mile, etc.).
  • the pairing module 27 may receive a list of one or more devices that are visible to the wearable device from the wearable device. For example, the wearable device may transmit information such as the model numbers, types, and/or identifiers of the one or more devices that are visible to the wearable device, to the pairing module 27.
  • the pairing module 27 may analyze and/or process the list of one or more devices to identify a device that the wearable device is to be paired with. For example, the pairing module 27 may determine multiple types of devices (e.g., a headset, a speaker system, a mouse, a keyboard, a weight scale, a blood pressure sensor, etc.) are visible to the wearable device. The pairing module 27 may identify one or more devices (e.g., one or more health sensors) that the wearable device should pair with. For example, the paring module 27 may identify one or more health sensors that are supported by the distributed cloud computing system 14. The pairing module 27 may present the list of one or more health sensors that the wearable device should pair with, to the remote operator.
  • the pairing module 27 may determine multiple types of devices (e.g., a headset, a speaker system, a mouse, a keyboard, a weight scale, a blood pressure sensor, etc.) are visible to the wearable device.
  • the pairing module 27 may identify one or more devices (e.g., one or more
  • the pairing module 27 may receive user input form the remote operator identifying a first device from the list of one or more devices.
  • the paring module 27 may receive a list of devices that are visible to the wearable device.
  • the list may include identifiers for the devices on the list of devices.
  • the identifiers may include information such as model numbers, types of the devices (e.g., computer, laptop, health sensor, speaker, headset, etc.), and/or other data that may be used to identify the devices that are visible to the wearable device.
  • the remote operator may select one of the devices by providing the identifier for the selected devices to the pairing module. For example, selecting a device from the list of devices may provide the identifier for the selected device to the pairing module 27.
  • the remote operator may provide the identifier to the paring module 27 (e.g., may input the identifier using a GUI).
  • the pairing module 27 may also select one of the devices without input form the remote operator based on the identifiers for the devices. For example, the pairing module 27 may automatically identify devices that are health sensors.
  • the pairing module 27 may access the database 37 to obtain instructions for pairing the first device with the wearable device.
  • the paring module 27 may access the database 37 to obtain instructions for pairing a weight sensor (e.g., a scale) with the wearable device 12a.
  • the pairing module 27 may use an identifier for the selected device to obtain the instructions for pairing.
  • the pairing module 27 may use the model number, a name, serial number, etc., of the selected device to obtain the instructions for pairing from the database 37.
  • the instructions for pairing may include commands, operations, access codes, passwords, etc., that may be used to pair the first device with the wearable device.
  • the pairing module 27 may access the
  • the instructions may include also include the identifier of the first device. This may allow the wearable device to identify which device the wearable device should pair with.
  • the instructions may cause the wearable device to automatically pair with the first device. For example, the instructions may cause the wearable device 12a to automatically pair with the health sensor 12a without user intervention or user input from the user 16a.
  • the wearable device may receive a confirmation (e.g., data, a message, etc.) from the wearable device to indicate that the wearable device is paired with the first device.
  • a confirmation e.g., data, a message, etc.
  • the wearable device may forward data received from the additional device to the distributed cloud computing system 14 and vice versa.
  • the wearable device may operate as a tunnel for data that is transmitted between the distributed cloud computing system 14 and the wearable device 12a, as discussed below in conjunction with FIG. 3.
  • FIG. 2 is a block diagram illustrating one embodiment of a wearable device 12a (e.g., wearable device 12a shown in FIG. 1).
  • the wearable device 12a may include a low-power processor 38 communicatively connected to an accelerometer 40 (e.g., a two-or more-axis accelerometer) for detecting acceleration events (e.g., high, low, positive, negative, oscillating, etc.), a magnetometer 42 (preferably a 3 -axis magnetometer) for assessing an orientation of the wearable device 12a, and a gyroscope 44 for providing a more precise determination of orientation of the wearable device 12a.
  • an accelerometer 40 e.g., a two-or more-axis accelerometer
  • a magnetometer 42 preferably a 3 -axis magnetometer
  • gyroscope 44 for providing a more precise determination of orientation of the wearable device 12a.
  • the low-power processor 38 is configured to receive continuous or near-continuous real-time measurement data from the accelerometer 40, the magnetometer 42, and the gyroscope 44 for rendering tentative decisions concerning predefined user states.
  • the wearable device 12 is able to render these decisions in relatively low-computationally expensive, low-powered manner and minimize false positive and false negative errors.
  • a processing module 46 such as the 3G IEM 6270 manufactured by Qualcomm®, includes a high-computationally-powered microprocessor element and internal memory that are adapted to receive the suspected fall events from the low- power processor 38 and to further correlate orientation data received from the optional gyroscope 44 with digitized audio data received from one or more microphones 48 (e.g., micro- electro-mechanical systems-based (MEMS) microphone(s)).
  • the processing module 46 may also be referred to as a processing device.
  • the audio data may include the type, number, and frequency of sounds originating from the user's voice, the user's body, and the environment.
  • the processing module 46 is also configured to receive data/commands from and/or transmit data/commands to the distributed cloud computing system 14 via a 3G, 4G, and/or other wireless protocol transceiver 50 over the cellular transmission network 20.
  • the wireless protocol transceiver 50 may also be referred to as a communication interface.
  • a communication interface may be software, hardware, or a combination of both that may be used to communicate data (e.g., transmit and/or receive data) with another device (e.g., a server, another computing device, a health sensor, etc.).
  • the processing module 46 is further configured to communicate with and receive position data from an aGPS receiver 52, and to receive measurements from the external health sensors (e.g., sensors 18a-18n shown in FIG.
  • a short-range Bluetooth transceiver 54 or other equivalent short range transceiver such as a WiFi transceiver
  • a direct connection to one or more health sensors e.g., the health sensors may be directly attached/coupled to the wearable device 12a.
  • the aGPS receiver 52 and the Bluetooth transceiver 54 may also be referred to as communication interfaces.
  • the processing module 46 is further configured to permit direct voice communication between the user 16a and the PSAP (e.g. 9-1- 1, an emergency response center, etc., not shown in the figures), a call center 30, first-to-answer systems 32 (e.g. a fire station, a police station, a physician's office, a hospital, etc.), or care givers and/or family 34 via a built-in speaker 58 and an amplifier 60.
  • the processing module 46 is further configured to permit the user 16a to conduct a conference connection with one or more of a PSAP, the call center 30, the first-to-answer systems 32, and/or care givers and/or family 34.
  • the processing module 46 may receive/operate one or more input and output indicators 62 (e.g., one or more mechanical and touch switches (not shown), a vibrator, LEDs, etc.).
  • the wearable device 12a also includes an on-board battery power module 64.
  • the wearable device 12a may also include a button 63.
  • the button 63 may allow a user to provide user input to the wearable device 12a.
  • the user may press or push the button to initiate a voice call to one or more of a call center 30, first-to-answer systems 32 (e.g. a fire station, a police station, a physician's office, a hospital, etc.), or care givers and/or family 34.
  • a user may use the button 63 to answer questions during a voice call (e.g., push the button 63 once for "yes” and push the button 63 twice for "no").
  • the user may indicate that the wearable device should start collecting data (e.g., datasets such as health data, audio data, location data, etc.) and/or send data to the distributed cloud computing system 14, using the button 63.
  • the wearable device 12a may also include empty expansion slots and/or connectors (not shown) to collect readings from other sensors (i.e., an inertial measurement unit, a pressure sensor for measuring air pressure or attitude, a heart rate sensor, blood perfusion sensor, temperature sensor), etc. These other sensors may be coupled to the device via the expansion slots and/or connectors to provide additional datasets or information to the distributed cloud computing system 14.
  • sensors i.e., an inertial measurement unit, a pressure sensor for measuring air pressure or attitude, a heart rate sensor, blood perfusion sensor, temperature sensor
  • the wearable device 12a may collect, gather, and/or obtain information using a variety of components.
  • the wearable device 12a may obtain orientation and/or movement data (e.g., information about how a user who is wearing the wearable device 12a has moved) using the accelerometer 40, the magnetometer 42, and/or the gyroscope 44.
  • the wearable device 12a may determine the location (e.g., location data, such as GPS coordinates) of the wearable device 12a (and the user who is wearing or holding the wearable device 12a) using the aGPS receiver 52.
  • the wearable device may collect health data (e.g., heart rate, blood pressure, sugar levels, temperature, etc.) using sensors (not shown in the figures) which may be attached to the wearable device 12a and/or may communicate with the wearable device 12a using the Bluetooth transceiver 54.
  • the wearable device 12a may obtain audio data (e.g., voice and/or sounds) using the microphone 48 or a plurality of microphones (now shown in the figures).
  • the wearable device 12a may obtain and/or generate datasets (e.g., orientation/movement data, health data, location data, audio data) using these components and may transmit these datasets to the distributed cloud computing system 14.
  • the wearable device 12a may periodically transmit data sets to the distributed cloud computing system 14. For example, the wearable device 12a may transmit the datasets once every 5 seconds, or once every 30 seconds.
  • the wearable device 12a may transmit the datasets when certain criteria are met (e.g., when an accelerometer detects an acceleration above a certain threshold indicating a possible fall, or when the aGPS receiver determines that the wearable devices has left a certain location).
  • the wearable device 12a may transmit datasets when a user input is received. For example, the wearable device 12a may send the datasets when the user presses or pushes the button 63, in order to initiate a voice call.
  • the wearable device 12a may process the datasets, prior to providing the datasets to the distributed cloud computing system 14. For example, the wearable device 12a may process motion and/or orientation data to make an initial determination as to whether a user event (e.g., a fall or some other accident) has occurred.
  • the distributed cloud computing system 14 may further process the datasets, in addition to the processing performed by the wearable device 12a.
  • the wearable device 12a may provide the datasets to the distributed cloud computing system 14 without first processing the datasets, and may allow the distributed cloud computing system 14 to process the datasets.
  • a user event e.g., a fall or some other accident
  • the distributed cloud computing system 14 may have more processing power (e.g., more CPUs) and may be better able to process and/or analyze the datasets than the wearable device 12a.
  • the wearable device 12a may establish a voice call with a call center (as discussed above in conjunction with FIG. 1).
  • the voice call may be initiated by the user of the wearable device or may be initiated by a remote operator in the call center.
  • the wearable device 12a may receive a request for a list of one or more devices that are visible to the wearable device 12a.
  • the wearable device 12a may receive a request for the list via the wireless protocol transceiver 50 (via a first communication interface).
  • the request for the list may include one or more instructions that may cause the wearable device 12a to activate the Bluetooth transceiver 54 (e.g., to activate a second communications interface).
  • the wearable device 12a may use the Bluetooth transceiver 54 (e.g., the second communication interface) to pair with additional devices (e.g., with health sensors and/or other computing devices).
  • the wearable device 12a may use the Bluetooth transceiver 54 to scan for devices (e.g., Bluetooth capable devices) that are visible to the Bluetooth transceiver 54 (e.g., that are within range of the Bluetooth transceiver 54).
  • the wearable device 12a may receive instructions from a server (in a distributed cloud computing system) to scan for devices.
  • the wearable device 12a may perform the scan and may store and/or record data about one or more devices that are visible to the wearable device 12a.
  • the wearable device 12a may store and/or record identifiers (e.g., alphanumeric values, numerical values, serial number, a MAC address, a name, etc.) of the one or more devices.
  • the wearable device 12a may transmit the list of the one or more devices that are visible, to the server.
  • the wearable device 12a may transmit a list of identifiers to the server using the wireless protocol transceiver 50.
  • the wearable device 12a may receive one or more instructions to pair with a first device from the list of one or more devices.
  • the wearable device 12a may receive an identifier that may be used to identify the first device from the one or more devices that are visible to the wearable device 12a.
  • the wearable device 12a may also receive commands, operations, access codes, passwords, etc., that the wearable device 12a may use to pair with the first device (e.g., pair with a health sensor).
  • the wearable device 12a may pair with the first device using these instructions (e.g., using the identifiers, commands, operations, access codes, and/r passwords, etc.).
  • the wearable device 12 may use the Bluetooth transceiver 54 to pair with the first device (e.g., to pair with the health sensor). After the first device is paired with the wearable device 12a (e.g., after a communication channel is established with the first device, the wearable device 12a may transmit a confirmation (e.g., a message, information and/or other data) to indicate that the wearable device 12a has successfully paired with the first device (e.g., to indicate that a Bluetooth pairing with a health sensor was successful).
  • a confirmation e.g., a message, information and/or other data
  • the wearable device may forward data received from the additional device to the distributed cloud computing system 14 and vice versa.
  • the wearable device 12a may operate as a tunnel for data that is transmitted between the distributed cloud computing system 14 and the wearable device 12a, as discussed below in conjunction with FIG. 3.
  • FIG. 3 is block diagram illustrating an example system architecture 300, according to another embodiment of the present disclosure.
  • the system architecture 300 includes a distributed cloud computing system 14, a wearable device 12a, and a health sensor 18a.
  • the wearable device 12a may be paired with the health sensor 18a.
  • the wearable device 12a may be a Bluetooth capable device that is paired with the health sensor 18a which is also a Bluetooth capable device.
  • a tunnel 305 (e.g., a logical communication channel) may be created or formed between the distributed cloud computing system 14 and the health sensor 18a when the health sensor 18a is paired with the wearable device 12a.
  • the tunnel 305 may allow the health sensor 18a to communicate data to the distributed cloud computing system 14 and vice versa.
  • the tunnel 305 may communicate data between the distributed cloud computing system 14 and the health sensor 18a via the wearable device 12a.
  • the wearable device 12a may operate as a proxy and may forward data from the distributed cloud computing system 14 to the wearable device 12a and vice versa. This may allow the wearable device 12a to operate with many different types of devices without having to process and/or understand the data that is provided by the different types of devices.
  • the health sensor 18a may be a new type of health sensor.
  • the user may want to pair the health sensor 18a with the wearable device 12a.
  • the wearable device 12a may not be able to process and/or analyze the type of data that the health sensor 18a generates.
  • the health sensor 18a may be a glucometer and the wearable device 12a may not be able to process data (e.g., glucose level measurements, sensor readings, etc.) from the health sensor 18a.
  • the data from the health sensor 18a may be forwarded by the wearable device to the distributed cloud computing system 14 via the tunnel 305 (e.g., the wearable device 12a may forward the data to the distributed cloud computing system 14).
  • the distributed cloud computing system 14 may process the data received from the health sensor 18a via the tunnel 305 (e.g., via the wearable device 12a) to determine the health or state of the user (e.g., to determine the glucose levels of the user). This allows the wearable device 12a to operate with the health sensor 18a and without the need for the wearable device 12a to understand that data (e.g., commands, sensor readings, operations, functions, protocols) of health sensor 18a.
  • the tunnel 305 may also be used to provide data (e.g., information, commands, instructions, etc.) to directly to the health sensor 18a instead of providing the data to the wearable device 12a.
  • the distributed cloud computing system 14 may directly send the command to perform the measurement to the health sensor 18a via the tunnel 305 (e.g., via the wearable device 12a).
  • FIGS. 4-5 are flow diagrams illustrating methods of pairing devices.
  • the methods are depicted and described as a series of acts.
  • acts in accordance with this disclosure can occur in various orders and/or concurrently and with other acts not presented and described herein.
  • not all illustrated acts may be required to implement the methods in accordance with the disclosed subject matter.
  • the methods could alternatively be represented as a series of interrelated states via a state diagram or events.
  • FIG. 4 is a flow diagram illustrating a method 400 of pairing a first device with a computing device, according to one embodiment of the present disclosure.
  • the method 400 may be performed by processing logic that comprises hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (e.g., instructions run on a processor to perform hardware simulation), or a combination thereof.
  • method 400 may be performed by a pairing module, as illustrated in FIG. 1.
  • the method 400 begins at block 405 where the processing logic receives user input indicating a pairing request for the computing device.
  • the computing device may be a wearable device (as illustrated in FIGS. 1 and 2).
  • the processing logic may receive user input from a remote operator indicating that a computing device should be paired with one or more other devices.
  • the processing logic may request a list of one or more devices that are visible to the computing device.
  • the processing logic may transmit instructions to the computing device that cause the computing device to activate or power on a first communication interface (e.g., a Bluetooth communication interface) and to scan for devices (e.g., scan for Bluetooth devices near the computing device).
  • a first communication interface e.g., a Bluetooth communication interface
  • the processing logic may transmit these instructions to the computing device using a second communication interface (e.g., using a cellular communication interface).
  • the processing logic may receive a list of devices that are visible to the computing device from the computing device.
  • the processing logic may receive one or more identifiers (e.g., MAC address, a name, etc.) for the devices that are visible to the computing device.
  • the processing logic may receive user input identifying a first device from the list of devices that are visible to the computing device. For example, the processing logic may receive user input from the remote operator selecting a first device from the list of devices that are visible to the computing device.
  • the first device may be a health sensor (e.g., a scale, a glucometer, a thermometer, etc.)
  • the processing logic may obtain the instructions for pairing the first device with the computing device. For example, the processing logic may look up access codes, commands, passwords, etc., for pairing the first device with the computing device using an identifier for the first device. The processing logic may transmit these instructions to the computing device using the second communication interface.
  • the processing logic may optionally receive a confirmation (e.g., a message and/or other data) from the computing device indicating that the computing device has successfully paired with the first device.
  • the processing logic may communicate data (e.g., transmit and/or receive) between the server and the first device via a tunnel formed by the computing device (as discussed above in conjunction with FIG. 2). After block 435, the method 400 ends.
  • FIG. 5 is a flow diagram illustrating a method 500 of pairing a first device with a computing device, according to another embodiment of the present disclosure.
  • the method 500 may be performed by processing logic that comprises hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (e.g., instructions run on a processor to perform hardware simulation), or a combination thereof.
  • method 500 may be performed by a processing module, as illustrated in FIG. 2.
  • the method 500 begins at block 505 where the processing logic receives a request for a list of devices that are visible to the computing device.
  • the computing device may be a wearable device (as illustrated in FIGS. 1 and 2).
  • the processing logic may receive this request using a first communication interface (e.g., a cellular communication interface).
  • the request may include instructions that cause and/or instruct the processing logic to activate or power on a second communication interface (e.g., a Bluetooth communication interface).
  • the processing logic may scan for devices (e.g., scan for Bluetooth devices near the computing device).
  • the processing logic may transmit a list of devices that are visible to the computing device to a server (e.g., a server in a distributed cloud computing system). For example, the processing logic may transmit one or more identifiers (e.g., MAC address, a name, etc.) for the devices that are visible to the computing device.
  • a server e.g., a server in a distributed cloud computing system.
  • the processing logic may transmit one or more identifiers (e.g., MAC address, a name, etc.) for the devices that are visible to the computing device.
  • the processing logic may receive instructions for pairing the first device with the computing device. For example, the processing logic may receive an identifier, access codes, commands, passwords, etc., for pairing the first device with the computing device. The processing logic may receive these instructions to the computing device using the first communication interface. At block 525, the processing logic may pair the computing device with the first device based on the instructions for pairing. At block 530, the processing logic may optionally transmit a confirmation (e.g., a message and/or other data) from the computing device indicating that the computing device has successfully paired with the first device.
  • a confirmation e.g., a message and/or other data
  • the processing logic may communicate data (e.g., transmit and/or receive) from the first device (e.g., a health sensor) to the server and vice versa (as discussed above in conjunction with FIG. 2).
  • the computing device may serve as a tunnel between the first device and the server and may forward data between the server and the first device (as discussed above in conjunction with FIG. 3).
  • FIG. 6 illustrates a diagrammatic representation of a machine in the example form of a computing device 600 within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.
  • the computing device 600 may be a mobile phone, a smart phone, a netbook computer, a rackmount server, a router computer, a server computer, a personal computer, a mainframe computer, a laptop computer, a tablet computer, a desktop computer etc., within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.
  • the machine may be connected (e.g., networked) to other machines in a LAN, an intranet, an extranet, or the Internet.
  • the machine may operate in the capacity of a server machine in client-server network environment.
  • the machine may be a personal computer (PC), a set-top box (STB), a server, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • STB set-top box
  • server a server
  • a network router, switch or bridge or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • the term "machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • the example computing device may be a server (in a distributed cloud computing system) and/or may be
  • the example computing device 600 includes a processing device (e.g., a processor) 602, a main memory 604 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 606 (e.g., flash memory, static random access memory (SRAM)) and a data storage device 618, which communicate with each other via a bus 630.
  • a processing device e.g., a processor
  • main memory 604 e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)
  • DRAM dynamic random access memory
  • SDRAM synchronous DRAM
  • static memory 606 e.g., flash memory, static random access memory (SRAM)
  • SRAM static random access memory
  • Processing device 602 represents one or more general -purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 602 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 602 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 602 is configured to execute instructions 626 for performing the operations and steps discussed herein.
  • CISC complex instruction set computing
  • RISC reduced instruction set computing
  • VLIW very long instruction word
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • DSP digital signal processor
  • network processor or the like.
  • the processing device 602 is configured to execute instructions 626 for performing the
  • the computing device 600 may further include a network interface device 608 which may communicate with a network 620.
  • the computing device 600 also may include a video display unit 610 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device 612 (e.g., a keyboard), a cursor control device 614 (e.g., a mouse) and a signal generation device 616 (e.g., a speaker).
  • the video display unit 610, the alphanumeric input device 612, and the cursor control device 614 may be combined into a single component or device (e.g., an LCD touch screen).
  • the data storage device 618 may include a computer-readable storage medium 628 on which is stored one or more sets of instructions (e.g., instructions for a pairing module) embodying any one or more of the methodologies or functions described herein.
  • the instructions 626 may also reside, completely or at least partially, within the main memory 604 and/or within the processing device 602 during execution thereof by the computing device 600, the main memory 604 and the processing device 602 also constituting computer-readable media.
  • the instructions may further be transmitted or received over a network 620 via the network interface device 608.
  • computer-readable storage medium 628 is shown in an example embodiment to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • the term “computer- readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media and magnetic media.
  • Embodiments of the disclosure also relate to an apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a non-transitory computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, flash memory, or any type of media suitable for storing electronic instructions.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Telephonic Communication Services (AREA)
  • Critical Care (AREA)
  • Emergency Management (AREA)
  • Emergency Medicine (AREA)
  • Nursing (AREA)
  • Telephone Function (AREA)

Abstract

A voice call is established using a computing device. A user is instructed to prepare a first device for pairing with the computing device and a server may instruct the computing device to scan for additional devices. The server may identify a first device from a list of devices and may instruct the computing device to pair with the first device.

Description

MANAGEMENT, CONTROL AND COMMUNICATION WITH SENSORS TECHNICAL FIELD
[0001] Embodiments of the present invention relate generally to health care-based monitoring systems, and more specifically, to mechanisms for managing, controlling, and communicating data between devices.
BACKGROUND
[0002] For certain age groups, such as the elderly, or people that engage in certain dangerous activities, such as firefighters and soldiers, it is desirable to track and understand human activity automatically. For example, a person that has fallen may be injured, unconscious, etc., and needs emergency assistance. In such circumstances, relying on the person to initiate a call to a public safety access point (PSAP) (e.g., 9-1-1 emergency services, an automated emergency call center, etc.) is not practical. Moreover, even if the person is capable of placing the call, the PSAP may be located outside the geographical jurisdiction for providing emergency services. An emergency services person located at a PSAP may need to manually place a second call to the local fire station, police, or Emergency Medical Services (EMS) squad, thereby wasting precious time that could be used to save the person's life. Further, if the person is unconscious, they would not be able to relate the nature of their injuries nor their physical location.
[0003] A wearable device may be worn by the user and the wearable device may monitor the activities and/or health of the user using a variety of sensors and/or components (e.g., GPS units, a blood pressure unit, an accelerometer, etc.). The wearable device may also provide a simple interface (e.g., a single button) to allow a user to initiate a voice call (e.g., to request help).
However, these simplified interfaces (e.g., the single button) may not allow a user to choose a destination for the voice call. The wearable device may be configured to call a single destination (e.g., a PSAP) in response to a user request (e.g., in response to the user pushing the button) and may not be able to initiate voice calls to other destinations in response to the user request. The wearable device may use various other devices (e.g., health sensors). For example, the wearable device may user a blood pressure sensor, a thermometer, a weight sensor (e.g., a scale), etc., to monitor the condition, health and/or state of the user.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] Embodiments of present invention will be more readily understood from the detailed description of exemplary embodiments presented below considered in conjunction with the attached drawings.
[0005] FIG. 1 is a block diagram illustrating one embodiment of a system for detecting a predefined user state, according to one embodiment of the present disclosure. [0006] FIG. 2 is a block diagram illustrating one embodiment of a wearable device, according to one embodiment of the present disclosure.
[0007] FIG. 3 is block diagram illustrating an example system architecture, according to another embodiment of the present disclosure.
[0008] FIG. 4 is a flow diagram illustrating a method of pairing a first device with a computing device, according to one embodiment of the present disclosure.
[0009] FIG. 5 is a flow diagram illustrating a method of pairing a first device with a computing device, according to another embodiment of the present disclosure.
[0010] FIG. 6 is a block diagram of an exemplary computer system that may perform one or more of the operations described herein, according to one embodiment of the present disclosure.
[0011] Detailed Description
[0012] As discussed above, a wearable device may use various other devices (e.g., health sensors). For example, the wearable device may use a blood pressure sensor, a thermometer, a weight sensor (e.g., a scale), etc., to monitor the condition, health and/or state of the user. Users may not be familiar with the procedures, operations, commands, etc., that may be needed to pair the wearable device with each of the other devices. For example, each type of health sensor may have a different procedure and/or passwords for pairing the health sensor with the wearable device.
[0013] Embodiments of the invention provide mechanisms for pairing devices such that the devices may communicate data. A wearable device may be paired with other devices (e.g., health sensors). A user may use the wearable device to establish a voice call with a remote operator in a call center. The remote operator may instruct the user to prepare a first device for pairing with the wearable device. After the user has prepared the device for pairing, the remote operator may use a server to pair the wearable device with the first device. The server may instruct the wearable device to activate/power up a communication interface and to scan for devices. The wearable device may provide a list of devices that are visible to the wearable device. The server may receive input identifying the first device from the list of devices and may provide instructions to the wearable device for pairing the first device with the wearable device. Some embodiments may allow the server to pair the wearable device with the first device automatically (e.g., without input and/or intervention from the user). This may allow users to pair a wearable device with other devices (e.g., health sensors) more quickly and efficiently.
[0014] In one embodiment, the wearable device may operate as a tunnel and may forward data from the wearable device to the server and vice versa. This may allow the wearable device to pair with devices when the wearable device is not able understand and/or process the data and/or information from these devices. The wearable device may forward the data to the server and the server may process the data, instead of having the wearable device process the data. The other devices (e.g., the other health sensors) may receive data from the server and may process the data (e.g., commands, etc.) from the server, instead of having the wearable device process the data.
[0015] FIG. 1 is a block diagram illustrating one embodiment of a system 10 for detecting a predefined user state. The system 10 includes wearable devices 12a-12n communicatively connected to a distributed cloud computing system 14. A wearable device 12 may be a small- size computing device that can be worn as a watch, a pendant, a ring, a pager, or the like, and can be held in any orientation.
[0016] In one embodiment, each of the wearable devices 12a-12n is operable to communicate with a corresponding one of users 16a-16n (e.g., via a microphone, speaker, and voice recognition software), external health sensors 18a-18n (e.g., an EKG, blood pressure device, weight scale, glucometer) via, for example, a short-range over the air (OTA) transmission method (e.g., Bluetooth, Wi-Fi, etc.), a call center 30, a first-to-answer system 32, and care giver and/or family member 34, and the distributed cloud computing system 14 via, for example, a long range OTA transmission method (e.g., over a 3rd Generation (3G) or 4th Generation (4G) cellular transmission network 20, such as a Long Term Evolution (LTE) network, a Code Division Multiple Access (CDMA) network, etc.).
[0017] Each wearable device 12 is configured to detect a predefined state of a user. The predefined state may include a user physical state (e.g., a user fall inside or outside a building, a user fall from a bicycle, a car incident involving a user, a user taking a shower, etc.) or an emotional state (e.g., a user screaming, a user crying, etc.). As will be discussed in more detail below, the wearable device 12 may include multiple sensors for detecting a predefined user state. For example, the wearable user device 12 may include an accelerometer for measuring an acceleration of the user, a magnetometer for measuring a magnetic field associated with the user's change of orientation, a gyroscope for providing a more precise determination of orientation of the user, and a microphone for receiving audio. Based on data received from the above sensors, the wearable device 12 may identify a suspected user state, and then categorize the suspected user state as an activity of daily life, a confirmed predefined user state, or an inconclusive event. The wearable user device 12 may then communicate with the distributed cloud computing system 14 to obtain a re-confirmation or change of classification from the distributed cloud computing system 14. In another embodiment, the wearable user device 12 transmits data provided by the sensors to the distributed cloud computing system 14, which then determines a user state based on this data. [0018] In one embodiment, the wearable device 12 includes a low-power processor (e.g., low- power processing device) to process data receive from sensors and/or detect anomalous sensor inputs. The low-power processor may cause a second processing device to further analyze the sensor inputs (e.g., may wake up a main CPU). If the second processing device determines that there is possibly an anomalous event in progress the second processing device may send dataset to the distributed cloud computing system 14. In one embodiment, if the distributed cloud computing system 14 concludes there is an anomalous event, the distributed cloud computing system 14 may instruct the wearable device 12 to initiate a voice call.
[0019] In one embodiment, the wearable user device 12 may also obtain audio data from one or more microphones on the wearable device 12. For example, the wearable user device 12 may record the user's voice and/or sounds which are captured by the one or more microphones, and may provide the recorded sounds and/or voice to the distributed cloud computing system 14 for processing (e.g., for voice or speech recognition).
[0020] In one embodiment, the wearable devices 12a-12n may continually or periodically gather/obtain data from the sensors and/or the one or more microphones (e.g., gather/obtain datasets and audio data) and the wearable devices 12a-12n may transmit these datasets to the distributed cloud computing system 14. The datasets may be transmitted to the distributed cloud computing system 14 at periodic intervals, or when a particular event occurs (e.g., user pushes a button on the wearable device 12a-12n or a fall is detected). In one embodiment, the datasets may include data indicative of measurements or information obtained by the sensors which may be within or coupled to the wearable device 12a-12n. For example, the datasets may include temperature readings (e.g., 98.5 degrees Fahrenheit, measurements obtained from an
accelerometer (e.g., a rate of acceleration), a GPS location (e.g., GPS or longitude/latitude coordinates), etc. In one embodiment, the wearable device 12a-12n may transmit a dataset per sensor (e.g., one dataset for the accelerometer, one data set for an aGPS receiver, etc.) In another embodiment, the wearable device 12a-12n may combine data received from multiple sensors into a dataset.
[0021] Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services. The term "cloud" refers to one or more computational services (e.g., servers) connected by a computer network.
[0022] The distributed cloud computing system 14 may include one or more computers configured as a telephony server 22 communicatively connected to the wearable devices 12a- 12n, the Internet 24, and one or more cellular communication networks 20, including, for example, the public circuit-switched telephone network (PSTN) 26. The distributed cloud computing system 14 may further include one or more computers configured as a Web server 28 communicatively connected to the Internet 24 for permitting each of the users 16a-16n to communicate with a call center 30, first-to-answer systems 32, and care givers and/or family 34. The web server 28 may also provide an interface for users to interact with the distributed cloud computing system 14 (e.g., to access their accounts, profiles, or subscriptions, to access stored datasets and/or audio data, etc.) The distributed cloud computing system 14 may further include one or more computers configured as a real-time data monitoring and computation server 36 communicatively connected to the wearable devices 12a-12n for receiving measurement data (e.g., datasets), for processing measurement data to draw conclusions concerning a potential predefined user state, for transmitting user state confirmation results and other commands back to the wearable devices 12a-12n, for storing and retrieving present and past historical predefined user state data from a database 37 which may be employed in the user state confirmation process, and in retraining further optimized and individualized classifiers that can in turn be transmitted to the wearable device 12a-12n. In one embodiment, the web server 28 may store and retrieve present and past historical predefined user state data, instead of the real-time data monitoring and computation serve 36 or the database 37.
[0023] In one embodiment, the wearable devices 12a-12n may include a button, which a user 16 may use to initiate voice calls. For example, a user 16a may push the button on the device 12a to initiate a voice call in order to obtain assistance or help (e.g., because the user has slipped or fallen, or because the user requires medical assistance). As discussed above, the wearable devices 12a-12n may periodically transmit datasets to the distributed cloud computing system 14. In one embodiment, the wearable devices 12a-12n may also transmit datasets to the distributed cloud computing system 14 when the user press or pushes the button on the wearable devices 12a-12n. In one embodiment, the wearable devices 12a-12n may be single-button devices (e.g., devices which only have one button) which provide a simplified interface to users.
[0024] In one embodiment, the distributed cloud computing system 14 may receive a request from the wearable device 12a-12n to initiate the voice call. The distributed cloud computing system 14 may also receive datasets from the wearable device 12a-12n associated with an event experienced by the user. After receiving the request to initiate the voice call, the distributed cloud computing system 14 may analyze the datasets to determine whether the event experienced by the user is an activity of daily life (ADL), a confirmed fall, or an inconclusive event. In another embodiment, the distributed cloud computing system 14 may identify a destination for routing the voice call, based on the analysis of the datasets. For example, if the distributed cloud computing system 14 analyzes the datasets and determines that the event is a confirmed fall, the distributed cloud computing system 14 may identify a first-to-answer system 32 (e.g., a 911 or emergency response call center) as destination for the voice call. In another example, if the distributed cloud computing system 14 analyzes the datasets and is unable to determine what event occurred (e.g., an inconclusive event), the distributed cloud computing system 14 may identify a family member 24, as destination for the voice call. After identifying a destination for the voice call, the distributed cloud computing system 14 routes the voice call to the identified destination.
[0025] In one embodiment, the distributed cloud computing system 14 may also analyze audio data received from a wearable device 12 to determine what event has happened to a user. For example, the wearable device 12 may provide audio data (e.g., a recording of the user's voice or other sounds) to the distributed cloud computing system 14. The distributed cloud computing system 14 may analyze the sound data and may determine that a user is asking for help (e.g., based on the user's words in the recording). The distributed cloud computing system 14 may identify a destination for the voice call, based on the audio data and/or the datasets received from the wearable device 12 and may route the voice call to the identified destination. The audio data may be used in conjunction with the datasets to identify a destination for routing the voice call.
[0026] In one embodiment, the distributed cloud computing system 14 may monitor the status of the voice call, after it routes the voice call to the identified destination. For example, the distributed cloud computing system 14 may route the voice call to a first destination and may route the call to a second destination if the call is not answered by the first destination. In another embodiment, the distributed cloud computing system 14 may also use subscription data (e.g., information associated with a user's account or subscription to a service) to identify destinations for routing the voice call. For example, the subscription data may include a list and/or an order for destinations where the voice call should be routed. In a further embodiment, the distributed cloud computing system 14 may also use a time of day and/or a geographic location to identify destinations for routing a voice call. For example, based on certain times of data and/or based on the location of the wearable device, the distributed cloud computing system 14 may route the voice to different locations.
[0027] The distributed cloud computing system 14 also includes a pairing module 27. The pairing module 27 may allow the distributed cloud computing system 14 to pair health sensors (e.g., health sensor 18a) with wearable devices (e.g., wearable device 12a). Pairing a health sensor to a wearable device may refer to establishing one or more communication channels between the health sensor and the wearable device (e.g., a physical communication channel or a logical communication channel). Pairing a health sensor to a wearable device may also refer to storing an identifier for the wearable device on the health sensor and/or storing an identifier for the health sensor on the wearable device. For example, the wearable device 12a may store an identifier for the health sensor 18a to indicate that the wearable device 12a is allowed to create a communication channel with the health sensor 18a and to communicate data with the health sensor 18a. Pairing a health sensor to a wearable device allows the health sensor and the wearable device to communicate data between the health sensor and the wearable device (e.g., sensor data, messages, information, etc.) using the one or more communication channels. One example of pairing may be pairing two Bluetooth capable devices, (e.g., Bluetooth pairing). Although some embodiments described herein may refer to Bluetooth and/or Bluetooth pairing, other embodiments may use different communication channels, different communication protocols, and /or different communication interfaces to pair a health sensor to a wearable device. For example, the wearable device may be paired with the health sensor using protocols and/or channels such as ZigBee, Z-Wave, RuBee, 802.15, 802.11, transmission control protocol/internet protocol (TCP/IP), user datagram protocol (UDP), and/or other communication protocols.
[0028] As discussed above, a user (e.g., user 16a) may acquire additional devices, such as medical sensors, to monitor the state, condition, and/or health of the user. For example, the user may request and/or order an additional health sensor (e.g., a weigh scale). In another example, a caregiver or a doctor of the user may request and/or order the additional health sensor.
Examples of health sensors include, but are not limited to, an EKG, a blood pressure sensor, a weight scale, an inertial measurement unit, a pressure sensor for measuring air pressure or altitude, a heart rate sensor, a blood perfusion sensor, a temperature sensor (e.g., a thermometer), a glucose level sensor (e.g., a glucometer), etc. When the user receives the health sensor (or other device), the health sensor may be paired with a wearable device in order to monitor the health, condition, and/or state of the user. The user may be unfamiliar with the process, procedures, and/or methods for pairing the health sensor with the wearable device (e.g., for establishing or setting up one or more communication channels between the health sensor and the wearable device).
[0029] The user may place a voice call to a caregiver, the call center 30 (e.g., a remote person or operator in the call center), etc., using the wearable device. For example, the wearable device may include a cellular communication interface (e.g., a communication interface used to communicate with a cellular network) and the user may place a voice call to the remote operator of the call center 30 using the cellular communication interface. A voice call may also refer to a voice-over-IP (VOIP) call. The user may indicate during the voice call that the user wishes to pair the health sensor (e.g., 18a) to the wearable device (e.g., 12a) of the user (e.g., 16a). The remote operator may provide instructions to the user (e.g., verbal instructions) to instruct the user to prepare the health sensor for pairing. For example, the remote operator may instruct the user to power or turn on the health sensor, or may instruct the user to activate a particular switch or button on the health sensor. After the user indicates that the health sensor is prepared for pairing, the remote operator may use the pairing module 27 to pair the health sensor with the wearable device. In one embodiment, the remote operator may provide user input to the pairing module 27 to indicate that a wearable device should be paired with a device (e.g., a health sensor). For example, the remote operator may receiving input from a remote operator indicating that the user of a wearable device wishes to pair the wearable device with an additional device (e.g., with a health sensor). The remote operator may provide the user input after speaking with the user via a voice call (as discussed above). In one embodiment, an automated system (e.g., an automated answering system) may instruct the user to prepare the health sensor for pairing. The user may provide input to the automated system (e.g., via button presses) to indicate that the health sensor is ready for pairing and the distributed cloud computing system 14 may initiate the pairing process as described above.
[0030] In one embodiment, pairing module 27 may transmit one or more instructions to the wearable device (e.g., 12a) to activate the communication interface that may be used to pair with the health sensor (e.g., 18a). For example, the pairing module 27 may transmit instructions to the health sensor 12a instructing the health sensor 12a to turn on and/or activate a Bluetooth communication interface. The pairing module may use a second communication interface of the wearable device to transmit the one or more instructions to the wearable device. For example, the pairing module 27 may use a cellular communication interface (e.g., a 3G communication interface, a 4G communication interface, etc.) to transmit the instructions to the wearable device. The wearable device may turn on the first communication interface (e.g., turn on Bluetooth) based on the instructions.
[0031] In another embodiment, the pairing module 27 may send one or more additional instructions instructing the wearable device to scan for devices (e.g., sensors, other computing devices, etc.) that are visible to the wearable device. For example, the pairing module 27 may instruct the wearable device 12a to perform a scan for Bluetooth devices that are visible to the wearable device 12a. One or more devices (e.g., health sensors or other devices) may be visible to a wearable device when the wearable device is able to communicate with the one or more devices. For example, a device (e.g., health sensor 18a) may be visible with to the wearable device (and vice versa) if the wearable device is able to transmit data to and/or receive data from the device. One or more devices may also be visible to the wearable device if the one or more devices are within a range of a communication interface of the wearable device (e.g., within 10 feet, 100 feet, 1 mile, etc.). The pairing module 27 may receive a list of one or more devices that are visible to the wearable device from the wearable device. For example, the wearable device may transmit information such as the model numbers, types, and/or identifiers of the one or more devices that are visible to the wearable device, to the pairing module 27.
[0032] In one embodiment, the pairing module 27 may analyze and/or process the list of one or more devices to identify a device that the wearable device is to be paired with. For example, the pairing module 27 may determine multiple types of devices (e.g., a headset, a speaker system, a mouse, a keyboard, a weight scale, a blood pressure sensor, etc.) are visible to the wearable device. The pairing module 27 may identify one or more devices (e.g., one or more health sensors) that the wearable device should pair with. For example, the paring module 27 may identify one or more health sensors that are supported by the distributed cloud computing system 14. The pairing module 27 may present the list of one or more health sensors that the wearable device should pair with, to the remote operator.
[0033] In one embodiment, the pairing module 27 may receive user input form the remote operator identifying a first device from the list of one or more devices. For example, as discussed above, the paring module 27 may receive a list of devices that are visible to the wearable device. The list may include identifiers for the devices on the list of devices. The identifiers may include information such as model numbers, types of the devices (e.g., computer, laptop, health sensor, speaker, headset, etc.), and/or other data that may be used to identify the devices that are visible to the wearable device. The remote operator may select one of the devices by providing the identifier for the selected devices to the pairing module. For example, selecting a device from the list of devices may provide the identifier for the selected device to the pairing module 27. In another example, the remote operator may provide the identifier to the paring module 27 (e.g., may input the identifier using a GUI). The pairing module 27 may also select one of the devices without input form the remote operator based on the identifiers for the devices. For example, the pairing module 27 may automatically identify devices that are health sensors.
[0034] In one embodiment, the pairing module 27 may access the database 37 to obtain instructions for pairing the first device with the wearable device. For example, the paring module 27 may access the database 37 to obtain instructions for pairing a weight sensor (e.g., a scale) with the wearable device 12a. The pairing module 27 may use an identifier for the selected device to obtain the instructions for pairing. For example, the pairing module 27 may use the model number, a name, serial number, etc., of the selected device to obtain the instructions for pairing from the database 37. The instructions for pairing may include commands, operations, access codes, passwords, etc., that may be used to pair the first device with the wearable device. In one embodiment, the pairing module 27 may access the
instructions based on the identifier for the first device (e.g., based on a model number, a type, a MAC address, a serial number, etc.). In one embodiment, the instructions may include also include the identifier of the first device. This may allow the wearable device to identify which device the wearable device should pair with. In another embodiment, the instructions may cause the wearable device to automatically pair with the first device. For example, the instructions may cause the wearable device 12a to automatically pair with the health sensor 12a without user intervention or user input from the user 16a. After the wearable device is paired with the first device (e.g., with a health sensor), the wearable device may receive a confirmation (e.g., data, a message, etc.) from the wearable device to indicate that the wearable device is paired with the first device.
[0035] In one embodiment, after the wearable device is paired with an additional device (e.g., after wearable device 12a is paired with health sensor 18a), the wearable device may forward data received from the additional device to the distributed cloud computing system 14 and vice versa. For example, the wearable device may operate as a tunnel for data that is transmitted between the distributed cloud computing system 14 and the wearable device 12a, as discussed below in conjunction with FIG. 3.
[0036] FIG. 2 is a block diagram illustrating one embodiment of a wearable device 12a (e.g., wearable device 12a shown in FIG. 1). The wearable device 12a may include a low-power processor 38 communicatively connected to an accelerometer 40 (e.g., a two-or more-axis accelerometer) for detecting acceleration events (e.g., high, low, positive, negative, oscillating, etc.), a magnetometer 42 (preferably a 3 -axis magnetometer) for assessing an orientation of the wearable device 12a, and a gyroscope 44 for providing a more precise determination of orientation of the wearable device 12a. The low-power processor 38 is configured to receive continuous or near-continuous real-time measurement data from the accelerometer 40, the magnetometer 42, and the gyroscope 44 for rendering tentative decisions concerning predefined user states. By utilizing the above components, the wearable device 12 is able to render these decisions in relatively low-computationally expensive, low-powered manner and minimize false positive and false negative errors. A processing module 46, such as the 3G IEM 6270 manufactured by Qualcomm®, includes a high-computationally-powered microprocessor element and internal memory that are adapted to receive the suspected fall events from the low- power processor 38 and to further correlate orientation data received from the optional gyroscope 44 with digitized audio data received from one or more microphones 48 (e.g., micro- electro-mechanical systems-based (MEMS) microphone(s)). The processing module 46 may also be referred to as a processing device. The audio data may include the type, number, and frequency of sounds originating from the user's voice, the user's body, and the environment. [0037] The processing module 46 is also configured to receive data/commands from and/or transmit data/commands to the distributed cloud computing system 14 via a 3G, 4G, and/or other wireless protocol transceiver 50 over the cellular transmission network 20. The wireless protocol transceiver 50 may also be referred to as a communication interface. A communication interface may be software, hardware, or a combination of both that may be used to communicate data (e.g., transmit and/or receive data) with another device (e.g., a server, another computing device, a health sensor, etc.). The processing module 46 is further configured to communicate with and receive position data from an aGPS receiver 52, and to receive measurements from the external health sensors (e.g., sensors 18a-18n shown in FIG. 1) via a short-range Bluetooth transceiver 54 (or other equivalent short range transceiver such as a WiFi transceiver) or via a direct connection to one or more health sensors (e.g., the health sensors may be directly attached/coupled to the wearable device 12a). The aGPS receiver 52 and the Bluetooth transceiver 54 may also be referred to as communication interfaces.
[0038] In addition to recording audio data for fall analysis, the processing module 46 is further configured to permit direct voice communication between the user 16a and the PSAP (e.g. 9-1- 1, an emergency response center, etc., not shown in the figures), a call center 30, first-to-answer systems 32 (e.g. a fire station, a police station, a physician's office, a hospital, etc.), or care givers and/or family 34 via a built-in speaker 58 and an amplifier 60. Either directly or via the distributed cloud computing system 14, the processing module 46 is further configured to permit the user 16a to conduct a conference connection with one or more of a PSAP, the call center 30, the first-to-answer systems 32, and/or care givers and/or family 34. The processing module 46 may receive/operate one or more input and output indicators 62 (e.g., one or more mechanical and touch switches (not shown), a vibrator, LEDs, etc.). The wearable device 12a also includes an on-board battery power module 64.
[0039] The wearable device 12a may also include a button 63. The button 63 may allow a user to provide user input to the wearable device 12a. For example, the user may press or push the button to initiate a voice call to one or more of a call center 30, first-to-answer systems 32 (e.g. a fire station, a police station, a physician's office, a hospital, etc.), or care givers and/or family 34. In another example, a user may use the button 63 to answer questions during a voice call (e.g., push the button 63 once for "yes" and push the button 63 twice for "no"). In another example, the user may indicate that the wearable device should start collecting data (e.g., datasets such as health data, audio data, location data, etc.) and/or send data to the distributed cloud computing system 14, using the button 63.
[0040] The wearable device 12a may also include empty expansion slots and/or connectors (not shown) to collect readings from other sensors (i.e., an inertial measurement unit, a pressure sensor for measuring air pressure or attitude, a heart rate sensor, blood perfusion sensor, temperature sensor), etc. These other sensors may be coupled to the device via the expansion slots and/or connectors to provide additional datasets or information to the distributed cloud computing system 14.
[0041] In one embodiment, the wearable device 12a may collect, gather, and/or obtain information using a variety of components. For example, the wearable device 12a may obtain orientation and/or movement data (e.g., information about how a user who is wearing the wearable device 12a has moved) using the accelerometer 40, the magnetometer 42, and/or the gyroscope 44. In another example, the wearable device 12a may determine the location (e.g., location data, such as GPS coordinates) of the wearable device 12a (and the user who is wearing or holding the wearable device 12a) using the aGPS receiver 52. In a further example, the wearable device may collect health data (e.g., heart rate, blood pressure, sugar levels, temperature, etc.) using sensors (not shown in the figures) which may be attached to the wearable device 12a and/or may communicate with the wearable device 12a using the Bluetooth transceiver 54. In yet another example, the wearable device 12a may obtain audio data (e.g., voice and/or sounds) using the microphone 48 or a plurality of microphones (now shown in the figures).
[0042] In one embodiment, the wearable device 12a may obtain and/or generate datasets (e.g., orientation/movement data, health data, location data, audio data) using these components and may transmit these datasets to the distributed cloud computing system 14. In another embodiment, the wearable device 12a may periodically transmit data sets to the distributed cloud computing system 14. For example, the wearable device 12a may transmit the datasets once every 5 seconds, or once every 30 seconds. In another embodiment, the wearable device 12a may transmit the datasets when certain criteria are met (e.g., when an accelerometer detects an acceleration above a certain threshold indicating a possible fall, or when the aGPS receiver determines that the wearable devices has left a certain location). In a further embodiment, the wearable device 12a may transmit datasets when a user input is received. For example, the wearable device 12a may send the datasets when the user presses or pushes the button 63, in order to initiate a voice call.
[0043] In one embodiment, the wearable device 12a may process the datasets, prior to providing the datasets to the distributed cloud computing system 14. For example, the wearable device 12a may process motion and/or orientation data to make an initial determination as to whether a user event (e.g., a fall or some other accident) has occurred. The distributed cloud computing system 14 may further process the datasets, in addition to the processing performed by the wearable device 12a. In another embodiment, the wearable device 12a may provide the datasets to the distributed cloud computing system 14 without first processing the datasets, and may allow the distributed cloud computing system 14 to process the datasets. In one
embodiment, the distributed cloud computing system 14 may have more processing power (e.g., more CPUs) and may be better able to process and/or analyze the datasets than the wearable device 12a.
[0044] In one embodiment, the wearable device 12a may establish a voice call with a call center (as discussed above in conjunction with FIG. 1). The voice call may be initiated by the user of the wearable device or may be initiated by a remote operator in the call center. The wearable device 12a may receive a request for a list of one or more devices that are visible to the wearable device 12a. For example, the wearable device 12a may receive a request for the list via the wireless protocol transceiver 50 (via a first communication interface). The request for the list may include one or more instructions that may cause the wearable device 12a to activate the Bluetooth transceiver 54 (e.g., to activate a second communications interface). The wearable device 12a may use the Bluetooth transceiver 54 (e.g., the second communication interface) to pair with additional devices (e.g., with health sensors and/or other computing devices).
[0045] In one embodiment, the wearable device 12a may use the Bluetooth transceiver 54 to scan for devices (e.g., Bluetooth capable devices) that are visible to the Bluetooth transceiver 54 (e.g., that are within range of the Bluetooth transceiver 54). For example the wearable device 12a may receive instructions from a server (in a distributed cloud computing system) to scan for devices. The wearable device 12a may perform the scan and may store and/or record data about one or more devices that are visible to the wearable device 12a. For example the wearable device 12a may store and/or record identifiers (e.g., alphanumeric values, numerical values, serial number, a MAC address, a name, etc.) of the one or more devices. The wearable device 12a may transmit the list of the one or more devices that are visible, to the server. For example, the wearable device 12a may transmit a list of identifiers to the server using the wireless protocol transceiver 50.
[0046] In one embodiment, the wearable device 12a may receive one or more instructions to pair with a first device from the list of one or more devices. For example, the wearable device 12a may receive an identifier that may be used to identify the first device from the one or more devices that are visible to the wearable device 12a. The wearable device 12a may also receive commands, operations, access codes, passwords, etc., that the wearable device 12a may use to pair with the first device (e.g., pair with a health sensor). The wearable device 12a may pair with the first device using these instructions (e.g., using the identifiers, commands, operations, access codes, and/r passwords, etc.). The wearable device 12 may use the Bluetooth transceiver 54 to pair with the first device (e.g., to pair with the health sensor). After the first device is paired with the wearable device 12a (e.g., after a communication channel is established with the first device, the wearable device 12a may transmit a confirmation (e.g., a message, information and/or other data) to indicate that the wearable device 12a has successfully paired with the first device (e.g., to indicate that a Bluetooth pairing with a health sensor was successful).
[0047] In one embodiment, after the wearable device is paired with an additional device (e.g., after wearable device 12a is paired with health sensor 18a), the wearable device may forward data received from the additional device to the distributed cloud computing system 14 and vice versa. For example, the wearable device 12a may operate as a tunnel for data that is transmitted between the distributed cloud computing system 14 and the wearable device 12a, as discussed below in conjunction with FIG. 3.
[0048] FIG. 3 is block diagram illustrating an example system architecture 300, according to another embodiment of the present disclosure. The system architecture 300 includes a distributed cloud computing system 14, a wearable device 12a, and a health sensor 18a. As discussed above (in conjunction with FIGS. 1 and 2), the wearable device 12a may be paired with the health sensor 18a. For example, the wearable device 12a may be a Bluetooth capable device that is paired with the health sensor 18a which is also a Bluetooth capable device.
[0049] In one embodiment, a tunnel 305 (e.g., a logical communication channel) may be created or formed between the distributed cloud computing system 14 and the health sensor 18a when the health sensor 18a is paired with the wearable device 12a. The tunnel 305 may allow the health sensor 18a to communicate data to the distributed cloud computing system 14 and vice versa. The tunnel 305 may communicate data between the distributed cloud computing system 14 and the health sensor 18a via the wearable device 12a. For example, the wearable device 12a may operate as a proxy and may forward data from the distributed cloud computing system 14 to the wearable device 12a and vice versa. This may allow the wearable device 12a to operate with many different types of devices without having to process and/or understand the data that is provided by the different types of devices. For example, the health sensor 18a may be a new type of health sensor. The user may want to pair the health sensor 18a with the wearable device 12a. The wearable device 12a may not be able to process and/or analyze the type of data that the health sensor 18a generates. For example, the health sensor 18a may be a glucometer and the wearable device 12a may not be able to process data (e.g., glucose level measurements, sensor readings, etc.) from the health sensor 18a. The data from the health sensor 18a may be forwarded by the wearable device to the distributed cloud computing system 14 via the tunnel 305 (e.g., the wearable device 12a may forward the data to the distributed cloud computing system 14). The distributed cloud computing system 14 may process the data received from the health sensor 18a via the tunnel 305 (e.g., via the wearable device 12a) to determine the health or state of the user (e.g., to determine the glucose levels of the user). This allows the wearable device 12a to operate with the health sensor 18a and without the need for the wearable device 12a to understand that data (e.g., commands, sensor readings, operations, functions, protocols) of health sensor 18a. The tunnel 305 may also be used to provide data (e.g., information, commands, instructions, etc.) to directly to the health sensor 18a instead of providing the data to the wearable device 12a. For example, instead of instructing the wearable device 12a to instruct the health sensor 18a to perform a measurement, the distributed cloud computing system 14 may directly send the command to perform the measurement to the health sensor 18a via the tunnel 305 (e.g., via the wearable device 12a).
[0050] FIGS. 4-5 are flow diagrams illustrating methods of pairing devices. For simplicity of explanation, the methods are depicted and described as a series of acts. However, acts in accordance with this disclosure can occur in various orders and/or concurrently and with other acts not presented and described herein. Furthermore, not all illustrated acts may be required to implement the methods in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the methods could alternatively be represented as a series of interrelated states via a state diagram or events.
[0051] FIG. 4 is a flow diagram illustrating a method 400 of pairing a first device with a computing device, according to one embodiment of the present disclosure. The method 400 may be performed by processing logic that comprises hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (e.g., instructions run on a processor to perform hardware simulation), or a combination thereof. In one embodiment, method 400 may be performed by a pairing module, as illustrated in FIG. 1.
[0052] Referring to FIG. 4 the method 400 begins at block 405 where the processing logic receives user input indicating a pairing request for the computing device. The computing device may be a wearable device (as illustrated in FIGS. 1 and 2). For example, the processing logic may receive user input from a remote operator indicating that a computing device should be paired with one or more other devices. At block 410, the processing logic may request a list of one or more devices that are visible to the computing device. For example, the processing logic may transmit instructions to the computing device that cause the computing device to activate or power on a first communication interface (e.g., a Bluetooth communication interface) and to scan for devices (e.g., scan for Bluetooth devices near the computing device). The processing logic may transmit these instructions to the computing device using a second communication interface (e.g., using a cellular communication interface). At block 415, the processing logic may receive a list of devices that are visible to the computing device from the computing device. For example, the processing logic may receive one or more identifiers (e.g., MAC address, a name, etc.) for the devices that are visible to the computing device.
[0053] At block 420, the processing logic may receive user input identifying a first device from the list of devices that are visible to the computing device. For example, the processing logic may receive user input from the remote operator selecting a first device from the list of devices that are visible to the computing device. The first device may be a health sensor (e.g., a scale, a glucometer, a thermometer, etc.) At block 425, the processing logic may obtain the instructions for pairing the first device with the computing device. For example, the processing logic may look up access codes, commands, passwords, etc., for pairing the first device with the computing device using an identifier for the first device. The processing logic may transmit these instructions to the computing device using the second communication interface. At block 430, the processing logic may optionally receive a confirmation (e.g., a message and/or other data) from the computing device indicating that the computing device has successfully paired with the first device. At block 435, the processing logic may communicate data (e.g., transmit and/or receive) between the server and the first device via a tunnel formed by the computing device (as discussed above in conjunction with FIG. 2). After block 435, the method 400 ends.
[0054] FIG. 5 is a flow diagram illustrating a method 500 of pairing a first device with a computing device, according to another embodiment of the present disclosure. The method 500 may be performed by processing logic that comprises hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (e.g., instructions run on a processor to perform hardware simulation), or a combination thereof. In one embodiment, method 500 may be performed by a processing module, as illustrated in FIG. 2.
[0055] Referring to FIG. 5 the method 500 begins at block 505 where the processing logic receives a request for a list of devices that are visible to the computing device. The computing device may be a wearable device (as illustrated in FIGS. 1 and 2). The processing logic may receive this request using a first communication interface (e.g., a cellular communication interface). The request may include instructions that cause and/or instruct the processing logic to activate or power on a second communication interface (e.g., a Bluetooth communication interface). At block 510, the processing logic may scan for devices (e.g., scan for Bluetooth devices near the computing device). At block 515, the processing logic may transmit a list of devices that are visible to the computing device to a server (e.g., a server in a distributed cloud computing system). For example, the processing logic may transmit one or more identifiers (e.g., MAC address, a name, etc.) for the devices that are visible to the computing device.
[0056] At block 520, the processing logic may receive instructions for pairing the first device with the computing device. For example, the processing logic may receive an identifier, access codes, commands, passwords, etc., for pairing the first device with the computing device. The processing logic may receive these instructions to the computing device using the first communication interface. At block 525, the processing logic may pair the computing device with the first device based on the instructions for pairing. At block 530, the processing logic may optionally transmit a confirmation (e.g., a message and/or other data) from the computing device indicating that the computing device has successfully paired with the first device. At block 535, the processing logic may communicate data (e.g., transmit and/or receive) from the first device (e.g., a health sensor) to the server and vice versa (as discussed above in conjunction with FIG. 2). For example, the computing device may serve as a tunnel between the first device and the server and may forward data between the server and the first device (as discussed above in conjunction with FIG. 3). After block 535, the method 500 ends.
[0057] FIG. 6 illustrates a diagrammatic representation of a machine in the example form of a computing device 600 within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed. The computing device 600 may be a mobile phone, a smart phone, a netbook computer, a rackmount server, a router computer, a server computer, a personal computer, a mainframe computer, a laptop computer, a tablet computer, a desktop computer etc., within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed. In alternative embodiments, the machine may be connected (e.g., networked) to other machines in a LAN, an intranet, an extranet, or the Internet. The machine may operate in the capacity of a server machine in client-server network environment. The machine may be a personal computer (PC), a set-top box (STB), a server, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. The example computing device may be a server (in a distributed cloud computing system) and/or may be a wearable device.
[0058] The example computing device 600 includes a processing device (e.g., a processor) 602, a main memory 604 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 606 (e.g., flash memory, static random access memory (SRAM)) and a data storage device 618, which communicate with each other via a bus 630.
[0059] Processing device 602 represents one or more general -purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 602 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 602 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 602 is configured to execute instructions 626 for performing the operations and steps discussed herein.
[0060] The computing device 600 may further include a network interface device 608 which may communicate with a network 620. The computing device 600 also may include a video display unit 610 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device 612 (e.g., a keyboard), a cursor control device 614 (e.g., a mouse) and a signal generation device 616 (e.g., a speaker). In one embodiment, the video display unit 610, the alphanumeric input device 612, and the cursor control device 614 may be combined into a single component or device (e.g., an LCD touch screen).
[0061] The data storage device 618 may include a computer-readable storage medium 628 on which is stored one or more sets of instructions (e.g., instructions for a pairing module) embodying any one or more of the methodologies or functions described herein. The instructions 626 may also reside, completely or at least partially, within the main memory 604 and/or within the processing device 602 during execution thereof by the computing device 600, the main memory 604 and the processing device 602 also constituting computer-readable media. The instructions may further be transmitted or received over a network 620 via the network interface device 608.
[0062] While the computer-readable storage medium 628 is shown in an example embodiment to be a single medium, the term "computer-readable storage medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database and/or associated caches and servers) that store the one or more sets of instructions. The term "computer-readable storage medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term "computer- readable storage medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical media and magnetic media.
[0063] In the above description, numerous details are set forth. It will be apparent, however, to one of ordinary skill in the art having the benefit of this disclosure, that embodiments of the disclosure may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the description.
[0064] Some portions of the detailed description are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
[0065] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the above discussion, it is appreciated that throughout the description, discussions utilizing terms such as "receiving," "transmitting," "requesting," "obtaining," "processing," "scanning," "pairing," "forwarding," or the like, refer to the actions and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (e.g., electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0066] Embodiments of the disclosure also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, flash memory, or any type of media suitable for storing electronic instructions.
[0067] The words "example" or "exemplary" are used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as "example' or "exemplary" is not necessarily to be construed as preferred or advantageous over other aspects or designs.
Rather, use of the words "example" or "exemplary" is intended to present concepts in a concrete fashion. As used in this application, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or". That is, unless specified otherwise, or clear from context, "X includes A or B" is intended to mean any of the natural inclusive permutations. That is, if X includes A; X includes B; or X includes both A and B, then "X includes A or B" is satisfied under any of the foregoing instances. In addition, the articles "a" and "an" as used in this application and the appended claims should generally be construed to mean "one or more" unless specified otherwise or clear from context to be directed to a singular form. Moreover, use of the term "an embodiment" or "one embodiment" or "an implementation" or "one implementation" throughout is not intended to mean the same embodiment or implementation unless described as such.
Furthermore, the terms "first," "second," "third," "fourth," etc. as used herein are meant as labels to distinguish among different elements and may not necessarily have an ordinal meaning according to their numerical designation.
[0068] The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below. In addition, the present disclosure is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the disclosure as described herein.
[0069] The above description sets forth numerous specific details such as examples of specific systems, components, methods and so forth, in order to provide a good understanding of several embodiments of the present disclosure. It will be apparent to one skilled in the art, however, that at least some embodiments of the present disclosure may be practiced without these specific details. In other instances, well-known components or methods are not described in detail or are presented in simple block diagram format in order to avoid unnecessarily obscuring the present disclosure. Thus, the specific details set forth above are merely examples. Particular
implementations may vary from these example details and still be contemplated to be within the scope of the present disclosure.
[0070] It is to be understood that the above description is intended to be illustrative and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reading and understanding the above description. The scope of the disclosure should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims

CLAIMS What is claimed is:
1. A method comprising:
receiving a first user input indicating a pairing request for a computing device;
requesting a list of one or more additional devices, wherein the one or more additional devices are visible to the computing device;
receiving the list of one or more additional devices from the computing device;
receiving a second user input identifying a first device from the list of one or more devices, wherein the first device comprises a health sensor; and
transmitting one or more instructions for pairing the health sensor with the computing device to the computing device.
2. The method of claim 1, wherein requesting the list of one or more additional devices comprises:
transmitting a first instruction to the computing device to activate a first communication interface, wherein the first communication interface is to be used by the computing device to communicate with the health sensor; and
transmitting a second instruction to the computing device to perform a scan for the one or more additional devices.
3. The method of claim 1, wherein the second user input comprise an identifier for the health sensor.
4. The method of claim 3, further comprising:
obtaining the one or more instructions based on the identifier.
5. The method of claim 1, further comprising:
receiving data from the health sensor, wherein the data is forwarded from the health sensor via the computing device; and
processing the data received from the health sensor, wherein the data comprises health data of a user of the computing device.
6. The method of claim 1, further comprising:
receiving a confirmation from the computing device, the confirmation indicating that the computing device has paired with the health sensor.
7. The method of claim 1, wherein the computing device comprises a wearable computing device.
8. The method of claim 1, wherein the first user input is received in response to a voice call between a remote operator and a user of the computing device.
9. The method of claim 8, wherein the computing device comprises a second communication interface and wherein the voice call is established using the second communication interface.
10. An apparatus comprising :
a memory to store data;
a processing device coupled to the memory, the processing device to:
receive a request for a list of one or more devices, wherein the one or more devices are visible to the apparatus;
scan for the one or more devices using a first communication interface;
transmit the list of one or more devices to a server using a second communication interface;
receive one or more instructions to pair with a first device from the list of one or more devices, wherein the first device comprises a health sensor; and
pair with the health sensor based on the one or more instructions.
11. The apparatus of claim 10, wherein the request for the list of one or more devices is received after a voice call is established using the second communication interface.
12. The apparatus of claim 10, wherein the processing device is to receive the request for the list of one or more devices by:
receiving a first instruction to activate the first communication interface, wherein the first communication interface is to be used to communicate with the health sensor.
13. The apparatus of claim 10, wherein the processing device is further to:
receive data from the health sensor; and
forward the data to the server.
14. The apparatus of claim 10, wherein the processing device is further to:
receive first data from the server; and
forward the data to the health sensor.
15. A non-transitory computer readable storage medium having instructions that, when executed by a processing device, cause the processing device to perform operations comprising: receiving a first user input indicating a pairing request for a computing device;
requesting a list of one or more additional devices, wherein the one or more additional devices are visible to the computing device;
receiving the list of one or more additional devices from the computing device;
receiving a second user input identifying a first device from the list of one or more devices, wherein the first device comprises a health sensor; and
transmitting one or more instructions for pairing the health sensor with the computing device to the computing device.
16. The transitory computer readable storage medium of claim 15, wherein requesting the list of one or more additional devices comprises:
transmitting a first instruction to the computing device to activate a first communication interface, wherein the first communication interface is to be used by the computing device to communicate with the health sensor; and
transmitting a second instruction to the computing device to perform a scan for the one or more additional devices.
17. The transitory computer readable storage medium of claim 15, wherein the second user input comprise an identifier for the health sensor.
18. The transitory computer readable storage medium of claim 17, the operations further comprising:
obtaining the one or more instructions based on the identifier.
19. The transitory computer readable storage medium of claim 15, the operations further comprising:
receiving data from the health sensor, wherein the data is forwarded from the health sensor via the computing device; and processing the data received from the health sensor, wherein the data comprises health data of a user of the computing device.
20. The transitory computer readable storage medium of claim 15, wherein the first user input is received in response to a voice call between a remote operator and a user of the computing device.
PCT/US2013/066966 2012-10-26 2013-10-25 Management, control and communication with sensors WO2014066854A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP13848885.3A EP2911577A4 (en) 2012-10-26 2013-10-25 Management, control and communication with sensors
HK16102374.4A HK1214116A1 (en) 2012-10-26 2016-03-01 Management, control and communication with sensors

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201261719277P 2012-10-26 2012-10-26
US61/719,277 2012-10-26
US14/062,688 US9526420B2 (en) 2012-10-26 2013-10-24 Management, control and communication with sensors
US14/062,688 2013-10-24

Publications (1)

Publication Number Publication Date
WO2014066854A1 true WO2014066854A1 (en) 2014-05-01

Family

ID=50545365

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/066966 WO2014066854A1 (en) 2012-10-26 2013-10-25 Management, control and communication with sensors

Country Status (4)

Country Link
US (1) US9526420B2 (en)
EP (1) EP2911577A4 (en)
HK (1) HK1214116A1 (en)
WO (1) WO2014066854A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107172718A (en) * 2017-03-30 2017-09-15 联想(北京)有限公司 A kind of information processing method and electronic equipment

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140357215A1 (en) * 2013-05-30 2014-12-04 Avaya Inc. Method and apparatus to allow a psap to derive useful information from accelerometer data transmitted by a caller's device
KR101752305B1 (en) * 2013-05-30 2017-06-29 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Schemes for providing wireless communication
PL227800B1 (en) * 2013-10-02 2018-01-31 Fibar Group Spólka Akcyjna Device for detecting and signaling of the condition of water appearing on surfaces, preferably on the surfaces of room floors
KR20150089525A (en) * 2014-01-28 2015-08-05 삼성전자주식회사 Communication Message Operating Method And Electronic Device supporting the same
US20150242895A1 (en) * 2014-02-21 2015-08-27 Wendell Brown Real-time coupling of a request to a personal message broadcast system
JP2015162783A (en) * 2014-02-27 2015-09-07 ソニー株式会社 Information processor, information processing method and program
US10409454B2 (en) 2014-03-05 2019-09-10 Samsung Electronics Co., Ltd. Smart watch device and user interface thereof
US20170091412A1 (en) 2014-05-30 2017-03-30 Apple Inc. Systems and Methods for Facilitating Health Research Using a Personal Wearable Device With Multiple Pairing Configurations
US10282696B1 (en) 2014-06-06 2019-05-07 Amazon Technologies, Inc. Augmented reality enhanced interaction system
US9679152B1 (en) * 2014-07-24 2017-06-13 Wells Fargo Bank, N.A. Augmented reality security access
WO2016048345A1 (en) * 2014-09-26 2016-03-31 Hewlett Packard Enterprise Development Lp Computing nodes
AU2016261830B2 (en) * 2015-05-12 2019-01-17 Dexcom, Inc. Distributed system architecture for continuous glucose monitoring
WO2017115145A1 (en) 2015-12-31 2017-07-06 Delta Faucet Company Water sensor
KR101639970B1 (en) * 2016-01-26 2016-07-15 주식회사 이노피아테크 Apparatus and method for configuring a two-way channel from the media service
US10542075B2 (en) * 2016-02-24 2020-01-21 Nokia Technologies Oy Method and apparatus for configuration for monitoring patient information
US11810032B2 (en) * 2016-03-16 2023-11-07 Triax Technologies, Inc. Systems and methods for low-energy wireless applications using networked wearable sensors
CN107547818A (en) * 2016-06-29 2018-01-05 福建星网锐捷通讯股份有限公司 The building equipment means of communication and system based on cloud call center
US11032855B2 (en) * 2016-10-18 2021-06-08 Dexcom, Inc. System and method for communication of analyte data
CN115580842A (en) * 2016-10-18 2023-01-06 德克斯康公司 Analyte data communication system and method
EP3549386B1 (en) * 2016-11-30 2023-12-27 Nokia Technologies Oy Transfer of sensor data
US20190051144A1 (en) 2017-07-27 2019-02-14 NXT-ID, Inc. Social Network for Responding to Event-Driven Notifications
US11382511B2 (en) 2017-07-27 2022-07-12 Logicmark, Inc. Method and system to reduce infrastructure costs with simplified indoor location and reliable communications
US11158179B2 (en) 2017-07-27 2021-10-26 NXT-ID, Inc. Method and system to improve accuracy of fall detection using multi-sensor fusion
CN107341360A (en) * 2017-08-17 2017-11-10 马婉婷 A kind of health services system and its method based on Intelligent bracelet
US10943680B1 (en) 2017-09-07 2021-03-09 Massachusetts Mutual Life Insurance Company Intelligent health-based blockchain
US10748656B1 (en) * 2019-03-12 2020-08-18 Harmonize Inc. Population health platform
GB2600936A (en) * 2020-11-11 2022-05-18 Spatialcortex Tech Limited System and method for human motion monitoring
EP4260920A1 (en) 2020-12-10 2023-10-18 Panasonic Intellectual Property Management Co., Ltd. Robot control method and information provision method
US11468992B2 (en) 2021-02-04 2022-10-11 Harmonize Inc. Predicting adverse health events using a measure of adherence to a testing routine

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040172290A1 (en) * 2002-07-15 2004-09-02 Samuel Leven Health monitoring device
WO2008098346A1 (en) * 2007-02-16 2008-08-21 Hongyue Luo Wearable mini-size intelligent healthcare system
US20120185267A1 (en) * 2010-01-22 2012-07-19 Deka Products Limited Partnership System, Method, and Apparatus for Electronic Patient Care
US20120266251A1 (en) * 2010-10-15 2012-10-18 Roche Diagnostics Operations, Inc. Systems and methods for disease management
US20130141235A1 (en) * 2011-06-10 2013-06-06 Aliphcom General health and wellness management method and apparatus for a wellness application using data associated with data-capable band

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009036333A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Dynamic pairing of patients to data collection gateways
US20110090086A1 (en) 2007-10-22 2011-04-21 Kent Dicks Systems for personal emergency intervention
US8823490B2 (en) * 2008-12-15 2014-09-02 Corventis, Inc. Patient monitoring systems and methods
US8190651B2 (en) * 2009-06-15 2012-05-29 Nxstage Medical, Inc. System and method for identifying and pairing devices
US9462444B1 (en) 2010-10-04 2016-10-04 Nortek Security & Control Llc Cloud based collaborative mobile emergency call initiation and handling distribution system
US20120182939A1 (en) 2011-01-14 2012-07-19 Qualcomm Incorporated Telehealth wireless communication hub and service platform system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040172290A1 (en) * 2002-07-15 2004-09-02 Samuel Leven Health monitoring device
WO2008098346A1 (en) * 2007-02-16 2008-08-21 Hongyue Luo Wearable mini-size intelligent healthcare system
US20120185267A1 (en) * 2010-01-22 2012-07-19 Deka Products Limited Partnership System, Method, and Apparatus for Electronic Patient Care
US20120266251A1 (en) * 2010-10-15 2012-10-18 Roche Diagnostics Operations, Inc. Systems and methods for disease management
US20130141235A1 (en) * 2011-06-10 2013-06-06 Aliphcom General health and wellness management method and apparatus for a wellness application using data associated with data-capable band

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2911577A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107172718A (en) * 2017-03-30 2017-09-15 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN107172718B (en) * 2017-03-30 2020-08-25 联想(北京)有限公司 Information processing method and electronic equipment

Also Published As

Publication number Publication date
US9526420B2 (en) 2016-12-27
EP2911577A1 (en) 2015-09-02
HK1214116A1 (en) 2016-07-22
EP2911577A4 (en) 2016-11-02
US20140118159A1 (en) 2014-05-01

Similar Documents

Publication Publication Date Title
US9526420B2 (en) Management, control and communication with sensors
US9143600B2 (en) Single button mobile telephone using server-based call routing
US20230360792A1 (en) System and method for monitoring activities through portable devices
US8907783B2 (en) Multiple-application attachment mechanism for health monitoring electronic devices
US10051410B2 (en) Assist device and system
US20190080056A1 (en) Systems and methods for remote patient monitoring and event detection
WO2015081736A1 (en) Processing method and terminal for providing health support for users
WO2011056812A1 (en) Systems and devices for emergency tracking and health monitoring
CN104113618A (en) Flexible screen based wearable monitoring device
CN110914918B (en) System and method for health monitoring and providing emergency support
CA3065096C (en) Adaptation of the auditory output of an electronic digital assistant in accordance with an indication of the acoustic environment
Pathinarupothi et al. Multi-layer architectures for remote health monitoring
EP2809057A1 (en) Method and apparatus to allow a PSAP to derive useful information from accelerometer data transmitted by a caller's device
CN104123818A (en) Health monitoring rescue system and first aid method
US10049420B1 (en) Digital assistant response tailored based on pan devices present
WO2019132681A1 (en) Methods and systems for generating time-synchronized audio messages of different content in a talkgroup
US20170249823A1 (en) System for Tracking Wellness and Scheduling of Caregiving
KR20220032662A (en) System for providing emergency alarming service using voice message
KR20170095050A (en) Method and system for recognizing and notifying emergency situation
JP6467558B2 (en) Remote initiation of tasks in idle wireless computing devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13848885

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2013848885

Country of ref document: EP