WO2016142351A1 - Robotic companion sensitive to physiological data gathered by wearable devices - Google Patents

Robotic companion sensitive to physiological data gathered by wearable devices Download PDF

Info

Publication number
WO2016142351A1
WO2016142351A1 PCT/EP2016/054824 EP2016054824W WO2016142351A1 WO 2016142351 A1 WO2016142351 A1 WO 2016142351A1 EP 2016054824 W EP2016054824 W EP 2016054824W WO 2016142351 A1 WO2016142351 A1 WO 2016142351A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
label
instructions
wearable device
robotic
Prior art date
Application number
PCT/EP2016/054824
Other languages
French (fr)
Inventor
John Cronin
Joseph BODKIN
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2016142351A1 publication Critical patent/WO2016142351A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/10ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • Various embodiments described here relate generally to the field of wearable technology.
  • various embodiments are directed to a robotic companion capable of performing actions as a function of multiple wearable device sensor readings.
  • the present disclosure is directed to controlling a robotic companion as a function of one or more wearable device sensor readings collected by a wearable device.
  • robotic companions can be made to perform actions in a realistic and interesting way such that users can receive meaningful and varied responses from the companion based on wearable device sensor readings collected by a wearable device worn by such users.
  • a robotic companion can be made that recognizes which actions work for a particular user by monitoring the user's physiological characteristics and modifies its behavior to optimize a desired reaction from a user.
  • a robotic companion may detect that a user's pulse is too high and take actions intended to reduce the user's pulse.
  • Various embodiments are directed to a method performed by a robotic companion device for determining behavior based on a user's wearable device, the method including: wirelessly receiving, from a wearable device of the user, information regarding the user's current physiological state; determining, based on the received information, a label identifying the user's current physiological state; identifying a first activity associated with the label; and engaging an autonomous interaction with the user according to the first activity.
  • Various embodiments are directed to a non-transitory machine-readable storage medium encoded with instructions for execution by a robotic companion device for determining behavior based on a user's wearable device, the non-transitory machine-readable storage medium including: instructions for wirelessly receiving, from a wearable device of the user, information regarding the user's current physiological state; instructions for determining, based on the received information, a label identifying the user's current physiological state; instructions for identifying a first activity associated with the label; and instructions for engaging an autonomous interaction with the user according to the first activity.
  • Various embodiments are directed to a robotic companion device for determining behavior based on a user's wearable device, the robotic companion device including: a wireless communications interface configured to receive, from a wearable device of the user, information regarding the user's current physiological state; a memory device, and a processor configured to determine, based on the received information, a label identifying the user's current physiological state; identify a first activity associated with the label; and engage an autonomous interaction with the user according to the first activity.
  • the information regarding the user's current state includes at least one physiological parameter.
  • Various embodiments additionally include receiving additional physiological parameters; determining whether the label is no longer applicable to the user based on at least one of the physiological parameters and the additional physiological parameters; and when the label is no longer applicable, ceasing interaction with the user according to the at least one activity after a time period has elapsed.
  • Various embodiments additionally include setting the time period in proportion to the duration of applicability of the label.
  • Various embodiments additionally include receiving additional physiological parameters; determining whether the label is no longer applicable to the user based on at least one of the physiological parameters and the additional physiological parameters; determining that the label remains applicable after a predetermined time period; based on determining that the label remains applicable after the predetermined time period, identifying a second activity associated with the label; and engaging an additional autonomous interaction with the user according to the second activity.
  • determining the label includes applying a learned model associated with the label to the received at least one physiological parameter to determine whether the label is applicable to the user's current state.
  • Various embodiments additionally include receiving feedback indicating the correctness of the label to the user's current state; creating a new training example from the at least one physiological parameter and the received feedback; retraining the learned model based on the new training example using a machine learning process.
  • FIG. 1 is a flow diagram illustrating a method of enabling a robotic companion to perform an action as a function of a user's wearable device sensor readings
  • FIG. 2 is a block diagram of a system that can be used to implement the method of FIG. i ;
  • FIGS. 3A-D are representative screenshots depicting various aspects of an exemplary companion interface
  • FIG. 4 is a flow diagram illustrating a method of controlling a robotic companion
  • FIG. 5 illustrates particular implementations of various steps of a method of controlling a robotic companion
  • FIG. 6 is a table containing an example of information that may be stored in a wearable device database
  • FIG. 7 is a block diagram of an example computing device that may implement various ones of the features and processes of the present disclosure
  • FIG. 8A is a flow diagram illustrating a wearable device decision method
  • FIG. 8B is a table containing an example of information that may be stored in a wearable device rules database
  • FIG. 9 is a flow diagram illustrating a robotic companion base method
  • FIG. 10 is a flow diagram illustrating a robotic companion interactive method
  • FIG. 11 A is a flow diagram illustrating a robotic companion execution method
  • FIG. 1 IB is a table containing an example of information that may be stored in a robotic companion program database
  • FIG. 12 is a table containing an example of information that may be stored in a robotic companion interactive reference table
  • FIG. 13 is a flow diagram illustrating a companion care software method
  • FIG. 14 is a graph illustrating an example of how a robotic companion may behave as a function of an amount of time during which a user exhibits a particular state
  • the present disclosure is directed to a system enabling one or more of a robotic companion's actions to be determined as a function of a user's wearable device sensor readings such that the robotic companion may provide the user with visual, aural, and/or other types of feedback when a sensor reading meets or breaks one or more rules.
  • aspects of the present disclosure are directed to systems, methods, and software for enabling a robotic companion to perform an action as a function of a user's wearable device sensor readings.
  • a user may specify one or more rules that determine how a robotic companion will respond as a function of one or more sensor readings received from a wearable device.
  • FIG. 1 illustrates a method 100 that can be implemented by a robotic companion or other supporting device in order to achieve the aforementioned functionality.
  • method includes: a step 105 of monitoring one or more first wearable device sensor readings; a step 1 10 of comparing the one or more first readings to one or more predetermined rules; a step 1 15 of, when one or more of the first readings violates one or more of the rules, assessing at least one of a type and a severity of a health condition as a function of the one or more first readings that violate the one or more rules; a step 120 of causing the robotic companion to perform a first action as a function of the at least one of the type and the severity of the health condition; a step 125 of monitoring one or more second wearable device sensor readings; a step 130 of reassessing the at least one of the type and the severity of the health condition as a function of the second one or more wearable device sensor readings; a step 135 of determining a duration of time that the at
  • Monitoring one or more first wearable device sensor readings at step 105 may entail requesting, retrieving, and/or automatedly receiving sensor readings from one or more wearable devices, which may include, for example, a smart phone, a pedometer, and/ or a heart rate monitor, among others.
  • Comparing the one or more first readings to one or more predetermined rules at step 1 10 may include comparing a reading to one or more upper or lower limits, among other types of rules that can be set.
  • Assessing a type and severity of a health condition at step 1 15 may entail, for example, determining that a user is experiencing elevated levels of stress as a result of wearable device sensor readings that indicate that the user's pulse rate is higher than average.
  • Causing the robotic companion to perform a first action at step 120 may include, for example, causing the robotic companion to make a purring sound to attempt to calm the user.
  • Monitoring one or more second wearable device sensor readings at step 125 may include, for example, comparing a user's blood pressure or other sensed physiological characteristic to one or more rules.
  • Reassessing at step 130 may entail, for example, determining that the user is still experiencing high levels of stress given the results of the comparison of the user's blood pressure or other sensed physiological characteristic readings with the one or more rules (e.g., above or below a certain level).
  • Determining a duration of time that the health condition (in this example, stress) persists at step 135 may include monitoring a clock or timer to determine an amount of time that elapses from the time the condition was first recognized (in this example, when the elevated pulse rate is first detected) to the reassessment (in this example, the determination that the user's blood pressure is elevated). Based on that amount of time, at step 140 the robotic companion may then perform a second action different from the first. For example, the robot may determine that making a purring sound has not caused the user to experience a reduction in stress and, as such, may perform a different action, such as jumping, meowing, or wagging its tail, among others.
  • FIG. 2 illustrates a system 200 that can be used to implement the method of FIG. 1.
  • system 200 may include a wearable device 204 with memory 208, which may contain wearable device base software 212, a companion graphical user interface or "GUI" 216, a wearable device database 220, wearable device decision software 224, and a wearable device rules database 228.
  • Wearable device 204 may further include a clock 232, one or more sensors 236, a display 240, a power supply 244, and a communications interface or "comm” 248, which may utilize, e.g., low power BluetoothTM and/or any other appropriate standard of communication.
  • the sensor devices 236 may sense physiological parameters about the user.
  • the sensor devices 236 may include accelerometers, conductance sensors, optical sensors, temperature sensors, microphones, cameras, etc. These or other sensors may be useful for sensing, computing, estimating, or otherwise acquiring physiological parameters descriptive of the wearer such as, for example, steps taken, walking/running distance, standing hours, heart rate, respiratory rate, blood pressure, stress level, body temperature, calories burned, resting energy expenditure, active energy expenditure, height, weight, sleep metrics, etc.
  • System 200 may further include a robotic companion 252 with a memory 256 containing robotic companion base software 260, robotic companion interactive software 264, a robotic companion interactive reference table 268, robotic companion execution software 272, a robotic companion program database 276, and robotic companion care software 280.
  • Robotic companion 252 may further include one or more sensors 282, a processor 284, a camera 286, a microphone or "mic” 288, and a communications interface or "comm" 290.
  • the sensors 282 may include any of those sensors described above in relation to the wearable device though, as will be apparent, the location of such sensors 282 within the robotic companion rather than within a carried or otherwise worn device will cause the data reported by the sensors to reflect other parameters.
  • robotic companion 252 may interact with wearable device 204 and take actions based upon wearable device sensor readings in order to provide a user with visual and/or other types of feedback from the robotic companion, e.g., when one or more sensor readings deviate from a predetermined range.
  • the robotic companion 252 may be virtually any machine that autonomously performs one or more actions within a physical environment of a user.
  • the robotic companion 252 may be a toy such as a robotic cat or other animal.
  • the robotic companion 252 may perform actions such as navigating and moving around its environment, outputting audible interactions (e.g., meowing or purring sounds), physically interacting with the user (e.g., rubbing against the user's leg, "licking" the user's face), and performing other simulations of pet behavior (e.g., running around the room, jumping, laying down for a "nap," etc.).
  • the robotic companion 252 may perform more useful functions beyond user entertainment.
  • the robotic companion 252 may fetch objects, pour drinks, prepare food, vacuum, fold laundry, or perform other household tasks autonomously.
  • the robotic companion 252 may perform health-related tasks such as medication dispensation or administration.
  • Various additional functions capable of performance by a robotic companion with respect to a user or the surrounding environment will be apparent.
  • a robotic companion need not be ambulatory and, in some embodiments, may be a stationary device; for example, the robotic companion may be a stationary coffee machine that autonomously decides when to brew a cup or pot of coffee and how to brew the coffee (e.g., which beans to grind, whether and how much milk to add, whether and how much sugar or other sweetener to add, or latte art to create on a prepared cup).
  • the robotic companion may be a stationary coffee machine that autonomously decides when to brew a cup or pot of coffee and how to brew the coffee (e.g., which beans to grind, whether and how much milk to add, whether and how much sugar or other sweetener to add, or latte art to create on a prepared cup).
  • processor will be understood to encompass various hardware devices such as, for example, microprocessors, field-programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), and other hardware devices capable of performing the various functions described herein as being performed by the wearable device 204, robotic companion 252, or other device.
  • FPGAs field-programmable gate arrays
  • ASICs application specific integrated circuits
  • the respective memory devices 208, 256 may include various devices such as L1/L2/L3 cache, system memory, or storage devices.
  • non-transitory machine-readable storage medium will be understood to refer to both volatile memory (e.g., SRAM and DRAM) and non-volatile memory (e.g., flash, magnetic, and optical memory) devices, but to exclude mere transitory signals. While various embodiments may be described herein with respect to software or instructions "performing" various functions, it will be understood that such functions will actually be performed by hardware devices such as a processor executing the software or instructions in question. In some embodiments, such as embodiments utilizing one or more ASICs, various functions described herein may be hardwired into the hardware operation; in such embodiments, the software or instructions corresponding to such functionality may be omitted.
  • volatile memory e.g., SRAM and DRAM
  • non-volatile memory e.g., flash, magnetic, and optical memory
  • FIGS. 3A-3D show various examples of a companion GUI 216 and related components.
  • a companion GUI 216 may include buttons for viewing and/or managing a user profile (see, e.g., FIG. 3B), wearable device data (see, e.g., FIG. 3C), robotic companion actions (see, e.g. , FIG. 3D), and raw data from wearable device database 220.
  • User profile information may be displayed in companion GUI 216 as shown in FIG. 3 A, and may include information such as a user ID and one or more commonly or currently connected robotic companions. For example, in FIG. 3A, CatXXXl and DogXXX2 robotic companions are listed.
  • FIG. 3A CatXXXl and DogXXX2 robotic companions are listed.
  • FIG. 3B shows an example of a user profile interface 305 that a user can configure based on their preferences.
  • the user has selected a medical schedule, which may be stored on wearable device 204 such that robotic companion 252 can recognize it, and a carry medicine option, which may enable the robotic companion to carry one or more medicines.
  • a user can configure user profile 305 such that robotic companion 252 will store wearable device data instead of or in addition to storing it on wearable device 204.
  • the user can also select an option that allows data to be sent to others.
  • a user can choose one or more different robotic companions to carry out various actions, such as, for example, a cat, a dog, or a fish, among others.
  • the user profile may also include a robot companion activity duration controls that allow the user or an administrator to input a desired amount of time for various actions, which may be specified by one or more programs, to be performed.
  • the user profile interface may further include a select and save button allowing the user to select and save their settings.
  • FIG. 3C shows an example of a wearable device data interface 310.
  • a user can select one or more physiological parameters they would like the wearable device to store in wearable device database 220 and/or to send (or have sent) to one or more robotic companions.
  • the wearable device data interface may include a priority list enabling a user to specify relative priorities of multiple sensors that may be associated with one or more actions.
  • FIG. 3D shows an example of a robotic companion actions interface 315. As shown, a user can select one or more actions for a robotic companion to perform; for each robotic companion, these actions will be dependent on the nature of the robotic companion, i.e., whether it is a robotic cat, robotic dog, or robotic bird, among others.
  • FIG. 4 shows an example of an overall software process 400 that can be used to implement interaction between one or more wearable devices and one or more robotic companions whose actions are based upon a user's wearable device sensor readings.
  • process 400 may begin with wearable device base software 212 executing and collecting sensor readings and then storing related data in wearable device database 220.
  • the wearable device base software 212 may further extract or otherwise obtain additional parameters (e.g., physiological parameters) from the raw data collected from the sensors.
  • additional parameters e.g., physiological parameters
  • the wearable device base software 212 may process color data over time received from an optical sensor on the underside of a watch device to extract a pulse parameter.
  • the wearable device base software 212 may obtain accelerometer data and transmit the data (e.g., in raw or processed form such as, for example, accumulated or average accelerometer data over a time window) to another device (e.g. a separate mobile device with greater processing capability or a remote server or virtual machine) for processing. The other device may then extract additional parameters (e.g., energy expenditure) to be returned to the wearable device or robotic companion. After obtaining relevant physiological or other parameters, the wearable device base software 212 may provide one or more subsets of such data to wearable device decision software 224.
  • additional parameters e.g., energy expenditure
  • Wearable device decision software 224 may then compare wearable device database 220 with wearable device rules database 228, and if any of the sensor readings meet or deviate from the rules, depending on the nature of the rules, the decision software may communicate with companion GUI 216 to ascertain user-selected options and send information to robotic companion base software 260 based on the options, rules, and/or sensor readings.
  • Robotic companion base software 260 may then provide that information to robotic companion interactive software 264, which may compare, e.g., wearable device database 220 with robotic companion interactive reference table 268. If an action is required of the robotic companion based on that comparison, then interactive software 264 may send an indication of such an action to robotic companion execution software 272, which may compare the indication with robotic companion program database 276.
  • Robotic companion program database 276 may then send a program file enabling the robotic companion to execute the action to execution software 272, after which the robotic companion may execute the action and/or the process may return to interactive software 264.
  • robotic companion execution software 272 may execute robotic companion care software 280 in order to provide more realistic and varied behavior.
  • FIG. 5 shows an exemplary method 500 that wearable device base software 212 may be configured to perform.
  • method 500 begins by executing routine operations (step 505), such as collecting sensor data, running data analysis, and reporting and/or storing the sensor data and/or results of the analysis.
  • routine operations such as collecting sensor data, running data analysis, and reporting and/or storing the sensor data and/or results of the analysis.
  • Method 500 then stores the raw data in wearable device database 220 (step 510) and then sends wearable device database 220 to wearable device decision software 224 (step 515).
  • FIG. 6 shows an example of a wearable device database 220.
  • Such a database may include one or more wearable IDs, dates, times, indications of sensors that have produced one or more readings, raw data associated with such one or more readings, and one or more associated labels.
  • the labels can be added after wearable device decision software 224 compares wearable device database 220 with wearable device rules database 228. If the resulting sensor reading is lower or higher than a pre-selected rule, a label may be added to that specific sensor reading as a way to determine emotion.
  • FIG. 7 illustrates an exemplary wearable computing device 700 that may implement various features and/or processes of the present disclosure, such as the features and processes illustrated in other figures of this disclosure, as well as features and processes that would be apparent to those of ordinary skill in the art after reading this entire disclosure.
  • computing device 700 may include a memory interface 704, one or more data processors, image processors and/or central processing units 708, and a peripherals interface 712.
  • Memory interface 704, one or more processors 708, and/or peripherals interface 712 may be separate components or may be integrated in one or more integrated circuits.
  • the various components in computing device 700 may be coupled by one or more communication buses or signal lines.
  • Sensors, devices, and subsystems may be coupled to peripherals interface 712 to facilitate one or more functionalities.
  • a motion sensor 716, a light sensor 720, and a proximity sensor 724 may be coupled to peripherals interface 712 to facilitate orientation, lighting, and/or proximity functions.
  • Other sensors 728 may also be connected to peripherals interface 712, such as a global navigation satellite system (GNSS) (e.g., GPS receiver), a temperature sensor, a biometric sensor, and/or one or more other sensing devices, to facilitate related functionalities.
  • GNSS global navigation satellite system
  • a camera subsystem 732 and an optical sensor 736 may be utilized to facilitate camera functions, such as recording images and/or video.
  • Camera subsystem 732 and optical sensor 736 may be used to collect images of a user to be used during authentication of a user, e.g., by performing facial recognition analysis.
  • Communication functions may be facilitated through one or more wireless
  • communication subsystems 740 may include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters.
  • the specific design and implementation of communication subsystem 740 may depend on the communication network(s) over which computing device 700 is intended to operate.
  • computing device 700 may include communication subsystems 740 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-FiTM or WiMaxTM network, and/or a BluetoothTM network.
  • wireless communication subsystems 740 may include hosting protocols such that one or more devices 700 may be configured as a base station for other wireless devices.
  • An audio subsystem 744 may be coupled to a speaker 748 and a microphone 752 to facilitate voice-enabled functions, such as speaker recognition, voice replication, digital recording, and/or telephony functions. Audio subsystem 744 may be configured to facilitate processing voice commands, voice-printing, and voice authentication.
  • I/O subsystem 756 may include a touch-surface controller 760 and/or other input controller(s) 764.
  • Touch-surface controller 760 may be coupled to a touch surface 768.
  • Touch surface 768 and touch-surface controller 760 may, for example, detect contact and movement or a lack thereof using one or more of any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and/or surface acoustic wave technologies, optionally as well as other proximity sensor arrays and/or other elements for determining one or more points of contact with touch surface 768.
  • Other input controller(s) 764 may be coupled to other input/control devices 772, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus.
  • One or more related buttons or other controls may include one or more sets of up/down buttons for volume and/or amplitude control of speaker 748 and/or
  • buttons or other controls a user may activate a voice control, or voice command, module that enables the user to speak commands into microphone to cause device 700 to execute the spoken command.
  • the user may customize functionality of one or more buttons or other controls.
  • Touch surface 768 may, for example, also be used to implement virtual or soft buttons and/or a keyboard.
  • computing device 700 may present recorded audio and/or video files, such as MP3, AAC, and/or MPEG files.
  • computing device 700 may include the functionality of an MP3 player, such as an iPodTM.
  • Computing device 700 may, therefore, include a 36-pin connector that is compatible with related iPodTM hardware. Other input/output and control devices may also be used.
  • memory interface 704 may be coupled to one or more types of memory 776.
  • Memory 776 may include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR).
  • Memory 776 may store an operating system 780, such as DarwinTM, RTXC, LINUX, UNIX, OS XTM, WINDOWSTM, and/or an embedded operating system such as Vx Works.
  • Operating system 780 may include instructions for handling basic system services and/or for performing hardware dependent tasks.
  • operating system 780 may include a kernel (e.g., UNIX kernel). Further, in some implementations, operating system 780 may include instructions for performing voice authentication.
  • Memory 776 may also store communication instructions 782 to facilitate communicating with one or more additional devices, one or more computers, and/or one or more servers.
  • memory 776 may include: graphical user interface instructions 784 to facilitate graphic user interface processing; sensor processing instructions 786 to facilitate sensor- related processing and functions; phone instructions 788 to facilitate phone-related processes and functions; electronic messaging instructions 790 to facilitate electronic-messaging related processes and functions; web browsing instructions 792 to facilitate web browsing-related processes and functions; media processing instructions 794 to facilitate media processing-related processes and functions; GNSS/Navigation instructions 796 to facilitate GNSS and navigation-related processes and instructions; and/or camera instructions 797 to facilitate camera-related processes and functions.
  • Memory 776 may store other software instructions 798 to facilitate other processes and functions.
  • other software instructions 798 may include instructions for counting steps the user takes when device 700 is worn.
  • Memory 776 may also store other software instructions (not shown), such as web video instructions to facilitate web video-related processes and functions and/or web shopping instructions to facilitate web shopping-related processes and functions.
  • media processing instructions 794 may be divided into audio processing instructions and video processing instructions to facilitate audio processing-related processes and functions and video processing- related processes and functions, respectively.
  • An activation record and International Mobile Equipment Identity (IMEI) 799 or similar hardware identifier may also be stored in memory 776.
  • IMEI International Mobile Equipment Identity
  • Each of the above identified instructions and applications may correspond to a set of instructions for performing one or more functions described herein. These instructions need not necessarily be implemented as separate software programs, procedures, or modules. Memory 776 may include additional instructions or fewer instructions. Further, various functions of computing device 700 may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
  • FIG. 8A shows an exemplary method 800 that wearable device decision software 224 may be configured to perform.
  • method 800 begins by receiving a wearable device database 220 or a portion thereof from wearable device base software 212.
  • Method 800 may then compare wearable device database 220 to wearable device rules database 228, and, as a function of that comparison, the method may determine whether wearable device rules database 228 requires that wearable device database (or a portion thereof) 220 be sent to a robotic companion. If not, then method 800 may terminate. However, if the determination is positive, then method 800 may initiate a communications interface, after which the method may determine whether the appropriate robotic companion is available. If yes, then the method may apply the companion GUI action the user has selected in robotic companion actions interface 315, then send the raw data or other extracted parameters to robotic companion base software 260.
  • FIG. 8B shows an example of wearable device rules database 228, which may include a list of one or more sensors, one or more rules associated with such sensors, and a location and/or device to which the wearable device database 220 may be sent when the rule is violated (e.g., a robotic cat).
  • a robotic cat e.g., a robotic cat
  • the wearable device may transmit the determined label to the robotic companion which, in turn, may act based on this label (e.g., the robotic companion may choose to engage in different actions for a depressed user versus an anxious user).
  • one or more models may be trained (e.g., using a machine learning algorithm such as logistic regression on a data set gathered from a test population) to use multiple types of sensor data and extracted parameters to determine whether an actionable label applies to the user.
  • the model(s) may be adapted to the particular user by receiving feedback as to the accuracy of each given prediction (e.g., manually input by the user or inferred based on subsequent user interactions with the robotic companion), creating additional or replacement training examples for inclusion in the training data set, and retraining the model(s) using the same machine learning algorithm. For example, if a current model identifies the user as being depressed, the robotic companion begins to play fetch with the user, and the user is then sensed to not be depressed (e.g.
  • the existing model immediately thereafter no longer classifies the user as depressed based on physiological parameters
  • the robotic companion observes an interaction pattern that is not known to fit with the label such as immediately and enthusiastically engaging in the game of fetch without coaxing by the robotic companion, etc.)
  • the sensor data and other physiological parameters that led to the initial conclusion may be labeled (e.g., "not depressed") as a new training example.
  • the training algorithm may be re-run to adapt the model(s) to the specific user.
  • FIG. 9 shows an exemplary method 900 that robotic companion base software 260 may be configured to perform.
  • method 900 begins at step 905 with a determination of whether the robotic companion received a connection to a wearable device. If not, then method 900 may end; however, if the determination is positive, then the method may proceed to step 910, at which it may receive the wearable device database 220 (or a portion thereof) from wearable device decision software 224. Method 900 may then proceed to step 915, at which it may send wearable device database 220 to the robotic companion interactive software 264, cause robotic companion interactive software 264 to execute at step 920, and then return to step 905 to determine whether the wearable device is connected.
  • FIG. 10 shows an exemplary method 1000 that robotic companion interactive software 264 may be configured to perform.
  • method 1000 begins at step 1005 by receiving wearable device database 220 from robotic companion base software 260 and then, at step 1010, comparing the raw data with robotic companion interactive reference table 268. If there is no match found and/or no action required, method 1000 may return to initial step 1005. However, if a match requires an action, then method 1000 may proceed to step 1020 to determine whether multiple activities are required by determining whether, for example, one or more sensors have violated two or more rules associated with separate activities.
  • method 1000 may proceed to step 1025 to determine which action should be performed and/or which label to associate with the user (e.g., depressed, anxious, etc.). If more than one action is required, method 1000 may first, before proceeding to step 1025, proceed to step 1030 to determine relative priorities for the actions and then, starting with the action having the highest priority, proceed to determine which action should be performed and/or which label to associate with the user at step 1025. The label may be determined in step 1025 based on physiological parameters of the user as described above with respect to FIG.
  • step 1025 may simply correspond to reading the label from the received message.
  • the robotic companion may revise the received label based on parameters available via the sensors that are local to the robotic companion (e.g., a camera mounted on the robotic companion may be used as a sensor to gauge the user's pulse, respiratory rate, and body language to confirm whether the user is indeed anxious).
  • Method 1000 may then send information related to the action and/or the label to execution software 272 at steps 1035 and/or 1040 and cause interactive software 264 to run robotic companion execution software 272, which may cause the robotic companion to perform the prescribed action at step 1045. If more than one activity is required, method 1000 may determine such at step 1050, return to determine relative priorities of any remaining actions at step 1030, and proceed through the method as described for each remaining action from highest priority to lowest priority. If no further activities are required to be performed, then method 1000 may return to receiving (or waiting to receive) wearable device database 220 at step 1005.
  • FIG. 1 1A shows an exemplary method 1 100 that robotic companion execution software 272 may be configured to perform.
  • method 1 100 begins in step 1 105 by receiving an indication of an action from robotic companion interactive software 264.
  • Method 1 100 may then compare the action with robotic companion program database 276 in step 1 100, execute a matching program file from the robotic companion program database 276 in step 1 1 15 (optionally proceeding to execute robotic companion care software 280 such as that shown in FIG. 13 to render more realistic and varied actions), and then return to robotic companion interactive software 264 in step 1 120.
  • FIG. 1 shows an exemplary method 1 100 that robotic companion execution software 272 may be configured to perform.
  • method 1 100 begins in step 1 105 by receiving an indication of an action from robotic companion interactive software 264.
  • Method 1 100 may then compare the action with robotic companion program database 276 in step 1 100, execute a matching program file from the robotic companion program database 276 in step 1 1 15 (optionally proceeding to execute robotic companion care software 280 such as that shown in FIG. 13 to render more realistic and varied actions
  • robotic companion program database 276 may include a list of one or more labels/inputs, one or more robotic companions, one or more actions associated with such companions, and one or more program files corresponding to each action.
  • program files with higher numbers e.g., program file 3 vs. program file 1
  • program files with higher numbers correspond to higher frequency of the associated activity. For example, if the activity is "meow,” then
  • “File.Meow-xxx2” may cause the robotic companion to meow twice as frequently as “File.Meow- xxxl " but half as frequent as “File.Meow-xxx3,” although other implementations could be used.
  • This functionality informs the user that the sensor readings are still outside of wearable device rules database 228.
  • execution software 272 may match one or more actions in program database 276 with the indication of an action received from interactive software 264.
  • FIG. 1 IB defines discrete robotic companion behaviors among multiple "program files," (e.g. , .jar files)
  • a robotic companion's elementary actions e.g., walking, making a sound, jumping, etc.
  • the activities triggered by different labels or physiological parameters may be defined in terms of a script of these elementary actions (passing in relevant parameters such as a target location for walking or a sound type to emit) or in terms of an end goal that the software may determine how to achieve using the available elementary actions.
  • FIG. 12 shows an example of robotic companion interactive reference table 268.
  • a table may define correspondences between one or more sensor readings 1204 and one or more selected actions 1208.
  • a user can specify actions 1208 they would like the robotic companion to run depending on different sensor readings 1204.
  • the robotic companion may be affected by a determined label for the user (through application of the robotic companion program database 276) or affected directly by raw sensor data or extracted parameters (through application of the robotic companion interactive reference table 268).
  • FIG. 13 shows an exemplary method 1300 that robotic companion care software 280 may be configured to perform.
  • method 1300 allows the robotic companion to run a first program file in step 1305 then determine whether a new reading from the same sensor still violates one or more rules in step 1310. If the new reading does not violate any rules, method 1300 may run the same program file for an additional time period in step 1315 dependent on how long the user was in the emotional state. For example, if a user was in an anxious state for twelve minutes, then the robotic companion's associated activity may last an additional 10 minutes. Method 1300 may then return to the execution software 272.
  • method 1300 may run the second program file in step 1320, which may, for example, perform the same activity, e.g., purring, but at a higher frequency. Again, this time in step 1325, the method 1300 determines if the most recent sensor readings (or other extracted parameters) are still outside of the defined rules. If not, the method 1300 continues to run the second program in step 1330. Otherwise, the method 1300 proceeds to step 1335 where the robotic companion proceeds to execute the third program for a preselected time and check, in step 1340, whether the parameters have normalized to fall within the rules.
  • step 1320 may, for example, perform the same activity, e.g., purring, but at a higher frequency. Again, this time in step 1325, the method 1300 determines if the most recent sensor readings (or other extracted parameters) are still outside of the defined rules. If not, the method 1300 continues to run the second program in step 1330. Otherwise, the method 1300 proceeds to step 1335 where the robotic companion proceeds to execute the third program for a
  • method 1300 proceeds to run, in step 1345, the third program file for an additional amount of time (e.g., 15 minutes). If after the third program file new sensor readings are still violating the rules, method 1300 may change the robotic companion's activity, in step 1350, and repeat the process until new readings no longer violate any rules.
  • the third program file for an additional amount of time (e.g., 15 minutes). If after the third program file new sensor readings are still violating the rules, method 1300 may change the robotic companion's activity, in step 1350, and repeat the process until new readings no longer violate any rules.
  • FIG. 14 shows a graph 1400 illustrating an example of one way in which companion care software 280 may perform.
  • the robotic companion may perform an activity for a particular duration of time even after the associated sensor data no longer violates any rules. For example, if the user is in an anxious state for 12 minutes and then new sensor data from one or more sensors associated with anxiousness indicates that the user is no longer anxious (e.g., when the sensor readings no longer violate any rules), the robotic companion may perform the same activity for an additional 10 minutes.
  • FIG. 15 shows a particular method 1500 that can be used to implement one or more robotic companions whose actions are based upon a user's wearable device sensor readings.
  • a wearable device with memory, which may contain wearable device base software 212, a companion GUI 216, a wearable device database 220, wearable device decision software 224, and a wearable device rules database 228.
  • the method may further involve providing the wearable device with a clock, one or more sensors, a display, a power supply, and a communications interface.
  • Such a method may further involve providing a robotic companion with a memory containing robotic companion base software 260, robotic companion interactive software 264, a robotic companion interactive reference table 268, robotic companion execution software 272, and a robotic companion program database 276.
  • the method may further include providing the robotic companion with one or more sensors, a processor, a camera, a microphone, and a communications interface.
  • the method may then entail executing the companion GUI 216 to determine user options, select wearable device data to be sent to one or more robotic companions, and to specify one or more robotic companion actions.
  • the method may then involve executing wearable device base software 212 to collect sensor data, storing raw data in wearable device database 220, and sending some or all of the database to wearable device decision software 224.
  • the method may then involve executing wearable device decision software 224 to compare wearable device database 220 to the wearable device rules in order to determine whether one or more sensor readings violate one or more rules for that particular sensor. If a reading violates a rule, then wearable device database 220 may be sent to the robotic companion.
  • the method may then involve executing robotic companion base software 260 to determine whether a wearable device has been connected to the robotic companion.
  • the method may involve executing robotic companion interactive software 264 to match the wearable device raw data with one or more elements of robotic companion interactive reference table 268 in order to determine whether the sensor readings from the wearable warrant an action from the robotic companion.
  • robotic companion execution software 272 to search robotic companion program database 276 and run one or more program files associated with the robotic companion action.
  • any one or more of the aspects and embodiments described herein may be conveniently implemented using one or more machines (e.g., one or more computing devices that are utilized as a user computing device for an electronic document, one or more server devices, such as a document server, etc.) programmed according to the teachings of the present specification, as will be apparent to those of ordinary skill in the computer art.
  • Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those of ordinary skill in the software art.
  • Aspects and implementations discussed above employing software and/or software modules may also include appropriate hardware for assisting in the implementation of the machine executable instructions of the software and/or software module.
  • Such software may be a computer program product that employs a machine-readable storage medium.
  • a machine-readable storage medium may be any medium that is capable of storing and/or encoding a sequence of instructions for execution by a machine (e.g., a computing device) and that causes the machine to perform any one of the methodologies and/or embodiments described herein.
  • Examples of a machine-readable storage medium include, but are not limited to, a magnetic disk, an optical disc (e.g., CD, CD-R, DVD, DVD-R, etc.), a magneto-optical disk, a read-only memory "ROM” device, a random access memory “RAM” device, a magnetic card, an optical card, a solid-state memory device, an EPROM, an EEPROM, and any combinations thereof.
  • a machine- readable medium, as used herein, is intended to include a single medium as well as a collection of physically separate media, such as, for example, a collection of compact discs or one or more hard disk drives in combination with a computer memory.
  • a machine-readable storage medium does not include transitory forms of signal transmission.
  • Such software may also include information (e.g., data) carried as a data signal on a data carrier, such as a carrier wave.
  • a data carrier such as a carrier wave.
  • machine-executable information may be included as a data-carrying signal embodied in a data carrier in which the signal encodes a sequence of instruction, or portion thereof, for execution by a machine (e.g., a computing device) and any related information (e.g., data structures and data) that causes the machine to perform any one of the methodologies and/or embodiments described herein.
  • Examples of a computing device include, but are not limited to, an electronic book reading device, a computer workstation, a terminal computer, a server computer, a handheld device (e.g., a tablet computer, a smartphone, etc.), a web appliance, a network router, a network switch, a network bridge, any machine capable of executing a sequence of instructions that specify an action to be taken by that machine, and any combinations thereof.
  • a computing device may include and/or be included in a kiosk.
  • FIG. 16 shows a diagrammatic representation of one embodiment of a computing device in the exemplary form of a computer system 1600 within which a set of instructions for causing a control system, such as any one or more of various systems of the present disclosure, such as the systems illustrated in other figures of this disclosure, as well as systems that would be apparent to those of ordinary skill in the art after reading this entire disclosure, to perform any one or more of the aspects and/or methodologies of the present disclosure may be executed. It is also contemplated that multiple computing devices may be utilized to implement a specially configured set of instructions for causing one or more of the devices to perform any one or more of the aspects and/or methodologies of the present disclosure.
  • Computer system 1600 includes a processor 1604 and a memory 1608 that communicate with each other, and with other components, via a bus 1612.
  • Bus 1612 may include any of several types of bus structures including, but not limited to, a memory bus, a memory controller, a peripheral bus, a local bus, and any combinations thereof, using any of a variety of bus architectures.
  • Memory 1608 may include various components (e.g., machine-readable media) including, but not limited to, a random access memory component, a read only component, and any combinations thereof.
  • a basic input/output system 1616 (BIOS), including basic routines that help to transfer information between elements within computer system 1600, such as during start-up, may be stored in memory 1608.
  • BIOS basic input/output system
  • Memory 1608 may also include (e.g., stored on one or more machine-readable media) instructions (e.g., software) 1620 embodying any one or more of the aspects and/or methodologies of the present disclosure.
  • memory 1608 may further include any number of program modules including, but not limited to, an operating system, one or more application programs, other program modules, program data, and any combinations thereof.
  • Computer system 1600 may also include a storage device 1624.
  • a storage device e.g., storage device 1624
  • Examples of a storage device include, but are not limited to, a hard disk drive, a magnetic disk drive, an optical disc drive in combination with an optical medium, a solid-state memory device, and any combinations thereof.
  • Storage device 1624 may be connected to bus 1612 by an appropriate interface (not shown).
  • Example interfaces include, but are not limited to, SCSI, advanced technology attachment (ATA), serial ATA, universal serial bus (USB), IEEE 1394 (FIREWIRE), and any combinations thereof.
  • storage device 1624 (or one or more components thereof) may be removably interfaced with computer system 1600 (e.g., via an external port connector (not shown)).
  • storage device 1624 and an associated machine-readable medium 1628 may provide nonvolatile and/or volatile storage of machine-readable instructions, data structures, program modules, and/or other data for computer system 1600.
  • software 1620 may reside, completely or partially, within machine-readable medium 1628. In another example, software 1620 may reside, completely or partially, within processor 1604.
  • Computer system 1600 may also include an input device 1632.
  • a user of computer system 1600 may enter commands and/or other information into computer system 1600 via input device 1632.
  • Examples of an input device 1632 include, but are not limited to, an alphanumeric input device (e.g., a keyboard), a pointing device, a joystick, a gamepad, an audio input device (e.g., a microphone, a voice response system, etc.), a cursor control device (e.g., a mouse), a touchpad, an optical scanner, a video capture device (e.g., a still camera, a video camera), a touchscreen, and any combinations thereof.
  • an alphanumeric input device e.g., a keyboard
  • a pointing device e.g., a joystick, a gamepad
  • an audio input device e.g., a microphone, a voice response system, etc.
  • a cursor control device e.g., a mouse
  • a touchpad
  • Input device 1632 may be interfaced to bus 1612 via any of a variety of interfaces (not shown) including, but not limited to, a serial interface, a parallel interface, a game port, a USB interface, a FIREWIRE interface, a direct interface to bus 1612, and any combinations thereof.
  • Input device 1632 may include a touch screen interface that may be a part of or separate from display 1636, discussed further below.
  • Input device 1632 may be utilized as a user selection device for selecting one or more graphical representations in a graphical interface as described above.
  • a user may also input commands and/or other information to computer system 1600 via storage device 1624 (e.g., a removable disk drive, a flash drive, etc.) and/or network interface device 1640.
  • a network interface device such as network interface device 1640, may be utilized for connecting computer system 1600 to one or more of a variety of networks, such as network 1644, and one or more remote devices 1648 connected thereto. Examples of a network interface device include, but are not limited to, a network interface card (e.g., a mobile network interface card, a LAN card), a modem, and any combination thereof.
  • Examples of a network include, but are not limited to, a wide area network (e.g., the Internet, an enterprise network), a local area network (e.g., a network associated with an office, a building, a campus or other relatively small geographic space), a telephone network, a data network associated with a telephone/voice provider (e.g., a mobile communications provider data and/or voice network), a direct connection between two computing devices, and any combinations thereof.
  • a network such as network 1644, may employ a wired and/or a wireless mode of communication. In general, any network topology may be used.
  • Information may be communicated to and/or from computer system 1600 via network interface device 1640.
  • Computer system 1600 may further include a video display adapter 1652 for
  • a display device such as display device 1636.
  • Examples of a display device include, but are not limited to, a liquid crystal display (LCD), a cathode ray tube (CRT), a plasma display, a light emitting diode (LED) display, and any combinations thereof.
  • LCD liquid crystal display
  • CRT cathode ray tube
  • LED light emitting diode
  • Display adapter 1652 and display device 1636 may be utilized in combination with processor 1604 to provide graphical representations of aspects of the present disclosure.
  • computer system 1600 may include one or more other peripheral output devices including, but not limited to, an audio speaker, a printer, and any combinations thereof.
  • peripheral output devices may be connected to bus 1612 via a peripheral interface 1656.
  • Examples of a peripheral interface include, but are not limited to, a serial port, a USB connection, a FIREWIRE connection, a parallel connection, and any combinations thereof.
  • various example embodiments may be implemented in hardware or firmware.
  • various exemplary embodiments may be implemented as instructions stored on a machine-readable storage medium, which may be read and executed by at least one processor to perform the operations described in detail herein.
  • a machine-readable storage medium may include any mechanism for storing information in a form readable by a machine, such as a personal or laptop computer, a server, or other computing device.
  • a machine-readable storage medium may include read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and similar storage media.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Primary Health Care (AREA)
  • Evolutionary Computation (AREA)
  • Epidemiology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Chemical & Material Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Medicinal Chemistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Robotics (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

A system enabling one or more of a robotic companion's actions to be determined as a function of a user's wearable device sensor readings such that the robotic companion may provide the user with visual, aural, and/or other types of feedback when a sensor reading meets or breaks one or more rules. At a high level, aspects of the present disclosure are directed to systems, methods, and software for enabling a robotic companion to perform an action as a function of a user's wearable device sensor readings. In an embodiment, a user may specify one or more rules that determine how a robotic companion will respond as a function of one or more sensor readings received from a wearable device.

Description

ROBOTIC COMPANION SENSITIVE TO PHYSIOLOGICAL DATA GATHERED BY
WEARABLE DEVICES
RELATED APPLICATION DATA
[0001] This application claims the benefit of priority of U.S. Provisional Patent Application Serial No. 62/130,201, filed on March 9, 2015, and titled "ROBOTIC COMPANION CAPABLE OF PERFORMING ACTIONS AS A FUNCTION OF WEARABLE DEVICE SENSOR READINGS," which is incorporated by reference herein in its entirety for all purposes.
TECHNICAL FIELD
[0002] Various embodiments described here relate generally to the field of wearable technology. In particular, but not exclusively, various embodiments are directed to a robotic companion capable of performing actions as a function of multiple wearable device sensor readings.
BACKGROUND
[0003] Various robotic companions have been implemented ranging from simple toys to artificially intelligent and situationally aware androids. In the race to develop highly useful and marketable robotic companions, various combinations of sensors and corresponding automated responses to resulting sensor readings have been implemented. However, such automated responses are mechanical and predictable in nature, which can lead to decreased interest in and use of robotic companions.
SUMMARY
[0004] In some aspects, the present disclosure is directed to controlling a robotic companion as a function of one or more wearable device sensor readings collected by a wearable device. By using one or more aspects of the present disclosure, robotic companions can be made to perform actions in a realistic and interesting way such that users can receive meaningful and varied responses from the companion based on wearable device sensor readings collected by a wearable device worn by such users. Through use of the methods herein, a robotic companion can be made that recognizes which actions work for a particular user by monitoring the user's physiological characteristics and modifies its behavior to optimize a desired reaction from a user. As a simple example, a robotic companion may detect that a user's pulse is too high and take actions intended to reduce the user's pulse. [0005] Various embodiments are directed to a method performed by a robotic companion device for determining behavior based on a user's wearable device, the method including: wirelessly receiving, from a wearable device of the user, information regarding the user's current physiological state; determining, based on the received information, a label identifying the user's current physiological state; identifying a first activity associated with the label; and engaging an autonomous interaction with the user according to the first activity.
[0006] Various embodiments are directed to a non-transitory machine-readable storage medium encoded with instructions for execution by a robotic companion device for determining behavior based on a user's wearable device, the non-transitory machine-readable storage medium including: instructions for wirelessly receiving, from a wearable device of the user, information regarding the user's current physiological state; instructions for determining, based on the received information, a label identifying the user's current physiological state; instructions for identifying a first activity associated with the label; and instructions for engaging an autonomous interaction with the user according to the first activity.
[0007] Various embodiments are directed to a robotic companion device for determining behavior based on a user's wearable device, the robotic companion device including: a wireless communications interface configured to receive, from a wearable device of the user, information regarding the user's current physiological state; a memory device, and a processor configured to determine, based on the received information, a label identifying the user's current physiological state; identify a first activity associated with the label; and engage an autonomous interaction with the user according to the first activity.
[0008] Various embodiments are described wherein the information regarding the user's current state includes the label.
[0009] Various embodiments are described wherein the information regarding the user's current state includes at least one physiological parameter.
[0010] Various embodiments additionally include receiving additional physiological parameters; determining whether the label is no longer applicable to the user based on at least one of the physiological parameters and the additional physiological parameters; and when the label is no longer applicable, ceasing interaction with the user according to the at least one activity after a time period has elapsed.
[0011] Various embodiments additionally include setting the time period in proportion to the duration of applicability of the label.
[0012] Various embodiments additionally include receiving additional physiological parameters; determining whether the label is no longer applicable to the user based on at least one of the physiological parameters and the additional physiological parameters; determining that the label remains applicable after a predetermined time period; based on determining that the label remains applicable after the predetermined time period, identifying a second activity associated with the label; and engaging an additional autonomous interaction with the user according to the second activity.
[0013] Various embodiments are described wherein determining the label includes applying a learned model associated with the label to the received at least one physiological parameter to determine whether the label is applicable to the user's current state.
[0014] Various embodiments additionally include receiving feedback indicating the correctness of the label to the user's current state; creating a new training example from the at least one physiological parameter and the received feedback; retraining the learned model based on the new training example using a machine learning process.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] In order to better understand various example embodiments, reference is made to the accompanying drawings, wherein:
[0016] FIG. 1 is a flow diagram illustrating a method of enabling a robotic companion to perform an action as a function of a user's wearable device sensor readings;
[0017] FIG. 2 is a block diagram of a system that can be used to implement the method of FIG. i ;
[0018] FIGS. 3A-D are representative screenshots depicting various aspects of an exemplary companion interface; [0019] FIG. 4 is a flow diagram illustrating a method of controlling a robotic companion;
[0020] FIG. 5 illustrates particular implementations of various steps of a method of controlling a robotic companion;
[0021] FIG. 6 is a table containing an example of information that may be stored in a wearable device database;
[0022] FIG. 7 is a block diagram of an example computing device that may implement various ones of the features and processes of the present disclosure;
[0023] FIG. 8A is a flow diagram illustrating a wearable device decision method;
[0024] FIG. 8B is a table containing an example of information that may be stored in a wearable device rules database;
[0025] FIG. 9 is a flow diagram illustrating a robotic companion base method;
[0026] FIG. 10 is a flow diagram illustrating a robotic companion interactive method;
[0027] FIG. 11 A is a flow diagram illustrating a robotic companion execution method;
[0028] FIG. 1 IB is a table containing an example of information that may be stored in a robotic companion program database;
[0029] FIG. 12 is a table containing an example of information that may be stored in a robotic companion interactive reference table;
[0030] FIG. 13 is a flow diagram illustrating a companion care software method;
[0031] FIG. 14 is a graph illustrating an example of how a robotic companion may behave as a function of an amount of time during which a user exhibits a particular state;
[0032] FIG. 15 is a flow diagram illustrating a particular method that can be used to implement one or more robotic companions whose actions are based upon a user's wearable device sensor readings; and [0033] FIG. 16 is a block diagram of a computing system that can be used to implement any one or more of the methodologies disclosed herein and any one or more portions thereof.
[0034] The drawings are not necessarily to scale and may be illustrated by phantom lines, diagrammatic representations and fragmentary views. In certain instances, details that are not necessary for an understanding of the embodiments or that render other details difficult to perceive may have been omitted.
DETAILED DESCRIPTION
[0035] The description and drawings presented herein illustrate various principles. It will be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody these principles and are included within the scope of this disclosure. As used herein, the term, "or," as used herein, refers to a non-exclusive or (i.e., and/or), unless otherwise indicated (e.g., "or else" or "or in the alternative"). Additionally, the various embodiments described herein are not necessarily mutually exclusive and may be combined to produce additional embodiments that incorporate the principles described herein.
[0036] In view of the foregoing, it would be beneficial to enable a robotic companion to simulate an attunement to a user's current condition (e.g., emotional or health state). Accordingly, the present disclosure is directed to a system enabling one or more of a robotic companion's actions to be determined as a function of a user's wearable device sensor readings such that the robotic companion may provide the user with visual, aural, and/or other types of feedback when a sensor reading meets or breaks one or more rules. At a high level, aspects of the present disclosure are directed to systems, methods, and software for enabling a robotic companion to perform an action as a function of a user's wearable device sensor readings. In an embodiment, a user may specify one or more rules that determine how a robotic companion will respond as a function of one or more sensor readings received from a wearable device.
[0037] FIG. 1 illustrates a method 100 that can be implemented by a robotic companion or other supporting device in order to achieve the aforementioned functionality. As shown, method includes: a step 105 of monitoring one or more first wearable device sensor readings; a step 1 10 of comparing the one or more first readings to one or more predetermined rules; a step 1 15 of, when one or more of the first readings violates one or more of the rules, assessing at least one of a type and a severity of a health condition as a function of the one or more first readings that violate the one or more rules; a step 120 of causing the robotic companion to perform a first action as a function of the at least one of the type and the severity of the health condition; a step 125 of monitoring one or more second wearable device sensor readings; a step 130 of reassessing the at least one of the type and the severity of the health condition as a function of the second one or more wearable device sensor readings; a step 135 of determining a duration of time that the at least one of the type and the severity of the health condition persists as a function of said reassessing; and a step 140 of causing the robotic companion to perform a second action different from the first action as a function of the duration of time.
[0038] Monitoring one or more first wearable device sensor readings at step 105 may entail requesting, retrieving, and/or automatedly receiving sensor readings from one or more wearable devices, which may include, for example, a smart phone, a pedometer, and/ or a heart rate monitor, among others. Comparing the one or more first readings to one or more predetermined rules at step 1 10 may include comparing a reading to one or more upper or lower limits, among other types of rules that can be set. Assessing a type and severity of a health condition at step 1 15 may entail, for example, determining that a user is experiencing elevated levels of stress as a result of wearable device sensor readings that indicate that the user's pulse rate is higher than average. Causing the robotic companion to perform a first action at step 120 may include, for example, causing the robotic companion to make a purring sound to attempt to calm the user. Monitoring one or more second wearable device sensor readings at step 125 may include, for example, comparing a user's blood pressure or other sensed physiological characteristic to one or more rules. Reassessing at step 130 may entail, for example, determining that the user is still experiencing high levels of stress given the results of the comparison of the user's blood pressure or other sensed physiological characteristic readings with the one or more rules (e.g., above or below a certain level). Determining a duration of time that the health condition (in this example, stress) persists at step 135 may include monitoring a clock or timer to determine an amount of time that elapses from the time the condition was first recognized (in this example, when the elevated pulse rate is first detected) to the reassessment (in this example, the determination that the user's blood pressure is elevated). Based on that amount of time, at step 140 the robotic companion may then perform a second action different from the first. For example, the robot may determine that making a purring sound has not caused the user to experience a reduction in stress and, as such, may perform a different action, such as jumping, meowing, or wagging its tail, among others. Although specific examples have been provided above, one of ordinary skill in the art will readily understand after reading this disclosure in its entirety the full spectrum of possibilities enabled and embodied herein. By using a system like that of FIG. 2 to implement a method like that of FIG. 1, users can quickly and easily utilize a robotic companion that can perform an action as a function of a user's wearable device sensor readings.
[0039] FIG. 2 illustrates a system 200 that can be used to implement the method of FIG. 1. As shown, system 200 may include a wearable device 204 with memory 208, which may contain wearable device base software 212, a companion graphical user interface or "GUI" 216, a wearable device database 220, wearable device decision software 224, and a wearable device rules database 228. Wearable device 204 may further include a clock 232, one or more sensors 236, a display 240, a power supply 244, and a communications interface or "comm" 248, which may utilize, e.g., low power Bluetooth™ and/or any other appropriate standard of communication. In some embodiments, the sensor devices 236 may sense physiological parameters about the user. For example, the sensor devices 236 may include accelerometers, conductance sensors, optical sensors, temperature sensors, microphones, cameras, etc. These or other sensors may be useful for sensing, computing, estimating, or otherwise acquiring physiological parameters descriptive of the wearer such as, for example, steps taken, walking/running distance, standing hours, heart rate, respiratory rate, blood pressure, stress level, body temperature, calories burned, resting energy expenditure, active energy expenditure, height, weight, sleep metrics, etc.
[0040] System 200 may further include a robotic companion 252 with a memory 256 containing robotic companion base software 260, robotic companion interactive software 264, a robotic companion interactive reference table 268, robotic companion execution software 272, a robotic companion program database 276, and robotic companion care software 280. Robotic companion 252 may further include one or more sensors 282, a processor 284, a camera 286, a microphone or "mic" 288, and a communications interface or "comm" 290. The sensors 282 may include any of those sensors described above in relation to the wearable device though, as will be apparent, the location of such sensors 282 within the robotic companion rather than within a carried or otherwise worn device will cause the data reported by the sensors to reflect other parameters. In some cases, these parameters are not the user's physiological parameters while in some cases, these parameters may only reflect the user's physiological parameters at intermittent times, such as when the user is touching, carrying, or otherwise interacting with the robotic companion 252. In use, robotic companion 252 may interact with wearable device 204 and take actions based upon wearable device sensor readings in order to provide a user with visual and/or other types of feedback from the robotic companion, e.g., when one or more sensor readings deviate from a predetermined range.
[0041] The robotic companion 252 may be virtually any machine that autonomously performs one or more actions within a physical environment of a user. For example, the robotic companion 252 may be a toy such as a robotic cat or other animal. In such an embodiment, the robotic companion 252 may perform actions such as navigating and moving around its environment, outputting audible interactions (e.g., meowing or purring sounds), physically interacting with the user (e.g., rubbing against the user's leg, "licking" the user's face), and performing other simulations of pet behavior (e.g., running around the room, jumping, laying down for a "nap," etc.). In some embodiments, the robotic companion 252 may perform more useful functions beyond user entertainment. For example, the robotic companion 252 may fetch objects, pour drinks, prepare food, vacuum, fold laundry, or perform other household tasks autonomously. As another example, the robotic companion 252 may perform health-related tasks such as medication dispensation or administration. Various additional functions capable of performance by a robotic companion with respect to a user or the surrounding environment will be apparent. It will also be understood that a robotic companion need not be ambulatory and, in some embodiments, may be a stationary device; for example, the robotic companion may be a stationary coffee machine that autonomously decides when to brew a cup or pot of coffee and how to brew the coffee (e.g., which beans to grind, whether and how much milk to add, whether and how much sugar or other sweetener to add, or latte art to create on a prepared cup).
[0042] As used herein, the term "processor" will be understood to encompass various hardware devices such as, for example, microprocessors, field-programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), and other hardware devices capable of performing the various functions described herein as being performed by the wearable device 204, robotic companion 252, or other device. Further, the respective memory devices 208, 256 may include various devices such as L1/L2/L3 cache, system memory, or storage devices. As used herein, the term "non-transitory machine-readable storage medium" will be understood to refer to both volatile memory (e.g., SRAM and DRAM) and non-volatile memory (e.g., flash, magnetic, and optical memory) devices, but to exclude mere transitory signals. While various embodiments may be described herein with respect to software or instructions "performing" various functions, it will be understood that such functions will actually be performed by hardware devices such as a processor executing the software or instructions in question. In some embodiments, such as embodiments utilizing one or more ASICs, various functions described herein may be hardwired into the hardware operation; in such embodiments, the software or instructions corresponding to such functionality may be omitted.
[0043] FIGS. 3A-3D show various examples of a companion GUI 216 and related components. As shown in FIG. 3 A, a companion GUI 216 may include buttons for viewing and/or managing a user profile (see, e.g., FIG. 3B), wearable device data (see, e.g., FIG. 3C), robotic companion actions (see, e.g. , FIG. 3D), and raw data from wearable device database 220. User profile information may be displayed in companion GUI 216 as shown in FIG. 3 A, and may include information such as a user ID and one or more commonly or currently connected robotic companions. For example, in FIG. 3A, CatXXXl and DogXXX2 robotic companions are listed. FIG. 3B shows an example of a user profile interface 305 that a user can configure based on their preferences. In the example shown in FIG. 3B, the user has selected a medical schedule, which may be stored on wearable device 204 such that robotic companion 252 can recognize it, and a carry medicine option, which may enable the robotic companion to carry one or more medicines. Although not selected in FIG. 3B, a user can configure user profile 305 such that robotic companion 252 will store wearable device data instead of or in addition to storing it on wearable device 204. The user can also select an option that allows data to be sent to others. By using the robotic companions button shown in FIG. 3B, a user can choose one or more different robotic companions to carry out various actions, such as, for example, a cat, a dog, or a fish, among others. The user profile may also include a robot companion activity duration controls that allow the user or an administrator to input a desired amount of time for various actions, which may be specified by one or more programs, to be performed. The user profile interface may further include a select and save button allowing the user to select and save their settings.
[0044] FIG. 3C shows an example of a wearable device data interface 310. By manipulating aspects of this interface, a user can select one or more physiological parameters they would like the wearable device to store in wearable device database 220 and/or to send (or have sent) to one or more robotic companions. As shown, the wearable device data interface may include a priority list enabling a user to specify relative priorities of multiple sensors that may be associated with one or more actions. FIG. 3D shows an example of a robotic companion actions interface 315. As shown, a user can select one or more actions for a robotic companion to perform; for each robotic companion, these actions will be dependent on the nature of the robotic companion, i.e., whether it is a robotic cat, robotic dog, or robotic bird, among others.
[0045] FIG. 4 shows an example of an overall software process 400 that can be used to implement interaction between one or more wearable devices and one or more robotic companions whose actions are based upon a user's wearable device sensor readings. As shown, process 400 may begin with wearable device base software 212 executing and collecting sensor readings and then storing related data in wearable device database 220. In some embodiments, the wearable device base software 212 may further extract or otherwise obtain additional parameters (e.g., physiological parameters) from the raw data collected from the sensors. For example, the wearable device base software 212 may process color data over time received from an optical sensor on the underside of a watch device to extract a pulse parameter. As another example, the wearable device base software 212 may obtain accelerometer data and transmit the data (e.g., in raw or processed form such as, for example, accumulated or average accelerometer data over a time window) to another device (e.g. a separate mobile device with greater processing capability or a remote server or virtual machine) for processing. The other device may then extract additional parameters (e.g., energy expenditure) to be returned to the wearable device or robotic companion. After obtaining relevant physiological or other parameters, the wearable device base software 212 may provide one or more subsets of such data to wearable device decision software 224.
[0046] Wearable device decision software 224 may then compare wearable device database 220 with wearable device rules database 228, and if any of the sensor readings meet or deviate from the rules, depending on the nature of the rules, the decision software may communicate with companion GUI 216 to ascertain user-selected options and send information to robotic companion base software 260 based on the options, rules, and/or sensor readings. Robotic companion base software 260 may then provide that information to robotic companion interactive software 264, which may compare, e.g., wearable device database 220 with robotic companion interactive reference table 268. If an action is required of the robotic companion based on that comparison, then interactive software 264 may send an indication of such an action to robotic companion execution software 272, which may compare the indication with robotic companion program database 276. Robotic companion program database 276 may then send a program file enabling the robotic companion to execute the action to execution software 272, after which the robotic companion may execute the action and/or the process may return to interactive software 264. Under some circumstances, robotic companion execution software 272 may execute robotic companion care software 280 in order to provide more realistic and varied behavior. When an action is no longer necessary, which may be the case after a user has, e.g., taken their scheduled medication, then process 400 may end.
[0047] FIG. 5 shows an exemplary method 500 that wearable device base software 212 may be configured to perform. In this example, method 500 begins by executing routine operations (step 505), such as collecting sensor data, running data analysis, and reporting and/or storing the sensor data and/or results of the analysis. Method 500 then stores the raw data in wearable device database 220 (step 510) and then sends wearable device database 220 to wearable device decision software 224 (step 515). FIG. 6 shows an example of a wearable device database 220. Such a database may include one or more wearable IDs, dates, times, indications of sensors that have produced one or more readings, raw data associated with such one or more readings, and one or more associated labels. In some embodiments, the labels can be added after wearable device decision software 224 compares wearable device database 220 with wearable device rules database 228. If the resulting sensor reading is lower or higher than a pre-selected rule, a label may be added to that specific sensor reading as a way to determine emotion.
Example System Architecture for a Wearable Device
[0048] FIG. 7 illustrates an exemplary wearable computing device 700 that may implement various features and/or processes of the present disclosure, such as the features and processes illustrated in other figures of this disclosure, as well as features and processes that would be apparent to those of ordinary skill in the art after reading this entire disclosure. As shown, computing device 700 may include a memory interface 704, one or more data processors, image processors and/or central processing units 708, and a peripherals interface 712. Memory interface 704, one or more processors 708, and/or peripherals interface 712 may be separate components or may be integrated in one or more integrated circuits. The various components in computing device 700 may be coupled by one or more communication buses or signal lines.
[0049] Sensors, devices, and subsystems may be coupled to peripherals interface 712 to facilitate one or more functionalities. For example, a motion sensor 716, a light sensor 720, and a proximity sensor 724 may be coupled to peripherals interface 712 to facilitate orientation, lighting, and/or proximity functions. Other sensors 728 may also be connected to peripherals interface 712, such as a global navigation satellite system (GNSS) (e.g., GPS receiver), a temperature sensor, a biometric sensor, and/or one or more other sensing devices, to facilitate related functionalities.
[0050] A camera subsystem 732 and an optical sensor 736, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, may be utilized to facilitate camera functions, such as recording images and/or video. Camera subsystem 732 and optical sensor 736 may be used to collect images of a user to be used during authentication of a user, e.g., by performing facial recognition analysis.
[0051] Communication functions may be facilitated through one or more wireless
communication subsystems 740, which may include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of communication subsystem 740 may depend on the communication network(s) over which computing device 700 is intended to operate. For example, computing device 700 may include communication subsystems 740 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-Fi™ or WiMax™ network, and/or a Bluetooth™ network. In particular, wireless communication subsystems 740 may include hosting protocols such that one or more devices 700 may be configured as a base station for other wireless devices.
[0052] An audio subsystem 744 may be coupled to a speaker 748 and a microphone 752 to facilitate voice-enabled functions, such as speaker recognition, voice replication, digital recording, and/or telephony functions. Audio subsystem 744 may be configured to facilitate processing voice commands, voice-printing, and voice authentication.
[0053] I/O subsystem 756 may include a touch-surface controller 760 and/or other input controller(s) 764. Touch-surface controller 760 may be coupled to a touch surface 768. Touch surface 768 and touch-surface controller 760 may, for example, detect contact and movement or a lack thereof using one or more of any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and/or surface acoustic wave technologies, optionally as well as other proximity sensor arrays and/or other elements for determining one or more points of contact with touch surface 768.
[0054] Other input controller(s) 764 may be coupled to other input/control devices 772, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. One or more related buttons or other controls (not shown) may include one or more sets of up/down buttons for volume and/or amplitude control of speaker 748 and/or
microphone 752. Using the same or similar buttons or other controls, a user may activate a voice control, or voice command, module that enables the user to speak commands into microphone to cause device 700 to execute the spoken command. The user may customize functionality of one or more buttons or other controls. Touch surface 768 may, for example, also be used to implement virtual or soft buttons and/or a keyboard.
[0055] In some implementations, computing device 700 may present recorded audio and/or video files, such as MP3, AAC, and/or MPEG files. In some implementations, computing device 700 may include the functionality of an MP3 player, such as an iPod™. Computing device 700 may, therefore, include a 36-pin connector that is compatible with related iPod™ hardware. Other input/output and control devices may also be used.
[0056] As shown, memory interface 704 may be coupled to one or more types of memory 776. Memory 776 may include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). Memory 776 may store an operating system 780, such as Darwin™, RTXC, LINUX, UNIX, OS X™, WINDOWS™, and/or an embedded operating system such as Vx Works. Operating system 780 may include instructions for handling basic system services and/or for performing hardware dependent tasks. In some implementations, operating system 780 may include a kernel (e.g., UNIX kernel). Further, in some implementations, operating system 780 may include instructions for performing voice authentication.
[0057] Memory 776 may also store communication instructions 782 to facilitate communicating with one or more additional devices, one or more computers, and/or one or more servers.
Additionally or alternatively, memory 776 may include: graphical user interface instructions 784 to facilitate graphic user interface processing; sensor processing instructions 786 to facilitate sensor- related processing and functions; phone instructions 788 to facilitate phone-related processes and functions; electronic messaging instructions 790 to facilitate electronic-messaging related processes and functions; web browsing instructions 792 to facilitate web browsing-related processes and functions; media processing instructions 794 to facilitate media processing-related processes and functions; GNSS/Navigation instructions 796 to facilitate GNSS and navigation-related processes and instructions; and/or camera instructions 797 to facilitate camera-related processes and functions. Memory 776 may store other software instructions 798 to facilitate other processes and functions. For example, other software instructions 798 may include instructions for counting steps the user takes when device 700 is worn.
[0058] Memory 776 may also store other software instructions (not shown), such as web video instructions to facilitate web video-related processes and functions and/or web shopping instructions to facilitate web shopping-related processes and functions. In some implementations, media processing instructions 794 may be divided into audio processing instructions and video processing instructions to facilitate audio processing-related processes and functions and video processing- related processes and functions, respectively. An activation record and International Mobile Equipment Identity (IMEI) 799 or similar hardware identifier may also be stored in memory 776.
[0059] Each of the above identified instructions and applications may correspond to a set of instructions for performing one or more functions described herein. These instructions need not necessarily be implemented as separate software programs, procedures, or modules. Memory 776 may include additional instructions or fewer instructions. Further, various functions of computing device 700 may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
[0060] FIG. 8A shows an exemplary method 800 that wearable device decision software 224 may be configured to perform. In this example, method 800 begins by receiving a wearable device database 220 or a portion thereof from wearable device base software 212. Method 800 may then compare wearable device database 220 to wearable device rules database 228, and, as a function of that comparison, the method may determine whether wearable device rules database 228 requires that wearable device database (or a portion thereof) 220 be sent to a robotic companion. If not, then method 800 may terminate. However, if the determination is positive, then method 800 may initiate a communications interface, after which the method may determine whether the appropriate robotic companion is available. If yes, then the method may apply the companion GUI action the user has selected in robotic companion actions interface 315, then send the raw data or other extracted parameters to robotic companion base software 260.
[0061] FIG. 8B shows an example of wearable device rules database 228, which may include a list of one or more sensors, one or more rules associated with such sensors, and a location and/or device to which the wearable device database 220 may be sent when the rule is violated (e.g., a robotic cat). For example, as shown in FIG. 8, if the pulse is less than 45 or greater than 220 then wearable device database 220 (or a portion thereof such as, for example, raw or extracted parameters acquired within a particular time window or that have not yet been sent to the robotic companion) will be sent to the robotic cat if it is available. As shown in the label columns, if the pulse is less than 45, system 200 may consider the user to be depressed; similarly, if the pulse is greater than 220, the system may consider the user to be anxious. As will be understood by those of ordinary skill after reading this disclosure in its entirety, various sensors, rules, locations, and labels can be used other than those explicitly shown in FIG. 8B. In some embodiments, rather than or in addition to sending parameters to the robotic companions, the wearable device may transmit the determined label to the robotic companion which, in turn, may act based on this label (e.g., the robotic companion may choose to engage in different actions for a depressed user versus an anxious user).
[0062] While various examples described herein relate to an emotional state of the user, it will be apparent that various labels may relate, instead, to different aspects of a user's state and that such labels may be useful for different types of robotic companions. For example, where the robotic companion is capable of providing a drink to the user, a label of "thirsty" may be helpful. As another example, where the robotic companion can provide coffee, a label of "sleepy" or "low caffeine" may be useful to determine whether to provide coffee. As yet another example, a robotic companion that is able to provide or fetch medical assistance, labels related to different medical states (e.g.,
"unconscious," "bleeding," "heart attack," or simply "assistance needed") may be useful.
[0063] Additionally, it will be apparent that alternate approaches to assigning actionable labels (e.g., "depressed" or "anxious") to a user to be shared with the robotic companion may be applied other than the simple rules-based approach. For example, in some alternative embodiments, one or more models may be trained (e.g., using a machine learning algorithm such as logistic regression on a data set gathered from a test population) to use multiple types of sensor data and extracted parameters to determine whether an actionable label applies to the user. In some such embodiments, the model(s) may be adapted to the particular user by receiving feedback as to the accuracy of each given prediction (e.g., manually input by the user or inferred based on subsequent user interactions with the robotic companion), creating additional or replacement training examples for inclusion in the training data set, and retraining the model(s) using the same machine learning algorithm. For example, if a current model identifies the user as being depressed, the robotic companion begins to play fetch with the user, and the user is then sensed to not be depressed (e.g. , the existing model immediately thereafter no longer classifies the user as depressed based on physiological parameters, the robotic companion observes an interaction pattern that is not known to fit with the label such as immediately and enthusiastically engaging in the game of fetch without coaxing by the robotic companion, etc.), the sensor data and other physiological parameters that led to the initial conclusion may be labeled (e.g., "not depressed") as a new training example. By collecting a set of training examples specific to the user, the training algorithm may be re-run to adapt the model(s) to the specific user.
[0064] FIG. 9 shows an exemplary method 900 that robotic companion base software 260 may be configured to perform. In this example, method 900 begins at step 905 with a determination of whether the robotic companion received a connection to a wearable device. If not, then method 900 may end; however, if the determination is positive, then the method may proceed to step 910, at which it may receive the wearable device database 220 (or a portion thereof) from wearable device decision software 224. Method 900 may then proceed to step 915, at which it may send wearable device database 220 to the robotic companion interactive software 264, cause robotic companion interactive software 264 to execute at step 920, and then return to step 905 to determine whether the wearable device is connected.
[0065] FIG. 10 shows an exemplary method 1000 that robotic companion interactive software 264 may be configured to perform. In this example, method 1000 begins at step 1005 by receiving wearable device database 220 from robotic companion base software 260 and then, at step 1010, comparing the raw data with robotic companion interactive reference table 268. If there is no match found and/or no action required, method 1000 may return to initial step 1005. However, if a match requires an action, then method 1000 may proceed to step 1020 to determine whether multiple activities are required by determining whether, for example, one or more sensors have violated two or more rules associated with separate activities. If only one action is required, method 1000 may proceed to step 1025 to determine which action should be performed and/or which label to associate with the user (e.g., depressed, anxious, etc.). If more than one action is required, method 1000 may first, before proceeding to step 1025, proceed to step 1030 to determine relative priorities for the actions and then, starting with the action having the highest priority, proceed to determine which action should be performed and/or which label to associate with the user at step 1025. The label may be determined in step 1025 based on physiological parameters of the user as described above with respect to FIG. 8B (e.g., using a rule based approach, a machine-learning approach, etc.) In some embodiments, such as those wherein the wearable device determines and provides the label to the robotic companion, step 1025 may simply correspond to reading the label from the received message. Alternatively, in some such embodiments, the robotic companion may revise the received label based on parameters available via the sensors that are local to the robotic companion (e.g., a camera mounted on the robotic companion may be used as a sensor to gauge the user's pulse, respiratory rate, and body language to confirm whether the user is indeed anxious).
[0066] Method 1000 may then send information related to the action and/or the label to execution software 272 at steps 1035 and/or 1040 and cause interactive software 264 to run robotic companion execution software 272, which may cause the robotic companion to perform the prescribed action at step 1045. If more than one activity is required, method 1000 may determine such at step 1050, return to determine relative priorities of any remaining actions at step 1030, and proceed through the method as described for each remaining action from highest priority to lowest priority. If no further activities are required to be performed, then method 1000 may return to receiving (or waiting to receive) wearable device database 220 at step 1005.
[0067] FIG. 1 1A shows an exemplary method 1 100 that robotic companion execution software 272 may be configured to perform. In this example, method 1 100 begins in step 1 105 by receiving an indication of an action from robotic companion interactive software 264. Method 1 100 may then compare the action with robotic companion program database 276 in step 1 100, execute a matching program file from the robotic companion program database 276 in step 1 1 15 (optionally proceeding to execute robotic companion care software 280 such as that shown in FIG. 13 to render more realistic and varied actions), and then return to robotic companion interactive software 264 in step 1 120. As shown in FIG. 1 IB, robotic companion program database 276 may include a list of one or more labels/inputs, one or more robotic companions, one or more actions associated with such companions, and one or more program files corresponding to each action. In the example of FIG. 1 IB, program files with higher numbers (e.g., program file 3 vs. program file 1) correspond to higher frequency of the associated activity. For example, if the activity is "meow," then
"File.Meow-xxx2" may cause the robotic companion to meow twice as frequently as "File.Meow- xxxl " but half as frequent as "File.Meow-xxx3," although other implementations could be used. This functionality informs the user that the sensor readings are still outside of wearable device rules database 228. In some embodiments, execution software 272 may match one or more actions in program database 276 with the indication of an action received from interactive software 264.
[0068] While the embodiments of FIG. 1 IB defines discrete robotic companion behaviors among multiple "program files," (e.g. , .jar files) it will be apparent that various alternative software architectures may be used to define behaviors for a robotic companion. For example, in some embodiments, a robotic companion's elementary actions (e.g., walking, making a sound, jumping, etc.) may be defined as subroutines in the base software, while the activities triggered by different labels or physiological parameters may be defined in terms of a script of these elementary actions (passing in relevant parameters such as a target location for walking or a sound type to emit) or in terms of an end goal that the software may determine how to achieve using the available elementary actions. As yet another alternative, the records in the database may call one or more functions defining these elementary actions directly (e.g., in the case of a robotic companion coffee machine, brew_latte(strength=strong, art=image.file)).
[0069] FIG. 12 shows an example of robotic companion interactive reference table 268. As shown, such a table may define correspondences between one or more sensor readings 1204 and one or more selected actions 1208. By utilizing a table like that of FIG. 12, a user can specify actions 1208 they would like the robotic companion to run depending on different sensor readings 1204. As shown for example in FIG. 12, if the user's pulse is less than 45 or greater than 220, the robotic cat will meow, and if the user's temperature is below 94 or above 202, then the robotic cat will jump. Thus, the behavior of the robotic companion may be affected by a determined label for the user (through application of the robotic companion program database 276) or affected directly by raw sensor data or extracted parameters (through application of the robotic companion interactive reference table 268).
[0070] FIG. 13 shows an exemplary method 1300 that robotic companion care software 280 may be configured to perform. In this example, method 1300 allows the robotic companion to run a first program file in step 1305 then determine whether a new reading from the same sensor still violates one or more rules in step 1310. If the new reading does not violate any rules, method 1300 may run the same program file for an additional time period in step 1315 dependent on how long the user was in the emotional state. For example, if a user was in an anxious state for twelve minutes, then the robotic companion's associated activity may last an additional 10 minutes. Method 1300 may then return to the execution software 272.
[0071] If, however, the new sensor reading still violates one or more rules, then method 1300 may run the second program file in step 1320, which may, for example, perform the same activity, e.g., purring, but at a higher frequency. Again, this time in step 1325, the method 1300 determines if the most recent sensor readings (or other extracted parameters) are still outside of the defined rules. If not, the method 1300 continues to run the second program in step 1330. Otherwise, the method 1300 proceeds to step 1335 where the robotic companion proceeds to execute the third program for a preselected time and check, in step 1340, whether the parameters have normalized to fall within the rules. If so, the method 1300 proceeds to run, in step 1345, the third program file for an additional amount of time (e.g., 15 minutes). If after the third program file new sensor readings are still violating the rules, method 1300 may change the robotic companion's activity, in step 1350, and repeat the process until new readings no longer violate any rules.
[0072] FIG. 14 shows a graph 1400 illustrating an example of one way in which companion care software 280 may perform. As shown, the robotic companion may perform an activity for a particular duration of time even after the associated sensor data no longer violates any rules. For example, if the user is in an anxious state for 12 minutes and then new sensor data from one or more sensors associated with anxiousness indicates that the user is no longer anxious (e.g., when the sensor readings no longer violate any rules), the robotic companion may perform the same activity for an additional 10 minutes.
[0073] FIG. 15 shows a particular method 1500 that can be used to implement one or more robotic companions whose actions are based upon a user's wearable device sensor readings. Such a method may begin with providing a wearable device with memory, which may contain wearable device base software 212, a companion GUI 216, a wearable device database 220, wearable device decision software 224, and a wearable device rules database 228. The method may further involve providing the wearable device with a clock, one or more sensors, a display, a power supply, and a communications interface. Such a method may further involve providing a robotic companion with a memory containing robotic companion base software 260, robotic companion interactive software 264, a robotic companion interactive reference table 268, robotic companion execution software 272, and a robotic companion program database 276. The method may further include providing the robotic companion with one or more sensors, a processor, a camera, a microphone, and a communications interface. The method may then entail executing the companion GUI 216 to determine user options, select wearable device data to be sent to one or more robotic companions, and to specify one or more robotic companion actions. The method may then involve executing wearable device base software 212 to collect sensor data, storing raw data in wearable device database 220, and sending some or all of the database to wearable device decision software 224. The method may then involve executing wearable device decision software 224 to compare wearable device database 220 to the wearable device rules in order to determine whether one or more sensor readings violate one or more rules for that particular sensor. If a reading violates a rule, then wearable device database 220 may be sent to the robotic companion. The method may then involve executing robotic companion base software 260 to determine whether a wearable device has been connected to the robotic companion. Next, the method may involve executing robotic companion interactive software 264 to match the wearable device raw data with one or more elements of robotic companion interactive reference table 268 in order to determine whether the sensor readings from the wearable warrant an action from the robotic companion. After an action by a robotic companion has been determined to be warranted by robotic companion interactive software 264, the method may then involve executing robotic companion execution software 272 to search robotic companion program database 276 and run one or more program files associated with the robotic companion action.
[0074] It is to be noted that any one or more of the aspects and embodiments described herein may be conveniently implemented using one or more machines (e.g., one or more computing devices that are utilized as a user computing device for an electronic document, one or more server devices, such as a document server, etc.) programmed according to the teachings of the present specification, as will be apparent to those of ordinary skill in the computer art. Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those of ordinary skill in the software art. Aspects and implementations discussed above employing software and/or software modules may also include appropriate hardware for assisting in the implementation of the machine executable instructions of the software and/or software module.
[0075] Such software may be a computer program product that employs a machine-readable storage medium. A machine-readable storage medium may be any medium that is capable of storing and/or encoding a sequence of instructions for execution by a machine (e.g., a computing device) and that causes the machine to perform any one of the methodologies and/or embodiments described herein. Examples of a machine-readable storage medium include, but are not limited to, a magnetic disk, an optical disc (e.g., CD, CD-R, DVD, DVD-R, etc.), a magneto-optical disk, a read-only memory "ROM" device, a random access memory "RAM" device, a magnetic card, an optical card, a solid-state memory device, an EPROM, an EEPROM, and any combinations thereof. A machine- readable medium, as used herein, is intended to include a single medium as well as a collection of physically separate media, such as, for example, a collection of compact discs or one or more hard disk drives in combination with a computer memory. As used herein, a machine-readable storage medium does not include transitory forms of signal transmission.
[0076] Such software may also include information (e.g., data) carried as a data signal on a data carrier, such as a carrier wave. For example, machine-executable information may be included as a data-carrying signal embodied in a data carrier in which the signal encodes a sequence of instruction, or portion thereof, for execution by a machine (e.g., a computing device) and any related information (e.g., data structures and data) that causes the machine to perform any one of the methodologies and/or embodiments described herein.
[0077] Examples of a computing device include, but are not limited to, an electronic book reading device, a computer workstation, a terminal computer, a server computer, a handheld device (e.g., a tablet computer, a smartphone, etc.), a web appliance, a network router, a network switch, a network bridge, any machine capable of executing a sequence of instructions that specify an action to be taken by that machine, and any combinations thereof. In one example, a computing device may include and/or be included in a kiosk.
[0078] FIG. 16 shows a diagrammatic representation of one embodiment of a computing device in the exemplary form of a computer system 1600 within which a set of instructions for causing a control system, such as any one or more of various systems of the present disclosure, such as the systems illustrated in other figures of this disclosure, as well as systems that would be apparent to those of ordinary skill in the art after reading this entire disclosure, to perform any one or more of the aspects and/or methodologies of the present disclosure may be executed. It is also contemplated that multiple computing devices may be utilized to implement a specially configured set of instructions for causing one or more of the devices to perform any one or more of the aspects and/or methodologies of the present disclosure. Computer system 1600 includes a processor 1604 and a memory 1608 that communicate with each other, and with other components, via a bus 1612.
Bus 1612 may include any of several types of bus structures including, but not limited to, a memory bus, a memory controller, a peripheral bus, a local bus, and any combinations thereof, using any of a variety of bus architectures.
[0079] Memory 1608 may include various components (e.g., machine-readable media) including, but not limited to, a random access memory component, a read only component, and any combinations thereof. In one example, a basic input/output system 1616 (BIOS), including basic routines that help to transfer information between elements within computer system 1600, such as during start-up, may be stored in memory 1608. Memory 1608 may also include (e.g., stored on one or more machine-readable media) instructions (e.g., software) 1620 embodying any one or more of the aspects and/or methodologies of the present disclosure. In another example, memory 1608 may further include any number of program modules including, but not limited to, an operating system, one or more application programs, other program modules, program data, and any combinations thereof.
[0080] Computer system 1600 may also include a storage device 1624. Examples of a storage device (e.g., storage device 1624) include, but are not limited to, a hard disk drive, a magnetic disk drive, an optical disc drive in combination with an optical medium, a solid-state memory device, and any combinations thereof. Storage device 1624 may be connected to bus 1612 by an appropriate interface (not shown). Example interfaces include, but are not limited to, SCSI, advanced technology attachment (ATA), serial ATA, universal serial bus (USB), IEEE 1394 (FIREWIRE), and any combinations thereof. In one example, storage device 1624 (or one or more components thereof) may be removably interfaced with computer system 1600 (e.g., via an external port connector (not shown)). Particularly, storage device 1624 and an associated machine-readable medium 1628 may provide nonvolatile and/or volatile storage of machine-readable instructions, data structures, program modules, and/or other data for computer system 1600. In one example, software 1620 may reside, completely or partially, within machine-readable medium 1628. In another example, software 1620 may reside, completely or partially, within processor 1604.
[0081] Computer system 1600 may also include an input device 1632. In one example, a user of computer system 1600 may enter commands and/or other information into computer system 1600 via input device 1632. Examples of an input device 1632 include, but are not limited to, an alphanumeric input device (e.g., a keyboard), a pointing device, a joystick, a gamepad, an audio input device (e.g., a microphone, a voice response system, etc.), a cursor control device (e.g., a mouse), a touchpad, an optical scanner, a video capture device (e.g., a still camera, a video camera), a touchscreen, and any combinations thereof. Input device 1632 may be interfaced to bus 1612 via any of a variety of interfaces (not shown) including, but not limited to, a serial interface, a parallel interface, a game port, a USB interface, a FIREWIRE interface, a direct interface to bus 1612, and any combinations thereof. Input device 1632 may include a touch screen interface that may be a part of or separate from display 1636, discussed further below. Input device 1632 may be utilized as a user selection device for selecting one or more graphical representations in a graphical interface as described above.
[0082] A user may also input commands and/or other information to computer system 1600 via storage device 1624 (e.g., a removable disk drive, a flash drive, etc.) and/or network interface device 1640. A network interface device, such as network interface device 1640, may be utilized for connecting computer system 1600 to one or more of a variety of networks, such as network 1644, and one or more remote devices 1648 connected thereto. Examples of a network interface device include, but are not limited to, a network interface card (e.g., a mobile network interface card, a LAN card), a modem, and any combination thereof. Examples of a network include, but are not limited to, a wide area network (e.g., the Internet, an enterprise network), a local area network (e.g., a network associated with an office, a building, a campus or other relatively small geographic space), a telephone network, a data network associated with a telephone/voice provider (e.g., a mobile communications provider data and/or voice network), a direct connection between two computing devices, and any combinations thereof. A network, such as network 1644, may employ a wired and/or a wireless mode of communication. In general, any network topology may be used.
Information (e.g., data, software 1620, etc.) may be communicated to and/or from computer system 1600 via network interface device 1640.
[0083] Computer system 1600 may further include a video display adapter 1652 for
communicating a displayable image to a display device, such as display device 1636. Examples of a display device include, but are not limited to, a liquid crystal display (LCD), a cathode ray tube (CRT), a plasma display, a light emitting diode (LED) display, and any combinations thereof.
Display adapter 1652 and display device 1636 may be utilized in combination with processor 1604 to provide graphical representations of aspects of the present disclosure. In addition to a display device, computer system 1600 may include one or more other peripheral output devices including, but not limited to, an audio speaker, a printer, and any combinations thereof. Such peripheral output devices may be connected to bus 1612 via a peripheral interface 1656. Examples of a peripheral interface include, but are not limited to, a serial port, a USB connection, a FIREWIRE connection, a parallel connection, and any combinations thereof.
[0084] It should be apparent from the foregoing description that various example embodiments may be implemented in hardware or firmware. Furthermore, various exemplary embodiments may be implemented as instructions stored on a machine-readable storage medium, which may be read and executed by at least one processor to perform the operations described in detail herein. A machine-readable storage medium may include any mechanism for storing information in a form readable by a machine, such as a personal or laptop computer, a server, or other computing device. Thus, a machine-readable storage medium may include read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and similar storage media.
[0085] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative circuitry embodying the principles disclosed herein.
Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in machine readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
[0086] Although the various exemplary embodiments have been described in detail with particular reference to certain exemplary aspects thereof, it should be understood that various other embodiments are possible the details of which are capable of modifications in various respects. As is readily apparent to those skilled in the art, variations and modifications can be affected while remaining within the spirit and scope of the invention. Accordingly, the foregoing disclosure, description, and figures are for illustrative purposes only and do not in any way limit the invention, which is defined only by the claims.

Claims

What is claimed is:
1. A method performed by a robotic companion device for determining behavior based on a user's wearable device, the method comprising:
wirelessly receiving, from a wearable device of the user, information regarding the user's current physiological state;
determining, based on the received information, a label identifying the user's current physiological state;
identifying a first activity associated with the label; and
engaging an autonomous interaction with the user according to the first activity.
2. The method of claim 1, wherein the information regarding the user's current state comprises the label.
3. The method of claim 1, wherein the information regarding the user's current state comprises at least one physiological parameter.
4. The method of claim 3, further comprising:
receiving additional physiological parameters;
determining whether the label is no longer applicable to the user based on at least one of the physiological parameters and the additional physiological parameters; and
when the label is no longer applicable, ceasing interaction with the user according to the at least one activity after a time period has elapsed.
5. The method of claim 4, further comprising setting the time period in proportion to the duration of applicability of the label.
6. The method of claim 3, further comprising:
receiving additional physiological parameters;
determining whether the label is no longer applicable to the user based on at least one of the physiological parameters and the additional physiological parameters;
determining that the label remains applicable after a predetermined time period; based on determining that the label remains applicable after the predetermined time period, identifying a second activity associated with the label; and
engaging an additional autonomous interaction with the user according to the second activity.
7. The method of claim 3, wherein determining the label comprises applying a learned model associated with the label to the received at least one physiological parameter to determine whether the label is applicable to the user's current state.
8. The method of claim 7, further comprising:
receiving feedback indicating the correctness of the label to the user's current state;
creating a new training example from the at least one physiological parameter and the received feedback;
retraining the learned model based on the new training example using a machine learning process.
9. A non-transitory machine-readable storage medium encoded with instructions for execution by a robotic companion device for determining behavior based on a user's wearable device, the non- transitory machine-readable storage medium comprising:
instructions for wirelessly receiving, from a wearable device of the user, information regarding the user's current physiological state;
instructions for determining, based on the received information, a label identifying the user's current physiological state;
instructions for identifying a first activity associated with the label; and
instructions for engaging an autonomous interaction with the user according to the first activity.
10. The non-transitory machine-readable storage medium of claim 1, wherein the information regarding the user's current state comprises the label.
1 1. The non-transitory machine-readable storage medium of claim 1, wherein the information regarding the user's current state comprises at least one physiological parameter.
12. The non-transitory machine-readable storage medium of claim 3, further comprising:
instructions for receiving additional physiological parameters;
instructions for determining whether the additional physiological parameters indicate that the label is no longer applicable to the user; and
instructions for, when the label is no longer applicable, ceasing interaction with the user according to the at least one activity after a time period has elapsed.
13. The non-transitory machine-readable storage medium of claim 4, further comprising instructions for setting the time period in proportion to the duration of applicability of the label.
14. The non-transitory machine-readable storage medium of claim 3, further comprising:
instructions for receiving additional physiological parameters;
instructions for determining whether the additional physiological parameters indicate that the label is no longer applicable to the user;
instructions for determining that the label remains applicable after a predetermined time period;
instructions for, based on determining that the label remains applicable after the
predetermined time period, identifying a second activity associated with the label; and
instructions for engaging an additional autonomous interaction with the user according to the second activity.
15. The non-transitory machine-readable storage medium of claim 3, wherein the instructions for determining the label comprise instructions for applying a learned model associated with the label to the received at least one physiological parameter to determine whether the label is applicable to the user's current state.
PCT/EP2016/054824 2015-03-09 2016-03-07 Robotic companion sensitive to physiological data gathered by wearable devices WO2016142351A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562130201P 2015-03-09 2015-03-09
US62/130,201 2015-03-09
EP15176440.4 2015-07-13
EP15176440 2015-07-13

Publications (1)

Publication Number Publication Date
WO2016142351A1 true WO2016142351A1 (en) 2016-09-15

Family

ID=53717892

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/054824 WO2016142351A1 (en) 2015-03-09 2016-03-07 Robotic companion sensitive to physiological data gathered by wearable devices

Country Status (1)

Country Link
WO (1) WO2016142351A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180218268A1 (en) * 2017-01-30 2018-08-02 International Business Machines Corporation System, method and computer program product for sensory stimulation to ameliorate a cognitive state
WO2019040196A1 (en) * 2017-08-23 2019-02-28 Sony Interactive Entertainment Inc. Continual selection of scenarios based on identified tags describing contextual environment of a user for execution by an artificial intelligence model of the user by an autonomous personal companion
US10769418B2 (en) 2017-01-20 2020-09-08 At&T Intellectual Property I, L.P. Devices and systems for collective impact on mental states of multiple users

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020161480A1 (en) * 2001-02-16 2002-10-31 Sanyo Electric Co., Ltd. Robot system and robot
EP1618842A1 (en) * 2004-07-20 2006-01-25 Sharp Kabushiki Kaisha Medical information detection apparatus and health management system using the medical information detection apparatus
US20080214903A1 (en) * 2005-02-22 2008-09-04 Tuvi Orbach Methods and Systems for Physiological and Psycho-Physiological Monitoring and Uses Thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020161480A1 (en) * 2001-02-16 2002-10-31 Sanyo Electric Co., Ltd. Robot system and robot
EP1618842A1 (en) * 2004-07-20 2006-01-25 Sharp Kabushiki Kaisha Medical information detection apparatus and health management system using the medical information detection apparatus
US20080214903A1 (en) * 2005-02-22 2008-09-04 Tuvi Orbach Methods and Systems for Physiological and Psycho-Physiological Monitoring and Uses Thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHANGCHUN LIU ET AL: "Online Affect Detection and Robot Behavior Adaptation for Intervention of Children With Autism", IEEE TRANSACTIONS ON ROBOTICS, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 24, no. 4, 12 August 2008 (2008-08-12), pages 883 - 896, XP011232784, ISSN: 1552-3098, DOI: 10.1109/TRO.2008.2001362 *
IOLANDA LEITE ET AL: "Sensors in the wild: Exploring electrodermal activity in child-robot interaction", HUMAN-ROBOT INTERACTION (HRI), 2013 8TH ACM/IEEE INTERNATIONAL CONFERENCE ON, IEEE, 3 March 2013 (2013-03-03), pages 41 - 48, XP032345270, ISBN: 978-1-4673-3099-2, DOI: 10.1109/HRI.2013.6483500 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10769418B2 (en) 2017-01-20 2020-09-08 At&T Intellectual Property I, L.P. Devices and systems for collective impact on mental states of multiple users
US20180218268A1 (en) * 2017-01-30 2018-08-02 International Business Machines Corporation System, method and computer program product for sensory stimulation to ameliorate a cognitive state
US11205127B2 (en) * 2017-01-30 2021-12-21 International Business Machines Corporation Computer program product for sensory stimulation to ameliorate a cognitive state
WO2019040196A1 (en) * 2017-08-23 2019-02-28 Sony Interactive Entertainment Inc. Continual selection of scenarios based on identified tags describing contextual environment of a user for execution by an artificial intelligence model of the user by an autonomous personal companion
US20190065960A1 (en) * 2017-08-23 2019-02-28 Sony Interactive Entertainment Inc. Continual selection of scenarios based on identified tags describing contextual environment of a user for execution by an artificial intelligence model of the user by an autonomous personal companion
CN111201539A (en) * 2017-08-23 2020-05-26 索尼互动娱乐股份有限公司 Continuously selecting, by an autonomous personal companion, a scene for execution by an artificial intelligence model of a user based on identified tags describing the contextual environment of the user
EP3673416A1 (en) * 2017-08-23 2020-07-01 Sony Interactive Entertainment Inc. Continual selection of scenarios based on identified tags describing contextual environment of a user for execution by an artificial intelligence model of the user by an autonomous personal companion
JP2020531999A (en) * 2017-08-23 2020-11-05 株式会社ソニー・インタラクティブエンタテインメント Continuous selection of scenarios based on identification tags that describe the user's contextual environment for the user's artificial intelligence model to run by the autonomous personal companion
US11568265B2 (en) 2017-08-23 2023-01-31 Sony Interactive Entertainment Inc. Continual selection of scenarios based on identified tags describing contextual environment of a user for execution by an artificial intelligence model of the user by an autonomous personal companion

Similar Documents

Publication Publication Date Title
CN107997767B (en) Method for recognizing user activity and electronic device thereof
KR102363794B1 (en) Information providing method and electronic device supporting the same
EP3550450A1 (en) Method and device for adjusting user mood
KR102402146B1 (en) Method and apparatus for sensing fingerprints
KR102384756B1 (en) Activity Guide Information Providing Method and electronic device supporting the same
US10275200B2 (en) Activity information processing method and electronic device supporting the same
US11908465B2 (en) Electronic device and controlling method thereof
EP3268840B1 (en) Wearable health interface for controlling internet of things devices
CN105955973B (en) User information processing method and electronic device supporting the same
KR20160057837A (en) User interface displaying method and apparatus
KR102349684B1 (en) Activity information providing method and electronic device supporting the same
US20180035927A1 (en) Wearable device for sweat testing administration
KR20180045572A (en) Electronic device and method for controlling operation thereof
EP3523709B1 (en) Electronic device and controlling method thereof
CN104145470A (en) Adaptable mobile communication device functions
US20180070873A1 (en) Methods and software for providing health information to a user expressing symptoms of an allergic reaction via a wearable device
WO2016142351A1 (en) Robotic companion sensitive to physiological data gathered by wearable devices
US11596764B2 (en) Electronic device and method for providing information for stress relief by same
KR102478952B1 (en) Method for storing image and electronic device thereof
JP2019053676A (en) Information processing device, information processing method, program, and information processing system
KR20160111268A (en) Method for providing motion detection service and electronic device thereof
CN103685715A (en) User action determination method and terminal
WO2016142865A1 (en) Wearable devices with searchable data
KR20230078547A (en) Method and apparatus for learning management using face detection technology
KR20240037481A (en) System and method for monitoring application usage to calculate insurance premiums for companion anmimals

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16712235

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16712235

Country of ref document: EP

Kind code of ref document: A1