US20030194205A1 - Life support apparatus and method and method for providing advertisement information - Google Patents
Life support apparatus and method and method for providing advertisement information Download PDFInfo
- Publication number
- US20030194205A1 US20030194205A1 US10/428,065 US42806503A US2003194205A1 US 20030194205 A1 US20030194205 A1 US 20030194205A1 US 42806503 A US42806503 A US 42806503A US 2003194205 A1 US2003194205 A1 US 2003194205A1
- Authority
- US
- United States
- Prior art keywords
- information
- user
- situation
- stress
- advertisement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 44
- 238000004891 communication Methods 0.000 claims description 31
- 238000005259 measurement Methods 0.000 claims description 22
- 238000006243 chemical reaction Methods 0.000 claims description 10
- 230000005540 biological transmission Effects 0.000 claims description 5
- 210000005037 parasympathetic nerve Anatomy 0.000 claims description 4
- 230000002889 sympathetic effect Effects 0.000 claims description 4
- 239000000284 extract Substances 0.000 claims description 3
- 230000004044 response Effects 0.000 claims description 3
- 230000006872 improvement Effects 0.000 claims description 2
- 230000006399 behavior Effects 0.000 description 57
- 230000009471 action Effects 0.000 description 27
- 238000012545 processing Methods 0.000 description 26
- 230000001133 acceleration Effects 0.000 description 24
- 238000001514 detection method Methods 0.000 description 20
- 230000008859 change Effects 0.000 description 16
- 239000003795 chemical substances by application Substances 0.000 description 16
- 230000006870 function Effects 0.000 description 15
- 230000002354 daily effect Effects 0.000 description 12
- 210000000707 wrist Anatomy 0.000 description 11
- 230000036760 body temperature Effects 0.000 description 10
- 230000003340 mental effect Effects 0.000 description 7
- 230000002159 abnormal effect Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 5
- 210000004204 blood vessel Anatomy 0.000 description 4
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 108010054147 Hemoglobins Proteins 0.000 description 3
- 102000001554 Hemoglobins Human genes 0.000 description 3
- 238000010521 absorption reaction Methods 0.000 description 3
- 210000004369 blood Anatomy 0.000 description 3
- 239000008280 blood Substances 0.000 description 3
- 201000010099 disease Diseases 0.000 description 3
- 230000008030 elimination Effects 0.000 description 3
- 238000003379 elimination reaction Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000011514 reflex Effects 0.000 description 3
- 230000002040 relaxant effect Effects 0.000 description 3
- 230000004304 visual acuity Effects 0.000 description 3
- CIWBSHSKHKDKBQ-JLAZNSOCSA-N Ascorbic acid Chemical compound OC[C@H](O)[C@H]1OC(=O)C(O)=C1O CIWBSHSKHKDKBQ-JLAZNSOCSA-N 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 230000010485 coping Effects 0.000 description 2
- 230000002950 deficient Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 235000013305 food Nutrition 0.000 description 2
- 235000001497 healthy food Nutrition 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 210000005036 nerve Anatomy 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000001737 promoting effect Effects 0.000 description 2
- 208000020016 psychiatric disease Diseases 0.000 description 2
- 230000000630 rising effect Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- ZZZCUOFIHGPKAK-UHFFFAOYSA-N D-erythro-ascorbic acid Natural products OCC1OC(=O)C(O)=C1O ZZZCUOFIHGPKAK-UHFFFAOYSA-N 0.000 description 1
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 1
- 101001139126 Homo sapiens Krueppel-like factor 6 Proteins 0.000 description 1
- 108010064719 Oxyhemoglobins Proteins 0.000 description 1
- 206010042434 Sudden death Diseases 0.000 description 1
- 208000003443 Unconsciousness Diseases 0.000 description 1
- 229930003268 Vitamin C Natural products 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 210000001367 artery Anatomy 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 235000009508 confectionery Nutrition 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 208000035475 disorder Diseases 0.000 description 1
- 210000000624 ear auricle Anatomy 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000008103 glucose Substances 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 208000019622 heart disease Diseases 0.000 description 1
- 230000001976 improved effect Effects 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 206010025482 malaise Diseases 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 235000012054 meals Nutrition 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000002969 morbid Effects 0.000 description 1
- 235000016709 nutrition Nutrition 0.000 description 1
- 230000035764 nutrition Effects 0.000 description 1
- 230000000050 nutritive effect Effects 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 210000005259 peripheral blood Anatomy 0.000 description 1
- 239000011886 peripheral blood Substances 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 230000036544 posture Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000035755 proliferation Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000010183 spectrum analysis Methods 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 235000019154 vitamin C Nutrition 0.000 description 1
- 239000011718 vitamin C Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
- A61B5/6815—Ear
- A61B5/6817—Ear canal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4029—Detecting, measuring or recording for evaluating the nervous system for evaluating the peripheral nervous systems
- A61B5/4035—Evaluating the autonomic nervous system
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
- A61B5/6815—Ear
- A61B5/6816—Ear lobe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7475—User input or interface means, e.g. keyboard, pointing device, joystick
- A61B5/749—Voice-controlled interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0207—Discounts or incentives, e.g. coupons or rebates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0242—Determining effectiveness of advertisements
- G06Q30/0245—Surveys
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0277—Online advertisement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0004—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
- A61B5/0013—Medical image data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02416—Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
- A61B5/02427—Details of sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02438—Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
- A61B5/0531—Measuring skin impedance
- A61B5/0533—Measuring galvanic skin response
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1112—Global tracking of patients, e.g. by using GPS
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7253—Details of waveform analysis characterised by using transforms
- A61B5/726—Details of waveform analysis characterised by using transforms using Wavelet transforms
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S128/00—Surgery
- Y10S128/903—Radio telemetry
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S128/00—Surgery
- Y10S128/92—Computer assisted medical diagnostics
Definitions
- the present invention relates to a wearable type life support apparatus for measuring and determining various states of a user by using a wearable device and giving life support by an information service such as medical administration or personal navigation according to the user's situation, a life support method, and an advertisement providing method.
- Stress originally means stimuli from the external world (Yoshinosuke Tadai, “What is Stress?”, Kodansha Bluebacks).
- stress is taken to also include adaptive reaction against stress. When the adaptive reaction exceeds human limitations, various diseases or mental disorders are said to occur. Alternatively, these disorders occur presumably when the sympathetic nerve and the parasympathetic nerve become unbalanced due to a change in rhythm of life.
- advertisement display (banner advertisement) according to the Internet use situation of a user, such as Internet advertisement, is widely used.
- information providing according to each scene of user's daily life and business that uses the information providing service for advertisement have not been realized yet.
- the present invention provides a system which can grasp the situation of a user by various sensors, grasp the degree of stress corresponding to each situation, and offers a service such as occasional life style improvement or relaxation using the concept of a wearable computer always attached to the user.
- the system is applied to consumer marketing or advertisement display business.
- a life support apparatus comprising a vital information sensor attached to a body to acquire vital information of a user, a behavior information sensor attached to the body to acquire behavior information of the user, a situation recognition device which recognizes a user's situation based on the behavior information acquired by the behavior information sensor and the vital information acquired by the vital information sensor to generate user's situation information, a data base which stores stress management information are prepared in advance, an information search device which searches the data base for stress management information corresponding to the user's situation information, and an information presentation device which presents the stress management information obtained by the information search device to the user.
- the vital information and behavior information of the user are acquired, the user's situation is recognized on the basis of the acquired behavior information, corresponding information is obtained from pieces of information for dealing with stress, which are prepared in advance, using the acquired user's situation information as a key, and the obtained information is presented to the user.
- the stress situation is determined in daily life, and optimum service information for stress elimination or care is provided to the user in accordance with the situation, thereby enabling life advice contributing to user's healthcare.
- a life support apparatus comprising a user information sensor attached to a body to acquire information representing a user's situation, a situation recognition device which recognizes the user's situation based on user information acquired by the user information sensor, a transceiver device which transmits the information of the user's situation recognized by the situation recognition device and receive external information transmitted from an external apparatus, and a presentation device which presents the external information received by the transceiver device to the user, the external information including an advertisement appropriate for the user, which is sent from the external apparatus in correspondence with the user's situation information.
- information representing the physical situation of the user is acquired, the user's situation is recognized on the basis of the acquired information, and advertisement information corresponding to the information of the recognized user's situation can be obtained from the server that hold various kinds of advertisement information corresponding to physical situations and presented to the user.
- the stress situation is determined in daily life, optimum service information for stress elimination or care is provided to the user in accordance with the situation in consideration of the time and circumstances, and the user is prompted to use the service, thereby enabling life advice contributing to the commercial effect and user's healthcare.
- a life support apparatus comprising various kinds of information presentation media for a voice or text message, a user information sensor attached to a body to acquire user information representing a user's situation, a situation recognition device which recognizes the user's situation based on user information acquired by the user information sensor to generate user's situation information, a communication device connected to the situation recognition device and communicating with external equipment, a situation information conversion device which selects an optimum message presentation medium from the information presentation media in accordance with the user's situation information and convert the situation information into a form corresponding to the optimum message presentation medium, to present call message information sent from the message sender and received by the communication device to the user, and an answer transmission device which transmits the user's situation information converted by the situation information conversion device to the message sender.
- the call message upon reception of incoming message information addressed to the user, the call message is converted into a form that uses an appropriate medium corresponding to the user's situation recognized by the situation recognition device, and presented.
- the answer transmission device transmits the user's situation information converted by the situation information conversion device to the message sender.
- the user's current situation is determined in daily life to cope with an incoming call in accordance with the user's situation at that time in consideration of specific time and circumstances such that the user need not worry about it, thereby enabling life advice without imposing any stress on the user.
- an advertisement information providing method comprising preparing a server which holds various kinds of advertisement information corresponding to physical situations, and extracting optimum advertisement information corresponding to a situation of a user to present the optimum advertisement information to the user.
- the server which hold various kinds of advertisement information corresponding to physical situations is prepared, the physical situation of the user is detected, and optimum advertisement information is obtained from the server in correspondence with the user's situation and presented to the user.
- the user's current situation is determined in daily life, and advertisement information such as optimum merchandise to deal with stress can be presented to the user on the basis of the determination result in consideration of the user's situation at that time. This contributes to user's stress care and healthcare, and commercially effective advertisements can be provided.
- FIG. 1 is a schematic block diagram showing a wearable type life support apparatus according to the first embodiment of the present invention
- FIG. 2 is a flow chart showing the processing procedure of the wearable type life support apparatus according to the first embodiment of the present invention
- FIGS. 3A to 3 C are schematic views showing the principle of human posture recognition used in the present invention.
- FIG. 4 is a flow chart showing action and posture recognition processing used in the embodiment of the present invention.
- FIG. 5 is a view for explaining the structure of a reference sensor information corpus related to stress, which is used in the embodiment of the present invention.
- FIG. 6 is a view for explaining the dialogue structure for a situation registered in the sensor information corpus used in the embodiment of the present invention.
- FIGS. 7A, 7B, and 7 C are views for explaining behavior-related pulse rate trend graph display and a behavior input window for an abnormal value, which are used in the embodiment of the present invention.
- FIGS. 8A and 8B are views for explaining a display window of vital information related to a behavior, which is used in the embodiment of the present invention.
- FIGS. 9A and 9B are views for explaining situation-dependent advertisement display and an online shopping window according to the display, which are used in the embodiment of the present invention.
- FIGS. 10A, 10B, and 10 C are views for explaining advertisement display corresponding to the degree of user's fatigue/stress and a road map window, which are used in the embodiment of the present invention.
- FIGS. 11A, 11B, and 11 C are views for explaining advertisement displays corresponding to the behavior information of the user and a road map, which are used in the second embodiment of the present invention.
- FIG. 12 is a view for explaining the structure of a regional sensor information corpus related to stress, which is used in the embodiment of the present invention.
- FIGS. 13A and 13B are a view and flow chart, respectively, showing collection of information of a person who passes by a convenience store and advertisement display, which are used in the embodiment of the present invention
- FIG. 14 is a view showing the structure of address book data including degree-of-stress information, which is used in the embodiment of the present invention.
- FIG. 15 is a view for explaining the structure of a relational database of a schedule/task list, the degree of stress, and the degree of fatigue, which is used in the embodiment of the present invention.
- FIG. 16 is a flow chart of processing of returning a situation-dependent automatic answering telephone message to a handyphone according to the third embodiment of the present invention.
- FIG. 17 is a flow chart of posture/action recognition based on peak detection of a waveform as a function of time, which is used in the embodiment of the present invention.
- FIG. 18 is a view showing an allowable range setting table of answer contents for each message sender, which is used in the embodiment of the present invention.
- FIG. 19 is a view showing an answer device setting table for each user's situation, which is used in the embodiment of the present invention.
- FIG. 20 is a view showing message display window transition that is used in the embodiment of the present invention.
- a wearable type life support apparatus which can give a life advice effectively used for healthcare and medical administration by monitoring stress applied on the body of a user, and when stress occurs, relaxing the stress to suppress mental and physical damage by the stress, and can also give navigation by utilizing the stress to improve the ability of the user will be described here with reference to FIG. 1.
- the wearable type life support apparatus shown in FIG. 1 has a main module 101 , a sensor modules 102 , an acceleration sensor module 103 , a display 104 , a wrist watch type display 105 , a headset 106 , and a handyphone 107 .
- the main module 101 is a compact and lightweight computer such as a wearable computer, which has a function of analyzing collected vital information to grasp the degree of stress and providing various kinds of supports in accordance with the degree of stress.
- the main module 101 also has functions of processing collected data, sending the processed data to the database of a center, executing desired processing using information obtained from the database, and transmitting/receiving information or a control command to/from the headset 106 , display 104 , or handyphone 107 .
- the main module 101 is formed from a memory 1011 and CPU 1012 .
- Application programs and control programs for implementing the above-described functions and an OS (Operating System) as the basic software of the computer are stored in the memory 1011 .
- the CPU 1012 executes these programs to realize various desired processing operations.
- the main module 101 also has a calendar/timepiece function such that collected or processed information can be managed with a time stamp.
- the main module 101 also has, e.g., a function of synthesizing a character string prepared as text data into a voice and outputting a voice signal, a function of recognizing a voice signal and converting it into text data, and a function of collating data.
- the main module 101 has, e.g., a Bluetooth chip 1013 for executing communication between modules using Bluetooth as an international standard short-distance radio communication device which has received a great deal of attention in recent years.
- the main module 101 can store data to be handled in the system, systematically manage the entire system, execute data communication between the modules, and communicate with a home server and management server (not shown).
- the sensor modules 102 collect and transmit vital signals and are connected to vital signal detection sensors such as a pulse sensor 1026 for detecting pulses of a human body, a thermo sensor 1027 for detecting the body temperature of the human body, and a GSR (Galvanic Skin Reflex) electrode 1028 for detecting the skin resistance of the human body.
- Each sensor module 102 comprises a preprocessor 1025 that amplifies and preprocesses the detection signal from each sensor, an A/D converter 1024 that converts the sensor detection signal preprocessed by the preprocessor 1025 into digital data, a CPU 1022 that executes various control operations and data processing, and a memory 1021 .
- Each sensor module 102 also incorporates a Bluetooth chip 1023 to execute data communication with the main module 101 .
- the structures from the sensors 1026 , 1027 , and 1028 to the sensor modules 102 are divided for the respective sensors. However, the structures for the respective sensors may be integrated into a single sensor module 102 . Processing operations in each sensor and module 102 may be integrated.
- a microcontroller e.g., PIC16F877 available from MicroChip Technologies
- incorporating an A/D conversion function may be used as the CPU 1022 without preparing a separate A/D converter.
- the preprocessor 1025 not only amplifies the signal detected by each sensor by an appropriate gain but also incorporates a filter circuit that performs high-pass filter processing depending on the type of signal or low-pass filter (anti-aliasing filter) processing in accordance with the band of each signal.
- Each sensor has a plurality of channels as needed.
- the handyphone 107 is a normal handyphone having a liquid crystal display panel, a plurality of operation buttons including dial keys, and a transceiver and inputs/outputs a voice.
- the handyphone 107 also incorporates a Bluetooth chip and can communicate with the main module 101 . With this arrangement, voice input/output and cursor control by cursor keys can be performed.
- the display 104 is a display terminal formed from a portable liquid crystal display panel which displays text data or an image and exclusively constructed for display.
- the display 104 has a Bluetooth chip 1041 and can control display contents upon receiving display data and the like from the main module 101 through the Bluetooth chips 1013 and 1041 .
- the headset 106 is an input/output terminal used on the user's head, i.e., a headset incorporating a Bluetooth chip and CCD camera (solid state image sensor) as well as a headphone (or earphone) and microphone.
- the headset 106 is a device for voice/image interface.
- the headset 106 also incorporates a Bluetooth chip to transmit and receive a voice signal and transmit an image.
- the headset 106 can be used simultaneously together with the handyphone 107 .
- the wrist watch type display 105 is a liquid crystal display panel having a wrist watch shape used on the user's arm.
- the wrist watch type display 105 incorporates a Bluetooth chip to transmit/receive data or command to/from the main module 101 .
- This apparatus assumes digital communication by Bluetooth.
- a radio communication device of any other scheme or a scheme of performing D/A conversion and transferring a voice signal to the headphone by FM modulation may be employed.
- a voice signal may be transferred not by radio communication but by cable connection.
- An image may be acquired by a digital camera attached independently of the headset 106 .
- FIG. 2 is a flow chart showing the flow of operation of the system according to the present invention having the arrangement shown in FIG. 1. The operation will be described with reference to the flow chart show in FIG. 2.
- the user carries the main module 101 , sensor modules 102 , handyphone 107 , display 104 , and headset 106 .
- the pulse sensor 1026 , thermosensor 1027 , GSR electrode 1028 , and acceleration sensor 1036 are set on the user, and then, the system is activated to start operation (step S 201 in FIG. 2).
- the sensors When the sensors are set and activated, they start to detect a vital signal. As a result, a pulse rate detection signal by the pulse sensor 1026 , a temperature detection signal by the thermosensor 1027 , a galvanic skin reflex detection signal by the GSR electrode 1028 , and an acceleration measurement signal by the acceleration sensor 1036 are obtained (step S 202 in FIG. 2).
- the measurements are done continuously, periodically (every minute, every 10 minutes, or the like), or in accordance with a measurement instruction from the main module 101 or a user's instruction.
- the analog detection signals obtained by the sensors 1026 , 1027 , and 1028 are amplified, filtered, and A/D-converted by the sensor modules 102 .
- the A/D-converted data are transferred to the main module 101 through a short-distance radio device such as the Bluetooth chip 1023 .
- the main module 101 processes the measurement data by a preset logic, thereby determining the user's situation.
- the main module 101 recognizes the action (behavior) or posture of the user on the basis of acceleration information obtained from the acceleration sensor 1036 (step S 203 in FIG. 2).
- step S 203 The action/posture recognition method in step S 203 is shown in the action recognition flow chart of FIG. 4.
- the acceleration information is obtained by attaching, e.g., a three-dimensional acceleration sensor to a predetermined portion of the human body as the acceleration sensor 1036 , thereby measuring the posture and action.
- the three-dimensional acceleration sensor 1036 can be formed by perpendicularly laying out two two-dimensional acceleration sensors such as “ADXL202JC” available from Analog Devices Corp.
- the three-dimensional acceleration sensor 1036 is attached to, e.g., the waist to measure the motion of the body center (trunk) portion.
- the tilt of the sensor is obtained from a DC component, i.e., an output obtained by passing the acceleration waveform from the acceleration sensor 1036 through a low-pass filter, thereby detecting the posture.
- a DC component i.e., an output obtained by passing the acceleration waveform from the acceleration sensor 1036 through a low-pass filter, thereby detecting the posture.
- the senor 1036 is attached to the joint portion of the base of the femoral region of a user P.
- An angle is obtained from the vertical and horizontal components of the DC component, and the posture can be recognized on the basis of the angle: when the sensor is almost horizontal, the user is lying on his/her back (FIG. 3C) or on his/her face, when the sensor is almost vertical, the user stands upright (FIG. 3A), and the sensor has an angle therebetween, the user is sitting (FIG. 3B).
- the action can be identified from the frequency component and variation pattern of an AC component.
- the fundamental frequency of walking is 60 to 100 (times/min), and that of running is 120 to 180 (times/min).
- the fundamental frequency component is acquired by performing frequency analysis (FFT (Fast Fourier Transform)) for the detected signal (S 401 in FIG. 4) or by detecting waveform peaks and peak interval. The powers in the respective bands are compared, thereby recognizing walking or running.
- FFT Fast Fourier Transform
- the acceleration sensor 1036 is attached not to the waist but to a leg portion, e.g., the femoral region of a leg, the acceleration during walking is maximized by vibration when the foot touches the ground. However, during running (running fast), the acceleration is maximized by vertical movement of the waist when the feet touch the ground. For this reason, for walking, the fundamental frequency must be further halved. Alternatively, since the vertical amplitude for running is larger by twice or more than that for walking, the amplitude values at the time of peak detection are compared, thereby recognizing walking or running.
- FFT frequency division multiplexing
- Another spectrum analysis device such as wavelet transformation may be used.
- pattern matching between the waveforms of fundamental frequencies may be performed to recognize “running”, “walking”, “ascending stairs”, or “descending stairs”.
- Simple peak detection may be performed, and the number of steps may be measured from the period.
- peak detection may be performed along the time axis to obtain the walking or running pitch.
- Posture/action recognition is done by the main module 101 .
- posture/action recognition may be done by the sensor modules 102 , and resultant status data (posture and action) may be transmitted periodically or when a change occurs.
- the main module 101 communicates with a radio tag (e.g., a Bluetooth chip) prepared in each room to detect the location. Outdoors, a position information service of a handyphone (or PHS) or a GPS (not shown) is used to detect the location.
- a radio tag e.g., a Bluetooth chip
- step S 204 The determination processing in step S 204 is executed to check whether the pulse rate, body temperature, GSR (Galvanic Skin Reflex), posture, action, or voice has changed. If NO in step S 204 , the flow returns to step S 203 in FIG. 2. If YES in step S 204 , the flow advances to processing in step S 205 .
- GSR Gatel Skin Reflex
- step S 204 the pieces of information of the pulse rate, body temperature, GSR, posture, action, or voice are necessary.
- pieces of vital information such as the pulse rate, body temperature, and GSR are measured simultaneously with the above-described user's behavior state detection. The measuring method will be described below.
- the pulse rate is obtained by the pulse sensor 1026 .
- the pulse sensor 1026 detects a pulse by photoelectrically sensing a change in bloodstream through peripheral blood vessels in, e.g., a finger, wrist, or ear as a part to be measured.
- a portion where blood vessels concentrate is irradiated with light using, as a light source, an incandescent lamp or LED (Light Emitting Diode) capable of emitting light having an absorption wavelength of hemoglobin contained in blood in a large quantity.
- the transmitted or reflected light is received by a photodiode as a photoelectric element, photoelectrically converted, and measured.
- a potential waveform on which the influence of light absorption by hemoglobin that flows in bloodstream is reflected is obtained as a detection signal from the component, e.g., photodiode of the pulse sensor 1026 .
- This signal is amplified by the preprocessor 1025 , filtered, converted into digital data by the A/D converter 1024 , transmitted from the sensor module 102 through the Bluetooth chip 1023 , and thus received by the main module 101 as potential waveform data as pulse rate data.
- the main module 101 analyzes the peak interval or frequency of the potential waveform of the received pulse data and calculates the pulse rate from the peak frequency. The analysis and calculation are done by the CPU 1012 .
- the pulse sensor 1026 can have the shape of an earring, ring, or wrist watch, and any shape can be employed.
- the pulse sensor 1026 may be incorporated in the headset 106 shown in FIG. 1 such that the light emitting diode (incardescent lamp or LED) and photoelectric element (photodiode) or CdS cell are arranged on the front and rear sides of an earlobe.
- the light emitting and photoelectric elements may be incorporated in a ring or wrist watch, and the sensor may be incorporated in each module.
- the resultant digital signals are received by the main module 101 , the blood pressure or the elastic modulus of blood vessels can be obtained from the difference between the waveforms.
- the blood sugar value can be measured using the reflected light.
- the heartbeat rate may be calculated using an electrocardiogram on the basis of the peak interval or peak frequency obtained from frequency analysis (this method is medically stricter).
- the pulse rate value, blood pressure value, and blood sugar value are always measured and stored in the memory 1011 of the main module 101 .
- measurements are performed periodically or at an arbitrary time in accordance with an instruction from the main module 101 to store data.
- thermosensor 1027 To measure the body temperature, the thermosensor 1027 is used.
- the thermosensor 1027 is formed from a detection device such as a thermocoupler or thermistor.
- the detection device is brought into contact with the body surface of the user, and the output from the detection device is converted into a temperature in accordance with the characteristics of the sensor.
- a pair of electrodes are attached to the body surface of the user at a predetermined interval, a weak current is supplied across the electrodes, the potential difference and current value are measured, and the resistance value is calculated using the measurement values.
- drift components are removed from a waveform corresponding to the measurement result obtained from the two electrodes, and then, the amplitude of the leading edge and the number of leading edges are acquired. The drift components are acquired from the average value of the waveform.
- analog (voltage) data from the acceleration sensor 1036 is also A/D-converted and stored in the memory 1011 . These data are linked with each other by giving measurement times to the respective data or recording the data in the same record.
- the pieces of vital information are obtained in this way. If the vital information or posture, action, or voice information changes, the CPU 1012 of the main module 101 acquires current schedule data by processing in step S 205 .
- a change means that the vital information (pulse rate, body temperature, or GSR) abruptly changes or becomes abnormal (for example, the pulse rate is “120” or more, or the body temperature is “ 37 ° C.” or more), or the action information represents a status change such as “the user stops walking”.
- the vital information pulse rate, body temperature, or GSR
- the action information represents a status change such as “the user stops walking”.
- the CPU 1012 of the main module 101 acquires schedule data including the change time by, e.g., PIM (Personal Information Manager) software that belongs and is compatible to the OS of the main module 101 (by, e.g., application “Microsoft Outlook 2000” if the OS (Operating System) of the main module 101 is “Windows” available from Microsoft) (step S 205 in FIG. 2).
- PIM Personal Information Manager
- step S 206 in FIG. 2 Consistency between the pieces of information and the schedule is checked (step S 206 in FIG. 2), and any inconsistencies and deficient information are acquired from the user by speech dialogue and supplemented (step S 207 in FIG. 2).
- the headset 106 receives the voice signal through the Bluetooth chip of its own, transfers the voice signal to the headphone, and causes it to output voice.
- the user who is wearing the headset 106 can hear the question from the main module 101 , “What are you doing now?”
- the user answers this question with his/her current situation by speaking. For example, “ascending stairs” or “standing up from a chair”
- the user's voice is converted into a voice signal by a microphone 1061 of the headset 106 , and the headset 106 transmits the voice signal through the Bluetooth chip of its own by radio.
- the main module 101 of the user receives, through the Bluetooth chip 1013 , the voice signal transmitted by radio.
- the CPU 1012 of the main module 101 executes voice recognition processing for the voice signal and grasps the contents.
- the CPU 1012 of the main module 101 acquires, from a database DB 1 , the user's current schedule data managed by the software (step S 205 in FIG. 2).
- the schedule is prepared in advance in accordance with the behavior plan of the user by specifically setting dates, times, and contents.
- the CPU 1012 of the main module 101 collates the behavior data recognized from the acceleration with the schedule data (step S 206 in FIG. 2). If the collation fails, a dialogue for checking it may be done to correct the expectation result on the basis of the result of dialogue. Conversely, when the user is standing still for a long time, it is checked by collating the schedule whether the behavior has no problem. If the collation fails, the CPU inquires of the user.
- This inquiry is also done by, e.g., speech dialogue.
- Collation with the schedule may be triggered by vital information. For example, when the pulse rate increases at the scheduled desk work time, the behavior possibly changes, and the CPU asks the user, e.g., “Are you walking or running?” If it is determined as a result of check that the user is at desk working, the increase in pulse rate is supposed to be caused by a mental or morbid factor. The main module 101 asks the user “Are you feeling unwell?” through the headset 106 to check whether the user feels stress.
- the CPU 1012 of the main module 101 recognizes that the user's illness is serious. In this case, under the control of the CPU 1012 , the main module 101 searches for medical information registered in advance, controls the handyphone 107 to execute dial call origination, and notifies the family physician of the emergency by, e.g., transmitting a voice message or a mail message prepared for emergency from the handyphone 107 , or alarms those who are around the user.
- the CPU 1012 of the main module 101 estimates the situation or life behavior on the basis of the measurement data, action, and schedule (step S 207 in FIG. 2).
- the CPU searches a personal sensor information corpus DB 2 in the terminal (main module 101 ) for sensor information with the same conditions on the basis of the obtained behavior information (where the user is and what the user is doing) and date/time data of the user, and compares the obtained sensor information with the measured sensor information to determine whether the value or change trend has a significant difference.
- the CPU 1012 of the main module 101 measures the degree of stress from changes in pulse rate, body temperature, and GSR corresponding to the life behavior and situation (step S 208 in FIG. 2).
- the standard range of each vital information is held in the memory 1011 as a parameter in correspondence with each behavior information, and each vital information is compared with the standard range.
- the value is determined to be normal.
- the value is determined to be abnormal.
- Each parameter may be automatically set on the basis of data in normal state.
- the pattern (waveform) of a change in vital information for a certain behavior is stored, a correlation coefficient with respect to the pattern is acquired, and abnormality is determined when the correlation coefficient is equal to or smaller than a set value.
- the degree of stress becomes higher than that in the normal state due to, e.g., disturbance. With this processing, whether the degree of stress is normal or abnormal can be detected for each behavior.
- FIGS. 7A, 7B, and 7 C are views showing displayed vital information/behavior display windows.
- a pulse rate trend graph is displayed on the monitor window every moment, indicating that the pulse rate abruptly increases during walking. Such an abrupt increase deviates from the normal pattern and can be determined as abnormality.
- This pattern can be estimated as a running state (the user is running).
- a question window “A change in measurement data is detected. You seem to be running, and the pulse rate is higher than usual. What is the matter with you?” is presented on the display panel, and the user is requested to answer the question.
- Answer examples such as “I'm running not to be late for work in the afternoon”, “for training”, and “being chased” are prepared and displayed, and the user is made to select one of them. If the user selects “I'm running not to be late for work in the afternoon”, it can be determined that “the pulse rate has increased because the user is in a hurry”, and consequently, it can be detected that “the degree of stress is+(plus)” (FIG. 7C). Even when the main module 101 executes such processing, whether the degree of stress is normal or abnormal can be detected for each behavior.
- step S 208 in FIG. 2 The dialogue structure of the speech dialogue used at this time is built by processing in the main module 101 in accordance with the user's situation, or a dialogue structure stored in the past in the personal sensor information corpus DB 2 serving as a material database is acquired together with sensor information. This will be described below in more detail.
- the reference sensor information corpus DB 2 has, in one record, environment (season, time, place, posture, action, behavior, and expected behavior), physical information (pulse rate, body temperature, GSR, and voice pitch), degree of stress, and dialogue structure.
- environment season, time, place, posture, action, behavior, and expected behavior
- physical information pulse rate, body temperature, GSR, and voice pitch
- degree of stress degree of stress
- dialogue structure The similarity between the environment and physical information and the measurement data (vital information) obtained from the user is obtained, and the degree of stress is calculated using an evaluation function.
- a value equal to or larger than a certain reference value is recognized as a record that represents the user's situation, and the degree of stress and a dialogue structure for coping with the stress are acquired (step S 210 in FIG. 2).
- the user may be asked a question “You seem to be considerably tired”, “You seem to be tired a little”, or “Are you tired?”
- the degree of stress may be corrected for the user on the basis of an answer from the user, and the correction result may be reflected on the corpus DB 2 .
- the main module 101 registers the dialogue result in the sensor information corpus DB 2 as a dialogue structure for a specific situation by processing in the CPU 1012 .
- a dialogue structure as shown in FIG. 6 is registered as a dialogue result.
- a dialogue structure for the situation is registered in the sensor information corpus DB 2 with contents “System: “Your pulse rate is rising before meeting. Are you planning presentation?”” ⁇ “User: “Yes, I have important presentation. I feel stressful”” “System: “Breathe deeply and relax or how about something to drink?”” ⁇ “User: “OK””
- the degree of stress may be detected by continuously analyzing the frequency component of the user's voice.
- the characteristic feature of the degree of stress appears in the frequency component and time-axis component of voice so that, for example, the frequency of the generated voice becomes higher than usual.
- the degree of stress can be detected by continuously analyzing the frequency component of the user's voice during the dialogue. Hence, when the degree of stress is measured by voice frequency analysis, the degree of stress can be more accurately measured (step S 211 in FIG. 2).
- the degree of stress is high.
- subjective data for that person is stored in the address book of the PIM software. The determination is done on the basis of vital information (pulse rate, GSR, and the like) when the user meets the person. If the pulse rate is high or the integrated value of GSR becomes large during speaking with that person, items “person (name)”, “address”, “telephone number”, . . . , “degree of stress” are stored in the address book having a structure shown in FIG. 14 as data of a person for which the user feels stressful.
- the person is recognized from the image or by inputting the name by voice recognition whereby the user's PIM data is obtained from the database DB 1 to acquire the data of the degree of stress for that person.
- the emotion of the person is recognized from the speech and behavior of the person, the degree of stress is acquired even from the current vital information, and the degree of stress of the user is determined by combination of these data.
- the degree of stress is set as frequency data, and the data are averaged every time the user meets that person.
- the expected degree of stress becomes high for a person for which the user habitually feels stressful.
- the situation when the user meets a person e.g., schedule information such as “ordinary meeting” is recorded in linkage with the degree of stress, and the degree of stress corresponding to each situation is stored in the corpus having a structure shown in FIG. 15.
- schedule data in the future and participants (persons) are input, the expected degree of stress is calculated on the basis of a predetermined degree-of-stress formula, so the user is advised before the meeting to control the stress to some extent.
- a distance sensor e.g., an ultrasonic distance sensor or infrared distance sensor
- Another factor for stress build-up is a bad smell or strong smell.
- the intensity and kind of peripheral smell may be recorded using an odor sensor and converted into a degree of stress.
- the schedule, task, and corresponding degree of stress are stored in the corpus shown in FIG. 15, which has items “season”, “day of week”, “schedule/task (To-Do)”, “content/volume”, . . . , “degree of stress”, and “degree of fatigue”.
- the schedule and task are freely input.
- keyword search is used, and the closest item is obtained in consideration of other situation data.
- the degree of stress and subjective information of the user are acquired by the above device, deficient data are supplemented and corrected, and the subjective information is recorded (step S 212 in FIG. 2).
- the degree of stress is more than a predetermined threshold value, and it is determined that the user is stressed, the data of the degree of stress is transmitted to the information providing service agent together with the data of the user's situation, and an information providing service appropriate for the user is offered on the basis of the data (step S 213 in FIG. 2).
- Potential service menus are
- the user is inquired about the service menu at the start time of use of the terminal (in this case, the start time of use of the main module 101 and the like) or when the main module 101 and the like are powered on.
- the user can set the service menu as he/she likes.
- the system links to a content distribution service agent for music or the like, extracts optimum contents for the user from the database of the service agent on the basis of the data of the user's situation (where the user is and what the user is doing) and the data of the degree of stress (the degree of fatigue and whether the user is being stressed), and presents candidates to the user.
- a confirmation message “Playback of this content costs ⁇ . OK?” is displayed.
- the system buys and downloads the content and displays or plays back the stream of data.
- a questionnaire of the result is acquired and fed back to the database.
- a service is offered to navigate the user such that the maximum efficiency can be obtained within the allowable range while allowing stress to some extent.
- Various kinds of events are prepared in accordance with the situation of each user. For example, “professional sports player”, “amateur sports player”, “examination”, “presentation”, and the like are prepared.
- the CPU 1012 of the main module 101 sets navigation menus from the service start day to the actual event and during the event and executes the service.
- the menus may be continuously set in a scale with which relaxation can be obtained at maximum efficiency. In this case, break and relaxation necessary for maximizing the efficiency are provided. If the service mainly aims at eliminating stress, the amount of break and relaxation to be provided is increased.
- the relaxation service is provided at a timing according to the measured user's situation.
- no relaxation service is provided.
- the service is provided with relaxation advice.
- control may be performed such that a parameter that reflects a change in sympathetic and parasympathetic nerves, e.g., a fluctuation in heart rate, is measured, when the sympathetic nerve is active, an advice menu for maximizing efficiency under that situation is displayed, and when the parasympathetic nerve is active, the relaxation service is provided because a break is necessary.
- the type of stress may be estimated from the user's situation, and a service according to the type of stress may be provided.
- the type of stress can be determined, and how to reflect the stress on the service can be determined.
- window information which allows the user to check the history for a predetermined past period from the current time may be generated on the basis of the monitor results of the vital information, posture/action information, and the like, sent to the terminal, and displayed on the user's display such that the user can refer to the history.
- action life information in this example, [walk (walking)] move to dining room for lunch, [sit (sitting)] meal, [walk (walking)] finish lunch and move to private room
- the current behavior state (running) and current physical condition (pulse rate is high are displayed together with operation buttons and the like.
- the graph of vital signals at that time is displayed, and the user can see the transition state (FIG. 8B).
- the vital signal graph shows the pulse wave and pulse rate, though any other vital signal can be displayed.
- pieces of vital information of the user including the pulse rate, body temperature, and GSR, are acquired, and the posture information of the user is also acquired.
- a change in user's physical condition is detected from the pieces of information.
- the pieces of information are collated with the behavior schedule of the user, and the change in user's body does not match the behavior schedule, the degree of stress is measured, and a service or advice for eliminating or relaxing the stress, or maximizing the ability by using the stress is provided to the user in accordance with the degree of stress and the user's situation.
- a life support apparatus which can relax stress and suppress mental and physical damage by the stress to achieve effective medical administration and healthcare can be provided.
- a life support apparatus capable of increasing the ability by taking advantage of stress can be provided.
- the physical and mental conditions of a user are always grasped, and an advertisement corresponding to the conditions is displayed through a wearable computer.
- the hardware configuration to be used is the same as that shown in the block diagram of FIG. 1.
- the behavior information, vital information, and degree-of-stress information of the user are acquired by the same arrangement and method as in the first embodiment in the above-described way. After the pieces of information are acquired, an advertisement genre corresponding to the user's situation is estimated in a wearable computer (main module 101 in FIG. 1) on the basis of the acquired information, and data of the selected advertisement genre is transferred to an advertisement service agent.
- the main module 101 has a Bluetooth chip 1013 for radio communication.
- Bluetooth tags Bluetooth chips
- the server of an advertisement service agent is connected to the network.
- the server of the advertisement service agent distributes, through the network, advertisements corresponding to genre data sent from the main module 101 of the user.
- the advertisement distribution is done in a form of mail, voice message, banner on a display, or the like, and the advertisements are exchanged through the Bluetooth.
- optimum contents are provided and displayed on the device on the user side in accordance with the current situation of the user at the most effective timing for advertisement. For example, if the user is on his/her way to the office, an advertisement related to the work of the day is provided and displayed, as shown in FIGS. 11A and 11B. If the user is on his/her way to home, an advertisement related to hobby or articles for daily use including foods and clothes, or an advertisement of a supermarket near the route to home is provided and displayed before the purchase chance of the user is lost.
- the display contents and display medium can be switched in accordance with the user's mental condition (whether the degree of stress is high or low). For example, when the degree of stress is high, and the user's mind is occupied with his/her own affairs, advertisement distribution is stopped. Advertisements are distributed later when the user is relaxed.
- the advertisement distribution agent selects contents to be distributed.
- the terminal side (wearable computer) may have a function of selecting an advertisement to be received and displayed in accordance with the situation of the user himself/herself.
- the server of the service agent may execute the filtering in accordance with setting information from the user and distribute advertisements.
- the display medium may be switched in accordance with the user's situation. For example, when it is recognized from behavior information that the user is walking, an advertisement to be provided is presented not by character and image data but by voice data.
- personal behavior data and subjective information for the data may be collected and used for marketing/consulting business. Since the wearable computer (main module 101 ) stores personal behavior data and subjective information for the data, the agent acquires the pieces of information provided from the user. Since the pieces of information are personal information, they must be provided based on user's will. For this reason, the pieces of information are transmitted to the service side by transmission operation by the user himself/herself.
- the pieces of collected information must correspond to the consulting service to be provided.
- pieces of information must be collected from persons who have come to or near the convenience store.
- FIG. 13A to collect data of a person P who has arrived at the neighborhood (area A) of the store, Bluetooth radio tags are installed at and near the store and connected to a person who has entered that area using the Bluetooth (step S 1301 in FIG. 13B).
- the system inquires of the person, presents the types and distribution destination—of data, and obtains user's consent about whether the data may be sold (step S 1302 in FIG. 13B).
- the data can be used for marketing and consulting services in terms of stress, e.g., the data can be reflected on the stock of pieces of merchandise which are useful to get rid of the stress or advertisement distribution for sales campaign in correspondence with the degree of stress of each person.
- the merchandise display can be controlled for each time zone.
- step S 1304 and S 1305 in FIG. 13B the advertisements of the merchandise and store can be displayed (steps S 1306 and S 1307 in FIG. 13B).
- step S 1305 in FIG. 13B an advertisement that recommends a candy or food of his/her favorite is distributed (steps S 1306 and S 1307 in FIG. 13B) and displayed on the wearable computer of that person (step S 1308 in FIG. 13B) to stimulate his/her will to buy.
- a system capable of promoting sale can be built and operated.
- an agent that sells healthy foods lends or sells at a low price a wearable computer set as shown in FIG. 1 to a consumer.
- the wearable computer set uses the wearable computer set, the behavior information, degree of stress, and health state of the user are measured by the above-described device.
- the consumer transmits the data of his/her own to the health advice service agent.
- the service agent receives the medical examination result.
- the pieces of information are periodically collected and automatically transmitted to the service agent, or the agent accesses the terminal of each user to collect information and determine the situation. It is determined whether the situation allows presentation of an answer to the user, and the examination result is transmitted from the system of the service agent using a medium according to the situation.
- the user side receives the transmitted examination result by the user's terminal.
- a banner advertisement of a healthy food or medicine (FIGS. 9A and 10A) related to the examination result is displayed, or the homepage of an online shopping service is displayed (FIG. 9B).
- a banner advertisement or a page of online shopping that recommends a vitamin C tablet or medicine for promoting nutrition, such as a nutritive drink, is displayed.
- a coupon service is executed to offer “a point service for use of the service as a benefit” or “a special coupon (e.g., free drink ticket or the like) as a benefit”.
- a store guide map display button is displayed on the banner advertisement (FIG. 10A).
- a map (FIG. 10B) for the store or a voice guidance (FIG. 10C) can be effectively given.
- the life support apparatus uses a wearable computer having a Bluetooth as a short-distance radio communication device and a function of collecting vital information and behavior information.
- the user carries the apparatus and uses for his/her health care against stress.
- pieces of information for the stress are collected from a pedestrian to the server through a radio tag (e.g., a Bluetooth chip) on the street and network.
- a radio tag e.g., a Bluetooth chip
- a content for commercially advertising a measure recommended to the user is distributed to the user to keep him/her informed.
- a system capable of realizing healthcare of the user and providing a commercial effect can be built.
- a system which can be effectively used for business by analyzing pieces of information of the behavior of a user and information related to stress, which are collected in the server, and using the information for consulting and marketing services can be built.
- a life support apparatus which presents a user's situation acquired using the above user's situation recognition device by a means optimum for each inquiring medium in response to an external inquiry from a handyphone, mail, or beeper is provided.
- the hardware configuration is the same as in FIG. 1, and the behavior information, vital information, and degree-of-stress information of a user can be acquired by the same arrangement as in the first embodiment.
- FIG. 16 is a flow chart showing the processing.
- a terminal main module 101
- the behavior information, vital information, and degree-of-stress information of the user are acquired by the main module 101 in accordance with the same arrangement and method as in the first embodiment.
- a CPU 1012 of the main module 101 looks up a set mode table shown in FIG. 18 (step S 1603 in FIG. 16).
- the CPU 1012 of the main module 101 starts user's situation recognition processing in accordance with the condition of the set mode table unless the answer is inhibited.
- Various data (vital information) collected by an acceleration sensor module 103 and sensor module 102 of the user are transmitted to the main module 101 .
- the CPU 1012 of the main module 101 recognizes the user's situation, as in the first and second embodiments, and accesses the handyphone 107 through the Bluetooth chip on the basis of the information.
- the CPU 1012 extracts publishable information from the set mode table incorporated in the handyphone 107 and creates voice presentation text by combining the information (step S 1611 in FIG. 16).
- This text may be displayed on the handyphone of the user (callee), and a window for inquiring of the user about whether the answer can be transmitted may be displayed. When the user selects “YES”, the answer is transmitted. This prevents the situation from being carelessly externally transmitted.
- the example of table shown in FIG. 19 has the following meaning.
- the incoming call notification is a “voice message”, and a “voice message” is output for message display.
- the incoming call notification is done by “vibration”, and message display is done by “text display on a wrist watch type display 105 ”.
- the incoming call notification is done by “vibration”, and message display is done by “display 104 ”.
- the location is “indoor”, and the action is “—(arbitrary)”, the incoming call notification is done by “vibration”, and message display is “not performed”.
- notification is to be executed in several steps, for example, the user is notified of only an incoming call by a voice message, and to display the contents as text, the user is notified of the medium or device where the details are to be displayed by a voice message.
- the CPU 1012 of the main module 101 creates and displays notification text “I'm on the Toyoko Line between Jiyugaoka and Toritsu-daigaku. Will arrive at Shibuya in about 10 minutes” on the basis of situation information (step ST 3 in FIG. 20).
- the user checks the text and selects “transmit” the text is converted in accordance with the medium and transmitted to the caller (steps ST 4 and ST 5 in FIG. 20).
- a mode for editing the situation information is set (step ST 6 in FIG. 20).
- “Jiyugaoka—Toritsu-daigaku” in the above example can be changed by the user to, e.g., “Nakameguro—Daikanyama” (step ST 7 in FIG. 20).
- the CPU 1012 of the main module 101 automatically changes the message “10 minutes” as the required time for “Jiyugaoka—Toritsu-daigaku” to “5 minutes” as the required time for “Nakameguro—Daikanyama” (step ST 8 in FIG. 20). This can be easily implemented by preparing a section required time table in advance, and when the section is changed in the edit mode, obtaining a corresponding required time by looking up the table.
- step ST 3 When “replace” is selected in step ST 3 , the flow advances to step ST 9 in FIG. 20 to allow whole message replacement, so any situation can be set.
- step ST 10 in FIG. 20 when, e.g., “in meeting” is selected (step ST 10 in FIG. 20), text that represents that the user is in a meeting in the office can be created independently of the actual situation (steps ST 11 and ST 12 in FIG. 20).
- step ST 11 and ST 12 in FIG. 20 When the user checks the text and selects “transmit”, the text is converted in accordance with the medium and transmitted to the caller.
- the edit contents are changed in accordance with the user's situation and detectable range. For example, if “in meeting” is selected, the caller is notified of the time of end of meeting (the time of meeting is detected from the schedule).
- the life support apparatus uses a wearable computer which has a Bluetooth as a short-distance radio communication device and also has a function of collecting vital information and behavior information, and can hold the user's schedule information and recognize the behavior state from that information.
- This apparatus grasps the behavior state of the user, and upon reception of an incoming call at the handyphone or the like, selects an optimum method of dealing with the incoming call from the current behavior state of the user. Even when an incoming call is detected on the train or during meeting, an optimum response method for that situation is automatically selected. For this reason, the user can respond to the caller without troubling those around the user. Hence, the user can optimally cope with termination of a call or mail without any stress.
- the feedback medium may be changed in accordance with the user's situation on the basis of the measured and recognized behavior. For example, when the user is walking, a voice message is output. During work, the message is displayed on the display window. When the user is sleeping, no message is output, though in case of emergency, those who are around the user are notified of the emergency as well as the user, or a message is transmitted to the family physician or security agent. In addition, the user may be notified of the emergency by strong vibration or voice to make him/her recognize the emergency level of the information.
- a message for prompting the user to measure data is displayed in accordance with the measurement schedule.
- a follow message is periodically displayed.
- the Bluetooth is used for communication between modules, though any other method can be used as long as communication at personal level is possible.
- a technique PAN: Personal Area Network
- Communication between modules may be executed using this technique.
- IrDA infrared communication interface
- communication between modules is executed as radio communication.
- cable connection may be performed using, e.g., RS232C as one of standard interfaces for serial communication.
- the transfer condition pieces of vital information before and after a change in action may be transferred, the transfer rate may be raised (the priority level may be increased), or time resolving power may be increased.
- the time resolving power for data to be measured is increased, and otherwise, the data are transferred at a low resolving power.
- the type of acquired information may be controlled. For example, an electrocardiogram is acquired in a high load state, and only a pulse rate is acquired in a low load state.
- wearable sensor modules may be used to acquire data when the sensors are attached, and the sensors on the environment side may be used to acquire data when the sensors are detached.
- an energization type attached/detached status detection sensor in a sensor module is attached to the user. If the sensor is a potential or resistance detection sensor, that the sensor is detached is detected when the resistance is infinite or the electrodes are open, or a check signal is transmitted from the main module to repeat detection. That the sensor is detached is detected when no check signal is received. If the sensor is detached, the main module searches for a sensor capable of acquiring vital information and environmental information of the user from the environmental network, and if a sensor is found, the main module connects to the sensor to acquire data. If no sensor is found, a message “no sensor” is presented to the user and recorded in data together with the reason.
- the pulse sensor when the user is taking a bath, the data is switched to the pulse rate from the electrocardiogram obtained while the user is in the bathtub.
- data is received as an electrocardiogram from an electrode attached to the bedclothes, or a variation is detected by variation in breath (detected from an image).
- measurement data is A/D-converted, and the situation is determined on the basis of a digital signal.
- this processing may be executed using an analog signal.
- the degree of-stress is grasped without troubling the user on the basis of actual user's behavior history and vital information in accordance with the motion information measured from the user and schedule data, thereby navigating user's life in a desired direction, e.g., relaxing the stress or making it possible to perform operation at the maximum efficiency.
- the pieces of information are collected in units of regions, they can be used for marketing in each region.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- Finance (AREA)
- Development Economics (AREA)
- Accounting & Taxation (AREA)
- Physiology (AREA)
- Otolaryngology (AREA)
- General Business, Economics & Management (AREA)
- Cardiology (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Marketing (AREA)
- Economics (AREA)
- Neurology (AREA)
- Game Theory and Decision Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Psychiatry (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Networks & Wireless Communication (AREA)
- Social Psychology (AREA)
- Educational Technology (AREA)
- Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Pulmonology (AREA)
Abstract
A life support apparatus comprising a vital information sensor attached to a body to acquire vital information of a user, a behavior information sensor attached to the body to acquire behavior information of the user, a situation recognition device which recognizes a user's situation based on the behavior information acquired by the behavior information sensor and the vital information acquired by the vital information sensor to generate user's situation information, a data base which stores stress management information are prepared in advance, an information search device which searches the data base for stress management information corresponding to the user's situation information, and an information presentation device which presents the stress management information obtained by the information search device to the user.
Description
- This application is based upon and claims the benefit of priority from the prior Japanese Patent Application No. 2000-163793, filed May 31, 2000, the entire contents of which are incorporated herein by reference.
- 1. Field of the Invention
- The present invention relates to a wearable type life support apparatus for measuring and determining various states of a user by using a wearable device and giving life support by an information service such as medical administration or personal navigation according to the user's situation, a life support method, and an advertisement providing method.
- 2. Description of the Related Art
- As the society becomes complex, stress in daily life is said to be one factor of various problems of the modern society because stress adversely affects the health of people by, e.g., causing life-style related diseases such as heart diseases or mental diseases such as depression, and also triggers crimes.
- “Stress” originally means stimuli from the external world (Yoshinosuke Tadai, “What is Stress?”, Kodansha Bluebacks). Currently, “stress” is taken to also include adaptive reaction against stress. When the adaptive reaction exceeds human limitations, various diseases or mental disorders are said to occur. Alternatively, these disorders occur presumably when the sympathetic nerve and the parasympathetic nerve become unbalanced due to a change in rhythm of life.
- How to cope with stress is important for the people of this day. The best measure against stress is stress control, i.e., to eliminate stress. Various methods are recommended to this end: “to see or listen to images or music that prompts relaxation”, “to do anything one likes”, and “to shout loudly”.
- However, a person who is pressed with business every day is often unconscious of the buildup of stress. He/she is heavily fatigued, and in the worst case, comes to a sudden death. It is therefore important to realize and control stress before it becomes serious or prevent stress from building up.
- As the importance of measures against stress is recognized, development of apparatuses for acquiring vital information of a user and measuring and managing user's stress is an urgent necessity. Several apparatuses have already been proposed.
- However, since a person experiences stress in a variety of situations in daily life, the vital information to be measured largely changes depending on external situations such as the peripheral environment and the position of the person as well as internal situations such as action and mental condition. For this reason, it is difficult to accurately grasp user's stress state unless the vital information is analyzed and determined in association with user's behavior.
- To make the user realize stress, he/she must be notified when or immediately after stress builds up. Otherwise, the user cannot be aware of stress, and if the user is unaware of the stress, life style and the like can hardly be improved.
- Hence, demand has arisen for the development of a system which can determine the stress situation in daily life and notify a user of it so as to assist stress control.
- For stress control, it is important in terms of care to provide the user an optimum measure to cope with the situation. However, such a technique is still absent.
- For portable type information communication devices such as handyphones, ringing tones or conversing voice at a public place as in a train poses a problem. To solve this problem, a technique of setting a vibrator call mode as a “manner” mode or receiving a message by an automatic answering telephone function is already widely used. However, since urgent contact is sometimes required, a mechanism capable of easily communicating a situation or message without voice communication is required.
- In the world of the Internet, advertisement display (banner advertisement) according to the Internet use situation of a user, such as Internet advertisement, is widely used. However, information providing according to each scene of user's daily life and business that uses the information providing service for advertisement have not been realized yet.
- It is an object of the present invention to provide a life support apparatus and method which can determine the stress situation in daily life and notify a user of it to cause the user to realize the stress or can support the user of a method of eliminating stress or care against the factor that has caused the stress, on the basis of the situation of the user.
- The present invention provides a system which can grasp the situation of a user by various sensors, grasp the degree of stress corresponding to each situation, and offers a service such as occasional life style improvement or relaxation using the concept of a wearable computer always attached to the user. In addition, the system is applied to consumer marketing or advertisement display business.
- According to a first aspect of the present invention, there is provided a life support apparatus comprising a vital information sensor attached to a body to acquire vital information of a user, a behavior information sensor attached to the body to acquire behavior information of the user, a situation recognition device which recognizes a user's situation based on the behavior information acquired by the behavior information sensor and the vital information acquired by the vital information sensor to generate user's situation information, a data base which stores stress management information are prepared in advance, an information search device which searches the data base for stress management information corresponding to the user's situation information, and an information presentation device which presents the stress management information obtained by the information search device to the user.
- According to this arrangement, the vital information and behavior information of the user are acquired, the user's situation is recognized on the basis of the acquired behavior information, corresponding information is obtained from pieces of information for dealing with stress, which are prepared in advance, using the acquired user's situation information as a key, and the obtained information is presented to the user.
- The stress situation is determined in daily life, and optimum service information for stress elimination or care is provided to the user in accordance with the situation, thereby enabling life advice contributing to user's healthcare.
- According to a second aspect of the present invention, there is provided a life support apparatus comprising a user information sensor attached to a body to acquire information representing a user's situation, a situation recognition device which recognizes the user's situation based on user information acquired by the user information sensor, a transceiver device which transmits the information of the user's situation recognized by the situation recognition device and receive external information transmitted from an external apparatus, and a presentation device which presents the external information received by the transceiver device to the user, the external information including an advertisement appropriate for the user, which is sent from the external apparatus in correspondence with the user's situation information.
- According to this arrangement, information representing the physical situation of the user is acquired, the user's situation is recognized on the basis of the acquired information, and advertisement information corresponding to the information of the recognized user's situation can be obtained from the server that hold various kinds of advertisement information corresponding to physical situations and presented to the user.
- The stress situation is determined in daily life, optimum service information for stress elimination or care is provided to the user in accordance with the situation in consideration of the time and circumstances, and the user is prompted to use the service, thereby enabling life advice contributing to the commercial effect and user's healthcare.
- According to a third aspect of the present invention, there is provided a life support apparatus comprising various kinds of information presentation media for a voice or text message, a user information sensor attached to a body to acquire user information representing a user's situation, a situation recognition device which recognizes the user's situation based on user information acquired by the user information sensor to generate user's situation information, a communication device connected to the situation recognition device and communicating with external equipment, a situation information conversion device which selects an optimum message presentation medium from the information presentation media in accordance with the user's situation information and convert the situation information into a form corresponding to the optimum message presentation medium, to present call message information sent from the message sender and received by the communication device to the user, and an answer transmission device which transmits the user's situation information converted by the situation information conversion device to the message sender.
- In this arrangement, upon reception of incoming message information addressed to the user, the call message is converted into a form that uses an appropriate medium corresponding to the user's situation recognized by the situation recognition device, and presented. The answer transmission device transmits the user's situation information converted by the situation information conversion device to the message sender.
- The user's current situation is determined in daily life to cope with an incoming call in accordance with the user's situation at that time in consideration of specific time and circumstances such that the user need not worry about it, thereby enabling life advice without imposing any stress on the user.
- According to a fourth aspect of the present invention, there is provided an advertisement information providing method comprising preparing a server which holds various kinds of advertisement information corresponding to physical situations, and extracting optimum advertisement information corresponding to a situation of a user to present the optimum advertisement information to the user.
- According to the present invention, the server which hold various kinds of advertisement information corresponding to physical situations is prepared, the physical situation of the user is detected, and optimum advertisement information is obtained from the server in correspondence with the user's situation and presented to the user.
- The user's current situation is determined in daily life, and advertisement information such as optimum merchandise to deal with stress can be presented to the user on the basis of the determination result in consideration of the user's situation at that time. This contributes to user's stress care and healthcare, and commercially effective advertisements can be provided.
- FIG. 1 is a schematic block diagram showing a wearable type life support apparatus according to the first embodiment of the present invention;
- FIG. 2 is a flow chart showing the processing procedure of the wearable type life support apparatus according to the first embodiment of the present invention;
- FIGS. 3A to3C are schematic views showing the principle of human posture recognition used in the present invention;
- FIG. 4 is a flow chart showing action and posture recognition processing used in the embodiment of the present invention;
- FIG. 5 is a view for explaining the structure of a reference sensor information corpus related to stress, which is used in the embodiment of the present invention;
- FIG. 6 is a view for explaining the dialogue structure for a situation registered in the sensor information corpus used in the embodiment of the present invention;
- FIGS. 7A, 7B, and7C are views for explaining behavior-related pulse rate trend graph display and a behavior input window for an abnormal value, which are used in the embodiment of the present invention;
- FIGS. 8A and 8B are views for explaining a display window of vital information related to a behavior, which is used in the embodiment of the present invention;
- FIGS. 9A and 9B are views for explaining situation-dependent advertisement display and an online shopping window according to the display, which are used in the embodiment of the present invention;
- FIGS. 10A, 10B, and10C are views for explaining advertisement display corresponding to the degree of user's fatigue/stress and a road map window, which are used in the embodiment of the present invention;
- FIGS. 11A, 11B, and11C are views for explaining advertisement displays corresponding to the behavior information of the user and a road map, which are used in the second embodiment of the present invention;
- FIG. 12 is a view for explaining the structure of a regional sensor information corpus related to stress, which is used in the embodiment of the present invention;
- FIGS. 13A and 13B are a view and flow chart, respectively, showing collection of information of a person who passes by a convenience store and advertisement display, which are used in the embodiment of the present invention;
- FIG. 14 is a view showing the structure of address book data including degree-of-stress information, which is used in the embodiment of the present invention;
- FIG. 15 is a view for explaining the structure of a relational database of a schedule/task list, the degree of stress, and the degree of fatigue, which is used in the embodiment of the present invention;
- FIG. 16 is a flow chart of processing of returning a situation-dependent automatic answering telephone message to a handyphone according to the third embodiment of the present invention;
- FIG. 17 is a flow chart of posture/action recognition based on peak detection of a waveform as a function of time, which is used in the embodiment of the present invention;
- FIG. 18 is a view showing an allowable range setting table of answer contents for each message sender, which is used in the embodiment of the present invention;
- FIG. 19 is a view showing an answer device setting table for each user's situation, which is used in the embodiment of the present invention; and
- FIG. 20 is a view showing message display window transition that is used in the embodiment of the present invention.
- The embodiments of the present invention will be described below in detail with reference to the accompanying drawing.
- (First Embodiment)
- A wearable type life support apparatus which can give a life advice effectively used for healthcare and medical administration by monitoring stress applied on the body of a user, and when stress occurs, relaxing the stress to suppress mental and physical damage by the stress, and can also give navigation by utilizing the stress to improve the ability of the user will be described here with reference to FIG. 1.
- The wearable type life support apparatus shown in FIG. 1 has a
main module 101, asensor modules 102, anacceleration sensor module 103, adisplay 104, a wristwatch type display 105, aheadset 106, and ahandyphone 107. - Of these components, the
main module 101 is a compact and lightweight computer such as a wearable computer, which has a function of analyzing collected vital information to grasp the degree of stress and providing various kinds of supports in accordance with the degree of stress. Themain module 101 also has functions of processing collected data, sending the processed data to the database of a center, executing desired processing using information obtained from the database, and transmitting/receiving information or a control command to/from theheadset 106,display 104, orhandyphone 107. - The
main module 101 is formed from amemory 1011 andCPU 1012. Application programs and control programs for implementing the above-described functions and an OS (Operating System) as the basic software of the computer are stored in thememory 1011. TheCPU 1012 executes these programs to realize various desired processing operations. Themain module 101 also has a calendar/timepiece function such that collected or processed information can be managed with a time stamp. - The
main module 101 also has, e.g., a function of synthesizing a character string prepared as text data into a voice and outputting a voice signal, a function of recognizing a voice signal and converting it into text data, and a function of collating data. Themain module 101 has, e.g., aBluetooth chip 1013 for executing communication between modules using Bluetooth as an international standard short-distance radio communication device which has received a great deal of attention in recent years. Themain module 101 can store data to be handled in the system, systematically manage the entire system, execute data communication between the modules, and communicate with a home server and management server (not shown). - The
sensor modules 102 collect and transmit vital signals and are connected to vital signal detection sensors such as apulse sensor 1026 for detecting pulses of a human body, athermo sensor 1027 for detecting the body temperature of the human body, and a GSR (Galvanic Skin Reflex)electrode 1028 for detecting the skin resistance of the human body. Eachsensor module 102 comprises apreprocessor 1025 that amplifies and preprocesses the detection signal from each sensor, an A/D converter 1024 that converts the sensor detection signal preprocessed by thepreprocessor 1025 into digital data, aCPU 1022 that executes various control operations and data processing, and amemory 1021. Eachsensor module 102 also incorporates aBluetooth chip 1023 to execute data communication with themain module 101. - The structures from the
sensors sensor modules 102 are divided for the respective sensors. However, the structures for the respective sensors may be integrated into asingle sensor module 102. Processing operations in each sensor andmodule 102 may be integrated. A microcontroller (e.g., PIC16F877 available from MicroChip Technologies) incorporating an A/D conversion function may be used as theCPU 1022 without preparing a separate A/D converter. - The
preprocessor 1025 not only amplifies the signal detected by each sensor by an appropriate gain but also incorporates a filter circuit that performs high-pass filter processing depending on the type of signal or low-pass filter (anti-aliasing filter) processing in accordance with the band of each signal. Each sensor has a plurality of channels as needed. - The
handyphone 107 is a normal handyphone having a liquid crystal display panel, a plurality of operation buttons including dial keys, and a transceiver and inputs/outputs a voice. Thehandyphone 107 also incorporates a Bluetooth chip and can communicate with themain module 101. With this arrangement, voice input/output and cursor control by cursor keys can be performed. - The
display 104 is a display terminal formed from a portable liquid crystal display panel which displays text data or an image and exclusively constructed for display. Thedisplay 104 has aBluetooth chip 1041 and can control display contents upon receiving display data and the like from themain module 101 through the Bluetooth chips 1013 and 1041. - The
headset 106 is an input/output terminal used on the user's head, i.e., a headset incorporating a Bluetooth chip and CCD camera (solid state image sensor) as well as a headphone (or earphone) and microphone. Theheadset 106 is a device for voice/image interface. Theheadset 106 also incorporates a Bluetooth chip to transmit and receive a voice signal and transmit an image. Theheadset 106 can be used simultaneously together with thehandyphone 107. - The wrist
watch type display 105 is a liquid crystal display panel having a wrist watch shape used on the user's arm. The wristwatch type display 105 incorporates a Bluetooth chip to transmit/receive data or command to/from themain module 101. - This apparatus assumes digital communication by Bluetooth. However, a radio communication device of any other scheme or a scheme of performing D/A conversion and transferring a voice signal to the headphone by FM modulation may be employed. A voice signal may be transferred not by radio communication but by cable connection. An image may be acquired by a digital camera attached independently of the
headset 106. - The function of the system with the above arrangement will be described next.
- FIG. 2 is a flow chart showing the flow of operation of the system according to the present invention having the arrangement shown in FIG. 1. The operation will be described with reference to the flow chart show in FIG. 2. The user carries the
main module 101,sensor modules 102,handyphone 107,display 104, andheadset 106. Thepulse sensor 1026,thermosensor 1027,GSR electrode 1028, andacceleration sensor 1036 are set on the user, and then, the system is activated to start operation (step S201 in FIG. 2). - When the sensors are set and activated, they start to detect a vital signal. As a result, a pulse rate detection signal by the
pulse sensor 1026, a temperature detection signal by thethermosensor 1027, a galvanic skin reflex detection signal by theGSR electrode 1028, and an acceleration measurement signal by theacceleration sensor 1036 are obtained (step S202 in FIG. 2). The measurements are done continuously, periodically (every minute, every 10 minutes, or the like), or in accordance with a measurement instruction from themain module 101 or a user's instruction. - The analog detection signals obtained by the
sensors sensor modules 102. The A/D-converted data are transferred to themain module 101 through a short-distance radio device such as theBluetooth chip 1023. - The
main module 101 processes the measurement data by a preset logic, thereby determining the user's situation. - First, the
main module 101 recognizes the action (behavior) or posture of the user on the basis of acceleration information obtained from the acceleration sensor 1036 (step S203 in FIG. 2). - The action/posture recognition method in step S203 is shown in the action recognition flow chart of FIG. 4.
- <Action/Posture Recognition in S203>
- Referring to FIG. 4, the acceleration information is obtained by attaching, e.g., a three-dimensional acceleration sensor to a predetermined portion of the human body as the
acceleration sensor 1036, thereby measuring the posture and action. The three-dimensional acceleration sensor 1036 can be formed by perpendicularly laying out two two-dimensional acceleration sensors such as “ADXL202JC” available from Analog Devices Corp. The three-dimensional acceleration sensor 1036 is attached to, e.g., the waist to measure the motion of the body center (trunk) portion. - As shown in FIG. 3A, 3B, or3C, the tilt of the sensor is obtained from a DC component, i.e., an output obtained by passing the acceleration waveform from the
acceleration sensor 1036 through a low-pass filter, thereby detecting the posture. - For example, the
sensor 1036 is attached to the joint portion of the base of the femoral region of a user P. An angle is obtained from the vertical and horizontal components of the DC component, and the posture can be recognized on the basis of the angle: when the sensor is almost horizontal, the user is lying on his/her back (FIG. 3C) or on his/her face, when the sensor is almost vertical, the user stands upright (FIG. 3A), and the sensor has an angle therebetween, the user is sitting (FIG. 3B). - The action (walking, running, bicycle, car, train, or the like) can be identified from the frequency component and variation pattern of an AC component. For example, the fundamental frequency of walking is 60 to 100 (times/min), and that of running is 120 to 180 (times/min). Hence, the fundamental frequency component is acquired by performing frequency analysis (FFT (Fast Fourier Transform)) for the detected signal (S401 in FIG. 4) or by detecting waveform peaks and peak interval. The powers in the respective bands are compared, thereby recognizing walking or running.
- If the
acceleration sensor 1036 is attached not to the waist but to a leg portion, e.g., the femoral region of a leg, the acceleration during walking is maximized by vibration when the foot touches the ground. However, during running (running fast), the acceleration is maximized by vertical movement of the waist when the feet touch the ground. For this reason, for walking, the fundamental frequency must be further halved. Alternatively, since the vertical amplitude for running is larger by twice or more than that for walking, the amplitude values at the time of peak detection are compared, thereby recognizing walking or running. - The above methods may be combined, or either method may be used.
- In the flow chart shown in FIG. 4, FFT is used for analysis processing. However, the present invention is not limited to this. Another spectrum analysis device such as wavelet transformation may be used. Alternatively, pattern matching between the waveforms of fundamental frequencies may be performed to recognize “running”, “walking”, “ascending stairs”, or “descending stairs”. Simple peak detection may be performed, and the number of steps may be measured from the period. Alternatively, as shown in FIG. 17, peak detection may be performed along the time axis to obtain the walking or running pitch.
- Posture/action recognition is done by the
main module 101. However, posture/action recognition may be done by thesensor modules 102, and resultant status data (posture and action) may be transmitted periodically or when a change occurs. - To recognize a position indoors, the
main module 101 communicates with a radio tag (e.g., a Bluetooth chip) prepared in each room to detect the location. Outdoors, a position information service of a handyphone (or PHS) or a GPS (not shown) is used to detect the location. - With the above operation, action (behavior) of the user can be recognized from the acceleration information. When the above processing is ended, the flow advances to determination processing in step S204.
- The determination processing in step S204 is executed to check whether the pulse rate, body temperature, GSR (Galvanic Skin Reflex), posture, action, or voice has changed. If NO in step S204, the flow returns to step S203 in FIG. 2. If YES in step S204, the flow advances to processing in step S205.
- For the determination processing in step S204, the pieces of information of the pulse rate, body temperature, GSR, posture, action, or voice are necessary. Of these pieces of information, pieces of vital information such as the pulse rate, body temperature, and GSR are measured simultaneously with the above-described user's behavior state detection. The measuring method will be described below.
- <Vital Information Measurement>
- The pulse rate is obtained by the
pulse sensor 1026. Thepulse sensor 1026 detects a pulse by photoelectrically sensing a change in bloodstream through peripheral blood vessels in, e.g., a finger, wrist, or ear as a part to be measured. A portion where blood vessels concentrate is irradiated with light using, as a light source, an incandescent lamp or LED (Light Emitting Diode) capable of emitting light having an absorption wavelength of hemoglobin contained in blood in a large quantity. The transmitted or reflected light is received by a photodiode as a photoelectric element, photoelectrically converted, and measured. - A potential waveform on which the influence of light absorption by hemoglobin that flows in bloodstream is reflected is obtained as a detection signal from the component, e.g., photodiode of the
pulse sensor 1026. This signal is amplified by thepreprocessor 1025, filtered, converted into digital data by the A/D converter 1024, transmitted from thesensor module 102 through theBluetooth chip 1023, and thus received by themain module 101 as potential waveform data as pulse rate data. - The
main module 101 analyzes the peak interval or frequency of the potential waveform of the received pulse data and calculates the pulse rate from the peak frequency. The analysis and calculation are done by theCPU 1012. - The
pulse sensor 1026 can have the shape of an earring, ring, or wrist watch, and any shape can be employed. Alternatively, thepulse sensor 1026 may be incorporated in theheadset 106 shown in FIG. 1 such that the light emitting diode (incardescent lamp or LED) and photoelectric element (photodiode) or CdS cell are arranged on the front and rear sides of an earlobe. The light emitting and photoelectric elements may be incorporated in a ring or wrist watch, and the sensor may be incorporated in each module. - When, e.g., two
pulse sensors 1026 are set at a predetermined interval to measure two waveforms, the resultant digital signals are received by themain module 101, the blood pressure or the elastic modulus of blood vessels can be obtained from the difference between the waveforms. - In addition, when LEDs for two wavelengths, i.e., absorption wavelength of oxyhemoglobin and that of reduced hemoglobin are used to irradiate blood vessels with light, and reflected light is measured, the oxygen saturation in the artery can be calculated.
- Alternatively, when blood vessels are irradiated with light from an LED having an absorption wavelength of glucose, the blood sugar value can be measured using the reflected light.
- In measuring the pulse rate, the heartbeat rate may be calculated using an electrocardiogram on the basis of the peak interval or peak frequency obtained from frequency analysis (this method is medically stricter).
- The pulse rate value, blood pressure value, and blood sugar value are always measured and stored in the
memory 1011 of themain module 101. Alternatively, measurements are performed periodically or at an arbitrary time in accordance with an instruction from themain module 101 to store data. - To measure the body temperature, the
thermosensor 1027 is used. Thethermosensor 1027 is formed from a detection device such as a thermocoupler or thermistor. The detection device is brought into contact with the body surface of the user, and the output from the detection device is converted into a temperature in accordance with the characteristics of the sensor. - To measure the GSR, a pair of electrodes are attached to the body surface of the user at a predetermined interval, a weak current is supplied across the electrodes, the potential difference and current value are measured, and the resistance value is calculated using the measurement values. In the measurement, drift components are removed from a waveform corresponding to the measurement result obtained from the two electrodes, and then, the amplitude of the leading edge and the number of leading edges are acquired. The drift components are acquired from the average value of the waveform.
- These data are also converted into digital data, transmitted to the
main module 101 by radio, and stored in thememory 1011 of themain module 101, like the output from thepulse sensor 1026. - Together with these measurement values, analog (voltage) data from the
acceleration sensor 1036 is also A/D-converted and stored in thememory 1011. These data are linked with each other by giving measurement times to the respective data or recording the data in the same record. - The pieces of vital information are obtained in this way. If the vital information or posture, action, or voice information changes, the
CPU 1012 of themain module 101 acquires current schedule data by processing in step S205. - The presence/absence of a change in vital information or posture or action information is determined on the basis of the following reference.
- In the measurements continued in the above-described manner, a change means that the vital information (pulse rate, body temperature, or GSR) abruptly changes or becomes abnormal (for example, the pulse rate is “120” or more, or the body temperature is “37° C.” or more), or the action information represents a status change such as “the user stops walking”. When such a change is detected (step S204 in FIG. 2), the
CPU 1012 of themain module 101 acquires schedule data including the change time by, e.g., PIM (Personal Information Manager) software that belongs and is compatible to the OS of the main module 101 (by, e.g., application “Microsoft Outlook 2000” if the OS (Operating System) of themain module 101 is “Windows” available from Microsoft) (step S205 in FIG. 2). - Consistency between the pieces of information and the schedule is checked (step S206 in FIG. 2), and any inconsistencies and deficient information are acquired from the user by speech dialogue and supplemented (step S207 in FIG. 2).
- The method of acquiring and supplementing information from the user by speech dialogue will be described below.
- Assume that the absolute values of the AC component outputs from the
acceleration sensor 1036 in the three axial directions (x-axis, y-axis, and z-axis) fall outside a preset range. TheCPU 1012 of themain module 101 determines that “the user is acting” because the absolute values of the AC components in the three axial directions exceed the set value, and asks the user “What are you doing now?” and executes voice recognition for the answer, thereby inputting behavior information. - More specifically, text data “What are you doing now?” is prepared as question data, and this data is synthesized into a voice signal and transmitted to the
headset 106 through theBluetooth chip 1013. - The
headset 106 receives the voice signal through the Bluetooth chip of its own, transfers the voice signal to the headphone, and causes it to output voice. The user who is wearing theheadset 106 can hear the question from themain module 101, “What are you doing now?” - The user answers this question with his/her current situation by speaking. For example, “ascending stairs” or “standing up from a chair” The user's voice is converted into a voice signal by a
microphone 1061 of theheadset 106, and theheadset 106 transmits the voice signal through the Bluetooth chip of its own by radio. Themain module 101 of the user receives, through theBluetooth chip 1013, the voice signal transmitted by radio. TheCPU 1012 of themain module 101 executes voice recognition processing for the voice signal and grasps the contents. - Using the PIM software, the
CPU 1012 of themain module 101 acquires, from a database DB1, the user's current schedule data managed by the software (step S205 in FIG. 2). The schedule is prepared in advance in accordance with the behavior plan of the user by specifically setting dates, times, and contents. - The
CPU 1012 of themain module 101 collates the behavior data recognized from the acceleration with the schedule data (step S206 in FIG. 2). If the collation fails, a dialogue for checking it may be done to correct the expectation result on the basis of the result of dialogue. Conversely, when the user is standing still for a long time, it is checked by collating the schedule whether the behavior has no problem. If the collation fails, the CPU inquires of the user. - This inquiry is also done by, e.g., speech dialogue.
- Collation with the schedule may be triggered by vital information. For example, when the pulse rate increases at the scheduled desk work time, the behavior possibly changes, and the CPU asks the user, e.g., “Are you walking or running?” If it is determined as a result of check that the user is at desk working, the increase in pulse rate is supposed to be caused by a mental or morbid factor. The
main module 101 asks the user “Are you feeling unwell?” through theheadset 106 to check whether the user feels stress. - If no answer to this question is received from the user, the
CPU 1012 of themain module 101 recognizes that the user's illness is serious. In this case, under the control of theCPU 1012, themain module 101 searches for medical information registered in advance, controls thehandyphone 107 to execute dial call origination, and notifies the family physician of the emergency by, e.g., transmitting a voice message or a mail message prepared for emergency from thehandyphone 107, or alarms those who are around the user. - The
CPU 1012 of themain module 101 estimates the situation or life behavior on the basis of the measurement data, action, and schedule (step S207 in FIG. 2). - That is, the CPU searches a personal sensor information corpus DB2 in the terminal (main module 101) for sensor information with the same conditions on the basis of the obtained behavior information (where the user is and what the user is doing) and date/time data of the user, and compares the obtained sensor information with the measured sensor information to determine whether the value or change trend has a significant difference.
- The
CPU 1012 of themain module 101 measures the degree of stress from changes in pulse rate, body temperature, and GSR corresponding to the life behavior and situation (step S208 in FIG. 2). - In the
main module 101, the standard range of each vital information is held in thememory 1011 as a parameter in correspondence with each behavior information, and each vital information is compared with the standard range. When the information to be measured falls within the standard range, the value is determined to be normal. When the information falls outside the standard range, the value is determined to be abnormal. Each parameter may be automatically set on the basis of data in normal state. Alternatively, the pattern (waveform) of a change in vital information for a certain behavior is stored, a correlation coefficient with respect to the pattern is acquired, and abnormality is determined when the correlation coefficient is equal to or smaller than a set value. When the value deviates from the normal value, it can be determined that the degree of stress becomes higher than that in the normal state due to, e.g., disturbance. With this processing, whether the degree of stress is normal or abnormal can be detected for each behavior. - Even in the following case, whether the degree of stress is normal or abnormal can be detected. For example, FIGS. 7A, 7B, and7C are views showing displayed vital information/behavior display windows. As shown in FIG. 7A, a pulse rate trend graph is displayed on the monitor window every moment, indicating that the pulse rate abruptly increases during walking. Such an abrupt increase deviates from the normal pattern and can be determined as abnormality. This pattern can be estimated as a running state (the user is running). Hence, as shown in FIG. 7B, a question window, “A change in measurement data is detected. You seem to be running, and the pulse rate is higher than usual. What is the matter with you?” is presented on the display panel, and the user is requested to answer the question. Answer examples such as “I'm running not to be late for work in the afternoon”, “for training”, and “being chased” are prepared and displayed, and the user is made to select one of them. If the user selects “I'm running not to be late for work in the afternoon”, it can be determined that “the pulse rate has increased because the user is in a hurry”, and consequently, it can be detected that “the degree of stress is+(plus)” (FIG. 7C). Even when the
main module 101 executes such processing, whether the degree of stress is normal or abnormal can be detected for each behavior. - Next, to grasp the contents of the user's feeling for the degree of stress, the user is asked about subjective information by speech dialogue (step S208 in FIG. 2). The dialogue structure of the speech dialogue used at this time is built by processing in the
main module 101 in accordance with the user's situation, or a dialogue structure stored in the past in the personal sensor information corpus DB2 serving as a material database is acquired together with sensor information. This will be described below in more detail. - <Dialogue Structure Acquisition Method>
- A method of acquiring the dialogue structure from the sensor information corpus DB2 will be described with reference to FIG. 5. As shown in FIG. 5, the reference sensor information corpus DB2 has, in one record, environment (season, time, place, posture, action, behavior, and expected behavior), physical information (pulse rate, body temperature, GSR, and voice pitch), degree of stress, and dialogue structure. The similarity between the environment and physical information and the measurement data (vital information) obtained from the user is obtained, and the degree of stress is calculated using an evaluation function. A value equal to or larger than a certain reference value is recognized as a record that represents the user's situation, and the degree of stress and a dialogue structure for coping with the stress are acquired (step S210 in FIG. 2).
- For the degree of stress determined (acquired from the average sensor information corpus), the user may be asked a question “You seem to be considerably tired”, “You seem to be tired a little”, or “Are you tired?” The degree of stress may be corrected for the user on the basis of an answer from the user, and the correction result may be reflected on the corpus DB2.
- For example, assume that the pulse rate increases before a meeting. The system grasps this situation and asks the user a question “Your pulse rate is rising before meeting. Are you planning presentation?” The user returns to the system an answer “Yes, I have important presentation. I feel stressful”. Upon receiving this answer, the system side gives an advice to the user to get rid of the stress, “Breathe deeply and relax, or how about something to drink?”, and the user replies “OK”.
- The
main module 101 registers the dialogue result in the sensor information corpus DB2 as a dialogue structure for a specific situation by processing in theCPU 1012. In this example, a dialogue structure as shown in FIG. 6 is registered as a dialogue result. For a situation “dialogue structure: pulse rate rises before meeting”, a dialogue structure for the situation is registered in the sensor information corpus DB2 with contents “System: “Your pulse rate is rising before meeting. Are you planning presentation?””→“User: “Yes, I have important presentation. I feel stressful”” “System: “Breathe deeply and relax or how about something to drink?””→“User: “OK”” - The degree of stress may be detected by continuously analyzing the frequency component of the user's voice. As a characteristic feature of a human voice, the characteristic feature of the degree of stress appears in the frequency component and time-axis component of voice so that, for example, the frequency of the generated voice becomes higher than usual. On the basis of this fact, the degree of stress can be detected by continuously analyzing the frequency component of the user's voice during the dialogue. Hence, when the degree of stress is measured by voice frequency analysis, the degree of stress can be more accurately measured (step S211 in FIG. 2).
- Alternatively, if the user feels difficulty (pressured) to speak with someone or one of participants of the current meeting, it is determined that the degree of stress is high. First, subjective data for that person is stored in the address book of the PIM software. The determination is done on the basis of vital information (pulse rate, GSR, and the like) when the user meets the person. If the pulse rate is high or the integrated value of GSR becomes large during speaking with that person, items “person (name)”, “address”, “telephone number”, . . . , “degree of stress” are stored in the address book having a structure shown in FIG. 14 as data of a person for which the user feels stressful.
- When the user meets the person, the person is recognized from the image or by inputting the name by voice recognition whereby the user's PIM data is obtained from the database DB1 to acquire the data of the degree of stress for that person. In addition, the emotion of the person is recognized from the speech and behavior of the person, the degree of stress is acquired even from the current vital information, and the degree of stress of the user is determined by combination of these data.
- The degree of stress is set as frequency data, and the data are averaged every time the user meets that person. The expected degree of stress becomes high for a person for which the user habitually feels stressful.
- Alternatively, the situation when the user meets a person (e.g., schedule information such as “ordinary meeting” is recorded in linkage with the degree of stress, and the degree of stress corresponding to each situation is stored in the corpus having a structure shown in FIG. 15. When schedule data in the future and participants (persons) are input, the expected degree of stress is calculated on the basis of a predetermined degree-of-stress formula, so the user is advised before the meeting to control the stress to some extent.
- A person sometimes feels stressful depending on the distance from another person. This is understood as a concept “personal space”. The distance changes even depending on the mental condition of a person. When the degree of stress by this concept is also measured, the degree of stress can be more practically reflected. More specifically, the distance from a person is measured using a distance sensor (e.g., an ultrasonic distance sensor or infrared distance sensor) attached to the user, recorded in association with the situation such as the name of the person and date/time/place. The personal space for each situation is also measured, and the degree of stress is counted for each situation in accordance with the person or the time while the person is in the personal space.
- Another factor for stress build-up is a bad smell or strong smell. The intensity and kind of peripheral smell may be recorded using an odor sensor and converted into a degree of stress.
- Still another factor for stress build-up is time. The degree of stress becomes high when the schedule is tight or when the user has a work with time limit. The degree of stress is linked with task (To-Do) data or an event of schedule and recorded, as shown in FIG. 15. As the time limit nears, it is determined that the degree of stress is high.
- The schedule, task, and corresponding degree of stress are stored in the corpus shown in FIG. 15, which has items “season”, “day of week”, “schedule/task (To-Do)”, “content/volume”, . . . , “degree of stress”, and “degree of fatigue”. The schedule and task are freely input. To search the corpus for an item name, keyword search is used, and the closest item is obtained in consideration of other situation data.
- The degree of stress and subjective information of the user are acquired by the above device, deficient data are supplemented and corrected, and the subjective information is recorded (step S212 in FIG. 2). When the degree of stress is more than a predetermined threshold value, and it is determined that the user is stressed, the data of the degree of stress is transmitted to the information providing service agent together with the data of the user's situation, and an information providing service appropriate for the user is offered on the basis of the data (step S213 in FIG. 2). Potential service menus are
- 1. Distribution of music, image, and short-short story (relaxation)
- 2. Advice/navigation for event (to increase concentration)
- 3. Combined service of 1 and 2
- The user is inquired about the service menu at the start time of use of the terminal (in this case, the start time of use of the
main module 101 and the like) or when themain module 101 and the like are powered on. The user can set the service menu as he/she likes. - For the service contents, if the user selects, e.g., the relaxation course, and the user is stressed, a voice message “You seem to be tired. How about music? Please look at the display” is presented to the user, and a music list is displayed on the portable display such as the
display 104 or wristwatch type display 105. - In this case, the system links to a content distribution service agent for music or the like, extracts optimum contents for the user from the database of the service agent on the basis of the data of the user's situation (where the user is and what the user is doing) and the data of the degree of stress (the degree of fatigue and whether the user is being stressed), and presents candidates to the user. When the user selects a content from the list, a confirmation message “Playback of this content costs \◯◯. OK?” is displayed. When the user inputs an acknowledge, the system buys and downloads the content and displays or plays back the stream of data. A questionnaire of the result is acquired and fed back to the database.
- For the “advice/navigation for event” course, a service is offered to navigate the user such that the maximum efficiency can be obtained within the allowable range while allowing stress to some extent. Various kinds of events are prepared in accordance with the situation of each user. For example, “professional sports player”, “amateur sports player”, “examination”, “presentation”, and the like are prepared. When the date of actual event is set, the
CPU 1012 of themain module 101 sets navigation menus from the service start day to the actual event and during the event and executes the service. - The menus may be continuously set in a scale with which relaxation can be obtained at maximum efficiency. In this case, break and relaxation necessary for maximizing the efficiency are provided. If the service mainly aims at eliminating stress, the amount of break and relaxation to be provided is increased.
- The relaxation service is provided at a timing according to the measured user's situation. When the user is acting to have an effect on the target event or given task (e.g., during study for an examination), no relaxation service is provided. When the behavior continues, and the user starts feeling tired, the service is provided with relaxation advice. Alternatively, control may be performed such that a parameter that reflects a change in sympathetic and parasympathetic nerves, e.g., a fluctuation in heart rate, is measured, when the sympathetic nerve is active, an advice menu for maximizing efficiency under that situation is displayed, and when the parasympathetic nerve is active, the relaxation service is provided because a break is necessary.
- Alternatively, since the control method changes depending on whether stress is preferred or undesirable for the user, the type of stress may be estimated from the user's situation, and a service according to the type of stress may be provided.
- <Method of Determining Type of Stress>
- The method of determining the type of stress will be described. Stress is detected by the method that was already described above. After that, indices that represent whether the stress is preferred or undesirable for the user are continuously expressed as numerical values, evaluated by the user, and stored in the personal sensor information corpus DB2. In the same situation, it is determined whether “navigation for eliminating the stress is to be performed” or “navigation for maximizing the ability is to be performed”.
- With the above processing, the type of stress can be determined, and how to reflect the stress on the service can be determined.
- Since the vital information and behavior information of the user are always acquired, window information which allows the user to check the history for a predetermined past period from the current time may be generated on the basis of the monitor results of the vital information, posture/action information, and the like, sent to the terminal, and displayed on the user's display such that the user can refer to the history. In the example shown in FIG. 8A, action life information (in this example, [walk (walking)] move to dining room for lunch, [sit (sitting)] meal, [walk (walking)] finish lunch and move to private room) for a predetermined past period from the current time, or the current behavior state (running) and current physical condition (pulse rate is high) are displayed together with operation buttons and the like. In this example, when desired one of the pieces of displayed action life information is selected on the window, the graph of vital signals at that time is displayed, and the user can see the transition state (FIG. 8B). In this example, the vital signal graph shows the pulse wave and pulse rate, though any other vital signal can be displayed.
- In the above first embodiment, pieces of vital information of the user, including the pulse rate, body temperature, and GSR, are acquired, and the posture information of the user is also acquired. A change in user's physical condition is detected from the pieces of information. When the pieces of information are collated with the behavior schedule of the user, and the change in user's body does not match the behavior schedule, the degree of stress is measured, and a service or advice for eliminating or relaxing the stress, or maximizing the ability by using the stress is provided to the user in accordance with the degree of stress and the user's situation.
- Hence, a life support apparatus which can relax stress and suppress mental and physical damage by the stress to achieve effective medical administration and healthcare can be provided. In addition, a life support apparatus capable of increasing the ability by taking advantage of stress can be provided.
- The second embodiment will be described next, in which when a user feels stressful, a measure recommended to the user can be commercially advertised to realize healthcare of the user and commercial effect, or pieces of information related to the stress are collected from the user and used for consulting or marketing business to effectively use the information for business.
- (Second Embodiment)
- In the second embodiment, in providing a service or presenting information, the physical and mental conditions of a user are always grasped, and an advertisement corresponding to the conditions is displayed through a wearable computer. The hardware configuration to be used is the same as that shown in the block diagram of FIG. 1.
- The behavior information, vital information, and degree-of-stress information of the user are acquired by the same arrangement and method as in the first embodiment in the above-described way. After the pieces of information are acquired, an advertisement genre corresponding to the user's situation is estimated in a wearable computer (
main module 101 in FIG. 1) on the basis of the acquired information, and data of the selected advertisement genre is transferred to an advertisement service agent. - As described above, the
main module 101 has aBluetooth chip 1013 for radio communication. As a network using Bluetooth is built, many radio tags (Bluetooth chips) as Bluetooth transmitting/receiving sections are installed at various places such as on a street, at a station, or in a building, and themain module 101 communicates with a radio tag, thereby communicating with the network. The server of an advertisement service agent is connected to the network. The server of the advertisement service agent distributes, through the network, advertisements corresponding to genre data sent from themain module 101 of the user. The advertisement distribution is done in a form of mail, voice message, banner on a display, or the like, and the advertisements are exchanged through the Bluetooth. - For advertisement distribution on the server side, optimum contents are provided and displayed on the device on the user side in accordance with the current situation of the user at the most effective timing for advertisement. For example, if the user is on his/her way to the office, an advertisement related to the work of the day is provided and displayed, as shown in FIGS. 11A and 11B. If the user is on his/her way to home, an advertisement related to hobby or articles for daily use including foods and clothes, or an advertisement of a supermarket near the route to home is provided and displayed before the purchase chance of the user is lost.
- The display contents and display medium can be switched in accordance with the user's mental condition (whether the degree of stress is high or low). For example, when the degree of stress is high, and the user's mind is occupied with his/her own affairs, advertisement distribution is stopped. Advertisements are distributed later when the user is relaxed.
- In the above description, the advertisement distribution agent selects contents to be distributed. However, the terminal side (wearable computer) may have a function of selecting an advertisement to be received and displayed in accordance with the situation of the user himself/herself. Alternatively, the server of the service agent may execute the filtering in accordance with setting information from the user and distribute advertisements.
- Not only the contents to be displayed but also the display medium may be switched in accordance with the user's situation. For example, when it is recognized from behavior information that the user is walking, an advertisement to be provided is presented not by character and image data but by voice data.
- As another example, personal behavior data and subjective information for the data may be collected and used for marketing/consulting business. Since the wearable computer (main module101) stores personal behavior data and subjective information for the data, the agent acquires the pieces of information provided from the user. Since the pieces of information are personal information, they must be provided based on user's will. For this reason, the pieces of information are transmitted to the service side by transmission operation by the user himself/herself.
- For a consulting service, the pieces of collected information must correspond to the consulting service to be provided. For example, for a consulting service to a convenience store, pieces of information must be collected from persons who have come to or near the convenience store. As shown in FIG. 13A, to collect data of a person P who has arrived at the neighborhood (area A) of the store, Bluetooth radio tags are installed at and near the store and connected to a person who has entered that area using the Bluetooth (step S1301 in FIG. 13B). When connection is successfully established, the system inquires of the person, presents the types and distribution destination—of data, and obtains user's consent about whether the data may be sold (step S1302 in FIG. 13B).
- For easy data collection, an information charge is paid to the user who has communicated at that time, or a special coupon is offered to the user (step S1303 in FIG. 13B). In addition, to survey stores where the residents near the convenience stores buy, pieces of information of commercial areas familiar to the residents are collected. For the collected data, when only anonymous data is used, as shown in FIG. 12, and the user is informed that only anonymous data is used, the collected data can be sold as merchandise without any serious problem.
- By collecting information in the above way, personal behavior data of many persons who live/work around the store and subjective stress information for the behavior data are collected, and therefore, the merchandise display contents at the convenience store can be easily optimized.
- As described above, when a mechanism is built to collect data using the wearable computer capable of collecting pieces of information of behavior of a user and the information of the degree of stress caused by the behavior and collect the collected data in the server on the network using the Bluetooth, the data can be used for marketing and consulting services in terms of stress, e.g., the data can be reflected on the stock of pieces of merchandise which are useful to get rid of the stress or advertisement distribution for sales campaign in correspondence with the degree of stress of each person. For commercial use, since the statistic of persons who pass near a store can be acquired, and the actual conditions of consumers can be grasped, the merchandise display can be controlled for each time zone.
- Conversely, when a person suitable to the merchandise display contents comes (steps S1304 and S1305 in FIG. 13B), the advertisements of the merchandise and store can be displayed (steps S1306 and S1307 in FIG. 13B). For example, when a person who happens to pass the store seems to be stressed (step S1305 in FIG. 13B), an advertisement that recommends a candy or food of his/her favorite is distributed (steps S1306 and S1307 in FIG. 13B) and displayed on the wearable computer of that person (step S1308 in FIG. 13B) to stimulate his/her will to buy. A system capable of promoting sale can be built and operated.
- For such marketing survey, not only information at a specific point as described above but also information of a person near a surveyor may be collected by giving a wearable computer to the surveyor.
- Another embodiment will be described below. For example, an agent that sells healthy foods lends or sells at a low price a wearable computer set as shown in FIG. 1 to a consumer. Using the wearable computer set, the behavior information, degree of stress, and health state of the user are measured by the above-described device.
- The consumer (user) transmits the data of his/her own to the health advice service agent. The service agent receives the medical examination result. Alternatively, the pieces of information are periodically collected and automatically transmitted to the service agent, or the agent accesses the terminal of each user to collect information and determine the situation. It is determined whether the situation allows presentation of an answer to the user, and the examination result is transmitted from the system of the service agent using a medium according to the situation. The user side receives the transmitted examination result by the user's terminal.
- Simultaneously, a banner advertisement of a healthy food or medicine (FIGS. 9A and 10A) related to the examination result is displayed, or the homepage of an online shopping service is displayed (FIG. 9B). For example, for a user who feels tired, a banner advertisement or a page of online shopping, that recommends a vitamin C tablet or medicine for promoting nutrition, such as a nutritive drink, is displayed. To actively obtain the sales promotion effect, a coupon service is executed to offer “a point service for use of the service as a benefit” or “a special coupon (e.g., free drink ticket or the like) as a benefit”. When the advertiser is a store, a store guide map display button is displayed on the banner advertisement (FIG. 10A). When the user operates this button, a map (FIG. 10B) for the store or a voice guidance (FIG. 10C) can be effectively given.
- As described above, the life support apparatus according to the second embodiment uses a wearable computer having a Bluetooth as a short-distance radio communication device and a function of collecting vital information and behavior information. The user carries the apparatus and uses for his/her health care against stress. In addition, pieces of information for the stress are collected from a pedestrian to the server through a radio tag (e.g., a Bluetooth chip) on the street and network. When the user feels stressful, to eliminate the stress, a content for commercially advertising a measure recommended to the user is distributed to the user to keep him/her informed. With this arrangement, a system capable of realizing healthcare of the user and providing a commercial effect can be built. In addition, a system which can be effectively used for business by analyzing pieces of information of the behavior of a user and information related to stress, which are collected in the server, and using the information for consulting and marketing services can be built.
- (Third Embodiment)
- In the third embodiment, a life support apparatus which presents a user's situation acquired using the above user's situation recognition device by a means optimum for each inquiring medium in response to an external inquiry from a handyphone, mail, or beeper is provided.
- In this embodiment as well, the hardware configuration is the same as in FIG. 1, and the behavior information, vital information, and degree-of-stress information of a user can be acquired by the same arrangement as in the first embodiment.
- An example in which a real-time voice (telephone) message is received by a
handyphone 107 will be described. - FIG. 16 is a flow chart showing the processing.
- When a terminal (main module101) is activated, the behavior information, vital information, and degree-of-stress information of the user are acquired by the
main module 101 in accordance with the same arrangement and method as in the first embodiment. In case of an incoming call at the handyphone 107 (steps S1601 and S1602 in FIG. 16), aCPU 1012 of themain module 101 looks up a set mode table shown in FIG. 18 (step S1603 in FIG. 16). - In the set mode table shown in FIG. 18, information representing “whether-an answer to an incoming call can be sent (YES/NO)”, “the contents to be returned”, “whether the fact is to transmitted”, and the like can be set for each category of the caller and for each person. Several types of such table are stored in the
handyphone 107 in advance. When the user selects and sets the table on thehandyphone 107, themain module 101 can acquire and use the table information. - The
CPU 1012 of themain module 101 starts user's situation recognition processing in accordance with the condition of the set mode table unless the answer is inhibited. Various data (vital information) collected by anacceleration sensor module 103 andsensor module 102 of the user are transmitted to themain module 101. Upon receiving these data, theCPU 1012 of themain module 101 recognizes the user's situation, as in the first and second embodiments, and accesses thehandyphone 107 through the Bluetooth chip on the basis of the information. TheCPU 1012 extracts publishable information from the set mode table incorporated in thehandyphone 107 and creates voice presentation text by combining the information (step S1611 in FIG. 16). - For example, assume that it is determined that the user is on a train by situation recognition on the basis of information from the
acceleration sensor module 103 and schedule, and that the user is in the section between Jiyugaoka and Nakameguro (Toyoko Line) from position information (for outdoors, the location is detected using the position information service of the handyphone (PHS) or a GPS (not shown)). According to the user's schedule, he/she is already waiting for a friend in Shibuya now, so text “I'm on the train, and will arrive at Shibuya in 10 minutes” is created. This text is synthesized into a voice message and returned to the caller as an answer from the handyphone (step S1612 in FIG. 16). - This text may be displayed on the handyphone of the user (callee), and a window for inquiring of the user about whether the answer can be transmitted may be displayed. When the user selects “YES”, the answer is transmitted. This prevents the situation from being carelessly externally transmitted.
- Alternatively, upon reception of an incoming call, information such as the name of the caller is displayed on a
portable display 104 or the display section of thehandyphone 107, as shown in FIG. 20, and an answering message is selected and input on the window. For a notification medium, the table shown in FIG. 19 is set, and a notification is sent in accordance with the table. - The example of table shown in FIG. 19 has the following meaning. When the location is “outdoor”, and the action is “walk”, the incoming call notification is a “voice message”, and a “voice message” is output for message display. When the location is “on train”, and the action is “stand”, the incoming call notification is done by “vibration”, and message display is done by “text display on a wrist
watch type display 105”. When the location is “on train”, and the behavior is “sit”, the incoming call notification is done by “vibration”, and message display is done by “display 104”. When the location is “indoor”, and the action is “—(arbitrary)”, the incoming call notification is done by “vibration”, and message display is “not performed”. - If notification is to be executed in several steps, for example, the user is notified of only an incoming call by a voice message, and to display the contents as text, the user is notified of the medium or device where the details are to be displayed by a voice message.
- When the notification is displayed on the display, and the use selects, e.g., “situation notification” (step ST1 in FIG. 20), the
CPU 1012 of themain module 101 creates and displays notification text “I'm on the Toyoko Line between Jiyugaoka and Toritsu-daigaku. Will arrive at Shibuya in about 10 minutes” on the basis of situation information (step ST3 in FIG. 20). When the user checks the text and selects “transmit”, the text is converted in accordance with the medium and transmitted to the caller (steps ST4 and ST5 in FIG. 20). - On the other hand, if the user selects “edit” in step ST3 of FIG. 20, a mode for editing the situation information is set (step ST6 in FIG. 20). In the edit mode, for example, “Jiyugaoka—Toritsu-daigaku” in the above example can be changed by the user to, e.g., “Nakameguro—Daikanyama” (step ST7 in FIG. 20). In accordance with this change, the
CPU 1012 of themain module 101 automatically changes the message “10 minutes” as the required time for “Jiyugaoka—Toritsu-daigaku” to “5 minutes” as the required time for “Nakameguro—Daikanyama” (step ST8 in FIG. 20). This can be easily implemented by preparing a section required time table in advance, and when the section is changed in the edit mode, obtaining a corresponding required time by looking up the table. - When “replace” is selected in step ST3, the flow advances to step ST9 in FIG. 20 to allow whole message replacement, so any situation can be set. In this state, when, e.g., “in meeting” is selected (step ST10 in FIG. 20), text that represents that the user is in a meeting in the office can be created independently of the actual situation (steps ST11 and ST12 in FIG. 20). When the user checks the text and selects “transmit”, the text is converted in accordance with the medium and transmitted to the caller.
- The edit contents are changed in accordance with the user's situation and detectable range. For example, if “in meeting” is selected, the caller is notified of the time of end of meeting (the time of meeting is detected from the schedule).
- When such edit operation cannot be performed because of the user's situation, the user is inquired by voice, or an answer is automatically sent in accordance with the conditions set in the table in advance.
- As described above, the life support apparatus according to the third embodiment uses a wearable computer which has a Bluetooth as a short-distance radio communication device and also has a function of collecting vital information and behavior information, and can hold the user's schedule information and recognize the behavior state from that information. This apparatus grasps the behavior state of the user, and upon reception of an incoming call at the handyphone or the like, selects an optimum method of dealing with the incoming call from the current behavior state of the user. Even when an incoming call is detected on the train or during meeting, an optimum response method for that situation is automatically selected. For this reason, the user can respond to the caller without troubling those around the user. Hence, the user can optimally cope with termination of a call or mail without any stress.
- Especially, for the conventional “manner” mode of a handyphone, when that the user cannot respond to the call can be presented to the caller together with the current situation, the caller can call the user again at a timing convenient for the user. This can be implemented by the third embodiment. In addition, an essential conflict of the conventional handyphone, i.e., the callee must answer the phone to explain that he/she cannot speak right now can be solved.
- Various embodiments have been described above. In the embodiments of the present invention, information is presented to the user by voice synthesis. However, the present invention is not limited to this, and characters or image may be displayed on a head-mounted display (goggle type display), pendant-type display, or wrist watch type display. A wrist watch type display or handyphone may incorporate a vibrator. When a message for the user is received, the user may be notified of reception of the message by actuating the vibrator.
- The feedback medium may be changed in accordance with the user's situation on the basis of the measured and recognized behavior. For example, when the user is walking, a voice message is output. During work, the message is displayed on the display window. When the user is sleeping, no message is output, though in case of emergency, those who are around the user are notified of the emergency as well as the user, or a message is transmitted to the family physician or security agent. In addition, the user may be notified of the emergency by strong vibration or voice to make him/her recognize the emergency level of the information.
- When a state that the user cannot deal with is detected (when the user's disease is serious), a plurality of terminals in the neighborhood are notified of that state. In this case, many people in the neighborhood can recognize the emergency, and a system capable of quickly coping with emergency in the aged society or single aged household can be built. It is also useful to convert a message in accordance with the terminal to transmit information as to who is originating the emergency message and the place of origination, or to set emergency levels and, as the emergency level rises, alarm in a large volume.
- If the user himself/herself must measure data (e.g., when automatic measurement or data transfer is impossible), a message for prompting the user to measure data is displayed in accordance with the measurement schedule. When no measurement is done, a follow message is periodically displayed. With this arrangement, the system can be prevented from irregularly functioning for a long time because no measurement result is obtained. The manner the message is displayed is preferably interactively adjusted.
- In the above-described embodiments, the Bluetooth is used for communication between modules, though any other method can be used as long as communication at personal level is possible. A technique (PAN: Personal Area Network) of using a body as a conductor and exchanging an electrical signal has also been developed. Communication between modules may be executed using this technique. IrDA (infrared communication interface) can also be used. In the embodiments, communication between modules is executed as radio communication. However, cable connection may be performed using, e.g., RS232C as one of standard interfaces for serial communication.
- As the transfer condition, pieces of vital information before and after a change in action may be transferred, the transfer rate may be raised (the priority level may be increased), or time resolving power may be increased. For example, when it is determined that the degree of physical action is high on the basis of the output from the acceleration sensor, or for a behavior for which considerable stress is expected by the above-described stress detection algorithm, the time resolving power for data to be measured is increased, and otherwise, the data are transferred at a low resolving power. The type of acquired information may be controlled. For example, an electrocardiogram is acquired in a high load state, and only a pulse rate is acquired in a low load state.
- In an arrangement in which the sensor modules and main module have attached sensors, respectively, and sensors for acquiring the same data are also prepared on the environment side, wearable sensor modules may be used to acquire data when the sensors are attached, and the sensors on the environment side may be used to acquire data when the sensors are detached.
- To implement this system, e.g., an energization type attached/detached status detection sensor in a sensor module is attached to the user. If the sensor is a potential or resistance detection sensor, that the sensor is detached is detected when the resistance is infinite or the electrodes are open, or a check signal is transmitted from the main module to repeat detection. That the sensor is detached is detected when no check signal is received. If the sensor is detached, the main module searches for a sensor capable of acquiring vital information and environmental information of the user from the environmental network, and if a sensor is found, the main module connects to the sensor to acquire data. If no sensor is found, a message “no sensor” is presented to the user and recorded in data together with the reason. For, e.g., the pulse sensor, when the user is taking a bath, the data is switched to the pulse rate from the electrocardiogram obtained while the user is in the bathtub. When the user is sleeping, data is received as an electrocardiogram from an electrode attached to the bedclothes, or a variation is detected by variation in breath (detected from an image).
- In measuring on the environment side, if the communication state with the wearable device degrades, data are stored on the network side. When the connection state recovers, the data are transmitted to the wearable device. However, if an emergency occurs for the user, an alarm is directly output.
- In the embodiments, measurement data is A/D-converted, and the situation is determined on the basis of a digital signal. However, this processing may be executed using an analog signal.
- As described above, according to the present invention, it is an object of the present invention to provide a life support apparatus and method which can determine the stress situation in daily life and notify a user of it to cause the user to realize the stress or can support the user of a method of eliminating stress or care against the factor that has caused the stress, on the basis of the situation of the user.
- Also, it is an object of the present invention to provide a life support apparatus and method which can determine the stress situation in daily life and, on the basis of the situation, provide a user optimum service information for stress elimination or care on the basis of the situation in consideration of specific time and circumstances, and prompt the user to use the service information, thereby contributing to the commercial effect and healthcare of the user.
- Today, handyphones and the like show wide proliferation, and such a portable type communication device is one of necessary articles that a person cannot dispense with because of convenience that allows contact and communication anytime and anywhere. However, an incoming call at such a portable type communication device may trouble those around the user depending on user's situation, so the user must cope with the incoming call with constraint. This causes stress on the user, and therefore, must be solved.
- Further, it is an object of the present invention to provide a life support apparatus and method which allow a user to select an optimum method of dealing with an incoming call at a handyphone or the like on the basis of the current behavior of the user, and also allow the user to automatically select an optimum method of responding to an incoming call even in a train or during meeting, thereby making it possible to optimally cope with an incoming call without troubling those around the user and also preventing the user from allowing stress to build up in.
- As has been described above, according to the present invention, in a wearable type life support apparatus, the degree of-stress is grasped without troubling the user on the basis of actual user's behavior history and vital information in accordance with the motion information measured from the user and schedule data, thereby navigating user's life in a desired direction, e.g., relaxing the stress or making it possible to perform operation at the maximum efficiency. When the pieces of information are collected in units of regions, they can be used for marketing in each region.
- Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
Claims (20)
1. A life support apparatus comprising:
a vital information sensor attached to a body to acquire vital information of a user;
a behavior information sensor attached to the body to acquire behavior information of the user;
a situation recognition device configured to recognize a user's situation based on the behavior information acquired by said behavior information sensor and the vital information acquired by said vital information sensor to generate user's situation information;
an information generation device configured to generate stress management information corresponding to the user's situation information; and
an information presentation device configured to present the stress management information generated by said information generation device to the user.
2. A life support apparatus comprising:
a vital information sensor attached to a body to acquire vital information of a user;
a behavior information sensor attached to the body to acquire behavior information of the user;
a situation recognition device configured to recognize a user's situation based on the behavior information acquired by said behavior information sensor and the vital information acquired by said vital information sensor to generate user's situation information;
a data base configured to store stress management information are prepared in advance;
an information search device configured to search said data base for stress management information corresponding to the user's situation information; and
an information presentation device configured to present the stress management information obtained by said information search device to the user.
3. An apparatus according to claim 2 , further comprising a measurement condition control device configured to control a measurement condition of said vital information sensor in accordance with the user's situation recognized by said situation recognition device.
4. An apparatus according to claim 3 , which further comprises a setting device configured to set presentation contents and procedure to be presented as the stress management information to said information presentation device, and wherein
said information search device searches for relaxation information for eliminating stress and navigation information for controlling the stress and increasing an operation efficiency, and
said information presentation device offers an information providing service for providing the relaxation information and navigation information in accordance with the user's situation and the presentation contents and procedure set by said setting device.
5. An apparatus according to claim 2 , wherein the information obtained by said information search device includes stress management information useful for improvement of the user's situation recognized by said situation recognition device.
6. An apparatus according to claim 5 , which further comprises a setting device configured to set presentation contents and procedure to be presented to said information presentation device, and wherein
said information search device searches for relaxation information for eliminating stress and navigation information for controlling the stress and increasing an operation efficiency, and
said information presentation device offers an information providing service for providing the information obtained by said information search device, in accordance with the user's situation and the presentation contents and procedure set by said setting device.
7. An apparatus according to claim 6 , wherein said information presentation device provides information of a schedule and task of the user.
8. An apparatus according to claim 6 , wherein said information presentation device presents an active situation of sympathetic and parasympathetic nerves.
9. A life support apparatus communicating with an external apparatus, comprising:
a user information sensor attached to a body to acquire information representing a user's situation;
a situation recognition device configured to recognize the user's situation based on user information acquired by said user information sensor;
a transceiver device configured to transmit the information of the user's situation recognized by said situation recognition device and receive external information transmitted from the external apparatus; and
a presentation device configured to present the external information received by said transceiver device to the user, the external information including an advertisement appropriate for the user, which is sent from the external apparatus in correspondence with the user's situation information.
10. A life support apparatus comprising:
a server having a communication device configured to receive user's situation information, and configured to hold various kinds of advertisement information and offer an advertisement distribution service; and
an advertisement search device configured to search said server for advertisement information appropriate for a situation of a user, based on the user's situation information received by said communication device, and supply searched advertisement information to said communication device to distribute the searched advertisement information to the user.
11. A life support apparatus comprising:
a server configured to hold various kinds of advertisement information and offer an advertisement distribution service;
a user information sensor attached to a body to acquire user information representing a user's situation;
a situation recognition device configured to recognize the user's situation based on the user information acquired by said user information sensor to generate user's situation information;
a transceiver device configured to transmit the user's situation information and receive the advertisement information from said server;
an advertisement search device configured to search said server for advertisement information appropriate for the user's situation, based on the user's situation information, and extract searched advertisement information from said server; and
a presentation device configured to present the searched advertisement information to the user.
12. An apparatus according to claim 11 , wherein the user information includes behavior information of the user.
13. An apparatus according to claim 11 , wherein the user information includes vital information and behavior information of the user.
14. A life support apparatus communicating with an external apparatus, comprising:
a user information sensor attached to a body to acquire user information representing a user's situation;
a situation recognition device configured to recognize the user's situation based on the user information to generate user's situation information;
a receiver device configured to receive an inquiry for the user by a voice or text message from the external apparatus;
a presentation device configured to present information;
a presentation control device configured to control said presentation device in accordance with the user's situation in response to the inquiry;
an answer information generation device configured to generate answer information containing the user's situation for the inquiry information as text or voice information; and
an answer information transmission device configured to transmit the answer information generated by said answer information generation device to the inquirer.
15. An apparatus according to claim 14 , wherein the user information includes behavior information of the user.
16. A life support apparatus communicating with external equipment including an message sender, comprising:
various kinds of information presentation media for a voice or text message;
a user information sensor attached to a body to acquire user information representing a user's situation;
a situation recognition device configured to recognize the user's situation based on user information acquired by said user information sensor to generate user's situation information;
a communication device connected to said situation recognition device and configured to communicate with the external equipment;
a situation information conversion device configured to select an optimum message presentation medium from said information presentation media in accordance with the user's situation information and convert the situation information into a form corresponding to the optimum message presentation medium, to present call message information sent from the message sender and received by said communication device to the user; and
an answer transmission device configured to transmit the user's situation information converted by said situation information conversion device to the message sender.
17. A life support apparatus comprising:
a plurality of information presentation devices configured to present various kinds of information, respectively;
a user information sensor attached to a body to acquire user information representing a user's situation;
a situation recognition device configured to recognize the user's situation based on the user information acquired by said user information sensor to generate—user's situation information;
a communication device connected to said situation recognition device and configured to transmit the user's situation information and receive information corresponding thereto; and
a situation information conversion device configured to select an optimum information presentation device from said information presentation devices in accordance with the user's situation information and convert the received information into a form corresponding to the selected optimum information presentation device to present the converted information to the user.
18. A life support method comprising:
acquiring vital information and behavior information of a user;
recognizing a user's situation based on the acquired behavior information and vital information to obtain user's situation information;
searching for information corresponding to the user's situation from stress management information prepared in advance, using the user's situation information; and
presenting to the user the information searched in the searching step.
19. An advertisement information providing method comprising:
acquiring information representing a physical situation of a user;
recognizing a user's situation based on the acquired information to obtain user's situation information; and
extracting advertisement information corresponding to the user's situation information from a server, which holds various kinds of advertisement information corresponding to physical situations, to present the extracted advertisement information to the user.
20. An advertisement information providing method comprising:
preparing a server which holds various kinds of advertisement information corresponding to physical situations; and
extracting optimum advertisement information corresponding to a situation of a user to present the optimum advertisement information to the user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/428,065 US20030194205A1 (en) | 2000-05-31 | 2003-05-02 | Life support apparatus and method and method for providing advertisement information |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2000-163793 | 2000-05-31 | ||
JP2000163793A JP2001344352A (en) | 2000-05-31 | 2000-05-31 | Life assisting device, life assisting method and advertisement information providing method |
US09/866,828 US6607484B2 (en) | 2000-05-31 | 2001-05-30 | Behavior and stress management recognition apparatus |
US10/428,065 US20030194205A1 (en) | 2000-05-31 | 2003-05-02 | Life support apparatus and method and method for providing advertisement information |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/866,828 Division US6607484B2 (en) | 2000-05-31 | 2001-05-30 | Behavior and stress management recognition apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
US20030194205A1 true US20030194205A1 (en) | 2003-10-16 |
Family
ID=18667499
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/866,828 Expired - Fee Related US6607484B2 (en) | 2000-05-31 | 2001-05-30 | Behavior and stress management recognition apparatus |
US10/427,922 Expired - Fee Related US6942615B2 (en) | 2000-05-31 | 2003-05-02 | Life support apparatus and method for providing advertisement information |
US10/428,065 Abandoned US20030194205A1 (en) | 2000-05-31 | 2003-05-02 | Life support apparatus and method and method for providing advertisement information |
US10/427,936 Abandoned US20030195398A1 (en) | 2000-05-31 | 2003-05-02 | Life support apparatus and method and method for providing advertisement information |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/866,828 Expired - Fee Related US6607484B2 (en) | 2000-05-31 | 2001-05-30 | Behavior and stress management recognition apparatus |
US10/427,922 Expired - Fee Related US6942615B2 (en) | 2000-05-31 | 2003-05-02 | Life support apparatus and method for providing advertisement information |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/427,936 Abandoned US20030195398A1 (en) | 2000-05-31 | 2003-05-02 | Life support apparatus and method and method for providing advertisement information |
Country Status (2)
Country | Link |
---|---|
US (4) | US6607484B2 (en) |
JP (1) | JP2001344352A (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060230108A1 (en) * | 2005-04-07 | 2006-10-12 | Olympus Corporation | Information display system |
EP1785088A1 (en) * | 2005-11-14 | 2007-05-16 | Congener Wellness Corp. | A system and method for the management and control of cardiovascular related diseases, such as hypertension |
US20070118026A1 (en) * | 2005-11-09 | 2007-05-24 | Kabushiki Kaisha Toshiba | Apparatus, system, and method for lighting control, and computer program product |
WO2011135386A1 (en) | 2010-04-27 | 2011-11-03 | Christian Berger | Apparatus for determining and storing the excitement level of a human individual, comprisind ecg electrodes and a skin resistance monitor |
US8115635B2 (en) | 2005-02-08 | 2012-02-14 | Abbott Diabetes Care Inc. | RF tag on test strips, test strip vials and boxes |
US8795184B2 (en) | 2010-07-12 | 2014-08-05 | Rohm Co., Ltd. | Wireless plethysmogram sensor unit, a processing unit for plethysmogram and a plethysmogram system |
US20140249429A1 (en) * | 2006-05-24 | 2014-09-04 | Bao Tran | Fitness monitoring |
EP3382716A1 (en) * | 2017-03-30 | 2018-10-03 | Tanita Corporation | Information processing device, information processing method, and storage medium |
CN109473159A (en) * | 2018-12-29 | 2019-03-15 | Oppo广东移动通信有限公司 | Information-pushing method and Related product |
US11331019B2 (en) | 2017-08-07 | 2022-05-17 | The Research Foundation For The State University Of New York | Nanoparticle sensor having a nanofibrous membrane scaffold |
US12005811B2 (en) | 2018-11-29 | 2024-06-11 | Ts Tech Co., Ltd. | Seat system |
Families Citing this family (341)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL130818A (en) | 1999-07-06 | 2005-07-25 | Intercure Ltd | Interventive-diagnostic device |
US6527711B1 (en) | 1999-10-18 | 2003-03-04 | Bodymedia, Inc. | Wearable human physiological data sensors and reporting system therefor |
US7212829B1 (en) | 2000-02-28 | 2007-05-01 | Chung Lau | Method and system for providing shipment tracking and notifications |
US7218938B1 (en) | 2002-04-24 | 2007-05-15 | Chung Lau | Methods and apparatus to analyze and present location information |
US7366522B2 (en) | 2000-02-28 | 2008-04-29 | Thomas C Douglass | Method and system for location tracking |
US6975941B1 (en) | 2002-04-24 | 2005-12-13 | Chung Lau | Method and apparatus for intelligent acquisition of position information |
US7321774B1 (en) | 2002-04-24 | 2008-01-22 | Ipventure, Inc. | Inexpensive position sensing device |
US7403972B1 (en) | 2002-04-24 | 2008-07-22 | Ip Venture, Inc. | Method and system for enhanced messaging |
US20060122474A1 (en) | 2000-06-16 | 2006-06-08 | Bodymedia, Inc. | Apparatus for monitoring health, wellness and fitness |
US7689437B1 (en) * | 2000-06-16 | 2010-03-30 | Bodymedia, Inc. | System for monitoring health, wellness and fitness |
EP1662989B1 (en) * | 2000-06-16 | 2014-09-03 | BodyMedia, Inc. | System for monitoring and managing body weight and other physiological conditions including iterative and personalized planning, intervention and reporting capability |
JP2002006785A (en) * | 2000-06-20 | 2002-01-11 | Nec Yonezawa Ltd | Advertisement offering method and system, data processing method, information storage medium |
CA2413148C (en) | 2000-06-23 | 2010-08-24 | Bodymedia, Inc. | System for monitoring health, wellness and fitness |
US7085719B1 (en) * | 2000-07-13 | 2006-08-01 | Rockwell Electronics Commerce Technologies Llc | Voice filter for normalizing an agents response by altering emotional and word content |
JP3936833B2 (en) * | 2000-08-28 | 2007-06-27 | 株式会社日立製作所 | Body motion sensing device and body motion sensing system |
JP2002101156A (en) * | 2000-09-22 | 2002-04-05 | Sony Corp | Portable telephone and method for processing voice |
KR100401012B1 (en) * | 2000-12-15 | 2003-10-30 | 김연경 | A music providing system having music selecting function by human feeling and a music providing method using thereof |
FI110560B (en) * | 2000-12-27 | 2003-02-14 | Nokia Corp | Grouping of wireless communication terminals |
KR20020092420A (en) * | 2001-02-07 | 2002-12-11 | 마츠시타 덴끼 산교 가부시키가이샤 | Biological information processing system, terminal, biological information processor, biological information processing method, and program |
SG126677A1 (en) * | 2001-06-26 | 2006-11-29 | Meng Ting Choon | Method and device for measuring blood sugar level |
NZ526298A (en) * | 2001-08-06 | 2004-10-29 | Index Corp | Device and method for judging dog's feelings from cry vocal character analysis |
JP3947959B2 (en) * | 2001-10-02 | 2007-07-25 | カシオ計算機株式会社 | Song data delivery apparatus and song data delivery method |
DE10152300A1 (en) * | 2001-10-26 | 2003-05-08 | Daniel Woehl | System for storing patient data with data input and output devices |
AUPR875101A0 (en) * | 2001-11-08 | 2001-11-29 | Mondo Medical Limited | Monitoring system |
JP2003204942A (en) * | 2002-01-16 | 2003-07-22 | Yamaha Motor Co Ltd | Biological state-related information-providing system |
JP2003220039A (en) * | 2002-01-29 | 2003-08-05 | Junichi Ninomiya | Remote observation system and remote observation method for physical abnormality based on biological information and acceleration information of patient |
US8043213B2 (en) | 2002-12-18 | 2011-10-25 | Cardiac Pacemakers, Inc. | Advanced patient management for triaging health-related data using color codes |
US7983759B2 (en) | 2002-12-18 | 2011-07-19 | Cardiac Pacemakers, Inc. | Advanced patient management for reporting multiple health-related parameters |
US7468032B2 (en) * | 2002-12-18 | 2008-12-23 | Cardiac Pacemakers, Inc. | Advanced patient management for identifying, displaying and assisting with correlating health-related data |
US7043305B2 (en) | 2002-03-06 | 2006-05-09 | Cardiac Pacemakers, Inc. | Method and apparatus for establishing context among events and optimizing implanted medical device performance |
US20040122294A1 (en) | 2002-12-18 | 2004-06-24 | John Hatlestad | Advanced patient management with environmental data |
US8391989B2 (en) | 2002-12-18 | 2013-03-05 | Cardiac Pacemakers, Inc. | Advanced patient management for defining, identifying and using predetermined health-related events |
US20040122487A1 (en) | 2002-12-18 | 2004-06-24 | John Hatlestad | Advanced patient management with composite parameter indices |
JP2003275183A (en) * | 2002-03-25 | 2003-09-30 | Matsushita Electric Ind Co Ltd | Biological information detection sensor and sensor control device |
JP2003296782A (en) * | 2002-03-29 | 2003-10-17 | Casio Comput Co Ltd | Device and program for recording action |
JP3821744B2 (en) * | 2002-03-29 | 2006-09-13 | 株式会社東芝 | Life support system |
US9049571B2 (en) | 2002-04-24 | 2015-06-02 | Ipventure, Inc. | Method and system for enhanced messaging |
US9182238B2 (en) | 2002-04-24 | 2015-11-10 | Ipventure, Inc. | Method and apparatus for intelligent acquisition of position information |
JP2004008573A (en) * | 2002-06-07 | 2004-01-15 | Seiko Instruments Inc | Remote medical checkup apparatus, diagnostic method therefor and remote medical checkup program |
JP3952870B2 (en) * | 2002-06-12 | 2007-08-01 | 株式会社東芝 | Audio transmission apparatus, audio transmission method and program |
JP2004024551A (en) * | 2002-06-26 | 2004-01-29 | Renesas Technology Corp | Semiconductor device for sensor system |
CN100455255C (en) | 2002-08-09 | 2009-01-28 | 因特尔丘尔有限公司 | Generalized metronome for modification of biorhythmic activity |
US7020508B2 (en) * | 2002-08-22 | 2006-03-28 | Bodymedia, Inc. | Apparatus for detecting human physiological and contextual information |
US20070100666A1 (en) * | 2002-08-22 | 2007-05-03 | Stivoric John M | Devices and systems for contextual and physiological-based detection, monitoring, reporting, entertainment, and control of other devices |
JP3912518B2 (en) * | 2002-09-10 | 2007-05-09 | ソニー株式会社 | Service providing system and method |
US20070055566A1 (en) * | 2005-09-02 | 2007-03-08 | Aws Convergence Technologies, Inc. | System, method, apparatus and computer media for user control of advertising |
BR0315229A (en) | 2002-10-09 | 2005-08-30 | Bodymedia Inc | Apparatus for detecting, receiving, derived from, and presenting human physiological and contextual information. |
KR20040032451A (en) | 2002-10-09 | 2004-04-17 | 삼성전자주식회사 | Mobile device having health care function and method of health care using the same |
US20090177068A1 (en) * | 2002-10-09 | 2009-07-09 | Stivoric John M | Method and apparatus for providing derived glucose information utilizing physiological and/or contextual parameters |
US8672852B2 (en) | 2002-12-13 | 2014-03-18 | Intercure Ltd. | Apparatus and method for beneficial modification of biorhythmic activity |
US20040116781A1 (en) * | 2002-12-17 | 2004-06-17 | International Business Machines Corporation | Behavior based life support generating new behavior patterns from historical behavior indicators |
US20040116783A1 (en) * | 2002-12-17 | 2004-06-17 | International Business Machines Corporation | Behavior based life support generating new behavior patterns from behavior indicators for a user |
US7378955B2 (en) * | 2003-01-03 | 2008-05-27 | Cardiac Pacemakers, Inc. | System and method for correlating biometric trends with a related temporal event |
JP4158533B2 (en) * | 2003-01-21 | 2008-10-01 | ソニー株式会社 | Method and apparatus for recording, transmitting or reproducing data |
JP2004252604A (en) * | 2003-02-19 | 2004-09-09 | Okinawa Pref Gov | Sightseeing estimation method and program therefor, and health recreation type sightseeing touring method using them |
JP3760920B2 (en) * | 2003-02-28 | 2006-03-29 | 株式会社デンソー | Sensor |
JP4270911B2 (en) * | 2003-03-10 | 2009-06-03 | 富士通株式会社 | Patient monitoring device |
JP2004310159A (en) * | 2003-04-02 | 2004-11-04 | Omron Corp | System and method for providing event |
JP2004318503A (en) * | 2003-04-16 | 2004-11-11 | Toshiba Corp | Device, method and program for supporting action management |
US7182738B2 (en) | 2003-04-23 | 2007-02-27 | Marctec, Llc | Patient monitoring apparatus and method for orthosis and other devices |
AU2003902187A0 (en) * | 2003-05-08 | 2003-05-22 | Aimedics Pty Ltd | Patient monitor |
AU2004236368B2 (en) * | 2003-05-08 | 2011-08-04 | University Of Technology, Sydney | Patient moniter |
JP2005021255A (en) * | 2003-06-30 | 2005-01-27 | Sony Corp | Control device and control method |
JP2005031840A (en) * | 2003-07-09 | 2005-02-03 | Seiko Instruments Inc | Emergency notifying device |
JP4085926B2 (en) | 2003-08-14 | 2008-05-14 | ソニー株式会社 | Information processing terminal and communication system |
KR100601932B1 (en) * | 2003-09-04 | 2006-07-14 | 삼성전자주식회사 | Method and apparatus for training control using biofeedback |
JP3968522B2 (en) * | 2003-10-06 | 2007-08-29 | ソニー株式会社 | Recording apparatus and recording method |
US7107180B2 (en) * | 2003-11-14 | 2006-09-12 | Ossur Hf | Method and system for determining an activity level in an individual |
TWI244851B (en) * | 2003-11-21 | 2005-12-01 | Benq Corp | Bluetooth earphone device for measuring body temperature |
US20050154264A1 (en) * | 2004-01-08 | 2005-07-14 | International Business Machines Corporation | Personal stress level monitor and systems and methods for using same |
US20050163302A1 (en) * | 2004-01-22 | 2005-07-28 | Mock Von A. | Customer service system and method using physiological data |
US20050195079A1 (en) * | 2004-03-08 | 2005-09-08 | David Cohen | Emergency situation detector |
JP4569134B2 (en) * | 2004-03-11 | 2010-10-27 | トヨタ自動車株式会社 | Emotion induction device and emotion induction method |
US8725244B2 (en) | 2004-03-16 | 2014-05-13 | Medtronic, Inc. | Determination of sleep quality for neurological disorders |
US7717848B2 (en) * | 2004-03-16 | 2010-05-18 | Medtronic, Inc. | Collecting sleep quality information via a medical device |
US7792583B2 (en) | 2004-03-16 | 2010-09-07 | Medtronic, Inc. | Collecting posture information to evaluate therapy |
WO2005092177A1 (en) | 2004-03-22 | 2005-10-06 | Bodymedia, Inc. | Non-invasive temperature monitoring device |
JP3987053B2 (en) * | 2004-03-30 | 2007-10-03 | 株式会社東芝 | Sleep state determination device and sleep state determination method |
US7378939B2 (en) * | 2004-03-30 | 2008-05-27 | Sengupta Uttam K | Method and apparatus for providing proximity based authentication, security, and notification in a wireless system |
JP4216810B2 (en) * | 2004-03-30 | 2009-01-28 | 株式会社東芝 | Biological information measuring device |
JP3981961B2 (en) * | 2004-03-31 | 2007-09-26 | 拓也 岡本 | Infant prone sleep detection device |
US8135473B2 (en) | 2004-04-14 | 2012-03-13 | Medtronic, Inc. | Collecting posture and activity information to evaluate therapy |
US20050240571A1 (en) * | 2004-04-23 | 2005-10-27 | Honeywell International Inc. | System and method for automatically gathering information relating to an actor in an environment |
JP2005315802A (en) * | 2004-04-30 | 2005-11-10 | Olympus Corp | User support device |
JP4838499B2 (en) | 2004-05-21 | 2011-12-14 | オリンパス株式会社 | User support device |
US20060015032A1 (en) * | 2004-07-14 | 2006-01-19 | Linda Gordon | Non-invasive method for measuring changes in vascular reactivity |
US20100286488A1 (en) * | 2004-08-27 | 2010-11-11 | Moshe Cohen | Method and system for using a mobile device as a portable personal terminal for medical information |
US9820658B2 (en) | 2006-06-30 | 2017-11-21 | Bao Q. Tran | Systems and methods for providing interoperability among healthcare devices |
US8172761B1 (en) | 2004-09-28 | 2012-05-08 | Impact Sports Technologies, Inc. | Monitoring device with an accelerometer, method and system |
JP4794846B2 (en) | 2004-10-27 | 2011-10-19 | キヤノン株式会社 | Estimation apparatus and estimation method |
US20080098074A1 (en) * | 2004-11-03 | 2008-04-24 | Robert Hurling | Method and Apparatus for Motivation Enhancement |
JP4665490B2 (en) * | 2004-11-19 | 2011-04-06 | 株式会社日立製作所 | Life support device |
CN101198277B (en) * | 2005-02-22 | 2011-06-15 | 海尔思-斯玛特有限公司 | Systems for physiological and psycho-physiological monitoring |
JP4964477B2 (en) * | 2005-02-23 | 2012-06-27 | パナソニック株式会社 | Biological information detection apparatus and method |
JP4005089B2 (en) * | 2005-03-07 | 2007-11-07 | 株式会社東芝 | Communication recording system |
JP4277817B2 (en) * | 2005-03-10 | 2009-06-10 | 富士ゼロックス株式会社 | Operation history display device, operation history display method and program |
JP2006263356A (en) * | 2005-03-25 | 2006-10-05 | Konica Minolta Sensing Inc | Bioinformation measuring apparatus |
JP4421507B2 (en) * | 2005-03-30 | 2010-02-24 | 株式会社東芝 | Sleepiness prediction apparatus and program thereof |
US20080072153A1 (en) * | 2005-06-10 | 2008-03-20 | Chang-Ming Yang | Method and Earphone-Microphone Device for Providing Wearable-Based Interaction |
JP2007026429A (en) * | 2005-06-13 | 2007-02-01 | Matsushita Electric Ind Co Ltd | Guidance apparatus |
US20060293838A1 (en) * | 2005-06-13 | 2006-12-28 | Kakuya Yamamoto | Guidance apparatus |
JP4686281B2 (en) * | 2005-07-06 | 2011-05-25 | 株式会社東芝 | Respiratory state determination device, respiratory state measurement method, and respiratory state determination program |
US8033996B2 (en) * | 2005-07-26 | 2011-10-11 | Adidas Ag | Computer interfaces including physiologically guided avatars |
JP4622749B2 (en) * | 2005-08-31 | 2011-02-02 | 株式会社デンソー | Vehicle data collection device, vehicle driving support device, and vehicle safe driving support system |
JP4877909B2 (en) * | 2005-09-15 | 2012-02-15 | シャープ株式会社 | Motion measuring device |
US20070080812A1 (en) * | 2005-09-23 | 2007-04-12 | David Perlman | Awareness enhancement and monitoring devices for the treatment of certain impulse control disorders |
US20070106127A1 (en) * | 2005-10-11 | 2007-05-10 | Alman Brian M | Automated patient monitoring and counseling system |
CN101313344B (en) * | 2005-12-20 | 2010-05-19 | 松下电器产业株式会社 | Contents presentation device, and contents presentation method |
US8323191B2 (en) * | 2005-12-23 | 2012-12-04 | Koninklijke Philips Electronics N.V. | Stressor sensor and stress management system |
JP4509042B2 (en) * | 2006-02-13 | 2010-07-21 | 株式会社デンソー | Hospitality information provision system for automobiles |
JP2007251741A (en) * | 2006-03-17 | 2007-09-27 | Nec Corp | Bidirectional communication image distributing system, cm content server, and bidirectional communication image distributing method to be used for them |
EP1998666B1 (en) * | 2006-03-21 | 2014-01-01 | Koninklijke Philips N.V. | Indication of the condition of a user |
EP1998849B1 (en) | 2006-03-24 | 2014-12-24 | Medtronic, Inc. | Collecting gait information for evaluation and control of therapy |
US8112293B2 (en) | 2006-03-24 | 2012-02-07 | Ipventure, Inc | Medical monitoring system |
JP2007267966A (en) * | 2006-03-31 | 2007-10-18 | Railway Technical Res Inst | Abnormal behavior suppressing device |
JP2007280485A (en) | 2006-04-05 | 2007-10-25 | Sony Corp | Recording device, reproducing device, recording and reproducing device, recording method, reproducing method, recording and reproducing method, and recording medium |
US8644396B2 (en) * | 2006-04-18 | 2014-02-04 | Qualcomm Incorporated | Waveform encoding for wireless applications |
CN101433052B (en) | 2006-04-26 | 2013-04-24 | 高通股份有限公司 | Dynamic distribution of device functionality and resource management |
US8289159B2 (en) | 2006-04-26 | 2012-10-16 | Qualcomm Incorporated | Wireless localization apparatus and method |
US8406794B2 (en) | 2006-04-26 | 2013-03-26 | Qualcomm Incorporated | Methods and apparatuses of initiating communication in wireless networks |
US8968195B2 (en) | 2006-05-12 | 2015-03-03 | Bao Tran | Health monitoring appliance |
US9060683B2 (en) | 2006-05-12 | 2015-06-23 | Bao Tran | Mobile wireless appliance |
US8323189B2 (en) | 2006-05-12 | 2012-12-04 | Bao Tran | Health monitoring appliance |
US7539533B2 (en) | 2006-05-16 | 2009-05-26 | Bao Tran | Mesh network monitoring appliance |
KR100786817B1 (en) | 2006-06-12 | 2007-12-18 | 주식회사 헬스피아 | System and Method for informing emergency state |
JP2008009501A (en) * | 2006-06-27 | 2008-01-17 | Olympus Corp | Charging method |
US20070299323A1 (en) * | 2006-06-27 | 2007-12-27 | Martijn Wilco Arns | Apparatus for measuring one or more physiological functions of a body and a method using the same |
EP2032020A2 (en) * | 2006-06-28 | 2009-03-11 | Endo-Rhythm Ltd. | Lifestyle and eating advisor based on physiological and biological rhythm monitoring |
EP2043502B1 (en) * | 2006-07-13 | 2014-03-12 | St. Jude Medical AB | Information management in devices implanted in a patient |
US7720954B2 (en) | 2006-08-03 | 2010-05-18 | Citrix Systems, Inc. | Method and appliance for using a dynamic response time to determine responsiveness of network services |
US9514436B2 (en) | 2006-09-05 | 2016-12-06 | The Nielsen Company (Us), Llc | Method and system for predicting audience viewing behavior |
WO2008055078A2 (en) * | 2006-10-27 | 2008-05-08 | Vivometrics, Inc. | Identification of emotional states using physiological responses |
US8672843B2 (en) * | 2006-11-27 | 2014-03-18 | Qtc Management, Inc. | Automated protocol for determining psychiatric disability |
US8157730B2 (en) | 2006-12-19 | 2012-04-17 | Valencell, Inc. | Physiological and environmental monitoring systems and methods |
US8652040B2 (en) | 2006-12-19 | 2014-02-18 | Valencell, Inc. | Telemetric apparatus for health and environmental monitoring |
US20080159514A1 (en) * | 2006-12-29 | 2008-07-03 | Motorola, Inc. | Telecommunication device |
JP2008167818A (en) * | 2007-01-09 | 2008-07-24 | Konica Minolta Sensing Inc | Biological information measuring apparatus and biometric information measuring system |
US20080208015A1 (en) * | 2007-02-09 | 2008-08-28 | Morris Margaret E | System, apparatus and method for real-time health feedback on a mobile device based on physiological, contextual and self-monitored indicators of mental and physical health states |
EP2750098A3 (en) * | 2007-02-16 | 2014-08-06 | BodyMedia, Inc. | Systems and methods for understanding and applying the physiological and contextual life patterns of an individual or set of individuals |
KR101261179B1 (en) | 2007-03-23 | 2013-05-09 | 퀄컴 인코포레이티드 | Multi-sensor data collection and/or processing |
US10178965B2 (en) * | 2007-06-22 | 2019-01-15 | Ipventure, Inc. | Activity monitoring system for pregnant women |
KR101435680B1 (en) * | 2007-09-11 | 2014-09-02 | 삼성전자주식회사 | Method for analyzing stress based on biometric signal measured multiple |
US8251903B2 (en) | 2007-10-25 | 2012-08-28 | Valencell, Inc. | Noninvasive physiological analysis using excitation-sensor modules and related devices and methods |
JP5061931B2 (en) * | 2008-02-04 | 2012-10-31 | ソニー株式会社 | Information processing apparatus and information processing method |
JP5007681B2 (en) * | 2008-02-14 | 2012-08-22 | ソニー株式会社 | Broadcast system |
US8529457B2 (en) * | 2008-02-22 | 2013-09-10 | Koninklijke Philips N.V. | System and kit for stress and relaxation management |
TWI368188B (en) * | 2008-03-18 | 2012-07-11 | Univ Nat Taiwan | Intra-body biomedical communication system (ibc) and the method use of |
US20090264711A1 (en) * | 2008-04-17 | 2009-10-22 | Motorola, Inc. | Behavior modification recommender |
US20090265437A1 (en) * | 2008-04-22 | 2009-10-22 | Eric Lucas | System and method for identifying and modifying influencers and stressors |
JP2009265852A (en) * | 2008-04-23 | 2009-11-12 | Sharp Corp | Answer storage portable terminal and answer storage method |
JP2009301430A (en) * | 2008-06-16 | 2009-12-24 | Ricoh Co Ltd | Working state management device, working state management method, working state management program and working state management system |
US20100016704A1 (en) * | 2008-07-16 | 2010-01-21 | Naber John F | Method and system for monitoring a condition of an eye |
JP5522338B2 (en) * | 2008-10-28 | 2014-06-18 | 日本電気株式会社 | Situation judging device, situation judging system, method and program thereof |
US8126542B2 (en) * | 2008-12-10 | 2012-02-28 | Somaxis, Inc. | Methods for performing physiological stress tests |
US8441356B1 (en) | 2009-02-16 | 2013-05-14 | Handhold Adaptive, LLC | Methods for remote assistance of disabled persons |
US8494507B1 (en) | 2009-02-16 | 2013-07-23 | Handhold Adaptive, LLC | Adaptive, portable, multi-sensory aid for the disabled |
US8700111B2 (en) | 2009-02-25 | 2014-04-15 | Valencell, Inc. | Light-guiding devices and monitoring devices incorporating same |
US8788002B2 (en) | 2009-02-25 | 2014-07-22 | Valencell, Inc. | Light-guiding devices and monitoring devices incorporating same |
US9750462B2 (en) | 2009-02-25 | 2017-09-05 | Valencell, Inc. | Monitoring apparatus and methods for measuring physiological and/or environmental conditions |
US20100256524A1 (en) | 2009-03-02 | 2010-10-07 | Seventh Sense Biosystems, Inc. | Techniques and devices associated with blood sampling |
US9041541B2 (en) * | 2010-01-28 | 2015-05-26 | Seventh Sense Biosystems, Inc. | Monitoring or feedback systems and methods |
KR20120000564A (en) * | 2009-04-24 | 2012-01-02 | 어드밴스드 브레인 모니터링, 아이엔씨. | Adaptive performance trainer |
CA3030271C (en) | 2009-10-08 | 2021-08-17 | Delos Living, Llc | Led lighting system |
US8902050B2 (en) * | 2009-10-29 | 2014-12-02 | Immersion Corporation | Systems and methods for haptic augmentation of voice-to-text conversion |
US8585588B2 (en) | 2009-11-18 | 2013-11-19 | Nohands, Llc | Method and system for preventing virus-related obesity and obesity related diseases |
KR101303648B1 (en) * | 2009-12-08 | 2013-09-04 | 한국전자통신연구원 | Sensing Device of Emotion Signal and method of the same |
DE102010001162A1 (en) * | 2010-01-22 | 2011-07-28 | Vodafone Holding GmbH, 40213 | Detecting an identifier of a communication subscriber in my communication terminal |
KR101438145B1 (en) | 2010-04-30 | 2014-09-04 | 이마테크 인크. | Risk evaluation system using people as sensors |
CA2803462A1 (en) * | 2010-06-22 | 2011-12-29 | Gili Medical Ltd. | Improved system and method for detecting symptoms of hypoglycemia |
WO2011163347A2 (en) | 2010-06-23 | 2011-12-29 | Seventh Sense Biosystems, Inc. | Sampling devices and methods involving relatively little pain |
JP5639805B2 (en) * | 2010-07-13 | 2014-12-10 | ローム株式会社 | Mobile device |
JP2013538069A (en) | 2010-07-16 | 2013-10-10 | セブンス センス バイオシステムズ,インコーポレーテッド | Low pressure environment for fluid transfer devices |
US20130158482A1 (en) | 2010-07-26 | 2013-06-20 | Seventh Sense Biosystems, Inc. | Rapid delivery and/or receiving of fluids |
WO2012021801A2 (en) | 2010-08-13 | 2012-02-16 | Seventh Sense Biosystems, Inc. | Systems and techniques for monitoring subjects |
FR2965468A1 (en) * | 2010-10-04 | 2012-04-06 | Zenkko | System for measuring cardiac activity e.g. heart rate, and/or respiratory activity of heart of person, has smartphone type portable terminal provided with screen displaying data representative of cardiac and/or respiratory activity |
JP2012078273A (en) * | 2010-10-05 | 2012-04-19 | Casio Comput Co Ltd | Information processing apparatus, method and program |
WO2012061707A2 (en) * | 2010-11-04 | 2012-05-10 | The Cleveland Clinic Foundation | Handheld biofeedback device and method for self-regulating at least one physiological state of a subject |
US8808202B2 (en) | 2010-11-09 | 2014-08-19 | Seventh Sense Biosystems, Inc. | Systems and interfaces for blood sampling |
US10201296B2 (en) | 2010-11-11 | 2019-02-12 | Ascensia Diabetes Care Holdings Ag | Apparatus, systems, and methods adapted to transmit analyte data having common electronic architecture |
US20120130196A1 (en) * | 2010-11-24 | 2012-05-24 | Fujitsu Limited | Mood Sensor |
US20120158520A1 (en) * | 2010-12-16 | 2012-06-21 | Qualcomm Incorporated | Context aware advertisement delivery |
US8634852B2 (en) * | 2011-01-04 | 2014-01-21 | Qualcomm Incorporated | Camera enabled headset for navigation |
JP2011138530A (en) * | 2011-01-26 | 2011-07-14 | Olympus Corp | Information display system |
US8888701B2 (en) | 2011-01-27 | 2014-11-18 | Valencell, Inc. | Apparatus and methods for monitoring physiological data during environmental interference |
AU2011358630A1 (en) * | 2011-02-09 | 2013-09-12 | Massachusetts Institute Of Technology | Wearable vital signs monitor |
JP2012196332A (en) * | 2011-03-22 | 2012-10-18 | Omron Healthcare Co Ltd | Body motion detecting device and method for controlling the same |
US20130158468A1 (en) | 2011-12-19 | 2013-06-20 | Seventh Sense Biosystems, Inc. | Delivering and/or receiving material with respect to a subject surface |
EP2701601B1 (en) | 2011-04-29 | 2017-06-07 | Seventh Sense Biosystems, Inc. | Devices and methods for collection and/or manipulation of blood spots or other bodily fluids |
EP3106092A3 (en) | 2011-04-29 | 2017-03-08 | Seventh Sense Biosystems, Inc. | Systems and methods for collecting fluid from a subject |
CN103874460B (en) | 2011-04-29 | 2016-06-22 | 第七感生物系统有限公司 | A kind of device for receiving blood or other material from the skin of subject |
US9189599B2 (en) | 2011-05-13 | 2015-11-17 | Fujitsu Limited | Calculating and monitoring a composite stress index |
US8529447B2 (en) * | 2011-05-13 | 2013-09-10 | Fujitsu Limited | Creating a personalized stress profile using renal doppler sonography |
US8622901B2 (en) * | 2011-05-13 | 2014-01-07 | Fujitsu Limited | Continuous monitoring of stress using accelerometer data |
US8617067B2 (en) * | 2011-05-13 | 2013-12-31 | Fujitsu Limited | Continuous monitoring of stress using environmental data |
US8725462B2 (en) * | 2011-05-13 | 2014-05-13 | Fujitsu Limited | Data aggregation platform |
US9173567B2 (en) * | 2011-05-13 | 2015-11-03 | Fujitsu Limited | Triggering user queries based on sensor inputs |
US8540629B2 (en) * | 2011-05-13 | 2013-09-24 | Fujitsu Limited | Continuous monitoring of stress using a stress profile created by renal doppler sonography |
JP2012248072A (en) | 2011-05-30 | 2012-12-13 | Sony Corp | Information processing apparatus, information processing method, and program |
US20120316455A1 (en) * | 2011-06-10 | 2012-12-13 | Aliphcom | Wearable device and platform for sensory input |
US20120316456A1 (en) * | 2011-06-10 | 2012-12-13 | Aliphcom | Sensory user interface |
US9069380B2 (en) | 2011-06-10 | 2015-06-30 | Aliphcom | Media device, application, and content management using sensory input |
US9109902B1 (en) | 2011-06-13 | 2015-08-18 | Impact Sports Technologies, Inc. | Monitoring device with a pedometer |
US8442500B2 (en) * | 2011-06-21 | 2013-05-14 | Qualcomm Incorporated | Relevant content delivery |
JP5608610B2 (en) * | 2011-06-28 | 2014-10-15 | ヤフー株式会社 | Portable advertisement display device, method and program |
WO2013016007A2 (en) | 2011-07-25 | 2013-01-31 | Valencell, Inc. | Apparatus and methods for estimating time-state physiological parameters |
US8799506B2 (en) * | 2011-08-01 | 2014-08-05 | Infosys Limited | System using personalized values to optimize content provided to user |
EP3222210B1 (en) | 2011-08-02 | 2024-09-25 | Yukka Magic LLC | Systems and methods for variable filter adjustment by heart rate metric feedback |
EP2575064A1 (en) * | 2011-09-30 | 2013-04-03 | General Electric Company | Telecare and/or telehealth communication method and system |
US10548490B2 (en) * | 2012-03-01 | 2020-02-04 | Pixart Imaging Inc. | Physiological detection device and operating method thereof |
WO2013144854A1 (en) * | 2012-03-27 | 2013-10-03 | Koninklijke Philips N.V. | Selection of ambient stimuli |
JP6027716B2 (en) * | 2012-04-03 | 2016-11-16 | 旭光電機株式会社 | Wearable user status information acquisition device |
JP5895716B2 (en) * | 2012-06-01 | 2016-03-30 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
US9824601B2 (en) * | 2012-06-12 | 2017-11-21 | Dassault Systemes | Symbiotic helper |
JP2015534701A (en) | 2012-08-28 | 2015-12-03 | デロス リビング エルエルシーDelos Living Llc | Systems, methods, and articles for promoting wellness associated with living environments |
JP6041613B2 (en) * | 2012-10-11 | 2016-12-14 | 株式会社Nttドコモ | Context information storage device, context information storage method, and context information storage program |
EP2911579B1 (en) * | 2012-10-23 | 2020-12-09 | Koninklijke Philips N.V. | Stress-measuring system |
US11157436B2 (en) | 2012-11-20 | 2021-10-26 | Samsung Electronics Company, Ltd. | Services associated with wearable electronic device |
US10551928B2 (en) | 2012-11-20 | 2020-02-04 | Samsung Electronics Company, Ltd. | GUI transitions on wearable electronic device |
US9477313B2 (en) | 2012-11-20 | 2016-10-25 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving outward-facing sensor of device |
US11372536B2 (en) | 2012-11-20 | 2022-06-28 | Samsung Electronics Company, Ltd. | Transition and interaction model for wearable electronic device |
US11237719B2 (en) | 2012-11-20 | 2022-02-01 | Samsung Electronics Company, Ltd. | Controlling remote electronic device with wearable electronic device |
US8994827B2 (en) | 2012-11-20 | 2015-03-31 | Samsung Electronics Co., Ltd | Wearable electronic device |
US10423214B2 (en) * | 2012-11-20 | 2019-09-24 | Samsung Electronics Company, Ltd | Delegating processing from wearable electronic device |
US10185416B2 (en) | 2012-11-20 | 2019-01-22 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving movement of device |
US9526437B2 (en) | 2012-11-21 | 2016-12-27 | i4c Innovations Inc. | Animal health and wellness monitoring using UWB radar |
US9865176B2 (en) | 2012-12-07 | 2018-01-09 | Koninklijke Philips N.V. | Health monitoring system |
JP5763110B2 (en) * | 2013-01-08 | 2015-08-12 | ビッグローブ株式会社 | Transmission / reception system, computer, transmission / reception apparatus, transmission / reception method, and program |
WO2014116924A1 (en) | 2013-01-28 | 2014-07-31 | Valencell, Inc. | Physiological monitoring devices having sensing elements decoupled from body motion |
US10448874B2 (en) * | 2013-03-12 | 2019-10-22 | Koninklijke Philips N.V. | Visit duration control system and method |
US10149617B2 (en) * | 2013-03-15 | 2018-12-11 | i4c Innovations Inc. | Multiple sensors for monitoring health and wellness of an animal |
US20150091736A1 (en) * | 2013-09-30 | 2015-04-02 | Evergreen Enterprises Of Virginia, Llc | Flag that plays sounds with detected motion |
JP6260190B2 (en) * | 2013-10-17 | 2018-01-17 | カシオ計算機株式会社 | Electronic device, setting method executed by computer controlling electronic device, and program |
JP6233776B2 (en) * | 2013-10-22 | 2017-11-22 | 公立大学法人首都大学東京 | Psychosomatic state determination apparatus and psychosomatic state determination program |
US10478075B2 (en) | 2013-10-25 | 2019-11-19 | Qualcomm Incorporated | System and method for obtaining bodily function measurements using a mobile device |
JP6247803B2 (en) * | 2013-12-26 | 2017-12-13 | 株式会社トヨタマップマスター | ADVERTISEMENT DISTRIBUTION SYSTEM, ADVERTISEMENT DISTRIBUTION SERVER DEVICE AND METHOD, COMPUTER PROGRAM FOR DISTRIBUTING ADVERTISEMENT, AND RECORDING MEDIUM CONTAINING COMPUTER PROGRAM |
EP3797680B1 (en) | 2014-01-10 | 2024-05-15 | Ascensia Diabetes Care Holdings AG | End user medical devices and methods for end user medical devices |
WO2016021236A1 (en) * | 2014-08-07 | 2016-02-11 | 任天堂株式会社 | Information processing system, information processing device, information processing program, and information processing method |
WO2015107681A1 (en) * | 2014-01-17 | 2015-07-23 | 任天堂株式会社 | Information processing system, information processing server, information processing program, and information providing method |
JP6429463B2 (en) * | 2014-02-19 | 2018-11-28 | 国立大学法人信州大学 | Psychosomatic state related information providing apparatus, psychosomatic state related information providing method, and program |
US10691332B2 (en) | 2014-02-28 | 2020-06-23 | Samsung Electronics Company, Ltd. | Text input on an interactive display |
MX2016011107A (en) | 2014-02-28 | 2017-02-17 | Delos Living Llc | Systems, methods and articles for enhancing wellness associated with habitable environments. |
WO2015157582A1 (en) | 2014-04-11 | 2015-10-15 | Bayer Healthcare Llc | Wireless transmitter adapters for battery-operated biosensor meters and methods of providing same |
CN106797528B (en) | 2014-07-07 | 2020-11-17 | 安晟信医疗科技控股公司 | Method and apparatus for improved low energy data communication |
US9179849B1 (en) | 2014-07-25 | 2015-11-10 | Impact Sports Technologies, Inc. | Mobile plethysmographic device |
US9538921B2 (en) | 2014-07-30 | 2017-01-10 | Valencell, Inc. | Physiological monitoring devices with adjustable signal analysis and interrogation power and monitoring methods using same |
EP4360552A3 (en) | 2014-08-06 | 2024-07-10 | Yukka Magic LLC | Optical physiological sensor modules with reduced signal noise |
US11974847B2 (en) | 2014-08-07 | 2024-05-07 | Nintendo Co., Ltd. | Information processing system, information processing device, storage medium storing information processing program, and information processing method |
US10617342B2 (en) | 2014-09-05 | 2020-04-14 | Vision Service Plan | Systems, apparatus, and methods for using a wearable device to monitor operator alertness |
US11918375B2 (en) | 2014-09-05 | 2024-03-05 | Beijing Zitiao Network Technology Co., Ltd. | Wearable environmental pollution monitor computer apparatus, systems, and related methods |
US10448867B2 (en) | 2014-09-05 | 2019-10-22 | Vision Service Plan | Wearable gait monitoring apparatus, systems, and related methods |
US9794653B2 (en) | 2014-09-27 | 2017-10-17 | Valencell, Inc. | Methods and apparatus for improving signal quality in wearable biometric monitoring devices |
EP3204056B1 (en) * | 2014-10-06 | 2021-02-17 | JB Scientific, LLC | Apparatus for delivering a sequence of scents for the purpose of altering an individual's appetite |
JP2015062067A (en) * | 2014-10-07 | 2015-04-02 | 株式会社ニコン | Photographing lens, photographing device, and photographing system |
SE540710C2 (en) * | 2014-10-22 | 2018-10-16 | Trumphy Ltd C/O Jonas Patrik Trumphy | Security system for personal protection and method therfor |
US20170316117A1 (en) * | 2014-10-30 | 2017-11-02 | Philips Lighting Holding B.V. | Controlling the output of information using a computing device |
US20170319122A1 (en) * | 2014-11-11 | 2017-11-09 | Global Stress Index Pty Ltd | A system and a method for gnerating stress level and stress resilience level information for an individual |
US20180020968A1 (en) * | 2014-12-18 | 2018-01-25 | Koninklijke Philips N.V. | System, device, method and computer program for providing a health advice to a subject |
JP6761417B2 (en) * | 2014-12-19 | 2020-09-23 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Dynamic wearable device behavior based on schedule detection |
JP6508938B2 (en) * | 2014-12-24 | 2019-05-08 | 株式会社Nttドコモ | INFORMATION PROCESSING DEVICE, ACTION SUPPORT METHOD, AND PROGRAM |
WO2016115230A1 (en) | 2015-01-13 | 2016-07-21 | Delos Living Llc | Systems, methods and articles for monitoring and enhancing human wellness |
CN104510482B (en) * | 2015-01-14 | 2017-08-01 | 北京理工大学 | A kind of numeral performance sensorial data acquisition system |
US10215568B2 (en) | 2015-01-30 | 2019-02-26 | Vision Service Plan | Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete |
CN104811903A (en) * | 2015-03-25 | 2015-07-29 | 惠州Tcl移动通信有限公司 | Method for establishing communication group and wearable device capable of establishing communication group |
WO2016174206A1 (en) | 2015-04-29 | 2016-11-03 | Ascensia Diabetes Care Holdings Ag | Location-based wireless diabetes management systems, methods and apparatus |
WO2016178329A1 (en) * | 2015-05-07 | 2016-11-10 | ソニー株式会社 | Information processing system, control method, and storage medium |
JP6450000B2 (en) * | 2015-06-03 | 2019-01-09 | 株式会社日立システムズ | Support support system, support support method, and support support program |
US11766182B2 (en) * | 2015-06-05 | 2023-09-26 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | Systems and methods for real-time signal processing and fitting |
US11061233B2 (en) | 2015-06-30 | 2021-07-13 | 3M Innovative Properties Company | Polarizing beam splitter and illuminator including same |
CN104905803B (en) * | 2015-07-01 | 2018-03-27 | 京东方科技集团股份有限公司 | Wearable electronic and its mood monitoring method |
JP6457346B2 (en) * | 2015-07-06 | 2019-01-23 | 日本電信電話株式会社 | Road surface understanding system, road surface understanding method, road surface understanding program |
JP6598570B2 (en) * | 2015-08-11 | 2019-10-30 | 日本光電工業株式会社 | Biological information measuring device and program |
JP6367166B2 (en) | 2015-09-01 | 2018-08-01 | 株式会社東芝 | Electronic apparatus and method |
JP6926369B2 (en) * | 2015-09-29 | 2021-08-25 | セコム株式会社 | Mobile monitoring terminals and programs |
WO2017070463A1 (en) | 2015-10-23 | 2017-04-27 | Valencell, Inc. | Physiological monitoring devices and methods that identify subject activity type |
US10945618B2 (en) | 2015-10-23 | 2021-03-16 | Valencell, Inc. | Physiological monitoring devices and methods for noise reduction in physiological signals based on subject activity type |
CN105426400A (en) * | 2015-10-29 | 2016-03-23 | 小米科技有限责任公司 | User matching degree determination method and apparatus |
JP2017086524A (en) * | 2015-11-11 | 2017-05-25 | セイコーエプソン株式会社 | Fatigue degree control device, fatigue degree control system and fatigue degree determination method |
JP6803140B2 (en) * | 2015-12-17 | 2020-12-23 | 株式会社イトーキ | Business support system |
JP6644398B2 (en) * | 2016-01-20 | 2020-02-12 | パイオニア株式会社 | Information processing apparatus, information processing method, and program |
US10417881B2 (en) * | 2016-05-02 | 2019-09-17 | Norman R. Byrne | Wireless status indicator light |
JP6019262B1 (en) * | 2016-05-16 | 2016-11-02 | ドリコス株式会社 | Supplement formulation support method, supplement supply system, and supplement supply apparatus |
JP2017213249A (en) * | 2016-06-01 | 2017-12-07 | セイコーエプソン株式会社 | Biological information display system, portable terminal device, wearable device, biological information display method, and biological information display program |
JP6638579B2 (en) * | 2016-06-30 | 2020-01-29 | アイシン・エィ・ダブリュ株式会社 | Information providing device and computer program |
MX366613B (en) | 2016-07-08 | 2019-07-15 | Norman R Byrne | Integrated wireless alert system. |
WO2018009736A1 (en) | 2016-07-08 | 2018-01-11 | Valencell, Inc. | Motion-dependent averaging for physiological metric estimating systems and methods |
JP6819109B2 (en) * | 2016-07-20 | 2021-01-27 | 日本電気株式会社 | Stress estimation device, stress estimation method, and stress estimation program |
JP6969554B2 (en) * | 2016-07-27 | 2021-11-24 | ソニーグループ株式会社 | Information processing systems, recording media, information processing methods, and programs |
US11338107B2 (en) | 2016-08-24 | 2022-05-24 | Delos Living Llc | Systems, methods and articles for enhancing wellness associated with habitable environments |
JP6986680B2 (en) * | 2016-08-29 | 2021-12-22 | パナソニックIpマネジメント株式会社 | Stress management system and stress management method |
JP6679453B2 (en) * | 2016-09-16 | 2020-04-15 | ヤフー株式会社 | Communication support program, communication support method, and mobile terminal device |
JP6945127B2 (en) | 2016-09-16 | 2021-10-06 | パナソニックIpマネジメント株式会社 | Stress management system, stress management method and computer program |
JP6509172B2 (en) * | 2016-09-23 | 2019-05-08 | 株式会社Msd | Stress monitor system and program |
JP6647629B2 (en) * | 2016-10-03 | 2020-02-14 | ドリコス株式会社 | Supplement blending support method, supplement supply system, and supplement supply device |
US20190275373A1 (en) * | 2016-12-22 | 2019-09-12 | Sony Corporation | Display control device, display control method, and computer program |
JP6618492B2 (en) * | 2017-02-06 | 2019-12-11 | ソフトバンク株式会社 | Data processing apparatus, data processing method, and program |
CN115137336A (en) * | 2017-02-28 | 2022-10-04 | 松下知识产权经营株式会社 | Processing method, system and storage medium |
EP3369374A1 (en) * | 2017-03-01 | 2018-09-05 | Koninklijke Philips N.V. | Method and apparatus for sending a message to a subject |
JP6391750B2 (en) * | 2017-04-13 | 2018-09-19 | フクダ電子株式会社 | Mobile device |
US9910298B1 (en) | 2017-04-17 | 2018-03-06 | Vision Service Plan | Systems and methods for a computerized temple for use with eyewear |
CN107358455A (en) * | 2017-05-22 | 2017-11-17 | 深圳市海睿广告有限公司 | Advertising equipment with health measuring function and the method from its acquisition health data |
JP6298919B1 (en) * | 2017-06-07 | 2018-03-20 | スマート ビート プロフィッツ リミテッド | Database construction method and database |
JP6988191B2 (en) * | 2017-06-20 | 2022-01-05 | 富士フイルムビジネスイノベーション株式会社 | Image forming device and program |
US10632278B2 (en) * | 2017-07-20 | 2020-04-28 | Bose Corporation | Earphones for measuring and entraining respiration |
JP2019022540A (en) * | 2017-07-21 | 2019-02-14 | 富士通株式会社 | Program, information processor, and stress evaluation method |
US11596795B2 (en) | 2017-07-31 | 2023-03-07 | Medtronic, Inc. | Therapeutic electrical stimulation therapy for patient gait freeze |
US11406788B2 (en) | 2017-08-08 | 2022-08-09 | Sony Corporation | Information processing apparatus and method |
CN110998559A (en) * | 2017-08-28 | 2020-04-10 | 索尼公司 | Information processing apparatus and information processing method |
WO2019046580A1 (en) | 2017-08-30 | 2019-03-07 | Delos Living Llc | Systems, methods and articles for assessing and/or improving health and well-being |
KR102503149B1 (en) * | 2017-11-30 | 2023-02-24 | 주식회사 라이프사이언스테크놀로지 | Apparatus and method for measuring biometric information |
JP7081129B2 (en) * | 2017-12-06 | 2022-06-07 | 富士フイルムビジネスイノベーション株式会社 | Information processing equipment and programs |
JP7031268B2 (en) * | 2017-12-08 | 2022-03-08 | 富士フイルムビジネスイノベーション株式会社 | Information transmission equipment and programs |
JP2021507366A (en) * | 2017-12-15 | 2021-02-22 | ソマティクス, インコーポレイテッド | Systems and methods for monitoring user health |
GB2584221B (en) * | 2017-12-28 | 2022-06-15 | Sayani Saleem | Wearable diagnostic device |
JP7006336B2 (en) * | 2018-02-06 | 2022-01-24 | 沖電気工業株式会社 | Information processing systems, information processing methods and programs |
JP7166801B2 (en) * | 2018-06-21 | 2022-11-08 | 三菱電機株式会社 | Information providing system, information processing device, information providing method, and information providing program |
JP7205092B2 (en) * | 2018-07-18 | 2023-01-17 | 富士フイルムビジネスイノベーション株式会社 | Information processing system, information processing device and program |
US10722128B2 (en) | 2018-08-01 | 2020-07-28 | Vision Service Plan | Heart rate detection system and method |
EP3850458A4 (en) | 2018-09-14 | 2022-06-08 | Delos Living, LLC | Systems and methods for air remediation |
JP2020048610A (en) * | 2018-09-21 | 2020-04-02 | 富士ゼロックス株式会社 | State evaluation system |
JP2020074864A (en) * | 2018-11-06 | 2020-05-21 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Determination device, determination method and computer program |
JP7239807B2 (en) * | 2018-11-29 | 2023-03-15 | テイ・エス テック株式会社 | seat system |
CN109567779A (en) * | 2018-11-29 | 2019-04-05 | 与德科技有限公司 | Heart rate detection method, system and storage medium |
US11844163B2 (en) | 2019-02-26 | 2023-12-12 | Delos Living Llc | Method and apparatus for lighting in an office environment |
JP6611972B1 (en) * | 2019-03-05 | 2019-11-27 | 正通 亀井 | Advice presentation system |
US11898898B2 (en) | 2019-03-25 | 2024-02-13 | Delos Living Llc | Systems and methods for acoustic monitoring |
JP7227854B2 (en) * | 2019-06-03 | 2023-02-22 | トヨタホーム株式会社 | Outing plan proposal system |
JP2019164831A (en) * | 2019-06-05 | 2019-09-26 | 株式会社Revo | Information providing device, system, and program |
JP2020201536A (en) * | 2019-06-06 | 2020-12-17 | 富士ゼロックス株式会社 | Information processing apparatus, information processing system, and information processing program |
JPWO2020255630A1 (en) * | 2019-06-17 | 2020-12-24 | ||
JP2019197564A (en) * | 2019-07-03 | 2019-11-14 | 株式会社東芝 | Wearable terminal, system, and method |
JP7385514B2 (en) * | 2020-03-23 | 2023-11-22 | シャープ株式会社 | Biometric information management device, biometric information management method, biometric information management program, and storage medium |
US11477583B2 (en) | 2020-03-26 | 2022-10-18 | Sonova Ag | Stress and hearing device performance |
JP7264859B2 (en) * | 2020-09-24 | 2023-04-25 | 本田技研工業株式会社 | Navigation system, recommended method of its search route, and program |
JP7048709B2 (en) * | 2020-11-27 | 2022-04-05 | 株式会社東芝 | System and method |
JP7113926B2 (en) * | 2020-12-04 | 2022-08-05 | 株式会社メタリアル | Glasses-type wearable terminal, advertisement display control method, advertisement display control program, advertisement providing device, advertisement providing method, advertisement providing program and advertisement providing system |
WO2022157874A1 (en) * | 2021-01-21 | 2022-07-28 | ソニーグループ株式会社 | Information processing apparatus, information processing method, and program |
WO2022201969A1 (en) * | 2021-03-24 | 2022-09-29 | 富士フイルム株式会社 | Ultrasonic system and method for controlling ultrasonic system |
WO2022254575A1 (en) * | 2021-06-01 | 2022-12-08 | 日本電気株式会社 | Stress factor estimation device, stress factor estimation method and storage medium |
JP7394510B2 (en) * | 2021-06-15 | 2023-12-08 | Lineヤフー株式会社 | Provision device, method and program |
WO2023032204A1 (en) * | 2021-09-06 | 2023-03-09 | 日本電気株式会社 | Recommendation device, system, method, and computer-readable medium |
JP2023151323A (en) * | 2022-03-31 | 2023-10-16 | オムロンヘルスケア株式会社 | Medical care assistance system, medical care assistance device, and program |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5596994A (en) * | 1993-08-30 | 1997-01-28 | Bro; William L. | Automated and interactive behavioral and medical guidance system |
US6083248A (en) * | 1995-06-23 | 2000-07-04 | Medtronic, Inc. | World wide patient location and data telemetry system for implantable medical devices |
US6102846A (en) * | 1998-02-26 | 2000-08-15 | Eastman Kodak Company | System and method of managing a psychological state of an individual using images |
US6139494A (en) * | 1997-10-15 | 2000-10-31 | Health Informatics Tools | Method and apparatus for an integrated clinical tele-informatics system |
US6198394B1 (en) * | 1996-12-05 | 2001-03-06 | Stephen C. Jacobsen | System for remote monitoring of personnel |
US6241684B1 (en) * | 1996-04-08 | 2001-06-05 | Seiko Epson Corporation | Exercise workout support device |
US6443890B1 (en) * | 2000-03-01 | 2002-09-03 | I-Medik, Inc. | Wireless internet bio-telemetry monitoring system |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2512695B2 (en) | 1993-10-29 | 1996-07-03 | 株式会社スズケン | Mental health evaluation device |
JPH0922314A (en) | 1995-07-05 | 1997-01-21 | Sanyo Electric Co Ltd | Stress-adaptive controller |
JPH1071137A (en) | 1996-08-29 | 1998-03-17 | Omron Corp | Device and method for displaying degree of stress |
JPH10305016A (en) | 1997-05-08 | 1998-11-17 | Casio Comput Co Ltd | Behavior information providing system |
-
2000
- 2000-05-31 JP JP2000163793A patent/JP2001344352A/en active Pending
-
2001
- 2001-05-30 US US09/866,828 patent/US6607484B2/en not_active Expired - Fee Related
-
2003
- 2003-05-02 US US10/427,922 patent/US6942615B2/en not_active Expired - Fee Related
- 2003-05-02 US US10/428,065 patent/US20030194205A1/en not_active Abandoned
- 2003-05-02 US US10/427,936 patent/US20030195398A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5596994A (en) * | 1993-08-30 | 1997-01-28 | Bro; William L. | Automated and interactive behavioral and medical guidance system |
US6083248A (en) * | 1995-06-23 | 2000-07-04 | Medtronic, Inc. | World wide patient location and data telemetry system for implantable medical devices |
US6241684B1 (en) * | 1996-04-08 | 2001-06-05 | Seiko Epson Corporation | Exercise workout support device |
US6198394B1 (en) * | 1996-12-05 | 2001-03-06 | Stephen C. Jacobsen | System for remote monitoring of personnel |
US6139494A (en) * | 1997-10-15 | 2000-10-31 | Health Informatics Tools | Method and apparatus for an integrated clinical tele-informatics system |
US6102846A (en) * | 1998-02-26 | 2000-08-15 | Eastman Kodak Company | System and method of managing a psychological state of an individual using images |
US6443890B1 (en) * | 2000-03-01 | 2002-09-03 | I-Medik, Inc. | Wireless internet bio-telemetry monitoring system |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8223021B2 (en) | 2005-02-08 | 2012-07-17 | Abbott Diabetes Care Inc. | RF tag on test strips, test strip vials and boxes |
US8542122B2 (en) | 2005-02-08 | 2013-09-24 | Abbott Diabetes Care Inc. | Glucose measurement device and methods using RFID |
US8390455B2 (en) | 2005-02-08 | 2013-03-05 | Abbott Diabetes Care Inc. | RF tag on test strips, test strip vials and boxes |
US8358210B2 (en) | 2005-02-08 | 2013-01-22 | Abbott Diabetes Care Inc. | RF tag on test strips, test strip vials and boxes |
US8115635B2 (en) | 2005-02-08 | 2012-02-14 | Abbott Diabetes Care Inc. | RF tag on test strips, test strip vials and boxes |
US20060230108A1 (en) * | 2005-04-07 | 2006-10-12 | Olympus Corporation | Information display system |
US7996177B2 (en) | 2005-04-07 | 2011-08-09 | Olympus Corporation | Information display system |
US20070118026A1 (en) * | 2005-11-09 | 2007-05-24 | Kabushiki Kaisha Toshiba | Apparatus, system, and method for lighting control, and computer program product |
US20080294021A1 (en) * | 2005-11-14 | 2008-11-27 | Congener Wellness Corp. | System and Method for the Management or Control of Cardiovascular Related Diseases, Such as Hypertension |
WO2007054399A1 (en) * | 2005-11-14 | 2007-05-18 | Congener Wellness Corp. | A system and method for the management or control of cardiovascular related diseases, such as hypertension |
EP1785088A1 (en) * | 2005-11-14 | 2007-05-16 | Congener Wellness Corp. | A system and method for the management and control of cardiovascular related diseases, such as hypertension |
US20140249429A1 (en) * | 2006-05-24 | 2014-09-04 | Bao Tran | Fitness monitoring |
US9107586B2 (en) * | 2006-05-24 | 2015-08-18 | Empire Ip Llc | Fitness monitoring |
WO2011135386A1 (en) | 2010-04-27 | 2011-11-03 | Christian Berger | Apparatus for determining and storing the excitement level of a human individual, comprisind ecg electrodes and a skin resistance monitor |
US8795184B2 (en) | 2010-07-12 | 2014-08-05 | Rohm Co., Ltd. | Wireless plethysmogram sensor unit, a processing unit for plethysmogram and a plethysmogram system |
EP3382716A1 (en) * | 2017-03-30 | 2018-10-03 | Tanita Corporation | Information processing device, information processing method, and storage medium |
US11331019B2 (en) | 2017-08-07 | 2022-05-17 | The Research Foundation For The State University Of New York | Nanoparticle sensor having a nanofibrous membrane scaffold |
US12005811B2 (en) | 2018-11-29 | 2024-06-11 | Ts Tech Co., Ltd. | Seat system |
CN109473159A (en) * | 2018-12-29 | 2019-03-15 | Oppo广东移动通信有限公司 | Information-pushing method and Related product |
Also Published As
Publication number | Publication date |
---|---|
US6942615B2 (en) | 2005-09-13 |
US20030195398A1 (en) | 2003-10-16 |
US20010049471A1 (en) | 2001-12-06 |
US6607484B2 (en) | 2003-08-19 |
JP2001344352A (en) | 2001-12-14 |
US20030204132A1 (en) | 2003-10-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6942615B2 (en) | Life support apparatus and method for providing advertisement information | |
US11696682B2 (en) | Mesh network personal emergency response appliance | |
US20200281480A1 (en) | Personal monitoring system | |
US9865176B2 (en) | Health monitoring system | |
US8684922B2 (en) | Health monitoring system | |
JP3846844B2 (en) | Body-mounted life support device | |
JP4283672B2 (en) | Device for monitoring health and health | |
US8968195B2 (en) | Health monitoring appliance | |
JP4327825B2 (en) | Body-worn life support device and method | |
US20150099941A1 (en) | Health monitoring appliance | |
JP2005536260A (en) | Device for detecting human physiological information and context information | |
KR20090017344A (en) | Portable device for managing user's health and method of managing user's health using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |