WO2018225838A1 - 情報処理システム - Google Patents
情報処理システム Download PDFInfo
- Publication number
- WO2018225838A1 WO2018225838A1 PCT/JP2018/021936 JP2018021936W WO2018225838A1 WO 2018225838 A1 WO2018225838 A1 WO 2018225838A1 JP 2018021936 W JP2018021936 W JP 2018021936W WO 2018225838 A1 WO2018225838 A1 WO 2018225838A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- avatar
- data
- state
- information processing
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 98
- 238000004891 communication Methods 0.000 claims abstract description 40
- 230000005540 biological transmission Effects 0.000 claims abstract description 8
- 230000036541 health Effects 0.000 claims description 23
- 230000006996 mental state Effects 0.000 claims description 22
- 230000000694 effects Effects 0.000 claims description 17
- 238000004458 analytical method Methods 0.000 claims description 13
- 230000008859 change Effects 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 5
- 230000003862 health status Effects 0.000 claims description 2
- 238000003672 processing method Methods 0.000 claims 2
- 238000000034 method Methods 0.000 abstract description 18
- 230000008569 process Effects 0.000 abstract description 10
- 238000010586 diagram Methods 0.000 description 21
- 230000033001 locomotion Effects 0.000 description 21
- 238000013480 data collection Methods 0.000 description 18
- 238000010219 correlation analysis Methods 0.000 description 16
- 206010015037 epilepsy Diseases 0.000 description 13
- KRQUFUKTQHISJB-YYADALCUSA-N 2-[(E)-N-[2-(4-chlorophenoxy)propoxy]-C-propylcarbonimidoyl]-3-hydroxy-5-(thian-3-yl)cyclohex-2-en-1-one Chemical compound CCC\C(=N/OCC(C)OC1=CC=C(Cl)C=C1)C1=C(O)CC(CC1=O)C1CCCSC1 KRQUFUKTQHISJB-YYADALCUSA-N 0.000 description 12
- 230000006870 function Effects 0.000 description 10
- 230000008451 emotion Effects 0.000 description 8
- 210000005252 bulbus oculi Anatomy 0.000 description 7
- 230000008921 facial expression Effects 0.000 description 7
- 210000003128 head Anatomy 0.000 description 6
- 210000003205 muscle Anatomy 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 230000036760 body temperature Effects 0.000 description 4
- 230000004424 eye movement Effects 0.000 description 3
- 230000014509 gene expression Effects 0.000 description 3
- 230000003183 myoelectrical effect Effects 0.000 description 3
- 210000000056 organ Anatomy 0.000 description 3
- 101150012579 ADSL gene Proteins 0.000 description 2
- 102100020775 Adenylosuccinate lyase Human genes 0.000 description 2
- 108700040193 Adenylosuccinate lyases Proteins 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 210000001015 abdomen Anatomy 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 230000007177 brain activity Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 210000001097 facial muscle Anatomy 0.000 description 2
- 210000001035 gastrointestinal tract Anatomy 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000003340 mental effect Effects 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 210000004761 scalp Anatomy 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000002618 waking effect Effects 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 206010037660 Pyrexia Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 210000000467 autonomic pathway Anatomy 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 244000144972 livestock Species 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000007115 recruitment Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02405—Determining heart rate variability
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0004—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/113—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
- A61B5/1135—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing by monitoring thoracic expansion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/163—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4809—Sleep detection, i.e. determining whether a subject is asleep or not
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4854—Diagnosis based on concepts of traditional oriental medicine
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Bio-feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7246—Details of waveform analysis using correlation, e.g. template matching or determination of similarity
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7275—Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7282—Event detection, e.g. detecting unique waveforms indicative of a medical condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/744—Displaying an avatar, e.g. an animated cartoon character
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/70—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
Definitions
- the present invention relates to an information processing system that collects and visualizes information related to a living body.
- Patent Document 1 discloses a technique for measuring emotion data for a web-compatible application. Specifically, the user can view websites, videos, etc. based on electrodermal activity (EDA), accelerometer readings, physiological data such as skin temperature, or facial expressions or head gestures observed with a webcam. We infer the mental state when interacting with the rendering and associate this mental state information with the rendering. Further, in Patent Document 1, this mental state information is displayed using a visual expression such as an avatar. In Patent Document 2, the potential of a plurality of locations on the user's head, or the acceleration or angular velocity of the head is detected, and the movement of the head and / or facial expression is estimated based on these detection results, and the estimation is performed. A technique is disclosed in which a facial expression is given to an avatar together with the movement of the head and displayed on a display.
- EDA electrodermal activity
- accelerometer readings e.g., accelerometer readings, physiological data such as skin temperature, or facial expressions or head gestures observed with a webcam.
- an avatar is used as a means of communication by displaying a user's mental state and facial expression on the avatar.
- the use of avatars is expected to become more diverse and sophisticated.
- the present invention has been made in view of such circumstances, and an object thereof is to provide an information processing system capable of constructing an avatar that more realistically reflects a user's state.
- an information processing system communicates vital data acquisition means for acquiring vital data including at least a user's heart rate, and vital data acquired by the vital data acquisition means.
- a terminal device that transmits in real time via a network; and an information processing server that processes vital data transmitted from the terminal device, wherein the information processing server includes at least the storage unit that stores the vital data; Based on a heart rate and a heart rate variability calculated based on the heart rate, a user state estimating unit that estimates the state of the user in real time, and an avatar that reflects at least an estimation result by the user state estimating unit
- An avatar data creation unit for creating display data for displaying
- the display data is one having a communication interface for transmitting to the requester.
- the user state estimation unit may estimate the mental state of the user
- the avatar data creation unit may create display data for the avatar that reflects the mental state of the user.
- the avatar data creation unit may create display data in which the expression or posture of the avatar is changed according to the mental state of the user.
- the avatar data creation unit may create display data in which the aura color or display range of the avatar is changed according to the mental state of the user.
- the user state estimation unit may estimate the health state of the user
- the avatar data creation unit may create display data for the avatar that reflects the health state of the user.
- the avatar data creation unit may create display data in which the color of the avatar is partially changed according to the health condition of the user.
- the avatar data creation unit may create display data in which the aura color or display range of the avatar is changed according to the health state of the user.
- the user state estimation unit may estimate an activity state of the user
- the avatar data creation unit may create display data for the avatar in which the activity state of the user is reflected.
- the avatar data creation unit may create display data in which the shape of the avatar is changed according to the activity state of the user.
- the display data is three-dimensional data including information related to the inside of the avatar, and displays the inside of the avatar in response to a transmission request for the display data from the request source.
- Display data may be generated and transmitted to the request source.
- the vital data acquisition unit further acquires a type of vital data different from the heart rate
- the information processing server analyzes a correlation between different types of vital data.
- the user state estimation unit may further estimate the user state based on an analysis result by the correlation analysis unit.
- vital data including at least a user's heart rate is acquired, a user's state is estimated in real time based on the vital data, and avatar display data reflecting the estimation result is created. Therefore, it is possible to configure an avatar that more realistically reflects the user's state.
- FIG. 1 is a system block diagram schematically showing an example of an information processing system according to an embodiment of the present invention. It is a system block diagram which shows roughly an example of a structure of the user terminal shown in FIG.
- FIG. 2 is a system block diagram schematically illustrating an example of a configuration of an information processing server illustrated in FIG. 1. It is a schematic diagram which illustrates the information stored in the user management database memorize
- FIG. 1 is a system block diagram schematically showing an example of an information processing system according to an embodiment of the present invention.
- the information processing system 1 includes a vital data collection unit 10 that collects vital data that is biometric information of a user, a user terminal 20, and an information processing server 30.
- the user terminal 20 and the information processing server 30 are connected via the communication network N (however, it is not limited to this).
- the network N is configured by the Internet, LAN, leased line, telephone line, corporate network, mobile communication network, Bluetooth (registered trademark), WiFi (Wireless Fidelity), other communication lines, combinations thereof, and the like. It is a communication network, regardless of whether it is wired or wireless.
- the vital data collection means 10 includes a plurality of devices that are attached to the user's body or are installed around the user's body and that monitor the user's body and collect vital data.
- the vital data collection means 10 includes a heart rate meter 11 that measures the user's heart rate, a pulse meter 12, a sphygmomanometer 13, a thermometer 14, a web camera 15 that captures the movement of the user's face and body, A surface myoelectric potential sensor 16 for measuring the movement of the user's muscle is included.
- Each device may be provided one by one, or a plurality of devices may be provided. For example, the measurement accuracy can be improved by attaching a plurality of pulse meters 12 to a plurality of locations on the user's body.
- a microphone or a pedometer that collects the user's voice may be provided as the vital data collection means 10.
- FIG. 2 is a system block diagram schematically showing an example of the configuration of the user terminal 20 in the information processing system according to the embodiment of the present invention.
- the user terminal 20 is any terminal device that can exchange data with other communication devices via a communication network, such as a tablet terminal, a personal computer (PC), a notebook PC, a smartphone, a mobile phone, and a personal digital assistant (PDA). Can be used.
- the tablet terminal is used with the user terminal 20 by installing a dedicated application on the tablet terminal and executing the application.
- the user terminal 20 includes a communication interface 21, an input unit 22, a display unit 23, an imaging unit 24, a signal input / output unit 25, a storage unit 26, and a processor 27.
- the communication interface 21 is a hardware module for connecting the user terminal 20 to the communication network N and communicating with other terminals on the communication network N.
- the communication interface 21 is a modulation / demodulation device such as an ISDN modem, an ADSL modem, a cable modem, an optical modem, or a soft modem.
- the input unit 22 is an input device such as various operation buttons and a touch panel.
- the display unit 23 is, for example, a liquid crystal display or an organic EL display.
- the imaging unit 24 is a camera built in the tablet terminal.
- the signal input / output unit 25 is an interface that connects an external device to the user terminal 20 by wireless communication based on a standard such as wired (cable) or Bluetooth (Bluetooth: registered trademark), and transmits and receives signals to and from the external device. It is.
- each device included in the vital data collection unit 10 is connected to the user terminal 20 via the signal input / output unit 25.
- the storage unit 26 is a logical device provided by the storage area of the physical device, and stores an operating system program, a driver program, various data, and the like used for processing of the user terminal 20.
- the physical device is a computer-readable recording medium such as a semiconductor memory.
- the driver program include a communication interface driver program for controlling the communication interface 21, an input device driver program for controlling the input unit 22, a display device driver program for controlling the display unit 23, and an imaging unit 24. Examples thereof include an imaging device driver program for controlling, various driver programs for controlling external devices connected to the signal input / output unit 25, and the like.
- the storage unit 26 stores a dedicated application program 261 that executes a predetermined operation in cooperation with the information processing server 30 by being executed by the processor 27.
- the application program 261 for example, an application program (vital information processing application) for processing vital data collected by the vital data collection means 10, an application program (SNS application) for SNS (social networking service), a user's Application programs for managing health (health management application) and the like.
- the processor 27 includes an arithmetic logic unit (CPU or the like) that processes arithmetic operations, logic operations, bit operations, and the like and various registers, and executes various programs stored in the storage unit 26 to execute the program of the user terminal 20. Centrally control each part.
- the various registers are, for example, a program counter, a data register, an instruction register, a general-purpose register, and the like.
- the processor 27 reads the application program 261 and functions as an application execution unit 271 for vital information processing, SNS, health management, and the like.
- such a user terminal 20 receives various vital data output from the vital data collection means 10 and transmits the vital data to the information processing server 30 via the communication network N constantly and in real time.
- the vital data collection means 10 is connected to the user terminal 20 and the vital data is transmitted to the information processing server 30 via the user terminal 20.
- each of the vital data collection means 10 is provided with a communication function, and an identification code (ID) of each vital data collection means 10 is registered in advance in the information processing server 30 so that each vital data collection means 10 can receive the information processing server. It is good also as transmitting vital data to 30 directly.
- ID identification code
- the vital data collection means 10 and the user terminal 20 are illustrated one by one, but the present invention is not limited to this. That is, two or more user terminals 20 each connected to the vital data collection means 10 can be connected to the communication network N, and the information processing server 30 can be simultaneously accessed from each user terminal 20.
- FIG. 3 is a system block diagram schematically showing an example of the configuration of the information processing server in the information processing system according to the embodiment of the present invention.
- the information processing server 30 accumulates vital data transmitted from the user terminal 20 (or vital data collection means 10), estimates the user's state in real time based on the accumulated vital data, and receives information from the user terminal 20 It is a server device that visualizes a user's state and provides it to the user terminal 20 in response to a request.
- the information processing server 30 is constituted by, for example, a host computer having a high arithmetic processing capability, and a server function is exhibited when a predetermined server program operates on the host computer. Note that the number of computers constituting the information processing server 30 is not necessarily one, and may be composed of a plurality of computers distributed on the communication network N.
- the information processing server 30 includes a communication interface 31, a storage unit 32, and a processor 33.
- the communication interface 31 is a hardware module for connecting to the communication network N and communicating with other terminals on the communication network N.
- the communication interface 31 is a modulation / demodulation device such as an ISDN modem, an ADSL modem, a cable modem, an optical modem, or a soft modem.
- the storage unit 32 is a logical device provided by a storage area of a physical device including a computer-readable recording medium such as a disk drive or a semiconductor memory (ROM, RAM, etc.).
- the storage unit 32 may be constructed by mapping a plurality of physical devices to one logical device, or may be constructed by mapping one physical device to a plurality of logical devices.
- the storage unit 32 stores various programs including operating system programs and driver programs, and various data used during the execution of these programs. Specifically, the storage unit 32 stores an information processing program 321 to be executed by the processor 33, a user management database 322, a user information database 323, and a correlation information database 324.
- the information processing program 321 stores the vital data of the user and provides the processor 33 with a function for visualizing and providing the user status (mental status, health status, activity status, etc.) based on the accumulated vital data. It is a program to be executed.
- FIG. 4 is a schematic diagram illustrating information stored in the user management database 322.
- the user management database 322 stores user account information including a user ID, a user name, a passcode, and the like, and information for managing access restrictions.
- the access restriction is for restricting the range of information disclosed to other users when other users request browsing of information related to the users. Depending on the relationship between the user and other users, access restrictions can be set by the user step by step in the range from “Disclose all (no access restrictions)” to “Do not disclose to anyone other than the user”. it can.
- FIG. 5 is a schematic diagram illustrating information stored in the user information database 323.
- the user information database 323 includes user basic information D1, such as a user's date of birth, height, weight, and blood type, vital data D2, and user status information D3 representing the user status estimated based on the vital data D2. Are stored for each user ID.
- the vital data D2 includes primary data directly acquired by the vital data collection means 10 and secondary data acquired from the primary data.
- the primary data includes heart rate, pulse rate, blood pressure, body temperature, movement of face, scalp and body muscles, movement of eyeball and pupil, voice and the like.
- Secondary data is calculated from the heart rate variability calculated from the heart rate, facial expressions and body poses calculated from the movements of the face, scalp, and body muscles, and movements of the abdomen and back muscles. It includes changes in diaphragm movement, spine elongation, fluctuation rate of eye movement calculated from eye movement, change in tone (size, height, speed, etc.) of voice.
- These secondary data may be calculated on the user terminal 20 side and transmitted to the information processing server 30, or may be calculated on the information processing server 30 side.
- the vital data D2 is transmitted from the user terminal 20 in real time and stored in the information processing server 30. Therefore, the information processing server 30 receives the vital data D2 after a predetermined period (for example, several years) has elapsed since it was received. It may be deleted sequentially. Even in this case, the user status information acquired from the vital data to be deleted may be saved.
- a predetermined period for example, several years
- the user status information D3 includes emotions (feelings and emotions), mental states such as stress levels, health levels such as health levels and malfunctioning sites, “sleeping”, “waking up”, “mealing”, “exercising”, etc. Contains information representing activity status. These pieces of information may be expressed by converting the level into numerical values, may be expressed by character (or symbol) information, or may be expressed by combining numerical values and character (or symbol) information.
- FIG. 6 is a schematic diagram illustrating information stored in the correlation information database 324.
- the correlation information database 324 stores information (correlation information) that associates vital data with the user's state (mental state, health state, activity state).
- information correlation information
- FIG. 6 shows that when the level of certain vital data (data A) is “5” and the level of another vital data (data B) is “4”, the state of the user is “X1”. Yes.
- the heart rate which is one of vital data
- the physical condition for example, normal or fever
- the mental state for example, when calm or nervous or excited
- the active state for example, at rest. Or during exercise.
- the heart rate interval is normal when it is fluctuated to some extent, and fluctuations in the heart rate interval become smaller when the mind and body are stressed or the autonomic nerve function is reduced.
- mental states emotions and stress levels
- health states function levels of organs and the like
- heart rate and the rate of change of heart rate interval (rate of heart rate variability)
- rate of heart rate variability it becomes possible to estimate to some extent the mental state, health state, and activity state of the user.
- other vital data of the user blood pressure, body temperature, user's eye movement captured by the camera, facial muscle movement (facial expression), changes in voice tone, body muscle movement (exercise), diaphragm
- movement, spine extension, etc. it is possible to increase the number of items that can be estimated for the user's state and improve the estimation accuracy.
- an activity state such as sleeping or waking up.
- the information used when estimating the user state does not need to be only the vital data, and the user state (estimated result) estimated based on the vital data may be used. That is, another type of user state may be estimated based on the estimation result based on vital data and the vital data.
- the user's mental state can be estimated in more detail based on the stress level of the user estimated based on the heart rate and / or rate of heart rate variability and the movement of facial muscles.
- the correlation information database 324 stores one or more pieces of correlation information used when estimating the user state in this way.
- the correlation information is not necessarily in the form of a table, and a function that uses a plurality of types of vital data as variables, and a function that uses an estimation result based on vital data and vital data as variables may be stored as correlation information. good.
- the correlation information stored in the correlation information database 324 may be created based on external information in advance, or may be created based on vital data stored in the information processing server 30. good. Further, the correlation information created based on the external information in advance may be updated based on the vital data stored in the information processing server 30.
- the processor 33 includes an arithmetic logic unit (CPU or the like) that processes arithmetic operations, logic operations, bit operations, and the like and various registers, and executes various programs stored in the storage unit 32 to execute the information processing server 30. Central control of each part.
- the various registers are, for example, a program counter, a data register, an instruction register, a general-purpose register, and the like. Further, the processor 33 implements a predetermined information processing function in cooperation with the user terminal 20 by executing the information processing program 321.
- Functional units realized by the processor 33 executing the information processing program 321 include an authentication management unit 331, a user information management unit 332, a user state estimation unit 333, an avatar data creation unit 334, and a correlation analysis unit. 335.
- the authentication management unit 331 performs authentication when the user terminal 20 accesses the information processing server 30. Specifically, when the user terminal 20 requests access, the authentication management unit 331 requests the user terminal 20 to input a user ID and a passcode, refers to the user management database 322, and refers to the user terminal Authentication of whether or not 20 access is permitted is performed.
- the user information management unit 332 manages the user information database 323 based on information transmitted from the user terminal 20.
- the user state estimation unit 333 estimates the user state based on the vital data accumulated in the vital data D2 and the correlation information database 324.
- the avatar data creation unit 334 creates an avatar that is a character to be displayed on the Internet space as a user's alternation, and reflects the user's vital data and the estimation result (user state) by the user state estimation unit 333 in the avatar.
- Display data for display (hereinafter referred to as avatar data) is created.
- the vital data reflected on the avatar, the type of state of the user, and the display method of the avatar are not particularly limited. A display example of the avatar will be described later.
- the vital data is transmitted from the user terminal 20 in real time, and the estimated user's state changes every moment.
- the avatar data creation unit 334 creates 3D data including information representing the inside of the avatar as avatar data, and the avatar is viewed from the inside (for example, in the digestive tract) in response to a request from the user terminal 20.
- the display data and the cross-section display data may be configured each time.
- the correlation analysis unit 335 analyzes the correlation between the vital data (input data) transmitted from the user terminal 20 and the correlation between the vital data (input data) and the estimation result (output data) by the user state estimation unit 333. As a result, a database of correlation information in which vital data and the state of the user are associated is constructed.
- FIG. 7 is a flowchart showing a correlation information database construction process executed by the correlation analysis unit 335.
- the correlation analysis unit 335 acquires first information associated with a user state in advance and one or more types of second information.
- the heart rate variability is determined in step S10 by associating the heart rate variability with the stress level in advance. It can be used as the first information.
- the second information for example, data other than the heart rate variability is acquired, such as movement of a specific part such as an eyeball, change in tone of voice, expansion of diaphragm, stretch of spine, and the like.
- the second information may be two or more types of information different from each other.
- the correlation analysis unit 335 analyzes the correlation between the first information and the second information.
- the correlation between the heart rate variability and the movement of the eyeball, the correlation between the heart rate variability and the change in the tone of the voice, and the correlation between the heart rate variability and the movement of the diaphragm are analyzed.
- correlation analysis section 335 determines whether or not the correlation between the first information and the second information is strong. For example, when the correlation coefficient between the two is greater than or equal to a predetermined value, it is determined that the correlation is strong, and when the correlation coefficient is less than the predetermined value, it is determined that the correlation is weak.
- the correlation analysis unit 335 ends the process.
- the correlation analysis unit 335 is a user who is associated with the first information in advance based on the analysis result of the correlation.
- the second information is associated with the state (step S13). Specifically, a table in which the second information is associated with the user state is created. Alternatively, a function may be created in which the second information is an input variable and the user state is an output value. This makes it possible to directly estimate the user's state based on the second information.
- the stress level of the user can be estimated from the movement data of the user's eyeball without passing through the heart rate variability.
- the user's emotions can be estimated from changes in the tone of the voice.
- the degree of tension of the user can be estimated from the way the diaphragm swells and the spine stretches.
- the correlation analysis unit 335 accumulates the correlation information between the second information acquired in this way and the user state in the correlation information database 324 (see FIG. 3). Thereafter, the correlation analysis unit 335 ends the process.
- any information other than the heart rate variability can be used as long as the information is associated with the state of the user.
- the eyeball movement can be used as new first information in step S10.
- another vital data is used as the second information in step S11, and the correlation with the new first information is analyzed, thereby associating the stress level with the other vital data through the movement of the eyeball. be able to.
- the target of analyzing the correlation is not limited to the correlation between vital data, the correlation between data arbitrarily input by the user and the vital data, the correlation between the data estimated from the vital data and the vital data, etc. May be analyzed. Specifically, the user's date of birth, time and place of birth, blood type, DNA type, fortune-telling (for example, four-posted prediction), the user's own evaluation of vital data, etc. One example is to correlate input data with vital data (such as heart rate variability).
- the analysis result by the correlation analysis unit 335 is accumulated in the correlation information database 324.
- the correlation information accumulated by the analysis of the correlation analysis unit 335 may be used only for estimating the state of the user used for the analysis. Alternatively, the correlation information that can be generalized may be used to estimate the status of other users.
- FIG. 8 is a sequence diagram of information collection processing executed in the information processing system 1 according to the embodiment of the present invention.
- the information processing server 30 requests the user terminal 20 for a user ID and passcode or new user registration (step S201).
- the information processing server 30 issues a user ID and a passcode, and newly creates user information (step S202).
- step S103 When the user ID and passcode are transmitted from the user terminal 20 (step S103) and the information processing server 30 succeeds in the authentication (step S203), the user terminal 20 enters a login state, and the user terminal 20 sends a vital sign to the information processing server 30. Data can be stored.
- the information processing server 30 receives the vital data and stores it in the user information database 323 (step S204). ). Subsequently, the information processing server 30 estimates the user's state (mental state, health state, activity state) based on the accumulated vital data (step S205), and displays an avatar reflecting the user state. Avatar data for the user is created (step S206).
- FIG. 9 is a sequence diagram of avatar display processing executed in the information processing system 1 according to the embodiment of the present invention.
- the information processing server 30 requests the user terminal 20 for a user ID and a passcode (step S211).
- step S112 User ID and passcode are transmitted from the user terminal 20 (step S112), and when the information processing server 30 succeeds in authentication (step S212), the user terminal 20 enters a login state. Note that if the login state of the user terminal 20 is maintained, steps S112, S211 and S212 are omitted.
- the information processing server 30 When the user terminal 20 requests the avatar data of a specific user from the information processing server 30 (step S113), the information processing server 30 refers to the user management database 322 and confirms the access restriction of the user who has requested the avatar data. (Step S213). Then, the requested avatar data is transmitted to the user terminal 20 within the access restriction range (step S214). For example, when the access restriction is set to “not disclosed to anyone other than the person”, the information processing server 30 does not transmit avatar data to users other than the user of the avatar data.
- FIG. 10 is a schematic diagram showing a display example of an avatar, and shows an avatar A1 that imitates the whole human body.
- a heart model a11 may be superimposed on the avatar A1, and the heart model a11 may be pulsated in accordance with the user's heartbeat.
- you may change the whole color of avatar A1 according to a user's body temperature.
- the estimated mental state (for example, emotion, stress level) of the user may be reflected in the expression and color (face color) of the face a12 of the avatar A1.
- the estimated health state of the user may be reflected in the aura (backlight) a13 of the avatar A1.
- the display range of the aura a13 is increased as the spirit is improved, or the color of the aura a13 is changed according to the stress level.
- the lightness of the shoulder portion a14 of the avatar A1 is reduced to indicate that the blood circulation is poor.
- the pose of the avatar A1 is changed according to the movement of the user's muscle acquired by the surface myoelectric potential sensor 16 (see FIG. 1).
- the avatar data creation unit 334 may change the form of the avatar A1 in response to a request transmitted from the user terminal 20. For example, a slider a15 that can be moved by an operation on the input unit 22 is displayed on the display unit 23 of the user terminal 20, and information indicating the position of the slider a15 is transmitted to the information processing server 30 when the slider a15 moves. To do.
- the avatar data creation unit 334 creates avatar data in which past vital data is reflected according to the position of the slider a ⁇ b> 15, and transmits it to the user terminal 20. Thereby, the avatar A1 reflecting the vital data of the user's desired time is displayed on the user terminal 20.
- the user can confirm time-series changes such as the health condition.
- information indicating that the operated area is selected is transmitted to the information processing server 30 by performing a predetermined operation (for example, a tap operation) on the avatar A1 displayed on the display unit 23 of the user terminal 20.
- the avatar data creation unit 334 creates avatar data representing the inside (for example, an organ) of the selected area and transmits it to the user terminal 20.
- the user terminal 20 displays the avatar A1 in which the user-desired internal area is exposed.
- the display method of the internal area may be a method of showing a cross section of the avatar A1, or a method of inserting a small camera inside the avatar A1 and displaying an image taken by this camera.
- FIG. 11 is a schematic diagram showing another display example of the avatar, and shows an avatar A2 that imitates the user's head.
- an area representing the user's emotion (emotion area a21), an area representing the right brain activity state (right brain area a22), and an area representing the left brain activity state (left brain area a23) are provided, and vital data is provided.
- the size, color, etc. of each region are changed according to the estimated user state.
- vital data including at least the user's heart rate is acquired in real time, and avatar data is created based on the state of the user estimated in real time based on the vital data. Therefore, it is possible to configure an avatar that realistically reflects the state of the user.
- the correlation between a plurality of vital data and the correlation between the estimation result of the user state and the vital data are analyzed, and the user state is further estimated based on these correlations. It is possible to increase the number of items that can be estimated for the state and improve the estimation accuracy.
- the information processing system 1 that displays the avatar based on the user's vital data as described above can be used in various applications.
- an avatar can be used as a user profile by combining the information processing system 1 with an SNS (social networking system) such as “facebook (registered trademark)” or “LinkedIn (registered trademark)”.
- SNS social networking system
- FIG. 12 and 13 are diagrams for explaining an application example of the information processing system 1 in the SNS.
- the user A is connected to the users B, C, and D as “friends”.
- the user B is connected to the users E and F as “friends” in addition to the user A.
- the mental state, health state, activity state, or aura of each avatar of these users is digitized, and a statistical value is calculated based on the user's friendship.
- the aura values obtained by quantifying the auras of the avatars of the users A to F are 2, 3, 8, 5, 6, and 4, respectively. In this case, as shown in FIG.
- the user A can perform an analysis that there are many friends who are not settled personally or many unhealthy friends. Alternatively, a comparison can be made between user A's work network and a private network.
- numerical values used for analysis vital data itself (such as heart rate) may be used in addition to numerical values of aura, mental state, health state, activity state, and the like.
- an average value is used as a statistical value, but a median value, a mode value, or the like may be used.
- the aura values and vital data described above are obtained for followers and people who follow the posted articles, and the points and rankings are obtained. It may be displayed in such a form. Thereby, for example, the follower or the like can be analyzed such that the follower of the user A has a high stress level and the follower of the user B has a high health level but a low activity level.
- an avatar may be presented to a company or a match partner as a member (a job applicant or a match applicant) profile.
- the information processing system 1 can be used at a game site.
- avatars may be battled in a battle game.
- the information processing system 1 can be used in a health management application.
- an index representing the health state of the user such as a pulse or a body temperature can be acquired from the avatar displayed on the user terminal 20. It is also possible to display an organ with the abdomen of the avatar opened or to display the avatar's digestive tract from the inside. Further, there is a usage method in which the user corrects his / her posture by looking at the posture of the whole body of the avatar.
- a human is the user of the information processing system 1, but animals such as pets and livestock may be users. That is, a vital data collection means is attached to a dog or cat, and an animal avatar is created based on the collected vital data. In this case, the veterinarian can see the avatar and use it for examination.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Physiology (AREA)
- Psychiatry (AREA)
- Cardiology (AREA)
- Artificial Intelligence (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Hospice & Palliative Care (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- General Physics & Mathematics (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Educational Technology (AREA)
- Pulmonology (AREA)
- Human Computer Interaction (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Dermatology (AREA)
- Computer Networks & Wireless Communication (AREA)
Abstract
Description
特許文献2には、ユーザの頭部の複数箇所の電位、又は、頭部の加速度若しくは角速度を検出し、これらの検出結果に基づいて頭部の動き及び/又は顔の表情を推定し、推定された表情を頭部の動きと併せてアバターに付与してディスプレイに表示する技術が開示されている。
図1は、本発明の実施形態に係る情報処理システムの一例を概略的に示すシステムブロック図である。図1に示すように、情報処理システム1は、ユーザの生体情報であるバイタルデータを収集するバイタルデータ収集手段10と、ユーザ端末20と、情報処理サーバ30とを含む。このうち、ユーザ端末20と情報処理サーバ30とは、通信ネットワークNを介して接続されている(但し、これに限定されない)。
通信インタフェース31は、通信ネットワークNに接続し、通信ネットワークN上の他の端末と通信をするためのハードウェアモジュールである。具体的には、通信インタフェース31は、例えば、ISDNモデム、ADSLモデム、ケーブルモデム、光モデム、ソフトモデム等の変調復調装置である。
ユーザ状態推定部333は、バイタルデータD2に蓄積されたバイタルデータと、相関情報データベース324とに基づいて、ユーザの状態を推定する。
まず、ステップS10において、相関分析部335は、ユーザの状態に予め関連付けられた第1の情報と、1種類以上の第2の情報とを取得する。ここで、上述したように、心拍変動率はユーザのストレスレベルと相関があることが知られているため、心拍変動率とストレスレベルとを予め関連付けておくことにより、心拍変動率をステップS10における第1の情報として用いることができる。また、第2の情報としては、例えば眼球など特定の部位の動きや、声の調子の変化、横隔膜の膨らみ方、背骨の伸び等、心拍変動率以外のデータを取得する。第2の情報は、互いに異なる2種類以上の情報であっても良い。
他方、第1の情報と第2の情報との間の相関が強い場合(ステップS12:Yes)、相関分析部335は、相関の分析結果に基づき、第1の情報と予め関連付けられているユーザの状態に第2の情報を関連付ける(ステップS13)。詳細には、ユーザの状態に第2の情報を関連付けたテーブルを作成する。或いは、第2の情報を入力変数とし、ユーザの状態を出力値とする関数を作成しても良い。これにより、第2の情報に基づいてユーザの状態を直接推定することが可能となる。上述した例においては、ユーザの眼球の動きのデータから、心拍変動率を経ることなく、ユーザのストレスレベルを推定できるようになる。或いは、声の調子の変化から、ユーザの喜怒哀楽を推定できるようになる。また、横隔膜の膨らみ方や背骨の伸びから、ユーザの緊張の度合いを推定できるようになる。相関分析部335は、このようにして取得された第2の情報とユーザの状態との相関情報を、相関情報データベース324(図3参照)に蓄積する。その後、相関分析部335は処理を終了する。
図8は、本発明の実施形態に係る情報処理システム1において実行される情報収集処理のシーケンス図である。
ユーザ端末20が情報処理サーバ30にアクセスを要求すると(ステップS101)、情報処理サーバ30はユーザ端末20に対し、ユーザID及びパスコード、又は、新規ユーザ登録を要求する(ステップS201)。
ユーザ端末20が、情報処理サーバ30にアクセスを要求すると(ステップS111)、情報処理サーバ30はユーザ端末20に対し、ユーザID及びパスコードを要求する(ステップS211)。
図10は、アバターの表示例を示す模式図であり、人間の全身を模したアバターA1を示している。このアバターA1に対し、例えば、心臓の模型a11を重畳表示し、ユーザの心拍に合わせて心臓の模型a11を拍動させることとしても良い。また、ユーザの体温に応じて、アバターA1の全体の色を変化させても良い。或いは、推定されたユーザの精神状態(例えば喜怒哀楽やストレスレベル)をアバターA1の顔a12の表情や色(顔色)に反映させても良い。さらには、推定されたユーザの健康状態(例えば元気~不調のレベルやストレスレベル)を、アバターA1のオーラ(後光)a13に反映させても良い。一例として、元気であるほどオーラa13の表示範囲を広くしたり、ストレスレベルに応じてオーラa13の色を変化させたりする。また、ユーザの身体における不調な箇所に対応するアバターA1の部分の色を、不調の程度に応じて変化させても良い。一例として、ユーザの肩こりがひどい場合に、アバターA1の肩の部分a14の明度を落として、血行不良であることを表示する。また、ユーザの活動状態に合わせて、アバターA1の形状を変化させても良い。一例として、表面筋電位センサ16(図1参照)により取得されたユーザの筋肉の動きに応じて、アバターA1のポーズを変化させる。
10 バイタルデータ収集手段
11 心拍計
12 血圧計
13 体温計
14 ウェブカメラ
15 表面筋電位センサ
20 ユーザ端末
21 通信インタフェース
22 入力部
23 表示部
24 撮像部
25 信号入出力部
26 記憶部
27 プロセッサ
30 情報処理サーバ
31 通信インタフェース
32 記憶部
33 プロセッサ
261 アプリケーションプログラム
271 アプリケーション実行部
321 情報処理プログラム
322 ユーザ管理データベース
323 ユーザ情報データベース
324 相関情報データベース
331 認証管理部
332 ユーザ情報管理部
333 ユーザ状態推定部
334 アバターデータ作成部
335 相関分析部
Claims (9)
- ユーザのバイタルデータを取得するバイタルデータ取得手段と、
前記バイタルデータ取得手段により取得されたバイタルデータを、通信ネットワークを介して送信する端末装置と、
前記端末装置から送信されたバイタルデータを処理する情報処理サーバと、
を備え、
前記情報処理サーバは、
前記バイタルデータを記憶する記憶部と、
前記バイタルデータのうち、心拍数と心拍間隔の変動率である心拍変動率とを含む第1のバイタルデータと、前記第1のバイタルデータとは異なる種類の第2のバイタルデータとの間における相関を分析し、当該分析結果に基づいて、前記ユーザの状態を推定するユーザ状態推定部と、
前記ユーザ状態推定部により推定された前記ユーザの状態が反映されたアバターを表示するための表示用データを作成するアバターデータ作成部と、
通信ネットワークを介して前記表示用データの送信要求を受信した場合に、前記表示用データを当該要求元に送信する通信インタフェースと、
を有し、前記要求元において前記ユーザの状態が反映されたアバターを表示させることのできる情報処理システム。 - 通信ネットワークを介して端末装置から受信した、ユーザの心拍数と心拍間隔の変動率である心拍変動率とを含む第1のバイタルデータを記憶する記憶部と、
前記心拍数と前記心拍変動率とを含む前記第1のバイタルデータと、前記第1のバイタルデータとは異なる種類の第2のバイタルデータとの間における相関を分析し、当該分析結果に基づいて、前記ユーザの状態を推定するユーザ状態推定部と、
前記ユーザ状態推定部により推定された前記ユーザの状態が反映されたアバターを表示するための表示用データを作成するアバターデータ作成部と、
通信ネットワークを介して前記表示用データの送信要求を受信した場合に、前記表示用データを当該要求元に送信する通信インタフェースと、
を有し、前記要求元において前記ユーザの状態が反映されたアバターを表示させることのできる情報処理サーバ。 - 前記アバターデータ作成部は、前記要求元において入力された情報に応じた過去のバイタルデータが反映された前記アバターの表示用データを作成し、
前記通信インタフェースが、前記過去のバイタルデータが反映された前記アバターの表示用データを前記要求元に送信することにより、前記要求元において前記ユーザの状態の時系列的な変化を確認することができる、
請求項2に記載の情報処理サーバ。 - 前記ユーザ状態推定部は、前記ユーザの精神状態を推定し、
前記アバターデータ作成部は、前記ユーザの精神状態が反映された前記アバターの表示用データを作成する、
請求項2又は3に記載の情報処理サーバ。 - 前記ユーザ状態推定部は、前記ユーザの健康状態を推定し、
前記アバターデータ作成部は、前記ユーザの健康状態が反映された前記アバターの表示用データを作成する、
請求項2又は3に記載の情報処理サーバ。 - 前記ユーザ状態推定部は、前記ユーザの活動状態を推定し、
前記アバターデータ作成部は、前記ユーザの活動状態が反映された前記アバターの表示用データを作成する、
請求項2又は3に記載の情報処理サーバ。 - 前記表示用データは、前記アバターの内部に関する情報を含む3次元データであり、
前記要求元からの前記表示用データの送信要求に応じて、前記アバターの内部を表示するための表示用データを作成して前記要求元に送信する、
請求項2~6のいずれか1項に記載の情報処理サーバ。 - プロセッサと記憶部とを備える情報処理サーバにおける情報処理の方法であって、
前記プロセッサが、
通信ネットワークを介して、端末装置からユーザの心拍数と心拍間隔の変動率である心拍変動率とを含む第1のバイタルデータを受信すること、
前記受信した心拍数と心拍変動率とを前記記憶部に記憶すること、
前記心拍数と前記心拍変動率とを含む第1のバイタルデータと、前記第1のバイタルデータとは異なる種類の第2のバイタルデータとの間における相関を分析し、当該分析結果に基づいて、前記ユーザの状態を推定すること、
前記推定された前記ユーザの状態が反映されたアバターを表示するための表示用データを作成すること、及び
通信ネットワークを介して前記表示用データの送信要求を受信した場合に、前記表示用データを当該要求元に送信すること、
を備え、前記要求元において前記ユーザの状態が反映されたアバターを表示させることのできる情報処理方法。 - コンピュータに、
通信ネットワークを介して、端末装置からユーザの心拍数と心拍間隔の変動率である心拍変動率とを含む第1のバイタルデータを受信すること、
前記受信した心拍数と心拍変動率とを前記記憶部に記憶すること、
前記心拍数と前記心拍変動率とを含む第1のバイタルデータと、前記第1のバイタルデータとは異なる種類の第2のバイタルデータとの間における相関を分析し、当該分析結果に基づいて、前記ユーザの状態を推定すること、
前記推定された前記ユーザの状態が反映されたアバターを表示するための表示用データを作成すること、及び
通信ネットワークを介して前記表示用データの送信要求を受信した場合に、前記表示用データを当該要求元に送信することであって、これにより前記要求元において前記ユーザの状態が反映されたアバターを表示させること、
を実行させるためのプログラム。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18813562.8A EP3636156A4 (en) | 2017-06-07 | 2018-06-07 | INFORMATION PROCESSING SYSTEM |
US16/620,404 US11816264B2 (en) | 2017-06-07 | 2018-06-07 | Vital data acquisition and three-dimensional display system and method |
CA3066704A CA3066704A1 (en) | 2017-06-07 | 2018-06-07 | Information processing system |
CN201880038038.4A CN110868930B (zh) | 2017-06-07 | 2018-06-07 | 信息处理系统 |
CN202310529501.8A CN116509348A (zh) | 2017-06-07 | 2018-06-07 | 信息处理系统及方法、信息处理服务器以及存储介质 |
KR1020207000382A KR102618437B1 (ko) | 2017-06-07 | 2018-06-07 | 정보 처리 시스템 |
CN202310530957.6A CN116548937A (zh) | 2017-06-07 | 2018-06-07 | 信息处理系统及方法、信息处理服务器以及存储介质 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017112845A JP6325154B1 (ja) | 2017-06-07 | 2017-06-07 | 情報処理システム |
JP2017-112845 | 2017-06-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018225838A1 true WO2018225838A1 (ja) | 2018-12-13 |
Family
ID=62143929
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/021936 WO2018225838A1 (ja) | 2017-06-07 | 2018-06-07 | 情報処理システム |
Country Status (7)
Country | Link |
---|---|
US (1) | US11816264B2 (ja) |
EP (1) | EP3636156A4 (ja) |
JP (1) | JP6325154B1 (ja) |
KR (1) | KR102618437B1 (ja) |
CN (3) | CN110868930B (ja) |
CA (1) | CA3066704A1 (ja) |
WO (1) | WO2018225838A1 (ja) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7140138B2 (ja) * | 2017-10-27 | 2022-09-21 | ソニーグループ株式会社 | 情報処理装置および情報処理方法、プログラム、並びに情報処理システム |
JP7288064B2 (ja) * | 2018-09-27 | 2023-06-06 | ジ アンチ-インフラメージング カンパニー アーゲー | 視覚型仮想エージェント |
EP4331484A1 (en) * | 2022-08-31 | 2024-03-06 | ASICS Corporation | Mental/physical state evaluation system and mental/physical state evaluation method |
WO2024084580A1 (ja) * | 2022-10-18 | 2024-04-25 | 日本電信電話株式会社 | 体性感覚制御装置、方法およびプログラム |
JP7523614B2 (ja) | 2023-03-07 | 2024-07-26 | 株式会社カプコン | 情報処理システム、情報処理方法およびプログラム ~デジタルツイン環境を支援する技術~ |
CN116687362A (zh) * | 2023-06-15 | 2023-09-05 | 中国中煤能源集团有限公司 | 一种岗前健康快速筛查系统及方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013258555A (ja) * | 2012-06-12 | 2013-12-26 | Sony Computer Entertainment Inc | ヘッドマウントディスプレイ、生体情報管理装置、および生体情報表示方法 |
JP2016126500A (ja) | 2014-12-26 | 2016-07-11 | Kddi株式会社 | ウェアラブル端末装置およびプログラム |
Family Cites Families (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000285377A (ja) | 1999-03-31 | 2000-10-13 | Matsushita Electric Ind Co Ltd | 緊急事態情報表示方法及び装置 |
US6354679B1 (en) | 2000-04-20 | 2002-03-12 | Caterpillar Inc. | Off-set symmetrical link and an associated subassembly for a track chain assembly |
JP2006120137A (ja) | 2001-02-19 | 2006-05-11 | Hitachi Kokusai Electric Inc | 画像情報通報システム |
JP2004062364A (ja) | 2002-07-26 | 2004-02-26 | Hitachi Ltd | 事故情報処理システム |
JP4277173B2 (ja) * | 2003-02-13 | 2009-06-10 | ソニー株式会社 | 再生方法、再生装置およびコンテンツ配信システム |
JP2005057343A (ja) | 2003-08-05 | 2005-03-03 | Tokio Marine & Nichido Fire Insurance Co Ltd | 画像データ登録装置及び方法 |
US8002553B2 (en) * | 2003-08-18 | 2011-08-23 | Cardiac Pacemakers, Inc. | Sleep quality data collection and evaluation |
KR100646868B1 (ko) * | 2004-12-29 | 2006-11-23 | 삼성전자주식회사 | 피부전도도와 심박 정보를 이용한 홈 제어시스템 및 그 방법 |
US8033996B2 (en) * | 2005-07-26 | 2011-10-11 | Adidas Ag | Computer interfaces including physiologically guided avatars |
US20090105560A1 (en) * | 2006-06-28 | 2009-04-23 | David Solomon | Lifestyle and eating advisor based on physiological and biological rhythm monitoring |
JP4506795B2 (ja) | 2007-08-06 | 2010-07-21 | ソニー株式会社 | 生体運動情報表示処理装置、生体運動情報処理システム |
US8666672B2 (en) * | 2009-11-21 | 2014-03-04 | Radial Comm Research L.L.C. | System and method for interpreting a user's psychological state from sensed biometric information and communicating that state to a social networking site |
JP5804405B2 (ja) | 2010-01-27 | 2015-11-04 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理方法および情報処理システム |
JP5480690B2 (ja) | 2010-03-26 | 2014-04-23 | Ms&Ad基礎研究所株式会社 | 保険会社システム |
US20120083675A1 (en) | 2010-09-30 | 2012-04-05 | El Kaliouby Rana | Measuring affective data for web-enabled applications |
US20130115582A1 (en) * | 2010-06-07 | 2013-05-09 | Affectiva, Inc. | Affect based concept testing |
US20150206000A1 (en) * | 2010-06-07 | 2015-07-23 | Affectiva, Inc. | Background analysis of mental state expressions |
US9167991B2 (en) * | 2010-09-30 | 2015-10-27 | Fitbit, Inc. | Portable monitoring devices and methods of operating same |
JP6453751B2 (ja) * | 2012-05-23 | 2019-01-16 | アイフィノタイプ エルエルシー | 表現型を統合したソーシャル検索データベース及び方法 |
US9652992B2 (en) * | 2012-10-09 | 2017-05-16 | Kc Holdings I | Personalized avatar responsive to user physical state and context |
US9199122B2 (en) * | 2012-10-09 | 2015-12-01 | Kc Holdings I | Personalized avatar responsive to user physical state and context |
WO2014145228A1 (en) * | 2013-03-15 | 2014-09-18 | Affectiva, Inc. | Mental state well being monitoring |
WO2014176478A1 (en) * | 2013-04-25 | 2014-10-30 | GM Global Technology Operations LLC | Scene awareness system for a vehicle |
WO2015108702A1 (en) * | 2014-01-14 | 2015-07-23 | Zsolutionz, LLC | Cloud-based initiation of customized exercise routine |
FR3017529B1 (fr) * | 2014-02-17 | 2021-04-23 | Vasile Zoicas | Procede et systeme de surveillance du systeme nerveux autonome d'un sujet |
WO2015128740A2 (en) * | 2014-02-28 | 2015-09-03 | Eco-Fusion | Systems for predicting hypoglycemia and methods of use thereof |
KR20160080958A (ko) * | 2014-12-30 | 2016-07-08 | 삼성전자주식회사 | 사용자 단말장치, 사용자 단말장치의 구동 방법 및 컴퓨터 판독가능 기록매체 |
KR101656808B1 (ko) | 2015-03-20 | 2016-09-22 | 현대자동차주식회사 | 사고 정보 관리 장치, 이를 포함하는 차량 및 사고 정보 관리 방법 |
US9930102B1 (en) * | 2015-03-27 | 2018-03-27 | Intuit Inc. | Method and system for using emotional state data to tailor the user experience of an interactive software system |
US10136856B2 (en) * | 2016-06-27 | 2018-11-27 | Facense Ltd. | Wearable respiration measurements system |
US10366624B2 (en) * | 2015-06-23 | 2019-07-30 | Rescon Ltd | Differentially weighted modifiable prescribed history reporting apparatus, systems, and methods for decision support and health |
PL3117766T3 (pl) * | 2015-07-16 | 2021-09-06 | Preventicus Gmbh | Przetwarzanie danych biologicznych |
EP3397151A4 (en) * | 2015-10-14 | 2019-09-25 | Synphne Pte Ltd. | SYSTEMS AND METHOD FOR ENABLING THE SELF-ADJUSTMENT OF THE MENTAL-BODY EMOTION CONDITION AND DEVELOPING FUNCTIONAL ABILITIES THROUGH BIOFEEDBACK AND AMBIENT MONITORING |
JP2019514603A (ja) * | 2016-05-09 | 2019-06-06 | ベルン テクノロジー カンパニー リミテッドBelun Technology Company Limited | ヘルスケア用ウェアラブル装置及びそのための方法 |
CN106126895A (zh) | 2016-06-20 | 2016-11-16 | 上海朗朗信息科技有限公司 | 基于移动终端的健康生活行为管理系统及方法 |
WO2018191741A1 (en) * | 2017-04-14 | 2018-10-18 | Emfit Ltd. | Wearable sensor and system thereof |
EP3403574A1 (en) * | 2017-05-18 | 2018-11-21 | Preventicus GmbH | Device for reliable acquisition of photoplethysmographic data |
EP3809954A4 (en) * | 2018-05-22 | 2022-03-09 | LifeLens Technologies, Inc. | MONITORING OF PHYSIOLOGICAL PARAMETERS FOR SYNCHRONIZATION FEEDBACK TO IMPROVE A SUBJECT'S PERFORMANCE DURING AN ACTIVITY |
US20220095931A1 (en) * | 2020-09-30 | 2022-03-31 | Cardiac Pacemakers, Inc. | Breath classification systems and methods |
-
2017
- 2017-06-07 JP JP2017112845A patent/JP6325154B1/ja active Active
-
2018
- 2018-06-07 US US16/620,404 patent/US11816264B2/en active Active
- 2018-06-07 CA CA3066704A patent/CA3066704A1/en active Pending
- 2018-06-07 CN CN201880038038.4A patent/CN110868930B/zh active Active
- 2018-06-07 CN CN202310530957.6A patent/CN116548937A/zh active Pending
- 2018-06-07 WO PCT/JP2018/021936 patent/WO2018225838A1/ja unknown
- 2018-06-07 EP EP18813562.8A patent/EP3636156A4/en active Pending
- 2018-06-07 CN CN202310529501.8A patent/CN116509348A/zh active Pending
- 2018-06-07 KR KR1020207000382A patent/KR102618437B1/ko active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013258555A (ja) * | 2012-06-12 | 2013-12-26 | Sony Computer Entertainment Inc | ヘッドマウントディスプレイ、生体情報管理装置、および生体情報表示方法 |
JP2016126500A (ja) | 2014-12-26 | 2016-07-11 | Kddi株式会社 | ウェアラブル端末装置およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN116509348A (zh) | 2023-08-01 |
JP6325154B1 (ja) | 2018-05-16 |
CN110868930A (zh) | 2020-03-06 |
EP3636156A1 (en) | 2020-04-15 |
KR20200019673A (ko) | 2020-02-24 |
EP3636156A4 (en) | 2021-03-17 |
CN116548937A (zh) | 2023-08-08 |
CN110868930B (zh) | 2023-05-30 |
US20200133394A1 (en) | 2020-04-30 |
CA3066704A1 (en) | 2018-12-13 |
KR102618437B1 (ko) | 2023-12-28 |
JP2018202012A (ja) | 2018-12-27 |
US11816264B2 (en) | 2023-11-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6325154B1 (ja) | 情報処理システム | |
WO2018225839A1 (ja) | データベースの構築方法 | |
US20220130556A1 (en) | Health management apparatus and health management system | |
US20130172693A1 (en) | Diagnosing system for consciousness level measurement and method thereof | |
KR20150112423A (ko) | 가상 병원 시스템, 가상 병원 생성 방법 및 장치, 이를 이용하는 의료 서비스 제공 방법 | |
JP2019058227A (ja) | IoT計測器、健康SNSプラットフォーム、および、健康サービスシステム | |
JP7311118B2 (ja) | 感情推定方法および感情推定システム | |
Jonas et al. | Designing a wearable IoT-based bladder level monitoring system for neurogenic bladder patients | |
CN109585027A (zh) | 一种基于远程问诊的药物推荐方法和装置 | |
Baig | Smart vital signs monitoring and novel falls prediction system for older adults | |
US20230066883A1 (en) | Predicting health or disease from user captured images or videos | |
EP4367609A1 (en) | Integrative system and method for performing medical diagnosis using artificial intelligence | |
JP7119755B2 (ja) | 健康管理装置、健康管理方法、及びプログラム | |
WO2023171162A1 (ja) | 心理状態推定装置および心理状態推定方法 | |
O’Hara et al. | Introduction to special issue on body tracking and healthcare | |
JP7435965B2 (ja) | 情報処理装置、情報処理方法、学習モデルの生成方法、及びプログラム | |
KR20230163822A (ko) | 비대면 질환 진단 방법 및 장치 | |
Guffey | Smartphone application self-tracking use and health | |
Mishra et al. | IoT ML Driven Holistic Health Monitoring and Fitness Assessment Empowering Proactive Wellbeing Management |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18813562 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3066704 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20207000382 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2018813562 Country of ref document: EP Effective date: 20200107 |