WO2022065154A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2022065154A1
WO2022065154A1 PCT/JP2021/033904 JP2021033904W WO2022065154A1 WO 2022065154 A1 WO2022065154 A1 WO 2022065154A1 JP 2021033904 W JP2021033904 W JP 2021033904W WO 2022065154 A1 WO2022065154 A1 WO 2022065154A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
behavior
score
behavior pattern
Prior art date
Application number
PCT/JP2021/033904
Other languages
French (fr)
Japanese (ja)
Inventor
規 高田
秀生 鶴
哲也 諏訪
翔平 大段
隆幸 菅原
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2020160161A external-priority patent/JP2022053365A/en
Priority claimed from JP2020160244A external-priority patent/JP2022053411A/en
Priority claimed from JP2020160246A external-priority patent/JP2022053413A/en
Priority claimed from JP2020160245A external-priority patent/JP2022053412A/en
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Priority to EP21872284.1A priority Critical patent/EP4202820A1/en
Priority to CN202180058932.XA priority patent/CN116194914A/en
Publication of WO2022065154A1 publication Critical patent/WO2022065154A1/en
Priority to US18/187,816 priority patent/US20230222884A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q99/00Subject matter not provided for in other groups of this subclass
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • A61B5/352Detecting R peaks, e.g. for synchronising diagnostic apparatus; Estimating R-R interval
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4029Detecting, measuring or recording for evaluating the nervous system for evaluating the peripheral nervous systems
    • A61B5/4035Evaluating the autonomic nervous system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • This disclosure relates to information processing devices, information processing methods, and programs.
  • Patent Document 1 describes a mobile phone device that detects user acceleration information and controls an operation mode using the detected acceleration information.
  • Patent Document 2 describes a motion information measurement system that measures motion information of a desired portion of a user and recognizes a motion state of the whole body.
  • the information processing apparatus includes a behavioral state detection sensor that detects behavioral state information related to the user's behavioral state, and at least the date, time, place, and time when the behavioral state is detected based on the behavioral state information.
  • a behavior pattern information generation unit that generates behavior pattern information in a multidimensional space with the parameter of
  • the behavior score calculation unit that calculates information about the size of the space including the behavior pattern information group as the behavior score, and the behavior pattern information group in which the value of the behavior score is equal to or higher than a predetermined value are the behaviors of the user. It is provided with an action pattern specifying unit that is specified as a pattern.
  • the information processing apparatus is based on a behavioral state sensor that detects behavioral state information regarding the user's behavioral state, a biological sensor that detects biological information regarding the user's biological information, and the biological information. It includes an autonomic nerve activity calculation unit that calculates the autonomic nerve activity of the user, and an output control unit that changes the intensity of output from the output unit according to the intensity of the autonomic nerve activity.
  • the information processing apparatus is based on a behavioral state sensor that detects behavioral state information regarding the user's behavioral state, a biological sensor that detects biological information regarding the user's biological information, and the biological information.
  • An autonomic nerve activity calculation unit that calculates the autonomic nerve activity of the user, and an autonomic nerve activity correction unit that corrects the autonomic nerve activity based on the country or region in which the behavior pattern of the user is specified. To prepare for.
  • the information processing method includes a step of detecting a behavioral state information regarding a user's behavioral state, and at least parameters of a date, time, place, and time when the behavioral state is detected based on the behavioral state information.
  • a step of calculating an action score that calculates information about the size of the action score as an action score, and a step of specifying the action pattern information group that exists in the space having a value of the action score of a predetermined value or more as the action pattern of the user. include.
  • the information processing method includes a step of detecting behavioral state information regarding the user's behavioral state, a step of detecting biological information regarding the user's biological information, and a step of detecting the user's biological information based on the biological information. It includes a step of calculating the autonomic nerve activity and a step of changing the intensity of the output from the output unit according to the intensity of the autonomic nerve activity.
  • the information processing method includes a step of detecting behavioral state information regarding the user's behavioral state, a step of detecting biological information regarding the user's biological information, and a step of detecting the user's biological information based on the biological information. It includes a step of calculating the autonomic nerve activity and a step of correcting the autonomic nerve activity based on the country or region in which the behavior pattern of the user is specified.
  • the program includes a step of detecting behavioral state information regarding a user's behavioral state, and at least the date and time, place, and time parameters at which the behavioral state is detected as coordinate axes based on the behavioral state information.
  • the program according to one aspect of the present disclosure includes a step of detecting behavioral state information regarding the user's behavioral state, a step of detecting biological information regarding the user's biological information, and the user's autonomic nerve based on the biological information.
  • a computer is made to execute a step of calculating the activity and a step of changing the intensity of the output from the output unit according to the intensity of the autonomic nerve activity.
  • the program according to one aspect of the present disclosure includes a step of detecting behavioral state information regarding the user's behavioral state, a step of detecting biological information regarding the user's biological information, and the autonomic nerve of the user based on the biological information.
  • the behavior pattern of the user can be specified based on the information regarding the behavioral state of the user.
  • FIG. 1 is a schematic diagram schematically showing an information processing apparatus according to the first embodiment.
  • FIG. 2 is a block diagram showing a configuration example of the information processing apparatus according to the first embodiment.
  • FIG. 3 is a flowchart showing an example of the processing flow of the information processing apparatus according to the first embodiment.
  • FIG. 4 is a diagram for explaining a multidimensional space that generates behavior pattern information.
  • FIG. 5 is a diagram for explaining a format for storing an action pattern.
  • FIG. 6 is a flowchart showing an example of the processing flow of the information processing apparatus according to the second embodiment.
  • FIG. 7 is a diagram for explaining daily behavior and extraordinary behavior.
  • FIG. 8 is a block diagram showing a configuration example of the information processing apparatus according to the third embodiment.
  • FIG. 8 is a block diagram showing a configuration example of the information processing apparatus according to the third embodiment.
  • FIG. 9 is a flowchart showing an example of the processing flow of the information processing apparatus according to the third embodiment.
  • FIG. 10 is a graph showing an example of a pulse wave.
  • FIG. 11 is a diagram for explaining a format for storing an action pattern.
  • FIG. 12 is a block diagram showing a configuration example of the information processing apparatus according to the fourth embodiment.
  • FIG. 13 is a diagram for explaining an example of correction data.
  • FIG. 14 is a flowchart showing an example of the processing flow of the information processing apparatus according to the fourth embodiment.
  • FIG. 15 is a diagram for explaining an example of correction data.
  • FIG. 16 is a flowchart showing an example of the processing flow of the information processing apparatus according to the fifth embodiment.
  • FIG. 17 is a block diagram showing a configuration example of the information processing apparatus according to the sixth embodiment.
  • FIG. 18A is a diagram for explaining an example of user history data.
  • FIG. 18B is a diagram for explaining an example of user history data.
  • FIG. 18C is a diagram for explaining an example of user history data.
  • FIG. 19 is a flowchart showing an example of the flow of the learning process according to the sixth embodiment.
  • FIG. 20 is a flowchart showing an example of the processing flow of the information processing apparatus according to the sixth embodiment.
  • FIG. 21 is a diagram for explaining a configuration example of the information processing system according to the seventh embodiment.
  • FIG. 22 is a block diagram showing a configuration example of the server device according to the seventh embodiment.
  • FIG. 23 is a flowchart showing an example of the processing flow of the server device according to the seventh embodiment.
  • FIG. 24 is a flowchart showing an example of the flow of the learning process according to the seventh embodiment.
  • FIG. 25 is a flowchart showing an example of the processing flow of the information processing apparatus according to the seventh embodiment.
  • FIG. 26 is a block diagram showing a configuration example of the information processing apparatus according to the eighth embodiment.
  • FIG. 27A is a diagram for explaining a method of relatively displaying the temporal transition of the activity score.
  • FIG. 27B is a diagram for explaining a method of relatively displaying the temporal transition of the activity score.
  • FIG. 27C is a diagram for explaining a method of relatively displaying the temporal transition of the activity score.
  • FIG. 27D is a diagram for explaining a method of relatively displaying the temporal transition of the activity score.
  • FIG. 27E is a diagram for explaining a method of relatively displaying the temporal transition of the activity score.
  • FIG. 28 is a diagram for explaining a method of outputting information showing the temporal transition of the activity score.
  • FIG. 1 is a schematic diagram of an information processing apparatus according to the first embodiment.
  • the information processing apparatus 10 is a so-called wearable device worn on the body of the user U.
  • the information processing device 10 includes a device 10A worn on the eyes of the user U, a device 10B worn on the ears of the user U, and a device 10C worn on the arm of the user U.
  • the device 10A attached to the eyes of the user U includes a display unit 26A described later that outputs a visual stimulus to the user U (displays an image), and the device 10B attached to the ear of the user U gives an auditory stimulus to the user U.
  • the device 10C attached to the arm of the user U includes a later-described audio output unit 26B that outputs (voice), and includes a later-described tactile stimulus output unit 26C that outputs a tactile stimulus to the user U.
  • a later-described audio output unit 26B that outputs (voice)
  • a later-described tactile stimulus output unit 26C that outputs a tactile stimulus to the user U.
  • the configuration of FIG. 1 is an example, and the number of devices and the mounting position on the user U may be arbitrary.
  • the information processing device 10 is not limited to a wearable device, and may be a device carried by the user U, for example, a so-called smartphone or tablet terminal.
  • FIG. 2 is a block diagram showing a configuration example of the information processing apparatus according to the first embodiment.
  • the information processing apparatus 10 includes an action state sensor 20, an input unit 22, an output unit 24, a communication unit 26, a storage unit 28, and a control unit 30.
  • the behavioral state sensor 20 is a sensor that detects behavioral state information regarding the behavioral state of the user U wearing the information processing device 10.
  • the behavior state information of the user U may include various information regarding the behavior of the user U.
  • the action state information of the user U may include at least information regarding the physical movement of the user U, the date and time of the action, the place of the action, and the time of the action.
  • the behavioral state sensor 20 includes a camera 20A, a microphone 20B, a GNSS receiver 20C, an acceleration sensor 20D, a gyro sensor 20E, an optical sensor 20F, a temperature sensor 20G, and a humidity sensor 20H.
  • the behavioral state sensor 20 may include an arbitrary sensor for detecting behavioral state information, for example, a camera 20A, a microphone 20B, a GNSS receiver 20C, an acceleration sensor 20D, and a gyro sensor 20E.
  • the optical sensor 20F, the temperature sensor 20G, and the humidity sensor 20H may be included, or may include other sensors.
  • the camera 20A is an image pickup device, and captures the periphery of the information processing device 10 by detecting visible light around the information processing device 10 (user U) as behavioral state information.
  • the camera 20A may be a video camera that captures images at predetermined frame rates.
  • the position and orientation of the camera 20A in the information processing apparatus 10 are arbitrary, but for example, the camera 20A is provided in the apparatus 10A shown in FIG. 1 and the imaging direction is the direction in which the face of the user U is facing. It may be there.
  • the camera 20A can image an object in the line of sight of the user U, that is, an object within the field of view of the user U.
  • the number of cameras 20A is arbitrary, and may be singular or plural. If there are a plurality of cameras 20A, the information in the direction in which the cameras 20A are facing is also acquired.
  • the microphone 20B is a microphone that detects voice (sound wave information) around the information processing device 10 (user U) as behavioral state information.
  • the position, orientation, number, and the like of the microphone 20B provided in the information processing apparatus 10 are arbitrary. If there are a plurality of microphones 20B, information in the direction in which the microphones 20B are facing is also acquired.
  • the GNSS receiver 20C is a device that detects the position information of the information processing device 10 (user U) as the action state information.
  • the position information here is the earth coordinates.
  • the GNSS receiver 20C is a so-called GNSS (Global Navigation Satellite System) module, which receives radio waves from satellites and detects the position information of the information processing device 10 (user U).
  • GNSS Global Navigation Satellite System
  • the acceleration sensor 20D is a sensor that detects the acceleration of the information processing device 10 (user U) as behavioral state information, and detects, for example, gravity, vibration, and impact.
  • the gyro sensor 20E is a sensor that detects the rotation and orientation of the information processing device 10 (user U) as behavioral state information, and detects it using the principles of Coriolis force, Euler force, centrifugal force, and the like.
  • the optical sensor 20F is a sensor that detects the intensity of light around the information processing device 10 (user U) as behavioral state information.
  • the optical sensor 20F can detect the intensity of visible light, infrared rays, and ultraviolet rays.
  • the temperature sensor 20G is a sensor that detects the temperature around the information processing device 10 (user U) as behavioral state information.
  • the humidity sensor 20H is a sensor that detects the humidity around the information processing device 10 (user U) as behavioral state information.
  • the input unit 22 is a device that accepts user operations, and may be, for example, a touch panel.
  • the output unit 24 outputs the output result of the information processing device 10.
  • the output unit 24 includes, for example, a display unit 24A for displaying an image and an audio output unit 24B for outputting audio.
  • the display unit 24A is, for example, a so-called HMD (Head Mounted Display).
  • the audio output unit 24B is a speaker that outputs audio.
  • the communication unit 26 is a module that communicates with an external device or the like, and may include, for example, an antenna or the like.
  • the communication method by the communication unit 26 is wireless communication in this embodiment, but the communication method may be arbitrary.
  • the storage unit 28 is a memory that stores various information such as calculation contents and programs of the control unit 30, and is, for example, a RAM (Random Access Memory), a main storage device such as a ROM (Read Only Memory), and an HDD (HDD). Includes at least one of external storage devices such as Hard Disk Drive).
  • the learning model 28A and the map data 28B are stored in the storage unit 28.
  • the learning model 28A is an AI model used to specify the environment in which the user U is located based on the environment information.
  • the map data 28B is data including position information of existing buildings and natural objects, and can be said to be data in which the earth coordinates and actual buildings and natural objects are associated with each other.
  • the processing using the learning model 28A, the map data 28B, and the like will be described later.
  • the learning model 28A, the map data 28B, and the program for the control unit 30 stored by the storage unit 28 may be stored in a recording medium readable by the information processing device 10. Further, the program for the control unit 30 stored by the storage unit 28, the learning model 28A, and the map data 28B are not limited to being stored in advance in the storage unit 28, and information is used when using these data.
  • the processing device 10 may acquire from an external device by communication.
  • the control unit 30 controls the operation of each unit of the information processing device 10.
  • the control unit 30 is realized by, for example, using a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like to execute a program stored in a storage unit (not shown) using a RAM or the like as a work area.
  • the control unit 30 may be realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • the control unit 30 may be realized by a combination of hardware and software.
  • the control unit 30 includes an action state information acquisition unit 40, an action pattern information generation unit 42, an action score calculation unit 44, an action pattern specifying unit 46, a memory control unit 48, and an output control unit 50.
  • the behavior state information acquisition unit 40 controls the behavior state sensor 20 to cause the behavior state sensor 20 to detect the behavior state information of the user U.
  • the behavior state information acquisition unit 40 acquires the behavior state information detected by the behavior state sensor 20.
  • the behavior pattern information generation unit 42 generates behavior pattern information based on the behavior status information acquired by the behavior status information acquisition unit 40.
  • the behavior pattern information generation unit 42 generates behavior pattern information in a multidimensional space whose coordinate axes are at least the date and time, place, and time parameters when the behavior state of the user U is detected, based on the behavior state information, for example.
  • the behavior score calculation unit 44 calculates the behavior score based on the behavior pattern information generated by the behavior pattern information generation unit 42.
  • the behavior score calculation unit 44 for example, groups the behavior pattern information groups in which the behavioral state information is collected for each space in which the density exceeds a predetermined density.
  • the behavior score calculation unit 44 calculates the behavior score of the behavior pattern information group based on the space including the grouped behavior pattern information group. Specifically, the behavior score calculation unit 44 calculates, for example, information regarding the size of the space including the behavior pattern information group as the behavior score.
  • the behavior pattern specifying unit 46 specifies the behavior pattern of the user U based on the behavior score calculated by the behavior score calculation unit 44.
  • the behavior pattern specifying unit 46 determines that the behavior corresponding to the behavior pattern information group whose behavior score value exceeds a predetermined threshold value is the behavior pattern of the user U.
  • the behavior pattern specifying unit 46 is based on, for example, image data, voice data, position information, acceleration information, attitude information, infrared and ultraviolet intensity information, temperature information, humidity information, etc. acquired by the behavior state information acquisition unit 40. , Identify the type of action the user U was performing.
  • the memory control unit 48 controls the storage unit 28 to perform storage. Information about the behavior pattern of the user U specified by the behavior pattern specifying unit 46 is stored in the storage unit 28.
  • the storage control unit 48 stores information about the behavior pattern of the user U specified by the behavior pattern specifying unit 46 in the storage unit 28 in a predetermined format. The predetermined format will be described later.
  • the output control unit 50 controls the output unit 24 to output.
  • the output control unit 50 controls, for example, the display unit 24A to display information regarding an action pattern.
  • the output control unit 50 controls, for example, the voice output unit 24B to output information regarding the behavior pattern by voice.
  • FIG. 3 is a flowchart showing an example of the processing flow of the information processing apparatus 10 according to the first embodiment.
  • the control unit 30 acquires the behavior state information regarding the behavior of the user U from the behavior state sensor 20 (step S10). Specifically, the behavior state information acquisition unit 40 acquires image data obtained by capturing the periphery of the information processing apparatus 10 (user U) from the camera 20A. The behavior state information acquisition unit 40 acquires voice data obtained by collecting voices around the information processing device 10 (user U) from the microphone 20B. The behavior state information acquisition unit 40 acquires the position information of the information processing device 10 (user U) from the GNSS receiver 20C. The behavior state information acquisition unit 40 acquires the acceleration information of the information processing device 10 (user U) from the acceleration sensor 20D. The behavior state information acquisition unit 40 acquires the posture information of the information processing device 10 (user U) from the gyro sensor 20E.
  • the behavior state information acquisition unit 40 acquires the intensity information of infrared rays and ultraviolet rays around the information processing apparatus 10 (user U) from the optical sensor 20F.
  • the behavior state information acquisition unit 40 acquires temperature information around the information processing device 10 (user U) from the temperature sensor 20G.
  • the behavior state information acquisition unit 40 acquires humidity information around the information processing device 10 (user U) from the humidity sensor 20H.
  • the behavior state information acquisition unit 40 sequentially acquires such information at predetermined intervals.
  • the action state information acquisition unit 40 may acquire each action state information at the same timing or at different timings. Further, the predetermined period until the next behavioral state information is acquired may be arbitrarily set, and the predetermined period may be the same or different for each environmental information.
  • the action of the user U may include three elements: the physical movement of the body of the action itself, the date and time when the action is performed, the place where the action is performed, and the time when the action is performed at the place.
  • the behavior of the user U may include not only the movement of the body of the user U but also actions such as "playing golf”, “watching a movie”, and “shopping”. Even if the body movement of the user U acquired by the action state information acquisition unit 40 is the same, if the position information is different, the action performed may be different.
  • the control unit 30 generates the behavior pattern information of the user U (step S11). Specifically, the behavior pattern information generation unit 42 generates the behavior pattern information of the user U based on the behavior state information of the user U acquired by the behavior state information acquisition unit 40.
  • the control unit 30 groups the behavior pattern information group (step S12). Specifically, the behavior pattern information generation unit 42 groups each space in which the density of the behavior pattern information group in which the behavior pattern information is collected exceeds a predetermined density.
  • the control unit 30 calculates the behavior score (step S13). Specifically, the behavior score calculation unit 44 calculates the distance from the center to the end of the space including the behavior pattern information group as the behavior score. In other words, the behavior score calculation unit 44 calculates the size of the space including the behavior pattern information group as the behavior score.
  • FIG. 4 is a diagram for explaining a multidimensional space that generates behavior pattern information.
  • FIG. 4 shows a three-dimensional space whose coordinate axes are date and time, place, and time.
  • the date and time is from 0:00 to 24:00
  • the place is a one-dimensional linear distance from the home
  • the time is the time from the start of the action to the end of the action, but it is limited to this. do not have.
  • the location may be a name, an address, or the like.
  • the behavior pattern information generation unit 42 generates behavior pattern information by plotting points P at predetermined time intervals in the three-dimensional space shown in FIG.
  • the behavior pattern information generation unit 42 plots the point P at intervals of 1 minute, for example, but the present invention is not limited to this. It is assumed that the behavior pattern information generation unit 42 is watching a movie for "2 hours" in "around area A2" at "around noon", for example. In this case, the behavior pattern information generation unit 42 points at predetermined intervals from the intersection of "around noon", "around area A2" and “2 hours” to "around 14:00" parallel to the date and time axis. Plot P.
  • the behavior pattern specifying unit 46 has image data, audio data, position information, acceleration information, attitude information, infrared and ultraviolet intensity information, temperature information, and information around the user U. It can be specified based on humidity information and the like. In the case of "purchasing groceries” as the action of the user U, there may usually be multiple candidates for grocery stores such as supermarkets and underground shopping malls of department stores.
  • the behavior pattern information generation unit 42 groups in the three-dimensional space shown in FIG. 4 as the same behavior pattern in a space where the density of the behavior pattern information group generated as behavior pattern information and gathering points P exceeds a predetermined density. To become.
  • the behavior pattern information generation unit 42 scans the space S in which the behavior pattern information can be generated, for example, by using the unit space US.
  • the behavior pattern information generation unit 42 counts, for example, the number of points P included in the unit space US at each location in the space S.
  • the behavior pattern information generation unit 42 specifies, for example, a location having the largest number of points P included in the unit space US as the center of the behavior pattern information.
  • the action pattern information generation unit 42 counts the number of points P included in the unit space US around the unit space US having the largest number of points P, and is about 60% of the unit space US having the largest number of points P. Up to the unit space US including the point P is specified as the same group G.
  • the action score calculation unit 44 calculates, for example, the length of the perpendicular line drawn from the center of the group G to any surface as the action score.
  • the action score calculation unit 44 calculates, for example, the volume of the group G as the action score.
  • the behavior pattern information generation unit 42 may specify a plurality of groups G in the three-dimensional space shown in FIG.
  • the behavior of the user U includes, for example, "going to the dealer” for “purchasing a car” and “going to the home center” for “purchasing a chair", which are not frequently purchased. There can also be behavior to buy. It is considered that these actions are plotted separately in place, time, and date and time in the three-dimensional space shown in FIG. However, for example, if User U does “gymnastics” for “about 10 minutes” at a nearby “park” at 6 am every day, the points plotted in the three-dimensional space will be crowded in the space day by day. Come on. Even within the same park location, the place where you do gymnastics may vary from day to day, and the time may vary from 10 minutes to ⁇ 2 minutes.
  • the behavior pattern information generation unit 42 can plot and generate behavior pattern information so as to be biased like a relatively large dense group in a three-dimensional space. In this way, by selecting parameters such as time, place, and time, which are relatively simple information, as the axis, it becomes possible to treat a dense group of three-dimensional space with specific fluctuations as one action pattern. ..
  • the behavior pattern of the user U is shown in a three-dimensional space, but the present disclosure is not limited to this.
  • the behavior pattern of the user U may be generated in any multidimensional space.
  • the place is "straight line distance from home", but the place may be "latitude” and "longitude”.
  • the space in which the behavior pattern information is generated is a four-dimensional space.
  • the date and time are set to "0:00 to 24:00" to indicate one day, an axis having a scalar value of 7 units may be added as the "day of the week" to form a five-dimensional space.
  • the plot of the behavior pattern information group may change significantly between Monday and Friday and Saturday and Sunday. From the viewpoint of the axis of "day of the week", it becomes easy to distinguish between the behavior pattern of daily behavior and the behavior pattern of extraordinary behavior.
  • the behavior patterns of daily behavior and extraordinary behavior will be described later.
  • the time is the time from the start to the end of the action, but this disclosure is not limited to this.
  • the behavior is an intermittent continuous event such as "200 swings of the bat” or "how many steps (or ran)”
  • the frequency may be used.
  • the user U has a habit of exercising regularly, by changing the time to frequency, all behavior patterns can be grasped as a parameter of "exercise” called “movement”, which is interesting data for user U. It is more likely that it can be displayed.
  • the control unit 30 determines whether or not the behavior score is less than the threshold value (step S14). Specifically, the behavior pattern specifying unit 46 determines whether or not the behavior score calculated by the behavior score calculation unit 44 in step S13 is less than a predetermined threshold value. If it is determined that the value is less than the threshold value (step S14; Yes), the process proceeds to step S15. If it is determined that the value is not less than the threshold value (step S14; No), the process proceeds to step S18.
  • control unit 30 specifies an action pattern (step S15). Specifically, the behavior pattern specifying unit 46 identifies the behavior corresponding to the behavior pattern information group whose behavior score is equal to or less than a predetermined threshold value as the behavior pattern of the user U.
  • the control unit 30 identifies the type of behavioral state of the specified behavioral pattern (step S16). Specifically, the behavior pattern specifying unit 46 may identify the type of behavior state performed by the user U based on the behavior state information acquired by the behavior state information acquisition unit 40.
  • the behavior pattern specifying unit 46 may specify the behavior state of the user U by using, for example, the learning model 28A.
  • the learning model 28A is constructed by learning and constructing a plurality of data sets as one data set, with the detection result of the behavior state sensor 20 and the information indicating the type of the behavior state indicated by the detection result of the behavior state sensor 20 as one data set. It is an AI (Artificial Intelligence) model.
  • the behavior pattern specifying unit 46 inputs the detection result of the behavior state sensor 20 into the learned learning model 28A, acquires the information indicating the type of the behavior state indicated by the detection result, and obtains the information indicating the type of the behavior state indicated by the detection result. Identify the type.
  • the behavior pattern specifying unit 46 uses the learned learning model 28A to specify, for example, that the user U is playing golf, shopping, being in a movie theater, or the like.
  • the control unit 30 stores the action pattern in the storage unit 28 (step S17). Specifically, the memory control unit 48 records the behavior pattern specified in step S16 in a predetermined format.
  • FIG. 5 is a diagram for explaining a format for storing an action pattern.
  • the behavior pattern can be associated with, for example, a place where 16 to 64 types of behavior can be predicted in advance. For example, a "golf course” can predict “playing golf”, a “cinema” can predict “watching a movie”, and a “shopping center” can predict “shopping”. It can be predicted that "doing”, and if it is "park”, it can be predicted that "doing gymnastics”.
  • the position information of the user U can be acquired from the GNSS receiver 20C of the behavior state sensor 20, the place visited by the user U can be registered for several weeks. Thereby, in the present embodiment, the candidates for the behavior pattern related to the lifestyle of the user U can be numbered in order.
  • the storage format F1 may include an area D1, an area D2, an area D3, an area D4, and an area D5.
  • the area D1 a numerical value numbered with the specified behavior pattern is stored.
  • the area D1 is composed of, for example, 3 bytes.
  • the number of dimensions of the space in which the behavior pattern information is plotted as a group is stored.
  • the area D2 is composed of, for example, 1 byte. In this case, the space can be up to 255 dimensions.
  • the behavior score R of the behavior pattern information group determined to be the behavior pattern of the user U is stored.
  • the behavior score R has fluctuations in the judgment error of the behavior pattern, and the smaller the value of the behavior score R, the higher the reliability of the behavior pattern.
  • the area D4 is a reserve area.
  • the reserve area D5 which is the last bit of the last 1-byte area of the reserve area, an identifier indicating whether the behavior pattern is everyday or extraordinary is stored.
  • 0 is described in the case of the behavior pattern of daily behavior described later, and 1 is described in the case of the behavior pattern of extraordinary behavior.
  • the reserve area can be used when ancillary information is desired to be added when each behavior pattern occurs.
  • the reserve area has, for example, an area of 6 bytes or more, and each numerical information corresponding to an N-dimensional (N is an arbitrary integer) dimension may be described.
  • the control unit 30 determines whether or not there is another group (step S18). Specifically, the behavior score calculation unit 44 determines whether or not there is a grouped behavior pattern information group for which the behavior score should be calculated. If it is determined that there is another group (step S18; Yes), the process proceeds to step S13. If it is determined that there is no other group (step S18; No), the process proceeds to step S19.
  • the control unit 30 determines whether or not to end the process (step S19). Specifically, the control unit 30 determines that the process is terminated when it receives an operation to end the process, or when it receives an operation to turn off the power. If it is determined that the process is not completed (step S19; No), the process proceeds to step S10. When it is determined to end the process (step S19; Yes), the process of FIG. 3 is terminated.
  • the information processing apparatus 10 detects the behavioral state of the user U and generates an behavioral pattern information group according to the behavioral state in a multidimensional space. Thereby, the information processing apparatus 10 according to the first embodiment can specify the behavior pattern of the user U based on the behavior pattern information group.
  • FIG. 6 is a flowchart showing an example of the processing flow of the information processing apparatus according to the second embodiment. Since the configuration of the information processing apparatus according to the second embodiment is the same as that of the information processing apparatus 10 shown in FIG. 2, the description thereof will be omitted.
  • the information processing apparatus 10 determines whether the identified user U's behavior pattern is a daily daily behavior or an extraordinary extraordinary behavior.
  • FIG. 7 is a diagram for explaining daily behavior and extraordinary behavior.
  • the space SA shows, for example, a range grouped as the same behavior pattern.
  • the space SB has, for example, the same center as the space SA, and its volume is about 60% of the space SA.
  • the behavior pattern information included in the space SB is a behavior pattern in which the behavior score is less than the first threshold value.
  • the behavior pattern whose behavior score is less than the first threshold value is determined to be the behavior pattern of daily behavior.
  • the behavior pattern corresponding to the point P1 in the space SB is determined to be the behavior pattern of daily behavior.
  • the behavior pattern included in the space between the space SA and the space SB is a behavior pattern in which the behavior score is equal to or more than the first threshold and less than the second threshold.
  • the behavior pattern having the behavior score of the first threshold value or more and less than the second threshold value is determined to be the behavior pattern of the extraordinary behavior.
  • the behavior pattern corresponding to the point P2 in the space between the space SA and the space SB is determined to be the behavior pattern of extraordinary behavior.
  • the behavior pattern included in the space outside the space SA is an behavior pattern in which the behavior score is equal to or higher than the second threshold value.
  • the behavior pattern whose behavior pattern is equal to or higher than the second threshold value is excluded from the target and is not included as the behavior pattern of the user.
  • steps S20 to S23 Since the processing of steps S20 to S23 is the same as the processing of steps S10 to S13 shown in FIG. 3, the description thereof will be omitted.
  • the control unit 30 determines whether or not the behavior score is less than the first threshold value (step S24). Specifically, the behavior pattern specifying unit 46 determines whether or not the behavior score calculated by the behavior score calculation unit 44 in step S23 is less than a predetermined first threshold value. If it is determined that the threshold value is less than the first threshold value (step S24; Yes), the process proceeds to step S25. If it is determined that the threshold value is not less than the first threshold value (step S24; No), the process proceeds to step S28.
  • control unit 30 specifies an action pattern of daily actions (step S25). Specifically, the behavior pattern specifying unit 46 identifies the behavior corresponding to the behavior pattern information group whose behavior score is less than a predetermined first threshold value as the behavior pattern of the daily behavior of the user U.
  • the control unit 30 identifies the type of behavioral state of the behavioral pattern of the specified daily behavior (step S26). Specifically, the behavior pattern specifying unit 46 may identify the type of the behavior state of the daily activity performed by the user U based on the behavior state information acquired by the behavior state information acquisition unit 40.
  • the control unit 30 stores the behavior pattern of daily activities in the storage unit 28 (step S27). Specifically, the memory control unit 48 records the behavior pattern of the daily behavior specified in step S25 in a predetermined format.
  • step S28 determines whether or not the behavior score is equal to or greater than the first threshold and less than the second threshold. Specifically, the behavior pattern specifying unit 46 determines whether or not the behavior score calculated by the behavior score calculation unit 44 in step S23 is equal to or greater than a predetermined first threshold value and less than the second threshold value. If it is determined that the threshold value is equal to or greater than the first threshold value and is less than the second threshold value (step S28; Yes), the process proceeds to step S29. If it is determined that it is not the first threshold value or more and less than the second threshold value (step S28; No), the process proceeds to step S32.
  • control unit 30 specifies an action pattern of extraordinary behavior (step S29). Specifically, the behavior pattern specifying unit 46 determines that the behavior corresponding to the behavior pattern information group whose behavior score is equal to or more than a predetermined first threshold value and less than the second threshold value is an extraordinary behavior behavior pattern.
  • the control unit 30 identifies the type of behavioral state of the behavioral pattern of the specified extraordinary behavior (step S30). Specifically, the behavior pattern specifying unit 46 may identify the type of behavioral state of the extraordinary behavior performed by the user U based on the behavioral state information acquired by the behavioral state information acquisition unit 40.
  • the control unit 30 stores the behavior pattern of daily activities in the storage unit 28 (step S31). Specifically, the memory control unit 48 records the behavior pattern of the extraordinary behavior specified in step S25 in a predetermined format.
  • step S32 and step 33 are the same as the processes of step S18 and step S19 shown in FIG. 3, the description thereof will be omitted.
  • the information processing apparatus 10 determines whether the behavior pattern is a daily behavior or an extraordinary behavior based on the behavior score. Thereby, the information processing apparatus 10 according to the second embodiment can determine whether the same action is a daily routine or a new action.
  • the discrimination between daily behavior and extraordinary behavior is, for example, whether the behavior pattern that occurred during commuting time on weekdays is a routine behavior or was the behavior pattern intentionally performed. It can be used to judge whether or not you are interested in such things. For example, if you commute from one station to another at a fixed time every day, the behavior of walking toward the station is routine if it is the same time, and it is not an active behavior that you are interested in. Statistically, the behavior pattern data can be ignored for calculations.
  • FIG. 8 is a block diagram showing a configuration example of the information processing apparatus according to the third embodiment.
  • the information processing apparatus 10a includes the biometric sensor 32 and the control unit 30a includes the biometric information acquisition unit 52 and the activity score calculation unit 54. Is different.
  • the information processing apparatus 10a calculates an activity score indicating the degree to which the user U enjoys the action.
  • the biosensor 32 is a sensor that detects the biometric information of the user U.
  • the biosensor 32 may be provided at any position as long as it can detect the biometric information of the user U.
  • the biometric information here is not immutable such as a fingerprint, but is preferably information whose value changes according to the state of the user U, for example.
  • the biometric information here is information about the autonomic nerve of the user U, that is, information whose value changes regardless of the intention of the user U.
  • the biological sensor 32 includes the pulse wave sensor 32A and detects the pulse wave of the user U as biological information.
  • the biosensor 32 may include an electroencephalogram sensor that detects the electroencephalogram of the user U.
  • the pulse wave sensor 32A is a sensor that detects the pulse wave of the user U.
  • the pulse wave sensor 32A may be, for example, a transmissive photoelectric sensor including a light emitting unit and a light receiving unit.
  • the pulse wave sensor 32A is configured such that, for example, the light emitting portion and the light receiving portion face each other with the fingertip of the user U interposed therebetween, and the light receiving portion receives the light transmitted through the fingertip, and the pressure of the pulse wave.
  • the pulse waveform may be measured by utilizing the fact that the larger the value, the larger the blood flow.
  • the pulse wave sensor 32A is not limited to this, and may be any method capable of detecting a pulse wave.
  • the biometric information acquisition unit 52 controls the biometric sensor 32 to cause the biometric sensor 32 to detect biometric information.
  • the biological information acquisition unit 52 acquires the biological information detected by the biological sensor 32.
  • the activity score calculation unit 54 calculates the autonomic nerve activity level based on the biometric information acquired by the biometric information. The method of calculating the autonomic nerve activity will be described later.
  • the activity score calculation unit 54 calculates the activity score.
  • the activity score calculation unit 54 calculates the activity score based on the behavior score calculated by the behavior score calculation unit 44, the behavior pattern specified by the behavior pattern specifying unit 46, and the autonomic nerve activity level.
  • FIG. 9 is a flowchart showing an example of the processing flow of the information processing apparatus 10a according to the third embodiment.
  • step S40 Since the process of step S40 is the same as the process of step S10 shown in FIG. 3, the description thereof will be omitted.
  • the control unit 30a acquires the biometric information of the user U (step S41). Specifically, the biological information acquisition unit 52 controls the pulse wave sensor 32A of the biological sensor 32 to acquire the pulse wave information of the user U.
  • the autonomic nervous activity degree that serves as a guideline for indicating the degree of stress, relaxation, interest, and concentration in the psychological aspect of the user U is determined by using the pulse wave information of the user U. calculate.
  • steps S42 to S47 are the same as the processes of steps S12 to S17 shown in FIG. 3, the description thereof will be omitted.
  • the control unit 30a calculates the activity score (step S48). Specifically, the activity score calculation unit 54 calculates the activity score of the user U based on the pulse wave information acquired in step S41.
  • FIG. 10 is a graph showing an example of a pulse wave.
  • the pulse wave is a waveform in which a peak called R wave WR appears at predetermined time intervals.
  • the pulse beat is caused by spontaneous combustion of pacemaker cells in the sinus node of the heart.
  • the pulse rhythm is strongly influenced by both the sympathetic and parasympathetic nerves.
  • Sympathetic nerves promote cardiac activity.
  • Parasympathetic nerves suppress cardiac activity.
  • the sympathetic nerve and the parasympathetic nerve act in opposition to each other. At rest or near, the parasympathetic nerve becomes dominant.
  • pulse rate increases when sympathetic excitement secretes adrenaline and decreases when parasympathetic excitement secretes acetylcholine.
  • the RR interval is an interval between R wave WRs that are continuous in a time series. Heart rate variability is measured with the R wave, which is the apex of the QPS wave of the signal waveform, as one pulse.
  • the fluctuation of the R wave interval of the electrocardiogram that is, the fluctuation of the time interval of the RR interval indicating the R wave interval of FIG. 10 is used as an autonomic nerve index.
  • the validity of using the time interval fluctuation of the RR interval as an autonomic nerve index has been reported in many medical institutions. Fluctuations in the RR interval increase at rest and decrease during stress.
  • a low-frequency component that appears near 0.1 Hz, which is derived from the modulation of sympathetic nervous system activity associated with the feedback regulation of blood vessel blood pressure.
  • the other is respiratory-tuned modulation, a high-frequency component that reflects respiratory sinus arrhythmia.
  • the high-frequency component reflects the direct interference of the vagal nerve anterior neuron by the respiratory center and the antireceptor reflexes of lung extension receptors and changes in blood pressure due to respiration, and is regarded as a parasympathetic index that mainly affects the heart.
  • the power spectrum of the low frequency component indicates the activity of the sympathetic nerve
  • the power spectrum of the high frequency component indicates the activity of the parasympathetic nerve. I can say.
  • the fluctuation of the input pulse wave is obtained by the differential value of the RR interval value.
  • the activity score calculation unit 54 converts the data into time series data at equal intervals by using three-dimensional spline interpolation or the like.
  • the activity score calculation unit 54 performs orthogonal transformation of the differential value of the RR interval by fast Fourier transform or the like.
  • the activity score calculation unit 54 calculates the power spectrum of the high frequency component of the differential value of the RR interval value of the pulse wave and the power spectrum of the low frequency component.
  • the activity score calculation unit 54 calculates the sum of the power spectra of the high frequency components as RRHF.
  • the activity score calculation unit 54 calculates it as RRLF, which is the sum of the power spectra of the low frequency components.
  • the activity score calculation unit 54 calculates the autonomic nerve activity using the formula (1).
  • the activity score calculation unit 54 may be referred to as an autonomic nerve activity calculation unit.
  • AN is the autonomic nervous activity
  • RRHF is the sum of the power spectra of the high frequency components
  • RRLF is the sum of the power spectra of the low frequency components.
  • C1 and C2 are fixed values defined in order to suppress the divergence of the solution of AN.
  • the activity score calculation unit 54 calculates the activity score using the formula (2).
  • NS is the activity score
  • AP is the behavior pattern
  • R is the behavior score
  • AN is the autonomic nerve activity. That is, the activity score calculation unit 54 may calculate the activity score by using a function including the behavior pattern, the behavior score, and the autonomic nerve activity as parameters.
  • the activity score calculation unit 54 may calculate the activity score by using, for example, a learning model.
  • the learning model is an AI model constructed by learning a behavioral pattern, a behavioral score, an autonomic nervous activity, and an activity score as one data set, and learning a plurality of data sets as teacher data.
  • the activity score calculation unit 54 inputs the behavior pattern, the behavior score, and the autonomic nerve activity into the learned learning model, acquires the information indicating the activity score, and obtains the activity score. calculate.
  • the control unit 30a presents the activity score to the user U (step S49). Specifically, the output control unit 50 controls at least one of the display unit 24A and the voice output unit 24B, and presents the activity score to the user U.
  • the control unit 30a stores the behavior pattern and the activity score in the storage unit 28 (step S50). Specifically, the memory control unit 48 records the behavior pattern and the activity score identified in step S46 in a predetermined format.
  • FIG. 11 is a diagram for explaining a format for storing an action pattern.
  • the storage format F2 may include a region D1a, a region D2a, a region D3a, a region D4a, a region D5a, and a region D6a.
  • the region D1a a numerical value numbered with the specified behavior pattern is stored.
  • the region D1a is composed of, for example, 3 bytes.
  • the area D2a the number of dimensions of the space in which the behavior pattern information is plotted as a group is stored.
  • the area D2a is composed of, for example, 1 byte.
  • the behavior score R of the behavior pattern information group determined to be the behavior pattern of the user U is stored.
  • the autonomic nerve activity is stored in the region D4a.
  • the region D4a is composed of, for example, 2 bytes.
  • the activity score is stored in the region D5a.
  • the region D5a is composed of, for example, 2 bytes.
  • the region D6a is a reserve region.
  • steps S51 and S52 are the same as the processes of steps S18 and S19 shown in FIG. 3, respectively, the description thereof will be omitted.
  • the information processing apparatus 10a according to the third embodiment calculates an activity score indicating the degree to which the user enjoys the action when the action specified as the action pattern of the user U is performed. be able to. Thereby, the information processing apparatus 10a according to the third embodiment can more appropriately specify the behavior pattern of the user U.
  • FIG. 12 is a block diagram showing a configuration example of the information processing apparatus according to the fourth embodiment.
  • the information processing apparatus 10b is an information processing apparatus shown in FIG. 8 in that the storage unit 28a stores the correction data 28C and the control unit 30b includes the activity score correction unit 56. It is different from 10a.
  • the information processing apparatus 10b corrects the activity score according to the country or region.
  • the correction data 28C is data used by the activity score correction unit 56 to correct the activity score.
  • the correction data 28C is data in which, for example, an action pattern and a correction coefficient for multiplying an activity score according to a country or region are associated with each other.
  • FIG. 13 is a diagram for explaining an example of correction data.
  • FIG. 13 shows an action pattern MP1, an action pattern MP2, and an action pattern MP3 as action patterns. Further, as a country or region, area A1, area A2, and area A3 are shown.
  • the behavior pattern is conceptually shown as the behavior pattern MP1 or the like, but is actually shown concretely as “playing golf” or the like. Further, although the country or region is conceptually shown as area A1, in reality, a specific country name such as Japan or a specific region name such as Tokyo is shown.
  • the activity score of the behavior pattern MP1 is multiplied by 0.5, the activity score of the behavior pattern MP2 is multiplied by 0.2, and the activity score of the behavior pattern MP3 is multiplied by 0.2. Multiply by 0.1.
  • the activity score of the behavior pattern MP1 is multiplied by 0.2, the activity score of the behavior pattern MP2 is multiplied by 0.6, and the activity score of the behavior pattern MP3 is multiplied by 0.9.
  • the activity score of the behavior pattern MP1 is multiplied by 0.3, the activity score of the behavior pattern MP2 is multiplied by 0.7, and the activity score of the behavior pattern MP3 is multiplied by 0.5. As shown in FIG.
  • the correction coefficient is generated, for example, by conducting a questionnaire survey on behavior patterns that can be assumed in advance for each country or region.
  • the correction coefficients are classified for each area as in areas A1 to A3, but the correction coefficients may be classified under predetermined conditions other than the area. For example, depending on the type of behavior pattern to be targeted, the correction coefficient may be classified according to age, gender, or the like.
  • the activity score correction unit 56 corrects the activity score calculated by the activity score calculation unit 54. Specifically, the activity score correction unit 56 corrects the activity score using the correction data 28C based on the country or region in which the behavior pattern is specified.
  • FIG. 14 is a flowchart showing an example of the processing flow of the information processing apparatus 10b according to the fourth embodiment.
  • steps S60 to S68 are the same as the processes of steps S40 to S48 shown in FIG. 9, the description thereof will be omitted.
  • the control unit 30b corrects the activity score (step S69). Specifically, the activity score correction unit 56 corrects the activity score calculated by the activity score calculation unit 54 using the correction data 28C based on the position information specified by the behavior pattern identification unit 46. ..
  • the control unit 30b presents the corrected activity score to the user U (step S70). Specifically, the output control unit 50 controls at least one of the display unit 24A and the voice output unit 24B, and presents the corrected activity score to the user U.
  • the control unit 30b stores the behavior pattern and the corrected activity score in the storage unit 28 (step S71). Specifically, the memory control unit 48 records the behavior pattern specified in step S66 and the corrected activity score in a predetermined format.
  • steps S72 and S73 are the same as the processes of steps S51 and S52 shown in FIG. 9, respectively, the description thereof will be omitted.
  • the information processing apparatus 10b according to the fourth embodiment corrects the activity score by multiplying the activity score by a correction coefficient according to the country or region. Thereby, the information processing apparatus 10b according to the fourth embodiment can more appropriately correct the activity score according to the country or region.
  • the activity score is corrected by using the correction data 28C in which the behavior pattern and the area are associated with each other.
  • the correction data does not have to be associated with an action pattern. That is, a dedicated correction coefficient may be set for each area.
  • the activity score correction unit 56 may correct the autonomic nerve activity calculated by the activity score calculation unit 54 based on the correction data.
  • the activity score correction unit 56 may also be referred to as an autonomic nerve activity correction unit.
  • FIG. 15 is a diagram for explaining an example of correction data. As shown in FIG. 15, in the correction data 28Ca, a correction coefficient is determined for each area. In the example shown in FIG. 15, the correction coefficient of the area A1 is 0.5, the correction coefficient of the area A2 is 0.2, and the correction coefficient of the area A3 is 0.3.
  • the activity score correction unit 56 autonomously multiplies the calculated autonomic nerve activity by 0.5. Corrects nerve activity.
  • the activity of the autonomic nerve can be calculated more appropriately according to each region.
  • FIG. 16 is a flowchart showing an example of the processing flow of the information processing apparatus according to the fifth embodiment. Since the configuration of the information processing apparatus according to the fifth embodiment is the same as that of the information processing apparatus 10b shown in FIG. 12, the description thereof will be omitted.
  • the information processing apparatus 10b independently calculates the activity score for the behavior pattern of the specified daily behavior of the user U and the behavior pattern of the extraordinary behavior. Further, in the fifth embodiment, the information processing apparatus 10b independently corrects the calculated activity score of the behavior pattern of the daily behavior of the user U and the activity score of the behavior pattern of the extraordinary behavior.
  • steps S80 to S84 are the same as the processes of steps S40 to S44 shown in FIG. 9, the description thereof will be omitted.
  • steps S85 to S87 are the same as the processes of steps S24 to S26 shown in FIG. 6, the description thereof will be omitted.
  • the control unit 30b calculates the activity score of the behavior pattern of daily behavior (step S88). Specifically, the activity score calculation unit 54 calculates the activity score of the behavior pattern of the daily behavior of the user U based on the pulse wave information acquired in step S81.
  • the control unit 30b corrects the activity score of the behavior pattern of daily behavior (step S89). Specifically, the activity score correction unit 56 uses the correction data 28C based on the position information specified by the behavior pattern identification unit 46 to determine the behavior pattern of daily behavior calculated by the activity score calculation unit 54. Correct the activity score.
  • the control unit 30b presents the user U with the activity score of the behavior pattern of the corrected daily behavior (step S90). Specifically, the output control unit 50 controls at least one of the display unit 24A and the voice output unit 24B, and presents the corrected activity score to the user U.
  • the control unit 30b stores the behavior pattern of daily activities and the corrected activity score in the storage unit 28 (step S91). Specifically, the memory control unit 48 records the behavior pattern of the daily behavior identified in step S86 and the corrected activity score in a predetermined format.
  • step S85 If No is determined in step S85, the process proceeds to step S92. Since the processes of steps S92 to S94 are the same as the processes of steps S28 to S30 shown in FIG. 6, the description thereof will be omitted.
  • the control unit 30b calculates the activity score of the behavior pattern of the extraordinary behavior (step S95). Specifically, the activity score calculation unit 54 calculates the activity score of the behavior pattern of the user U's extraordinary behavior based on the pulse wave information acquired in step S81.
  • the control unit 30b corrects the activity score of the behavior pattern of extraordinary behavior (step S96). Specifically, the activity score correction unit 56 uses the correction data 28C based on the position information specified by the behavior pattern identification unit 46 to calculate the behavior pattern of extraordinary behavior calculated by the activity score calculation unit 54. Correct the activity score of.
  • the control unit 30b presents the corrected activity score of the behavior pattern of the extraordinary behavior to the user U (step S97). Specifically, the output control unit 50 controls at least one of the display unit 24A and the voice output unit 24B, and presents the corrected activity score to the user U.
  • the control unit 30b stores the behavior pattern of extraordinary behavior and the corrected activity score in the storage unit 28 (step S91). Specifically, the memory control unit 48 records the behavior pattern of the extraordinary behavior identified in step S86 and the corrected activity score in a predetermined format.
  • steps S99 and S100 are the same as the processes of steps S51 and S52 shown in FIG. 9, respectively, the description thereof will be omitted.
  • the information processing apparatus 10b according to the fifth embodiment independently calculates the behavior pattern of daily behavior and the activity score when performing the behavior specified as the behavior pattern of extraordinary behavior. Can be done. Thereby, the information processing apparatus 10b according to the fifth embodiment can more appropriately specify the behavior pattern of the user U.
  • the information processing apparatus 10b according to the fifth embodiment multiplies the activity score by a correction coefficient according to the country or region to obtain the activity score of the behavior pattern of daily behavior and the activity of the behavior pattern of extraordinary behavior. Correct the degree score. Thereby, the information processing apparatus 10b according to the fifth embodiment can more appropriately correct the activity score according to the country or region.
  • FIG. 17 is a block diagram showing a configuration example of the information processing apparatus according to the sixth embodiment.
  • the storage unit 28b stores the history data 28D
  • the control unit 30c includes the history data acquisition unit 58 and the learning unit 60. It is different from the information processing apparatus 10a shown in 1.
  • the information processing apparatus 10c calculates an activity score using a trained model customized for each user.
  • the history data 28D is data related to the history of the activity score.
  • the historical data 28D may include information regarding the ranking of activity scores for each user in a predetermined period.
  • the historical data 28D may include information on behavior patterns in which the activity score in a predetermined period is higher than a predetermined time.
  • the predetermined period is, for example, 3 months, but is not limited to this.
  • FIGS. 18A to 18C are diagrams for explaining an example of the history data 28D. As shown in FIGS. 18A to 18C, in the history data 28D, the ranking, the behavior pattern, the behavior score, and the activity score are associated with each other.
  • FIG. 18A is an example of the history data of the user U1.
  • FIG. 18B is an example of the history data of the user U2.
  • FIG. 18C is an example of the history data of the user U3.
  • 18A to 18C show behavior patterns from users U1 to users U3 having high activity scores in a predetermined period, respectively, from 1st to 5th.
  • the first place of the user U1 is the behavior pattern MP4 having a behavior score of 10 and an activity score of 99.
  • the second place of the user U1 is the behavior pattern MP3 having a behavior score of 9 and an activity score of 85.
  • the third place of the user U1 is the behavior pattern MP1 having an behavior score of 8 and an activity score of 80.
  • the fourth place of the user U1 is the behavior pattern MP9 having a behavior score of 7 and an activity score of 58.
  • the fifth place of the user U1 is the behavior pattern MP3 having a behavior score of 7 and an activity score of 53.
  • the first place of the user U2 is the behavior pattern MP4 having an behavior score of 8 and an activity score of 90.
  • the second place of the user U2 is the behavior pattern MP3 having an behavior score of 8 and an activity score of 88.
  • the third place of the user U2 is the behavior pattern MP1 having an behavior score of 8 and an activity score of 79.
  • the fourth place of the user U2 is the behavior pattern MP8 having an behavior score of 9 and an activity score of 51.
  • the fifth place of the user U2 is the behavior pattern MP5 having an behavior score of 9 and an activity score of 49.
  • the first place of the user U3 is the behavior pattern MP7 having a behavior score of 10 and an activity score of 89.
  • the second place of the user U3 is the behavior pattern MP2 having an behavior score of 6 and an activity score of 71.
  • the third place of the user U3 is the behavior pattern MP9 having a behavior score of 7 and an activity score of 68.
  • the fourth place of the user U3 is the behavior pattern MP4 having an behavior score of 8 and an activity score of 65.
  • the fifth place of the user U3 is the behavior pattern MP3 having an behavior score of 9 and an activity score of 57.
  • the behavior pattern with a high activity score differs depending on the user, and there are individual differences in the activity score.
  • the history data acquisition unit 58 acquires the history data 28D from the storage unit 28c. Specifically, the history data acquisition unit 58 acquires the history data 28D of the user whose activity score is to be calculated.
  • the learning unit 60 generates a trained model for calculating a user's activity score by learning by machine learning based on learning data.
  • the learning unit 60 generates, for example, a trained model for calculating the activity score based on the history data 28D acquired by the history data acquisition unit 58.
  • the learning unit 60 learns the weight of DNN (Deep Neural Network) as a trained model for calculating the activity score, for example.
  • the learning unit 60 may learn by using a well-known machine learning method such as deep learning.
  • the learning unit 60 may update the trained model every time the learning data is updated, for example.
  • FIG. 19 is a flowchart showing an example of the flow of the learning process according to the sixth embodiment.
  • the control unit 30c acquires learning data (step S110). Specifically, the history data acquisition unit 58 acquires the history data 28D for a predetermined period of the user for which the activity score is to be calculated from the storage unit 28b.
  • the history data 28D acquired by the history data acquisition unit 58 may include at least information regarding a ranking, an action pattern, an action score, and an activity score.
  • the history data acquisition unit 58 acquires, for example, the history data 28D from the 1st place to the 1000th place in the past 3 months.
  • the control unit 30c executes the learning process (step S111). Specifically, the learning unit 60 uses the history data 28D acquired by the history data acquisition unit 58 to learn and generate a trained model for calculating the user's activity score by machine learning. More specifically, the learning unit 60 uses a behavior pattern, a behavior score, an autonomic nerve activity degree, and an activity score as one data set, and a plurality of (for example, 1000) data sets as teacher data. Train to generate a trained model. The learning unit 60 generates, for example, a trained model for each user whose activity score is to be calculated. That is, in the present embodiment, a trained model customized for each user is generated.
  • the control unit 30c stores the trained model (step S112). Specifically, the learning unit 60 stores the generated learned model in the storage unit 28c.
  • FIG. 20 is a flowchart showing an example of the processing flow of the information processing apparatus 10c according to the sixth embodiment.
  • steps S120 to S127 are the same as the processes of steps S40 to S47 shown in FIG. 9, the description thereof will be omitted.
  • the control unit 30c calculates the activity score based on the trained model according to the user (step S128). Specifically, the activity score calculation unit 54 calculates the user's activity score using a trained model customized according to the user.
  • steps S129 to S132 are the same as the processes of steps S49 to S52 shown in FIG. 9, the description thereof will be omitted.
  • the information processing apparatus 10c according to the sixth embodiment uses the trained model generated by customizing according to the history of the activity score for each user from the user U1 to the user U3, and uses the user U1.
  • the activity score of the user U3 is calculated from.
  • the information processing apparatus 10c according to the sixth embodiment can more appropriately calculate the activity score according to the user's sensibility.
  • FIG. 21 is a diagram for explaining a configuration example of the information processing system according to the seventh embodiment.
  • the information processing system 1 according to the seventh embodiment includes a plurality of information processing devices 10c and a server device 100.
  • the information processing device 10c and the server device 100 are communicably connected to each other via a network N (for example, the Internet). That is, the information processing system 1 according to the seventh embodiment has a configuration in which the information processing device 10c according to the sixth embodiment and the server device 100 are communicably connected to each other.
  • the activity of the user is calculated using the trained model generated by customizing according to the history of the activity score for each user.
  • the server device 100 stores the history data of a plurality of users as shared data, and has been learned based on the history data of the plurality of users whose activity score and the tendency of the behavior pattern are similar to each other. Generate a model to calculate user activity.
  • FIG. 22 is a block diagram showing a configuration example of the server device according to the seventh embodiment.
  • the server device 100 includes a communication unit 110, a control unit 120, and a storage unit 130.
  • the server device 100 is a so-called cloud server.
  • the communication unit 110 is realized by, for example, a NIC (Network Interface Card) or a communication circuit.
  • the communication unit 110 is connected to the network N wirelessly or by wire, and transmits / receives information to / from the information processing device 10c.
  • the control unit 120 controls the operation of each unit of the server device 100.
  • the control unit 120 is realized by, for example, a CPU, an MPU, or the like executing a program stored in a storage unit (not shown) using a RAM or the like as a work area.
  • the control unit 120 may be realized by an integrated circuit such as an ASIC or FPGA.
  • the control unit 120 may be realized by a combination of hardware and software.
  • the control unit 120 includes an acquisition unit 122, a determination unit 124, a request unit 126, and a provision unit 128.
  • the acquisition unit 122 acquires, for example, historical data regarding the history of the activity score of each user wearing the information processing apparatus 10c from the communication unit 110.
  • the acquisition unit 122 acquires the history data 28D 3 from the history data 28D 1 of the user U3 from the user U1 shown in FIGS. 18A to 18C, for example.
  • the acquisition unit 122 stores the acquired history data as shared data 132 in the storage unit 130.
  • the determination unit 124 determines the tendency of the history data among the shared data 132.
  • the determination unit 124 determines, for example, whether or not there is a user whose tendency of historical data is similar.
  • the request unit 126 requests whether or not to allow other users to use their own history data when there are a plurality of users whose trends in the history data are similar to each other.
  • the providing unit 128 When the use of historical data is permitted, the providing unit 128 provides historical data to users whose trends in historical data are similar.
  • the storage unit 130 is a memory that stores various information such as calculation contents and programs of the control unit 120.
  • a RAM random access memory
  • a main storage device such as a ROM
  • an external storage device such as an HDD. Including one.
  • Shared data 132 is stored in the storage unit 130.
  • the shared data 132 may include historical data regarding activity scores of a plurality of users wearing the information processing apparatus 10c.
  • the shared data 132 may include, for example, historical data 28D 1 to historical data 28D 3 from users U1 to users U3 shown in FIGS. 18A to 18C.
  • FIG. 23 is a flowchart showing an example of the processing flow of the server device according to the seventh embodiment.
  • the control unit 120 refers to the shared data 132 and determines whether or not there is a user whose historical data is similar (step S140). For example, it is assumed that the shared data 132 includes the history data 28D 1 to the history data 28D 3 of the users U1 to U3 shown in FIGS. 18A to 18C.
  • the determination unit 124 for example, history users who have the same behavior pattern from the first place to the third place, the behavior score of each behavior pattern is 8 or more, and the difference between the activity scores of each other is 10 or less. It is determined that the user has similar data.
  • the determination unit 124 identifies that the first place of the activity score of the user U1 and the user U2 is the action pattern MP4, the second place is the action pattern MP3, and the third place is the action pattern MP1.
  • the determination unit 124 identifies that the behavior score of the behavior pattern MP4 is 10, the behavior score of the behavior pattern MP3 is 9, and the behavior score of the behavior pattern MP1 is 8 for the user U1.
  • the determination unit 124 identifies that the behavior score of the behavior pattern MP4 is 8, the behavior score of the behavior pattern MP3 is 8, and the behavior score of the behavior pattern MP1 is 8 for the user U2.
  • the determination unit 124 has a difference in the activity score of the behavior pattern MP1 of 9, a difference in the activity score of the behavior pattern MP3 of 3, and a difference in the activity score of the behavior pattern MP1 for the user U1 and the user U2. Identify that it is 1. In this case, the determination unit 124 determines that the history data of the user U1 and the user U2 are similar to each other.
  • the determination unit 124 has described the case where two users are selected from the three users U1 to U3, but this is an example and is particularly limited to the number of actual user populations. There is no limit.
  • the determination unit 124 may determine that the history data of three or more users are close to each other.
  • the determination unit 124 may determine whether or not the historical data are close to each other by a method other than the method described in this embodiment.
  • the determination unit 124 may determine, for example, whether or not the historical data are close to each other according to a predetermined conditional expression defined mathematically.
  • step S140 If it is determined that there is an approximate user (step S140; Yes), the process proceeds to step S141. When it is determined that there is no approximate user (step S140; No), the process of FIG. 23 is terminated.
  • control unit 120 requests the user whose historical data is close to each other for permission to share (step S141). Specifically, the request unit 126 transmits a notification requesting the user U1 and the user U2 for permission to share the historical data via the communication unit 110.
  • the control unit 120 determines whether or not the sharing request is permitted (step S142). Specifically, the request unit 126 determines whether or not there is a response from the user U1 or the user U2 to permit the sharing of the history data in response to the request for permission to share the history data transmitted in step S141. .. If it is determined that the sharing request is permitted (step S142; Yes), the process proceeds to step S143. When it is determined that the sharing request is not permitted (step S142; No), the process of FIG. 23 is terminated.
  • step S142 the control unit 120 shares the history data (step S143). Specifically, the providing unit 128, for example, when the user U2 permits sharing of historical data, the user U1 with respect to the information processing device 10c worn by the user U1 via the communication unit 110. History data 28D 1 is transmitted. Then, the process of FIG. 23 is completed.
  • FIG. 24 is a flowchart showing an example of the flow of the learning process according to the seventh embodiment.
  • the information processing device 10c worn by the user U1 acquires the history data 28D 2 of the user U2 and performs the learning process will be described.
  • the control unit 30c acquires historical data (step S150). Specifically, the history data acquisition unit 58 acquires the history data 28D 1 for a predetermined period of the user for which the activity score is to be calculated from the storage unit 28c.
  • the history data 28D 1 acquired by the history data acquisition unit 58 may include at least information regarding a ranking, an action pattern, an action score, and an activity score.
  • the history data acquisition unit 58 receives, for example, the history data 28D 1 from the first place to the 1000th place in the past three months. That is, the history data acquisition unit 58 acquires 1000 sets of data sets as teacher data.
  • the control unit 30c acquires history data that is close to the history data of the user U1 from the server device 100 (step S151). Specifically, the history data acquisition unit 58 acquires the history data 28D 2 of the user U2 from the server device 100 via, for example, the communication unit 26. Here, for example, it can be assumed that there are a small number of behavior patterns included in 1000 sets of data sets of historical data 28D1. For example, when generating a trained model, 6000 to 8000 sets or more of data sets may be required. In the present embodiment, the history data acquisition unit 58 acquires the history data 28D 2 that is close to the history data 28D 1 from the server device 100, so that the number of data sets can be supplemented and a more optimal trained model can be generated. ..
  • the control unit 30c executes the learning process (step S152). Specifically, the learning unit 60 learns a trained model for calculating a user's activity score by machine learning using the history data 28D 1 and the history data 28D 2 acquired by the history data acquisition unit 58. To generate.
  • the control unit 30c stores the trained model (step S153). Specifically, the learning unit 60 stores the generated learned model in the storage unit 28b. Then, the process of FIG. 24 is terminated.
  • FIG. 25 is a flowchart showing an example of the processing flow of the information processing apparatus 10c according to the seventh embodiment.
  • steps S160 to S167 are the same as the processes of steps S120 to S127 shown in FIG. 20, the description thereof will be omitted.
  • the control unit 30c calculates the activity score based on the trained model generated using the shared data (step S168). Specifically, the activity score calculation unit 54 calculates the activity score of the user U1 by using the trained model generated by using the history data 28D 1 and the history data 28D 2 .
  • steps S169 to S172 are the same as the processes of steps S129 to S132 shown in FIG. 20, the description thereof will be omitted.
  • the information processing apparatus 10c according to the seventh embodiment generates a trained model using the history data 28D 1 of the user U1 and the history data 28D 2 of the user U2 which is close to the history data 28D 1 .
  • the activity score is calculated using the trained model.
  • the information processing apparatus 10c according to the seventh embodiment can more appropriately calculate the activity score.
  • FIG. 26 is a block diagram showing a configuration example of the information processing apparatus according to the eighth embodiment.
  • the information processing apparatus 10d is different from the information processing apparatus 10 shown in FIG. 2 in that the output unit 24a includes the tactile stimulus output unit 24C.
  • the information processing apparatus 10d presents the change in the activity score in an easy-to-understand manner by relatively presenting the temporal transition of the activity score of the user.
  • the tactile stimulus output unit 26C is a device that outputs the tactile stimulus of the user U.
  • the tactile stimulus output unit 26C outputs a tactile stimulus to the user by physically operating such as vibration, but the type of the tactile stimulus is not limited to vibration or the like and may be arbitrary.
  • the output control unit 50 controls the output unit 24a, and the activity score calculation unit 54 calculates the activity score in time.
  • the information showing the transition is output.
  • the output control unit 50 controls the output unit 24a to output information indicating a relative change in the activity score.
  • FIGS. 27A to 27E are diagrams for explaining a method of relatively displaying the temporal transition of the activity score.
  • the output control unit 50 controls, for example, the display unit 24A to display the temporal transition of the activity score as the graph G1.
  • the horizontal axis represents time and the vertical axis represents activity score.
  • the time t0 indicates the time when the calculation of the activity score is started.
  • the output control unit 50 controls, for example, the display unit 24A to display the value of the activity score at the start and the value of the current activity score in parallel.
  • the activity score at the start is 56 and the current activity score is 78.
  • the user can easily grasp the transition of the activity score over time.
  • the output control unit 50 controls, for example, the display unit 24A to display an arc obtained by dividing a circle centered on the lower left corner of the screen into four equal parts.
  • the arc C1 indicates the activity score at the start time
  • the radius r1 indicates the magnitude of the activity score.
  • arc C2 and arc C3 indicate the current activity score. If the current activity score is larger than the starting point, the output control unit 50 causes the display unit 24A to display the arc C2 having a radius r2 larger than the radius r1 together with the arc C1 having a radius r1.
  • the output control unit 50 causes the display unit 24A to display the arc C3 having a radius r3 smaller than the radius r1 together with the arc C1 having a radius r1. As a result, the user can easily grasp the transition of the activity score over time.
  • the output control unit 50 controls, for example, the display unit 24A to display a bar.
  • the bar B1 is displayed in the central portion of the display unit 24A, but the present invention is not limited to this, and the bar B1 may be displayed in the lower left corner or in the lower right corner.
  • the bar B1 indicates the activity score at the start time, and the height h1 indicates the magnitude of the activity score.
  • bars B2 and B3 indicate the current activity score.
  • the output control unit 50 causes the display unit 24A to display the rod B2 having a height h2 higher than the height h1 together with the rod B1 having a height h1. If the current activity score is smaller than the starting point, the output control unit 50 causes the display unit 24A to display the rod B3 having a height h3 lower than the height h1 together with the rod B1 having a height h1. As a result, the user can easily grasp the transition of the activity score over time.
  • the output control unit 50 controls the display unit 24A, for example, to display the graph G2.
  • Graph G2 can be a bar graph.
  • the horizontal axis represents time and the vertical axis represents activity score.
  • the time t0 indicates the time when the calculation of the activity score is started.
  • the output control unit 50 may control, for example, the voice output unit 24B or the tactile stimulus output unit 26C to output information indicating the temporal transition of the activity score.
  • FIG. 28 is a diagram for explaining a method of outputting information indicating a temporal transition of the activity score by the voice output unit 24B or the tactile stimulus output unit 24C.
  • the horizontal axis represents time and the vertical axis represents activity score.
  • the activity score NS1 at time t0 is the activity score at the time when the calculation of the activity score is started.
  • the activity score NS1 is used as a reference.
  • the output control unit 50 sets, for example, the volume of the sound output from the sound output unit 24B indicating the activity score NS1 which is the reference.
  • the output control unit 50 sets, for example, the strength of the stimulus (for example, vibration) to be output from the tactile stimulus output unit 24C showing the activity score NS1 as a reference.
  • the output control unit 50 controls, for example, the voice output unit 24B to output the voice corresponding to the activity score NS1 and the voice corresponding to the activity score NS2 as a set.
  • the output control unit 50 controls the tactile stimulus output unit 24C to output a stimulus corresponding to the activity score NS1 and a stimulus corresponding to the activity score NS2 as a set.
  • the voice corresponding to the activity score NS2 is louder than the voice corresponding to the activity score NS1.
  • the stimulus corresponding to the activity score NS2 is stronger than the stimulus corresponding to the activity score NS1.
  • the loudness of the voice corresponding to the activity score NS2 is preferably changed according to, for example, the ratio of the activity score NS2 to the activity score NS1.
  • the intensity of the stimulus corresponding to the activity score NS2 is preferably changed according to, for example, the ratio of the activity score NS2 to the activity score NS1. Thereby, the user can grasp the relative magnitude of the activity score NS2 with respect to the activity score NS1.
  • the output control unit 50 controls, for example, the voice output unit 24B to output the voice corresponding to the activity score NS1 and the voice corresponding to the activity score NS3 as a set.
  • the output control unit 50 controls the tactile stimulus output unit 24C to output a stimulus corresponding to the activity score NS1 and a stimulus corresponding to the activity score NS3 as a set.
  • the voice corresponding to the activity score NS3 is smaller than the voice corresponding to the activity score NS1.
  • the stimulus corresponding to the activity score NS3 is weaker than the stimulus corresponding to the activity score NS1.
  • the loudness of the voice corresponding to the activity score NS3 is preferably changed according to, for example, the ratio of the activity score NS2 to the activity score NS1.
  • the intensity of the stimulus corresponding to the activity score NS3 is preferably changed according to, for example, the ratio of the activity score NS2 to the activity score NS1. Thereby, the user can grasp the relative magnitude of the activity score NS2 with respect to the activity score NS1.
  • the information processing apparatus 10d according to the eighth embodiment relatively presents the temporal transition of the user's activity score. As a result, the information processing apparatus 10d according to the eighth embodiment makes it easy to understand the temporal change of the activity score.
  • the present disclosure is not limited by the contents of these embodiments.
  • the above-mentioned components include those that can be easily assumed by those skilled in the art, those that are substantially the same, that is, those in a so-called equal range.
  • the components described above can be combined as appropriate. Further, various omissions, replacements or changes of the components can be made without departing from the gist of the above-described embodiment.
  • the information processing device, information processing method, and program of the present disclosure can be applied to a technique for analyzing user behavior.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Human Resources & Organizations (AREA)
  • Pathology (AREA)
  • Strategic Management (AREA)
  • Data Mining & Analysis (AREA)
  • Neurology (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Engineering & Computer Science (AREA)
  • Cardiology (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Neurosurgery (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Primary Health Care (AREA)
  • Psychiatry (AREA)
  • Dermatology (AREA)
  • Multimedia (AREA)

Abstract

An information processing device (10) comprising: a behavior state detection sensor (20) that detects behavior state information pertaining to the behavior state of a user; a behavior pattern information generation unit (42) that, on the basis of the behavior state information, generates behavior pattern information in a multidimensional space having at least the parameters of time/date, place, and length of time that the behavior state was detected as axes, and groups the behavior pattern information in each space in which the density of a behavior pattern information group of clustered behavior pattern information surpasses a prescribed density; a behavior score calculation unit (44) that calculates, as a behavior score, information pertaining to the size of a space that includes a behavior pattern information group; and a behavior pattern identification unit (46) that identifies a behavior pattern information group that exists in a space for which the behavior score value is a prescribed value or greater as a behavior pattern of the user.

Description

情報処理装置、情報処理方法、およびプログラムInformation processing equipment, information processing methods, and programs
 本開示は、情報処理装置、情報処理方法、およびプログラムに関する。 This disclosure relates to information processing devices, information processing methods, and programs.
 ユーザに装着されるウェアラブルデバイスを用いて、ユーザの動きを検出してユーザの行動を識別する技術が知られている。 There is known a technique of detecting a user's movement and identifying the user's behavior by using a wearable device worn by the user.
 例えば、特許文献1には、ユーザの加速度情報を検出し、検出された加速度情報を用いて動作モードの制御を行う携帯電話装置が記載されている。特許文献2には、ユーザの所望の部位の動作情報を計測し、全身の運動状態を認識する動作情報計測システムが記載されている。 For example, Patent Document 1 describes a mobile phone device that detects user acceleration information and controls an operation mode using the detected acceleration information. Patent Document 2 describes a motion information measurement system that measures motion information of a desired portion of a user and recognizes a motion state of the whole body.
特開2003-46630号公報Japanese Patent Application Laid-Open No. 2003-46630 特開2004-184351号公報Japanese Unexamined Patent Publication No. 2004-184351
 ここで、ユーザの行動状態に関する情報に基づいて、ユーザの行動パターンを特定することが求められている。 Here, it is required to specify the user's behavior pattern based on the information regarding the user's behavior state.
 本開示は、ユーザの行動状態に関する情報に基づいて、ユーザの行動パターンを特定することのできる情報処理装置、情報処理方法、およびプログラムを提供することを目的とする。 It is an object of the present disclosure to provide an information processing device, an information processing method, and a program capable of specifying a user's behavior pattern based on information on the user's behavior state.
 本開示の一態様に係る情報処理装置は、ユーザの行動状態に関する行動状態情報を検出する行動状態検出センサと、前記行動状態情報に基づいて、少なくとも前記行動状態が検出された日時、場所、時間のパラメータを座標軸とする多次元空間に行動パターン情報を生成し、前記行動パターン情報が集まる行動パターン情報群の密度が所定の密度を超えている空間ごとにグループ化する行動パターン情報生成部と、前記行動パターン情報群を含む空間の大きさに関する情報を行動スコアとして算出する行動スコア算出部と、前記行動スコアの値が所定以上の前記空間内に存在する前記行動パターン情報群を前記ユーザの行動パターンとして特定する行動パターン特定部と、を備える。 The information processing apparatus according to one aspect of the present disclosure includes a behavioral state detection sensor that detects behavioral state information related to the user's behavioral state, and at least the date, time, place, and time when the behavioral state is detected based on the behavioral state information. A behavior pattern information generation unit that generates behavior pattern information in a multidimensional space with the parameter of The behavior score calculation unit that calculates information about the size of the space including the behavior pattern information group as the behavior score, and the behavior pattern information group in which the value of the behavior score is equal to or higher than a predetermined value are the behaviors of the user. It is provided with an action pattern specifying unit that is specified as a pattern.
 本開示の一態様に係る情報処理装置は、ユーザの行動状態に関する行動状態情報を検出する行動状態センサと、前記ユーザの生体情報に関する生体情報を検出する生体センサと、前記生体情報に基づいて、前記ユーザの自律神経活性度を算出する自律神経活性度算出部と、前記自律神経活性度の強度に応じて出力部からの出力の強度を変更する出力制御部と、を備える。 The information processing apparatus according to one aspect of the present disclosure is based on a behavioral state sensor that detects behavioral state information regarding the user's behavioral state, a biological sensor that detects biological information regarding the user's biological information, and the biological information. It includes an autonomic nerve activity calculation unit that calculates the autonomic nerve activity of the user, and an output control unit that changes the intensity of output from the output unit according to the intensity of the autonomic nerve activity.
 本開示の一態様に係る情報処理装置は、ユーザの行動状態に関する行動状態情報を検出する行動状態センサと、前記ユーザの生体情報に関する生体情報を検出する生体センサと、前記生体情報に基づいて、前記ユーザの自律神経活性度を算出する自律神経活性度算出部と、前記ユーザの行動パターンが特定された国又は地域に基づいて、前記自律神経活性度を補正する自律神経活性度補正部と、を備える。 The information processing apparatus according to one aspect of the present disclosure is based on a behavioral state sensor that detects behavioral state information regarding the user's behavioral state, a biological sensor that detects biological information regarding the user's biological information, and the biological information. An autonomic nerve activity calculation unit that calculates the autonomic nerve activity of the user, and an autonomic nerve activity correction unit that corrects the autonomic nerve activity based on the country or region in which the behavior pattern of the user is specified. To prepare for.
 本開示の一態様に係る情報処理方法は、ユーザの行動状態に関する行動状態情報を検出するステップと、前記行動状態情報に基づいて、少なくとも前記行動状態が検出された日時、場所、時間のパラメータを座標軸とする多次元空間に行動パターン情報を生成し、行動パターン情報が集まる行動パターン情報群の密度が所定の密度を超えている空間ごとにグループ化するステップと、前記行動パターン情報群を含む空間の大きさに関する情報を行動スコアとして算出する行動スコア算出ステップと、前記行動スコアの値が所定以上の前記空間内に存在する前記行動パターン情報群を前記ユーザの行動パターンとして特定するステップと、を含む。 The information processing method according to one aspect of the present disclosure includes a step of detecting a behavioral state information regarding a user's behavioral state, and at least parameters of a date, time, place, and time when the behavioral state is detected based on the behavioral state information. A step of generating behavior pattern information in a multidimensional space as a coordinate axis and grouping each behavior pattern information group in which the density of the behavior pattern information group exceeds a predetermined density, and a space including the behavior pattern information group. A step of calculating an action score that calculates information about the size of the action score as an action score, and a step of specifying the action pattern information group that exists in the space having a value of the action score of a predetermined value or more as the action pattern of the user. include.
 本開示の一態様に係る情報処理方法は、ユーザの行動状態に関する行動状態情報を検出するステップと、前記ユーザの生体情報に関する生体情報を検出するステップと、前記生体情報に基づいて、前記ユーザの自律神経活性度を算出するステップと、前記自律神経活性度の強度に応じて出力部からの出力の強度を変更するステップと、を含む。 The information processing method according to one aspect of the present disclosure includes a step of detecting behavioral state information regarding the user's behavioral state, a step of detecting biological information regarding the user's biological information, and a step of detecting the user's biological information based on the biological information. It includes a step of calculating the autonomic nerve activity and a step of changing the intensity of the output from the output unit according to the intensity of the autonomic nerve activity.
 本開示の一態様に係る情報処理方法は、ユーザの行動状態に関する行動状態情報を検出するステップと、前記ユーザの生体情報に関する生体情報を検出するステップと、前記生体情報に基づいて、前記ユーザの自律神経活性度を算出するステップと、前記ユーザの行動パターンが特定された国又は地域に基づいて、前記自律神経活性度を補正するステップと、を含む。 The information processing method according to one aspect of the present disclosure includes a step of detecting behavioral state information regarding the user's behavioral state, a step of detecting biological information regarding the user's biological information, and a step of detecting the user's biological information based on the biological information. It includes a step of calculating the autonomic nerve activity and a step of correcting the autonomic nerve activity based on the country or region in which the behavior pattern of the user is specified.
 本開示の一態様に係るプログラムは、ユーザの行動状態に関する行動状態情報を検出するステップと、前記行動状態情報に基づいて、少なくとも前記行動状態が検出された日時、場所、時間のパラメータを座標軸とする多次元空間に行動パターン情報を生成し、行動パターン情報が集まる行動パターン情報群の密度が所定の密度を超えている空間ごとにグループ化するステップと、前記行動パターン情報群を含む空間の大きさに関する情報を行動スコアとして算出する行動スコア算出ステップと、前記行動スコアの値が所定以上の前記空間内に存在する前記行動パターン情報群を前記ユーザの行動パターンとして特定するステップと、前記ユーザの行動パターンを記憶部に記憶するステップと、をコンピュータに実行させる。 The program according to one aspect of the present disclosure includes a step of detecting behavioral state information regarding a user's behavioral state, and at least the date and time, place, and time parameters at which the behavioral state is detected as coordinate axes based on the behavioral state information. A step of generating behavior pattern information in a multidimensional space and grouping the behavior pattern information groups into groups in which the density of the behavior pattern information group exceeds a predetermined density, and the size of the space including the behavior pattern information group. A step of calculating an action score that calculates information about the behavior as an action score, a step of specifying the action pattern information group in which the value of the action score is equal to or higher than a predetermined value in the space, and a step of specifying the action pattern of the user. Have the computer execute the step of storing the action pattern in the storage unit.
 本開示の一態様に係るプログラムは、ユーザの行動状態に関する行動状態情報を検出するステップと、前記ユーザの生体情報に関する生体情報を検出するステップと、前記生体情報に基づいて、前記ユーザの自律神経活性度を算出するステップと、前記自律神経活性度の強度に応じて出力部からの出力の強度を変更するステップと、をコンピュータに実行させる。 The program according to one aspect of the present disclosure includes a step of detecting behavioral state information regarding the user's behavioral state, a step of detecting biological information regarding the user's biological information, and the user's autonomic nerve based on the biological information. A computer is made to execute a step of calculating the activity and a step of changing the intensity of the output from the output unit according to the intensity of the autonomic nerve activity.
 本開示の一態様に係るプログラムは、ユーザの行動状態に関する行動状態情報を検出するステップと、前記ユーザの生体情報に関する生体情報を検出するステップと、前記生体情報に基づいて、前記ユーザの自律神経活性度を算出するステップと、前記ユーザの行動パターンが特定された国又は地域に基づいて、前記自律神経活性度を補正するステップと、をコンピュータに実行させる。 The program according to one aspect of the present disclosure includes a step of detecting behavioral state information regarding the user's behavioral state, a step of detecting biological information regarding the user's biological information, and the autonomic nerve of the user based on the biological information. Have the computer perform a step of calculating the activity and a step of correcting the autonomic nervous activity based on the country or region in which the user's behavior pattern is specified.
 本開示によれば、ユーザの行動状態に関する情報に基づいて、ユーザの行動パターンを特定することができる。 According to the present disclosure, the behavior pattern of the user can be specified based on the information regarding the behavioral state of the user.
図1は、第1実施形態に係る情報処理装置を模式的に示す模式図である。FIG. 1 is a schematic diagram schematically showing an information processing apparatus according to the first embodiment. 図2は、第1実施形態に係る情報処理装置の構成例を示すブロック図である。FIG. 2 is a block diagram showing a configuration example of the information processing apparatus according to the first embodiment. 図3は、第1実施形態に係る情報処理装置の処理の流れの一例を示すフローチャートである。FIG. 3 is a flowchart showing an example of the processing flow of the information processing apparatus according to the first embodiment. 図4は、行動パターン情報を生成する多次元空間について説明するための図である。FIG. 4 is a diagram for explaining a multidimensional space that generates behavior pattern information. 図5は、行動パターンを記憶するフォーマットを説明するための図である。FIG. 5 is a diagram for explaining a format for storing an action pattern. 図6は、第2実施形態に係る情報処理装置の処理の流れの一例を示すフローチャートである。FIG. 6 is a flowchart showing an example of the processing flow of the information processing apparatus according to the second embodiment. 図7は、日常行動と、非日常行動とを説明するための図である。FIG. 7 is a diagram for explaining daily behavior and extraordinary behavior. 図8は、第3実施形態に係る情報処理装置の構成例を示すブロック図である。FIG. 8 is a block diagram showing a configuration example of the information processing apparatus according to the third embodiment. 図9は、第3実施形態に係る情報処理装置の処理の流れの一例を示すフローチャートである。FIG. 9 is a flowchart showing an example of the processing flow of the information processing apparatus according to the third embodiment. 図10は、脈波の一例を示すグラフである。FIG. 10 is a graph showing an example of a pulse wave. 図11は、行動パターンを記憶するフォーマットを説明するための図である。FIG. 11 is a diagram for explaining a format for storing an action pattern. 図12は、第4実施形態に係る情報処理装置の構成例を示すブロック図である。FIG. 12 is a block diagram showing a configuration example of the information processing apparatus according to the fourth embodiment. 図13は、補正データの一例を説明するための図である。FIG. 13 is a diagram for explaining an example of correction data. 図14は、第4実施形態に係る情報処理装置の処理の流れの一例を示すフローチャートである。FIG. 14 is a flowchart showing an example of the processing flow of the information processing apparatus according to the fourth embodiment. 図15は、補正データの一例を説明するための図である。FIG. 15 is a diagram for explaining an example of correction data. 図16は、第5実施形態に係る情報処理装置の処理の流れの一例を示すフローチャートである。FIG. 16 is a flowchart showing an example of the processing flow of the information processing apparatus according to the fifth embodiment. 図17は、第6実施形態に係る情報処理装置の構成例を示すブロック図である。FIG. 17 is a block diagram showing a configuration example of the information processing apparatus according to the sixth embodiment. 図18Aは、ユーザの履歴データの一例を説明するための図である。FIG. 18A is a diagram for explaining an example of user history data. 図18Bは、ユーザの履歴データの一例を説明するための図である。FIG. 18B is a diagram for explaining an example of user history data. 図18Cは、ユーザの履歴データの一例を説明するための図である。FIG. 18C is a diagram for explaining an example of user history data. 図19は、第6実施形態に係る学習処理の流れの一例を示すフローチャートである。FIG. 19 is a flowchart showing an example of the flow of the learning process according to the sixth embodiment. 図20は、第6実施形態に係る情報処理装置の処理の流れの一例を示すフローチャートである。FIG. 20 is a flowchart showing an example of the processing flow of the information processing apparatus according to the sixth embodiment. 図21は、第7実施形態に係る情報処理システムの構成例を説明するための図である。FIG. 21 is a diagram for explaining a configuration example of the information processing system according to the seventh embodiment. 図22は、第7実施形態に係るサーバ装置の構成例を示すブロック図である。FIG. 22 is a block diagram showing a configuration example of the server device according to the seventh embodiment. 図23は、第7実施形態に係るサーバ装置の処理の流れの一例を示すフローチャートである。FIG. 23 is a flowchart showing an example of the processing flow of the server device according to the seventh embodiment. 図24は、第7実施形態に係る学習処理の流れの一例を示すフローチャートである。FIG. 24 is a flowchart showing an example of the flow of the learning process according to the seventh embodiment. 図25は、第7実施形態に係る情報処理装置の処理の流れの一例を示すフローチャートである。FIG. 25 is a flowchart showing an example of the processing flow of the information processing apparatus according to the seventh embodiment. 図26は、第8実施形態に係る情報処理装置の構成例を示すブロック図である。FIG. 26 is a block diagram showing a configuration example of the information processing apparatus according to the eighth embodiment. 図27Aは、活性度スコアの時間的な推移を相対的に表示する方法を説明するための図である。FIG. 27A is a diagram for explaining a method of relatively displaying the temporal transition of the activity score. 図27Bは、活性度スコアの時間的な推移を相対的に表示する方法を説明するための図である。FIG. 27B is a diagram for explaining a method of relatively displaying the temporal transition of the activity score. 図27Cは、活性度スコアの時間的な推移を相対的に表示する方法を説明するための図である。FIG. 27C is a diagram for explaining a method of relatively displaying the temporal transition of the activity score. 図27Dは、活性度スコアの時間的な推移を相対的に表示する方法を説明するための図である。FIG. 27D is a diagram for explaining a method of relatively displaying the temporal transition of the activity score. 図27Eは、活性度スコアの時間的な推移を相対的に表示する方法を説明するための図である。FIG. 27E is a diagram for explaining a method of relatively displaying the temporal transition of the activity score. 図28は、活性度スコアの時間的な推移を示す情報を出力する方法を説明するための図である。FIG. 28 is a diagram for explaining a method of outputting information showing the temporal transition of the activity score.
 以下、添付図面を参照して、本開示に係る実施形態を詳細に説明する。なお、この実施形態により本開示が限定されるものではなく、また、実施形態が複数ある場合には、各実施形態を組み合わせて構成するものも含む。また、以下の実施形態において、同一の部位には同一の符号を付することにより重複する説明を省略する。 Hereinafter, embodiments according to the present disclosure will be described in detail with reference to the attached drawings. It should be noted that the present disclosure is not limited to this embodiment, and when there are a plurality of embodiments, the present embodiment also includes a combination of the respective embodiments. Further, in the following embodiments, the same parts are designated by the same reference numerals, so that duplicate description will be omitted.
 [第1実施形態]
 図1は、第1実施形態に係る情報処理装置の模式図である。図1に示すように、情報処理装置10は、ユーザUの体に装着される、いわゆるウェアラブルデバイスである。本実施形態の例では、情報処理装置10は、ユーザUの目に装着される装置10Aと、ユーザUの耳に装着される装置10Bと、ユーザの腕に装着される装置10Cとを含む。ユーザUの目に装着される装置10AはユーザUに視覚刺激を出力する(画像を表示する)後述の表示部26Aを含み、ユーザUの耳に装着される装置10Bは、ユーザUに聴覚刺激(音声)を出力する後述の音声出力部26Bを含み、ユーザUの腕に装着される装置10Cは、ユーザUに触覚刺激を出力する後述の触覚刺激出力部26Cを含む。ただし、図1の構成は一例であり、装置の数や、ユーザUへの装着位置も任意であってよい。例えば、情報処理装置10は、ウェアラブルデバイスに限られず、ユーザUに携帯される装置であってよく、例えばいわゆるスマートフォンやタブレット端末などであってもよい。
[First Embodiment]
FIG. 1 is a schematic diagram of an information processing apparatus according to the first embodiment. As shown in FIG. 1, the information processing apparatus 10 is a so-called wearable device worn on the body of the user U. In the example of the present embodiment, the information processing device 10 includes a device 10A worn on the eyes of the user U, a device 10B worn on the ears of the user U, and a device 10C worn on the arm of the user U. The device 10A attached to the eyes of the user U includes a display unit 26A described later that outputs a visual stimulus to the user U (displays an image), and the device 10B attached to the ear of the user U gives an auditory stimulus to the user U. The device 10C attached to the arm of the user U includes a later-described audio output unit 26B that outputs (voice), and includes a later-described tactile stimulus output unit 26C that outputs a tactile stimulus to the user U. However, the configuration of FIG. 1 is an example, and the number of devices and the mounting position on the user U may be arbitrary. For example, the information processing device 10 is not limited to a wearable device, and may be a device carried by the user U, for example, a so-called smartphone or tablet terminal.
 図2は、第1実施形態に係る情報処理装置の構成例を示すブロック図である。図2に示すように、情報処理装置10は、行動状態センサ20と、入力部22と、出力部24と、通信部26と、記憶部28と、制御部30とを備える。 FIG. 2 is a block diagram showing a configuration example of the information processing apparatus according to the first embodiment. As shown in FIG. 2, the information processing apparatus 10 includes an action state sensor 20, an input unit 22, an output unit 24, a communication unit 26, a storage unit 28, and a control unit 30.
 行動状態センサ20は、情報処理装置10を装着しているユーザUの行動状態に関する行動状態情報を検出するセンサである。ユーザUの行動状態情報には、ユーザUの行動に関する各種の情報が含まれ得る。ユーザUの行動状態情報には、少なくともユーザUの物理的な身体の動き、行動をする日時、行動をする場所、行動を行っている時間に関する情報が含まれ得る。 The behavioral state sensor 20 is a sensor that detects behavioral state information regarding the behavioral state of the user U wearing the information processing device 10. The behavior state information of the user U may include various information regarding the behavior of the user U. The action state information of the user U may include at least information regarding the physical movement of the user U, the date and time of the action, the place of the action, and the time of the action.
 行動状態センサ20は、カメラ20Aと、マイク20Bと、GNSS受信器20Cと、加速度センサ20Dと、ジャイロセンサ20Eと、光センサ20Fと、温度センサ20Gと、湿度センサ20Hとを含む。ただし、行動状態センサ20は、行動状態情報を検出する任意のセンサを含むものであってよく、例えば、カメラ20Aと、マイク20Bと、GNSS受信器20Cと、加速度センサ20Dと、ジャイロセンサ20Eと、光センサ20Fと、温度センサ20Gと、湿度センサ20Hとの、少なくとも1つを含んだものであってよいし、他のセンサを含んだものであってもよい。 The behavioral state sensor 20 includes a camera 20A, a microphone 20B, a GNSS receiver 20C, an acceleration sensor 20D, a gyro sensor 20E, an optical sensor 20F, a temperature sensor 20G, and a humidity sensor 20H. However, the behavioral state sensor 20 may include an arbitrary sensor for detecting behavioral state information, for example, a camera 20A, a microphone 20B, a GNSS receiver 20C, an acceleration sensor 20D, and a gyro sensor 20E. , The optical sensor 20F, the temperature sensor 20G, and the humidity sensor 20H may be included, or may include other sensors.
 カメラ20Aは、撮像装置であり、行動状態情報として、情報処理装置10(ユーザU)の周辺の可視光を検出することで、情報処理装置10の周辺を撮像する。カメラ20Aは、所定のフレームレート毎に撮像するビデオカメラであってよい。情報処理装置10においてカメラ20Aの設けられる位置や向きは任意であるが、例えば、カメラ20Aは、図1に示す装置10Aに設けられており、撮像方向がユーザUの顔が向いている方向であってよい。これにより、カメラ20Aは、ユーザUの視線の先にある対象物を、すなわちユーザUの視野の範囲に入る対象物を、撮像できる。また、カメラ20Aの数は任意であり、単数であっても複数であってもよい。なお、カメラ20Aが複数ある場合には、カメラ20Aが向いている方向の情報も、取得される。 The camera 20A is an image pickup device, and captures the periphery of the information processing device 10 by detecting visible light around the information processing device 10 (user U) as behavioral state information. The camera 20A may be a video camera that captures images at predetermined frame rates. The position and orientation of the camera 20A in the information processing apparatus 10 are arbitrary, but for example, the camera 20A is provided in the apparatus 10A shown in FIG. 1 and the imaging direction is the direction in which the face of the user U is facing. It may be there. As a result, the camera 20A can image an object in the line of sight of the user U, that is, an object within the field of view of the user U. Further, the number of cameras 20A is arbitrary, and may be singular or plural. If there are a plurality of cameras 20A, the information in the direction in which the cameras 20A are facing is also acquired.
 マイク20Bは、行動状態情報として、情報処理装置10(ユーザU)の周辺の音声(音波情報)を検出するマイクである。情報処理装置10においてマイク20Bの設けられる位置、向き、及び数などは任意である。なお、マイク20Bが複数ある場合には、マイク20Bが向いている方向の情報も、取得される。 The microphone 20B is a microphone that detects voice (sound wave information) around the information processing device 10 (user U) as behavioral state information. The position, orientation, number, and the like of the microphone 20B provided in the information processing apparatus 10 are arbitrary. If there are a plurality of microphones 20B, information in the direction in which the microphones 20B are facing is also acquired.
 GNSS受信器20Cは、行動状態情報として、情報処理装置10(ユーザU)の位置情報を検出する装置である。ここでの位置情報とは、地球座標である。本実施形態では、GNSS受信器20Cは、いわゆるGNSS(Global Navigation Satellite System)モジュールであり、衛星からの電波を受信して、情報処理装置10(ユーザU)の位置情報を検出する。 The GNSS receiver 20C is a device that detects the position information of the information processing device 10 (user U) as the action state information. The position information here is the earth coordinates. In the present embodiment, the GNSS receiver 20C is a so-called GNSS (Global Navigation Satellite System) module, which receives radio waves from satellites and detects the position information of the information processing device 10 (user U).
 加速度センサ20Dは、行動状態情報として、情報処理装置10(ユーザU)の加速度を検出するセンサであり、例えば、重力、振動、及び衝撃などを検出する。 The acceleration sensor 20D is a sensor that detects the acceleration of the information processing device 10 (user U) as behavioral state information, and detects, for example, gravity, vibration, and impact.
 ジャイロセンサ20Eは、行動状態情報として、情報処理装置10(ユーザU)の回転や向きを検出するセンサであり、コリオリの力やオイラー力や遠心力の原理などを用いて検出する。 The gyro sensor 20E is a sensor that detects the rotation and orientation of the information processing device 10 (user U) as behavioral state information, and detects it using the principles of Coriolis force, Euler force, centrifugal force, and the like.
 光センサ20Fは、行動状態情報として、情報処理装置10(ユーザU)の周辺の光の強度を検出するセンサである。光センサ20Fは、可視光線や赤外線や紫外線の強度を検出できる。 The optical sensor 20F is a sensor that detects the intensity of light around the information processing device 10 (user U) as behavioral state information. The optical sensor 20F can detect the intensity of visible light, infrared rays, and ultraviolet rays.
 温度センサ20Gは、行動状態情報として、情報処理装置10(ユーザU)の周辺の温度を検出するセンサである。 The temperature sensor 20G is a sensor that detects the temperature around the information processing device 10 (user U) as behavioral state information.
 湿度センサ20Hは、行動状態情報として、情報処理装置10(ユーザU)の周辺の湿度を検出するセンサである。 The humidity sensor 20H is a sensor that detects the humidity around the information processing device 10 (user U) as behavioral state information.
 入力部22は、ユーザの操作を受け付ける装置であり、例えばタッチパネルなどであってよい。 The input unit 22 is a device that accepts user operations, and may be, for example, a touch panel.
 出力部24は、情報処理装置10による出力結果を出力する。出力部24は、例えば映像を表示する表示部24Aと、音声を出力する音声出力部24Bとを有する。本実施形態では、表示部24Aは、例えば、いわゆるHMD(Head Mounted Display)である。音声出力部24Bは、音声を出力するスピーカである。 The output unit 24 outputs the output result of the information processing device 10. The output unit 24 includes, for example, a display unit 24A for displaying an image and an audio output unit 24B for outputting audio. In the present embodiment, the display unit 24A is, for example, a so-called HMD (Head Mounted Display). The audio output unit 24B is a speaker that outputs audio.
 通信部26は、外部の装置などと通信するモジュールであり、例えば、アンテナなどを含んでよい。通信部26による通信方式は、本実施形態では無線通信であるが、通信方式は任意であってよい。 The communication unit 26 is a module that communicates with an external device or the like, and may include, for example, an antenna or the like. The communication method by the communication unit 26 is wireless communication in this embodiment, but the communication method may be arbitrary.
 記憶部28は、制御部30の演算内容やプログラムなどの各種情報を記憶するメモリであり、例えば、RAM(Random Access Memory)と、ROM(Read Only Memory)のような主記憶装置と、HDD(Hard Disk Drive)などの外部記憶装置とのうち、少なくとも1つ含む。 The storage unit 28 is a memory that stores various information such as calculation contents and programs of the control unit 30, and is, for example, a RAM (Random Access Memory), a main storage device such as a ROM (Read Only Memory), and an HDD (HDD). Includes at least one of external storage devices such as Hard Disk Drive).
 記憶部28には、学習モデル28Aと、地図データ28Bとが記憶されている。学習モデル28Aは、環境情報に基づいてユーザUのおかれている環境を特定するために用いられるAIモデルである。地図データ28Bは、実在の建造物や自然物などの位置情報を含んだデータであり、地球座標と実在の建造物や自然物などとが、関連付けられたデータといえる。学習モデル28A、及び地図データ28Bなどを用いた処理については、後述する。なお、学習モデル28A、及び地図データ28Bや、記憶部28が保存する制御部30用のプログラムは、情報処理装置10が読み取り可能な記録媒体に記憶されていてもよい。また、記憶部28が保存する制御部30用のプログラムや、学習モデル28A、及び地図データ28Bは、記憶部28に予め記憶されていることに限られず、これらのデータを使用する際に、情報処理装置10が通信によって外部の装置から取得してもよい。 The learning model 28A and the map data 28B are stored in the storage unit 28. The learning model 28A is an AI model used to specify the environment in which the user U is located based on the environment information. The map data 28B is data including position information of existing buildings and natural objects, and can be said to be data in which the earth coordinates and actual buildings and natural objects are associated with each other. The processing using the learning model 28A, the map data 28B, and the like will be described later. The learning model 28A, the map data 28B, and the program for the control unit 30 stored by the storage unit 28 may be stored in a recording medium readable by the information processing device 10. Further, the program for the control unit 30 stored by the storage unit 28, the learning model 28A, and the map data 28B are not limited to being stored in advance in the storage unit 28, and information is used when using these data. The processing device 10 may acquire from an external device by communication.
 制御部30は、情報処理装置10の各部の動作を制御する。制御部30は、例えば、CPU(Central Processing Unit)やMPU(Micro Processing Unit)等によって、図示しない記憶部に記憶されたプログラムがRAM等を作業領域として実行されることにより実現される。制御部30は、例えば、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現されてもよい。制御部30は、ハードウェアと、ソフトウェアとの組み合わせで実現されてもよい。 The control unit 30 controls the operation of each unit of the information processing device 10. The control unit 30 is realized by, for example, using a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like to execute a program stored in a storage unit (not shown) using a RAM or the like as a work area. The control unit 30 may be realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array). The control unit 30 may be realized by a combination of hardware and software.
 制御部30は、行動状態情報取得部40と、行動パターン情報生成部42と、行動スコア算出部44と、行動パターン特定部46と、記憶制御部48と、出力制御部50とを備える。 The control unit 30 includes an action state information acquisition unit 40, an action pattern information generation unit 42, an action score calculation unit 44, an action pattern specifying unit 46, a memory control unit 48, and an output control unit 50.
 行動状態情報取得部40は、行動状態センサ20を制御して、行動状態センサ20にユーザUの行動状態情報を検出させる。行動状態情報取得部40は、行動状態センサ20が検出した行動状態情報を取得する。 The behavior state information acquisition unit 40 controls the behavior state sensor 20 to cause the behavior state sensor 20 to detect the behavior state information of the user U. The behavior state information acquisition unit 40 acquires the behavior state information detected by the behavior state sensor 20.
 行動パターン情報生成部42は、行動状態情報取得部40が取得した行動状態情報に基づいて、行動パターン情報を生成する。行動パターン情報生成部42は、例えば、行動状態情報に基づいて、少なくともユーザUの行動状態が検出された日時、場所、時間のパラメータを座標軸とする多次元空間に行動パターン情報を生成する。 The behavior pattern information generation unit 42 generates behavior pattern information based on the behavior status information acquired by the behavior status information acquisition unit 40. The behavior pattern information generation unit 42 generates behavior pattern information in a multidimensional space whose coordinate axes are at least the date and time, place, and time parameters when the behavior state of the user U is detected, based on the behavior state information, for example.
 行動スコア算出部44は、行動パターン情報生成部42が生成した行動パターン情報に基づいて、行動スコアを算出する。行動スコア算出部44は、例えば、行動状態情報が集まる行動パターン情報群の密度が所定の密度を超えている空間ごとにグループ化する。行動スコア算出部44は、グループ化された行動パターン情報群を含む空間に基づいて、行動パターン情報群の行動スコアを算出する。具体的には、行動スコア算出部44は、例えば、行動パターン情報群を含む空間の大きさに関する情報を行動スコアとして算出する。 The behavior score calculation unit 44 calculates the behavior score based on the behavior pattern information generated by the behavior pattern information generation unit 42. The behavior score calculation unit 44, for example, groups the behavior pattern information groups in which the behavioral state information is collected for each space in which the density exceeds a predetermined density. The behavior score calculation unit 44 calculates the behavior score of the behavior pattern information group based on the space including the grouped behavior pattern information group. Specifically, the behavior score calculation unit 44 calculates, for example, information regarding the size of the space including the behavior pattern information group as the behavior score.
 行動パターン特定部46は、行動スコア算出部44が算出した行動スコアに基づいて、ユーザUの行動パターンを特定する。行動パターン特定部46は、行動スコアの値が所定の閾値を超えている行動パターン情報群に対応する行動をユーザUの行動パターンであると判定する。行動パターン特定部46は、例えば、行動状態情報取得部40が取得した画像データ、音声データ、位置情報、加速度情報、姿勢情報、赤外線及び紫外線の強度情報、温度情報、及び湿度情報などに基づいて、ユーザUが行っていた行動の種類を特定する。 The behavior pattern specifying unit 46 specifies the behavior pattern of the user U based on the behavior score calculated by the behavior score calculation unit 44. The behavior pattern specifying unit 46 determines that the behavior corresponding to the behavior pattern information group whose behavior score value exceeds a predetermined threshold value is the behavior pattern of the user U. The behavior pattern specifying unit 46 is based on, for example, image data, voice data, position information, acceleration information, attitude information, infrared and ultraviolet intensity information, temperature information, humidity information, etc. acquired by the behavior state information acquisition unit 40. , Identify the type of action the user U was performing.
 記憶制御部48は、記憶部28を制御して、記憶を行わせる。行動パターン特定部46が特定したユーザUの行動パターンに関する情報を記憶部28に記憶する。記憶制御部48は、行動パターン特定部46が特定したユーザUの行動パターンに関する情報を所定のフォーマットで記憶部28に記憶する。所定のフォーマットについては、後述する。 The memory control unit 48 controls the storage unit 28 to perform storage. Information about the behavior pattern of the user U specified by the behavior pattern specifying unit 46 is stored in the storage unit 28. The storage control unit 48 stores information about the behavior pattern of the user U specified by the behavior pattern specifying unit 46 in the storage unit 28 in a predetermined format. The predetermined format will be described later.
 出力制御部50は、出力部24を制御して、出力を行わせる。出力制御部50は、例えば、表示部24Aを制御して、行動パターンに関する情報を表示させる。出力制御部50は、例えば、音声出力部24Bを制御して、行動パターンに関する情報を音声で出力させる。 The output control unit 50 controls the output unit 24 to output. The output control unit 50 controls, for example, the display unit 24A to display information regarding an action pattern. The output control unit 50 controls, for example, the voice output unit 24B to output information regarding the behavior pattern by voice.
 [処理内容]
 図3を用いて、第1実施形態に係る情報処理装置10の処理内容について説明する。図3は、第1実施形態に係る情報処理装置10の処理の流れの一例を示すフローチャートである。
[Processing content]
The processing content of the information processing apparatus 10 according to the first embodiment will be described with reference to FIG. FIG. 3 is a flowchart showing an example of the processing flow of the information processing apparatus 10 according to the first embodiment.
 制御部30は、行動状態センサ20からユーザUの行動に関する行動状態情報を取得する(ステップS10)。具体的には、行動状態情報取得部40は、情報処理装置10(ユーザU)の周辺を撮像した画像データをカメラ20Aから取得する。行動状態情報取得部40は、情報処理装置10(ユーザU)の周辺の音声を収音した音声データをマイク20Bから取得する。行動状態情報取得部40は、情報処理装置10(ユーザU)の位置情報をGNSS受信器20Cから取得する。行動状態情報取得部40は、情報処理装置10(ユーザU)の加速度情報を加速度センサ20Dから取得する。行動状態情報取得部40は、情報処理装置10(ユーザU)の姿勢情報をジャイロセンサ20Eから取得する。行動状態情報取得部40は、情報処理装置10(ユーザU)の周辺の赤外線及び紫外線の強度情報を光センサ20Fから取得する。行動状態情報取得部40は、情報処理装置10(ユーザU)の周辺の温度情報を温度センサ20Gから取得する。行動状態情報取得部40は、情報処理装置10(ユーザU)の周辺の湿度情報を湿度センサ20Hから取得する。行動状態情報取得部40は、これらの情報を、所定期間ごとに、逐次取得する。行動状態情報取得部40は、それぞれの行動状態情報を、同じタイミングで取得してもよいし、異なるタイミングで取得してもよい。また、次の行動状態情報を取得するまでの所定期間は、任意に設定してよく、環境情報毎に所定期間を同じにしてもよいし、異ならせてもよい。 The control unit 30 acquires the behavior state information regarding the behavior of the user U from the behavior state sensor 20 (step S10). Specifically, the behavior state information acquisition unit 40 acquires image data obtained by capturing the periphery of the information processing apparatus 10 (user U) from the camera 20A. The behavior state information acquisition unit 40 acquires voice data obtained by collecting voices around the information processing device 10 (user U) from the microphone 20B. The behavior state information acquisition unit 40 acquires the position information of the information processing device 10 (user U) from the GNSS receiver 20C. The behavior state information acquisition unit 40 acquires the acceleration information of the information processing device 10 (user U) from the acceleration sensor 20D. The behavior state information acquisition unit 40 acquires the posture information of the information processing device 10 (user U) from the gyro sensor 20E. The behavior state information acquisition unit 40 acquires the intensity information of infrared rays and ultraviolet rays around the information processing apparatus 10 (user U) from the optical sensor 20F. The behavior state information acquisition unit 40 acquires temperature information around the information processing device 10 (user U) from the temperature sensor 20G. The behavior state information acquisition unit 40 acquires humidity information around the information processing device 10 (user U) from the humidity sensor 20H. The behavior state information acquisition unit 40 sequentially acquires such information at predetermined intervals. The action state information acquisition unit 40 may acquire each action state information at the same timing or at different timings. Further, the predetermined period until the next behavioral state information is acquired may be arbitrarily set, and the predetermined period may be the same or different for each environmental information.
 ユーザUの行動には、行動自体の物理的な身体の動き、その行動をする日時、その行動をする場所、その行動をその場所で行っている時間の3つの要素が含まれ得る。ユーザUの行動には、ユーザUの身体の動きだけでなく、例えば、「ゴルフをする」、「映画を見る」、「買い物をする」などの行為も含まれ得る。行動状態情報取得部40が取得したユーザUの身体の動きが同じであっても、位置情報が異なっていれば行っている行為が異なることもあり得る。 The action of the user U may include three elements: the physical movement of the body of the action itself, the date and time when the action is performed, the place where the action is performed, and the time when the action is performed at the place. The behavior of the user U may include not only the movement of the body of the user U but also actions such as "playing golf", "watching a movie", and "shopping". Even if the body movement of the user U acquired by the action state information acquisition unit 40 is the same, if the position information is different, the action performed may be different.
 制御部30は、ユーザUの行動パターン情報を生成する(ステップS11)。具体的には、行動パターン情報生成部42は、行動状態情報取得部40が取得したユーザUの行動状態情報に基づいて、ユーザUの行動パターン情報を生成する。制御部30は、行動パターン情報群をグループ化する(ステップS12)。具体的には、行動パターン情報生成部42は、行動パターン情報が集まる行動パターン情報群の密度が所定の密度を超えている空間ごとにグループ化する。制御部30は、行動スコアを算出する(ステップS13)。具体的には、行動スコア算出部44は、行動パターン情報群を含む空間の中心から端部までの距離を行動スコアとして算出する。言い換えれば、行動スコア算出部44は、行動パターン情報群を含む空間の大きさを行動スコアとして算出する。 The control unit 30 generates the behavior pattern information of the user U (step S11). Specifically, the behavior pattern information generation unit 42 generates the behavior pattern information of the user U based on the behavior state information of the user U acquired by the behavior state information acquisition unit 40. The control unit 30 groups the behavior pattern information group (step S12). Specifically, the behavior pattern information generation unit 42 groups each space in which the density of the behavior pattern information group in which the behavior pattern information is collected exceeds a predetermined density. The control unit 30 calculates the behavior score (step S13). Specifically, the behavior score calculation unit 44 calculates the distance from the center to the end of the space including the behavior pattern information group as the behavior score. In other words, the behavior score calculation unit 44 calculates the size of the space including the behavior pattern information group as the behavior score.
 図4は、行動パターン情報を生成する多次元空間について説明するための図である。 FIG. 4 is a diagram for explaining a multidimensional space that generates behavior pattern information.
 図4は、日時、場所、時間を座標軸とする3次元空間を示す。図4では、日時は0時から24時、場所は自宅からの1次元的な直線距離、時間は行動の開始と判断されてから終了と判断されるまでの時間であるが、これに限られない。例えば、場所は名称および住所などであってもよい。 FIG. 4 shows a three-dimensional space whose coordinate axes are date and time, place, and time. In FIG. 4, the date and time is from 0:00 to 24:00, the place is a one-dimensional linear distance from the home, and the time is the time from the start of the action to the end of the action, but it is limited to this. do not have. For example, the location may be a name, an address, or the like.
 行動パターン情報生成部42は、図4に示す3次元空間において、所定の時間間隔ごとに点Pをプロットすることで行動パターン情報を生成する。行動パターン情報生成部42は、例えば、1分間隔ごとに点Pをプロットするが、これに限られない。行動パターン情報生成部42は、例えば、ユーザUが「正午頃」に「エリアA2付近」で、「2時間」映画を見ていたとする。この場合、行動パターン情報生成部42は、「正午頃」と、「エリアA2付近」と、「2時間」とか交差する点から日時の軸と平行に、「14時頃」まで所定間隔で点Pをプロットする。なお、ユーザUが映画館にいたことは、行動パターン特定部46が、ユーザUの周辺の画像データ、音声データ、位置情報、加速度情報、姿勢情報、赤外線及び紫外線の強度情報、温度情報、及び湿度情報などに基づいて、特定することができる。ユーザUの行動として「食料品を購入する」の場合、通常、スーパーマーケット及びデパートの地下街などの食料品店の候補は複数あり得る。 The behavior pattern information generation unit 42 generates behavior pattern information by plotting points P at predetermined time intervals in the three-dimensional space shown in FIG. The behavior pattern information generation unit 42 plots the point P at intervals of 1 minute, for example, but the present invention is not limited to this. It is assumed that the behavior pattern information generation unit 42 is watching a movie for "2 hours" in "around area A2" at "around noon", for example. In this case, the behavior pattern information generation unit 42 points at predetermined intervals from the intersection of "around noon", "around area A2" and "2 hours" to "around 14:00" parallel to the date and time axis. Plot P. It should be noted that the fact that the user U was in the movie theater means that the behavior pattern specifying unit 46 has image data, audio data, position information, acceleration information, attitude information, infrared and ultraviolet intensity information, temperature information, and information around the user U. It can be specified based on humidity information and the like. In the case of "purchasing groceries" as the action of the user U, there may usually be multiple candidates for grocery stores such as supermarkets and underground shopping malls of department stores.
 行動パターン情報生成部42は、図4に示す3次元空間において、行動パターン情報として生成されて点Pが集まる行動パターン情報群の密度が所定の密度を超えている空間を同一の行動パターンとしてグループ化する。行動パターン情報生成部42は、例えば、行動パターン情報が生成され得る空間Sに対して、単位空間USを用いてスキャニングを行う。行動パターン情報生成部42は、例えば、空間Sの各箇所における単位空間USに含まれる点Pの数をカウントする。行動パターン情報生成部42は、例えば、単位空間USに含まれる点Pの数が最も多い箇所を行動パターン情報の中心として特定する。行動パターン情報生成部42は、点Pの数が最も多い単位空間USの周囲の単位空間USに含まれる点Pの数をカウントし、点Pの数が最も多い単位空間USの60%程度の点Pが含まれている単位空間USまでを同一のグループGとして特定する。行動スコア算出部44は、例えば、グループGの中心から何れかの面に引いた垂線の長さを行動スコアとして算出する。行動スコア算出部44は、例えば、グループGの体積を行動スコアとして算出する。なお、行動パターン情報生成部42は、図4に示す3次元空間において、複数のグループGを特定することもあり得る。 The behavior pattern information generation unit 42 groups in the three-dimensional space shown in FIG. 4 as the same behavior pattern in a space where the density of the behavior pattern information group generated as behavior pattern information and gathering points P exceeds a predetermined density. To become. The behavior pattern information generation unit 42 scans the space S in which the behavior pattern information can be generated, for example, by using the unit space US. The behavior pattern information generation unit 42 counts, for example, the number of points P included in the unit space US at each location in the space S. The behavior pattern information generation unit 42 specifies, for example, a location having the largest number of points P included in the unit space US as the center of the behavior pattern information. The action pattern information generation unit 42 counts the number of points P included in the unit space US around the unit space US having the largest number of points P, and is about 60% of the unit space US having the largest number of points P. Up to the unit space US including the point P is specified as the same group G. The action score calculation unit 44 calculates, for example, the length of the perpendicular line drawn from the center of the group G to any surface as the action score. The action score calculation unit 44 calculates, for example, the volume of the group G as the action score. The behavior pattern information generation unit 42 may specify a plurality of groups G in the three-dimensional space shown in FIG.
 ここで、ユーザUの行動には、例えば、「車の購入」であれば「ディーラーに行く」及び「椅子の購入」であれば「ホームセンターに行く」などの購入する対象が頻繁ではないものを購入する行動もあり得る。これらの行動については、図4に示す3次元空間上において、場所も、時間も、日時もばらばらにプロットされるものと考えられる。しかしながら、例えば、ユーザUが毎日朝6時に、近くの「公園」で「体操」を「約10分」するとしたら、3次元空間上にプロットされる点は日を追うごとに空間上に密集してくる。同じ公園の場所の中でも数mから数十mは体操をする場所が日によって違うこともあり得るし、時間も10分から±2分程度ばらつくこともあり得る。このため、行動パターン情報生成部42は、3次元空間上において比較的大きな密集された群のように偏りがあるように行動パターン情報をプロットして生成し得る。このように、比較的簡単な情報である時刻、場所、時間というパラメータを座標軸に選ぶことで、特定の揺らぎを持つ3次元空間の密集した群を1つの行動パターンとして扱うことができるようになる。 Here, the behavior of the user U includes, for example, "going to the dealer" for "purchasing a car" and "going to the home center" for "purchasing a chair", which are not frequently purchased. There can also be behavior to buy. It is considered that these actions are plotted separately in place, time, and date and time in the three-dimensional space shown in FIG. However, for example, if User U does "gymnastics" for "about 10 minutes" at a nearby "park" at 6 am every day, the points plotted in the three-dimensional space will be crowded in the space day by day. Come on. Even within the same park location, the place where you do gymnastics may vary from day to day, and the time may vary from 10 minutes to ± 2 minutes. Therefore, the behavior pattern information generation unit 42 can plot and generate behavior pattern information so as to be biased like a relatively large dense group in a three-dimensional space. In this way, by selecting parameters such as time, place, and time, which are relatively simple information, as the axis, it becomes possible to treat a dense group of three-dimensional space with specific fluctuations as one action pattern. ..
 図4に示す例では、ユーザUの行動パターンを3次元空間上に示したが、本開示はこれに限定されない。本実施形態では、ユーザUの行動パターンは、任意の多次元空間に生成してよい。例えば、場所は「自宅からの直線距離」としたが、場所を「緯度」と「経度」としてもよい。この場合、行動パターン情報が生成される空間は、4次元空間となる。さらに、日時は「0時から24時」として1日間を示したが、「曜日」として7単位のスカラー値を持つ軸を加えて、5次元空間としてもよい。この場合、例えば、ユーザUが月曜日から金曜日に出勤し、土曜日及び日曜日が休日である場合、月曜日から金曜日と、土曜日及び日曜日とでは、行動パターン情報群のプロットが大きく変化し得る。「曜日」の軸の視点で見ると、日常行動の行動パターンと、非日常行動な行動パターンとを判別しやすくなる。日常行動及び非日常行動の行動パターンについては、後述する。 In the example shown in FIG. 4, the behavior pattern of the user U is shown in a three-dimensional space, but the present disclosure is not limited to this. In the present embodiment, the behavior pattern of the user U may be generated in any multidimensional space. For example, the place is "straight line distance from home", but the place may be "latitude" and "longitude". In this case, the space in which the behavior pattern information is generated is a four-dimensional space. Further, although the date and time are set to "0:00 to 24:00" to indicate one day, an axis having a scalar value of 7 units may be added as the "day of the week" to form a five-dimensional space. In this case, for example, when the user U goes to work from Monday to Friday and Saturday and Sunday are holidays, the plot of the behavior pattern information group may change significantly between Monday and Friday and Saturday and Sunday. From the viewpoint of the axis of "day of the week", it becomes easy to distinguish between the behavior pattern of daily behavior and the behavior pattern of extraordinary behavior. The behavior patterns of daily behavior and extraordinary behavior will be described later.
 なお、時間は、その行動の開始から終了までの時間としたが、本開示はこれに限られない。例えば、「バットの素振りを200回行った」とか、「何歩歩いた(又は走った)」とかなどの断続的な継続事象のような行動である場合、頻度であってもよい。例えば、ユーザUが定期的に運動をする習慣が場合に時間を頻度に変更することにより、あらゆる行動パターンを「動き」という「運動」のパラメータとして捉えることができ、ユーザUにとって興味あるデータとして表示できる可能性が高くなる。 The time is the time from the start to the end of the action, but this disclosure is not limited to this. For example, if the behavior is an intermittent continuous event such as "200 swings of the bat" or "how many steps (or ran)", the frequency may be used. For example, if the user U has a habit of exercising regularly, by changing the time to frequency, all behavior patterns can be grasped as a parameter of "exercise" called "movement", which is interesting data for user U. It is more likely that it can be displayed.
 図3に戻る。制御部30は、行動スコアが閾値未満であるか否かを判定する(ステップS14)。具体的には、行動パターン特定部46は、ステップS13で行動スコア算出部44が算出した行動スコアが所定の閾値未満であるか否かを判定する。閾値未満であると判定された場合(ステップS14;Yes)、ステップS15に進む。閾値未満でないと判定された場合(ステップS14;No)、ステップS18に進む。 Return to Fig. 3. The control unit 30 determines whether or not the behavior score is less than the threshold value (step S14). Specifically, the behavior pattern specifying unit 46 determines whether or not the behavior score calculated by the behavior score calculation unit 44 in step S13 is less than a predetermined threshold value. If it is determined that the value is less than the threshold value (step S14; Yes), the process proceeds to step S15. If it is determined that the value is not less than the threshold value (step S14; No), the process proceeds to step S18.
 ステップS14でYesと判定された場合、制御部30は、行動パターンを特定する(ステップS15)。具体的には、行動パターン特定部46は、行動スコアが所定の閾値以下であった行動パターン情報群に対応する行動をユーザUの行動パターンであると特定する。 If it is determined to be Yes in step S14, the control unit 30 specifies an action pattern (step S15). Specifically, the behavior pattern specifying unit 46 identifies the behavior corresponding to the behavior pattern information group whose behavior score is equal to or less than a predetermined threshold value as the behavior pattern of the user U.
 制御部30は、特定された行動パターンの行動状態の種類を識別する(ステップS16)。具体的には、行動パターン特定部46は、行動状態情報取得部40が取得した行動状態情報に基づいて、ユーザUが行った行動状態の種類を識別してよい。 The control unit 30 identifies the type of behavioral state of the specified behavioral pattern (step S16). Specifically, the behavior pattern specifying unit 46 may identify the type of behavior state performed by the user U based on the behavior state information acquired by the behavior state information acquisition unit 40.
 より具体的には、行動パターン特定部46は、例えば、学習モデル28Aを用いて、ユーザUの行動状態を特定してもよい。学習モデル28Aは、行動状態センサ20の検出結果と、行動状態センサ20の検出結果が示す行動状態の種類を示す情報とを1つのデータセットとし、複数のデータセットを教師データとして学習して構築されたAI(Artificial Intelligence)モデルである。行動パターン特定部46は、学習済みの学習モデル28Aに、行動状態センサ20の検出結果を入力して、その検出結果が示す行動状態の種類を示す情報を取得して、ユーザUの行動状態の種類を識別する。行動パターン特定部46は、学習済みの学習モデル28Aを用いて、例えば、ユーザUがゴルフをしている、ショッピングをしている、映画館にいることなどを特定する。 More specifically, the behavior pattern specifying unit 46 may specify the behavior state of the user U by using, for example, the learning model 28A. The learning model 28A is constructed by learning and constructing a plurality of data sets as one data set, with the detection result of the behavior state sensor 20 and the information indicating the type of the behavior state indicated by the detection result of the behavior state sensor 20 as one data set. It is an AI (Artificial Intelligence) model. The behavior pattern specifying unit 46 inputs the detection result of the behavior state sensor 20 into the learned learning model 28A, acquires the information indicating the type of the behavior state indicated by the detection result, and obtains the information indicating the type of the behavior state indicated by the detection result. Identify the type. The behavior pattern specifying unit 46 uses the learned learning model 28A to specify, for example, that the user U is playing golf, shopping, being in a movie theater, or the like.
 制御部30は、行動パターンを記憶部28に記憶させる(ステップS17)。具体的には、記憶制御部48は、ステップS16で特定された行動パターンを所定のフォーマットで記録する。 The control unit 30 stores the action pattern in the storage unit 28 (step S17). Specifically, the memory control unit 48 records the behavior pattern specified in step S16 in a predetermined format.
 図5は、行動パターンを記憶するフォーマットを説明するための図である。本実施形態では、行動パターンは、例えば、16種類から64種類程度の行動を予測できる場所を予め対応付けられ得る。例えば、「ゴルフ場」であれば「ゴルフをしている」ことが予測でき、「映画館」であれば「映画を見ている」ことが予測でき、「ショッピングセンター」であれば「買い物をしている」ことが予測でき、「公園」であれば「体操をしている」ことが予測できる。本実施形態は、ユーザUの位置情報は、行動状態センサ20のGNSS受信器20Cから取得できるので、数週間程度に渡って、ユーザUが訪れた場所を登録することができる。これにより、本実施形態は、ユーザUのライフスタイルに係る行動パターンの候補に順番に番号を付けることができる。 FIG. 5 is a diagram for explaining a format for storing an action pattern. In the present embodiment, the behavior pattern can be associated with, for example, a place where 16 to 64 types of behavior can be predicted in advance. For example, a "golf course" can predict "playing golf", a "cinema" can predict "watching a movie", and a "shopping center" can predict "shopping". It can be predicted that "doing", and if it is "park", it can be predicted that "doing gymnastics". In this embodiment, since the position information of the user U can be acquired from the GNSS receiver 20C of the behavior state sensor 20, the place visited by the user U can be registered for several weeks. Thereby, in the present embodiment, the candidates for the behavior pattern related to the lifestyle of the user U can be numbered in order.
 図5に示すように、記憶フォーマットF1には、領域D1と、領域D2と、領域D3と、領域D4と、領域D5とが、含まれ得る。 As shown in FIG. 5, the storage format F1 may include an area D1, an area D2, an area D3, an area D4, and an area D5.
 領域D1には、特定された行動パターンをナンバリングされた数値が記憶される。領域D1は、例えば、3バイトで構成される。領域D2には、行動パターン情報を群としてプロットした空間の次元数が記憶される。領域D2は、例えば、1バイトで構成される。この場合、空間は最大で、255次元になり得る。領域D3には、ユーザUの行動パターンであると判定された、行動パターン情報群の行動スコアRが記憶される。行動スコアRは、行動パターンの判定誤差の揺らぎを持ち、行動スコアRが小さい値であるほど、行動パターンの信頼度が高いことを意味する。領域D4は、リザーブ領域である。リザーブ領域の最後の1バイトの領域の最終ビットである領域D5には、行動パターンが日常であるものか、非日常であるものかを示す識別子が記憶される。領域D5には、後述する日常行動の行動パターンである場合には0が記述され、非日常行動の行動パターンである場合には1が記述される。リザーブ領域には、各々の行動パターンが発生したときに、付随的な情報を追加したい場合に使用され得る。リザーブ領域は、例えば、6バイト以上の領域を持ち、N次元(Nは任意の整数)の次元に相当する各数字情報を記述してもよい。 In the area D1, a numerical value numbered with the specified behavior pattern is stored. The area D1 is composed of, for example, 3 bytes. In the area D2, the number of dimensions of the space in which the behavior pattern information is plotted as a group is stored. The area D2 is composed of, for example, 1 byte. In this case, the space can be up to 255 dimensions. In the area D3, the behavior score R of the behavior pattern information group determined to be the behavior pattern of the user U is stored. The behavior score R has fluctuations in the judgment error of the behavior pattern, and the smaller the value of the behavior score R, the higher the reliability of the behavior pattern. The area D4 is a reserve area. In the area D5, which is the last bit of the last 1-byte area of the reserve area, an identifier indicating whether the behavior pattern is everyday or extraordinary is stored. In the area D5, 0 is described in the case of the behavior pattern of daily behavior described later, and 1 is described in the case of the behavior pattern of extraordinary behavior. The reserve area can be used when ancillary information is desired to be added when each behavior pattern occurs. The reserve area has, for example, an area of 6 bytes or more, and each numerical information corresponding to an N-dimensional (N is an arbitrary integer) dimension may be described.
 図3に戻る。制御部30は、他のグループがあるか否かを判定する(ステップS18)。具体的には行動スコア算出部44は、行動スコアを算出すべき、グループ化された行動パターン情報群があるか否かを判定する。他のグループがあると判定された場合(ステップS18;Yes)、ステップS13に進む。他のグループがないと判定された場合(ステップS18;No)、ステップS19に進む。 Return to Fig. 3. The control unit 30 determines whether or not there is another group (step S18). Specifically, the behavior score calculation unit 44 determines whether or not there is a grouped behavior pattern information group for which the behavior score should be calculated. If it is determined that there is another group (step S18; Yes), the process proceeds to step S13. If it is determined that there is no other group (step S18; No), the process proceeds to step S19.
 制御部30は、処理を終了するか否かを判定する(ステップS19)。具体的には、制御部30は、処理を終了する操作を受け付けた場合、及び電源をオフする操作などを受け付けた場合に、処理を終了すると判定する。処理を終了しないと判定された場合(ステップS19;No)、ステップS10に進む。処理を終了すると判定された場合(ステップS19;Yes)、図3の処理を終了する。 The control unit 30 determines whether or not to end the process (step S19). Specifically, the control unit 30 determines that the process is terminated when it receives an operation to end the process, or when it receives an operation to turn off the power. If it is determined that the process is not completed (step S19; No), the process proceeds to step S10. When it is determined to end the process (step S19; Yes), the process of FIG. 3 is terminated.
 上述のとおり、第1実施形態に係る情報処理装置10は、ユーザUの行動状態を検出し、行動状態に応じた行動パターン情報群を多次元空間に生成する。これにより、第1実施形態に係る情報処理装置10は、行動パターン情報群に基づいて、ユーザUの行動パターンを特定することができる。 As described above, the information processing apparatus 10 according to the first embodiment detects the behavioral state of the user U and generates an behavioral pattern information group according to the behavioral state in a multidimensional space. Thereby, the information processing apparatus 10 according to the first embodiment can specify the behavior pattern of the user U based on the behavior pattern information group.
 [第2実施形態]
 次に、第2実施形態について説明する。図6は、第2実施形態に係る情報処理装置の処理の流れの一例を示すフローチャートである。第2実施形態に係る情報処理装置の構成は、図2に示す情報処理装置10と同一の構成なので、説明を省略する。
[Second Embodiment]
Next, the second embodiment will be described. FIG. 6 is a flowchart showing an example of the processing flow of the information processing apparatus according to the second embodiment. Since the configuration of the information processing apparatus according to the second embodiment is the same as that of the information processing apparatus 10 shown in FIG. 2, the description thereof will be omitted.
 第2実施形態では、情報処理装置10は、特定されたユーザUの行動パターンが日常的な日常行動であるか、又は非日常的な非日常行動であるか否かを判定する。 In the second embodiment, the information processing apparatus 10 determines whether the identified user U's behavior pattern is a daily daily behavior or an extraordinary extraordinary behavior.
 図7は、日常行動と、非日常行動とを説明するための図である。 FIG. 7 is a diagram for explaining daily behavior and extraordinary behavior.
 図7において、空間SAは、例えば、同一の行動パターンとしてグループ化された範囲を示す。空間SBは、例えば、中心が空間SAと同じあり、体積が空間SAの60%程度である。 In FIG. 7, the space SA shows, for example, a range grouped as the same behavior pattern. The space SB has, for example, the same center as the space SA, and its volume is about 60% of the space SA.
 第2実施形態では、空間SBに含まれている行動パターン情報は、行動スコアが第1閾値未満の行動パターンである。第2実施形態では、行動スコアが第1閾値未満の行動パターンを日常行動の行動パターンであると判定する。図7に示す例では、空間SB内の点P1に対応する行動パターンが日常行動の行動パターンと判定される。 In the second embodiment, the behavior pattern information included in the space SB is a behavior pattern in which the behavior score is less than the first threshold value. In the second embodiment, the behavior pattern whose behavior score is less than the first threshold value is determined to be the behavior pattern of daily behavior. In the example shown in FIG. 7, the behavior pattern corresponding to the point P1 in the space SB is determined to be the behavior pattern of daily behavior.
 第2実施形態では、空間SAと、空間SBとの間の空間に含まれている行動パターンは、行動スコアが第1閾値以上第2閾値未満の行動パターンである。第2実施形態では、行動スコアが第1閾値以上第2閾値未満の行動パターンを非日常行動の行動パターンであると判定する。図7に示す例では、空間SAと、空間SBとの間の空間内の点P2に対応する行動パターンが非日常行動の行動パターンと判定される。 In the second embodiment, the behavior pattern included in the space between the space SA and the space SB is a behavior pattern in which the behavior score is equal to or more than the first threshold and less than the second threshold. In the second embodiment, the behavior pattern having the behavior score of the first threshold value or more and less than the second threshold value is determined to be the behavior pattern of the extraordinary behavior. In the example shown in FIG. 7, the behavior pattern corresponding to the point P2 in the space between the space SA and the space SB is determined to be the behavior pattern of extraordinary behavior.
 第2実施形態では、空間SAの外側の空間に含まれている行動パターンは、行動スコアが第2閾値以上の行動パターンである。第2実施形態では、行動パターンが第2閾値以上の行動パターンを、対象外としてユーザの行動パターンとして含めない。 In the second embodiment, the behavior pattern included in the space outside the space SA is an behavior pattern in which the behavior score is equal to or higher than the second threshold value. In the second embodiment, the behavior pattern whose behavior pattern is equal to or higher than the second threshold value is excluded from the target and is not included as the behavior pattern of the user.
 図6に戻る。図ステップS20からステップS23の処理は、図3に示すステップS10からステップS13の処理と同一の処理なので、説明を省略する。 Return to Fig. 6. Since the processing of steps S20 to S23 is the same as the processing of steps S10 to S13 shown in FIG. 3, the description thereof will be omitted.
 制御部30は、行動スコアが第1閾値未満であるか否かを判定する(ステップS24)。具体的には、行動パターン特定部46は、ステップS23で行動スコア算出部44が算出した行動スコアが所定の第1閾値未満であるか否かを判定する。第1閾値未満であると判定された場合(ステップS24;Yes)、ステップS25に進む。第1閾値未満でないと判定された場合(ステップS24;No)、ステップS28に進む。 The control unit 30 determines whether or not the behavior score is less than the first threshold value (step S24). Specifically, the behavior pattern specifying unit 46 determines whether or not the behavior score calculated by the behavior score calculation unit 44 in step S23 is less than a predetermined first threshold value. If it is determined that the threshold value is less than the first threshold value (step S24; Yes), the process proceeds to step S25. If it is determined that the threshold value is not less than the first threshold value (step S24; No), the process proceeds to step S28.
 ステップS24でYesと判定された場合、制御部30は、日常行動の行動パターンを特定する(ステップS25)。具体的には、行動パターン特定部46は、行動スコアが所定の第1閾値未満であった行動パターン情報群に対応する行動をユーザUの日常行動の行動パターンであると特定する。 If it is determined to be Yes in step S24, the control unit 30 specifies an action pattern of daily actions (step S25). Specifically, the behavior pattern specifying unit 46 identifies the behavior corresponding to the behavior pattern information group whose behavior score is less than a predetermined first threshold value as the behavior pattern of the daily behavior of the user U.
 制御部30は、特定された日常行動の行動パターンの行動状態の種類を識別する(ステップS26)。具体的には、行動パターン特定部46は、行動状態情報取得部40が取得した行動状態情報に基づいて、ユーザUが行った日常行動の行動状態の種類を識別してよい。 The control unit 30 identifies the type of behavioral state of the behavioral pattern of the specified daily behavior (step S26). Specifically, the behavior pattern specifying unit 46 may identify the type of the behavior state of the daily activity performed by the user U based on the behavior state information acquired by the behavior state information acquisition unit 40.
 制御部30は、日常行動の行動パターンを記憶部28に記憶させる(ステップS27)。具体的には、記憶制御部48は、ステップS25で特定された日常行動の行動パターンを所定のフォーマットで記録する。 The control unit 30 stores the behavior pattern of daily activities in the storage unit 28 (step S27). Specifically, the memory control unit 48 records the behavior pattern of the daily behavior specified in step S25 in a predetermined format.
 ステップS24でNoと判定された場合、制御部30は、行動スコアの第1閾値以上第2閾値未満であるか否かを判定する(ステップS28)。具体的には、行動パターン特定部46は、ステップS23で行動スコア算出部44が算出した行動スコアが所定の第1閾値以上第2閾値未満であるか否かを判定する。第1閾値以上第2閾値未満であると判定された場合(ステップS28;Yes)、ステップS29に進む。第1閾値以上第2閾値未満でないと判定された場合(ステップS28;No)、ステップS32に進む。 If No is determined in step S24, the control unit 30 determines whether or not the behavior score is equal to or greater than the first threshold and less than the second threshold (step S28). Specifically, the behavior pattern specifying unit 46 determines whether or not the behavior score calculated by the behavior score calculation unit 44 in step S23 is equal to or greater than a predetermined first threshold value and less than the second threshold value. If it is determined that the threshold value is equal to or greater than the first threshold value and is less than the second threshold value (step S28; Yes), the process proceeds to step S29. If it is determined that it is not the first threshold value or more and less than the second threshold value (step S28; No), the process proceeds to step S32.
 ステップS28でYesと判定された場合、制御部30は、非日常行動の行動パターンを特定する(ステップS29)。具体的には、行動パターン特定部46は、行動スコアが所定の第1閾値以上第2閾値未満であった行動パターン情報群に対応する行動を非日常行動の行動パターンであると判定する。 If it is determined to be Yes in step S28, the control unit 30 specifies an action pattern of extraordinary behavior (step S29). Specifically, the behavior pattern specifying unit 46 determines that the behavior corresponding to the behavior pattern information group whose behavior score is equal to or more than a predetermined first threshold value and less than the second threshold value is an extraordinary behavior behavior pattern.
 制御部30は、特定された非日常行動の行動パターンの行動状態の種類を識別する(ステップS30)。具体的には、行動パターン特定部46は、行動状態情報取得部40が取得した行動状態情報に基づいて、ユーザUが行った非日常行動の行動状態の種類を識別してよい。 The control unit 30 identifies the type of behavioral state of the behavioral pattern of the specified extraordinary behavior (step S30). Specifically, the behavior pattern specifying unit 46 may identify the type of behavioral state of the extraordinary behavior performed by the user U based on the behavioral state information acquired by the behavioral state information acquisition unit 40.
 制御部30は、日常行動の行動パターンを記憶部28に記憶させる(ステップS31)。具体的には、記憶制御部48は、ステップS25で特定された非日常行動の行動パターンを所定のフォーマットで記録する。 The control unit 30 stores the behavior pattern of daily activities in the storage unit 28 (step S31). Specifically, the memory control unit 48 records the behavior pattern of the extraordinary behavior specified in step S25 in a predetermined format.
 ステップS32及びステップ33の処理は、ぞれぞれ、図3に示すステップS18及びステップS19の処理と同一の処理なので、説明を省略する。 Since the processes of step S32 and step 33 are the same as the processes of step S18 and step S19 shown in FIG. 3, the description thereof will be omitted.
 上述のとおり、第2実施形態に係る情報処理装置10は、行動スコアに基づいて、行動パターンが日常行動であるか、非日常行動であるかを判定する。これにより、第2実施形態に係る情報処理装置10は、同じ行動においても日常のルーチンであるのか、新しい行動であるのかを判定することができる。 As described above, the information processing apparatus 10 according to the second embodiment determines whether the behavior pattern is a daily behavior or an extraordinary behavior based on the behavior score. Thereby, the information processing apparatus 10 according to the second embodiment can determine whether the same action is a daily routine or a new action.
 具体的には、第2実施形態における、日常行動および非日常行動の識別は、例えば、平日の通勤時間に発生した行動パターンは、ルーチンによる行動なのか、または意図的にその行動パターンを行ったかなどの興味の有無の判断に使用することができる。例えば、毎日決まった時刻にある駅から別の駅まで通勤している場合、駅に向かって歩く行動は同じ時刻であればルーチンであり、興味を持って起こした能動的な行動ではないとして、統計上その行動パターンのデータを無視して計算などすることができる。 Specifically, in the second embodiment, the discrimination between daily behavior and extraordinary behavior is, for example, whether the behavior pattern that occurred during commuting time on weekdays is a routine behavior or was the behavior pattern intentionally performed. It can be used to judge whether or not you are interested in such things. For example, if you commute from one station to another at a fixed time every day, the behavior of walking toward the station is routine if it is the same time, and it is not an active behavior that you are interested in. Statistically, the behavior pattern data can be ignored for calculations.
 [第3実施形態]
 次に、図8を用いて、第3実施形態に係る情報処理装置について説明する。図8は、第3実施形態に係る情報処理装置の構成例を示すブロック図である。
[Third Embodiment]
Next, the information processing apparatus according to the third embodiment will be described with reference to FIG. FIG. 8 is a block diagram showing a configuration example of the information processing apparatus according to the third embodiment.
 図8に示すように、情報処理装置10aは、生体センサ32を備える点と、制御部30aが生体情報取得部52及び活性度スコア算出部54を備える点で、図2に示す情報処理装置10とは異なる。 As shown in FIG. 8, the information processing apparatus 10a includes the biometric sensor 32 and the control unit 30a includes the biometric information acquisition unit 52 and the activity score calculation unit 54. Is different.
 ユーザUの行動には、物理的な動きと共に、興奮度合いなどの生体情報が伴う。そのため、ユーザUの行動パターンを特定する際には、その行動時のユーザUの心理的な状況を考慮することが好ましい。情報処理装置10aは、ユーザUが行動を楽しんでいる度合いを示す活性度スコアを算出する。 User U's behavior is accompanied by biometric information such as the degree of excitement as well as physical movement. Therefore, when specifying the behavior pattern of the user U, it is preferable to consider the psychological situation of the user U at the time of the behavior. The information processing apparatus 10a calculates an activity score indicating the degree to which the user U enjoys the action.
 生体センサ32は、ユーザUの生体情報を検出するセンサである。生体センサ32は、ユーザUの生体情報を検出可能であれば、任意の位置に設けられてよい。ここでの生体情報は、指紋など不変のものではなく、例えばユーザUの状態に応じて値が変化する情報であることが好ましい。さらに言えば、ここでの生体情報は、ユーザUの自律神経に関する情報、すなわちユーザUの意思にかかわらず値が変化する情報であることが好ましい。具体的には、生体センサ32は、脈波センサ32Aを含み、生体情報として、ユーザUの脈波を検出する。生体センサ32は、ユーザUの脳波を検出する脳波センサを含んでもよい。 The biosensor 32 is a sensor that detects the biometric information of the user U. The biosensor 32 may be provided at any position as long as it can detect the biometric information of the user U. The biometric information here is not immutable such as a fingerprint, but is preferably information whose value changes according to the state of the user U, for example. Furthermore, it is preferable that the biometric information here is information about the autonomic nerve of the user U, that is, information whose value changes regardless of the intention of the user U. Specifically, the biological sensor 32 includes the pulse wave sensor 32A and detects the pulse wave of the user U as biological information. The biosensor 32 may include an electroencephalogram sensor that detects the electroencephalogram of the user U.
 脈波センサ32Aは、ユーザUの脈波を検出するセンサである。脈波センサ32Aは、例えば、発光部と受光部とを備える透過型光電方式のセンサであってよい。この場合、脈波センサ32Aは、例えば、ユーザUの指先を挟んで発光部と受光部とが対峙する構成となっており、指先を透過してきた光を受光部が受光し、脈波の圧力が大きいほど血流が大きくなることを利用して、脈の波形を計測するものであってよい。ただし、脈波センサ32Aは、それに限られず、脈波を検出可能な任意の方式のものであってよい。 The pulse wave sensor 32A is a sensor that detects the pulse wave of the user U. The pulse wave sensor 32A may be, for example, a transmissive photoelectric sensor including a light emitting unit and a light receiving unit. In this case, the pulse wave sensor 32A is configured such that, for example, the light emitting portion and the light receiving portion face each other with the fingertip of the user U interposed therebetween, and the light receiving portion receives the light transmitted through the fingertip, and the pressure of the pulse wave. The pulse waveform may be measured by utilizing the fact that the larger the value, the larger the blood flow. However, the pulse wave sensor 32A is not limited to this, and may be any method capable of detecting a pulse wave.
 生体情報取得部52は、生体センサ32を制御して、生体センサ32に生体情報を検出させる。生体情報取得部52は、生体センサ32が検出した生体情報を取得する。 The biometric information acquisition unit 52 controls the biometric sensor 32 to cause the biometric sensor 32 to detect biometric information. The biological information acquisition unit 52 acquires the biological information detected by the biological sensor 32.
 活性度スコア算出部54は、生体情報が取得した生体情報に基づいて、自律神経活性度を算出する。自律神経活性度を算出する方法については、後述する。活性度スコア算出部54は、活性度スコアを算出する。活性度スコア算出部54は、行動スコア算出部44が算出した行動スコアと、行動パターン特定部46が特定した行動パターンと、自律神経活性度に基づいて、活性度スコアを算出する。 The activity score calculation unit 54 calculates the autonomic nerve activity level based on the biometric information acquired by the biometric information. The method of calculating the autonomic nerve activity will be described later. The activity score calculation unit 54 calculates the activity score. The activity score calculation unit 54 calculates the activity score based on the behavior score calculated by the behavior score calculation unit 44, the behavior pattern specified by the behavior pattern specifying unit 46, and the autonomic nerve activity level.
 [処理内容]
 図9を用いて、第3実施形態に係る情報処理装置10aの処理内容について説明する。図9は、第3実施形態に係る情報処理装置10aの処理の流れの一例を示すフローチャートである。
[Processing content]
The processing content of the information processing apparatus 10a according to the third embodiment will be described with reference to FIG. 9. FIG. 9 is a flowchart showing an example of the processing flow of the information processing apparatus 10a according to the third embodiment.
 ステップS40の処理は、図3に示すステップS10と同一の処理なので、説明を省略する。 Since the process of step S40 is the same as the process of step S10 shown in FIG. 3, the description thereof will be omitted.
 制御部30aは、ユーザUの生体情報を取得する(ステップS41)。具体的には、生体情報取得部52は、生体センサ32の脈波センサ32Aを制御して、ユーザUの脈波情報を取得する。本実施形態では、後述するように、ユーザUの脈波情報を用いて、ユーザUの心理面におけるストレス度、リラックス度、興味度、および集中度の度合いを示す指針となる自律神経活性度を算出する。 The control unit 30a acquires the biometric information of the user U (step S41). Specifically, the biological information acquisition unit 52 controls the pulse wave sensor 32A of the biological sensor 32 to acquire the pulse wave information of the user U. In the present embodiment, as will be described later, the autonomic nervous activity degree that serves as a guideline for indicating the degree of stress, relaxation, interest, and concentration in the psychological aspect of the user U is determined by using the pulse wave information of the user U. calculate.
 ステップS42からステップS47の処理は、それぞれ、図3に示すステップS12からステップS17の処理と同一の処理なので、説明を省略する。 Since the processes of steps S42 to S47 are the same as the processes of steps S12 to S17 shown in FIG. 3, the description thereof will be omitted.
 制御部30aは、活性度スコアを算出する(ステップS48)。具体的には、活性度スコア算出部54は、ステップS41で取得された脈波情報に基づいて、ユーザUの活性度スコアを算出する。 The control unit 30a calculates the activity score (step S48). Specifically, the activity score calculation unit 54 calculates the activity score of the user U based on the pulse wave information acquired in step S41.
 図10を用いて、脈波について説明する。図10は、脈波の一例を示すグラフである。図10に示すように、脈波は、所定時間毎にR波WRと呼ばれるピークが現れる波形となる。脈の拍動は、心臓の洞結節にあるペースメーカ細胞の自然発火で生じる。脈拍のリズムは、交感神経と副交感神経の両方の影響を強く受けている。交感神経は、心臓活動を促進させる。副交感神経は、心臓活動を抑制させる。通常、交感神経と、副交感神経とは拮抗して作用する。安静時又はそれに近い状態では、副交感神経が支配的となる。通常、脈拍数は、交感神経の興奮によってアドレナリンが分泌されると増加し、副交感神経の興奮によってアセチルコリンが分泌されると減少する。したがって、自律神経の機能検査は、心電図におけるR-R間隔の変動調べることが有用であるとされている。これは、藤本等の「心電図R-R間隔の変動を用いた自律神経機能検査の正常参考値および標準予測式」(自律神経,30巻2号,1987年,167-173頁)及び早野等の「心拍変動と自律神経機能」(生物物理,28-4,P32-36,1988)に記載されている。R-R間隔とは、図10に示すように、時系列で連続するR波WR同士の間隔である。心拍変動は、信号波形のQPS波の頂点であるR波を脈の1発として測定される。心電図のR波の間隔の変動、つまり図10のR波間を示すR-R間隔の時間間隔の揺らぎが、自律神経指標として用いられている。自律神経指標としてR-R間隔の時間間隔の揺らぎを用いることの妥当性は、多くの医療機関で報告されている。R-R間隔のゆらぎは、安静時に大きくなり、ストレス時には小さくなる。 The pulse wave will be described with reference to FIG. FIG. 10 is a graph showing an example of a pulse wave. As shown in FIG. 10, the pulse wave is a waveform in which a peak called R wave WR appears at predetermined time intervals. The pulse beat is caused by spontaneous combustion of pacemaker cells in the sinus node of the heart. The pulse rhythm is strongly influenced by both the sympathetic and parasympathetic nerves. Sympathetic nerves promote cardiac activity. Parasympathetic nerves suppress cardiac activity. Normally, the sympathetic nerve and the parasympathetic nerve act in opposition to each other. At rest or near, the parasympathetic nerve becomes dominant. Normally, pulse rate increases when sympathetic excitement secretes adrenaline and decreases when parasympathetic excitement secretes acetylcholine. Therefore, it is useful to examine the fluctuation of the RR interval in the electrocardiogram for the function test of the autonomic nerve. This includes "normal reference values and standard prediction formulas for autonomic nerve function tests using fluctuations in the ECG RR interval" by Fujimoto et al. (Autonomic Nervous System, Vol. 30, No. 2, 1987, pp. 167-173) and Hayano et al. "Heart rate variability and autonomic nervous function" (Biophysics, 28-4, P32-36, 1988). As shown in FIG. 10, the RR interval is an interval between R wave WRs that are continuous in a time series. Heart rate variability is measured with the R wave, which is the apex of the QPS wave of the signal waveform, as one pulse. The fluctuation of the R wave interval of the electrocardiogram, that is, the fluctuation of the time interval of the RR interval indicating the R wave interval of FIG. 10 is used as an autonomic nerve index. The validity of using the time interval fluctuation of the RR interval as an autonomic nerve index has been reported in many medical institutions. Fluctuations in the RR interval increase at rest and decrease during stress.
 R-R間隔の変動には、いくつかの特徴的な揺らぎある。1つは0.1Hz付近に出現する低周波成分であり、血管の血圧のフィードバック調節に伴う交感神経系活動の変調に由来する。もう1つは呼吸に同調した変調であり、呼吸性洞性不整脈を反映する高周波成分である。高周波成分は、呼吸中枢による迷走神経前節ニューロンへの直接干渉と肺の伸展受容体および呼吸による血圧変化の反受容体反射を反映し、主に心臓に影響する副交感神経指標とされている。すなわち、脈波のR-R波間の揺らぎを測定した波形成分のうち、低周波成分のパワースペクトルは交感神経の活性度を示し、高周波成分のパワースペクトルは副交感神経の活性度を示しているといえる。 There are some characteristic fluctuations in the fluctuation of the RR interval. One is a low-frequency component that appears near 0.1 Hz, which is derived from the modulation of sympathetic nervous system activity associated with the feedback regulation of blood vessel blood pressure. The other is respiratory-tuned modulation, a high-frequency component that reflects respiratory sinus arrhythmia. The high-frequency component reflects the direct interference of the vagal nerve anterior neuron by the respiratory center and the antireceptor reflexes of lung extension receptors and changes in blood pressure due to respiration, and is regarded as a parasympathetic index that mainly affects the heart. That is, among the waveform components obtained by measuring the fluctuation between the RR waves of the pulse wave, the power spectrum of the low frequency component indicates the activity of the sympathetic nerve, and the power spectrum of the high frequency component indicates the activity of the parasympathetic nerve. I can say.
 入力される脈波の揺らぎは、R-R間隔値の微分値で求められる。この場合、R-R間隔の微分値が時間的に等間隔のデータでない場合、活性度スコア算出部54は、三次元スプライン補間などを用いて等間隔な時系列データに変換する。活性度スコア算出部54は、R-R間隔の微分値を、高速フーリエ変換などで直交変換を行う。これにより、活性度スコア算出部54は、脈波のR-R間隔値の微分値の高周波成分のパワースペクトルと、低周波成分のパワースペクトルを算出する。活性度スコア算出部54は、高周波成分のパワースペクトルの総和をRRHFとして算出する。活性度スコア算出部54は、低周波成分のパワースペクトルの総和のRRLFとして算出する。活性度スコア算出部54は、式(1)を用いて、自律神経活性度を算出する。活性度スコア算出部54は、自律神経活性度算出部と呼ばれることもある。 The fluctuation of the input pulse wave is obtained by the differential value of the RR interval value. In this case, if the differential value of the RR interval is not the data at equal intervals in time, the activity score calculation unit 54 converts the data into time series data at equal intervals by using three-dimensional spline interpolation or the like. The activity score calculation unit 54 performs orthogonal transformation of the differential value of the RR interval by fast Fourier transform or the like. As a result, the activity score calculation unit 54 calculates the power spectrum of the high frequency component of the differential value of the RR interval value of the pulse wave and the power spectrum of the low frequency component. The activity score calculation unit 54 calculates the sum of the power spectra of the high frequency components as RRHF. The activity score calculation unit 54 calculates it as RRLF, which is the sum of the power spectra of the low frequency components. The activity score calculation unit 54 calculates the autonomic nerve activity using the formula (1). The activity score calculation unit 54 may be referred to as an autonomic nerve activity calculation unit.
Figure JPOXMLDOC01-appb-M000001
 
Figure JPOXMLDOC01-appb-M000001
 
 式(1)において、ANは自律神経活性度、RRHFは高周波成分のパワースペクトルの総和、RRLFは低周波成分のパワースペクトルの総和である。C1及びC2は、ANの解の発散を抑えるために規定した固定値である。 In the formula (1), AN is the autonomic nervous activity, RRHF is the sum of the power spectra of the high frequency components, and RRLF is the sum of the power spectra of the low frequency components. C1 and C2 are fixed values defined in order to suppress the divergence of the solution of AN.
 活性度スコア算出部54は、式(2)を用いて、活性度スコアを算出する。 The activity score calculation unit 54 calculates the activity score using the formula (2).
Figure JPOXMLDOC01-appb-M000002
 
Figure JPOXMLDOC01-appb-M000002
 
 式(2)において、NSは活性度スコア、APは行動パターン、Rは行動スコア、ANは自律神経活性度である。すなわち、活性度スコア算出部54は、行動パターンと、行動スコアと、自律神経活性度とをパラメータとして含む関数を用いて、活性度スコアを算出してもよい。 In equation (2), NS is the activity score, AP is the behavior pattern, R is the behavior score, and AN is the autonomic nerve activity. That is, the activity score calculation unit 54 may calculate the activity score by using a function including the behavior pattern, the behavior score, and the autonomic nerve activity as parameters.
 また、活性度スコア算出部54は、例えば、学習モデルを用いて、活性度スコアを算出してもよい。学習モデルは、行動パターンと、行動スコアと、自律神経活性度と、活性度スコアとを1つのデータセットとし、複数のデータセットを教師データとして学習して構築されたAIモデルである。この場合、活性度スコア算出部54は、学習済みの学習モデルに行動パターンと、行動スコアと、自律神経活性度とを入力して、活性度スコアを示す情報を取得して、活性度スコアを算出する。 Further, the activity score calculation unit 54 may calculate the activity score by using, for example, a learning model. The learning model is an AI model constructed by learning a behavioral pattern, a behavioral score, an autonomic nervous activity, and an activity score as one data set, and learning a plurality of data sets as teacher data. In this case, the activity score calculation unit 54 inputs the behavior pattern, the behavior score, and the autonomic nerve activity into the learned learning model, acquires the information indicating the activity score, and obtains the activity score. calculate.
 制御部30aは、ユーザUに対して活性度スコアを提示する(ステップS49)。具体的には、出力制御部50は、表示部24A及び音声出力部24Bの少なくとも一方を制御して、ユーザUに対して活性度スコアを提示する。 The control unit 30a presents the activity score to the user U (step S49). Specifically, the output control unit 50 controls at least one of the display unit 24A and the voice output unit 24B, and presents the activity score to the user U.
 制御部30aは、行動パターンと活性度スコアを記憶部28に記憶させる(ステップS50)。具体的には、記憶制御部48は、ステップS46で特定された行動パターンと活性度スコアとを所定のフォーマットで記録する。 The control unit 30a stores the behavior pattern and the activity score in the storage unit 28 (step S50). Specifically, the memory control unit 48 records the behavior pattern and the activity score identified in step S46 in a predetermined format.
 図11は、行動パターンを記憶するフォーマットを説明するための図である。図11に示すように、記憶フォーマットF2は、領域D1aと、領域D2aと、領域D3aと、領域D4aと、領域D5aと、領域D6aとを、含み得る。 FIG. 11 is a diagram for explaining a format for storing an action pattern. As shown in FIG. 11, the storage format F2 may include a region D1a, a region D2a, a region D3a, a region D4a, a region D5a, and a region D6a.
 領域D1aには、特定された行動パターンをナンバリングされた数値が記憶される。領域D1aは、例えば、3バイトで構成される。領域D2aには、行動パターン情報を群としてプロットした空間の次元数が記憶される。領域D2aは、例えば、1バイトで構成される。領域D3aには、ユーザUの行動パターンであると判定された、行動パターン情報群の行動スコアRが記憶される。領域D4aには、自律神経活性度が記憶される。領域D4aは、例えば、2バイトで構成される。領域D5aには、活性度スコアが記憶される。領域D5aは、例えば、2バイトで構成されている。領域D6aは、リザーブ領域である。 In the area D1a, a numerical value numbered with the specified behavior pattern is stored. The region D1a is composed of, for example, 3 bytes. In the area D2a, the number of dimensions of the space in which the behavior pattern information is plotted as a group is stored. The area D2a is composed of, for example, 1 byte. In the area D3a, the behavior score R of the behavior pattern information group determined to be the behavior pattern of the user U is stored. The autonomic nerve activity is stored in the region D4a. The region D4a is composed of, for example, 2 bytes. The activity score is stored in the region D5a. The region D5a is composed of, for example, 2 bytes. The region D6a is a reserve region.
 ステップS51及びステップS52の処理は、それぞれ、図3に示すステップS18及びステップS19の処理と同一の処理なので説明を省略する。 Since the processes of steps S51 and S52 are the same as the processes of steps S18 and S19 shown in FIG. 3, respectively, the description thereof will be omitted.
 上述のとおり、第3実施形態に係る情報処理装置10aは、ユーザUの行動パターンとして特定された行動を行っていた際の、ユーザがその行動を楽しんでいた度合いを示す活性度スコアを算出することができる。これにより、第3実施形態に係る情報処理装置10aは、ユーザUの行動パターンをより適切に特定することができる。 As described above, the information processing apparatus 10a according to the third embodiment calculates an activity score indicating the degree to which the user enjoys the action when the action specified as the action pattern of the user U is performed. be able to. Thereby, the information processing apparatus 10a according to the third embodiment can more appropriately specify the behavior pattern of the user U.
 [第4実施形態]
 次に、図12を用いて、第4実施形態に係る情報処理装置について説明する。図12は、第4実施形態に係る情報処理装置の構成例を示すブロック図である。
[Fourth Embodiment]
Next, the information processing apparatus according to the fourth embodiment will be described with reference to FIG. FIG. 12 is a block diagram showing a configuration example of the information processing apparatus according to the fourth embodiment.
 図12に示すように、情報処理装置10bは、記憶部28aが補正データ28Cを記憶している点と、制御部30bが活性度スコア補正部56を備える点で、図8に示す情報処理装置10aとは異なる。 As shown in FIG. 12, the information processing apparatus 10b is an information processing apparatus shown in FIG. 8 in that the storage unit 28a stores the correction data 28C and the control unit 30b includes the activity score correction unit 56. It is different from 10a.
 国又は地域などが変われば、同じ行動パターンであっても感性などの違いにより感じ方に違いが生じることもあり得る。情報処理装置10bは、国又は地域に応じて、活性度スコアを補正する。 If the country or region changes, even if the behavior pattern is the same, the feeling may differ due to differences in sensibilities. The information processing apparatus 10b corrects the activity score according to the country or region.
 補正データ28Cは、活性度スコア補正部56が活性度スコアを補正する際に用いるデータである。補正データ28Cは、例えば、行動パターンと、国又は地域などに応じて活性度スコアに乗算する補正係数とが対応付けられているデータである。 The correction data 28C is data used by the activity score correction unit 56 to correct the activity score. The correction data 28C is data in which, for example, an action pattern and a correction coefficient for multiplying an activity score according to a country or region are associated with each other.
 図13は、補正データの一例を説明するための図である。図13には、行動パターンとして、行動パターンMP1と、行動パターンMP2と、行動パターンMP3とが示されている。また、国又は地域として、エリアA1と、エリアA2と、エリアA3とが示されている。図13では、行動パターンは、行動パターンMP1などのように概念的に示しているが、実際には、「ゴルフをする」などのように具体的に示される。また、国又は地域は、エリアA1などのように概念的に示しているが、実際には、日本などの具体的な国名や、東京などの具体的な地域名が示される。 FIG. 13 is a diagram for explaining an example of correction data. FIG. 13 shows an action pattern MP1, an action pattern MP2, and an action pattern MP3 as action patterns. Further, as a country or region, area A1, area A2, and area A3 are shown. In FIG. 13, the behavior pattern is conceptually shown as the behavior pattern MP1 or the like, but is actually shown concretely as “playing golf” or the like. Further, although the country or region is conceptually shown as area A1, in reality, a specific country name such as Japan or a specific region name such as Tokyo is shown.
 図13に示すように、エリアA1では、行動パターンMP1の活性度スコアには0.5を乗じ、行動パターンMP2の活性度スコアには0.2を乗じ、行動パターンMP3の活性度スコアには0.1を乗じる。エリアA2では、行動パターンMP1の活性度スコアには0.2を乗じ、行動パターンMP2の活性度スコアには0.6を乗じ、行動パターンMP3の活性度スコアには0.9を乗じる。エリアA3では、行動パターンMP1の活性度スコアには0.3を乗じ、行動パターンMP2の活性度スコアには0.7を乗じ、行動パターンMP3の活性度スコアには0.5を乗じる。図13に示すように、国又は地域が異なれば、行動パターンが同じであっても、活性度スコアに乗じる補正係数は変化し得る。補正係数は、例えば、各国又は地域ごとに予め想定し得る行動パターンについてアンケート調査を行うことで生成される。なお、図13では、エリアA1からエリアA3のように地域ごとに補正係数を分類しているが、地域以外の所定の条件で、補正係数を分類してもよい。例えば、対象とする行動パターンの種類によっては、年齢や性別などで補正係数を分類してもよい。 As shown in FIG. 13, in the area A1, the activity score of the behavior pattern MP1 is multiplied by 0.5, the activity score of the behavior pattern MP2 is multiplied by 0.2, and the activity score of the behavior pattern MP3 is multiplied by 0.2. Multiply by 0.1. In area A2, the activity score of the behavior pattern MP1 is multiplied by 0.2, the activity score of the behavior pattern MP2 is multiplied by 0.6, and the activity score of the behavior pattern MP3 is multiplied by 0.9. In area A3, the activity score of the behavior pattern MP1 is multiplied by 0.3, the activity score of the behavior pattern MP2 is multiplied by 0.7, and the activity score of the behavior pattern MP3 is multiplied by 0.5. As shown in FIG. 13, different countries or regions can have different correction factors for multiplying the activity score, even if the behavior patterns are the same. The correction coefficient is generated, for example, by conducting a questionnaire survey on behavior patterns that can be assumed in advance for each country or region. In FIG. 13, the correction coefficients are classified for each area as in areas A1 to A3, but the correction coefficients may be classified under predetermined conditions other than the area. For example, depending on the type of behavior pattern to be targeted, the correction coefficient may be classified according to age, gender, or the like.
 活性度スコア補正部56は、活性度スコア算出部54が算出した活性度スコアを補正する。具体的には、活性度スコア補正部56は、行動パターンが特定された国又は地域に基づいて、補正データ28Cを用いて、活性度スコアを補正する。 The activity score correction unit 56 corrects the activity score calculated by the activity score calculation unit 54. Specifically, the activity score correction unit 56 corrects the activity score using the correction data 28C based on the country or region in which the behavior pattern is specified.
 [処理内容]
 図14を用いて、第4実施形態に係る情報処理装置10bの処理内容について説明する。図14は、第4実施形態に係る情報処理装置10bの処理の流れの一例を示すフローチャートである。
[Processing content]
The processing content of the information processing apparatus 10b according to the fourth embodiment will be described with reference to FIG. FIG. 14 is a flowchart showing an example of the processing flow of the information processing apparatus 10b according to the fourth embodiment.
 ステップS60からステップS68の処理は、それぞれ、図9に示すステップS40からステップS48の処理と同一の処理なので、説明を省略する。 Since the processes of steps S60 to S68 are the same as the processes of steps S40 to S48 shown in FIG. 9, the description thereof will be omitted.
 制御部30bは、活性度スコアを補正する(ステップS69)。具体的には、活性度スコア補正部56は、行動パターン特定部46によって特定された位置情報に基づいて、補正データ28Cを用いて、活性度スコア算出部54が算出した活性度スコアを補正する。 The control unit 30b corrects the activity score (step S69). Specifically, the activity score correction unit 56 corrects the activity score calculated by the activity score calculation unit 54 using the correction data 28C based on the position information specified by the behavior pattern identification unit 46. ..
 制御部30bは、ユーザUに対して補正後の活性度スコアを提示する(ステップS70)。具体的には、出力制御部50は、表示部24A及び音声出力部24Bの少なくとも一方を制御して、ユーザUに対して補正後の活性度スコアを提示する。 The control unit 30b presents the corrected activity score to the user U (step S70). Specifically, the output control unit 50 controls at least one of the display unit 24A and the voice output unit 24B, and presents the corrected activity score to the user U.
 制御部30bは、行動パターンと補正後の活性度スコアを記憶部28に記憶させる(ステップS71)。具体的には、記憶制御部48は、ステップS66で特定された行動パターンと補正後の活性度スコアとを所定のフォーマットで記録する。 The control unit 30b stores the behavior pattern and the corrected activity score in the storage unit 28 (step S71). Specifically, the memory control unit 48 records the behavior pattern specified in step S66 and the corrected activity score in a predetermined format.
 ステップS72及びステップS73の処理は、それぞれ、図9に示すステップS51及びステップS52の処理と同一の処理なので、説明を省略する。 Since the processes of steps S72 and S73 are the same as the processes of steps S51 and S52 shown in FIG. 9, respectively, the description thereof will be omitted.
 上述のとおり、第4実施形態に係る情報処理装置10bは、国又は地域に応じて活性度スコアに補正係数を乗ずることにより、活性度スコアを補正する。これにより、第4実施形態に係る情報処理装置10bは、国又は地域に応じてより適切に活性度スコアを補正することができる。 As described above, the information processing apparatus 10b according to the fourth embodiment corrects the activity score by multiplying the activity score by a correction coefficient according to the country or region. Thereby, the information processing apparatus 10b according to the fourth embodiment can more appropriately correct the activity score according to the country or region.
 [第4実施形態の変形例]
 次に、第4実施形態の変形例について説明する。第4実施形態では、図13に示すように、行動パターンとエリアとが対応付けられた補正データ28Cを用いて、活性度スコアを補正する。しかしながら、補正データには、行動パターンが対応付けられていなくてもよい。すなわち、エリアごとに専用の補正係数が定められていてもよい。この場合、活性度スコア補正部56は、活性度スコア算出部54が算出した自律神経活性度を補正データに基づいて補正するようにすればよい。この場合、活性度スコア補正部56は、自律神経活性度補正部とも呼ばれ得る。
[Modified example of the fourth embodiment]
Next, a modified example of the fourth embodiment will be described. In the fourth embodiment, as shown in FIG. 13, the activity score is corrected by using the correction data 28C in which the behavior pattern and the area are associated with each other. However, the correction data does not have to be associated with an action pattern. That is, a dedicated correction coefficient may be set for each area. In this case, the activity score correction unit 56 may correct the autonomic nerve activity calculated by the activity score calculation unit 54 based on the correction data. In this case, the activity score correction unit 56 may also be referred to as an autonomic nerve activity correction unit.
 図15は、補正データの一例を説明するための図である。図15に示すように、補正データ28Caには、エリアごとに補正係数が定められている。図15に示す例では、エリアA1の補正係数は0.5、エリアA2の補正係数は0.2、及びエリアA3の補正係数は0.3である。 FIG. 15 is a diagram for explaining an example of correction data. As shown in FIG. 15, in the correction data 28Ca, a correction coefficient is determined for each area. In the example shown in FIG. 15, the correction coefficient of the area A1 is 0.5, the correction coefficient of the area A2 is 0.2, and the correction coefficient of the area A3 is 0.3.
 活性度スコア補正部56は、例えば、ユーザUの自律神経活性度が算出された地域がエリアA1であると判定された場合、算出された自律神経活性度に0.5を乗ずることで、自律神経の活性度を補正する。これにより、第4実施形態の変形例では、各地域に応じて自律神経の活性度をより適切に算出することができる。 For example, when it is determined that the area where the autonomic nerve activity of the user U is calculated is the area A1, the activity score correction unit 56 autonomously multiplies the calculated autonomic nerve activity by 0.5. Corrects nerve activity. Thereby, in the modified example of the fourth embodiment, the activity of the autonomic nerve can be calculated more appropriately according to each region.
 [第5実施形態]
 次に、第5実施形態について説明する。図16は、第5実施形態に係る情報処理装置の処理の流れの一例を示すフローチャートである。第5実施形態に係る情報処理装置の構成は、図12に示す情報処理装置10bと同一の構成なので、説明を省略する。
[Fifth Embodiment]
Next, the fifth embodiment will be described. FIG. 16 is a flowchart showing an example of the processing flow of the information processing apparatus according to the fifth embodiment. Since the configuration of the information processing apparatus according to the fifth embodiment is the same as that of the information processing apparatus 10b shown in FIG. 12, the description thereof will be omitted.
 第5実施形態では、情報処理装置10bは、特定されたユーザUの日常行動の行動パターンと、非日常行動の行動パターンついて、それぞれ独立に活性度スコアを算出する。また、第5実施形態では、情報処理装置10bは、算出されたユーザUの日常行動の行動パターンの活性度スコアと、非日常行動の行動パターンの活性度スコアついて、それぞれ独立に補正する。 In the fifth embodiment, the information processing apparatus 10b independently calculates the activity score for the behavior pattern of the specified daily behavior of the user U and the behavior pattern of the extraordinary behavior. Further, in the fifth embodiment, the information processing apparatus 10b independently corrects the calculated activity score of the behavior pattern of the daily behavior of the user U and the activity score of the behavior pattern of the extraordinary behavior.
 ステップS80からステップS84の処理は、それぞれ、図9に示すステップS40からステップS44の処理と同一の処理なので、説明を省略する。 Since the processes of steps S80 to S84 are the same as the processes of steps S40 to S44 shown in FIG. 9, the description thereof will be omitted.
 ステップS85からステップS87の処理は、ぞれぞれ、図6に示すステップS24からステップS26の処理と同一の処理なので、説明を省略する。 Since the processes of steps S85 to S87 are the same as the processes of steps S24 to S26 shown in FIG. 6, the description thereof will be omitted.
 制御部30bは、日常行動の行動パターンの活性度スコアを算出する(ステップS88)。具体的には、活性度スコア算出部54は、ステップS81で取得された脈波情報に基づいて、ユーザUの日常行動の行動パターンの活性度スコアを算出する。 The control unit 30b calculates the activity score of the behavior pattern of daily behavior (step S88). Specifically, the activity score calculation unit 54 calculates the activity score of the behavior pattern of the daily behavior of the user U based on the pulse wave information acquired in step S81.
 制御部30bは、日常行動の行動パターンの活性度スコアを補正する(ステップS89)。具体的には、活性度スコア補正部56は、行動パターン特定部46によって特定された位置情報に基づいて、補正データ28Cを用いて、活性度スコア算出部54が算出した日常行動の行動パターンの活性度スコアを補正する。 The control unit 30b corrects the activity score of the behavior pattern of daily behavior (step S89). Specifically, the activity score correction unit 56 uses the correction data 28C based on the position information specified by the behavior pattern identification unit 46 to determine the behavior pattern of daily behavior calculated by the activity score calculation unit 54. Correct the activity score.
 制御部30bは、ユーザUに対して補正後の日常行動の行動パターンの活性度スコアを提示する(ステップS90)。具体的には、出力制御部50は、表示部24A及び音声出力部24Bの少なくとも一方を制御して、ユーザUに対して補正後の活性度スコアを提示する。 The control unit 30b presents the user U with the activity score of the behavior pattern of the corrected daily behavior (step S90). Specifically, the output control unit 50 controls at least one of the display unit 24A and the voice output unit 24B, and presents the corrected activity score to the user U.
 制御部30bは、日常行動の行動パターンと補正後の活性度スコアを記憶部28に記憶させる(ステップS91)。具体的には、記憶制御部48は、ステップS86で特定された日常行動の行動パターンと補正後の活性度スコアとを所定のフォーマットで記録する。 The control unit 30b stores the behavior pattern of daily activities and the corrected activity score in the storage unit 28 (step S91). Specifically, the memory control unit 48 records the behavior pattern of the daily behavior identified in step S86 and the corrected activity score in a predetermined format.
 ステップS85でNoと判定された場合、ステップS92に進む。ステップS92からステップS94の処理は、それぞれ、図6に示すステップS28からステップS30の処理と同一の処理なので、説明を省略する。 If No is determined in step S85, the process proceeds to step S92. Since the processes of steps S92 to S94 are the same as the processes of steps S28 to S30 shown in FIG. 6, the description thereof will be omitted.
 制御部30bは、非日常行動の行動パターンの活性度スコアを算出する(ステップS95)。具体的には、活性度スコア算出部54は、ステップS81で取得された脈波情報に基づいて、ユーザUの非日常行動の行動パターンの活性度スコアを算出する。 The control unit 30b calculates the activity score of the behavior pattern of the extraordinary behavior (step S95). Specifically, the activity score calculation unit 54 calculates the activity score of the behavior pattern of the user U's extraordinary behavior based on the pulse wave information acquired in step S81.
 制御部30bは、非日常行動の行動パターンの活性度スコアを補正する(ステップS96)。具体的には、活性度スコア補正部56は、行動パターン特定部46によって特定された位置情報に基づいて、補正データ28Cを用いて、活性度スコア算出部54が算出した非日常行動の行動パターンの活性度スコアを補正する。 The control unit 30b corrects the activity score of the behavior pattern of extraordinary behavior (step S96). Specifically, the activity score correction unit 56 uses the correction data 28C based on the position information specified by the behavior pattern identification unit 46 to calculate the behavior pattern of extraordinary behavior calculated by the activity score calculation unit 54. Correct the activity score of.
 制御部30bは、ユーザUに対して補正後の非日常行動の行動パターンの活性度スコアを提示する(ステップS97)。具体的には、出力制御部50は、表示部24A及び音声出力部24Bの少なくとも一方を制御して、ユーザUに対して補正後の活性度スコアを提示する。 The control unit 30b presents the corrected activity score of the behavior pattern of the extraordinary behavior to the user U (step S97). Specifically, the output control unit 50 controls at least one of the display unit 24A and the voice output unit 24B, and presents the corrected activity score to the user U.
 制御部30bは、非日常行動の行動パターンと補正後の活性度スコアを記憶部28に記憶させる(ステップS91)。具体的には、記憶制御部48は、ステップS86で特定された非日常行動の行動パターンと補正後の活性度スコアとを所定のフォーマットで記録する。 The control unit 30b stores the behavior pattern of extraordinary behavior and the corrected activity score in the storage unit 28 (step S91). Specifically, the memory control unit 48 records the behavior pattern of the extraordinary behavior identified in step S86 and the corrected activity score in a predetermined format.
 ステップS99及びステップS100の処理は、それぞれ、図9に示すステップS51及びステップS52の処理と同一の処理なので、説明を省略する。 Since the processes of steps S99 and S100 are the same as the processes of steps S51 and S52 shown in FIG. 9, respectively, the description thereof will be omitted.
 上述のとおり、第5実施形態に係る情報処理装置10bは、日常行動の行動パターンと、非日常行動の行動パターンとして特定された行動を行っていた際の活性度スコアをそれぞれ独立に算出することができる。これにより、第5実施形態に係る情報処理装置10bは、ユーザUの行動パターンをより適切に特定することができる。 As described above, the information processing apparatus 10b according to the fifth embodiment independently calculates the behavior pattern of daily behavior and the activity score when performing the behavior specified as the behavior pattern of extraordinary behavior. Can be done. Thereby, the information processing apparatus 10b according to the fifth embodiment can more appropriately specify the behavior pattern of the user U.
 また、第5実施形態に係る情報処理装置10bは、国又は地域に応じて活性度スコアに補正係数を乗ずることにより、日常行動の行動パターンの活性度スコアと、非日常行動の行動パターンの活性度スコアを補正する。これにより、第5実施形態に係る情報処理装置10bは、国又は地域に応じてより適切に活性度スコアを補正することができる。 Further, the information processing apparatus 10b according to the fifth embodiment multiplies the activity score by a correction coefficient according to the country or region to obtain the activity score of the behavior pattern of daily behavior and the activity of the behavior pattern of extraordinary behavior. Correct the degree score. Thereby, the information processing apparatus 10b according to the fifth embodiment can more appropriately correct the activity score according to the country or region.
 [第6実施形態]
 次に、図17を用いて、第6実施形態に係る情報処理装置について説明する。図17は、第6実施形態に係る情報処理装置の構成例を示すブロック図である。
[Sixth Embodiment]
Next, the information processing apparatus according to the sixth embodiment will be described with reference to FIG. FIG. 17 is a block diagram showing a configuration example of the information processing apparatus according to the sixth embodiment.
 図17に示すように、情報処理装置10cは、記憶部28bが履歴データ28Dを記憶している点と、制御部30cが履歴データ取得部58と、学習部60とを備える点で、図8に示す情報処理装置10aとは異なる。 As shown in FIG. 17, in the information processing apparatus 10c, the storage unit 28b stores the history data 28D, and the control unit 30c includes the history data acquisition unit 58 and the learning unit 60. It is different from the information processing apparatus 10a shown in 1.
 趣味や趣向はユーザごとに異なることがあるため、同じ行動を行っている場合でも、活性度スコアの値は、ユーザごとに異なることもあり得る。第6実施形態に係る情報処理装置10cは、ユーザごとにカスタマイズされた学習済みモデルを用いて、活性度スコアを算出する。 Since hobbies and tastes may differ from user to user, the value of the activity score may differ from user to user even if the same behavior is performed. The information processing apparatus 10c according to the sixth embodiment calculates an activity score using a trained model customized for each user.
 履歴データ28Dは、活性度スコアの履歴に関するデータである。履歴データ28Dは、ユーザごとに所定期間における、活性度スコアの順位に関する情報を含み得る。具体的には、履歴データ28Dは、所定期間における活性度スコアが所定よりも高い行動パターンの情報を含み得る。所定期間は、例えば、3カ月であるが、これに限定されない。 The history data 28D is data related to the history of the activity score. The historical data 28D may include information regarding the ranking of activity scores for each user in a predetermined period. Specifically, the historical data 28D may include information on behavior patterns in which the activity score in a predetermined period is higher than a predetermined time. The predetermined period is, for example, 3 months, but is not limited to this.
 図18Aから図18Cは、履歴データ28Dの一例を説明するための図である。図18Aから図18Cに示すように、履歴データ28Dにおいては、順位と、行動パターンと、行動スコアと、活性度スコアとが対応付けられている。図18Aは、ユーザU1の履歴データの一例である。図18Bは、ユーザU2の履歴データの一例である。図18Cは、ユーザU3の履歴データの一例である。図18Aから図18Cには、それぞれ、ユーザU1からユーザU3の所定期間における活性度スコアが高かった行動パターンの1位から5位までが示されている。 18A to 18C are diagrams for explaining an example of the history data 28D. As shown in FIGS. 18A to 18C, in the history data 28D, the ranking, the behavior pattern, the behavior score, and the activity score are associated with each other. FIG. 18A is an example of the history data of the user U1. FIG. 18B is an example of the history data of the user U2. FIG. 18C is an example of the history data of the user U3. 18A to 18C show behavior patterns from users U1 to users U3 having high activity scores in a predetermined period, respectively, from 1st to 5th.
 図18Aに示すように、ユーザU1の1位は、行動スコアが10、活性度スコアが99の行動パターンMP4である。ユーザU1の2位は、行動スコアが9、活性度スコアが85の行動パターンMP3である。ユーザU1の3位は、行動スコアが8、活性度スコアが80の行動パターンMP1である。ユーザU1の4位は、行動スコアが7、活性度スコアが58の行動パターンMP9である。ユーザU1の5位は、行動スコアが7、活性度スコアが53の行動パターンMP3である。 As shown in FIG. 18A, the first place of the user U1 is the behavior pattern MP4 having a behavior score of 10 and an activity score of 99. The second place of the user U1 is the behavior pattern MP3 having a behavior score of 9 and an activity score of 85. The third place of the user U1 is the behavior pattern MP1 having an behavior score of 8 and an activity score of 80. The fourth place of the user U1 is the behavior pattern MP9 having a behavior score of 7 and an activity score of 58. The fifth place of the user U1 is the behavior pattern MP3 having a behavior score of 7 and an activity score of 53.
 図18Bに示すように、ユーザU2の1位は、行動スコアが8、活性度スコアが90の行動パターンMP4である。ユーザU2の2位は、行動スコアが8、活性度スコアが88の行動パターンMP3である。ユーザU2の3位は、行動スコアが8、活性度スコアが79の行動パターンMP1である。ユーザU2の4位は、行動スコアが9、活性度スコアが51の行動パターンMP8である。ユーザU2の5位は、行動スコアが9、活性度スコアが49の行動パターンMP5である。 As shown in FIG. 18B, the first place of the user U2 is the behavior pattern MP4 having an behavior score of 8 and an activity score of 90. The second place of the user U2 is the behavior pattern MP3 having an behavior score of 8 and an activity score of 88. The third place of the user U2 is the behavior pattern MP1 having an behavior score of 8 and an activity score of 79. The fourth place of the user U2 is the behavior pattern MP8 having an behavior score of 9 and an activity score of 51. The fifth place of the user U2 is the behavior pattern MP5 having an behavior score of 9 and an activity score of 49.
 図18Cに示すように、ユーザU3の1位は、行動スコアが10、活性度スコアが89の行動パターンMP7である。ユーザU3の2位は、行動スコアが6、活性度スコアが71の行動パターンMP2である。ユーザU3の3位は、行動スコアが7、活性度スコアが68の行動パターンMP9である。ユーザU3の4位は、行動スコアが8、活性度スコアが65の行動パターンMP4である。ユーザU3の5位は、行動スコアが9、活性度スコアが57の行動パターンMP3である。 As shown in FIG. 18C, the first place of the user U3 is the behavior pattern MP7 having a behavior score of 10 and an activity score of 89. The second place of the user U3 is the behavior pattern MP2 having an behavior score of 6 and an activity score of 71. The third place of the user U3 is the behavior pattern MP9 having a behavior score of 7 and an activity score of 68. The fourth place of the user U3 is the behavior pattern MP4 having an behavior score of 8 and an activity score of 65. The fifth place of the user U3 is the behavior pattern MP3 having an behavior score of 9 and an activity score of 57.
 図18Aから図18Cに示すように、同じ行動パターンであってもユーザに応じて活性度スコアの高い行動パターンは異なり、活性度スコアには個人差が存在する。 As shown in FIGS. 18A to 18C, even if the behavior pattern is the same, the behavior pattern with a high activity score differs depending on the user, and there are individual differences in the activity score.
 履歴データ取得部58は、記憶部28cから履歴データ28Dを取得する。具体的には、履歴データ取得部58は、活性度スコアの算出対象となるユーザの履歴データ28Dを取得する。 The history data acquisition unit 58 acquires the history data 28D from the storage unit 28c. Specifically, the history data acquisition unit 58 acquires the history data 28D of the user whose activity score is to be calculated.
 学習部60は、学習用データに基づいて、機械学習による学習によりユーザの活性度スコアを算出するための学習済みモデルを生成する。本実施形態では、学習部60は、例えば、履歴データ取得部58が取得した履歴データ28Dに基づいて、活性度スコアを算出するための学習済みモデルを生成する。学習部60は、例えば、活性度スコアを算出する学習済みモデルとしてDNN(Deep Neural Network)の重みを学習する。学習部60は、例えば、ディープラーニングなどの周知の機械学習の方法を用いて学習すればよい。学習部60は、例えば、学習用データが更新されるごとに、学習済みモデルを更新してもよい。 The learning unit 60 generates a trained model for calculating a user's activity score by learning by machine learning based on learning data. In the present embodiment, the learning unit 60 generates, for example, a trained model for calculating the activity score based on the history data 28D acquired by the history data acquisition unit 58. The learning unit 60 learns the weight of DNN (Deep Neural Network) as a trained model for calculating the activity score, for example. The learning unit 60 may learn by using a well-known machine learning method such as deep learning. The learning unit 60 may update the trained model every time the learning data is updated, for example.
 図19を用いて、第6実施形態に係る学習処理について説明する。図19は、第6実施形態に係る学習処理の流れの一例を示すフローチャートである。 The learning process according to the sixth embodiment will be described with reference to FIG. FIG. 19 is a flowchart showing an example of the flow of the learning process according to the sixth embodiment.
 制御部30cは、学習用データを取得する(ステップS110)。具体的には、履歴データ取得部58は、記憶部28bから活性度スコアの算出対象となるユーザの所定期間の履歴データ28Dを取得する。履歴データ取得部58が取得する履歴データ28Dには、少なくとも順位と、行動パターンと、行動スコアと、活性度スコアに関する情報が含まれ得る。履歴データ取得部58は、例えば、過去3カ月の1位から1000位までの履歴データ28Dを取得する。 The control unit 30c acquires learning data (step S110). Specifically, the history data acquisition unit 58 acquires the history data 28D for a predetermined period of the user for which the activity score is to be calculated from the storage unit 28b. The history data 28D acquired by the history data acquisition unit 58 may include at least information regarding a ranking, an action pattern, an action score, and an activity score. The history data acquisition unit 58 acquires, for example, the history data 28D from the 1st place to the 1000th place in the past 3 months.
 制御部30cは、学習処理を実行する(ステップS111)。具体的には、学習部60は、履歴データ取得部58が取得した履歴データ28Dを用いて、ユーザの活性度スコアを算出するための学習済みモデルを機械学習により学習して生成する。より具体的には、学習部60は、行動パターンと、行動スコアと、自律神経活性度と、活性度スコアとを1つのデータセットとし、複数(例えば、1000個)のデータセットを教師データとして学習して学習済みモデルを生成する。学習部60は、例えば、活性度スコアの算出対象となるユーザごとに学習済みモデルを生成する。すなわち、本実施形態では、ユーザごとにカスタマイズされた学習済みモデルを生成する。 The control unit 30c executes the learning process (step S111). Specifically, the learning unit 60 uses the history data 28D acquired by the history data acquisition unit 58 to learn and generate a trained model for calculating the user's activity score by machine learning. More specifically, the learning unit 60 uses a behavior pattern, a behavior score, an autonomic nerve activity degree, and an activity score as one data set, and a plurality of (for example, 1000) data sets as teacher data. Train to generate a trained model. The learning unit 60 generates, for example, a trained model for each user whose activity score is to be calculated. That is, in the present embodiment, a trained model customized for each user is generated.
 制御部30cは、学習済みモデルを記憶する(ステップS112)。具体的には、学習部60は、生成した学習済みモデルを記憶部28cに記憶する。 The control unit 30c stores the trained model (step S112). Specifically, the learning unit 60 stores the generated learned model in the storage unit 28c.
 [処理内容]
 図20を用いて、第6実施形態に係る情報処理装置10cの処理内容について説明する。図20は、第6実施形態に係る情報処理装置10cの処理の流れの一例を示すフローチャートである。
[Processing content]
The processing content of the information processing apparatus 10c according to the sixth embodiment will be described with reference to FIG. 20. FIG. 20 is a flowchart showing an example of the processing flow of the information processing apparatus 10c according to the sixth embodiment.
 ステップS120からステップS127の処理は、それぞれ、図9に示すステップS40からステップS47の処理と同一の処理なので、説明を省略する。 Since the processes of steps S120 to S127 are the same as the processes of steps S40 to S47 shown in FIG. 9, the description thereof will be omitted.
 制御部30cは、ユーザに応じた学習済みモデルに基づいて、活性度スコアを算出する(ステップS128)。具体的には、活性度スコア算出部54は、ユーザに応じてカスタマイズされた学習済みモデルを用いて、ユーザの活性度スコアを算出する。 The control unit 30c calculates the activity score based on the trained model according to the user (step S128). Specifically, the activity score calculation unit 54 calculates the user's activity score using a trained model customized according to the user.
 ステップS129からステップS132の処理は、それぞれ、図9に示すステップS49からステップS52の処理と同一の処理なので、説明を省略する。 Since the processes of steps S129 to S132 are the same as the processes of steps S49 to S52 shown in FIG. 9, the description thereof will be omitted.
 上述のとおり、第6実施形態に係る情報処理装置10cは、ユーザU1からユーザU3のそれぞれのユーザごとに活性度スコアの履歴に応じてカスタマイズして生成された学習済みモデルを用いて、ユーザU1からユーザU3の活性度スコアを算出する。これにより、第6実施形態に係る情報処理装置10cは、ユーザの感性に合わせ、より適切に活性度スコアを算出することができる。 As described above, the information processing apparatus 10c according to the sixth embodiment uses the trained model generated by customizing according to the history of the activity score for each user from the user U1 to the user U3, and uses the user U1. The activity score of the user U3 is calculated from. As a result, the information processing apparatus 10c according to the sixth embodiment can more appropriately calculate the activity score according to the user's sensibility.
 [第7実施形態]
 図21を用いて、第7実施形態に係る情報処理システムについて説明する。図21は、第7実施形態に係る情報処理システムの構成例を説明するための図である。
[7th Embodiment]
The information processing system according to the seventh embodiment will be described with reference to FIG. 21. FIG. 21 is a diagram for explaining a configuration example of the information processing system according to the seventh embodiment.
 図21に示すように、第7実施形態に係る情報処理システム1は、複数の情報処理装置10cと、サーバ装置100と、を含む。情報処理装置10cと、サーバ装置100とは、ネットワークN(例えば、インターネット)を介して、通信可能に接続されている。すなわち、第7実施形態に係る情報処理システム1は、第6実施形態に係る情報処理装置10cと、サーバ装置100とが通信可能に接続された構成を有している。 As shown in FIG. 21, the information processing system 1 according to the seventh embodiment includes a plurality of information processing devices 10c and a server device 100. The information processing device 10c and the server device 100 are communicably connected to each other via a network N (for example, the Internet). That is, the information processing system 1 according to the seventh embodiment has a configuration in which the information processing device 10c according to the sixth embodiment and the server device 100 are communicably connected to each other.
 上述の第6実施形態は、ユーザごとに活性度スコアの履歴に応じてカスタマイズして生成された学習済みモデルを用いて、ユーザの活性度を算出する。第7実施形態では、サーバ装置100が複数のユーザの履歴データを共有データとして記憶することにより、活性度スコアと行動パターンの傾向が近似している複数のユーザの履歴データに基づいて、学習済みモデルを生成して、ユーザの活性度を算出する。 In the above-mentioned sixth embodiment, the activity of the user is calculated using the trained model generated by customizing according to the history of the activity score for each user. In the seventh embodiment, the server device 100 stores the history data of a plurality of users as shared data, and has been learned based on the history data of the plurality of users whose activity score and the tendency of the behavior pattern are similar to each other. Generate a model to calculate user activity.
 図22を用いて、第7実施形態に係るサーバ装置の構成について説明する。図22は、第7実施形態に係るサーバ装置の構成例を示すブロック図である。 The configuration of the server device according to the seventh embodiment will be described with reference to FIG. 22. FIG. 22 is a block diagram showing a configuration example of the server device according to the seventh embodiment.
 図22に示すように、サーバ装置100は、通信部110と、制御部120と、記憶部130と、を備える。サーバ装置100は、いわゆるクラウドサーバである。 As shown in FIG. 22, the server device 100 includes a communication unit 110, a control unit 120, and a storage unit 130. The server device 100 is a so-called cloud server.
 通信部110は、例えば、NIC(Network Interface Card)又は通信回路などによって実現される。通信部110は、ネットワークNと無線又は有線で接続され、情報処理装置10cとの間で情報の送受信を行う。 The communication unit 110 is realized by, for example, a NIC (Network Interface Card) or a communication circuit. The communication unit 110 is connected to the network N wirelessly or by wire, and transmits / receives information to / from the information processing device 10c.
 制御部120は、サーバ装置100の各部の動作を制御する。制御部120は、例えば、CPUやMPU等によって、図示しない記憶部に記憶されたプログラムがRAM等を作業領域として実行されることにより実現される。制御部120は、例えば、ASICやFPGA等の集積回路により実現されてもよい。制御部120は、ハードウェアと、ソフトウェアとの組み合わせで実現されてもよい。制御部120は、取得部122と、判定部124と、要求部126と、提供部128と、を備える。 The control unit 120 controls the operation of each unit of the server device 100. The control unit 120 is realized by, for example, a CPU, an MPU, or the like executing a program stored in a storage unit (not shown) using a RAM or the like as a work area. The control unit 120 may be realized by an integrated circuit such as an ASIC or FPGA. The control unit 120 may be realized by a combination of hardware and software. The control unit 120 includes an acquisition unit 122, a determination unit 124, a request unit 126, and a provision unit 128.
 取得部122は、例えば、通信部110から情報処理装置10cを装着している各ユーザの活性度スコアの履歴に関する履歴データを取得する。取得部122は、例えば、図18Aから図18Cに示すユーザU1からユーザU3の履歴データ28Dから履歴データ28Dを取得する。取得部122は、取得した履歴データを共有データ132として記憶部130に記憶させる。 The acquisition unit 122 acquires, for example, historical data regarding the history of the activity score of each user wearing the information processing apparatus 10c from the communication unit 110. The acquisition unit 122 acquires the history data 28D 3 from the history data 28D 1 of the user U3 from the user U1 shown in FIGS. 18A to 18C, for example. The acquisition unit 122 stores the acquired history data as shared data 132 in the storage unit 130.
 判定部124は、共有データ132のうち、履歴データの傾向を判定する。判定部124は、例えば、履歴データの傾向が近似しているユーザがいるか否かを判定する。 The determination unit 124 determines the tendency of the history data among the shared data 132. The determination unit 124 determines, for example, whether or not there is a user whose tendency of historical data is similar.
 要求部126は、履歴データの傾向が近似している複数のユーザ存在した場合に、自身の履歴データを他のユーザに使用させることについての可否を要求する。 The request unit 126 requests whether or not to allow other users to use their own history data when there are a plurality of users whose trends in the history data are similar to each other.
 提供部128は、履歴データの使用が許可された場合に、履歴データの傾向が近似しているユーザに対して、履歴データを提供する。 When the use of historical data is permitted, the providing unit 128 provides historical data to users whose trends in historical data are similar.
 記憶部130は、制御部120の演算内容やプログラムなどの各種情報を記憶するメモリであり、例えば、RAMと、ROMのような主記憶装置と、HDDなどの外部記憶装置とのうち、少なくとも1つ含む。 The storage unit 130 is a memory that stores various information such as calculation contents and programs of the control unit 120. For example, at least one of a RAM, a main storage device such as a ROM, and an external storage device such as an HDD. Including one.
 記憶部130には、共有データ132が記憶されている。共有データ132は、情報処理装置10cを装着している複数のユーザの活性度スコアに関する履歴データを含み得る。共有データ132は、例えば、例えば、図18Aから図18Cに示すユーザU1からユーザU3の履歴データ28Dから履歴データ28Dを含み得る。 Shared data 132 is stored in the storage unit 130. The shared data 132 may include historical data regarding activity scores of a plurality of users wearing the information processing apparatus 10c. The shared data 132 may include, for example, historical data 28D 1 to historical data 28D 3 from users U1 to users U3 shown in FIGS. 18A to 18C.
 [サーバ装置の処理]
 図23を用いて、第7実施形態に係るサーバ装置の処理について説明する。図23は、第7実施形態に係るサーバ装置の処理の流れの一例を示すフローチャートである。
[Processing of server device]
A process of the server device according to the seventh embodiment will be described with reference to FIG. 23. FIG. 23 is a flowchart showing an example of the processing flow of the server device according to the seventh embodiment.
 制御部120は、共有データ132を参照し、履歴データが近似しているユーザがいるか否かを判定する(ステップS140)。例えば、共有データ132には、図18Aから図18Cに示すユーザU1からユーザU3の履歴データ28Dから履歴データ28Dが含まれているとする。判定部124は、例えば、1位から3位までの行動パターンが同じであり、各行動パターンの行動スコアが8以上であり、かつ互いの活性度スコアの差が10以下のユーザ同士を、履歴データが近似しているユーザであると判定する。この場合、判定部124は、ユーザU1と、ユーザU2の活性度スコアの1位が行動パターンMP4であり、2位が行動パターンMP3であり、3位が行動パターンMP1であることを特定する。判定部124は、ユーザU1について、行動パターンMP4の行動スコアが10であり、行動パターンMP3の行動スコアが9であり、行動パターンMP1の行動スコアが8であることを特定する。判定部124は、ユーザU2について、行動パターンMP4の行動スコアが8であり、行動パターンMP3の行動スコアが8であり、行動パターンMP1の行動スコアが8であることを特定する。判定部124は、ユーザU1及びユーザU2について、行動パターンMP1の活性度スコアの差が9であり、行動パターンMP3の活性度スコアの差が3であり、行動パターンMP1の活性度スコアの差が1であることを特定する。この場合、判定部124は、ユーザU1と、ユーザU2とは履歴データが近似していると判定する。 The control unit 120 refers to the shared data 132 and determines whether or not there is a user whose historical data is similar (step S140). For example, it is assumed that the shared data 132 includes the history data 28D 1 to the history data 28D 3 of the users U1 to U3 shown in FIGS. 18A to 18C. The determination unit 124, for example, history users who have the same behavior pattern from the first place to the third place, the behavior score of each behavior pattern is 8 or more, and the difference between the activity scores of each other is 10 or less. It is determined that the user has similar data. In this case, the determination unit 124 identifies that the first place of the activity score of the user U1 and the user U2 is the action pattern MP4, the second place is the action pattern MP3, and the third place is the action pattern MP1. The determination unit 124 identifies that the behavior score of the behavior pattern MP4 is 10, the behavior score of the behavior pattern MP3 is 9, and the behavior score of the behavior pattern MP1 is 8 for the user U1. The determination unit 124 identifies that the behavior score of the behavior pattern MP4 is 8, the behavior score of the behavior pattern MP3 is 8, and the behavior score of the behavior pattern MP1 is 8 for the user U2. The determination unit 124 has a difference in the activity score of the behavior pattern MP1 of 9, a difference in the activity score of the behavior pattern MP3 of 3, and a difference in the activity score of the behavior pattern MP1 for the user U1 and the user U2. Identify that it is 1. In this case, the determination unit 124 determines that the history data of the user U1 and the user U2 are similar to each other.
 本実施形態では、判定部124は、ユーザU1からユーザU3の3人の中から2人を選択している場合を説明したが、これは例示であり、実際のユーザの母集団の人数に特に制限はない。判定部124は、3人以上のユーザの履歴データが近似していると判定することもあり得る。判定部124は、本実施形態で説明した方法以外の方法で、履歴データが近似しているか否かを判定してもよい。判定部124は、例えば、数学的に定義された所定の条件式に従って、履歴データが近似しているか否かを判定してもよい。 In the present embodiment, the determination unit 124 has described the case where two users are selected from the three users U1 to U3, but this is an example and is particularly limited to the number of actual user populations. There is no limit. The determination unit 124 may determine that the history data of three or more users are close to each other. The determination unit 124 may determine whether or not the historical data are close to each other by a method other than the method described in this embodiment. The determination unit 124 may determine, for example, whether or not the historical data are close to each other according to a predetermined conditional expression defined mathematically.
 近似しているユーザがいると判定された場合(ステップS140;Yes)、ステップS141に進む。近似しているユーザがいないと判定された場合(ステップS140;No)、図23の処理を終了する。 If it is determined that there is an approximate user (step S140; Yes), the process proceeds to step S141. When it is determined that there is no approximate user (step S140; No), the process of FIG. 23 is terminated.
 ステップS140でYesと判定された場合、制御部120は、履歴データが近似しているユーザに対して共有の許可を要求する(ステップS141)。具体的には、要求部126は、通信部110を介して、ユーザU1及びユーザU2に対して、履歴データの共有の許可を求める通知を送信する。 If it is determined to be Yes in step S140, the control unit 120 requests the user whose historical data is close to each other for permission to share (step S141). Specifically, the request unit 126 transmits a notification requesting the user U1 and the user U2 for permission to share the historical data via the communication unit 110.
 制御部120は、共有の要求が許可されたか否かを判定する(ステップS142)。具体的には、要求部126は、ステップS141で送信した履歴データの共有の許可の要求に対して、ユーザU1又はユーザU2から履歴データの共有を許可する旨の返答があったか否かを判定する。共有の要求が許可されたと判定された場合(ステップS142;Yes)、ステップS143に進む。共有の要求が許可されていないと判定された場合(ステップS142;No)、図23の処理を終了する。 The control unit 120 determines whether or not the sharing request is permitted (step S142). Specifically, the request unit 126 determines whether or not there is a response from the user U1 or the user U2 to permit the sharing of the history data in response to the request for permission to share the history data transmitted in step S141. .. If it is determined that the sharing request is permitted (step S142; Yes), the process proceeds to step S143. When it is determined that the sharing request is not permitted (step S142; No), the process of FIG. 23 is terminated.
 ステップS142でYesと判定された場合、制御部120は、履歴データを共有させる(ステップS143)。具体的には、提供部128は、例えば、ユーザU2から履歴データの共有を許可された場合には、通信部110を介して、ユーザU1が装着している情報処理装置10cに対してユーザU1の履歴データ28Dを送信する。そして、図23の処理を終了する。 If it is determined to be Yes in step S142, the control unit 120 shares the history data (step S143). Specifically, the providing unit 128, for example, when the user U2 permits sharing of historical data, the user U1 with respect to the information processing device 10c worn by the user U1 via the communication unit 110. History data 28D 1 is transmitted. Then, the process of FIG. 23 is completed.
 図24を用いて、第7実施形態に係る学習処理について説明する。図24は、第7実施形態に係る学習処理の流れの一例を示すフローチャートである。以下では、ユーザU1が装着している情報処理装置10cがユーザU2の履歴データ28Dを取得して、学習処理を行う処理について説明する。 The learning process according to the seventh embodiment will be described with reference to FIG. 24. FIG. 24 is a flowchart showing an example of the flow of the learning process according to the seventh embodiment. Hereinafter, a process in which the information processing device 10c worn by the user U1 acquires the history data 28D 2 of the user U2 and performs the learning process will be described.
 制御部30cは、履歴データを取得する(ステップS150)。具体的には、履歴データ取得部58は、記憶部28cから活性度スコアの算出対象となるユーザの所定期間の履歴データ28Dを取得する。履歴データ取得部58が取得する履歴データ28Dには、少なくとも順位と、行動パターンと、行動スコアと、活性度スコアに関する情報が含まれ得る。履歴データ取得部58は、例えば、過去3カ月の1位から1000位までの履歴データ28Dを所得する。すなわち、履歴データ取得部58は、1000組のデータセットを教師データとして取得する。 The control unit 30c acquires historical data (step S150). Specifically, the history data acquisition unit 58 acquires the history data 28D 1 for a predetermined period of the user for which the activity score is to be calculated from the storage unit 28c. The history data 28D 1 acquired by the history data acquisition unit 58 may include at least information regarding a ranking, an action pattern, an action score, and an activity score. The history data acquisition unit 58 receives, for example, the history data 28D 1 from the first place to the 1000th place in the past three months. That is, the history data acquisition unit 58 acquires 1000 sets of data sets as teacher data.
 制御部30cは、サーバ装置100からユーザU1の履歴データと近似している履歴データを取得する(ステップS151)。具体的には、履歴データ取得部58は、例えば、通信部26を介して、サーバ装置100からユーザU2の履歴データ28Dを取得する。ここで、例えば、履歴データ28Dの1000組のデータセットに含まれる数の少ない行動パターンが存在することも想定され得る。例えば、学習済みモデルを生成する場合には、6000から8000組以上のデータセットが必要な場合もあり得る。本実施形態では、履歴データ取得部58がサーバ装置100から履歴データ28Dに近似する履歴データ28Dを取得することで、データセット数を補い、より最適な学習済みモデルを生成することができる。 The control unit 30c acquires history data that is close to the history data of the user U1 from the server device 100 (step S151). Specifically, the history data acquisition unit 58 acquires the history data 28D 2 of the user U2 from the server device 100 via, for example, the communication unit 26. Here, for example, it can be assumed that there are a small number of behavior patterns included in 1000 sets of data sets of historical data 28D1. For example, when generating a trained model, 6000 to 8000 sets or more of data sets may be required. In the present embodiment, the history data acquisition unit 58 acquires the history data 28D 2 that is close to the history data 28D 1 from the server device 100, so that the number of data sets can be supplemented and a more optimal trained model can be generated. ..
 制御部30cは、学習処理を実行する(ステップS152)。具体的には、学習部60は、履歴データ取得部58が取得した履歴データ28Dと履歴データ28Dとを用いて、ユーザの活性度スコアを算出するための学習済みモデルを機械学習により学習して生成する。 The control unit 30c executes the learning process (step S152). Specifically, the learning unit 60 learns a trained model for calculating a user's activity score by machine learning using the history data 28D 1 and the history data 28D 2 acquired by the history data acquisition unit 58. To generate.
 制御部30cは、学習済みモデルを記憶する(ステップS153)。具体的には、学習部60は、生成した学習済みモデルを記憶部28bに記憶する。そして、図24の処理を終了する。 The control unit 30c stores the trained model (step S153). Specifically, the learning unit 60 stores the generated learned model in the storage unit 28b. Then, the process of FIG. 24 is terminated.
 [処理内容]
 図25を用いて、第7実施形態に係る情報処理装置10cの処理内容について説明する。図25は、第7実施形態に係る情報処理装置10cの処理の流れの一例を示すフローチャートである。
[Processing content]
The processing content of the information processing apparatus 10c according to the seventh embodiment will be described with reference to FIG. 25. FIG. 25 is a flowchart showing an example of the processing flow of the information processing apparatus 10c according to the seventh embodiment.
 ステップS160からステップS167の処理は、それぞれ、図20に示すステップS120からステップS127の処理と同一の処理なので、説明を省略する。 Since the processes of steps S160 to S167 are the same as the processes of steps S120 to S127 shown in FIG. 20, the description thereof will be omitted.
 制御部30cは、共有データを用いて生成された学習済みモデルに基づいて、活性度スコアを算出する(ステップS168)。具体的には、活性度スコア算出部54は、履歴データ28Dと履歴データ28Dとを用いて生成された学習済みモデルを用いて、ユーザU1の活性度スコアを算出する。 The control unit 30c calculates the activity score based on the trained model generated using the shared data (step S168). Specifically, the activity score calculation unit 54 calculates the activity score of the user U1 by using the trained model generated by using the history data 28D 1 and the history data 28D 2 .
 ステップS169からステップS172の処理は、それぞれ、図20に示すステップS129からステップS132の処理と同一の処理なので、説明を省略する。 Since the processes of steps S169 to S172 are the same as the processes of steps S129 to S132 shown in FIG. 20, the description thereof will be omitted.
 上述のとおり、第7実施形態に係る情報処理装置10cは、ユーザU1の履歴データ28Dと、履歴データ28Dに近似するユーザU2の履歴データ28Dとを用いて学習済みモデルを生成し、その学習済みモデルを用いて活性度スコアを算出する。これにより、第7実施形態に係る情報処理装置10cは、より適切に活性度スコアを算出することができる。 As described above, the information processing apparatus 10c according to the seventh embodiment generates a trained model using the history data 28D 1 of the user U1 and the history data 28D 2 of the user U2 which is close to the history data 28D 1 . The activity score is calculated using the trained model. As a result, the information processing apparatus 10c according to the seventh embodiment can more appropriately calculate the activity score.
 [第8実施形態]
 次に、図26を用いて、第8実施形態に係る情報処理装置について説明する。図26は、第8実施形態に係る情報処理装置の構成例を示すブロック図である。
[Eighth Embodiment]
Next, the information processing apparatus according to the eighth embodiment will be described with reference to FIG. 26. FIG. 26 is a block diagram showing a configuration example of the information processing apparatus according to the eighth embodiment.
 図26に示すように、情報処理装置10dは、出力部24aが触覚刺激出力部24Cを備える点で、図2に示す情報処理装置10とは異なる。 As shown in FIG. 26, the information processing apparatus 10d is different from the information processing apparatus 10 shown in FIG. 2 in that the output unit 24a includes the tactile stimulus output unit 24C.
 ユーザの生体情報を時々刻々と変化するため、ユーザの活性度スコアも生体情報に変化に応じて、変化する可能性がある。情報処理装置10dは、ユーザの活性度スコアの時間的推移を相対的に提示することにより、活性度スコアの変化を分かりやすく提示する。 Since the user's biometric information changes from moment to moment, the user's activity score may also change according to the change in the biometric information. The information processing apparatus 10d presents the change in the activity score in an easy-to-understand manner by relatively presenting the temporal transition of the activity score of the user.
 触覚刺激出力部26Cは、ユーザUの触覚刺激を出力する装置である。例えば、触覚刺激出力部26Cは、振動などの物理的に作動することで、ユーザに触覚刺激を出力するが、触覚刺激の種類は、振動などに限られず任意のものであってよい。 The tactile stimulus output unit 26C is a device that outputs the tactile stimulus of the user U. For example, the tactile stimulus output unit 26C outputs a tactile stimulus to the user by physically operating such as vibration, but the type of the tactile stimulus is not limited to vibration or the like and may be arbitrary.
 第8実施形態に係る情報処理装置10dの制御部30dの出力制御部50は、出力制御部50は、出力部24aを制御して、活性度スコア算出部54が算出した活性度スコアの時間的な推移を示す情報を出力させる。言い換えれば、出力制御部50は、出力部24aを制御して、活性度スコアの相対的な変化を示す情報を出力させる。 In the output control unit 50 of the control unit 30d of the information processing apparatus 10d according to the eighth embodiment, the output control unit 50 controls the output unit 24a, and the activity score calculation unit 54 calculates the activity score in time. The information showing the transition is output. In other words, the output control unit 50 controls the output unit 24a to output information indicating a relative change in the activity score.
 図27Aから図27Eを用いて、活性度スコアの時間的な推移を相対的に表示する方法について説明する。図27Aから図27Eは、活性度スコアの時間的な推移を相対的に表示する方法を説明するための図である。 Using FIGS. 27A to 27E, a method of relatively displaying the temporal transition of the activity score will be described. 27A to 27E are diagrams for explaining a method of relatively displaying the temporal transition of the activity score.
 図27Aに示すように、出力制御部50は、例えば、表示部24Aを制御して、活性度スコアの時間的な推移をグラフG1として表示させる。図27Aは、横軸が時間、縦軸が活性度スコアを示す。グラフG1において、時刻t0は、活性度スコアの算出を開始した時点を示す。ユーザは、グラフG1を視認することで、活性度スコアの時間的な推移を容易に把握することができる。 As shown in FIG. 27A, the output control unit 50 controls, for example, the display unit 24A to display the temporal transition of the activity score as the graph G1. In FIG. 27A, the horizontal axis represents time and the vertical axis represents activity score. In the graph G1, the time t0 indicates the time when the calculation of the activity score is started. By visually recognizing the graph G1, the user can easily grasp the temporal transition of the activity score.
 図27Bに示すように、出力制御部50は、例えば、表示部24Aを制御して、開始時の活性度スコアの値と、現在の活性度スコアの値とを並列に表示させる。図27Bに示す例では、開始時の活性度スコアが56で、現在の活性度スコアが78であることが示されている。これにより、ユーザは、活性度スコアの時間的な推移を容易に把握することができる。 As shown in FIG. 27B, the output control unit 50 controls, for example, the display unit 24A to display the value of the activity score at the start and the value of the current activity score in parallel. In the example shown in FIG. 27B, it is shown that the activity score at the start is 56 and the current activity score is 78. As a result, the user can easily grasp the transition of the activity score over time.
 図27Cに示すように、出力制御部50は、例えば、表示部24Aを制御して、画面の左下隅を中心とする円を4等分した円弧を表示させる。図27Cに示す例では、円弧C1が、開始時点の活性度スコアを示し、その半径r1が活性度スコアの大きさを示す。図27Cに示す例では、円弧C2及び円弧C3は、現在の活性度スコアを示す。出力制御部50は、現在の活性度スコアが開始時点よりも大きければ、半径r1よりも大きい半径r2の円弧C2を半径r1の円弧C1と共に表示部24Aに表示させる。出力制御部50は、現在の活性度スコアが開始時点よりも小さければ、半径r1よりも小さい半径r3の円弧C3を半径r1の円弧C1と共に表示部24Aに表示させる。これにより、ユーザは、活性度スコアの時間的な推移を容易に把握することができる。 As shown in FIG. 27C, the output control unit 50 controls, for example, the display unit 24A to display an arc obtained by dividing a circle centered on the lower left corner of the screen into four equal parts. In the example shown in FIG. 27C, the arc C1 indicates the activity score at the start time, and the radius r1 indicates the magnitude of the activity score. In the example shown in FIG. 27C, arc C2 and arc C3 indicate the current activity score. If the current activity score is larger than the starting point, the output control unit 50 causes the display unit 24A to display the arc C2 having a radius r2 larger than the radius r1 together with the arc C1 having a radius r1. If the current activity score is smaller than the start time, the output control unit 50 causes the display unit 24A to display the arc C3 having a radius r3 smaller than the radius r1 together with the arc C1 having a radius r1. As a result, the user can easily grasp the transition of the activity score over time.
 図27Dに示すように、出力制御部50は、例えば、表示部24Aを制御して、棒を表示させる。図27Dに示す例では、棒B1が表示部24Aの中央部分に表示されているが、これに限らず、左下隅に表示されてもよいし、右下隅に表示されてもよい。図27Dに示す例では、棒B1が、開始時点の活性度スコアを示し、その高さh1が活性度スコアの大きさを示す。図27Dに示す例では、棒B2及び棒B3は、現在の活性度スコアを示す。出力制御部50は、現在の活性度スコアが開始時点よりも大きければ、高さh1よりも高い高さh2の棒B2を高さh1の棒B1と共に表示部24Aに表示させる。出力制御部50は、現在の活性度スコアが開始時点よりも小さければ、高さh1よりも低い高さh3の棒B3を高さh1の棒B1と共に表示部24Aに表示させる。これにより、ユーザは、活性度スコアの時間的な推移を容易に把握することができる。 As shown in FIG. 27D, the output control unit 50 controls, for example, the display unit 24A to display a bar. In the example shown in FIG. 27D, the bar B1 is displayed in the central portion of the display unit 24A, but the present invention is not limited to this, and the bar B1 may be displayed in the lower left corner or in the lower right corner. In the example shown in FIG. 27D, the bar B1 indicates the activity score at the start time, and the height h1 indicates the magnitude of the activity score. In the example shown in FIG. 27D, bars B2 and B3 indicate the current activity score. If the current activity score is higher than the starting point, the output control unit 50 causes the display unit 24A to display the rod B2 having a height h2 higher than the height h1 together with the rod B1 having a height h1. If the current activity score is smaller than the starting point, the output control unit 50 causes the display unit 24A to display the rod B3 having a height h3 lower than the height h1 together with the rod B1 having a height h1. As a result, the user can easily grasp the transition of the activity score over time.
 図27Eに示すように、出力制御部50は、例えば、表示部24Aを制御して、グラフG2を表示する。グラフG2は、棒グラフであり得る。図27Eは、横軸が時間を示し、縦軸が活性度スコアを示す。グラフG2において、時刻t0は、活性度スコアの算出を開始した時点を示す。ユーザは、グラフG2を視認することで、活性度スコアの時間的な推移を容易に把握することができる。 As shown in FIG. 27E, the output control unit 50 controls the display unit 24A, for example, to display the graph G2. Graph G2 can be a bar graph. In FIG. 27E, the horizontal axis represents time and the vertical axis represents activity score. In the graph G2, the time t0 indicates the time when the calculation of the activity score is started. By visually recognizing the graph G2, the user can easily grasp the temporal transition of the activity score.
 出力制御部50は、例えば、音声出力部24B又は触覚刺激出力部26Cを制御して、活性度スコアの時間的な推移を示す情報を出力してもよい。 The output control unit 50 may control, for example, the voice output unit 24B or the tactile stimulus output unit 26C to output information indicating the temporal transition of the activity score.
 図28を用いて、音声出力部24B又は触覚刺激出力部24Cにより、活性度スコアの時間的な推移を示す情報を出力する方法について説明する。図28は、音声出力部24B又は触覚刺激出力部24Cにより活性度スコアの時間的な推移を示す情報を出力する方法を説明するための図である。 A method of outputting information indicating a temporal transition of the activity score by the voice output unit 24B or the tactile stimulus output unit 24C will be described with reference to FIG. 28. FIG. 28 is a diagram for explaining a method of outputting information indicating a temporal transition of the activity score by the voice output unit 24B or the tactile stimulus output unit 24C.
 図28は、横軸が時間を示し、縦軸が活性度スコアを示す。時刻t0における活性度スコアNS1は、活性度スコアの算出を開始した時点の活性度スコアである。図28に示す例では、活性度スコアNS1が基準となる。出力制御部50は、例えば、基準である活性度スコアNS1を示す音声出力部24Bから出力させる音声の大きさを設定する。出力制御部50は、例えば、基準である活性度スコアNS1を示す触覚刺激出力部24Cから出力させる刺激(例えば、振動)の強さを設定する。 In FIG. 28, the horizontal axis represents time and the vertical axis represents activity score. The activity score NS1 at time t0 is the activity score at the time when the calculation of the activity score is started. In the example shown in FIG. 28, the activity score NS1 is used as a reference. The output control unit 50 sets, for example, the volume of the sound output from the sound output unit 24B indicating the activity score NS1 which is the reference. The output control unit 50 sets, for example, the strength of the stimulus (for example, vibration) to be output from the tactile stimulus output unit 24C showing the activity score NS1 as a reference.
 ここで、時刻t1の時点で活性度スコアが活性度スコアNS1よりも大きい、活性度スコアNS2になったとする。この場合、出力制御部50は、例えば、音声出力部24Bを制御して、活性度スコアNS1に対応する音声と、活性度スコアNS2に対応する音声とをセットで出力させる。出力制御部50は、例えば、触覚刺激出力部24Cを制御して、活性度スコアNS1に対応する刺激と、活性度スコアNS2に対応する刺激とをセットで出力させる。活性度スコアNS2に対応する音声は、活性度スコアNS1に対応する音声よりも大きい。活性度スコアNS2に対応する刺激は、活性度スコアNS1に対応する刺激よりも強い。活性度スコアNS2に対応する音声の大きさは、例えば、活性度スコアNS1に対する活性度スコアNS2の比に応じて変化させることが好ましい。活性度スコアNS2に対応する刺激の強さは、例えば、活性度スコアNS1に対する活性度スコアNS2の比に応じて変化させることが好ましい。これにより、ユーザは、活性度スコアNS2の活性度スコアNS1に対する相対的な大きさを把握することができる。 Here, it is assumed that the activity score becomes the activity score NS2, which is larger than the activity score NS1 at the time t1. In this case, the output control unit 50 controls, for example, the voice output unit 24B to output the voice corresponding to the activity score NS1 and the voice corresponding to the activity score NS2 as a set. For example, the output control unit 50 controls the tactile stimulus output unit 24C to output a stimulus corresponding to the activity score NS1 and a stimulus corresponding to the activity score NS2 as a set. The voice corresponding to the activity score NS2 is louder than the voice corresponding to the activity score NS1. The stimulus corresponding to the activity score NS2 is stronger than the stimulus corresponding to the activity score NS1. The loudness of the voice corresponding to the activity score NS2 is preferably changed according to, for example, the ratio of the activity score NS2 to the activity score NS1. The intensity of the stimulus corresponding to the activity score NS2 is preferably changed according to, for example, the ratio of the activity score NS2 to the activity score NS1. Thereby, the user can grasp the relative magnitude of the activity score NS2 with respect to the activity score NS1.
 ここで、時刻t2の時点で活性度スコアが活性度スコアNS1よりも小さい、活性度スコアNS3になったとする。この場合、出力制御部50は、例えば、音声出力部24Bを制御して、活性度スコアNS1に対応する音声と、活性度スコアNS3に対応する音声とをセットで出力させる。出力制御部50は、例えば、触覚刺激出力部24Cを制御して、活性度スコアNS1に対応する刺激と、活性度スコアNS3に対応する刺激とをセットで出力させる。活性度スコアNS3に対応する音声は、活性度スコアNS1に対応する音声よりも小さい。活性度スコアNS3に対応する刺激は、活性度スコアNS1に対応する刺激よりも弱い。活性度スコアNS3に対応する音声の大きさは、例えば、活性度スコアNS1に対する活性度スコアNS2の比に応じて変化させることが好ましい。活性度スコアNS3に対応する刺激の強さは、例えば、活性度スコアNS1に対する活性度スコアNS2の比に応じて変化させることが好ましい。これにより、ユーザは、活性度スコアNS2の活性度スコアNS1に対する相対的な大きさを把握することができる。 Here, it is assumed that the activity score becomes the activity score NS3, which is smaller than the activity score NS1 at the time t2. In this case, the output control unit 50 controls, for example, the voice output unit 24B to output the voice corresponding to the activity score NS1 and the voice corresponding to the activity score NS3 as a set. For example, the output control unit 50 controls the tactile stimulus output unit 24C to output a stimulus corresponding to the activity score NS1 and a stimulus corresponding to the activity score NS3 as a set. The voice corresponding to the activity score NS3 is smaller than the voice corresponding to the activity score NS1. The stimulus corresponding to the activity score NS3 is weaker than the stimulus corresponding to the activity score NS1. The loudness of the voice corresponding to the activity score NS3 is preferably changed according to, for example, the ratio of the activity score NS2 to the activity score NS1. The intensity of the stimulus corresponding to the activity score NS3 is preferably changed according to, for example, the ratio of the activity score NS2 to the activity score NS1. Thereby, the user can grasp the relative magnitude of the activity score NS2 with respect to the activity score NS1.
 上述のとおり、第8実施形態に係る情報処理装置10dは、ユーザの活性度スコアの時間的推移を相対的に提示する。これにより、第8実施形態に係る情報処理装置10dは、活性度スコアの時間的な変化が分かりやすくなる。 As described above, the information processing apparatus 10d according to the eighth embodiment relatively presents the temporal transition of the user's activity score. As a result, the information processing apparatus 10d according to the eighth embodiment makes it easy to understand the temporal change of the activity score.
 以上、本開示の実施形態を説明したが、これら実施形態の内容により本開示が限定されるものではない。また、前述した構成要素には、当業者が容易に想定できるもの、実質的に同一のもの、いわゆる均等の範囲のものが含まれる。さらに、前述した構成要素は適宜組み合わせることが可能である。さらに、前述した実施形態の要旨を逸脱しない範囲で構成要素の種々の省略、置換又は変更を行うことができる。 Although the embodiments of the present disclosure have been described above, the present disclosure is not limited by the contents of these embodiments. Further, the above-mentioned components include those that can be easily assumed by those skilled in the art, those that are substantially the same, that is, those in a so-called equal range. Furthermore, the components described above can be combined as appropriate. Further, various omissions, replacements or changes of the components can be made without departing from the gist of the above-described embodiment.
 本開示の情報処理装置、情報処理方法、およびプログラムは、ユーザの行動を分析する技術に適用することができる。 The information processing device, information processing method, and program of the present disclosure can be applied to a technique for analyzing user behavior.
 10,10a,10b,10c,10d 情報処理装置
 20 行動状態センサ
 20A カメラ
 20B マイク
 20C GNSS受信器
 20D 加速度センサ
 20E ジャイロセンサ
 20F 光センサ
 20G 温度センサ
 20H 湿度センサ
 22 入力部
 24 出力部
 24A 表示部
 24B 音声出力部
 24C 触覚刺激出力部
 26,110 通信部
 28,130 記憶部
 28A 学習モデル
 28B 地図データ
 28C,28Ca 補正データ
 28D 履歴データ
 30,30a,30b,30c,30d,120 制御部
 32 生体センサ
 32A 脈波センサ
 40 行動状態情報取得部
 42 行動パターン情報生成部
 44 行動スコア算出部
 46 行動パターン特定部
 48 記憶制御部
 50 出力制御部
 52 生体情報取得部
 54 活性度スコア算出部
 56 活性度スコア補正部
 58 履歴データ取得部
 60 学習部
 100 サーバ装置
 122 取得部
 124 判定部
 126 要求部
 128 提供部
10, 10a, 10b, 10c, 10d Information processing device 20 Behavioral state sensor 20A Camera 20B Microphone 20C GNSS receiver 20D Acceleration sensor 20E Gyro sensor 20F Optical sensor 20G Temperature sensor 20H Humidity sensor 22 Input unit 24 Output unit 24A Display unit 24B Audio Output unit 24C Tactile stimulus output unit 26,110 Communication unit 28,130 Storage unit 28A Learning model 28B Map data 28C, 28Ca Correction data 28D History data 30,30a, 30b, 30c, 30d, 120 Control unit 32 Biosensor 32A Pulse wave Sensor 40 Behavior state information acquisition unit 42 Behavior pattern information generation unit 44 Behavior score calculation unit 46 Behavior pattern specification unit 48 Memory control unit 50 Output control unit 52 Biometric information acquisition unit 54 Activity score calculation unit 56 Activity score correction unit 58 History Data acquisition unit 60 Learning unit 100 Server device 122 Acquisition unit 124 Judgment unit 126 Request unit 128 Providing unit

Claims (19)

  1.  ユーザの行動状態に関する行動状態情報を検出する行動状態検出センサと、
     前記行動状態情報に基づいて、少なくとも前記行動状態が検出された日時、場所、時間のパラメータを座標軸とする多次元空間に行動パターン情報を生成し、前記行動パターン情報が集まる行動パターン情報群の密度が所定の密度を超えている空間ごとにグループ化する行動パターン情報生成部と、
     前記行動パターン情報群を含む空間の大きさに関する情報を行動スコアとして算出する行動スコア算出部と、
     前記行動スコアの値が所定以上の前記空間内に存在する前記行動パターン情報群を前記ユーザの行動パターンとして特定する行動パターン特定部と、
     を備える、情報処理装置。
    A behavioral state detection sensor that detects behavioral state information related to the user's behavioral state,
    Based on the behavioral state information, behavioral pattern information is generated in a multidimensional space whose coordinate axes are at least the date and time, place, and time when the behavioral state is detected, and the density of the behavioral pattern information group in which the behavioral pattern information is collected. Behavior pattern information generator that groups by space that exceeds a predetermined density,
    An action score calculation unit that calculates information about the size of the space including the action pattern information group as an action score, and
    A behavior pattern specifying unit that specifies the behavior pattern information group in the space where the value of the behavior score is equal to or higher than a predetermined value as the behavior pattern of the user.
    An information processing device equipped with.
  2.  前記行動パターン特定部は、前記行動スコアの値が第1閾値よりも大きい前記行動パターンを非日常的行動として判定し、第2閾値よりも小さい前記行動パターンを日常行動として判定する、
     請求項1に記載の情報処理装置。
    The behavior pattern specifying unit determines the behavior pattern in which the value of the behavior score is larger than the first threshold value as an extraordinary behavior, and determines the behavior pattern in which the value is smaller than the second threshold value as a daily behavior.
    The information processing apparatus according to claim 1.
  3.  前記行動パターン情報生成部は、前記行動パターン情報を前記多次元空間に点をプロットして生成する、
     請求項1または2に記載の情報処理装置。
    The behavior pattern information generation unit generates the behavior pattern information by plotting points in the multidimensional space.
    The information processing apparatus according to claim 1 or 2.
  4.  前記ユーザの生体情報に関する生体情報を検出する生体センサと、
     前記生体情報に基づいて前記ユーザの自律神経活性度を算出し、前記自律神経活性度と、前記行動スコアと、前記行動パターンとに基づいて、前記ユーザの活性度スコアを算出する活性度スコア算出部と、を備える、
     請求項1から3のいずれか1項に記載の情報処理装置。
    A biosensor that detects biometric information related to the user's biometric information,
    The activity score calculation that calculates the autonomic nerve activity of the user based on the biological information and calculates the activity score of the user based on the autonomic nerve activity, the behavior score, and the behavior pattern. With a department,
    The information processing apparatus according to any one of claims 1 to 3.
  5.  前記ユーザの行動パターンが特定された国又は地域に基づいて、前記活性度スコアを補正する活性度スコア補正部を備える、
     請求項4に記載の情報処理装置。
    It is provided with an activity score correction unit that corrects the activity score based on the country or region in which the user's behavior pattern is specified.
    The information processing apparatus according to claim 4.
  6.  前記活性度スコア算出部は、前記ユーザの活性度スコアの履歴に基づいて、前記ユーザの現在の活性度スコアを算出する、
     請求項4または5に記載の情報処理装置。
    The activity score calculation unit calculates the current activity score of the user based on the history of the activity score of the user.
    The information processing apparatus according to claim 4 or 5.
  7.  前記ユーザの活性度スコアの時間的遷移を、出力部を用いて出力する出力制御部を備える、
     請求項4から6のいずれか1項に記載の情報処理装置。
    It is provided with an output control unit that outputs the temporal transition of the activity score of the user by using the output unit.
    The information processing apparatus according to any one of claims 4 to 6.
  8.  前記ユーザの行動パターンと活性度スコアとが近似している他のユーザの履歴に基づいて、前記ユーザの現在の活性度スコアを算出する、
     請求項4から7のいずれか1項に記載の情報処理装置。
    The current activity score of the user is calculated based on the history of another user whose behavior pattern and the activity score are similar to each other.
    The information processing apparatus according to any one of claims 4 to 7.
  9.  ユーザの行動状態に関する行動状態情報を検出する行動状態センサと、
     前記ユーザの生体情報に関する生体情報を検出する生体センサと、
     前記生体情報に基づいて、前記ユーザの自律神経活性度を算出する自律神経活性度算出部と、
     前記自律神経活性度の強度に応じて出力部からの出力の強度を変更する出力制御部と、
     を備える、情報処理装置。
    A behavioral state sensor that detects behavioral state information related to the user's behavioral state,
    A biosensor that detects biometric information related to the user's biometric information,
    An autonomic nerve activity calculation unit that calculates the autonomic nerve activity of the user based on the biological information, and an autonomic nerve activity calculation unit.
    An output control unit that changes the intensity of the output from the output unit according to the intensity of the autonomic nerve activity,
    An information processing device equipped with.
  10.  前記出力制御部は、前記ユーザの自律神経活性度の強度の時間的な推移に応じて、前記出力部からの出力の強度を変更する、
     請求項9に記載の情報処理装置。
    The output control unit changes the intensity of the output from the output unit according to the temporal transition of the intensity of the autonomic nerve activity of the user.
    The information processing apparatus according to claim 9.
  11.  前記出力部は、触覚刺激出力部を有し、
     前記出力制御部は、前記自律神経活性度の強度に応じて、前記触覚刺激出力部の振動の強度を変更する、
     請求項9または10に記載の情報処理装置。
    The output unit has a tactile stimulus output unit.
    The output control unit changes the vibration intensity of the tactile stimulus output unit according to the intensity of the autonomic nerve activity.
    The information processing apparatus according to claim 9 or 10.
  12.  ユーザの行動状態に関する行動状態情報を検出する行動状態センサと、
     前記ユーザの生体情報に関する生体情報を検出する生体センサと、
     前記生体情報に基づいて、前記ユーザの自律神経活性度を算出する自律神経活性度算出部と、
     前記ユーザの行動パターンが特定された国又は地域に基づいて、前記自律神経活性度を補正する自律神経活性度補正部と、
     を備える、情報処理装置。
    A behavioral state sensor that detects behavioral state information related to the user's behavioral state,
    A biosensor that detects biometric information related to the user's biometric information,
    An autonomic nerve activity calculation unit that calculates the autonomic nerve activity of the user based on the biological information, and an autonomic nerve activity calculation unit.
    An autonomic nerve activity correction unit that corrects the autonomic nerve activity based on the country or region in which the user's behavior pattern is specified,
    An information processing device equipped with.
  13.  前記自律神経活性度補正部は、前記国又は前記地域に応じて予め定められた補正係数を用いて前記自律神経活性度を補正する、
     請求項12に記載の情報処理装置。
    The autonomic nerve activity correction unit corrects the autonomic nerve activity using a correction coefficient predetermined according to the country or region.
    The information processing apparatus according to claim 12.
  14.  ユーザの行動状態に関する行動状態情報を検出するステップと、
     前記行動状態情報に基づいて、少なくとも前記行動状態が検出された日時、場所、時間のパラメータを座標軸とする多次元空間に行動パターン情報を生成し、行動パターン情報が集まる行動パターン情報群の密度が所定の密度を超えている空間ごとにグループ化するステップと、
     前記行動パターン情報群を含む空間の大きさに関する情報を行動スコアとして算出する行動スコア算出ステップと、
     前記行動スコアの値が所定以上の前記空間内に存在する前記行動パターン情報群を前記ユーザの行動パターンとして特定するステップと、
     を含む、情報処理方法。
    Steps to detect behavioral state information about the user's behavioral state,
    Based on the behavioral state information, at least the behavioral pattern information is generated in a multidimensional space whose coordinate axes are the date and time, place, and time when the behavioral state is detected, and the density of the behavioral pattern information group in which the behavioral pattern information is collected is increased. Steps to group by space above a given density,
    The action score calculation step of calculating the information about the size of the space including the action pattern information group as the action score, and the action score calculation step.
    A step of specifying the behavior pattern information group in which the value of the behavior score is equal to or higher than a predetermined value as the behavior pattern of the user, and
    Information processing methods, including.
  15.  ユーザの行動状態に関する行動状態情報を検出するステップと、
     前記ユーザの生体情報に関する生体情報を検出するステップと、
     前記生体情報に基づいて、前記ユーザの自律神経活性度を算出するステップと、
     前記自律神経活性度の強度に応じて出力部からの出力の強度を変更するステップと、
     を含む、情報処理方法。
    Steps to detect behavioral state information about the user's behavioral state,
    The step of detecting the biometric information regarding the biometric information of the user, and
    A step of calculating the autonomic nervous activity of the user based on the biometric information,
    The step of changing the intensity of the output from the output unit according to the intensity of the autonomic nerve activity, and
    Information processing methods, including.
  16.  ユーザの行動状態に関する行動状態情報を検出するステップと、
     前記ユーザの生体情報に関する生体情報を検出するステップと、
     前記生体情報に基づいて、前記ユーザの自律神経活性度を算出するステップと、
     前記ユーザの行動パターンが特定された国又は地域に基づいて、前記自律神経活性度を補正するステップと、
     を含む、情報処理方法。
    Steps to detect behavioral state information about the user's behavioral state,
    The step of detecting the biometric information regarding the biometric information of the user, and
    A step of calculating the autonomic nervous activity of the user based on the biometric information,
    A step of correcting the autonomic nervous activity based on the country or region in which the user's behavior pattern is specified, and
    Information processing methods, including.
  17.  ユーザの行動状態に関する行動状態情報を検出するステップと、
     前記行動状態情報に基づいて、少なくとも前記行動状態が検出された日時、場所、時間のパラメータを座標軸とする多次元空間に行動パターン情報を生成し、行動パターン情報が集まる行動パターン情報群の密度が所定の密度を超えている空間ごとにグループ化するステップと、
     前記行動パターン情報群を含む空間の大きさに関する情報を行動スコアとして算出する行動スコア算出ステップと、
     前記行動スコアの値が所定以上の前記空間内に存在する前記行動パターン情報群を前記ユーザの行動パターンとして特定するステップと、
     をコンピュータに実行させる、プログラム。
    Steps to detect behavioral state information about the user's behavioral state,
    Based on the behavioral state information, at least the behavioral pattern information is generated in a multidimensional space whose coordinate axes are the date and time, place, and time when the behavioral state is detected, and the density of the behavioral pattern information group in which the behavioral pattern information is collected is increased. Steps to group by space above a given density,
    The action score calculation step of calculating the information about the size of the space including the action pattern information group as the action score, and the action score calculation step.
    A step of specifying the behavior pattern information group in which the value of the behavior score is equal to or higher than a predetermined value as the behavior pattern of the user, and
    A program that lets your computer run.
  18.  ユーザの行動状態に関する行動状態情報を検出するステップと、
     前記ユーザの生体情報に関する生体情報を検出するステップと、
     前記生体情報に基づいて、前記ユーザの自律神経活性度を算出するステップと、
     前記自律神経活性度の強度に応じて出力部からの出力の強度を変更するステップと、
     をコンピュータに実行させる、プログラム。
    Steps to detect behavioral state information about the user's behavioral state,
    The step of detecting the biometric information regarding the biometric information of the user, and
    A step of calculating the autonomic nervous activity of the user based on the biometric information,
    The step of changing the intensity of the output from the output unit according to the intensity of the autonomic nerve activity, and
    A program that lets your computer run.
  19.  ユーザの行動状態に関する行動状態情報を検出するステップと、
     前記ユーザの生体情報に関する生体情報を検出するステップと、
     前記生体情報に基づいて、前記ユーザの自律神経活性度を算出するステップと、
     前記ユーザの行動パターンが特定された国又は地域に基づいて、前記自律神経活性度を補正するステップと、
     をコンピュータに実行させる、プログラム。
    Steps to detect behavioral state information about the user's behavioral state,
    The step of detecting the biometric information regarding the biometric information of the user, and
    A step of calculating the autonomic nervous activity of the user based on the biometric information,
    A step of correcting the autonomic nervous activity based on the country or region in which the user's behavior pattern is specified, and
    A program that lets your computer run.
PCT/JP2021/033904 2020-09-24 2021-09-15 Information processing device, information processing method, and program WO2022065154A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP21872284.1A EP4202820A1 (en) 2020-09-24 2021-09-15 Information processing device, information processing method, and program
CN202180058932.XA CN116194914A (en) 2020-09-24 2021-09-15 Information processing device, information processing method, and program
US18/187,816 US20230222884A1 (en) 2020-09-24 2023-03-22 Information processing apparatus, information processing method, and computer-readable storage medium

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2020-160244 2020-09-24
JP2020160161A JP2022053365A (en) 2020-09-24 2020-09-24 Information processing apparatus, information processing method, and program
JP2020-160246 2020-09-24
JP2020160244A JP2022053411A (en) 2020-09-24 2020-09-24 Information processing device, information processing method, and program
JP2020-160161 2020-09-24
JP2020160246A JP2022053413A (en) 2020-09-24 2020-09-24 Information processing device, information processing method, and program
JP2020160245A JP2022053412A (en) 2020-09-24 2020-09-24 Information processing device, information processing method, and program
JP2020-160245 2020-09-24

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/187,816 Continuation US20230222884A1 (en) 2020-09-24 2023-03-22 Information processing apparatus, information processing method, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
WO2022065154A1 true WO2022065154A1 (en) 2022-03-31

Family

ID=80846538

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/033904 WO2022065154A1 (en) 2020-09-24 2021-09-15 Information processing device, information processing method, and program

Country Status (4)

Country Link
US (1) US20230222884A1 (en)
EP (1) EP4202820A1 (en)
CN (1) CN116194914A (en)
WO (1) WO2022065154A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003046630A (en) 2001-08-02 2003-02-14 Nec Access Technica Ltd Portable telephone equipment, method and program for canceling operating mode therefor
JP2004184351A (en) 2002-12-06 2004-07-02 Toshiba Corp Operation information measuring system and operation information measuring method
JP2010112927A (en) * 2008-11-10 2010-05-20 Sony Corp Tactile action recognition device and tactile action recognition method, information processor, and computer program
JP2011198292A (en) * 2010-03-23 2011-10-06 Nippon Telegr & Teleph Corp <Ntt> Modeling device, method, and program for action prediction, and prediction device, method, and program using the modeling information
JP2019083564A (en) * 2012-10-19 2019-05-30 三星電子株式会社Samsung Electronics Co.,Ltd. Display device, remote control device for controlling display device, method of controlling display device, method of controlling server, and method of controlling remote control device
JP2019093175A (en) * 2019-02-05 2019-06-20 パイオニア株式会社 Drowsiness calculation device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003046630A (en) 2001-08-02 2003-02-14 Nec Access Technica Ltd Portable telephone equipment, method and program for canceling operating mode therefor
JP2004184351A (en) 2002-12-06 2004-07-02 Toshiba Corp Operation information measuring system and operation information measuring method
JP2010112927A (en) * 2008-11-10 2010-05-20 Sony Corp Tactile action recognition device and tactile action recognition method, information processor, and computer program
JP2011198292A (en) * 2010-03-23 2011-10-06 Nippon Telegr & Teleph Corp <Ntt> Modeling device, method, and program for action prediction, and prediction device, method, and program using the modeling information
JP2019083564A (en) * 2012-10-19 2019-05-30 三星電子株式会社Samsung Electronics Co.,Ltd. Display device, remote control device for controlling display device, method of controlling display device, method of controlling server, and method of controlling remote control device
JP2019093175A (en) * 2019-02-05 2019-06-20 パイオニア株式会社 Drowsiness calculation device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
FUJIMOTO ET AL.: "Normal Reference Values and Prediction Equations of Autonomic Nerve Functions Based on Variations in the R-R interval in Electrocardiographs", J. JAPAN DIAB. SEC., vol. 30, no. 2, 1987, pages 167 - 173
HAYANO ET AL.: "Heart Rate Variability and Autonomic Nerve Functions", SEIBUTSU BUTSURI, vol. 28-4, 1988, pages 32 - 36
OGAWA, SAYAKA; FUJIWARA, KOICHI; YAMAKAWA, TOSHITAKA; ABE, ERIKA; KANO, MANABU: "False Increasing Heart Rate Feedback For Improvement of Game Experience", IPSJ SIG TECHNICAL REPORT. EC, 1 September 2017 (2017-09-01), pages 280 - 286, XP009535601 *

Also Published As

Publication number Publication date
EP4202820A1 (en) 2023-06-28
CN116194914A (en) 2023-05-30
US20230222884A1 (en) 2023-07-13

Similar Documents

Publication Publication Date Title
KR102662981B1 (en) Digitally expresses user immersion with directed content based on biometric sensor data
US10885800B2 (en) Human performance optimization and training methods and systems
US20180206775A1 (en) Measuring medication response using wearables for parkinson&#39;s disease
Zhan et al. High frequency remote monitoring of Parkinson's disease via smartphone: Platform overview and medication response detection
US20190365286A1 (en) Passive tracking of dyskinesia/tremor symptoms
CN108348181A (en) Method and system for monitoring and improving attention
EP3079568B1 (en) Device, method and system for counting the number of cycles of a periodic movement of a subject
US11531393B1 (en) Human-computer interface systems and methods
US11904179B2 (en) Virtual reality headset and system for delivering an individualized therapy session
JP2022053365A (en) Information processing apparatus, information processing method, and program
WO2022065154A1 (en) Information processing device, information processing method, and program
US20220409110A1 (en) Inferring cognitive load based on gait
JP2022053412A (en) Information processing device, information processing method, and program
JP2022053413A (en) Information processing device, information processing method, and program
JP2022053411A (en) Information processing device, information processing method, and program
US20230112071A1 (en) Assessing fall risk of mobile device user
US20230123815A1 (en) Stability scoring of individuals utilizing inertial sensor device
US20210074389A1 (en) System and method for collecting, analyzing, and utilizing cognitive, behavioral, neuropsychological, and biometric data from a user&#39;s interaction with a smart device with either physically invasive or physically non-invasive means
US20230229372A1 (en) Display device, display method, and computer-readable storage medium
US20230200711A1 (en) Information providing device, information providing method, and computer-readable storage medium
US20220111257A1 (en) System, Method and Computer Program Product Configured for Sensor-Based Enhancement of Physical Rehabilitation
Mitra et al. Automatic Detection of Situational Context Using AI from Minimal Sensor Modality
CN116186552A (en) Providing unlabeled training data for training a computational model
Sinnott A quantitative investigation of natural head movement and its contribution to spatial orientation perception
JP2022027085A (en) Display device, display method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21872284

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021872284

Country of ref document: EP

Effective date: 20230323

NENP Non-entry into the national phase

Ref country code: DE