WO2024100860A1 - Presentation device, presentation method, and presentation program - Google Patents

Presentation device, presentation method, and presentation program Download PDF

Info

Publication number
WO2024100860A1
WO2024100860A1 PCT/JP2022/042005 JP2022042005W WO2024100860A1 WO 2024100860 A1 WO2024100860 A1 WO 2024100860A1 JP 2022042005 W JP2022042005 W JP 2022042005W WO 2024100860 A1 WO2024100860 A1 WO 2024100860A1
Authority
WO
WIPO (PCT)
Prior art keywords
synchronization
user
users
presentation device
presentation
Prior art date
Application number
PCT/JP2022/042005
Other languages
French (fr)
Japanese (ja)
Inventor
愛 中根
信哉 志水
藍李 太田
高雄 中村
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2022/042005 priority Critical patent/WO2024100860A1/en
Publication of WO2024100860A1 publication Critical patent/WO2024100860A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • A61B5/374Detecting the frequency distribution of signals, e.g. detecting delta, theta, alpha, beta or gamma waves

Definitions

  • the present invention relates to a presentation device, a presentation method, and a presentation program.
  • the present invention has been made in consideration of the above, and aims to provide a presentation device, a presentation method, and a presentation program that can present each user with specific actions that encourage synchronization of the phase of brain waves according to the current synchronization state.
  • the presentation device is characterized by having an acquisition unit that acquires electroencephalogram data of multiple users, a calculation unit that calculates the degree of phase synchronization based on the electroencephalogram data of each user, a decision unit that decides an action that encourages synchronization for each user based on the degree of synchronization calculated by the calculation unit, and an output control unit that causes an output unit to output information indicating the action decided by the decision unit.
  • FIG. 1 is a diagram illustrating an example of a configuration of a presentation device according to the first embodiment.
  • FIG. 2 is a diagram for explaining an example of presentation of the phase entrainment rate of electroencephalogram data.
  • FIG. 3 is a flowchart showing a processing procedure of the presentation method according to the first embodiment.
  • FIG. 4 is a diagram illustrating an example of a configuration of a presentation device according to the second embodiment.
  • FIG. 5 is a diagram illustrating an example of output of behavior information.
  • FIG. 6 is a flowchart showing a processing procedure of the presentation method according to the second embodiment.
  • FIG. 7 is a flowchart illustrating an example of a processing procedure of the behavior determination processing illustrated in FIG.
  • FIG. 8 is a diagram illustrating an example of a computer that realizes the presentation device by executing a program.
  • the electroencephalogram data of multiple users is divided into acquisition sites and predetermined frequency components, and the phase synchronization rate for each of the multiple sites and predetermined frequency components is calculated and presented to the user. This allows the user to specifically recognize the synchronization rate of the electroencephalogram phase. Since the calculated synchronization rate is related to cooperation between users, each user can intentionally control the state of their brain by presenting the synchronization state.
  • [Presentation device] 1 is a diagram showing an example of the configuration of a presentation device according to embodiment 1.
  • the presentation device 10 is realized by, for example, loading a predetermined program into a computer or the like including a ROM (Read Only Memory), a RAM (Random Access Memory), a CPU (Central Processing Unit), etc., and having the CPU execute the predetermined program.
  • the presentation device 10 also has a communication interface for transmitting and receiving various information to and from other devices connected via a network or the like.
  • the presentation device 10 has brain wave acquisition units 11-1 and 11-2 (acquisition units), a synchronization rate calculation unit 12 (calculation unit), an output control unit 13 (presentation unit), and output units 14-1 and 14-2.
  • the brain wave acquisition units 11-1 and 11-2 acquire brain wave data of users 1 and 2 for whom the synchronization rate is to be calculated, and transmit the acquired brain wave data to the synchronization rate calculation unit 12.
  • the brain wave data is time series data of brain waves.
  • the synchronization rate calculation unit 12 separates the EEG data of each user 1, 2 into acquired parts and frequency components, and calculates the degree of phase synchronization for multiple parts and predetermined frequency components in the EEG data of each user 1, 2.
  • the synchronization rate calculation unit 12 transmits the calculated synchronization to the output control unit 13.
  • the synchronization rate calculation unit 12 calculates the phase matching rate (synchronization rate) for each of a plurality of predetermined frequency components in the brain wave data of users 1 and 2 as the degree of synchronization. Note that the synchronization rate calculation unit 12 is not limited to the synchronization rate, and may also calculate a level value that indicates in stages the degree of phase matching for each of a plurality of predetermined frequency components in the brain wave data of users 1 and 2.
  • the synchronization rate calculation unit 12 separates the brainwave data sent from the brainwave acquisition units 11-1 and 11-2 into, for example, alpha waves, beta waves, gamma waves, and theta waves. The synchronization rate calculation unit 12 then calculates the instantaneous phase for each frequency and the acquisition site (left, right) of the brainwave data using a Hilbert transform.
  • the synchronization rate calculation unit 12 calculates the Phase Locking Value by performing phase synchronization analysis on the instantaneous phase between users 1 and 2 for each of the acquired parts: alpha waves, beta waves, gamma waves, and theta waves, and calculates the synchronization rate between users 1 and 2.
  • the entrainment rate calculation unit 12 may calculate the phase agreement rate of users 1 and 2 for each measurement channel of the brainwave data, and use this agreement rate as the entrainment rate.
  • the entrainment rate calculation unit 12 may use, for example, a wavelet transform instead of a Hilbert transform as a method for calculating the instantaneous phase.
  • the synchronization rate calculation unit 12 may also calculate the synchronization rate using a method other than the Phase Locking Value. For example, when there are three or more users, the synchronization rate calculation unit 12 regards the synchronization rate as a smallness of variation, and calculates the variance of the instantaneous phase or the average Phase Locking Value between each user as an index of this, and outputs the calculated value as the synchronization rate.
  • the output control unit 13 presents the synchrony to the users 1 and 2 by outputting each synchrony calculated by the synchrony rate calculation unit 12 from the output units 14-1 and 14-2 in association with the location and frequency component from which the electroencephalogram data was acquired.
  • the output control unit 13 may further present the average of all synchrony rates calculated by the synchrony rate calculation unit 12 and/or a trend graph of the average of all synchrony rates.
  • the output control unit 13 may further present the degree of deviation of the average of all synchrony rates calculated by the synchrony rate calculation unit 12 from a predetermined target synchrony.
  • the target synchrony is, for example, the maximum synchrony.
  • the output units 14-1 and 14-2 are, for example, a display that displays and outputs an image, or a terminal device having a display.
  • the output units 14-1 and 14-2 are also speakers or terminal devices having a speaker.
  • the output units 14-1 and 14-2 may also be output devices that output information that is given to the tactile sense of the users 1 and 2 by force, vibration, movement, temperature, etc.
  • FIG. 2 is a diagram for explaining an example of presentation of the phase entrainment rate of electroencephalogram data.
  • output control unit 13 causes output units 14-1 and 14-2 to display, as the current state, the average of all synchronization rates for users 1 and 2 calculated by synchronization rate calculation unit 12 (see frame W1), table T1 showing the synchronization rates for each location and frequency component where EEG data was acquired, and trend graph G1 of the average of all synchronization rates showing the trend in the synchronization state between users 1 and 2.
  • the average of all synchronization rates is, for example, the average of all synchronization rates of alpha waves, beta waves, gamma waves, and theta waves in the left and right brain.
  • Table T1 shows the synchronization rates of alpha waves, beta waves, gamma waves, and theta waves in the left and right brain.
  • the output control unit 13 may also change the color of frame W2 of the transition graph G1 depending on the degree of deviation of the average of all synchronization rates from a predetermined target synchronization rate (for example, the maximum synchronization rate (100%)).
  • the output control unit 13 displays the color of frame W2 of the transition graph G1 in a bluer color as the average of all synchronization rates calculated by the synchronization rate calculation unit 12 approaches 100%, and changes to a redder color as it deviates from 100%.
  • the output control unit 13 may present one or more of the average of all synchronization rates (frame W1), table T1, and transition graph G1, rather than all of them, and the display format of each data is not limited to the display format of FIG. 3.
  • the output control unit 13 may present, for example, only the highest synchronization rate, rather than the average of all synchronization rates or the synchronization rates of alpha waves, beta waves, gamma waves, and theta waves of the left and right brain.
  • the output control unit 13 may present only changes such as an improvement in the synchronization rate or a decrease in the synchronization rate, in addition to presenting the transition graph G1 of the average of all synchronization rates, as a transition of the synchronization rate.
  • the output control unit 13 may present the deviation from an ideal maximum synchronization rate (for example, 100%), as well as the degree of deviation from a predetermined target synchronization rate that is set arbitrarily.
  • output units 14-1 and 14-2 are provided for users 1 and 2, respectively, but a configuration in which all users 1 and 2 check the presented information using a single output device may also be used.
  • the output control unit 13 may present the synchronization rate for each pair of users.
  • the output control unit 13 may present the synchronization rate of the pair with the highest synchronization rate and/or the synchronization rate of the pair with the lowest synchronization rate.
  • the output control unit 13 may also provide auditory or tactile presentation.
  • the output control unit 13 may present the tuning state by, for example, increasing the volume as the tuning state increases.
  • the output control unit 13 may present the tuning state by, for example, increasing the temperature as the tuning state increases.
  • FIG. 3 is a flowchart showing the processing procedure of the presentation method according to the first embodiment.
  • the brain wave acquisition units 11-1 and 11-2 acquire brain wave data of users 1 and 2, who are the subjects of the synchronization rate calculation (step S11), and transmit the acquired brain wave data to the synchronization rate calculation unit 12.
  • the synchronization rate calculation unit 12 separates the received EEG data of users 1 and 2 into acquired body parts and frequency components, and calculates the degree of phase synchronization (e.g., synchronization rate) for multiple body parts and predetermined frequency components in the EEG data of each user (step S12).
  • degree of phase synchronization e.g., synchronization rate
  • the output control unit 13 associates each degree of synchronization (e.g., synchronization rate) calculated by the synchronization rate calculation unit 12 with the location and frequency from which the EEG data was acquired, and outputs the result from the output units 14-1 and 14-2 (step S13).
  • degree of synchronization e.g., synchronization rate
  • the presentation device 10 acquires electroencephalogram data of a plurality of users, separates the electroencephalogram data of each user into acquired parts and frequency components, and calculates the phase synchrony for each part and predetermined frequency component in the electroencephalogram data of each user. The presentation device 10 then associates each calculated synchrony with the acquired part and frequency component of the electroencephalogram data and presents it to the user.
  • the presentation device 10 can separate each user's EEG data into acquisition areas and frequency components, calculate the degree of phase synchronization for multiple areas and predetermined frequency components in each user's EEG data, and present the degree of phase synchronization of the EEG to the user in concrete terms. This allows the user to specifically recognize the degree to which the phases of the EEG are synchronized.
  • the presentation device 10 calculates and presents the phase synchronization of the EEG data for each part of the brain, rather than for the entire head, allowing the user to specifically identify which parts of the brain are currently active.
  • the frequency of brainwave data that becomes active varies depending on the state of activity. Specifically, alpha waves indicate a relaxed state, beta waves indicate a state of concentration, and theta waves indicate a state of light sleep. Furthermore, it is known that the synchronization of theta waves during joint movements enhances the sense of agency over joint movements (Reference 1). Therefore, the presentation device 10 calculates and presents the degree of phase synchronization of the brainwave data for each specified frequency component, allowing the user to specifically recognize in which state the user is synchronized.
  • neurofeedback There is a technique called neurofeedback that allows users to intentionally control the state of their brain by providing them with feedback on their own brain activity in real time.
  • the presentation device 10 separates each user's EEG data into acquisition areas and frequency components, and calculates and presents the phase synchronization for each of the multiple areas and frequency components in each user's EEG data, so that it is possible to increase the synchronization between users using neurofeedback.
  • the synchronization calculated and presented by the presentation device 10 is related to cooperation between users, and by presenting the state of synchronization between users, each user can intentionally control the state of their brain.
  • the synchronization between users can be improved by the actions of the users. Therefore, in the second embodiment, the synchronization between the multiple users is improved by presenting actions that encourage synchronization according to the synchronization calculated based on the electroencephalogram data of the multiple users.
  • [Presentation device] 4 is a diagram showing an example of the configuration of the presentation device according to embodiment 2.
  • the presentation device 210 is realized, for example, by loading a predetermined program into a computer or the like including a ROM, a RAM, a CPU, etc., and causing the CPU to execute the predetermined program.
  • the presentation device 210 also has a communication interface for transmitting and receiving various information to and from other devices connected via a network or the like.
  • the presentation device 210 has brain wave acquisition units 11-1, 11-2, 11-3 (acquisition units), a synchronization rate calculation unit 12 (calculation unit), a behavior decision unit 213, an output control unit 214 (presentation unit), and output units 14-1, 14-2, 14-3 that output behavior information to each user 1, 2, 3, respectively.
  • the number of users is not limited to three, as long as there is more than one.
  • the brain wave acquisition units 11-1, 11-2, and 11-3 acquire brain wave data of users 1, 2, and 3, who are the targets of the behavior presentation, and transmit it to the synchronization rate calculation unit 12.
  • the synchronization rate calculation unit 12 calculates the degree of phase synchronization of the EEG data for each user pair and transmits it to the behavior decision unit 213. By performing the same process as in embodiment 1, the synchronization rate calculation unit 12 separates the EEG data of users 1, 2, and 3 into acquisition parts and frequency components for each user pair, and calculates the phase synchronization rate for multiple parts and predetermined frequency components in the EEG data of each user. The synchronization rate calculation unit 12 also calculates the phase synchronization rate of the EEG data at a predetermined time interval (e.g., every second).
  • a predetermined time interval e.g., every second
  • the synchronization rate calculation unit 12 may calculate a value that collectively represents the degree of synchronization of all users, such as the average or median of the synchronization rates of all pairs of users, rather than for each pair of users.
  • the presentation device 210 may be configured to omit the brain wave acquisition units 11-1, 11-2, 11-3 and the synchronization rate calculation unit 12.
  • the behavior decision unit 213 decides on an action to encourage the user to synchronize based on each synchronization degree calculated by the synchronization rate calculation unit 12.
  • the behavior decision unit 213 decides on a behavior that promotes synchronization based on the synchronization rate calculated by the synchronization rate calculation unit 12 and in accordance with a preset behavior decision algorithm.
  • actions that encourage synchronization are set for each range of synchronization levels. For example, hugging is set as an action that encourages synchronization when the synchronization level is greater than 50%, eye contact and handshakes are set when the synchronization level is greater than 30% but less than 50%, and eye contact alone is set when the synchronization level is less than 30%.
  • the behavior decision unit 213 decides on a behavior that will encourage synchronization depending on the range of the degree of synchronization calculated by the synchronization rate calculation unit 12. For example, the behavior decision unit 213 decides on a hug for a user pair whose synchronization rate is 60%. The behavior decision unit 213 decides on eye contact and a handshake for a user pair whose synchronization rate is 40%. The behavior decision unit 213 may also present a behavior that will further increase synchronization to a user pair whose synchronization rate has reached a target synchronization rate.
  • the behavior decision unit 213 decides the behavior of the user pair according to the range of synchronization degree. Synchrony degree can be improved by the pair of users making eye contact, engaging in joint gaze, making physical contact, etc.
  • the behavior decision unit 213 decides on an action for all users to increase the overall synchronization degree. For example, when the synchronization rate is less than 30% for any pair of users, the behavior decision unit 213 decides on an action for all users to breathe in time with a cue. Furthermore, the behavior decision unit 213 may decide on output processing such as periodic sound output, blinking lights, or playing music, to create an environment that makes it easier to synchronize breathing.
  • the behavior decision unit 213 may also arbitrarily determine a first user for whom behavior is to be determined, randomly determine a second user targeted by this first user, and determine the behavior of the first user toward the second user. Alternatively, the behavior decision unit 213 may arbitrarily set a pair of users with the highest synchronization rate (first pair) and a user (first user) not included in the first pair, determine the user of the first pair who has the highest synchronization rate with the first user as the second user, and determine only the behavior of the first user toward this second user. The behavior decision unit 213 supports the behavior of the first user so that the first user can synchronize with the second user.
  • the behavior decision unit 213 may also arbitrarily decide on a pair of a first user and a second user, and decide on an action that will encourage the users of this pair to take an action that encourages synchronization with each other, thereby supporting the actions of the first user and the second user.
  • the behavior decision unit 213 may, for example, employ an algorithm that is not limited to extracting pairs of users and deciding on a behavior, but instead determines and presents behaviors to all users, calculates the degree of synchronicity for each pair of users, and decides on a behavior for each pair.
  • the behavior decision unit 213 may also decide, for each pair of users, on the behavior to present to the users using a model that has previously learned the relationship between synchrony and behaviors that increase synchrony.
  • the output control unit 214 presents users 1, 2, and 3 with behavior that encourages conformity by causing the output units 14-1, 14-2, and 14-3 to output behavior information indicating the behavior determined by the behavior determination unit 213.
  • the output control unit 214 determines the output method for the behavior information to be presented to each user, and causes the output units 14-1, 14-2, and 14-3 to output the behavior information using the determined output method.
  • FIG. 5 is a diagram illustrating an example of output of behavioral information.
  • the output control unit 214 displays and outputs a menu M21 showing the behavior "Make eye contact with Mr. A and shake hands" (see frame W22) together with the average synchronization rate (see frame W21) on a display that is recognizable by the user to whom the behavior is presented.
  • output units 14-1, 14-2, and 14-3 are provided for users 1, 2, and 3, respectively.
  • the output units 14-1, 14-2, and 14-3 may support the execution of actions that encourage synchronization by outputting auditory information in addition to visual information.
  • FIG. 6 is a flowchart showing the processing procedure of the presentation method according to the second embodiment.
  • the brain wave acquisition units 11-1, 11-2, and 11-3 acquire brain wave data of users 1, 2, and 3 for which the synchronization rate is to be calculated (step S211), and transmit the acquired brain wave data to the synchronization rate calculation unit 12.
  • the synchronization rate calculation unit 12 divides the EEG data of users 1, 2, and 3 into acquisition areas and frequency components, and calculates the phase synchronization rate for each of the multiple areas and frequency components in the EEG data of each user (step S212).
  • the behavior decision unit 213 performs a behavior decision process to decide on a behavior that will encourage synchronization based on the calculations made by the synchronization rate calculation unit 12 (step S213).
  • the output control unit 214 causes the output units 14-1, 14-2, and 14-3 to output behavior information indicating the behavior decided in step S213 (step S214).
  • FIG. 7 is a flowchart showing an example of the processing procedure of the behavior decision process shown in Fig. 6.
  • Fig. 7 illustrates an example of a behavior decision algorithm that extracts pairs of users (synchronized pairs) whose average synchronization rate for the last minute before the current time is 50% or more, and determines the behavior of other pairs of users to have one-to-one interactions with users included in the synchronized pairs.
  • the behavior decision unit 213 counts the synchronization rates for each second in the last minute before the current time for all pairs based on the synchronization rates calculated by the synchronization rate calculation unit 12, and calculates the average synchronization rate (step S221).
  • the behavior decision unit 213 determines whether there are any synchronized pairs whose average synchronization rate is 50% or more based on the calculation results of step S221 (step S222).
  • step S222 If there is no synchronized pair (step S222: No), the behavior decision unit 213 decides on an action for all users to breathe in time with the cue so as to increase the overall synchronization rate (step S223).
  • step S222 If a synchronized pair exists (step S222: Yes), the behavior decision unit 213 initializes the user number u to 1 (step S224) and determines whether the user u is included in the synchronized pair (step S225).
  • step S225 If user u is included in a synchronized pair (step S225: Yes), this user u is sufficiently synchronized with the other user of the synchronized pair, so the behavior decision unit 213 determines that there is no behavior output for this user u (step S227).
  • the behavior decision unit 213 decides that the behavior of user u is to make eye contact and shake hands with the user who has the highest synchronization rate with user u among the users included in the synchronized pair (step S226).
  • the behavior decision unit 213 compares the numerical value of u with the maximum value u max of u to determine whether u ⁇ u max is satisfied (step S229).
  • step S229: Yes If u ⁇ u max is satisfied (step S229: Yes), the behavior determining unit 213 returns to step S225. If u ⁇ u max is not satisfied (step S229: No), the behavior determining unit 213 ends the behavior determination process.
  • the presentation device 210 acquires electroencephalogram data of a plurality of users, calculates a phase synchronization degree based on the electroencephalogram data of each user, and determines an action to encourage synchronization for the user based on the calculated synchronization degree, and outputs information indicating the determined action.
  • the presentation device 210 can improve the synchronization of the phase of the electroencephalogram data by presenting each user with specific actions that encourage synchronization of the phase of the electroencephalogram data according to the current synchronization state.
  • the presentation device 210 presents actions to improve the synchronization state for each user according to the synchronization state of the phase of the EEG data between the users, so that each user can perform an action that is appropriate for them. Furthermore, when the synchronization state of the phase of the EEG data is less than a predetermined degree, the presentation device 210 determines an action of breathing in time with a cue for all users, thereby improving the overall synchronization state.
  • the presentation device 210 targets brainwave synchronization among multiple people, and when there is a pair (such as the aforementioned synchronized pair) whose brainwave data phase synchronization is higher than a predetermined degree, it increases the overall synchronization rate by presenting other users with actions that will increase synchronization with the synchronized pair.
  • each process performed in the presentation device 10, 210 may be realized, in whole or in part, by a CPU and a program analyzed and executed by the CPU. Furthermore, each process performed in the presentation device 10, 210 may be realized as hardware using wired logic.
  • [program] 8 is a diagram showing an example of a computer in which a program is executed to realize the presentation device 10, 210.
  • the computer 1000 has, for example, a memory 1010 and a CPU 1020.
  • the computer 1000 also has a hard disk drive interface 1030, a disk drive interface 1040, a serial port interface 1050, a video adapter 1060, and a network interface 1070. These components are connected by a bus 1080.
  • the memory 1010 includes a ROM 1011 and a RAM 1012.
  • the ROM 1011 stores a boot program such as a BIOS (Basic Input Output System).
  • BIOS Basic Input Output System
  • the hard disk drive interface 1030 is connected to a hard disk drive 1090.
  • the disk drive interface 1040 is connected to a disk drive 1100.
  • a removable storage medium such as a magnetic disk or optical disk is inserted into the disk drive 1100.
  • the serial port interface 1050 is connected to a mouse 1110 and a keyboard 1120, for example.
  • the video adapter 1060 is connected to a display 1130, for example.
  • the hard disk drive 1090 stores, for example, an OS (Operating System) 1091, application programs 1092, program modules 1093, and program data 1094. That is, the programs that define each process of the presentation device 10 are implemented as program modules 1093 in which code executable by the computer 1000 is written.
  • the program modules 1093 are stored, for example, in the hard disk drive 1090.
  • a program module 1093 for executing processes similar to the functional configuration of the presentation device 10 is stored in the hard disk drive 1090.
  • the hard disk drive 1090 may be replaced by an SSD (Solid State Drive).
  • the setting data used in the processing of the above-mentioned embodiment is stored as program data 1094, for example, in memory 1010 or hard disk drive 1090.
  • the CPU 1020 reads the program module 1093 or program data 1094 stored in memory 1010 or hard disk drive 1090 into RAM 1012 as necessary and executes it.
  • the program module 1093 and program data 1094 may not necessarily be stored in the hard disk drive 1090, but may be stored in a removable storage medium, for example, and read by the CPU 1020 via the disk drive 1100 or the like.
  • the program module 1093 and program data 1094 may be stored in another computer connected via a network (such as a LAN (Local Area Network), WAN (Wide Area Network)).
  • the program module 1093 and program data 1094 may then be read by the CPU 1020 from the other computer via the network interface 1070.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Psychology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Psychiatry (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

A presentation device (210) includes: brain wave acquisition units (11-1, 11-2, 11-3) that acquire brain wave data of a plurality of users, respectively; a synchronization rate calculation unit (12) that calculates a synchronization rate of a phase on the basis of the brain wave data of each of the users; an action determination unit (213) that determines an action encouraging synchronization with respect to each of the users on the basis of the synchronization rate calculated by the synchronization rate calculation unit (12); and an output control unit (214) that causes output units (14-1, 14-2, 14-3) to output information indicating an action determined by the action determination unit (213).

Description

提示装置、提示方法及び提示プログラムPresentation device, presentation method, and presentation program
 本発明は、提示装置、提示方法及び提示プログラムに関する。 The present invention relates to a presentation device, a presentation method, and a presentation program.
 複数人の心の状態を同調することを目的として、ユーザの脳波を計測し、特定の周波数成分の強度を各ユーザで同時に最大化されるように、各ユーザに対して刺激を提示し脳波を同調させる技術が提案されている。 With the aim of synchronizing the mental states of multiple people, technology has been proposed that measures the users' brain waves and presents stimuli to each user to synchronize their brain waves so that the intensity of specific frequency components is maximized for each user simultaneously.
特許第5317277号公報Patent No. 5317277
 ユーザの行動によって、脳波の位相の同調率の向上が生じることが知られている。しかしながら、従来技術では、行動による脳波データの同調の向上を促すことはできなかった。 It is known that a user's actions can improve the synchronization rate of brainwave phase. However, conventional technology has not been able to promote the improvement of synchronization of brainwave data through actions.
 本発明は、上記に鑑みてなされたものであって、現在の同調状態に応じて、脳波の位相の同調を促す具体的な行動を各ユーザに提示することができる提示装置、提示方法及び提示プログラムを提供することを目的とする。 The present invention has been made in consideration of the above, and aims to provide a presentation device, a presentation method, and a presentation program that can present each user with specific actions that encourage synchronization of the phase of brain waves according to the current synchronization state.
 上述した課題を解決し、目的を達成するために、本発明に係る提示装置は、複数のユーザの脳波データを取得する取得部と、各ユーザの脳波データを基に位相の同調度を算出する算出部と、前記算出部によって算出された同調度を基に、各ユーザに対して、同調を促す行動を決定する決定部と、前記決定部によって決定された行動を示す情報を出力部から出力させる出力制御部と、を有することを特徴とする。 In order to solve the above-mentioned problems and achieve the object, the presentation device according to the present invention is characterized by having an acquisition unit that acquires electroencephalogram data of multiple users, a calculation unit that calculates the degree of phase synchronization based on the electroencephalogram data of each user, a decision unit that decides an action that encourages synchronization for each user based on the degree of synchronization calculated by the calculation unit, and an output control unit that causes an output unit to output information indicating the action decided by the decision unit.
 本発明によれば、現在の同調状態に応じて、脳波の位相の同調を促す具体的な行動を各ユーザに提示することができる。 According to the present invention, it is possible to present each user with specific actions that encourage synchronization of the phase of their brain waves, depending on their current state of synchronization.
図1は、実施の形態1に係る提示装置の構成の一例を示す図である。FIG. 1 is a diagram illustrating an example of a configuration of a presentation device according to the first embodiment. 図2は、脳波データの位相の同調率の提示例を説明する図である。FIG. 2 is a diagram for explaining an example of presentation of the phase entrainment rate of electroencephalogram data. 図3は、実施の形態1に係る提示方法の処理手順を示すフローチャートである。FIG. 3 is a flowchart showing a processing procedure of the presentation method according to the first embodiment. 図4は、実施の形態2に係る提示装置の構成の一例を示す図である。FIG. 4 is a diagram illustrating an example of a configuration of a presentation device according to the second embodiment. 図5は、行動情報の出力例を説明する図である。FIG. 5 is a diagram illustrating an example of output of behavior information. 図6は、実施の形態2に係る提示方法の処理手順を示すフローチャートである。FIG. 6 is a flowchart showing a processing procedure of the presentation method according to the second embodiment. 図7は、図6に示す行動決定処理の処理手順の一例を示すフローチャートである。FIG. 7 is a flowchart illustrating an example of a processing procedure of the behavior determination processing illustrated in FIG. 図8は、プログラムが実行されることにより、提示装置が実現されるコンピュータの一例を示す図である。FIG. 8 is a diagram illustrating an example of a computer that realizes the presentation device by executing a program.
 以下に、本願に係る提示装置、提示方法及び提示プログラムの実施の形態を図面に基づいて詳細に説明する。また、本発明は、以下に説明する実施の形態により限定されるものではない。 Below, embodiments of the presentation device, presentation method, and presentation program according to the present application are described in detail with reference to the drawings. Furthermore, the present invention is not limited to the embodiments described below.
[実施の形態1]
 実施の形態1では、複数のユーザの脳波データを取得部位及び所定の周波数成分に分けて、複数の部位及び所定の周波数成分ごとの位相の同調率を算出し、ユーザに提示する。これによって、ユーザは、脳波の位相の同調度を具体的に認識することができる。そして、算出された同調率は、ユーザ間の協調に関連するものであるため、同調状態を提示することで、各ユーザが意図的に脳の状態をコントロールすることができる。
[First embodiment]
In the first embodiment, the electroencephalogram data of multiple users is divided into acquisition sites and predetermined frequency components, and the phase synchronization rate for each of the multiple sites and predetermined frequency components is calculated and presented to the user. This allows the user to specifically recognize the synchronization rate of the electroencephalogram phase. Since the calculated synchronization rate is related to cooperation between users, each user can intentionally control the state of their brain by presenting the synchronization state.
[提示装置]
 図1は、実施の形態1に係る提示装置の構成の一例を示す図である。提示装置10は、例えば、ROM(Read Only Memory)、RAM(Random Access Memory)、CPU(Central Processing Unit)等を含むコンピュータ等に所定のプログラムが読み込まれて、CPUが所定のプログラムを実行することで実現される。また、提示装置10は、ネットワーク等を介して接続された他の装置との間で、各種情報を送受信する通信インタフェースを有する。
[Presentation device]
1 is a diagram showing an example of the configuration of a presentation device according to embodiment 1. The presentation device 10 is realized by, for example, loading a predetermined program into a computer or the like including a ROM (Read Only Memory), a RAM (Random Access Memory), a CPU (Central Processing Unit), etc., and having the CPU execute the predetermined program. The presentation device 10 also has a communication interface for transmitting and receiving various information to and from other devices connected via a network or the like.
 提示装置10は、脳波取得部11-1,11-2(取得部)、同調率算出部12(算出部)、出力制御部13(提示部)及び出力部14-1,14-2を有する。 The presentation device 10 has brain wave acquisition units 11-1 and 11-2 (acquisition units), a synchronization rate calculation unit 12 (calculation unit), an output control unit 13 (presentation unit), and output units 14-1 and 14-2.
 脳波取得部11-1,11-2は、同調率の算出対象であるユーザ1,2の脳波データを取得し、取得した脳波データを、同調率算出部12に送信する。例えば、脳波データは、脳波の時系列データである。 The brain wave acquisition units 11-1 and 11-2 acquire brain wave data of users 1 and 2 for whom the synchronization rate is to be calculated, and transmit the acquired brain wave data to the synchronization rate calculation unit 12. For example, the brain wave data is time series data of brain waves.
 同調率算出部12は、各ユーザ1,2の脳波データを取得部位及び周波数成分に分けて、各ユーザ1,2の脳波データにおける複数の部位及び所定の周波数成分ごとの位相の同調度を算出する。同調率算出部12は、算出した同調度を、出力制御部13に送信する。 The synchronization rate calculation unit 12 separates the EEG data of each user 1, 2 into acquired parts and frequency components, and calculates the degree of phase synchronization for multiple parts and predetermined frequency components in the EEG data of each user 1, 2. The synchronization rate calculation unit 12 transmits the calculated synchronization to the output control unit 13.
 同調率算出部12は、同調度として、ユーザ1,2の脳波データにおける、複数の所定の周波数成分ごとの位相の一致率(同調率)を算出する。なお、同調率算出部12は、同調率に限らず、ユーザ1,2の脳波データにおける、複数の所定の周波数成分ごとの位相の一致度合いを段階的に示すレベル値を算出してもよい。 The synchronization rate calculation unit 12 calculates the phase matching rate (synchronization rate) for each of a plurality of predetermined frequency components in the brain wave data of users 1 and 2 as the degree of synchronization. Note that the synchronization rate calculation unit 12 is not limited to the synchronization rate, and may also calculate a level value that indicates in stages the degree of phase matching for each of a plurality of predetermined frequency components in the brain wave data of users 1 and 2.
 同調率算出部12は、脳波取得部11-1,11-2から送信された脳波データを、例えば、α波,β波,γ波,θ波に分離する。そして、同調率算出部12は、各周波数及び脳波データの取得部位(左、右)について、瞬時位相をhilbert変換にて算出する。 The synchronization rate calculation unit 12 separates the brainwave data sent from the brainwave acquisition units 11-1 and 11-2 into, for example, alpha waves, beta waves, gamma waves, and theta waves. The synchronization rate calculation unit 12 then calculates the instantaneous phase for each frequency and the acquisition site (left, right) of the brainwave data using a Hilbert transform.
 続いて、同調率算出部12は、取得部位に分けて、α波,β波,γ波,θ波ごとに、ユーザ1,2間の瞬時位相について位相同期解析を行うことで、Phase Locking Valueを計算し、ユーザ1,2間の同調率を算出する。 Next, the synchronization rate calculation unit 12 calculates the Phase Locking Value by performing phase synchronization analysis on the instantaneous phase between users 1 and 2 for each of the acquired parts: alpha waves, beta waves, gamma waves, and theta waves, and calculates the synchronization rate between users 1 and 2.
 同調率算出部12が、α波,β波,γ波,θ波に分離する場合を例に説明したが、この周波数成分に分離する場合に限らない。また、脳波データの取得部位は、左右及び左右差でなくてもよく、前方、中方、後方、または左感覚運動野、前頭前野といった位置であってもよい。また、同調率算出部12は、脳波データの測定チャネルごとに、ユーザ1,2の位相の一致率を計算し、この一致率を同調率としてもよい。また、同調率算出部12は、瞬時位相の算出方法として、hilbert変換でなく、例えばウェーブレット変換等を用いてもよい。 Although the entrainment rate calculation unit 12 has been described as separating the brainwaves into alpha waves, beta waves, gamma waves, and theta waves, this separation is not limited to these frequency components. The location from which the brainwave data is acquired does not have to be left or right or have a difference between left and right, but may be the front, middle, or back, or the left sensorimotor cortex or prefrontal cortex. The entrainment rate calculation unit 12 may calculate the phase agreement rate of users 1 and 2 for each measurement channel of the brainwave data, and use this agreement rate as the entrainment rate. The entrainment rate calculation unit 12 may use, for example, a wavelet transform instead of a Hilbert transform as a method for calculating the instantaneous phase.
 また、同調率算出部12は、Phase Locking Value以外の方法で、同調率を算出してもよい。同調率算出部12は、例えば、ユーザが3名以上であった場合には、同調率をばらつきの小ささと捉え、その指標として、瞬時位相の分散や、各ユーザ同士のPhase Locking Value平均を算出し、算出した値を同調率として出力する。 The synchronization rate calculation unit 12 may also calculate the synchronization rate using a method other than the Phase Locking Value. For example, when there are three or more users, the synchronization rate calculation unit 12 regards the synchronization rate as a smallness of variation, and calculates the variance of the instantaneous phase or the average Phase Locking Value between each user as an index of this, and outputs the calculated value as the synchronization rate.
 出力制御部13は、同調率算出部12によって算出された各同調度を、脳波データの取得部位及び周波数成分に対応付けて、出力部14-1,14-2から出力させることで、ユーザ1,2に同調度を提示する。出力制御部13は、同調率算出部12によって算出された全同調度の平均、及び/または、全同調率の平均の推移グラフをさらに提示してもよい。出力制御部13は、同調率算出部12によって算出された全同調度の平均の、所定の目標同調度に対する乖離度合いをさらに提示してもよい。目標同調度は、例えば、最大同調度である。 The output control unit 13 presents the synchrony to the users 1 and 2 by outputting each synchrony calculated by the synchrony rate calculation unit 12 from the output units 14-1 and 14-2 in association with the location and frequency component from which the electroencephalogram data was acquired. The output control unit 13 may further present the average of all synchrony rates calculated by the synchrony rate calculation unit 12 and/or a trend graph of the average of all synchrony rates. The output control unit 13 may further present the degree of deviation of the average of all synchrony rates calculated by the synchrony rate calculation unit 12 from a predetermined target synchrony. The target synchrony is, for example, the maximum synchrony.
 出力部14-1,14-2は、例えば、画像を表示出力するディスプレイまたはディスプレイを有する端末装置である。また、出力部14-1,14-2は、スピーカーまたはスピーカーを有する端末装置である。また、出力部14-1,14-2は、力、振動、動き、温度等によりユーザ1,2の触覚に与えられる情報を出力する出力デバイスであってもよい。 The output units 14-1 and 14-2 are, for example, a display that displays and outputs an image, or a terminal device having a display. The output units 14-1 and 14-2 are also speakers or terminal devices having a speaker. The output units 14-1 and 14-2 may also be output devices that output information that is given to the tactile sense of the users 1 and 2 by force, vibration, movement, temperature, etc.
[提示例]
 図2は、脳波データの位相の同調率の提示例を説明する図である。
[Presentation example]
FIG. 2 is a diagram for explaining an example of presentation of the phase entrainment rate of electroencephalogram data.
 図2のメニュー画面M1に示すように、出力制御部13は、現在の状態として、同調率算出部12によって算出されたユーザ1,2の全同調率の平均(枠W1参照)、脳波データの取得部位及び周波数成分ごとの同調率を示すテーブルT1、ユーザ1,2間の同調状態の推移を示す全同調率の平均の推移グラフG1を出力部14-1,14-2から表示出力させる。 As shown in menu screen M1 in FIG. 2, output control unit 13 causes output units 14-1 and 14-2 to display, as the current state, the average of all synchronization rates for users 1 and 2 calculated by synchronization rate calculation unit 12 (see frame W1), table T1 showing the synchronization rates for each location and frequency component where EEG data was acquired, and trend graph G1 of the average of all synchronization rates showing the trend in the synchronization state between users 1 and 2.
 全同調率の平均は、例えば、左脳、右脳のα波、β波、γ波、θ波の全同調率の平均である。テーブルT1は、左脳、右脳の、α波、β波、γ波、θ波の同調率を示す。また、出力制御部13は、推移グラフG1の枠W2の色を、全同調率の平均の、所定の目標同調率(例えば、最大同調率(100%))に対する乖離度合いに応じて変更してもよい。例えば、出力制御部13は、推移グラフG1の枠W2の色を、同調率算出部12によって算出された全同調率の平均が、100%に近づくほど青く表示し、100%から乖離するほど赤く変色する。 The average of all synchronization rates is, for example, the average of all synchronization rates of alpha waves, beta waves, gamma waves, and theta waves in the left and right brain. Table T1 shows the synchronization rates of alpha waves, beta waves, gamma waves, and theta waves in the left and right brain. The output control unit 13 may also change the color of frame W2 of the transition graph G1 depending on the degree of deviation of the average of all synchronization rates from a predetermined target synchronization rate (for example, the maximum synchronization rate (100%)). For example, the output control unit 13 displays the color of frame W2 of the transition graph G1 in a bluer color as the average of all synchronization rates calculated by the synchronization rate calculation unit 12 approaches 100%, and changes to a redder color as it deviates from 100%.
 なお、出力制御部13は、全同調率の平均(枠W1)、テーブルT1、推移グラフG1の全てではなく、いずれか一つ以上を提示してもよく、また、各データの表示形式は、図3の表示形式に限らない。また、出力制御部13は、全同調率の平均、左脳及び右脳のα波、β波、γ波、θ波の同調率ではなく、例えば、最も同調率の高いもののみを提示してもよい。また、出力制御部13は、同調率の推移として、全同調率の平均の推移グラフG1を提示するほか、同調率の向上、或いは、同調率の低下等の変化のみを提示してもよい。また、出力制御部13は、理想とする最大同調率(例えば、100%)との乖離を提示するほか、任意に設定された所定の目標同調率との乖離度合いを提示してもよい。 The output control unit 13 may present one or more of the average of all synchronization rates (frame W1), table T1, and transition graph G1, rather than all of them, and the display format of each data is not limited to the display format of FIG. 3. The output control unit 13 may present, for example, only the highest synchronization rate, rather than the average of all synchronization rates or the synchronization rates of alpha waves, beta waves, gamma waves, and theta waves of the left and right brain. The output control unit 13 may present only changes such as an improvement in the synchronization rate or a decrease in the synchronization rate, in addition to presenting the transition graph G1 of the average of all synchronization rates, as a transition of the synchronization rate. The output control unit 13 may present the deviation from an ideal maximum synchronization rate (for example, 100%), as well as the degree of deviation from a predetermined target synchronization rate that is set arbitrarily.
 また、図1では、ユーザ1,2ごとに出力部14-1,14-2を設けた例を説明したが、全ユーザ1,2が、一つの出力デバイスにより提示情報を確認する構成であってもよい。 In addition, in FIG. 1, an example is described in which output units 14-1 and 14-2 are provided for users 1 and 2, respectively, but a configuration in which all users 1 and 2 check the presented information using a single output device may also be used.
 また、例えば、ユーザが3名以上だった場合、出力制御部13は、ユーザのペアごとに同調率を提示してもよい。或いは、出力制御部13は、最も同調率が高いペアの同調率、及び/または、最も同調率が低いペアの同調率を提示してもよい。 Furthermore, for example, if there are three or more users, the output control unit 13 may present the synchronization rate for each pair of users. Alternatively, the output control unit 13 may present the synchronization rate of the pair with the highest synchronization rate and/or the synchronization rate of the pair with the lowest synchronization rate.
 また、出力制御部13は、ディスプレイ等を介した視覚的提示のほか、聴覚的提示或いは触覚的提示を行ってもよい。例えば、出力制御部13は、例えば、同調状態が高まると音量が大きくなるように同調状態を提示してもよい。また、出力制御部13は、同調状態が高まると温度が高くなる等によって同調状態を提示してもよい。 In addition to visual presentation via a display or the like, the output control unit 13 may also provide auditory or tactile presentation. For example, the output control unit 13 may present the tuning state by, for example, increasing the volume as the tuning state increases. In addition, the output control unit 13 may present the tuning state by, for example, increasing the temperature as the tuning state increases.
[提示処理]
 次に、提示装置10が実行する提示方法の処理手順について説明する。図3は、実施の形態1に係る提示方法の処理手順を示すフローチャートである。
[Presentation Processing]
Next, a description will be given of a processing procedure of the presentation method executed by the presentation device 10. Fig. 3 is a flowchart showing the processing procedure of the presentation method according to the first embodiment.
 図3に示すように、提示装置10は、脳波取得部11-1,11-2は、同調率の算出対象であるユーザ1,2の脳波データを取得し(ステップS11)、取得した脳波データを同調率算出部12に送信する。 As shown in FIG. 3, in the presentation device 10, the brain wave acquisition units 11-1 and 11-2 acquire brain wave data of users 1 and 2, who are the subjects of the synchronization rate calculation (step S11), and transmit the acquired brain wave data to the synchronization rate calculation unit 12.
 同調率算出部12は、受信したユーザ1,2の脳波データを取得部位及び周波数成分に分けて、各ユーザの脳波データにおける複数の部位及び所定の周波数成分ごとの位相の同調度(例えば、同調率)を算出する(ステップS12)。 The synchronization rate calculation unit 12 separates the received EEG data of users 1 and 2 into acquired body parts and frequency components, and calculates the degree of phase synchronization (e.g., synchronization rate) for multiple body parts and predetermined frequency components in the EEG data of each user (step S12).
 出力制御部13は、同調率算出部12によって算出された各同調度(例えば、同調率)を、脳波データの取得部位及び周波数に対応付けて、出力部14-1,14-2から出力させる(ステップS13)。 The output control unit 13 associates each degree of synchronization (e.g., synchronization rate) calculated by the synchronization rate calculation unit 12 with the location and frequency from which the EEG data was acquired, and outputs the result from the output units 14-1 and 14-2 (step S13).
[実施の形態1の効果]
 実施の形態1に係る提示装置10は、複数のユーザの脳波データを取得し、各ユーザの脳波データを取得部位及び周波数成分に分けて、各ユーザの脳波データにおける複数の部位及び所定の周波数成分ごとの位相の同調度を算出する。そして、提示装置10は、算出した各同調度を、脳波データの取得部位及び周波数成分に対応付けて、ユーザに提示する。
[Effects of the First Embodiment]
The presentation device 10 according to the first embodiment acquires electroencephalogram data of a plurality of users, separates the electroencephalogram data of each user into acquired parts and frequency components, and calculates the phase synchrony for each part and predetermined frequency component in the electroencephalogram data of each user. The presentation device 10 then associates each calculated synchrony with the acquired part and frequency component of the electroencephalogram data and presents it to the user.
 このように、提示装置10は、各ユーザの脳波データを取得部位及び周波数成分に分けて、各ユーザの脳波データにおける複数の部位及び所定の周波数成分ごとの位相の同調度を計算し、脳波の位相の同調度を具体的にユーザに提示することができる。これによって、ユーザは、脳波の位相が、どの程度同調しているかを具体的に認識することができる。 In this way, the presentation device 10 can separate each user's EEG data into acquisition areas and frequency components, calculate the degree of phase synchronization for multiple areas and predetermined frequency components in each user's EEG data, and present the degree of phase synchronization of the EEG to the user in concrete terms. This allows the user to specifically recognize the degree to which the phases of the EEG are synchronized.
 また、ユーザが実施している活動の内容によって、活発化する脳の部位は異なる。提示装置10は、全頭ではなく、部位ごとに、脳波データの位相の同調度を算出し、提示することで、ユーザは、現に活発化している脳の部位を具体的に認識することができる。 Furthermore, the parts of the brain that become active vary depending on the activity that the user is performing. The presentation device 10 calculates and presents the phase synchronization of the EEG data for each part of the brain, rather than for the entire head, allowing the user to specifically identify which parts of the brain are currently active.
 また、活動状態により、活発化する脳波データの周波数は異なる。具体的には、α波は、リラックス状態を示し、β波は、集中状態を示し、θ波は、浅い睡眠状態を示すとされている。さらに共同運動を実施している際にθ波が同調することで、共同運動主体感を高めることがわかっている(参考文献1)。したがって、提示装置10は、所定の周波数成分ごとに脳波データの位相の同調度を算出し、提示することで、ユーザは、ユーザがいずれの状態で同調しているかを具体的に認識することができる。 Furthermore, the frequency of brainwave data that becomes active varies depending on the state of activity. Specifically, alpha waves indicate a relaxed state, beta waves indicate a state of concentration, and theta waves indicate a state of light sleep. Furthermore, it is known that the synchronization of theta waves during joint movements enhances the sense of agency over joint movements (Reference 1). Therefore, the presentation device 10 calculates and presents the degree of phase synchronization of the brainwave data for each specified frequency component, allowing the user to specifically recognize in which state the user is synchronized.
 ユーザに、ユーザ自身の脳活動をリアルタイムにフィードバックすることによって、ユーザが、意図的に脳の状態をコントロールすることができるようになるニューロフィードバックという手法がある。 There is a technique called neurofeedback that allows users to intentionally control the state of their brain by providing them with feedback on their own brain activity in real time.
 提示装置10は、各ユーザの脳波データを取得部位及び周波数成分に分けて、各ユーザの脳波データにおける複数の部位及び周波数成分ごとの位相の同調度を計算し、提示するため、ニューロフィードバックを用いて、ユーザ間の同調度を高めることができる。すなわち、提示装置10によって算出及び提示された同調度は、ユーザ間の協調に関連するものであり、ユーザ間の同調状態を提示することで、各ユーザが意図的に脳の状態をコントロールすることができるようになる。 The presentation device 10 separates each user's EEG data into acquisition areas and frequency components, and calculates and presents the phase synchronization for each of the multiple areas and frequency components in each user's EEG data, so that it is possible to increase the synchronization between users using neurofeedback. In other words, the synchronization calculated and presented by the presentation device 10 is related to cooperation between users, and by presenting the state of synchronization between users, each user can intentionally control the state of their brain.
[実施の形態2]
 ユーザの行動によってユーザ間の同調度の向上を図ることができる。そこで、実施の形態2では、複数のユーザの脳波データに基づいて算出した同調度に応じて、同調を促す行動を提示することで、複数のユーザの同調度を向上させる。
[Embodiment 2]
The synchronization between users can be improved by the actions of the users. Therefore, in the second embodiment, the synchronization between the multiple users is improved by presenting actions that encourage synchronization according to the synchronization calculated based on the electroencephalogram data of the multiple users.
[提示装置]
 図4は、実施の形態2に係る提示装置の構成の一例を示す図である。提示装置210は、例えば、ROM、RAM、CPU等を含むコンピュータ等に所定のプログラムが読み込まれて、CPUが所定のプログラムを実行することで実現される。また、提示装置210は、ネットワーク等を介して接続された他の装置との間で、各種情報を送受信する通信インタフェースを有する。
[Presentation device]
4 is a diagram showing an example of the configuration of the presentation device according to embodiment 2. The presentation device 210 is realized, for example, by loading a predetermined program into a computer or the like including a ROM, a RAM, a CPU, etc., and causing the CPU to execute the predetermined program. The presentation device 210 also has a communication interface for transmitting and receiving various information to and from other devices connected via a network or the like.
 提示装置210は、脳波取得部11-1,11-2,11-3(取得部)、同調率算出部12(算出部)、行動決定部213、出力制御部214(提示部)、及び、各ユーザ1,2,3にそれぞれ行動情報を出力する出力部14-1,14-2,14-3を有する。なお、ユーザの数は、複数であれば、3人に限らない。 The presentation device 210 has brain wave acquisition units 11-1, 11-2, 11-3 (acquisition units), a synchronization rate calculation unit 12 (calculation unit), a behavior decision unit 213, an output control unit 214 (presentation unit), and output units 14-1, 14-2, 14-3 that output behavior information to each user 1, 2, 3, respectively. Note that the number of users is not limited to three, as long as there is more than one.
 脳波取得部11-1,11-2,11-3は、行動の提示対象であるユーザ1,2,3の脳波データを取得し、同調率算出部12に送信する。 The brain wave acquisition units 11-1, 11-2, and 11-3 acquire brain wave data of users 1, 2, and 3, who are the targets of the behavior presentation, and transmit it to the synchronization rate calculation unit 12.
 同調率算出部12は、ユーザのペアごとに、脳波データの位相の同調度を算出し、行動決定部213に送信する。同調率算出部12は、実施の形態1と同様の処理を行うことで、ユーザのペアごとに、ユーザ1,2,3の脳波データを取得部位及び周波数成分に分けて、各ユーザの脳波データにおける複数の部位及び所定の周波数成分ごとの位相の同調率を算出する。また、同調率算出部12は、所定の時間間隔(例えば、毎秒)で、脳波データの位相の同調率を算出する。 The synchronization rate calculation unit 12 calculates the degree of phase synchronization of the EEG data for each user pair and transmits it to the behavior decision unit 213. By performing the same process as in embodiment 1, the synchronization rate calculation unit 12 separates the EEG data of users 1, 2, and 3 into acquisition parts and frequency components for each user pair, and calculates the phase synchronization rate for multiple parts and predetermined frequency components in the EEG data of each user. The synchronization rate calculation unit 12 also calculates the phase synchronization rate of the EEG data at a predetermined time interval (e.g., every second).
 なお、同調率算出部12は、ユーザのペアごとではなく、ユーザの全ペアの同調率の平均や中央値等、全ユーザの同調度合いをまとめて表現する値を算出してもよい。なお、行動決定部213に、ユーザの同調度を示す情報が入力される場合、提示装置210は、脳波取得部11-1,11-2,11-3及び同調率算出部12を省略した構成であってもよい。 The synchronization rate calculation unit 12 may calculate a value that collectively represents the degree of synchronization of all users, such as the average or median of the synchronization rates of all pairs of users, rather than for each pair of users. When information indicating the degree of synchronization of users is input to the action decision unit 213, the presentation device 210 may be configured to omit the brain wave acquisition units 11-1, 11-2, 11-3 and the synchronization rate calculation unit 12.
 行動決定部213は、同調率算出部12によって算出された各同調度を基に、ユーザに対して、同調を促す行動を決定する。 The behavior decision unit 213 decides on an action to encourage the user to synchronize based on each synchronization degree calculated by the synchronization rate calculation unit 12.
 例えば、行動決定部213は、同調率算出部12によって算出された同調率を基に、予め設定された行動決定アルゴリズムに従って、同調を促す行動を決定する。 For example, the behavior decision unit 213 decides on a behavior that promotes synchronization based on the synchronization rate calculated by the synchronization rate calculation unit 12 and in accordance with a preset behavior decision algorithm.
 ここで、同調を促す行動は、同調度の範囲ごとにそれぞれ設定されている。例えば、同調を促す行動として、同調率が50%より大である場合には、ハグが設定されており、同調率が、30%より大であり、かつ、50%以下である場合は、視線合わせ及び握手が設定されており、30%未満の場合には、視線合わせのみが設定される。 Here, actions that encourage synchronization are set for each range of synchronization levels. For example, hugging is set as an action that encourages synchronization when the synchronization level is greater than 50%, eye contact and handshakes are set when the synchronization level is greater than 30% but less than 50%, and eye contact alone is set when the synchronization level is less than 30%.
 行動決定部213は、同調率算出部12によって算出された同調度が属する範囲に応じて、同調を促す行動を決定する。例えば、行動決定部213は、同調率が60%であるユーザのペアには、ハグを決定する。行動決定部213は、同調率が40%であるユーザのペアには、視線合わせ及び握手を決定する。また、行動決定部213は、同調率が目標同調率を達成したユーザのペアには、さらに同調を高める行動を提示してもよい。 The behavior decision unit 213 decides on a behavior that will encourage synchronization depending on the range of the degree of synchronization calculated by the synchronization rate calculation unit 12. For example, the behavior decision unit 213 decides on a hug for a user pair whose synchronization rate is 60%. The behavior decision unit 213 decides on eye contact and a handshake for a user pair whose synchronization rate is 40%. The behavior decision unit 213 may also present a behavior that will further increase synchronization to a user pair whose synchronization rate has reached a target synchronization rate.
 このように、行動決定部213は、同調率算出部12によって算出された同調度が所定度合い以上となるユーザのペアがある場合、このユーザのペアに対する行動を、同調度の範囲に応じて決定する。ユーザのペア同士で、視線を合わせる、共同注視をする、身体の接触等の行動を行うことによって、同調度を向上させることができる。 In this way, when there is a user pair whose synchronization degree calculated by the synchronization rate calculation unit 12 is equal to or greater than a predetermined degree, the behavior decision unit 213 decides the behavior of the user pair according to the range of synchronization degree. Synchrony degree can be improved by the pair of users making eye contact, engaging in joint gaze, making physical contact, etc.
 また、行動決定部213は、同調率算出部12によって算出された同調度が所定度合い未満である場合、全ユーザに対して、全体の同調度を高める行動を決定する。例えば、行動決定部213は、いずれのユーザのペアについても、同調率が30%未満の場合には、全ユーザに対して、合図に合わせて呼吸を行う行動を決定する。さらに、行動決定部213は、周期的な音声出力、光の明滅、音楽を流すなどの出力処理を決定し、呼吸を合わせやすい環境を生成してもよい。 Furthermore, when the synchronization degree calculated by the synchronization rate calculation unit 12 is less than a predetermined degree, the behavior decision unit 213 decides on an action for all users to increase the overall synchronization degree. For example, when the synchronization rate is less than 30% for any pair of users, the behavior decision unit 213 decides on an action for all users to breathe in time with a cue. Furthermore, the behavior decision unit 213 may decide on output processing such as periodic sound output, blinking lights, or playing music, to create an environment that makes it easier to synchronize breathing.
 また、行動決定部213は、行動の決定対象である第1のユーザを任意に決定し、この第1のユーザが対象とする第2のユーザをランダムに決定して、第2のユーザに対する第1のユーザの行動を決定してもよい。或いは、行動決定部213は、最も同調率が高いユーザのペア(第1のペア)と、第一のペアに含まれないユーザ(第1のユーザ)を任意に設定し、第1のペアのうち第1のユーザと最も同調率が高いユーザを第2のユーザとして決定し、この第2のユーザに対する第1のユーザの行動のみを決定してもよい。行動決定部213は、第1のユーザが第2のユーザと同調できるように第1のユーザの行動を支援する。 The behavior decision unit 213 may also arbitrarily determine a first user for whom behavior is to be determined, randomly determine a second user targeted by this first user, and determine the behavior of the first user toward the second user. Alternatively, the behavior decision unit 213 may arbitrarily set a pair of users with the highest synchronization rate (first pair) and a user (first user) not included in the first pair, determine the user of the first pair who has the highest synchronization rate with the first user as the second user, and determine only the behavior of the first user toward this second user. The behavior decision unit 213 supports the behavior of the first user so that the first user can synchronize with the second user.
 また、行動決定部213は、第1のユーザと第2のユーザとのペアを任意に決定し、このペアのユーザが互いに、同調を促す行動を起こすように行動を決定して、第1のユーザと第2のユーザの行動を支援してもよい。 The behavior decision unit 213 may also arbitrarily decide on a pair of a first user and a second user, and decide on an action that will encourage the users of this pair to take an action that encourages synchronization with each other, thereby supporting the actions of the first user and the second user.
 行動決定部213は、例えば、ユーザのペアを抽出して行動を決定する行動決定アルゴリズムに限らず、全ユーザに対し行動を決定及び提示した後に、ユーザのペアごとに同調度を算出して、ペアごとに行動を決定するアルゴリズムを採用してもよい。 The behavior decision unit 213 may, for example, employ an algorithm that is not limited to extracting pairs of users and deciding on a behavior, but instead determines and presents behaviors to all users, calculates the degree of synchronicity for each pair of users, and decides on a behavior for each pair.
 また、行動決定部213は、ユーザのペアごとに、同調度と、同調度を高める行動との関係を予め学習したモデルを用いて、ユーザに提示する行動を決定してもよい。 The behavior decision unit 213 may also decide, for each pair of users, on the behavior to present to the users using a model that has previously learned the relationship between synchrony and behaviors that increase synchrony.
 出力制御部214は、行動決定部213によって決定された行動を示す行動情報を、出力部14-1,14-2,14-3から出力させることで、同調を促す行動を、ユーザ1,2,3に提示する。出力制御部214は、各ユーザへ提示する行動情報の出力方法を決定し、決定した出力方法を用いて、出力部14-1,14-2,14-3に行動情報を出力させる。 The output control unit 214 presents users 1, 2, and 3 with behavior that encourages conformity by causing the output units 14-1, 14-2, and 14-3 to output behavior information indicating the behavior determined by the behavior determination unit 213. The output control unit 214 determines the output method for the behavior information to be presented to each user, and causes the output units 14-1, 14-2, and 14-3 to output the behavior information using the determined output method.
 図5は、行動情報の出力例を説明する図である。図5に例示するように、出力制御部214は、行動提示対象のユーザが認識可能であるディスプレイに、平均同調率(枠W21参照)とともに、「Aさんと視線を合わせて握手しましょう」という行動(枠W22参照)を示したメニューM21を表示出力させる。 FIG. 5 is a diagram illustrating an example of output of behavioral information. As illustrated in FIG. 5, the output control unit 214 displays and outputs a menu M21 showing the behavior "Make eye contact with Mr. A and shake hands" (see frame W22) together with the average synchronization rate (see frame W21) on a display that is recognizable by the user to whom the behavior is presented.
 なお、図4では、ユーザ1,2,3ごとに出力部14-1,14-2,14-3を設けた例を説明したが、全ユーザ1,2,3が、例えば一つの出力デバイスにより提示情報を確認する構成であってもよい。出力部14-1,14-2,14-3は、視覚情報のほか、聴覚情報を出力することで、同調を促す行動の実行を支援してもよい。 In FIG. 4, an example is described in which output units 14-1, 14-2, and 14-3 are provided for users 1, 2, and 3, respectively. However, a configuration in which all users 1, 2, and 3 check the presented information using, for example, one output device may also be used. The output units 14-1, 14-2, and 14-3 may support the execution of actions that encourage synchronization by outputting auditory information in addition to visual information.
[提示処理]
 次に、実施の形態の提示装置210が実行する提示方法の処理手順について説明する。図6は、実施の形態2に係る提示方法の処理手順を示すフローチャートである。
[Presentation Processing]
Next, a description will be given of a processing procedure of a presentation method executed by the presentation device 210 according to the embodiment. Fig. 6 is a flowchart showing the processing procedure of the presentation method according to the second embodiment.
 図6に示すように、提示装置210は、脳波取得部11-1,11-2,11-3は、同調率の算出対象であるユーザ1,2,3の脳波データを取得し(ステップS211)、取得した脳波データを同調率算出部12に送信する。 As shown in FIG. 6, in the presentation device 210, the brain wave acquisition units 11-1, 11-2, and 11-3 acquire brain wave data of users 1, 2, and 3 for which the synchronization rate is to be calculated (step S211), and transmit the acquired brain wave data to the synchronization rate calculation unit 12.
 同調率算出部12は、ユーザのペアごとに、ユーザ1,2,3の脳波データを取得部位及び周波数成分に分けて、各ユーザの脳波データにおける複数の部位及び周波数成分ごとの位相の同調率を算出する(ステップS212)。 For each pair of users, the synchronization rate calculation unit 12 divides the EEG data of users 1, 2, and 3 into acquisition areas and frequency components, and calculates the phase synchronization rate for each of the multiple areas and frequency components in the EEG data of each user (step S212).
 行動決定部213は、同調率算出部12によって算出された基に、同調を促す行動を決決定する行動決定処理を行う(ステップS213)。出力制御部214は、ステップS213において決定された行動を示す行動情報を、出力部14-1,14-2,14-3から出力させる(ステップS214)。 The behavior decision unit 213 performs a behavior decision process to decide on a behavior that will encourage synchronization based on the calculations made by the synchronization rate calculation unit 12 (step S213). The output control unit 214 causes the output units 14-1, 14-2, and 14-3 to output behavior information indicating the behavior decided in step S213 (step S214).
[行動決定処理]
 図7は、図6に示す行動決定処理の処理手順の一例を示すフローチャートである。図7では、現時刻の直前1分間の同調率の平均が50%以上のユーザのペア(同調ペア)を抽出し、他のペアのユーザに、同調ペアに含まれるユーザとの1対1のやり取りを行うように行動を決定する行動決定アルゴリズムを一例として説明する。
[Action Decision Processing]
Fig. 7 is a flowchart showing an example of the processing procedure of the behavior decision process shown in Fig. 6. Fig. 7 illustrates an example of a behavior decision algorithm that extracts pairs of users (synchronized pairs) whose average synchronization rate for the last minute before the current time is 50% or more, and determines the behavior of other pairs of users to have one-to-one interactions with users included in the synchronized pairs.
 図7に示すように、行動決定部213は、同調率算出部12によって算出された同調率を基に、全てのペアについて、現時刻の直前1分間の各秒の同調率を集計し、同調率の平均を算出する(ステップS221)。 As shown in FIG. 7, the behavior decision unit 213 counts the synchronization rates for each second in the last minute before the current time for all pairs based on the synchronization rates calculated by the synchronization rate calculation unit 12, and calculates the average synchronization rate (step S221).
 行動決定部213は、ステップS221の算出結果より、同調率の平均が50%以上の同調ペアはあるか否かを判定する(ステップS222)。 The behavior decision unit 213 determines whether there are any synchronized pairs whose average synchronization rate is 50% or more based on the calculation results of step S221 (step S222).
 同調ペアがない場合(ステップS222:No)、行動決定部213は、全体の同調率を高めるように、全ユーザに対して、合図に合わせて呼吸を行う行動を決定する(ステップS223)。 If there is no synchronized pair (step S222: No), the behavior decision unit 213 decides on an action for all users to breathe in time with the cue so as to increase the overall synchronization rate (step S223).
 同調ペアがある場合(ステップS222:Yes)、行動決定部213は、ユーザ番号であるuを1に初期化し(ステップS224)、ユーザuが同調ペアに含まれるか否かを判定する(ステップS225)。 If a synchronized pair exists (step S222: Yes), the behavior decision unit 213 initializes the user number u to 1 (step S224) and determines whether the user u is included in the synchronized pair (step S225).
 ユーザuが同調ペアに含まれる場合(ステップS225:Yes)、このユーザuは、同調ペアの他方のユーザと十分同調しているため、行動決定部213は、このユーザuに対する行動出力はないと決定する(ステップS227)。 If user u is included in a synchronized pair (step S225: Yes), this user u is sufficiently synchronized with the other user of the synchronized pair, so the behavior decision unit 213 determines that there is no behavior output for this user u (step S227).
 ユーザuが同調ペアに含まれない場合(ステップS225:No)、行動決定部213は、ユーザuの行動として、同調ペアに含まれるユーザのうち、ユーザuと最も同調率が高いユーザとの視線合わせ及び握手を決定する(ステップS226)。 If user u is not included in the synchronized pair (step S225: No), the behavior decision unit 213 decides that the behavior of user u is to make eye contact and shake hands with the user who has the highest synchronization rate with user u among the users included in the synchronized pair (step S226).
 ステップS226またはステップS227の処理後、行動決定部213は、u=u+1とする(ステップS228)。行動決定部213は、uの数値と、uの最大値umaxとを比較し、u≦umaxであるか否かを判定する(ステップS229)。 After the process of step S226 or step S227, the behavior decision unit 213 sets u=u+1 (step S228). The behavior decision unit 213 compares the numerical value of u with the maximum value u max of u to determine whether u≦u max is satisfied (step S229).
 u≦umaxである場合(ステップS229:Yes)、行動決定部213は、ステップS225に戻る。u≦umaxでない場合(ステップS229:No)、行動決定部213は、行動決定処理を終了する。 If u≦u max is satisfied (step S229: Yes), the behavior determining unit 213 returns to step S225. If u≦u max is not satisfied (step S229: No), the behavior determining unit 213 ends the behavior determination process.
[実施の形態2の効果]
 実施の形態2に係る提示装置210は、複数のユーザの脳波データを取得し、各ユーザの脳波データを基に位相の同調度を算出する。そして、提示装置210は、算出した同調度を基に、ユーザに対して、同調を促す行動を決定し、決定した行動を示す情報を出力する。
[Effects of the second embodiment]
The presentation device 210 according to the second embodiment acquires electroencephalogram data of a plurality of users, calculates a phase synchronization degree based on the electroencephalogram data of each user, and determines an action to encourage synchronization for the user based on the calculated synchronization degree, and outputs information indicating the determined action.
 このように、提示装置210は、現在の同調状態に応じて、脳波データの位相の同調を促す具体的な行動を各ユーザに提示することで、脳波データの位相の同調の向上を実現することができる。 In this way, the presentation device 210 can improve the synchronization of the phase of the electroencephalogram data by presenting each user with specific actions that encourage synchronization of the phase of the electroencephalogram data according to the current synchronization state.
 そして、提示装置210は、ユーザ間の脳波データの位相の同調状態に応じて、ユーザごとに、同調状態を向上させる行動を提示するため、各ユーザにそれぞれ適した行動を実行させることができる。また、提示装置210は、脳波データの位相の同調度が所定度合い未満である場合、全ユーザに対して、合図に合わせて呼吸を行う行動を決定することで、全体の同調状態を向上させることができる。 The presentation device 210 presents actions to improve the synchronization state for each user according to the synchronization state of the phase of the EEG data between the users, so that each user can perform an action that is appropriate for them. Furthermore, when the synchronization state of the phase of the EEG data is less than a predetermined degree, the presentation device 210 determines an action of breathing in time with a cue for all users, thereby improving the overall synchronization state.
 また、提示装置210は、複数人での脳波同調を対象とし、脳波データの位相の同調度が所定度合いよりも高いペア(例えば、前述の同調ペア)がある場合、それ以外のユーザに対して、同調ペアとの同調を高めるような行動を提示することで全体の同調率を高める。 In addition, the presentation device 210 targets brainwave synchronization among multiple people, and when there is a pair (such as the aforementioned synchronized pair) whose brainwave data phase synchronization is higher than a predetermined degree, it increases the overall synchronization rate by presenting other users with actions that will increase synchronization with the synchronized pair.
[実施形態のシステム構成について]
 上記に示した提示装置10,210の各構成要素は機能概念的なものであり、必ずしも物理的に図示のように構成されていることを要しない。すなわち、提示装置10,210の機能の分散および統合の具体的形態は図示のものに限られず、その全部または一部を、各種の負荷や使用状況などに応じて、任意の単位で機能的または物理的に分散または統合して構成することができる。
[System configuration of the embodiment]
The components of the presentation devices 10 and 210 shown above are conceptual and functional, and do not necessarily have to be physically configured as shown in the drawings. In other words, the specific form of distribution and integration of the functions of the presentation devices 10 and 210 is not limited to that shown in the drawings, and all or part of them can be functionally or physically distributed or integrated in any unit depending on various loads, usage conditions, etc.
 また、提示装置10,210においておこなわれる各処理は、全部または任意の一部が、CPUおよびCPUにより解析実行されるプログラムにて実現されてもよい。また、提示装置10,210においておこなわれる各処理は、ワイヤードロジックによるハードウェアとして実現されてもよい。 Furthermore, each process performed in the presentation device 10, 210 may be realized, in whole or in part, by a CPU and a program analyzed and executed by the CPU. Furthermore, each process performed in the presentation device 10, 210 may be realized as hardware using wired logic.
 また、実施の形態において説明した各処理のうち、自動的におこなわれるものとして説明した処理の全部または一部を手動的に行うこともできる。もしくは、手動的におこなわれるものとして説明した処理の全部または一部を公知の方法で自動的に行うこともできる。この他、上述および図示の処理手順、制御手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて適宜変更することができる。 Furthermore, among the processes described in the embodiments, all or part of the processes described as being performed automatically can be performed manually. Alternatively, all or part of the processes described as being performed manually can be performed automatically using known methods. In addition, the information including the processing procedures, control procedures, specific names, various data, and parameters described above and illustrated can be modified as appropriate unless otherwise specified.
[プログラム]
 図8は、プログラムが実行されることにより、提示装置10,210が実現されるコンピュータの一例を示す図である。コンピュータ1000は、例えば、メモリ1010、CPU1020を有する。また、コンピュータ1000は、ハードディスクドライブインタフェース1030、ディスクドライブインタフェース1040、シリアルポートインタフェース1050、ビデオアダプタ1060、ネットワークインタフェース1070を有する。これらの各部は、バス1080によって接続される。
[program]
8 is a diagram showing an example of a computer in which a program is executed to realize the presentation device 10, 210. The computer 1000 has, for example, a memory 1010 and a CPU 1020. The computer 1000 also has a hard disk drive interface 1030, a disk drive interface 1040, a serial port interface 1050, a video adapter 1060, and a network interface 1070. These components are connected by a bus 1080.
 メモリ1010は、ROM1011およびRAM1012を含む。ROM1011は、例えば、BIOS(Basic Input Output System)等のブートプログラムを記憶する。ハードディスクドライブインタフェース1030は、ハードディスクドライブ1090に接続される。ディスクドライブインタフェース1040は、ディスクドライブ1100に接続される。例えば磁気ディスクや光ディスク等の着脱可能な記憶媒体が、ディスクドライブ1100に挿入される。シリアルポートインタフェース1050は、例えばマウス1110、キーボード1120に接続される。ビデオアダプタ1060は、例えばディスプレイ1130に接続される。 The memory 1010 includes a ROM 1011 and a RAM 1012. The ROM 1011 stores a boot program such as a BIOS (Basic Input Output System). The hard disk drive interface 1030 is connected to a hard disk drive 1090. The disk drive interface 1040 is connected to a disk drive 1100. A removable storage medium such as a magnetic disk or optical disk is inserted into the disk drive 1100. The serial port interface 1050 is connected to a mouse 1110 and a keyboard 1120, for example. The video adapter 1060 is connected to a display 1130, for example.
 ハードディスクドライブ1090は、例えば、OS(Operating System)1091、アプリケーションプログラム1092、プログラムモジュール1093、プログラムデータ1094を記憶する。すなわち、提示装置10の各処理を規定するプログラムは、コンピュータ1000により実行可能なコードが記述されたプログラムモジュール1093として実装される。プログラムモジュール1093は、例えばハードディスクドライブ1090に記憶される。例えば、提示装置10における機能構成と同様の処理を実行するためのプログラムモジュール1093が、ハードディスクドライブ1090に記憶される。なお、ハードディスクドライブ1090は、SSD(Solid State Drive)により代替されてもよい。 The hard disk drive 1090 stores, for example, an OS (Operating System) 1091, application programs 1092, program modules 1093, and program data 1094. That is, the programs that define each process of the presentation device 10 are implemented as program modules 1093 in which code executable by the computer 1000 is written. The program modules 1093 are stored, for example, in the hard disk drive 1090. For example, a program module 1093 for executing processes similar to the functional configuration of the presentation device 10 is stored in the hard disk drive 1090. The hard disk drive 1090 may be replaced by an SSD (Solid State Drive).
 また、上述した実施の形態の処理で用いられる設定データは、プログラムデータ1094として、例えばメモリ1010やハードディスクドライブ1090に記憶される。そして、CPU1020が、メモリ1010やハードディスクドライブ1090に記憶されたプログラムモジュール1093やプログラムデータ1094を必要に応じてRAM1012に読み出して実行する。 Furthermore, the setting data used in the processing of the above-mentioned embodiment is stored as program data 1094, for example, in memory 1010 or hard disk drive 1090. Then, the CPU 1020 reads the program module 1093 or program data 1094 stored in memory 1010 or hard disk drive 1090 into RAM 1012 as necessary and executes it.
 なお、プログラムモジュール1093やプログラムデータ1094は、ハードディスクドライブ1090に記憶される場合に限らず、例えば着脱可能な記憶媒体に記憶され、ディスクドライブ1100等を介してCPU1020によって読み出されてもよい。あるいは、プログラムモジュール1093およびプログラムデータ1094は、ネットワーク(LAN(Local Area Network)、WAN(Wide Area Network)等)を介して接続された他のコンピュータに記憶されてもよい。そして、プログラムモジュール1093およびプログラムデータ1094は、他のコンピュータから、ネットワークインタフェース1070を介してCPU1020によって読み出されてもよい。 The program module 1093 and program data 1094 may not necessarily be stored in the hard disk drive 1090, but may be stored in a removable storage medium, for example, and read by the CPU 1020 via the disk drive 1100 or the like. Alternatively, the program module 1093 and program data 1094 may be stored in another computer connected via a network (such as a LAN (Local Area Network), WAN (Wide Area Network)). The program module 1093 and program data 1094 may then be read by the CPU 1020 from the other computer via the network interface 1070.
 以上、本発明者によってなされた発明を適用した実施の形態について説明したが、本実施の形態による本発明の開示の一部をなす記述および図面により本発明は限定されることはない。すなわち、本実施の形態に基づいて当業者等によりなされる他の実施の形態、実施例および運用技術等はすべて本発明の範疇に含まれる。 The above describes an embodiment of the invention made by the inventor, but the present invention is not limited to the description and drawings that form part of the disclosure of the present invention according to this embodiment. In other words, all other embodiments, examples, and operational techniques made by those skilled in the art based on this embodiment are included in the scope of the present invention.
 10,210 提示装置
 11-1,11-2,11-3 脳波取得部
 12 同調率算出部
 13,214 出力制御部
 14-1,14-2,14-3 出力部
 213 行動決定部
10, 210 Presentation device 11-1, 11-2, 11-3 Brain wave acquisition unit 12 Synchronization rate calculation unit 13, 214 Output control unit 14-1, 14-2, 14-3 Output unit 213 Action decision unit

Claims (7)

  1.  複数のユーザの脳波データを取得する取得部と、
     各ユーザの脳波データを基に位相の同調度を算出する算出部と、
     前記算出部によって算出された同調度を基に、各ユーザに対して、同調を促す行動を決定する決定部と、
     前記決定部によって決定された行動を示す情報を出力部から出力させる出力制御部と、
     を有することを特徴とする提示装置。
    An acquisition unit that acquires electroencephalogram data of a plurality of users;
    A calculation unit that calculates a phase synchronization degree based on the electroencephalogram data of each user;
    A determination unit that determines an action for encouraging synchronization for each user based on the synchronization degree calculated by the calculation unit;
    an output control unit that causes an output unit to output information indicating the action determined by the determination unit;
    A presentation device comprising:
  2.  前記算出部は、各ユーザの脳波データを取得部位及び所定の周波数成分に分けて、複数の部位及び所定の周波数成分ごとの位相の同調度を算出することを特徴とする請求項1に記載の提示装置。 The presentation device according to claim 1, characterized in that the calculation unit divides the electroencephalogram data of each user into acquisition areas and predetermined frequency components, and calculates the phase synchronization for each of the multiple areas and predetermined frequency components.
  3.  前記算出部は、各ユーザの脳波の取得部位として左脳及び右脳に分けて、α波、β波、γ波、θ波の同調度を算出することを特徴とする請求項2に記載の提示装置。 The presentation device according to claim 2, characterized in that the calculation unit calculates the synchronization of alpha waves, beta waves, gamma waves, and theta waves by dividing the brainwave acquisition area of each user into the left and right brain.
  4.  前記同調を促す行動は、前記同調度の範囲ごとにそれぞれ設定されており、
     前記決定部は、前記算出部によって算出された同調度が属する範囲に応じて、前記同調を促す行動を前記ユーザごとに決定することを特徴とする請求項1に記載の提示装置。
    The behaviors that encourage synchronization are set for each range of the synchronization degree,
    The presentation device according to claim 1 , wherein the determination unit determines, for each user, the behavior that encourages the synchronization depending on a range to which the synchronization degree calculated by the calculation unit belongs.
  5.  前記決定部は、前記算出部によって算出された前記同調度が所定度合い未満である場合、全ユーザに対して、合図に合わせて呼吸を行う行動を決定することを特徴とする請求項1に記載の提示装置。 The presentation device according to claim 1, characterized in that the determination unit determines, for all users, an action to breathe in accordance with a cue when the synchronization degree calculated by the calculation unit is less than a predetermined degree.
  6.  提示装置が実行する提示方法であって、
     複数のユーザの脳波データを取得する工程と、
     各ユーザの脳波データを基に位相の同調度を算出する工程と、
     前記算出する工程において算出された同調度を基に、各ユーザに対して、同調を促す行動を決定する工程と、
     前記決定する工程によって決定された行動を示す情報を出力部から出力させる工程と、
     を含んだことを特徴とする提示方法。
    A presentation method executed by a presentation device, comprising:
    acquiring electroencephalogram data of a plurality of users;
    calculating a phase synchronization degree based on the electroencephalogram data of each user;
    determining an action for encouraging synchronization for each user based on the degree of synchronization calculated in the calculating step;
    outputting information indicating the action determined by the determining step from an output unit;
    A presentation method comprising:
  7.  コンピュータを、請求項1~5のいずれか一つに記載の提示装置として機能させるための提示プログラム。 A presentation program for causing a computer to function as a presentation device according to any one of claims 1 to 5.
PCT/JP2022/042005 2022-11-10 2022-11-10 Presentation device, presentation method, and presentation program WO2024100860A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/042005 WO2024100860A1 (en) 2022-11-10 2022-11-10 Presentation device, presentation method, and presentation program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/042005 WO2024100860A1 (en) 2022-11-10 2022-11-10 Presentation device, presentation method, and presentation program

Publications (1)

Publication Number Publication Date
WO2024100860A1 true WO2024100860A1 (en) 2024-05-16

Family

ID=91032154

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/042005 WO2024100860A1 (en) 2022-11-10 2022-11-10 Presentation device, presentation method, and presentation program

Country Status (1)

Country Link
WO (1) WO2024100860A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH114892A (en) * 1997-06-13 1999-01-12 Seiko Epson Corp Organism synchronization detecting device
JP2011519654A (en) * 2008-05-09 2011-07-14 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Synchronizing heart rate parameters of multiple users
WO2022209296A1 (en) * 2021-03-31 2022-10-06 国立研究開発法人情報通信研究機構 Empathy measurement method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH114892A (en) * 1997-06-13 1999-01-12 Seiko Epson Corp Organism synchronization detecting device
JP2011519654A (en) * 2008-05-09 2011-07-14 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Synchronizing heart rate parameters of multiple users
WO2022209296A1 (en) * 2021-03-31 2022-10-06 国立研究開発法人情報通信研究機構 Empathy measurement method

Similar Documents

Publication Publication Date Title
Volosyak et al. Age-related differences in SSVEP-based BCI performance
Combaz et al. A comparison of two spelling brain-computer interfaces based on visual P3 and SSVEP in locked-in syndrome
US10357151B2 (en) Method of identifying and eye disorder of an observer and apparatus for implementing the same
US20130144537A1 (en) Real Time Assessment During Interactive Activity
Volosyak et al. A dictionary-driven SSVEP speller with a modified graphical user interface
Speier et al. Online BCI typing using language model classifiers by ALS patients in their homes
Gembler et al. A comparison of cVEP-based BCI-performance between different age groups
Peyrin et al. Scene perception in age-related macular degeneration: Effect of spatial frequencies and contrast in residual vision
WO2024100860A1 (en) Presentation device, presentation method, and presentation program
Hashemi et al. The role of horizontal facial structure on the N170 and N250
WO2024100861A1 (en) Presentation device, presentation method, and presentation program
Xiao et al. Toward assessment of sound localization in disorders of consciousness using a hybrid audiovisual brain–computer Interface
Nave-Blodgett et al. Auditory superiority for perceiving the beat level but not measure level in music.
WO2020157686A1 (en) Attention-based neurofeedback training
Ahmad et al. Visuospatial cognitive dysfunction in patients with vestibular loss
McKendrick et al. Response times across the visual field: empirical observations and application to threshold determination
Trivedi et al. A pilot study evaluating the use of EyeSpy video game software to perform vision screening in school-aged children
Blankertz et al. 23—BCI APPLICATIONS FOR THE GENERAL POPULATION
Maróti et al. The effect of beat frequency on eye movements during free viewing
Acar et al. Altered functional interactions between neurons in primary visual cortex of macaque monkeys with experimental amblyopia
de Boer et al. Auditory and visual integration for emotion recognition and compensation for degraded signals are preserved with age
Mak et al. Detection of stroke-induced visual neglect and target response prediction using augmented reality and electroencephalography
Pollmann et al. Intact contextual cueing for search in realistic scenes with simulated central or peripheral vision loss
CN116392123A (en) Multi-movement symptom screening method and system based on game interaction and eye movement tracking
Silverstein et al. The relationship between depressive symptoms, eHealth literacy, and asthma outcomes in the context of a mobile health intervention

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22965184

Country of ref document: EP

Kind code of ref document: A1