CN117836742A - Sensory transmission system, sensory transmission method, and sensory transmission program - Google Patents

Sensory transmission system, sensory transmission method, and sensory transmission program Download PDF

Info

Publication number
CN117836742A
CN117836742A CN202280057052.5A CN202280057052A CN117836742A CN 117836742 A CN117836742 A CN 117836742A CN 202280057052 A CN202280057052 A CN 202280057052A CN 117836742 A CN117836742 A CN 117836742A
Authority
CN
China
Prior art keywords
subject
information
sensory information
sensory
brain activation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280057052.5A
Other languages
Chinese (zh)
Inventor
菅原隆幸
岛仓孝满
大段翔平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JVCKenwood Corp
Original Assignee
JVCKenwood Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by JVCKenwood Corp filed Critical JVCKenwood Corp
Priority claimed from PCT/JP2022/035933 external-priority patent/WO2023048295A1/en
Publication of CN117836742A publication Critical patent/CN117836742A/en
Pending legal-status Critical Current

Links

Landscapes

  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The sensory transmission system is provided with: the device comprises a first device, a second device and a third device, wherein the first device detects brain activation information of a first tested person under the condition that the first tested person perceives; an estimating device that estimates reference sensory information, which is sensory information associated with perception, based on the detected brain activation information of the first subject, and that estimates corresponding sensory information, which is sensory information corresponding to the reference sensory information, with respect to the second subject different from the first subject, based on the estimated reference sensory information; and a second device for applying a stimulus to the second subject so as to associate the estimated corresponding sensory information.

Description

Sensory transmission system, sensory transmission method, and sensory transmission program
Technical Field
The present disclosure relates to a sensory transmission system, a sensory transmission method, and a sensory transmission program.
Background
In recent years, techniques for noninvasively measuring activation information in the brain, such as a functional magnetic resonance method and a near infrared spectroscopy method, have been developed, and techniques for a brain-computer interface, which is an interface between the brain and the outside, have been put into practical use. As an example using such a technique, the following structure is disclosed: the brain activation information of a sleeping subject is detected to determine the sleep state of the subject, and when it is determined that the subject is in a rapid eye movement sleep state, a dream-induced stimulus is applied to the subject (for example, refer to patent document 1).
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 2003-332251.
Disclosure of Invention
Problems to be solved by the invention
The correspondence between the brain activity state of the subject and the measured brain activation information varies depending on the personality of the subject, the surrounding environment of the subject at the time of measurement, and the like. For example, in the technique described in patent document 1, when a dream-induced stimulus is applied, what kind of dream is seen differs depending on each subject, timing of applying the stimulus, and the like, and it is difficult to control the content itself of the dream. In contrast, when determining or controlling a brain activity state based on brain activation information of a subject, a technique is required that takes into account factors such as individual differences and differences in the surrounding environment of the subject.
The present disclosure has been made in view of the above circumstances, and an object thereof is to provide a sensation transmission system and a sensation transmission method capable of appropriately transmitting a sensation between subjects when there is an individual difference between the subjects or a difference in the surrounding environments of the subjects.
Means for solving the problems
The sensory transmission system according to the present disclosure includes: a first device that detects brain activation information of a first subject when the first subject is perceiving; an estimating device that estimates reference sensory information based on the detected brain activation information of the first subject, and that estimates corresponding sensory information regarding a second subject different from the first subject based on the estimated reference sensory information, the reference sensory information being sensory information reminiscent of the perception, the corresponding sensory information being sensory information corresponding to the reference sensory information; and a second device that applies a stimulus to the second subject so that the second subject conceive the estimated corresponding sensory information.
The sensory transmission method related to the present disclosure includes: detecting brain activation information of a first detected person when the first detected person senses; estimating reference sensory information, which is sensory information associated with the sense, based on the detected brain activation information of the first subject, and estimating corresponding sensory information, which is sensory information corresponding to the reference sensory information, regarding a second subject different from the first subject, based on the estimated reference sensory information; and applying a stimulus to the second subject such that the second subject is reminiscent of the estimated corresponding sensory information.
The sensory transfer program to which the present disclosure relates causes a computer to execute: detecting brain activation information of a first detected person when the first detected person senses; estimating reference sensory information, which is sensory information associated with the sense, based on the detected brain activation information of the first subject, and estimating corresponding sensory information, which is sensory information corresponding to the reference sensory information, regarding a second subject different from the first subject, based on the estimated reference sensory information; and applying a stimulus to the second subject such that the second subject is reminiscent of the estimated corresponding sensory information.
Effects of the invention
According to the present disclosure, in the case where there is an individual difference of each subject or a difference in the surrounding environment of the subject, it is possible to appropriately transfer the feeling between the subjects.
Drawings
Fig. 1 is a schematic diagram showing an example of a sensory transmission system according to a first embodiment;
fig. 2 is a functional block diagram showing an example of the sensory transmission system according to the first embodiment;
FIG. 3 is a diagram showing an example of a neural network;
fig. 4 is a diagram schematically showing an example of the operation of the sensory transmission system according to the first embodiment;
fig. 5 is a flowchart showing an example of the operation of the sensory transmission system according to the first embodiment;
fig. 6 is a diagram schematically showing another example of the action of the sensory transmission system according to the second embodiment;
fig. 7 is a diagram schematically showing another example of the action of the sensory transmission system according to the third embodiment;
FIG. 8 is a functional block diagram showing an example of a sensory transmission system according to a fourth embodiment;
fig. 9 is a diagram schematically showing an example of the operation of the sensory transmission system according to the fourth embodiment;
fig. 10 is a flowchart showing an example of the operation of the sensory transmission system according to the fourth embodiment;
Fig. 11 is a functional block diagram showing an example of a sensory transmission system according to the fifth embodiment;
fig. 12 is a diagram schematically showing an example of the operation of the sensory transmission system according to the fifth embodiment;
fig. 13 is a flowchart showing an example of the operation of the sensory transmission system according to the fifth embodiment.
Detailed Description
Embodiments of the present disclosure will be described below based on the drawings. The present invention is not limited to this embodiment. The constituent elements in the following embodiments include elements that can be replaced by a person skilled in the art and are easily replaced, or substantially the same elements.
First embodiment
Fig. 1 is a schematic diagram showing an example of a sensory transmission system 100 according to a first embodiment. Fig. 2 is a functional block diagram illustrating an example of the sensory transfer system 100. As shown in fig. 1 and 2, the sensory transfer system 100 includes a first device 10, an estimating device 20, and a second device 30.
The first device 10 detects the brain activation information of the first subject R1 in the case where the first subject R1 perceives it. The first device 10 includes a detection unit 11, a communication unit 12, a processing unit 13, a stimulation application unit 14, and a storage unit 15.
The detection unit 11 detects brain activation information. Examples of the intra-brain activation information include an oxyhemoglobin concentration, a deoxyhemoglobin concentration, and a total hemoglobin concentration contained in the cerebral blood stream of the subject. As the detection unit 11, for example, a measurement device that performs measurement based on the principle of fMRI (functional Magnetic Resonance Imaging: functional magnetic resonance imaging method), fnis (functional Near-infrared spectroscopy-Infrared Spectroscopy) or the like, a measurement device that uses an invasive electrode, a measurement device that disposes a micro-machine in a cerebral blood vessel, and performs measurement using the micro-machine, or the like can be used. The detecting unit 11 is not limited to the above-described device, and other types of devices may be used. For example, in the case where the brain of the first subject R1 is divided by a three-dimensional matrix composed of voxels of several millimeters or less, the intra-brain activation information can be expressed as the magnitude of the activity of each voxel.
The communication unit 12 is an interface for performing wired communication or wireless communication. The communication unit 12 transmits the intra-brain activation information detected by the detection unit 11 to the estimation device 20. The communication section 12 includes an interface for communicating with an external device in accordance with a so-called wireless LAN of the IEEE 802.11 standard. The communication unit 12 may also realize communication with an external device by control of the processing unit 22 or the like. The communication system is not limited to the wireless LAN, and may include, for example, an infrared communication system, a Bluetooth (registered trademark) communication system, a wireless communication system such as a WirelessUSB system, and the like. Further, a wired connection such as a USB cable, HDMI (registered trademark), IEEE 1394, or Ethernet (Ethernet) may be used.
The processing unit 13, the stimulus applying unit 14, and the storage unit 15 will be described later. In the first embodiment, the processing unit 13, the stimulus applying unit 14, and the storage unit 15 may not be provided.
The estimation device 20 includes a communication unit 21, a processing unit 22, and a storage unit 23. The communication unit 21 is an interface for performing wired communication or wireless communication. The communication unit 21 receives, for example, the intra-brain activation information transmitted from the first device 10. The communication unit 21 transmits, for example, the corresponding sensory information estimated by the processing unit 22 to be described later to the second device 30. The communication unit 21 may have the same configuration as the communication unit 12 described above.
The processing unit 22 includes a processing device such as a CPU (Central Processing Unit ), a Memory device such as a RAM (Random Access Memory ) or a ROM (Read Only Memory). The processing unit 22 performs various processes including an estimation process described below.
The storage unit 23 stores various information. The storage unit 23 has a memory such as a hard disk drive or a solid state drive. Further, as the storage unit 23, an external storage medium such as a removable disk may be used.
The processing unit 22 estimates sensory information (reference sensory information) concerning the sense of the first subject R1 associated with the sense of perception based on the detected brain activation information of the first subject R1. The sensory information may be information on at least one of the five senses of sight, hearing, touch, taste, and smell, for example, or may be a balance sense, a sense of body other than the above, or the like. Specifically, in the case where the sense information is visually related information, the sense information is image data perceived by the first subject R1, but the sense information is not limited to this, and may be image data obtained by sampling the image data or image data obtained by subjecting the image data to a filter process, instead of the image data itself. In addition, when the sensory information is visually related information, the sensory information may be information related to light entering the eyeball of the first subject R1. In this case, for example, information on light may be acquired by a contact lens including a photosensor. Alternatively, an artificial retina may be used to obtain information about the light. Specifically, information on light of a CCD sensor provided in the artificial retina may be used. In the case where the sensory information is information related to hearing, the sensory information may be any sound signal data perceived by the first subject R1. In the case where the sensory information is information related to taste, the sensory information may be data indicating indexes of a plurality of chemical substances that reproduce the taste sense perceived by the first subject R1. In the case where the sensory information is information related to the sense of touch, the sensory information may be data indicating what degree of stimulation has occurred in which part of the developed image the entire body surface of the first subject R1 is developed into a plane. These sensory information are examples, and are not limited thereto. Regarding the first subject R1, the correspondence relation can be obtained by performing experiments in advance on what brain activation information is what is supposed to be in the case of what sensory information is supposed to be. For example, the brain activation information detected from the first subject R1 and the sensory information corresponding to the brain activation information may be associated as a set of learning data sets, and the first learning model may be generated by performing machine learning on the learning data sets. The first learning model can be stored in the storage unit 23 in advance, for example.
The processing unit 22 estimates, based on the estimated reference sensation information, corresponding sensation information about a second subject R2 different from the first subject R1, the corresponding sensation information corresponding to the reference sensation information. The specific example corresponding to the sensory information may correspond to the specific example of the sensory information. The second subject R2 is a subject to be a transmission destination to which the sensation of the first subject R1 is transmitted. The relationship between the reference sensory information and the corresponding sensory information can be correlated by performing experiments in advance. For example, the second learning model can be generated by taking sensory information corresponding between the first subject R1 and the second subject R2 as a set of learning data sets, and subjecting the learning data sets to machine learning. The second learning model can be stored in the storage unit 23 in advance, for example.
The second device 30 applies a stimulus to the second subject R2 such that the estimated corresponding sensory information is associated. The second device 30 includes a detection unit 31, a communication unit 32, a processing unit 33, a stimulation applying unit 34, and a storage unit 35.
The detection unit 31 will be described later. In the first embodiment, the detection unit 31 may not be provided.
The communication unit 32 is an interface for performing wired communication or wireless communication. The communication unit 32 receives the corresponding sensory information transmitted from the estimation device 20. The communication unit 32 transmits the brain activation information detected by the detection unit 31 to the estimation device 20. The communication unit 32 may have the same configuration as the communication unit 12 described above.
The stimulus applying unit 34 applies stimulus to the second subject R2 by activating the target portion of the brain of the second subject R2 by radiating an electromagnetic wave signal to the target portion. In this case, the brain of the second subject R2 is divided by a three-dimensional matrix composed of voxels of several millimeters or less, for example, and electromagnetic waves are irradiated to each voxel. The stimulus applying unit 34 can apply electromagnetic waves based on stimulus image information indicating which voxel of the three-dimensional matrix is applied with electromagnetic waves of what intensity. The voxels in the three-dimensional matrix of the stimulation image information may correspond to, for example, the size and position between the voxels in the three-dimensional matrix of the brain activation information. The correspondence relation can be obtained by performing experiments in advance on the sensory information about which voxel in the brain of the second subject R2 is irradiated with electromagnetic waves of which intensity is associated with. For example, the stimulus image information for the second subject R2 and the sensory information associated with the second subject R2 when electromagnetic waves are irradiated based on the stimulus image information may be associated with each other to form a set of learning data sets, and the learning data sets may be subjected to machine learning to generate the third learning model. The third learning model can be stored in the storage unit 35 of the second device 30 in advance, for example. The processing unit 33 can calculate the stimulus image information corresponding to the received corresponding sensory information based on the corresponding sensory information received by the communication unit 32 and the third learning model.
The first learning model, the second learning model, and the third learning model described above can be generated using, for example, a neural network (convolutional neural network) represented by VGG 16. Fig. 3 is a diagram showing an example of a neural network. As shown in the upper stage of fig. 3, the neural network NW has 13 convolutional layers S1, 5 pooling layers S2, and 3 fully-connected layers S3. The neural network processes the input information in turn at the convolution layer S1 and the pooling layer S2, and the processing result is coupled and output at the full-connection layer S3.
When the first learning model, the second learning model, and the third learning model are generated, as shown in the middle stage of fig. 3, the learning data sets including the information (I1, I2) corresponding to each other are input to the neural network NW, and the correlation of the learning data sets is learned by machine learning such as deep learning. That is, the learning model is generated by learning the optimized neural network NW. For example, learning is performed to solve the problem of obtaining one piece of information when one piece of information among pieces of information constituting each learning data set is input.
When reasoning is performed using the first learning model, the second learning model, and the third learning model, as shown in the lower stage of fig. 3, one piece of information I1 of two pieces of information each constituting a learning data set is input to the neural network NW. Based on the learning result of the correlation of the information constituting the learning data set, another piece of information I2 corresponding to the input piece of information I1 is output from the first learning model, the second learning model, and the third learning model. In the present embodiment, the example of generating the learning model using the convolutional neural network represented by VGG16 has been described, but the present invention is not limited to this, and other types of neural networks may be used to generate the learning model.
Next, a sensory transmission method using the sensory transmission system 100 configured as described above will be described. Fig. 4 is a diagram schematically showing an example of the operation of the sensory transmission system 100. As shown in fig. 4, sensing is performed so that the first subject R1 associates reference sensory information. Hereinafter, a case where the first subject R1 visually perceives the face of the cat and associates visual information will be described as an example.
As shown in the upper part of fig. 4, the detection unit 11 detects the intra-brain activation information 42 of the first subject R1 associated with the sensory information 41 by sensing the cat's face. The communication unit 12 transmits the intra-brain activation information 42 detected by the detection unit 11 to the estimation device 20.
In the estimating device 20, the communication unit 21 receives the intra-brain activation information 42 transmitted from the first device 10. The processing unit 22 estimates the reference sensory information 43 based on the received brain activation information 42. In this case, the processing unit 22 inputs the intra-brain activation information 42 of the first subject R1 into the first learning model. Based on the learning result of the correlation between the brain activation information 42 and the reference sensory information 43, the reference sensory information 43 corresponding to the input brain activation information 42 is output from the first learning model. The processing section 22 acquires the outputted reference sensation information 43 as an estimation result.
The processing unit 22 estimates corresponding sensory information 44 of the second subject R2 corresponding to the reference sensory information 43 based on the estimated reference sensory information 43. In this case, the processing section 22 inputs the acquired reference sensory information 43 to the second learning model. Based on the learning result of the correlation between the reference sensory information 43 and the corresponding sensory information 44, the corresponding sensory information 44 corresponding to the input reference sensory information 43 is output from the second learning model. The processing section 22 acquires the outputted corresponding feeling information 44 as an estimation result. The communication unit 21 transmits the acquired corresponding sensation information 44 to the second device 30.
In the second device 30, the communication unit 32 receives the corresponding sensation information 44 transmitted from the estimation device 20. The processing unit 33 inputs the received corresponding feeling information 44 to the third learning model stored in the storage unit 35. Based on the learning result of the correlation between the corresponding sensory information 44 and the stimulus image information, stimulus image information 45 corresponding to the inputted corresponding sensory information 44 is output from the third learning model. The stimulus applying unit 34 applies a stimulus to the brain of the second subject R2 by irradiating the brain of the second subject R2 with electromagnetic waves based on the outputted stimulus image information 45. As a result, the second subject R2 to which the stimulus is applied by the stimulus applying section 34 associates the corresponding sensory information 46 corresponding to the stimulus image information 45. That is, the second subject R2 associates the visual information of the face of the cat as the corresponding sensory information 46.
On the other hand, since there is a difference in individuality or the like between the first subject R1 and the second subject R2, the correspondence relationship between the brain activity state and the intra-brain activation information is different. Therefore, as shown in the lower part of fig. 4 (indicated by a single-dot chain line), when the intra-brain activation information 42 of the first subject R1 is directly transmitted to the second device 30 and the second device 30 applies a stimulus to the brain of the second subject R2 so as to correspond to the intra-brain activation information 42, the second subject R2 has a high possibility of associating visual information different from the cat face as the sensory information 47. In this case, the sensory information of the first subject R1 is not properly transferred to the second subject R2.
In contrast, in the sensory transmission system 100 according to the present embodiment, the corresponding sensory information 44 of the second subject R2 is estimated in the estimating device 20, so that the visual information of the cat's face is appropriately transmitted from the first subject R1 to the second subject R2.
Fig. 5 is a flowchart showing an example of the operation of the sensation transmission system 100. As shown in fig. 5, in the sensory transmission system 100, the first device 10 detects the intra-brain activation information of the first subject R1 when the first subject R1 senses (step S101). Next, the estimating device 20 estimates reference sensory information of the first subject R1 regarding the sensory association based on the intra-brain activation information of the first subject R1 (step S102). Next, the estimation device 20 estimates corresponding sensory information corresponding to the reference sensory information of the second subject R2 different from the first subject R1 based on the estimated reference sensory information (step S103). Then, the second device 30 applies a stimulus to the second subject R2 so that the estimated corresponding sensory information is associatively correlated (step S104).
As described above, the sensory transmission system 100 according to the present embodiment includes: a first device 10 that detects brain activation information of a first subject R1 when the first subject R1 senses; an estimating device 20 that estimates reference sensory information, which is sensory information associated with perception, based on the detected brain activation information of the first subject R1, and that estimates corresponding sensory information, which is sensory information corresponding to the reference sensory information, with respect to the second subject R2 different from the first subject R1, based on the estimated reference sensory information; and a second device 30 that applies a stimulus to the second subject R2 so as to associate the estimated corresponding sensory information.
The sensation transmission method of the present embodiment includes: detecting brain activation information of the first subject R1 in the case where the first subject R1 senses; estimating reference sensory information, which the first subject R1 perceives as being associated with, based on the detected brain activation information of the first subject R1, and estimating corresponding sensory information, which corresponds to the reference sensory information, of the second subject R2 different from the first subject R1 based on the estimated reference sensory information; and applying a stimulus to the second subject R2 so that the estimated corresponding sensory information is associated.
The sensory transmission program according to the present embodiment causes a computer to execute: detecting brain activation information of the first subject R1 in the case where the first subject R1 senses; estimating reference sensory information of the first subject R1 to which the sense is associated based on the detected brain activation information of the first subject R1, and estimating corresponding sensory information corresponding to the reference sensory information of the second subject R2 different from the first subject R1 based on the estimated reference sensory information; and applying a stimulus to the second subject R2 so that the estimated corresponding sensory information is associated.
According to this configuration, the second subject R2 is not made to directly associate with the reference sensory information of the first subject R1, but rather the corresponding sensory information corresponding to the second subject R2 is estimated based on the reference sensory information, and the stimulus is applied to the second subject R2 so that the corresponding sensory information estimated by association is associated, so that even in the case where there is a difference in the brain activity at the time of associating the sensory information between the first subject R1 and the second subject R2, the sensory information can be appropriately transmitted from the first subject R1 to the second subject R2.
In the sensory transmission system 100 according to the present embodiment, the reference sensory information is sensory information associated with the first subject R1 who detects the intra-brain activation information. In this configuration, the sensory information of the first subject R1 and the sensory information of the second subject R2 are directly associated with each other, whereby the sensory information can be appropriately transmitted.
In the sensory transmission system 100 according to the present embodiment, the stimulation includes a behavior of irradiating the target portion of the brain of the second subject R2 with an electromagnetic wave signal to activate the target portion. In this configuration, the brain of the second subject R2 is directly activated, so that the second subject R2 can more directly associate with the sensory information.
Second embodiment
Next, a second embodiment will be described. In the first embodiment, the case where the sensation transmission system 100 transmits the sensation from the first subject R1 to the second subject R2 in one direction is described as an example. In contrast, in the second embodiment, the sensation transmission system 100 also transmits a sensation from the second subject R2 to the first subject R1. That is, the sensation transmission system 100 is configured to be capable of transmitting a sensation in both directions between the first subject R1 and the second subject R2.
The overall structure of the sensory transmission system 100 is the same as that of the first embodiment. The configuration of the sensory transmission system 100 will be described below with reference to fig. 1 and 2 from the second device 30 side.
The second device 30 includes a detection unit 31, a communication unit 32, a processing unit 33, a stimulation applying unit 34, and a storage unit 35. The processing unit 33, the stimulation applying unit 34, and the storage unit 35 are the same as those of the first embodiment. The detection unit 31 detects the brain activation information of the second subject R2 in the same manner as the detection unit 11 in the first embodiment. The communication unit 32 transmits the intra-brain activation information detected by the detection unit 31 to the estimation device 20.
The estimation device 20 includes a communication unit 21, a processing unit 22, and a storage unit 23, as in the estimation device 20 of the first embodiment. The communication unit 21 can perform wired communication or wireless communication. In the present embodiment, the communication unit 21 receives, for example, the intra-brain activation information transmitted from the second device 30. The communication unit 21 transmits, for example, the corresponding sensory information estimated by the processing unit 22, which will be described later, to the first device 10.
The processing unit 22 estimates sensory information (reference sensory information) concerning a sense of perception of the second subject R2 based on the detected brain activation information of the second subject R2. Regarding the second subject R2, the correspondence relation can be obtained by performing experiments in advance as to what brain activation information is to be what kind of sensory information is to be associated. For example, the brain activation information detected from the second subject R2 and the sensory information corresponding to the brain activation information may be associated with each other to form a set of learning data sets, and the learning data sets may be subjected to machine learning to generate the fourth learning model. The fourth learning model can be stored in the storage unit 23 in advance, for example.
The processing unit 22 estimates, based on the estimated reference sensory information, corresponding sensory information regarding the first subject R1, the corresponding sensory information corresponding to the reference sensory information. In this case, the processing unit 22 can perform estimation based on the second learning model stored in the storage unit 23.
The first device 10 includes a detection unit 11, a communication unit 12, a processing unit 13, a stimulation application unit 14, and a storage unit 15. The detection unit 11 and the communication unit 12 have the same configuration as the first embodiment. Further, the communication unit 12 receives the corresponding sensory information transmitted from the estimation device 20.
The processing unit 13 estimates stimulus image information corresponding to the corresponding sensory information received by the communication unit 12. The stimulus image information is information indicating the stimulus content applied to the first subject R1 by the stimulus applying unit 14.
The stimulus applying unit 14 applies stimulus to the first subject R1 by irradiating an electromagnetic wave signal to a target portion of the brain of the first subject R1 to activate the target portion. In this case, similarly to the stimulus applying section 34 in the first embodiment, the brain of the first subject R1 is divided by a three-dimensional matrix composed of voxels of several millimeters or less, and electromagnetic waves are irradiated to each voxel. The stimulus applying unit 14 can irradiate electromagnetic waves based on stimulus image information indicating which voxel of the three-dimensional matrix is irradiated with electromagnetic waves of what intensity. The correspondence relation can be obtained by performing experiments in advance on the sensory information about what kind of electromagnetic waves can be associated as long as which voxel in the brain of the first subject R1 is irradiated with electromagnetic waves of what intensity. For example, the stimulus image information for the first subject R1 and the sensory information associated with the first subject R1 when electromagnetic waves are irradiated based on the stimulus image information can be associated with each other to form a set of learning data sets, and the learning data sets can be subjected to machine learning to generate the fifth learning model. The fifth learning model can be stored in the storage unit 15 of the first device 10 in advance, for example.
The fourth learning model and the fifth learning model described above are generated using, for example, a neural network represented by VGG16, similarly to the first to third learning models. When the fourth learning model and the fifth learning model are generated, the learning data set is input to the neural network, and the correlation of the learning data set is learned by machine learning such as deep learning. That is, a learning model is generated by learning the optimized neural network. The present invention is not limited to the convolutional neural network represented by VGG16, and other types of neural networks may be used to generate the learning model.
Next, a sensory transmission method using the sensory transmission system 100 configured as described above will be described. Further, a sensory transmission method of transmitting a sensory from the first subject R1 to the second subject R2 is the same as that of the first embodiment. In this embodiment, a case will be described in which a feeling is transmitted from the second subject R2 to the first subject R1.
Fig. 6 is a diagram schematically showing an example of the operation of the sensory transmission system 100. As shown in fig. 6, the second subject R2 is perceived to associate the reference sensory information. Hereinafter, a case where the second subject R2 visually perceives the face of the cat and associates the same as visual information will be described as an example. In the following example, the face of a cat is set to be the face of a cat in the same manner as in the first embodiment.
As shown in the upper stage of fig. 6, the detection unit 31 detects the intra-brain activation information 52 of the second subject R2 associated with the sensory information 51 by sensing the cat's face. The communication unit 32 transmits the intra-brain activation information 52 detected by the detection unit 31 to the estimation device 20.
In the estimating device 20, the communication unit 21 receives the intra-brain activation information 52 transmitted from the third device 130. The processing unit 22 estimates the reference sensory information 53 based on the received brain activation information 52. In this case, the processing unit 22 inputs the intra-brain activation information 52 of the second subject R2 to the fourth learning model. According to the fourth learning model, based on the learning result of the correlation between the intra-brain activation information 52 and the reference sensory information 53, the reference sensory information 53 corresponding to the inputted intra-brain activation information 52 is output. The processing section 22 acquires the outputted reference sensation information 53 as an estimation result.
The processing unit 22 estimates corresponding sensory information 54 corresponding to the reference sensory information 53 of the first subject R1 based on the estimated reference sensory information 53. In this case, the processing unit 22 inputs the acquired reference sensory information 53 into the second learning model. From the second learning model, based on the learning result of the correlation between the reference sensory information 53 and the corresponding sensory information 54, the corresponding sensory information 54 corresponding to the input reference sensory information 53 is output. The processing section 22 acquires the outputted corresponding feeling information 54 as an estimation result. The communication unit 21 transmits the acquired corresponding sensory information 54 to the first device 10.
As shown in the lower stage of fig. 6, in the first apparatus 10, the communication section 12 receives the corresponding sensation information 54 transmitted from the estimation apparatus 20. The processing unit 13 inputs the received corresponding feeling information 54 to the fifth learning model stored in the storage unit 15. From the fifth learning model, based on the learning result of the correlation between the corresponding sensory information 54 and the stimulus image information 55, the stimulus image information 55 corresponding to the inputted corresponding sensory information 54 is output. The stimulus applying unit 14 applies a stimulus to the brain of the first subject R1 by irradiating the brain of the first subject R1 with electromagnetic waves based on the outputted stimulus image information 55. As a result, the first subject R1 to which the stimulus is applied by the stimulus applying section 14 associates the corresponding sensory information 56 corresponding to the stimulus image information 55. That is, the first subject R1 associates the visual information of the cat's face as the corresponding sensory information 56. Thus, the visual information of the cat's face is transferred from the second subject R2 to the first subject R1.
As described above, in the sensory transmission system 100 according to the present embodiment, the reference sensory information is the sensory information associated with the second subject R2 who detected the brain activation information. In this configuration, by directly associating the sensory information of the second subject R2 with the sensory information of the first subject R1, the sensory information can be appropriately transmitted between subjects having a difference in brain activity.
Third embodiment
Next, a third embodiment will be described. In the first and second embodiments, description has been made with reference to sensory information associated with a subject who detects brain activation information. In contrast, in the third embodiment, a case will be described in which standard sensory information extracted based on sensory information corresponding to a plurality of subjects is taken as reference sensory information. The overall structure of the sensory transmission system 100 is the same as that of the first embodiment.
In the estimation device 20, in the case of estimating the corresponding sensory information from the reference sensory information, the standard sensory information is used as the reference sensory information. For example, when a plurality of testees perform the same perception of observing the same image or the like, the standard sensory information can be an average value of the brain activation information detected in the plurality of testees. Therefore, the standard sensory information is average sensory information of a person with small individual differences based on the acquired memory and the like.
The standard sensory information can be extracted from the learning result when learning the sensory information corresponding between the plurality of subjects. For example, the sixth learning model can be generated by performing mechanical learning on a specific subject (for example, the first subject R1 or the second subject R2) and standard sensory information (average value of the brain activation information of a plurality of subjects) as a set of learning data sets. In the sixth learning model generation, the standard sensory information may be extracted from the plurality of sensory information items included in the learning data set, and the extracted standard sensory information item may be associated with each of the brain activation information items included in the learning data set. As a result, for example, a sixth learning model is generated in which the correspondence relationship between the individual brain activation information and the standard sensory information of the plurality of subjects is machine-learned for different types of sensory information, such as visual sensory information when a certain object is observed, auditory sensory information when a certain sound is heard, and tactile sensory information when a certain object is touched. The sixth learning model can be stored in the storage unit 25, for example.
Next, a sensory transmission method using the sensory transmission system 100 configured as described above will be described. Fig. 7 is a diagram schematically showing an example of the operation of the sensory transmission system 100. For example, when the sensory information 61 when the first subject R1 perceives is transmitted to the second subject R2, the intra-brain activation information 62 of the first subject R1 is acquired in the first apparatus 10 and transmitted to the estimating apparatus 20.
In the estimating device 20, the processing unit 33 inputs the intra-brain activation information 62 transmitted from the first device 10 and the identification information of the second subject R2, which is the transmission destination of the sensory information, to the sixth learning model stored in the storage unit 35. In the sixth learning model, the standard sensory information 63 corresponding to the intra-brain activation information 62 is calculated, and the sensory information of the second subject R2 corresponding to the standard sensory information 63 is output as the corresponding sensory information 64. The communication unit 21 transmits the outputted corresponding sensory information 64 to the second device 30.
In the second device 30, the corresponding sensory information 64 transmitted from the estimating device 20 is received, and the stimulus image information 65 is acquired based on the received corresponding sensory information 64, as in the first embodiment. The stimulus applying unit 34 applies a stimulus to the brain of the second subject R2 by irradiating the brain of the second subject R2 with electromagnetic waves based on the outputted stimulus image information 65. The second subject R2 to which the stimulus is applied by the stimulus applying section 34 associates the corresponding sensory information 66 corresponding to the stimulus image information 65.
As described above, in the sensory transfer system 100 according to the present embodiment, the reference sensory information is the standard sensory information 63 extracted based on the corresponding sensory information among the plurality of subjects. In this structure, since the standard sensory information 63 extracted based on the corresponding sensory information among the plurality of subjects is used as the reference sensory information, the sensory information can be appropriately transferred among the plurality of subjects.
Fourth embodiment
Fig. 8 is a diagram showing an example of the sensory transmission system 200 according to the fourth embodiment. In the sensory transmission system 100 according to the above embodiment, description has been made of a case where sensory information is transmitted between different subjects. In contrast, in the sensory transmission system 200 according to the fourth embodiment, a case where sensory information is transmitted between the same subjects will be described as an example.
As shown in fig. 8, the sensory transfer system 200 includes a detection stimulation device (detection device, stimulation device) 110 and an estimation device 120. The detection stimulation device 110 has, for example, the same configuration as the first device 10 described in the above embodiment, and includes a detection unit 11, a communication unit 12, a processing unit 13, a stimulation application unit 14, and a storage unit 15. The detection unit 11 detects brain activation information. The communication unit 12 performs wired communication or wireless communication, and transmits the brain activation information detected by the detection unit 11 to the estimation device 20. The processing unit 13 calculates stimulation image information based on the corresponding sensory information received by the communication unit 12. The stimulus applying unit 14 applies a stimulus to the target subject R4 by irradiating the target site of the brain of the target subject R4 with an electromagnetic wave signal based on the calculated stimulus image information to activate the target site. The storage unit 15 stores various information.
The estimation device 120 includes a communication unit 21, a processing unit 22, and a storage unit 23. The communication unit 21 can perform wired communication or wireless communication. In the present embodiment, the communication unit 21 receives, for example, the intra-brain activation information transmitted from the detection stimulation device 110. The communication unit 21 transmits, for example, the corresponding sensory information estimated by the processing unit 22 to be described later to the detection stimulation device 110.
The processing unit 22 estimates reference sensory information associated with perception based on the intra-brain activation information of the subject R4 detected at the first time point. The first time point may be a time point when the subject R4 is young, for example, the subject R4 may be less than 3 years old.
The reference sensory information may be, for example, sensory information that the subject R4 perceives as being at the first time point. In this case, the processing unit 22 can estimate the reference sensory information based on the seventh learning model similar to the first learning model described above. The seventh learning model can be stored in the storage unit 25, for example.
The reference sensory information may be, for example, standard sensory information extracted from a plurality of subjects based on the corresponding sensory information. The standard sensory information may be, for example, information extracted based on sensory information that is associated with a plurality of subjects corresponding to the target subject R4 in the growth state of the brain. Examples of such a plurality of subjects include a plurality of subjects corresponding to ages, a plurality of subjects corresponding to growing environments (latitude, cultural environments, use languages, etc.), a plurality of subjects identical or corresponding to professions, and the like. The processing unit 22 can estimate the eighth learning model of the standard sensory information using, for example, the intra-brain activation information of the subject R4 at the first time point. The eighth learning model can be stored in the storage unit 25, for example.
The processing unit 22 estimates corresponding sensory information corresponding to the reference sensory information of the target subject R4 at a second time point when time has elapsed from the first time point, based on the estimated reference sensory information. The extent of the growth of the brain of, for example, the subject R4 may also be substantially different at the first time point and the second time point. For example, the ninth learning model can be generated by taking the sensory information corresponding between the object subject R4 at the first time point and the object subject R4 at the second time point as a set of learning data sets, and subjecting the learning data sets to machine learning. The ninth learning model can be stored in the storage unit 23 in advance, for example.
Next, a sensory transmission method using the sensory transmission system 200 configured as described above will be described. Fig. 9 is a diagram schematically showing an example of the operation of the sensory transmission system 100. As shown in the upper stage of fig. 9, the detection unit 11 of the detection stimulation device 110 detects the intra-brain activation information 72 of the subject R4 that perceives the face of the cat and thinks of the sensory information 71. The communication unit 12 transmits the intra-brain activation information 72 detected by the detection unit 11 to the estimation device 120. In the estimating device 120, the communication unit 21 receives the intra-brain activation information 72 transmitted from the detection stimulating device 110. The processing unit 22 inputs the received brain activation information to the seventh learning model or the eighth learning model. The reference sensory information 73 corresponding to the input brain activation information 72 is output from the seventh learning model or the eighth learning model. The processing section 22 acquires the outputted reference sensation information 73 as an estimation result. The storage unit 25 stores the acquired reference sensation information 73.
As shown in the lower stage of fig. 9, at the second time point when the time has elapsed from the first time point, for example, when the target subject R4 wants to additionally experience the perception of the face of the cat, the target subject R4 prepares to be able to apply the stimulus from the stimulus applying section 14 of the detection stimulus device 110. The processing unit 22 estimates corresponding sensory information 74 corresponding to the reference sensory information 73 of the target subject R4 based on the reference sensory information 73 stored in the storage unit 25. In this case, the processing unit 22 inputs the acquired reference sensory information 73 to the ninth learning model. Corresponding sensory information 74 corresponding to the input reference sensory information 73 is output from the ninth learning model. The processing section 22 acquires the outputted corresponding feeling information 74 as an estimation result. The communication unit 21 transmits the acquired corresponding sensory information 74 to the detection stimulation device 110.
In the detection stimulation device 110, the communication unit 12 receives the corresponding sensation information 74 transmitted from the estimation device 120. The processing unit 13 inputs the received corresponding feeling information 74 to the third learning model stored in the storage unit 15. Stimulus image information 75 corresponding to the inputted corresponding sensory information 74 is output from the third learning model. The stimulus applying unit 14 applies stimulus to the brain of the subject R4 by irradiating electromagnetic waves to the brain of the subject R4 based on the output stimulus image information 75. As a result, the subject R4 to which the stimulus is applied by the stimulus applying section 14 associates the corresponding sensory information 76 corresponding to the stimulus image information 75. That is, the subject examinee R4 associates visual information when observing the face of the cat at the first time point at the second time point as the corresponding sensation information 76. Thus, visual information of the cat's face is transferred from the subject R4 at the first time point to the subject R4 at the second time point. Thus, the subject R4 can additionally experience the face of the viewing cat.
Fig. 10 is a flowchart showing an example of the operation of the sensation transmission system 200. As shown in fig. 10, in the sensory transmission system 200, the detection stimulation device 110 detects the brain activation information in the case where the subject person R4 perceives at the first time point (step S201). Next, the estimation device 20 estimates reference sensory information at the first time point based on the intra-brain activation information of the subject R4 (step S202). Next, the estimation device 20 estimates corresponding sensory information corresponding to the reference sensory information of the second subject R2 different from the first subject R1 based on the estimated reference sensory information (step S203). Then, the second device 30 applies a stimulus to the second subject R2 so that the estimated corresponding sensory information is associatively correlated (step S204).
As described above, the sensory transmission system 200 according to the present embodiment includes: a detection device (detection stimulation device 110) that detects brain activation information of the target subject R4 when the target subject R4 senses; an estimating device 120 that estimates reference sensory information, which is sensory information that is a perception idea, based on the brain activation information of the subject R4 detected at the first time point, and that estimates corresponding sensory information corresponding to the reference sensory information of the subject R4 at a second time point at which time has elapsed compared to the first time point, based on the estimated reference sensory information; and a stimulation device (detection stimulation device 110) that applies stimulation to the subject examinee R4 at a second time point so that the estimated corresponding sensory information is associated.
The sensory transmission method according to the present embodiment includes: detecting brain activation information of the subject R4 in the case where the subject R4 senses; estimating reference sensory information, which is sensory information thinking about perception, based on the intra-brain activation information of the subject R4 detected at the first time point, and estimating corresponding sensory information corresponding to the reference sensory information of the subject R4 at a second time point at which time has elapsed compared to the first time point, based on the estimated reference sensory information; and applying a stimulus to the subject R4 at a second point in time so that the estimated corresponding sensory information is associatively estimated.
According to this structure, instead of causing the object-subject R4 at the second time point to directly associate the reference sensory information of the object-subject R4 at the first time point, the corresponding sensory information corresponding to the object-subject R4 at the second time point is estimated based on the reference sensory information, and the stimulus is applied to the object-subject R4 so that the estimated corresponding sensory information is associated. Therefore, even in the case where there is a difference in brain activity when the sensory information is associated between the subject R4 at the first time point and the subject R4 at the second time point, the sensory information can be appropriately transmitted.
In the sensory transmission system 200 according to the present embodiment, the reference sensory information is the sensory information associated with the subject R4 at the first time point. In this configuration, the sensory information of the target subject R4 at the first time point and the sensory information of the target subject R4 at the second time point are directly associated with each other, whereby the sensory information can be appropriately transmitted.
In the sensory transmission system 200 according to the present embodiment, the reference sensory information is standard sensory information extracted based on sensory information corresponding to a plurality of subjects. In this structure, since the standard sensory information extracted based on the sensory information corresponding to the plurality of subjects is used as the reference sensory information, the sensory information can be efficiently transferred between the subject subjects R4 at different points in time.
In the sensory transmission system 200 according to the present embodiment, the standard sensory information is extracted based on the sensory information associated with a plurality of subjects whose brain growth states correspond to the target subject R4. In this configuration, based on the sensory information associated with a plurality of subjects whose brain growth states correspond to the target subjects R4, the sensory information can be efficiently transmitted between the target subjects R4 at different time points.
Fifth embodiment
Fig. 11 is a diagram showing an example of a sensory transmission system 300 according to the fifth embodiment. As shown in fig. 10, a sensation transmission system 300 according to the fifth embodiment includes a sensation estimation device 220 and a stimulation device 210.
The sensation estimation apparatus 220 includes a communication section 221, a processing section 222, a storage section 223, and an input section 224. The communication unit 221 performs wired communication or wireless communication with the stimulation device 210. The storage section 223 stores reference sensation information as sensation information reminiscent of the perception. The storage section 223 has a memory such as a hard disk drive or a solid state drive. Further, as the storage section 223, an external storage medium such as a removable disk may be used. The reference sensory information may be sensory information that the subject R5 or a person different from the subject R5 thinks about, or may be standard sensory information extracted based on sensory information corresponding to a plurality of subjects.
The processing unit 222 estimates corresponding sensory information, which is sensory information corresponding to the reference sensory information about the subject R5, based on the reference sensory information stored in the storage unit 223. For example, the tenth learning model can be generated by taking the sensory information corresponding between the sensory information of the subject R5 and the reference sensory information as a set of learning data sets, and subjecting the learning data sets to machine learning. The tenth learning model can be stored in the storage unit 223 in advance, for example. For example, when reference sensory information is designated by the input unit 224 described later, the processing unit 222 estimates corresponding sensory information corresponding to the designated reference sensory information.
The input unit 224 can perform a predetermined input operation for inputting information. As the input unit 224, for example, an input device such as a keyboard or a touch panel is used. Further, as the input section 224, a button, a lever, a dial, a switch, or other input means may be used in addition to or instead of them. The input unit 224 can input, for example, sensory information that the subject R5 wants to experience from among the plurality of pieces of reference sensory information stored in the storage unit 223.
The stimulation device 210 includes a communication unit 212, a processing unit 213, a stimulation applying unit 214, and a storage unit 215. The communication unit 212 can perform wired communication or wireless communication. The communication unit 212 receives the corresponding sensation information transmitted from the sensation estimation apparatus 220.
The processing unit 213 calculates the stimulus image information corresponding to the received corresponding sensory information based on the corresponding sensory information received by the communication unit 212 and the eleventh learning model in which the correspondence relation between the corresponding sensory information and the stimulus image information is learned. The eleventh learning model can be, for example, a learning model similar to the third learning model in the first embodiment. The stimulus applying unit 214 applies a stimulus to the subject R5 by irradiating an electromagnetic wave signal to a target portion of the brain of the subject R5 to activate the target portion.
Next, a sensory transmission method using the sensory transmission system 300 configured as described above will be described. Fig. 12 is a diagram schematically showing an example of the operation of the sensory transmission system 300 according to the present embodiment. The subject R5 is prepared so as to be able to apply stimulus from the stimulus applying section 214 of the stimulus device 210. When the subject R5 selects the reference sensory information 81 via the input unit 224, the processing unit 222 of the sensory estimation device 220 estimates the corresponding sensory information 82 about the subject R5 based on the selected reference sensory information 81. The processing unit 222 inputs the selected reference sensory information 81 to the tenth learning model, for example. Corresponding sensory information 82 corresponding to the input reference sensory information 81 is output from the tenth learning model. The processing section 222 acquires the outputted corresponding sensation information 82 as an estimation result. The communication unit 221 transmits the acquired corresponding sensory information 82 to the stimulation device 210.
In the stimulation device 210, the communication unit 212 receives the corresponding sensation information 82 transmitted from the sensation estimation device 220. The processing unit 213 inputs the received corresponding sensory information 82 to the eleventh learning model stored in the storage unit 215. Stimulus image information 83 corresponding to the inputted corresponding sensory information 82 is output from the eleventh learning model. The stimulus applying unit 214 applies a stimulus to the brain of the subject R5 by irradiating the brain of the subject R5 with electromagnetic waves based on the output stimulus image information 83. As a result, the subject R5 to which the stimulus is applied by the stimulus applying section 214 associates the corresponding sensory information 84 corresponding to the stimulus image information 83. That is, the subject R5 associates visual information of the cat's face as the corresponding sensory information 84. In this way, visual information of the cat's face is transmitted from the sensation evaluation apparatus 220 to the subject R5.
Fig. 13 is a flowchart showing an example of the operation of the sensation transmission system 300. As shown in fig. 13, when the subject R5 selects the reference sensory information 81 via the input unit 224, the processing unit 222 acquires the selected reference sensory information 81 in the sensory estimation device 220 (step S301), and estimates the corresponding sensory information 82 about the subject R5 based on the acquired reference sensory information 81 (step S302). Then, the stimulation device 210 applies stimulation to the fifth subject R5 to associate the estimated corresponding sensory information 82 (step S303).
As described above, the sensation estimation apparatus 220 according to the present embodiment includes: a storage unit 223 that stores reference sense information that associates senses; and a processing unit 222 that estimates, based on the reference sensory information stored in the storage unit 223, sensory information corresponding to the reference sensory information about the subject R5, that is, corresponding sensory information.
Further, the sensation estimation method according to the present embodiment includes: acquiring reference sensory information, which is sensory information reminiscent of perception, from the storage portion 223 storing reference sensory information; and estimating corresponding sensory information, which is sensory information corresponding to the reference sensory information about the subject R5, based on the acquired reference sensory information.
According to this structure, since the corresponding sensory information corresponding to the subject R5 is estimated based on the reference sensory information of the sensory estimation device 220, the appropriate sensory information can be estimated for each subject R5.
In the sensation estimation apparatus 220 according to the present embodiment, the reference sensation information is sensation information that is perceived by a person different from the subject R5. According to this structure, appropriate sensory information can be estimated between subjects having a difference in brain activity.
In the sensation estimation apparatus 220 according to the present embodiment, the reference sensation information is standard sensation information extracted based on the sensation information corresponding to the plurality of subjects R5. According to this configuration, sensory information can be appropriately transmitted between a plurality of subjects.
The sensory transmission system 300 according to the present embodiment includes: the above-described sensation estimation apparatus 220; and a stimulation device 210 that applies stimulation to the subject R5 so as to associate the corresponding sensory information estimated in the sensory estimation device 220. According to this configuration, based on the reference sensory information stored in the storage unit 223, the appropriate sensory information can be estimated for each subject R5 and transmitted to the subject R5.
The technical scope of the present disclosure is not limited to the above-described embodiments, and appropriate modifications may be made without departing from the spirit of the present disclosure. For example, in the above embodiments, the visual sense information was described as an example, but the present invention is not limited to this, and other sense information may be used as long as the sense information can detect the brain activation information. Further, for example, the feeling information may be detailed feeling information obtained by linking feeling information of five senses.
In generating the learning model, the following structure may be adopted: the content presented to the subject is set to an instruction such as "raise right hand" to learn the response of the subject's motion field. In generating the learning model, the following structure may be used: the content presented to the subject is audio-visual content such as a movie, so that the overall brain response of the subject is learned.
Further, the ID may be assigned to an element of the standard sensory information obtained when the learning model is generated, and the closer the correlation with the content is, the closer the ID is. For example, in the case where the contents of "dog", "cat" and "paper" are present, the "dog" and "cat" may be assigned close IDs, and the "dog" and "cat" may be assigned far IDs.
In the case where there is no corresponding sensory information corresponding to the standard sensory information, the content whose ID is set as described above is close to the corresponding sensory information, or the content may be configured so that the corresponding sensory information is not transmitted.
The corresponding sensory information derived from a certain standard sensory information may be replaced with other sensory information to be presented. For example, based on the standard sensory information corresponding to the visual information of the orange, the sensory information corresponding to the olfactory information, gustatory information, and tactile information replaced with the orange may be transmitted to the transmission destination.
In addition, when the sensory information is transmitted from the first subject R1 to the second subject R2, the instruction information such as "the sensory information is transmitted to the second subject R2" may be deleted, and the communication control information corresponding to the packet header such as "the sensory information such as the first subject R1" and "the sensory information such as the request of the first subject R1 to the second subject R2" may be added.
In addition, in the case of extracting standard sensory information, online learning may be performed so that the content is always updated. The estimated feeling information may be stored in advance in a storage unit, and transmitted to a transmission destination after a predetermined period of time has elapsed.
The sensory transfer system, the sensory transfer method, and the sensory transfer program to which the present disclosure relates can be used for a processing apparatus such as a computer or the like.
NW … neural network, R1 3835 first subject, R2 … second subject, R4 … subject, R5 … subject, S1 … convolution layer, S2 … pooling layer, S3 … binding layer, 10 … first device, 11, 31 … detection portion, 12, 21, 32, 212, 221 … communication portion, 13, 22, 33, 213, 222 … processing portion, 14, 34, 214 … stimulation imparting portion, 15, 23, 25, 35, 215, 223 … storage portion, 20, 120 … estimation device, 30 … second device, 41, 47, 51, 61, 71 … sensory information, 42, 52, 62, 72 … intra-brain activation information, 43, 53, 73, 81 … reference sensory information, 44, 46, 54, 56, 64, 66, 74, 76, 82, 84 … correspondence sensory information, 45, 55, 65, 75, 83, …,63, 200, 300, and third device, and a device for detecting the standard sensory information, and the like the input device …, the device for the input of the ….

Claims (13)

1. A sensation transmission system is provided with:
a first device that detects brain activation information of a first subject when the first subject is perceiving;
An estimating device that estimates reference sensory information based on the detected brain activation information of the first subject, and that estimates corresponding sensory information regarding a second subject different from the first subject based on the estimated reference sensory information, the reference sensory information being sensory information reminiscent of the perception, the corresponding sensory information being sensory information corresponding to the reference sensory information; and
and a second device that applies a stimulus to the second subject so that the second subject is reminiscent of the estimated corresponding sensory information.
2. The sensory transmission system according to claim 1, wherein,
the second device detects brain activation information of the second subject when stimulus is applied to the second subject,
the estimating means estimates the reference sensory information based on the detected brain activation information of the second subject, estimates the corresponding sensory information about the first subject based on the estimated reference sensory information,
the first device applies a stimulus to the first subject such that the first subject is reminiscent of the estimated corresponding sensory information.
3. The sensory transmission system according to claim 1 or 2, wherein,
the reference sensory information is the sensory information associated with the sense by a subject detecting the intra-brain activation information among the first subject and the second subject.
4. The sensory transmission system according to claim 1 or 2, wherein,
the reference sensory information is standard sensory information extracted based on the sensory information corresponding between a plurality of subjects.
5. The sensory transmission system according to claim 1, wherein,
the first device includes a detection device that detects brain activation information of a subject when the subject is subjected to sensing,
the estimating means estimates reference sensory information, which is sensory information reminiscent of the perception, based on the in-brain activation information of the subject detected at a first time point, and estimates corresponding sensory information about the subject at a second time point at which time has elapsed compared to the first time point, which is information corresponding to the reference sensory information, based on the estimated reference sensory information,
The second means comprises stimulation means for applying stimulation to the subject at the second point in time such that the subject is reminiscent of the estimated corresponding sensory information.
6. The sensory transmission system according to claim 5, wherein,
the reference sensory information is the sensory information that the subject-to-be-measured at the first point in time is reminiscent of the perception.
7. The sensory transmission system according to claim 5, wherein,
the reference sensory information is standard sensory information extracted based on the sensory information corresponding between a plurality of subjects.
8. The sensory transmission system according to claim 7, wherein,
the standard sensory information is extracted based on the sensory information that the growth state of the brain is reminiscent of a plurality of the subjects corresponding to the subject.
9. The sensory transmission system according to claim 1, wherein,
the estimation device has:
a storage unit that stores the reference sensation information; and
and a processing section that estimates the corresponding sensory information based on the reference sensory information stored in the storage section.
10. The sensation delivery system of claim 9 wherein,
The reference sensory information is sensory information associated with the perception by a person different from the first subject.
11. The sensation delivery system of claim 9 wherein,
the reference sensory information is standard sensory information extracted based on the sensory information corresponding between a plurality of subjects.
12. A method of sensory transmission, comprising:
detecting brain activation information of a first detected person when the first detected person senses;
estimating reference sensory information, which is sensory information associated with the sense, based on the detected brain activation information of the first subject, and estimating corresponding sensory information, which is sensory information corresponding to the reference sensory information, regarding a second subject different from the first subject, based on the estimated reference sensory information; and
applying a stimulus to the second subject such that the second subject is reminiscent of the estimated corresponding sensory information.
13. A sensation transfer program that causes a computer to execute:
detecting brain activation information of a first detected person when the first detected person senses;
Estimating reference sensory information, which is sensory information associated with the sense, based on the detected brain activation information of the first subject, and estimating corresponding sensory information, which is sensory information corresponding to the reference sensory information, regarding a second subject different from the first subject, based on the estimated reference sensory information; and
applying a stimulus to the second subject such that the second subject is reminiscent of the estimated corresponding sensory information.
CN202280057052.5A 2021-09-27 2022-09-27 Sensory transmission system, sensory transmission method, and sensory transmission program Pending CN117836742A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2021156506A JP2023047539A (en) 2021-09-27 2021-09-27 Sensory estimation device, sensory transmission system and sensory estimation method
JP2021-156717 2021-09-27
JP2021-156506 2021-09-27
JP2021-157247 2021-09-27
PCT/JP2022/035933 WO2023048295A1 (en) 2021-09-27 2022-09-27 Sensation transmission system, sensation transmission method, and sensation transmission program

Publications (1)

Publication Number Publication Date
CN117836742A true CN117836742A (en) 2024-04-05

Family

ID=85779197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280057052.5A Pending CN117836742A (en) 2021-09-27 2022-09-27 Sensory transmission system, sensory transmission method, and sensory transmission program

Country Status (2)

Country Link
JP (1) JP2023047539A (en)
CN (1) CN117836742A (en)

Also Published As

Publication number Publication date
JP2023047539A (en) 2023-04-06

Similar Documents

Publication Publication Date Title
KR102014176B1 (en) Brain training simulation system based on behavior modeling
JP5320543B2 (en) Brain function enhancement support device and brain function enhancement support method
CN104955385B (en) The stimulus to the sense organ of the accuracy of increase sleep sublevel
JP2009508553A (en) System and method for determining human emotion by analyzing eyeball properties
Novak et al. Predicting targets of human reaching motions using different sensing technologies
JP2002238908A (en) Method for measuring aura of human body and measuring system
JP2009531077A (en) Apparatus and method for real time control of effectors
Faria et al. Cerebral palsy eeg signals classification: Facial expressions and thoughts for driving an intelligent wheelchair
CA3046789A1 (en) Platform for identification of biomarkers using navigation tasks and treatments using navigation tasks
Lamti et al. When mental fatigue maybe characterized by Event Related Potential (P300) during virtual wheelchair navigation
KR102048551B1 (en) System and Method for Virtual reality rehabilitation training using Smart device
KR102282101B1 (en) Method and system for determining state of object using functional near-infrared spectroscopy
JP5870465B2 (en) Brain function training apparatus and brain function training program
US20200214613A1 (en) Apparatus, method and computer program for identifying an obsessive compulsive disorder event
Longo et al. Using brain-computer interface to control an avatar in a virtual reality environment
CN117836742A (en) Sensory transmission system, sensory transmission method, and sensory transmission program
EP4155880A1 (en) A method performed by an information processing device, a program product and a system for processing context related operational and human mental brain activity data
CN109069787A (en) Householder method, auxiliary system and program
WO2023048295A1 (en) Sensation transmission system, sensation transmission method, and sensation transmission program
EP4000520A1 (en) Method and system for sensor signals dependent dialog generation during a medical imaging process
JP2023047666A (en) Sensory transmission system and sensory transmission method
KR101736452B1 (en) Apparatus for analyzing stimulus reaction
JP5681917B2 (en) Brain function enhancement support device and brain function enhancement support method
JP2023048016A (en) Sensation transmission system and method for transmitting sensation
JP7435965B2 (en) Information processing device, information processing method, learning model generation method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination