NL1042207B1 - An emotion content control system for combining emotion content signals for feedback interaction and a method for enhancing the interaction with the human or animal subject thereof - Google Patents

An emotion content control system for combining emotion content signals for feedback interaction and a method for enhancing the interaction with the human or animal subject thereof Download PDF

Info

Publication number
NL1042207B1
NL1042207B1 NL1042207A NL1042207A NL1042207B1 NL 1042207 B1 NL1042207 B1 NL 1042207B1 NL 1042207 A NL1042207 A NL 1042207A NL 1042207 A NL1042207 A NL 1042207A NL 1042207 B1 NL1042207 B1 NL 1042207B1
Authority
NL
Netherlands
Prior art keywords
emotion
signals
control signal
human
interaction
Prior art date
Application number
NL1042207A
Other languages
Dutch (nl)
Other versions
NL1042207A (en
Inventor
Erwin Rinaldo Meinders Dr
Johanna Reitsma Mariska
Original Assignee
Mentech Innovation B V
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mentech Innovation B V filed Critical Mentech Innovation B V
Priority to NL1042207A priority Critical patent/NL1042207B1/en
Publication of NL1042207A publication Critical patent/NL1042207A/en
Application granted granted Critical
Publication of NL1042207B1 publication Critical patent/NL1042207B1/en

Links

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

An emotion content control system for converting emotion content signals from a human or animal subject into emotion content control signals for feedback interaction, said system comprising at least: - a first apparatus arranged for sensing from at least a first human or animal subject first subject emotion content signals of at least a first and a second type, processing and converting said first and second type subject emotion content signals into a first emotion content control signal and transmitting said converted first emotion content control signal; - a subject interaction device arranged in receiving said first emotion content control signal being transmitted by said first apparatus and in performing feedback interaction with said first human or animal subject based on said first emotion content control signal being received.

Description

SURROUND EMOTION
An emotion content control system for combining emotion content signals for feedback interaction and a method for enhancing the interaction with the human or animal subject thereof
Inventors: Erwin Rinaldo MEINDERS, Mariska Johanna REITSMA
Applicant: Mentech Innovation B.V., Tonnekeshei 14, 5508 VB Veldhoven, Netherlands
FIELD
[OOI] The invention relates to an emotion content control system for converting emotion content signals from a human or animal subject into emotion content control signals for feedback interaction, said system comprising at least a first apparatus arranged for sensing from at least a first human or animal subject first subject emotion content signals of at least a first and a second type, processing and converting said first and second type subject emotion content signals into a first emotion content control signal and transmitting said converted first emotion content control signal; and a subject interaction device arranged in receiving said first emotion content control signal being transmitted by said first apparatus and in performing feedback interaction with said first human or animal subject based on said first emotion content control signal being received.
[002] The invention also relates to a system comprising at least a further apparatus arranged for sensing from a further human or animal subject further subject emotion content signals of at least a first and a second type, processing and converting said further first and second type subject emotion content signals into a further emotion content control signal and transmitting said converted further emotion content control signal to said subject interaction device for performing feedback interaction with said further human or animal subject based on said further emotion content control signal being received.
[003] The invention also relates to a system wherein said first apparatus is arranged for sensing from at least a further human or animal subject further emotion content signals of at least a first and a second type, and in processing and converting said first and second type emotion content signals of said first and said further human or animal subject into an combined emotion content control signal and transmitting said converted combined emotion content control signal to said subject interaction device for performing feedback interaction with said first and said further human or animal subject based on said converted combined emotion content control signal being received.
[004] The invention also relates to a system wherein said first and/or further apparatus comprises: sensing means for sensing said emotion content signals of said at least first and second type; processing means for processing said emotion content signals of said at least first and second type into said emotion content control signal using cross-correlations based on parametric representations and frequency analysis of said emotion content signals of said at least first and second type; and transmission means for transmitting said emotion content control signal.
[005] The invention also relates to a system wherein said first and/or further apparatus comprises storage means for storing said emotion content signals of said at least first and second type and said emotion content control signal.
[006] The invention also relates to a system wherein: said processing means of said first and/or further apparatus are arranged in extracting reference content information from said emotion content signals of said at least first and second type being stored and/or being received, said reference content information being human and/or animal subject; and said reference content information signal being constructed from time and frequency analysis of the said emotion content signal of at least first and second type, or from content analysis of the said emotion content signal of at least first and second type; said reference content information signal being constructed via a self-learning algorithm of patterns, via pattern recognition; said reference content information is stored in the storage means.
[007] The invention also relates to a system wherein the said processing means of said first and/or further apparatus are capable of generating an emotion content control signal based on a predefined category of event selection.
[008] The invention also relates to a system according to any or more of the preceding claims, wherein said subject interaction device is arranged in receiving feedback signals being input by said first and/or further human or animal subject.
[009] The invention also relates to a system further comprises at least one auxiliary device operable based on said feedback interaction being performed by said subject interaction device.
[010] The invention also relates to a system wherein said auxiliary device comprises at least one of the list comprising: a sound emitting module; a light emitting module; a display screen; a tactile actuator; a vibration actuator; a heat emitting actuator.
[Oil] The invention also relates to a system wherein said types of emotion content signals comprises at least two of the following signals: heart beat; blood pressure; facial expression; body surface temperature; body surface conductivity; neurotransmitter or hormone profile information; vocal speech characteristics; body gestures or posture.
[012] The invention also relates to a method for enhancing the interaction with the human or animal subject, the method comprising the steps of: A. Exposing the human and animal subject to an interactive environmental setting; B. Sensing and receiving the emotional content signals of a first and second type of the said human or animal subject, said emotional content signals being generated by said subject due to said exposure to the interactive environmental setting; C. Processing the emotion content signals of said at least first and second type into said emotion content control signal using cross-correlations based on parametric representations and frequency analysis of said emotion content signals of said at least first and second type; D. Storage said emotion content signals of said at least first and second type and said emotion content control signal; E. Transmitting the emotion content control signal to a subject interaction device arranged in receiving said first emotion content control signal; F. Providing feedback interaction with said subject interaction device with first human or animal subject to said interactive environmental setting.
[013] The invention also relates to a computer implemented method that is executed in the emotion content storage apparatus or subject interaction device and consists of executable code compatible with said emotion content storage apparatus enabling the carrying out of a method according to the invention.
[014] The invention also relates to a computer program stored on a non-volatile record carrier, said computer program containing instruction codes, which instruction codes when executed comprising a computer program.
BACKGROUND
[015] In recent years, increased attention for emotion detection is noticed. Emotion states includes states of pleasure (for instance happiness), displeasure (for instance sadness), low arousal (for instance quietness), high arousal (for instance surprised). Social media make use of icons to express emotions. Emotion is expressed by facial, vocal, and postural expressions. Emotion can be determined from physiological reaction (activation or arousal, for instance increases in heart rate), the change in activity in the autonomic nervous system (ANS), blood pressure responses, skin responses, pupillary responses, brain waves, and heart responses. Examples include the IBM's emotion mouse (Ark, Dryer, & Lu 1999) and a variety of wearable sensors designed by the Affective Computing Group at MIT (e.g. Picard 2000).
[016] In recent years, the availability of measurement devices to measure physiological parameters of users is growing. Examples include heart beat sensors, respiratory sensors, skin conductance sensors, blood pressure sensors, temperature sensors, oxygen sensors, accelerometer sensors, motion sensors, GPS sensors. These sensors are more and more integrated in the human vicinity (for instance integrated in smart watches, clothing, shoes) or are embedded in the body (for instance underneath the human skin, or inside the body). The quality of the content of the signals is also increased.
[017] Further information on content analysis is generally available to the person skilled in the art, see for example the articles: - 'Activity-aware Mental Stress Detection Using Physiological Sensors' by Sun FT., Kuo C., Cheng HT., Buthpitiya S., Collins P., Griss M. from Carnegie Mellon University and Nokia Research Center, published in: Gris M., Yang G. (eds) Mobile Computing, Applications, and Services. MobiCASE 2010. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 76. Springer, Berlin, Heidelberg. - Towards mental stress detection using wearable physiological sensors by Wijsman Jl, Grundlehner B, Liu H, Hermens H, Penders J., in Conf Proc IEEE Eng Med Biol Soc. 2011;2011:1798-801. doi: 10.1109/IEMBS.2011.6090512. - 'Stress Detection Using Wearable Physiological Sensors' by Sandulescu V., Andrews S., Ellis D., Bellotto N., Mozos O.M. (2015). In: Ferrandez Vicente J., Alvarez-Sanchez J., de la Paz López F., Toledo-Moreo F., Adeli H. (eds) Artificial Computation in Biology and Medicine. IWINAC 2015. Lecture Notes in Computer Science, vol 9107. Springer, Cham - 'Stress Recognition Using Wearable Sensors and Mobile Phones' by Akane Sano, Rosalind W. Picard, in: Proceeding ACII Ί3 Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, pages 671-676.
[018] In addition to the increased quantity of emotion content and the increased distribution flexibility of these signals, vocal/sound, facial mages and postural expressions are sources that provide information about the emotion level of a user. Pattern recognition can be applied to derive emotion levels from these parameters.
[019] In addition to these sources of emotion content, increased attention is given to derive emotion content from neurotransmitters or hormones. Information about the presence and the temporal concentration of these hormones in a human or animal body provides insight in the emotional status. Pattern recognition of the concentration profiles can be applied to derive emotion levels from these parameters. The hormone Dopamine is a feel good hormone, and is created in the brains. The hormone Oxytocine strengthens the bond between persons. Mother's milk contains a large number of this hormone. Endorfine is used to face stress and pain. It's a sort of relieve for pain. Serotonine is the happiness hormone, generated in the gut and brains.
[020] More information about the working principles of neurotransmitters and hormones is given in: - www.medicalnewstoday.com/kc/serotonin-facts-232248 en.wikipedia.org/wiki/Serotonin - www.sciencedirect.com/science/article/pii/S0166432814004768 - www.newhealthadvisor.com/Serotonin-and-Dopamine.html - www.life-enhancement.com/magazine/article/178-5-htp-enhance-your-mood-your-sleep-and-a-lot-more eocinstitute.org/meditation/dhea_gaba_cortisol_hgh_melatonin_serotonin_endorphins [021] Many means nowadays exist to augment the perception of senses. 3D imaging via virtual reality glasses is for instance used to augment the experience of watching movies. Surround sound is used to augment the sensation of audio. 4D cinema use all kind of tricks to augment the sensation of performance, via movement, water droplets, etc.
[022] The care industry anticipates to these developments by introducing care games to augment the sensing performance of vulnerable people, like mentally disabled persons, retarded elderly etc. Care games for instance can augment the sensation of feeling, or interaction between an image and movement of the body. For example, a care game used to stimulate the activity of retarded people or dement elderly can be equipped with emotion detection to increase the participation in the game. Excitement can be stimulated by increased complexity of the offered game features, boringness can be avoided by offering different features or levels.
[023] It has also been suggested that improved user experience may be achieved by providing emotion content signals. Queasy live performances of great singers is common practice, by augmenting the experience by showing live recordings, including live voice/sing, live dance/performance and other visual live elements. Augmentation of the performance by live emotion of the remote or passed-away performer will augment the experience of the public. How great will it be to listen to a recorded live performance of Elvis's 'How great thou are', with the sensation of feeling his emotions as well via recorded emotion data from past live performances? Or experience the sensation of making a goal during a world champion soccer game.
[024] It has also been suggested that improved gaming experience may be achieved by providing emotion content signals. If the emotion of a game player is determined and simultaneously provided as input signal to control the course of a game, the gaming experience will be influenced. For example, if a gamer wants to relax it can program the game in such a way that excitement, captured by the emotion content signal, is mitigated by changing the degree of difficulty, the pace of the game, the appearance of the game, the environmental setting, the appearance of the characters and personalities, etc. It may also be programmed to enhance the emotional status via the emotional control signal.
[025] It has also been suggested that improved training performance may be achieved by providing emotion content signals. If the emotion of a person, or a horse or a dog during training is determined and simultaneously provided as input signal to steer the training program, the results of the program may be enhanced. For instance, the emotional status of a dressage horse may be used to influence the training program. If the trainer noticed stress build-up, it can decide to practice a for the horse known exercise to reduce stress and to give the horse confidence. In case the trainer detects happiness or positive emotions, he can decide to increase the degree of difficulty, or practice a difficult element of dressage programs. The same applies for dogs and other animals. Also for sportsmen, the emotional status may be used to steer the training program, based on positive and negative attributes. Also for soldiers, the emotional status may be used to steer the training program.
[026] It has also been suggested that improved education and learning experience may be achieved by providing emotion content signals. If the emotion of a student during education or learning experience is determined and simultaneously provided as input signal to steer the educational program (E-learning or school class), the efficiency of the educational effort is increased. If the student experiences stress, the teacher may decide to introduce stress-relaxation exercises, if the student experiences happiness and positive vibe or flow, the teacher may decide to increase the degree of difficulty. The emotion content signal may also be used to change the subject of the learning program, the degree of difficulty of the exercises.
[027] It has also been suggested that improved mission experience may be achieved by providing emotion content signals. If the emotion of a soldier or peace worker during operations and missions is determined and simultaneously provided as input signal to determine the deployment of a soldier or peace worker, the efficiency of the operation is increased. If the soldier experiences stress, the officer in charge may decide to redefine the soldier's deployment in the mission or operation.
[028] It has also been suggested that improved sports experience may be achieved by providing emotion content signals. If the emotion of a sportsmen during exercise, training or real game is determined and simultaneously provided as input signal to steer the sports achievement, the efficiency of the sports achievement is increased. If the sportsman experiences stress, the coach may decide to introduce stress-relaxation exercises, if the sportsman experiences happiness and positive vibe or flow, the coach may decide to increase the degree of difficulty. The emotion content signal may also be used to change the subject of the sports program, the degree of difficulty of the exercises.
[029] It has also been suggested that artificial intelligence of a robotic apparatus may be achieved by providing emotion content signals to it. The robot or autonomous robotic apparatus can be provided with emotion content control signals to make the robotic apparatus for instance autonomous, interactive with the environment, responsive to emotional situations, sensitive to environmental influences, etc.
[030] United States patent US 9,256,825 B2 discloses an emotion script generating method, which is based on receiving means, generating means, adjusting means and providing means.
[031] However, the system of US 9,256,825 B2 tends to have a number of associated disadvantages including the following. First, the system does not provide means of combining emotion content signals of the same human or animal subject to create emotion content control signals. The system does also not provide means of combining emotion content signals of different human or animal subjects to create emotion content control signals. Furthermore, the system does not make use of smart correlation algorithms to extract the emotion content from the combined emotion content signals to generate the emotion content control signal. Also, the emotion signals are not labelled and coupled to an event or reference. And last but not least, the system does not provide means of storing emotions from remote or remembered person.
SUMMARY
[032] Hence, an emotion content control system for converting emotion content signals from a human or animal subject into emotion content control signals for feedback interaction, said system comprising at least a first apparatus arranged for sensing from at least a first human or animal subject first subject emotion content signals of at least a first and a second type, processing and converting said first and second type subject emotion content signals into a first emotion content control signal and transmitting said converted first emotion content control signal; and a subject interaction device arranged in receiving said first emotion content control signal being transmitted by said first apparatus and in performing feedback interaction with said first human or animal subject based on said first emotion content control signal being received, is advantageous.
[033] Accordingly, the invention preferably seeks to mitigate, alleviate or eliminate one or more of the above mentioned disadvantages singly or in any combination.
[034] According to a first aspect of the invention, the emotion content of the emotion content signal is determined by correlation functions (auto and cross-correlation) based on parametric representations and frequency analysis of one-dimensional (voice, heartbeat, blood pressure), two-dimensional (2D sound and speech, facial expression data) and three-dimensional neurotransmitter data, body temperature) emotion content signals. These data reduction schemes derive in-depth emotion signals from the temporal and geometrical physiological measurements, biometrical measurements and body observations, like the standard deviation, the skewness and the kurtosis of the signal and data. In particularly the higher order data reduction reveals information about repeating data patterns. In addition, the cross-correlation between the emotion content signals determine more accurate emotion signal content. The invention provides means for processing emotion content signals of at least first and second type into said emotion content control signal using crosscorrelations based on parametric representations and frequency analysis of these emotion content signals of at least the first and second type. An example is the heartbeat variability, and the correlation between heartbeat level and skin conductance, and heartbeat variation and skin-conductance. Heartbeat variation might be experienced due to several events, like stress, physical activity like sports or excitement. The correlation with the second type of data skin conductance, gives a better estimation of the emotional state of the subject. Another example is the facial expression, and the correlation between facial expression and the heartbeat level and heartbeat variation. Another example is the vocal signal, and the correlation between sound information and the heartbeat level and heartbeat variations. Another example is the concentration of the hormones Dopamine and Serotine, and the correlation between the hormone concentration and facial expressions. An important aspect of this analysis is the subtraction of the temporal reference content information, which is derived from the user data via self-learning algorithms or from meta-data. The meta-data can be retrieved from the emotion content signal or from stored emotion metadata.
[035] A second aspect of the invention is that emotions from current or past times are stored in emotion content signals on a personal emotion box. These emotions are characterized via tags or labels, such as type of events, time stamp, type of emotions. These emotions are also characterized by meta-data descriptions. For example, an event of happiness or celebration can be characterized by specific emotion content signals and thereby labeled as such. In case these emotion content signals are captured, emotion content signals related to these specific events can be retrieved. Another example is a moment of sadness which relates to specific characterizations of speech, facial expression, heartbeat, skin conductance and specific neurotransmitter concentration. The measured parameters of a such an event can be labeled and stored as a sadness moment, related to a specific human or animal subject.
[036] A third aspect of the invention is the augmentation of user interaction by using the generated emotion control signals for feedback interaction with an subject interaction device. The control signals are generated from emotion content signals stemming from one or more human or animal subjects and used to generate emotion control signals. These emotion control signals can stem from one or more humans or animals. An example is a care game for people with a mental disability, like dement elderly. In this game, the human subject is exposed to video content. Depending on the mental status of the human subject, and the response of the human subject to the exposed video images, the content of the video images is changed to create augmented interaction. If the human subject is happy, and experiences excitement by watching the video content, the system can use the emotion signals of the human subject expressing happiness to generate emotion control signals to influence the video content, for instance by changing the scene, the persons/personages in the video content, the colors, the dynamics of the activities inside the video content, etc. In this case, the happiness state of the human subject can be maintained or increased. In another example is a care game based on light. Depending on the emotional status of the human subject, the light color or light intensity in the room exposed to the human subject is changed to interact with the human subject. For instance, if the human subject experiences high stress levels, the emotion signals of the human subject are combined to generate the emotion control signal to change the color and intensity of the light in the room such that the stress level of the human subject is lowered.
[037] A fourth aspect is related to operate at least one auxiliary device based on the feedback interaction being performed by said subject interaction device. Such an auxiliary device can comprises a sound emitting module, a light emitting module, a display screen, a tactile actuator, a vibration actuator, or a heat emitting actuator. An example is for instance the interaction between two human subjects communicating remotely via telephone and video connection. The emotion signals of both human subjects are combined to generate emotion control signals that for instance operate heat and vibration generating devices. The level of vibration and the temperature of the device is sensed by the human subjects and reflects the level of love experienced by the human subjects.
[038] The emotion content signal may specifically comprise meta-data which indicates the current reference status. For example, the meta-data may indicate that the emotion status of a person follows a certain trajectory, for instance a periodic change in happiness during the day (low in the morning, higher in the afternoon, low in the evening) or during the year (low in winter, high in spring and summer), or change in emotion status during holidays. For instance the time-averaged reference level of happiness of a human subject is 10 during summer, during autumn and spring and 5 during winter. Measured happiness levels during the year are then relative to these reference levels. For instance, a measured level of happiness of 10 in winter is indicative for a happy person, a measured level of happiness of 10 in summer is indicative for a normal day. The meta-data may also directly indicate reference characteristics or objects, the data may for example indicate specific events like Xmas or periods in life. For instance, the happiness level of a human subject is on average 5 during winter, except Xmas, where a happiness level of 10 is more common for the human subject.
[039] The analysis of the emotion content signal may allow for a fully automated extraction of the reference content information without requiring any additional information to be included. For example, possible seasonal variation in the emotion content signal may become apparent from analysis of the time-dependent emotion content signal.
[040] The reference content information is adapted to the current and temporal emotion content signal via a self-learning algorithm. This self-learning ensures up-to-date reference content information. For example, the self-learning algorithm identifies repeating periods of low happiness from the time-dependent emotion content signals. This information can be used to predict the reference level in time.
[041] According to a different feature of the invention, the means adapted to extract is operable to extract real time reference content information from the emotion content signal, and the means adapted to generate the emotion control data is operable to generate an emotion control signal in response to the real time reference content information.
[042] According to a different feature of the invention, the emotion content signal may be used to control an subject interaction device, such as a lighting device, a computer device, a display apparatus, an audio apparatus, a sense device (massage chair) to interact with the human or animal subject. The emotion content control signal of a human subject can be used to augment the interaction with the subject interaction device. Also user interface information may be provided to the subject interaction device for increased interaction. For example, by using the emotion content control signal of a human subject, the color and level of light intensity (brightness) of a light source can be changed to change the emotional status of a sad person into a status of happiness, by mimicking for instance a rising sun. A similar effect may occur in case of controlling the content of a display apparatus, audio apparatus, sense device, or lighting device.
[043] According to a different feature of the invention, the emotion content signal may be used to generate an emotion content control signal to control an interactive game. The emotion control signal is for instance used to pre-set or change the level of the game, the speed, the intensity, the difficulty degree or level, etc. It may be a gradual or a sudden change. In case of a sad person, the game may be programmed by the emotion control signal to let the person experience a winning feeling, by easier assignments or by faster character building. In case of a happy person, the game may be programmed by the emotion control signal to give the person more challenges, via difficulty levels, less bonus points, less fast character building.
[044] According to a different feature of the invention, the emotion content signal may be used to generate an emotion content control signal to control a learning or training device. The emotions invoked by using the learning device (E-learning, training, etc.) are detected, processed and used via an emotion content control signal to adapt the learning or training device accordingly. The device can be programmed to flatten emotion levels, strengthen emotion levels, etc.
[045] According to a different feature of the invention, the emotion content signal may be used to generate an emotion content control signal to control an actuator to enhance body or physiological parameters. Heartbeat acceleration mimics excitement, an imposed reduction of heartbeat might lead to relaxation and release of stresses. Also, the emotion content control signal may be used to expose a human subject to heat or vibration.
[046] These and other aspects, features and advantages of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
[047] An embodiment of the invention will be described, by way of example only, with reference to the drawings, in which FIG. 1 illustrates an emotion content signal storage apparatus and ways in accordance with an embodiment of the invention to generate an emotion content control signal; FIG. 2 illustrates the working principle of data analysis of emotion content signals, thereby providing an in-depth analysis of the emotion content; FIG. 3 illustrates the working principle of controlling an subject interaction device adapted to use the emotion content control signal thereby providing an enhanced or interactive function of the said apparatus; FIG. 4 illustrates the working principle of controlling an subject interaction device adapted to receive user interface data and to use the emotion content control signal thereby providing an enhanced or interactive function of the said apparatus; FIG. 5 illustrates the working principle of controlling an subject interaction device adapted to use the emotion content control signal thereby providing an enhanced or interactive function of the said apparatus for a second user; FIG. 6 illustrates the working principle of controlling an subject interaction device adapted to use the emotion content control signal thereby providing an enhanced or interactive function of the said apparatus for a second user; FIG. 7 illustrates the working principle of controlling an subject interaction device adapted to use the emotion content control signals of more than one users thereby providing an enhanced or interactive function of the said apparatus to more than one users; and FIG. 8 illustrates the working principle of controlling an subject interaction device adapted to use the emotion content control signals of more than one users thereby providing an enhanced or interactive function of the said apparatus to more than one users.
DETAILED DESCRIPTION OF THE DRAWINGS
[048] The following description focuses on an embodiment of the invention applicable to an emotion content system particularly suited for a professional consumer environment but it will be appreciated that the invention is not limited to this application. For brevity, the term content signal has in the description been used to include both single signal sequences and multiple signal sequences.
[049] FIG. 1 illustrates an emotion content storage apparatus (100) in accordance with an embodiment of the invention.
[050] The emotion content storage apparatus (100) comprises a receiver (102), a processor (103) coupled to the receiver (102), a storage device (104) coupled to the processor (103), and a transmitter (105) coupled to the processor (103). The processor (103) is operable to process multiple emotion content signals and to generate the emotion control signal (106). In the preferred embodiment, the receiver (102), the processor (103), the transmitter (104) are embedded in conventional smart watches or other wearable devices.
[051] The emotion content storage apparatus (100) comprises a receiver (102) which receives the emotion content signal (101) from an external source (the human or animal subject). The receiver (102) comprises all necessary functionality required for receiving the emotion content signal and to extract or convert this into a suitable format. For example, for a heartbeat sensor signal the receiver (102) comprises all required functionality for amplifying, filtering demodulating and decoding the received signal to generate a base band emotion content signal.
[052] The emotion content signal (101) consists of more than one emotion content signals (101-1), (101-2)... (101-n) at least of a first and a second type, being body signals, physiological signals, vocal signals, facial expression signals, or pre-processed emotion content signals. The emotion content signals (101-1), (101-2)... (101-n) come typically from the human or animal subject. Auto-correlation analysis determines the variability, the skewness and the kurtosis of the individual signals, embedded in the emotion content signal (101), cross-correlation between the signals (101-1), (101-2)... (101-n) determine more accurate emotion signal content. An example is the heartbeat variability, and the correlation between heartbeat level and skin conductance, and heartbeat variation and skin-conductance. Another example is the facial expression, and the correlation between facial expression and the heartbeat level and heartbeat variation. Another example is the vocal signal, and the correlation between sound information and the heartbeat level and heartbeat variations. Another example is the concentration of the hormones Dopamine and Serotine, and the correlation between the hormone concentration and facial expressions.
[053] The processor (103) receives the processed emotion content signal (101). The processer (103) determines via correlations (autocorrelation for the individual signals and cross-correlations between the signals) and smart self-learning algorithms the emotional content. It extracts the reference content information from the emotion content signal via pattern recognition of the emotion content signal or via meta-data embedded in the emotion content signal. The meta-data can also be retrieved from the storage device (104). The meta-data comprises information which is indicative of the emotion history of user. In the preferred embodiment, the reference content information relates to signal sections which related to events that have characteristic emotion profiles. For example, the meta data file can contain heartbeat signals that relate for instance to watching a scary movie, or relate to a specific season, like winter or summer.
[054] In the preferred embodiment, the meta-data relates to long-term emotion characteristics of the emotion content signal. For example, meta-data may indicate that the person concerned is currently in a positive emotional status, experiences happiness, and is in a positive vibe or flow. Alternatively, the meta-data may indicate that the person concerned is currently in a transitional phase, for instance caused by the seasonal changes (from summer to autumn), which occurs every year. This meta-data is stored on the storage device (104).
[055] The generated emotion content control signal (106) is transmitted via the transmitter (104). The emotion content control signal (106) is fed into an subject interaction device (200) to augment the interaction with the human or animal subject.
[056] An example of an data analysis of emotion content signals (101-1), (101-2)... (101-n), is given in FIG. 2. Shown are two emotion content signals, (101-1) and (101-2), and a schematic representation of the cross-correlation between these two signals. The upper signal (101-1) is the elapse of the heartbeat signal of a user, during physical and emotional exercise. The second signal (101-2) is a measure for the physical exercise of the person. The third sketch (110) is a graphical representation of the cross-correlation between the heartbeat (101-1) and the physical exercise (101-2), indicating periods of physical stress and mental stress.
[057] To further illustrate the preferred embodiment, FIG. 3 illustrates an arrangement comprising an emotion signal storage apparatus (100) and an subject interaction device (200) for enhanced interaction with said subject interaction device.
[058] The emotion content control signal (106) is fed into the subject interaction device (200). The subject interaction device comprises a receiver (202), a core function (203), a transmitter (204) and creates an output signal (206) used to influence the emotion of the respective user from which the emotion content signal (101) originated.
[059] The core function (203) is the essential function of the subject interaction device (200) and may comprise a storage device with recorded images, recorded video or audio content, text documents, E-book content, cartoons, game content, lighting content, meta-data, reference data, position data, sense data. It may also comprise a processor to process the received or recorded content, in relation to the emotion content control signal (106). It may also comprise a data content generating function, like the generation of game features, interactive video games, data scripts, interactive drawings and cartoons, etc. It may also use stored reference data or meta-data to create the output signal (206).
[060] In a possible embodiment, the subject interaction device (200) comprises a data projector which is operable to display the emotion content signals (101) or the emotion content control signals (106) via the output signal (206). The data projector can be used to display the emotional status of a user via the output signal (206). The output signal (206) can also serve as input to control an auxiliary apparatus.
[061] In another possible embodiment, the subject interaction device (200) comprises a display projector which is operable to visual images. The emotion content control signal can be used to augment the emotion of a person by changing the visual content signal (206) of the display apparatus (200).
[062] In another possible embodiment, the subject interaction device (200) comprises an audio apparatus which is operable to vocal and sound content. The emotion content control signal (106) can be used to augment the emotion of a person by changing the audio contents signal (206) of the audio apparatus (200).
[063] In another possible embodiment, the subject interaction device (200) comprises a lighting apparatus which is operable to light content. The emotion content control signal (106) can be used to augment the emotion of a person by changing the light contents signal (206) of the lighting apparatus (200).
[064] In another possible embodiment, the subject interaction device (200) comprises an electronic nose apparatus which is operable to data content. The emotion content control signal (106) can be used to operate the artificial electronic nose apparatus, by steering the sensing of specific gasses or molecules, by controlling the smell contents signal (206) of the electronic nose apparatus (200).
[065] In another possible embodiment, the subject interaction device (200) comprises a robotic apparatus which is operable to data content. The emotion content control signal (106) can be used to operate the robotic apparatus, to steer the movement of the robotic apparatus, to steer actuators, to control intelligent operation of the robot, to create vocal signals or sound (for instance talking), to control an electronic nose for molecule or gas detection, to control touch sensors, to control vision sensors, to control temperature sensors, to control radiation sensors.
[066] In another possible embodiment, the subject interaction device (200) comprises a micro electro-mechanical system apparatus which is operable to data content. The emotion content control signal (106) can be used to operate the micro electro-mechanical system apparatus, to steer actuators, to control a microprocessor, to control sensors, by controlling the output signal (206) of the micro electro-mechanical system apparatus (200). In a possible embodiment, the actuating system can be placed inside the body (internal) or attached to a body (external) to control the heartbeat of a person, to control the dosing of a hormone or neurotransmitter.
[067] In another possible embodiment, the subject interaction device (200) comprises a global position system apparatus which is operable to data content. The emotion content control signal (106) can be used to operate the global positioning system device, to steer direction and motion via a data signal (206) of the global positioning system apparatus (200).
[068] FIG. 4 illustrates an arrangement comprising an emotion signal storage apparatus (100) and an subject interaction device (300) for increased interaction with the subject interaction device.
[069] The emotion content control signal (106) is fed into an subject interaction device (300). In addition, a user interface content signal (301) is fed into the subject interaction device. The subject interaction device comprises a receiver (302), a core function (303), a transmitter (304) and creates an output signal (306) is used for increased interaction with the human or animal subject from which the emotion content signal (101) originated.
[070] The core function (303) is the essential function of the subject interaction device (300) and may comprise a storage device with recorded images, recorded video or audio content, text documents, E-book content, cartoons, game content, lighting content, meta-data, reference data, position data, sense data. It may also comprise a processor to process the received or recorded content, in relation to the emotion content control signal (106). It may also comprise a data content generating function, like the generation of game features, interactive video games, data scripts, interactive drawings and cartoons, etc. It may also use the user interface signal (301) in addition to the emotion content control signal (106) to create the output signal (306). It may also use stored reference data or meta-data to create the output signal (306).
[071] In a possible embodiment, the subject interaction device (300) comprises a gaming apparatus. The emotion content control signal (106) can be used to influence the emotion of a person by changing the game content signal (306) of the gaming apparatus (300). The gaming apparatus also receives a user interface signal (301), which is used to operate the gaming apparatus.
[072] In a possible embodiment, the subject interaction device (300) comprises an advertisement apparatus. The emotion content control signal (106) can be used to influence the emotion of a person by changing the advertisement content signal (306) of the advertisement apparatus (300). The advertisement apparatus also receives a user interface signal (301), which is used to operate the advertisement apparatus.
[073] FIG.5 illustrates an arrangement comprising an emotion signal storage apparatus (100) and an subject interaction device (300) to increase the interaction of a second human or animal subject with the subject interaction device.
[074] The emotion content control signal (106) is fed into the subject interaction device (300). In addition, a user interface content from a signal (301) is fed into the subject interaction device. The subject interaction device comprises a receiver (302), a core function (303), a transmitter (304) and creates an output signal (306), which is used to influence the interaction of the second human or animal subject with the subject interaction device.
[075] FIG.6 illustrates an arrangement comprising an emotion signal storage apparatus (100) and an subject interaction device (300) to increase the interaction of a second human or animal subject with the subject interaction device.
[076] The emotion content control signal (106) is fed into the subject interaction device (300). In addition, a user interface content signal (301) from a different user is fed into the second apparatus. The subject interaction device comprises a receiver (302), a core function (303) , a transmitter (304) and creates an output signal (306), which is used to influence the interaction of the second human or animal subject with the subject interaction device.
[077] FIG.7 illustrates an arrangement comprising two emotion signal storage apparatus (100) and an subject interaction device (300) to influence the interaction of at least two human or animal subjects with the subject interaction device. One emotion signal storage apparatus (100) corresponds to a first human or animal subject, one emotion signal storage apparatus (100') corresponds to a further human or animal subject. The emotion content control signal (106) of the first emotion signal storage apparatus (100) and the emotion content control signal (106') of the further emotion content storage apparatus (100') are fed into the subject interaction device (300). In addition, a user interface content signal (301) from the first human or animal subject and a user interface content signal (301') from the further human or animal subject are fed into the subject interaction device (300). The subject interaction device comprises a receiver (302), a core function (303), a transmitter (304) and creates an output signal (306), which is used to influence the interaction of at least two human or animal subjects with the subject interaction device.
[078] FIG.8 illustrates an arrangement comprising one emotion signal storage apparatus (100) and an subject interaction device (300) to influence the interaction of at least two human or animal subjects with the subject interaction device. The emotion signal storage apparatus (100) corresponds to the first and further human and/or animals subjects. The emotion content control signal (106) of the emotion signal storage apparatus (100) is fed into the subject interaction device (300). In addition, a user interface content signal (301) from the first human or animal subject and a user interface content signal (301') from a further human or animal subject are fed into the subject interaction device (300). The subject interaction device comprises a receiver (302), a core function (303), a transmitter (304) and creates an output signal (306), which is used to influence the interaction of at least two human or animal subjects with the subject interaction device.
[079] In a preferred embodiment, the emotion content signals (101) of the first human or animal subject 1 and the emotion content signal (101') of the further human or animal subject are combined to generate the emotion content control signal (106).
[080] The emotion content control signals (106) may be derived from the different signals from the emotion content signal (101) for example by a suitable repetition, selectivity, emotion content. Alternatively or additionally, existing pre-stored emotion content control signals may be used. For example, the emotion content storage apparatus may comprise a large number of pre-stored emotion content signals corresponding to different possible events and reference information characteristics.
[081] In one embodiment, meta-data may thus be extracted from the emotion content signal and used to select a suitable pre-stored emotion content signal. This signal may have characteristics amended to correspond to e.g. the victory of a soccer game.
[082] Thus in some embodiments, the determination of content may be used to determine estimates of the reference content information for a given emotion content signal. For example, if it is determined that the emotion content signal relates to a football match an emotion content control signal comprising e.g. scoring of a goal, or the sensation of a victory may be generated.
[083] In the preferred embodiment, the emotion content storage apparatus is operable to modify the processing parameters and algorithms used for extracting the reference content information and for generating the emotion content control signal depending on the category (e.g. genre) of the emotion content signal. For example, the content that may be enhanced by emotion content signals includes data related to soccer games or concerts.
[084] The emotion content storage apparatus may comprise an input for changing the dynamics of the emotion content control signal. The emotion content storage apparatus may also comprise a switch to choose the emotion content control signal being based on meta-data or signal processing/analysis. The emotion content storage apparatus may also comprise means for determining a user profile reflecting preferred emotion content and dynamics of the emotion content control signals (e.g. rate of switching, emotion preference, etc.). For example, if a user has experienced certain emotional events four times, a different emotion content control signal may be desirable at the fifth event. In that case, the user profile may be stored and used to control the settings of the emotion content control signal.
[085] The emotion content storage apparatus may provide the emotion content control signal selectively. For example, the emotion content control signal may be provided only when predefined events occur. As a specific example, an emotion content storage apparatus may be provided as a consumer emotion content storage apparatus which contains a number of features and control means including for example the following:
Control input for changing the intensity of the emotion experience.
Control input for selecting an emotion genre.
Control input for selecting a content item category
Control input for changing the dynamics of the emotion content control signal.
Control input for controlling an emotion contrast.
Control input or automatic means for selecting and/or storing a user profile.
Means for entering a self-learning mode (e.g. measuring or determining characteristics of the operations of the emotion content storage apparatus such as the number of emotions, for example succeeding happiness events can be emphasized in time).
Polarization control means: e.g. for controlling that enhancement occurs only for predefined events. - A Source selector for selecting source information for the emotion content control signal, such as e.g. which information from the emotion content signal to use (heartbeat, skin conductance, respiratory data, etc.). - A purpose selector for selecting e.g. a purpose of the emotion experience thereby allowing the emotion content control signal to be selected to most suitably achieve this purpose. - A mood selector.
[086] The invention can be implemented in any suitable form including hardware, software, firmware or any combination of these. However, preferably, the invention is implemented at least partly as computer software running on one or more data processors and/or digital signal processors. The elements and components of an embodiment of the invention may be physically, functionally and logically implemented in any suitable way. Indeed the functionality may be implemented in a single unit, in a plurality of units or as part of other functional units. As such, the invention may be implemented in a single unit or may be physically and functionally distributed between different units and processors.
[087] Although the present invention has been described in connection with the preferred embodiment, it is not intended to be limited to the specific form set forth herein. Rather, the scope of the present invention is limited only by the accompanying claims. In the claims, the term comprising does not exclude the presence of other elements or steps. Furthermore, although individually listed, a plurality of means, elements or method steps may be implemented by e.g. a single unit or processor. Additionally, although individual features may be included in different claims, these may possibly be advantageously combined, and the inclusion in different claims does not imply that a combination of features is no feasible and/or advantageous. In addition, singular references do not exclude a plurality. Thus references to "a", "an", "first", "second" etc. do not preclude a plurality.
NUMBER LISTING (100) = emotion content storage apparatus (101) = emotion content signal, comprised of more than one emotion content signals (101-1) = emotion content signal (101-2) = emotion content signal (101-n) = emotion content signal (102) = receiver (103) = processor (104) = storage device (105) = transmitter (106) = emotion content control signal (110) = graphical representation of the cross-correlation (200) = subject interaction device (201) = (202) = receiver (203) = core function, being a storage device, processors, data generator (204) = transmitter (206) = output signal (300) = another second apparatus (301) = user interface content signal (302) = receiver (303) = core function, being a storage device, processors, data generator (304) = transmitter (306) = output signal

Claims (17)

1. Een emotie controle systeem voor het converteren van emotie signalen van mensen of dieren in emotie stuursignalen voor feedback interactie, waarbij het systeem ten minste het volgende omvat: - Een eerste apparaat ingericht voor het waarnemen van emotie signalen van ten minste een eerste en een tweede type, van een eerste mens of dier, het verwerken en het omzetten van desbetreffende emotie signalen van een eerste en tweede type, in een eerste emotie stuursignaal en het aanbieden van desbetreffende eerste emotie stuursignaal; Een interactie apparaat ingericht voor het ontvangen van een eerste emotie stuursignaal, uitgestuurd door genoemde eerste apparaat, en voor het uitvoeren van de feedback interactie met een eerste mens of dier op basis van een ontvangen eerste emotie stuursignaal, met het kenmerk, dat het eerste en/of verdere apparaat omvat: - Ontvangstmiddelen voor het ontvangen van genoemde emotie signalen van het ten minste eerste en tweede type; - Verwerkingsmiddelen voor het verwerken van emotie signalen van het ten minste eerste en tweede type, tot het genoemde emotie stuursignaal via kruiscorrelaties op basis van parametrische representaties en frequentieanalyse van de emotie signalen van het ten minste eerste en tweede type; en - Overdrachtsmiddelen voor het uitsturen van het emotie stuursignaal.An emotion control system for converting emotion signals from people or animals into emotion control signals for feedback interaction, the system comprising at least the following: - A first device adapted to detect emotion signals from at least a first and a second type, of a first human or animal, processing and converting relevant emotion signals of a first and second type, into a first emotion control signal and presenting the relevant first emotion control signal; An interaction device adapted to receive a first emotion control signal output from said first device and to perform the feedback interaction with a first human or animal based on a received first emotion control signal, characterized in that the first and / or further device comprises: - Receiving means for receiving said emotion signals of the at least first and second type; - Processing means for processing emotion signals of the at least first and second type, up to said emotion control signal via cross-correlations based on parametric representations and frequency analysis of the emotion signals of the at least first and second type; and - Transfer means for outputting the emotion control signal. 2. Systeem volgens conclusie 1, verder omvattend: - Ten minste een tweede apparaat ingericht voor het waarnemen van emotie signalen van ten minste een eerste en een tweede type van ten minste een tweede mens of dier, het verwerken en het omzetten van de verdere emotie signalen van het eerste en tweede type in een tweede emotie stuursignaal en het verzenden van desbetreffende gegenereerde tweede emotie stuursignaal naar een interactie apparaat ingericht voor het uitvoeren van feedback interactie met een andere mens of dier op basis van de inhoud van het ontvangen tweede emotie stuursignaal.A system according to claim 1, further comprising: - At least a second device adapted to detect emotion signals from at least a first and a second type from at least a second human or animal, process and convert the further emotion signals of the first and second type in a second emotion control signal and the sending of relevant generated second emotion control signal to an interaction device arranged for performing feedback interaction with another human or animal based on the content of the received second emotion control signal. 3. Systeem volgens conclusie 1, waarbij: - Genoemde eerste apparaat is ingericht voor het waarnemen van emotie signalen van ten minste een eerste en een tweede type van ten minste twee of meer mensen en/of dieren, het verwerken en het omzetten van de emotie signalen van een eerste en tweede type van de genoemde eerste en genoemde ten minste tweede mens of dier in een gecombineerd emotie stuursignaal en het verzenden van genoemde gegenereerde gecombineerde emotie stuursignaal naar het interactie apparaat voor het uitvoeren van de feedback interactie met de eerste en de ten minste tweede mens of dier gebaseerd op het genoemde gegenereerde en ontvangen gecombineerde emotie stuursignaal.3. System as claimed in claim 1, wherein: - The said first device is arranged for detecting emotion signals from at least a first and a second type of at least two or more people and / or animals, processing and converting the emotion signals of a first and second type from said first and said at least second human or animal in a combined emotion control signal and transmitting said generated combined emotion control signal to the interaction device for performing the feedback interaction with the first and the atten least second human or animal based on said generated and received combined emotion control signal. 4. Systeem volgens conclusie 1, waarbij het eerste en/of verdere apparaat omvat: - Opslagmiddelen voor het opslaan van de emotie signalen van het ten minste eerste en tweede type, en het emotie stuursignaal.System as claimed in claim 1, wherein the first and / or further device comprises: - Storage means for storing the emotion signals of the at least first and second type, and the emotion control signal. 5. Systeem volgens conclusie 1 of 4, waarbij: - De genoemde verwerkingsmiddelen van het eerste en/of verdere apparaat referentie informatie uit de ontvangen emotie signalen van het ten minste eerste en tweede type van een eerste of ten minste tweede mens of dier extraheert; of - De genoemde verwerkingsmiddelen van het eerste en/of verdere apparaat referentie informatie uit de in de opslagmiddelen opgeslagen emotie signalen en emotie control signalen extraheert; en - De genoemde referentie informatie door middel van tijd en frequentie-analyse van de genoemde emotie signalen van het ten minste eerste en tweede type genereert; en - De genoemde referentie informatie via een zelflerend en/of patroonherkenning algoritme wordt gegenereerd; en - De genoemde referentie informatie in de opslagmiddelen wordt opgeslagen.System as claimed in claim 1 or 4, wherein: - said processing means of the first and / or further device extracts reference information from the received emotion signals of the at least first and second type of a first or at least second human or animal; or - The said processing means of the first and / or further device extracts reference information from the emotion signals and emotion control signals stored in the storage means; and - The said reference information generates signals of the at least first and second type by means of time and frequency analysis of said emotion; and - The said reference information is generated via a self-learning and / or pattern recognition algorithm; and - The said reference information is stored in the storage means. 6. Systeem volgens conclusie 5, waarbij de genoemde verwerkingsmiddelen van het eerste en de / of volgende apparaten het emotie stuursignaal gebaseerd op een vooraf gedefinieerde en geselecteerde gebeurtenis genereert.The system of claim 5, wherein said processing means of the first and / or subsequent devices generates the emotion control signal based on a predefined and selected event. 7. Systeem volgens een of meer der voorgaande conclusies, waarbij het interactie apparaat is ingericht voor het ontvangen van feedback signalen van de eerste en / of verdere mens of dier.7. System as claimed in one or more of the foregoing claims, wherein the interaction device is adapted to receive feedback signals from the first and / or further human or animal. 8. Systeem volgens conclusie 7, waarbij het systeem verder omvat: - Tenminste één hulpinrichting welke door de feedback interactie van het interactie apparaat wordt aangestuurd.A system according to claim 7, wherein the system further comprises: - At least one auxiliary device which is controlled by the feedback interaction of the interaction device. 9. Systeem volgens conclusie 8, waarbij de hulpinrichting ten minste één van de lijst omvat: - Een geluid voortbrengende module; - Een licht-emitterende module; - Een beeldscherm; - Een data-projecterende module; - Een tactiele actuator; - Een trilling actuator; - Een warmte uitstralende actuator;The system of claim 8, wherein the auxiliary device comprises at least one of the list: - A sound generating module; - A light-emitting module; - A screen; - A data projecting module; - A tactile actuator; - A vibration actuator; - A heat radiating actuator; 10. Systeem volgens één of meer der voorgaande conclusies, waarbij genoemde typen emotie signalen ten minste twee van de volgende signalen omvat: - Hartslag; - Bloeddruk; - Gezichtsuitdrukking; - Body oppervlaktetemperatuur; - Lichaamsoppervlak geleidbaarheid; - Neurotransmitter of hormoon profielinformatie; - Vocale speech kenmerken; - Body gebaren of houding.System according to one or more of the preceding claims, wherein said types of emotion signals comprise at least two of the following signals: - Heartbeat; - Blood pressure; - Facial expression; - Body surface temperature; - Body surface conductivity; - Neurotransmitter or hormone profile information; - Vocal speech characteristics; - Body gestures or posture. 11. Apparaat voor toepassing in een systeem volgens een of meer der voorgaande conclusies, omvattende: - Ontvangstmiddelen voor het ontvangen van genoemde emotie signalen van het ten minste eerste en tweede type; - Verwerkingsmiddelen voor het verwerken van emotie signalen van het ten minste eerste en tweede type, tot het genoemde emotie stuursignaal via kruiscorrelaties op basis van parametrische representaties en frequentieanalyse van de emotie signalen van het ten minste eerste en tweede type; en - Overdrachtsmiddelen voor het uitsturen van het emotie stuursignaal.Apparatus for use in a system according to one or more of the preceding claims, comprising: - Receiving means for receiving said emotion signals of the at least first and second type; - Processing means for processing emotion signals of the at least first and second type, up to said emotion control signal via cross-correlations based on parametric representations and frequency analysis of the emotion signals of the at least first and second type; and - Transfer means for outputting the emotion control signal. 12. Gebruikersinteractie systeem voor toepassing in een systeem volgens een of meer der voorgaande conclusies.A user interaction system for use in a system according to one or more of the preceding claims. 13. Werkwijze voor het verbeteren van de interactie met een eerste en/of volgende mens of dier, waarbij de werkwijze de stappen omvat van: A. Blootstellen van een eerste en/of volgende mens en dier aan een interactieve omgeving; B. Ontvangen van de emotie signalen van een eerste en tweede type van de genoemde eerste en/of volgende mens of dier, waarbij genoemde emotie signalen door blootstelling aan de genoemde interactieve omgeving door genoemde eerste en volgende mens of dier worden gegenereerd; C. Verwerken van de emotie signalen van de ten minste eerste en tweede type tot enkelvoudige en/of samengestelde emotie stuursignalen via kruiscorrelaties op basis van parametrische representaties en frequentieanalyse van de emotie signalen van het ten minste eerste en tweede type; D. Opslag van de genoemde emotie signalen van de ten minste eerste en tweede type en de genoemde enkelvoudige en/of samengestelde emotie stuursignalen; E. Het overbrengen van de genoemde enkelvoudige en/of samengestelde emotie stuursignalen op een interactie apparaat, welke is uitgerust met middelen voor het ontvangen van de enkelvoudige en/of samengestelde emotie stuursignalen; F. Het geven van feedback interactie door genoemde interactie apparaat aan een eerste en/of volgende mens of dier welke is blootgesteld aan de interactieve omgeving.A method for improving interaction with a first and / or subsequent human or animal, the method comprising the steps of: A. Exposing a first and / or following human and animal to an interactive environment; B. Receiving the emotion signals of a first and second type from said first and / or following human or animal, said emotion signals being generated by exposure to said interactive environment by said first and following human or animal; C. Processing of the emotion signals of the at least first and second type into single and / or composite emotion control signals via cross-correlations based on parametric representations and frequency analysis of the emotion signals of the at least first and second type; D. Storage of said emotion signals of the at least first and second type and said single and / or composite emotion control signals; E. Transferring said single and / or composite emotion control signals to an interaction device, which is equipped with means for receiving the single and / or composite emotion control signals; F. Giving feedback interaction through said interaction device to a first and / or following human or animal that is exposed to the interactive environment. 14. Werkwijze volgens conclusie 13, verder omvattende de stap van het herhalen van de stappen B-C-D-E-F.The method of claim 13, further comprising the step of repeating steps B-C-D-E-F. 15. Werkwijze volgens conclusie 13, verder de stappen omvattend: - Het opslaan van de emotie signalen van het ten minste eerste en tweede type en het genoemde emotie stuursignaal; - Stap F van terugkoppeling met verdere stappen van: • F1 het verstrekken van tactiele terugkoppeling; • F2 het verstrekken van akoestische terugkoppeling; • F3 het verstrekken van visuele terugkoppeling; • F4 het verstrekken van emitterende terugkoppeling; • F5 het verstrekken van warmte terugkoppeling; • F5 het verstrekken van mechanische terugkoppeling;The method of claim 13, further comprising the steps of: - storing the emotion signals of the at least first and second types and said emotion control signal; - Step F of feedback with further steps of: • F1 providing tactile feedback; • F2 providing acoustic feedback; • F3 providing visual feedback; • F4 providing emitting feedback; • F5 providing heat feedback; • F5 providing mechanical feedback; 16. Een computer geïmplementeerde werkwijze die wordt uitgevoerd in de opslagmiddelen van de eerste of verdere apparaten of in de opslagmiddelen van het interactie apparaat, bestaande uit uitvoerbare code compatibel met besturingsprogramma’s van de genoemde opslagmiddelen, volgens een werkwijze beschreven in conclusie 13.A computer-implemented method that is executed in the storage means of the first or further devices or in the storage means of the interaction device, consisting of executable code compatible with control programs of said storage means, according to a method described in claim 13. 17. Computerprogramma opgeslagen op een niet-vluchtig registratiedrager, het computerprogramma bevat instructiecodes uitgevoerd in een omvattend computerprogramma volgens conclusie 16.17. Computer program stored on a non-volatile record carrier, the computer program contains instruction codes executed in a comprehensive computer program according to claim 16.
NL1042207A 2017-01-01 2017-01-01 An emotion content control system for combining emotion content signals for feedback interaction and a method for enhancing the interaction with the human or animal subject thereof NL1042207B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
NL1042207A NL1042207B1 (en) 2017-01-01 2017-01-01 An emotion content control system for combining emotion content signals for feedback interaction and a method for enhancing the interaction with the human or animal subject thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NL1042207A NL1042207B1 (en) 2017-01-01 2017-01-01 An emotion content control system for combining emotion content signals for feedback interaction and a method for enhancing the interaction with the human or animal subject thereof

Publications (2)

Publication Number Publication Date
NL1042207A NL1042207A (en) 2018-07-06
NL1042207B1 true NL1042207B1 (en) 2018-07-23

Family

ID=62816257

Family Applications (1)

Application Number Title Priority Date Filing Date
NL1042207A NL1042207B1 (en) 2017-01-01 2017-01-01 An emotion content control system for combining emotion content signals for feedback interaction and a method for enhancing the interaction with the human or animal subject thereof

Country Status (1)

Country Link
NL (1) NL1042207B1 (en)

Also Published As

Publication number Publication date
NL1042207A (en) 2018-07-06

Similar Documents

Publication Publication Date Title
US11815951B2 (en) System and method for enhanced training using a virtual reality environment and bio-signal data
US20220314078A1 (en) Virtual environment workout controls
JP2005237561A (en) Information processing device and method
KR20230059828A (en) Multiplexed communications via smart mirrors and video streaming with display
CN112352390A (en) Content generation and control using sensor data for detecting neurological state
Rostami et al. Bio-sensed and embodied participation in interactive performance
KR20150001038A (en) Apparatus and method for personalized sensory media play based on the inferred relationship between sensory effects and user's emotional responses
Ueoka et al. Emotion hacking VR: Amplifying scary VR experience by accelerating actual heart rate
Hidaka et al. Preliminary test of affective virtual reality scenes with head mount display for emotion elicitation experiment
Tajadura-Jiménez et al. Designing a gesture-sound wearable system to motivate physical activity by altering body perception
Baños et al. Positive technologies for understanding and promoting positive emotions
Ibáñez et al. Using gestural emotions recognised through a neural network as input for an adaptive music system in virtual reality
NL1042207B1 (en) An emotion content control system for combining emotion content signals for feedback interaction and a method for enhancing the interaction with the human or animal subject thereof
Fraleigh A phenomenology of being seen
Kolykhalova et al. A serious games platform for validating sonification of human full-body movement qualities
Conroy Paralympic cultures: Disability as paradigm
DeFrantz Improvising Social Exchange
US20220155850A1 (en) Method and system for an immersive and responsive enhanced reality
KR20190129532A (en) Emotion determination system and method, wearable apparatus including the same
NL1043927B1 (en) A system for emotion detection and a method for personalized patterns in emotion-related physiology thereof
Beilharz Tele-touch embodied controllers: posthuman gestural interaction in music performance
Lee et al. Data-driven rendering of motion effects for walking sensations in different gaits
Blankendaal et al. Using run-time biofeedback during virtual agent-based aggression de-escalation training
Soleymani Implicit and Automated Emtional Tagging of Videos
Hauke et al. Building a body of evidence: From sensation to emotion and psychotherapy

Legal Events

Date Code Title Description
MM Lapsed because of non-payment of the annual fee

Effective date: 20200201