KR101753834B1 - A Method for Emotional classification using vibraimage technology - Google Patents

A Method for Emotional classification using vibraimage technology Download PDF

Info

Publication number
KR101753834B1
KR101753834B1 KR1020150049955A KR20150049955A KR101753834B1 KR 101753834 B1 KR101753834 B1 KR 101753834B1 KR 1020150049955 A KR1020150049955 A KR 1020150049955A KR 20150049955 A KR20150049955 A KR 20150049955A KR 101753834 B1 KR101753834 B1 KR 101753834B1
Authority
KR
South Korea
Prior art keywords
parameter
value
image
extracting
vibration
Prior art date
Application number
KR1020150049955A
Other languages
Korean (ko)
Other versions
KR20150117614A (en
Inventor
최진관
황성택
Original Assignee
주식회사 바이브라시스템
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 바이브라시스템 filed Critical 주식회사 바이브라시스템
Publication of KR20150117614A publication Critical patent/KR20150117614A/en
Application granted granted Critical
Publication of KR101753834B1 publication Critical patent/KR101753834B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0037Performing a preliminary scan, e.g. a prescan for identifying a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4884Other medical applications inducing physiological or psychological stress, e.g. applications for stress testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Psychiatry (AREA)
  • Radiology & Medical Imaging (AREA)
  • Artificial Intelligence (AREA)
  • Hospice & Palliative Care (AREA)
  • Physiology (AREA)
  • Developmental Disabilities (AREA)
  • Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Social Psychology (AREA)
  • Educational Technology (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Methods and apparatus for reliable and accurate measurement of psychophysiological parameters of an object are described. The method includes the steps of acquiring a moving image from a subject's psychophysiological subject through a moving image, measuring a vibration parameter of the subject from the moving image, generating a living body signal image based on the vibration parameter, To generate a subject ' s psychophysiological response parameters.

Figure R1020150049955

Description

[0001] The present invention relates to a method for classifying emotions using vibration image technology,

The present invention relates to a method for acquiring a physiological signal using a moving image acquired from a living body of a subject and classifying the sensibility state using the acquired physiological signal.

There have been many studies to obtain psychophysiological information on the human body, and to date there have been known a number of specific contact methods, devices and systems. They use well-known physiological parameters of the human body to assess human emotional and psychological states and make medical diagnosis

In the conventional method, analyzing the state of the human body generally takes several hours, requires the sensor to be firmly attached to the subject's body, and requires the participation of skilled test-driving personnel. Therefore, there are many practical limitations to widely utilize the above system to make a psychophysiological diagnosis of the human body.

These systems provide sufficient information from the viewpoint of analyzing the state of the human body, but there are restrictions on the use of the sensor in the human body. Moreover, the integrated information obtained from the local site is not easy to analyze.

Another problem with this contact method is that it is technically impossible to conduct the experiment without the knowledge of the subject because the contact type sensor is used. People with contact-type psychophysical tests are always aware of this fact, which adds to the difficulty of analyzing test results. Because those who try to hide something try to hide information and feel no stress and anxiety throughout the test.

US 7301465 B JP 7-108848 A JP 2010-186276 A JP 4702100 B JP 4458146 B JP 2011-128966 A JP 7-296299 A JP 2002-370559 A JP 2007-293587 A JP 2007-257043 A JP 4123077 B WO 03-049967 A JP 4743137 B JP 4259585 B JP 10-74595 A JP 2002-46499 A US 2011-37595 A

The present invention provides a method for classifying emotional states of a subject by measuring psychophysiological parameters of a highly reliable subject based on moving images.

The bio-signal acquisition method according to the present invention comprises:

Capturing a subject and acquiring a plurality of continuous image information;

Analyzing the image information to extract a vibration parameter of the subject;

Generating a psychophysiological parameter based on the vibration parameter;

Extracting a physiological signal of the subject from the parameter; And

And classifying the emotions of the subject from the physiological signals.

According to the present invention, the emotional state of a subject is classified using a camera that can reliably and accurately measure the psychophysiological parameters of the subject.

Figure 1 shows a two-dimensional emotional model by James Russell.
2 is a diagram for explaining a sensitivity classification process according to the present invention.
Figure 3 is a schematic diagram of an emotion classification algorithm in accordance with the present invention.
Fig. 4 shows that the living body energy (aura) is radiated around the subject's human image formed by the amplitude component of the vibration image.
Fig. 5 shows the bio-energy emitted around the actual image of the human body.
6 and 7 show biometric image radiation according to the state of the subject. FIG. 6 shows a stable state, and FIG. 7 shows an unstable stress state.
8 is a distribution graph for a frequency component (a biological signal image) of a human vibration image in a stable state.
9 is a distribution graph for a frequency component (a biological signal image) of a human vibration image in a stressed state.
10 is a radial graph (chart) showing the emotional state of the subject according to the method of the present invention.

Hereinafter, a sensitivity classification method using a vibration image according to the present invention will be described with reference to the accompanying drawings.

Various theorists thought that all human beings had inherent basic feelings. Some research resources define emotions according to more than one dimension. In 1897 emotions were explained in three dimensions. "Pleasure & Discomfort", "Awakening & Calming", "Pressing & Relaxing". In 1980, emotions were categorized in a two - dimensional circular space, a model developed by James Russell. As shown in FIG. 1, in Russell's model, emotional languages are recorded in a two-dimensional circular space with awakening and balance dimensions. In Fig. 1, arousal appears on the vertical axis and emotion (pleasure / offensive) appears on the horizontal axis.

The vibrational image represents the head movement activity characterized by the information of Vestibulo-Emotional Reflex (VER). The vibration image system is a general image processing process that detects human emotions, mental states, and lie elements. 1. oscillating image technology aggression, stress, 2., 3. tension / anxiety, 4. doubts, 5.10 balance, 6. Charm, 7. energy, confidence 8., 9. suppression, 10. nervousness such things Get the variable . Image and Vibration Image Transformation provides the computed variables of the head movement related to the functional state of the body measured in real time. Each variable is calculated from the following equation and expressed in amplitude and frequency.

Figure 112015034526730-pat00001

Where A is the amplitude, N is the number of frames,

Figure 112015034526730-pat00002
Is the signal amplitude at points x and y in the i < th > frame,
Figure 112015034526730-pat00003
Is the signal amplitude at the points x and y in the (i + 1) th frame. The frequency component is calculated from the number 2.

Figure 112015034526730-pat00004

In the present invention, the signal component corresponding to the EEG signal is extracted from the parameters, and the physiological and psychological states of the subject are evaluated through the extracted signal components.

Before describing the method of emotional classification of the subject according to the present invention, the relation between the vibration of the living body part and the psychophysiological parameters will be described with reference to the vibration parameter used in the present invention.

In particle physics, there is no clear boundary between the wave characteristics and the particle properties of the material, and the photon energy (ε) is known to be linked to the frequency (ν) of the photon energy through the Planck's constant (ε = hν). The hypothesis is that the energy emitted from each part of the organism is proportional to the frequency of vibration in that part of the space. Consequently, in order to record the energy coming from an organism, it is necessary to record the vibrations occurring in various parts of the organism (in space or between each site). This process is possible using a non-contact television system with sufficient resolution and fast processing power. Furthermore, the frequency component of the bio-signal image (that is, the vibration (position change, wave) frequency occurring at each site) has the most information on the bioenergy or psychophysiological characteristics of the observed organism. The analysis of the obtained bio-signal image can be performed mathematically by processing at least one of the digital bio-signal image and its components that the human being can or can obtain by a program. For creating and analyzing algorithms for mathematical processing, it is better to create a bio-signal image that is convenient for visual analysis, such as pseudocolor images on a monitor screen.

In other words, the frequency component of the bio-signal image to be obtained should be able to identify the psychophysiological state of the human body and the level of the emotional state continuously and clearly, and to distinguish the change of the human body state when various stimuli are applied to human do. According to the findings, it is possible to evaluate the psychophysiological state of the human body more quickly and accurately than the other methods by using an image showing the bio energy field of the human body, which is represented by an aura around the human body.

The term Aura refers to the integrated nature of the human psychophysiological state. These aura appear around the human body and have a specific relationship with the human body energy components. The images of the human aura provide a lot of information when studying the psychophysiological parameters of the human body, and the following factors are considered in this study. Human emotional state can literally change every second. A typical person can not stay in one emotional state for a long time.

Every thought, action, or reaction to a situation leads to a momentary change in the emotional state (each bio-signal image). Therefore, it is important to find an optimal correlation between the number of information about the acquired bio-signal image (among other things, the resolution of the camera) and the rapid processing of the system.

By modulating the maximum frequency of vibration of an object with the amplitude modulation of the aura size to the average value of the frequency or amplitude of the position change taking place in the body-specific zone, Changes can be recorded at a glance and instantly. The fractal fluctuation of the brain is known to play a key role in learning, memory, and various tasks. Experiments have shown that the most intense part of the human body is the brain and in most cases the aura (frequency component of the vibration image) can exist only around the human head, much larger than the aura around the body . Changes that occur in the human body are expressed as the aura breaks down or the color and shape appear asymmetrically. This can be clearly seen from the image of the bio-signal obtained.

The point that the elements of the biomedical signal image are related to the elements of the actual image in topology is the one-shot. Experimental results show that the human emotional state, which contains the most information, is transmitted at the maximum vibration frequency, and the mean level of the frequency or the background level of the adjacent points is concealed, or conceals the true change that occurs when the visual signal of the bio- It is possible.

Therefore, it is shown that the components of the bio-signal image are not more effective than the frequency components of the vibration image represented by the aura located near the actual image. When the elements of the bio-signal image are topographically related to the elements of the real image, the elements having the maximum vibration frequency are not visible in the overall background when the color-frequency adjustment is performed on the image. In order to mathematically analyze the bio-signal image in various forms, it is necessary to visually control the bio-signal image to be obtained in advance. The proposed aura-type, frequency component image of the bio-signal image matches the physical concept of the bio-energy emission and allows the device to visually control and analyze the image produced.

Unlike frequency components, using amplitude components is more effective in topological relationships. Above all, the quality of the bio-signal image obtained using the amplitude component of the bio-signal image, which is topologically connected to the vibration point, can be evaluated and an accurate parameter (parameter) for adjusting the system can be determined.

First, the measurement of the vibration image parameters will be described in detail.

Acquiring information on the level of aggressiveness of an organism constitutes a frequency distribution histogram and thus measures the head vibration image parameters of the organism.

The aggression level (Ag) aggregation consists of the following equation (3).

Figure 112015034526730-pat00005

Fm-the maximum frequency of the frequency distribution density in the histogram

Fi "I" frequency aggregation of the frequency distribution density acquired over N frame times in the histogram

Fin-vibration image processing frequency

n - Aggregation including interframe differences over the limit in N frames

And biological head vibration image parameters to obtain information on the stress level of the organism. The stress level (St) is compiled by the following equation (4).

Figure 112015034526730-pat00006

Figure 112016121160585-pat00007
- Left part of object "I" Thermal vibration image Total amplitude

Figure 112016121160585-pat00008
- Right part of subject "I" Thermal vibration image Total amplitude

Figure 112015034526730-pat00009
-
Figure 112015034526730-pat00010
From
Figure 112015034526730-pat00011
Maximum value between

Figure 112016121160585-pat00012
- Left part of target "I" Thermal vibration image Maximum frequency

Figure 112016121160585-pat00013
- Right part of object "I" Thermal vibration image Maximum frequency

Figure 112015034526730-pat00014
-
Figure 112015034526730-pat00015
From
Figure 112015034526730-pat00016
Maximum value between

n - the number of columns associated with the target

Measure biological head vibrational image parameters to obtain information on the levels of anxiety levels in living organisms. The level of anxiety (Tn) is measured in the following equation (3).

Figure 112015034526730-pat00017

Pi (f) - Vibration image frequency distribution power spectrum

fmax - the maximum frequency of the oscillating image frequency spectrum

In order to obtain information on the level of compatibility of organisms with other organisms, a histogram of the vibration frequency distribution of all individual organisms is established, each histogram is constructed, a common frequency distribution is obtained, a general rule of distribution and a common distribution area And find the difference between the general law of distribution and the frequency histogram. The compatibility level (C) is calculated as follows.

Figure 112015034526730-pat00018

K - acquired frequency histogram generalized correlation coefficient

Figure 112015034526730-pat00019

y'-general distribution density

Figure 112015034526730-pat00020

When determining whether a verbal or nonverbal falsehood, the vibration image parameter of the head of an organism is measured to obtain information on the degree of integrated change in psychophysiological status.

The integrated level of change (L) of the psychophysiological state used in the false decision is calculated by the following formula.

Figure 112015034526730-pat00021

Pi - the parameter that changes the higher set threshold

Pc - Vibration image parameter measured in determining false level

K - Pi semantic correlation coefficient measured

n - the number of parameters to be measured

m - the number of parameters changed

As is well known, cybernetic and informational theories examine the applicability of operational and technical means to living organisms and systems. The modern concept of cognitive biology is related mainly to the concept and definition of signal information and transfer theory and enables psychophysiological information of mathematical parameters established in information theory. The long study and observations of the authors of human head micro-movements with the help of statistical parameters used in information theory show that there is a statistically reliable dependency between human psychophysiological state and head micro-motion information statistical parameters Could know.

 The present inventor has also been able to present his own interpretation of this phenomenon and vestibular affective reflex. First, we define the interrelationships between psychophysiological energy adjustments (metabolism). All typical emotional states can be characterized by a correlation between specific energy expenditure and individual physiologically necessary energy and emotional energy. Physiological energy is formed for physiological processes, and emotional energy is formed as a result of conscious or unconscious processes. For example, an attack state, if it is really the same attack state, should be unequal in many people, and natural adjustment processes such as age, sex, and education level should be considered. However, from a physiological point of view, this difference should not have a fundamental meaning in the relative energy release and location in the body organs. All of these result in visible emotional symptoms, such as redness of the face, frequent sighs, rapid heartbeat, and certain fine movements. The main reason for the emotional state to be expressed externally is due to the additional release of energy in the organism that changes the correlation between physiological and emotional energy. What should be emphasized here is that the authors considered the body-chemical energy of a natural physical process well known at modern technology development levels. The progression of the physiological process, and the interruption and triggering of the mutual relationship for the human thought and movement process.

The main task of the electrography is to maintain the mechanical equality or evenness above all else. However, it has been proved that the equilibrium state of the semi-closed system is possible only in the equilibrium state of mechanics, chemistry, energy, and other systems (systems) forming the subject. If any one of these systems is unbalanced, the equilibrium state of the adjacent system will be destroyed, which means that the mechanical balance destruction will result in energy balance destruction.

The human head in a vertical semi-balance state can be seen as an overly sensitive epidemiological indicator of all energy processes occurring within the organ. From a biomechanical point of view, a tremendous ongoing effort and reduction of the neck-to-head bone muscles is required to maintain a vertical balance and equilibrium of the head, much higher than the center of gravity. In addition, this movement is reflexively realized under precise operating conditions. All meaningful phenomena (emotions) in the organs of the organism lead to a continuous change in the physiological process. This is similar to other physiological process changes traditionally used in psychophysiological analyzes, such as GSR (galvanic skin response), arterial pressure, and heart rate. In addition, the head motion parameters change with energy footprint and energy location. The spatial trajectory of head movement is very complicated because its head shape is similar to a sphere. Also, the motion trajectory of each point can be significantly different in the movement of hundreds of neck muscles. Reliable quantitative parameter differentiation of head movements is possible through statistical analysis of the informational motion parameters. That is, it is possible to measure and confirm the emotional state by measuring energy and potentiated responses. The laws of mechanics appear consistently, and behaviors always react to maintain uniformity. Measurement of intracorporeal energy, which is naturally directed at a wide variety of people, will result in a consistent and corresponding change in the head motion parameters through the preselective activity.

The overall emotion classification according to the information and statistical parameters of the proposed head movement allows to confirm all emotional states. At present, there is no unified holistic approach to emotional state measurement, so it can be used for the first measurement in other psychophysiological methods or independent experimental evaluation comparisons. Modern psychology uses mainly qualitative criteria in the assessment of emotional states, which makes quantitative measurement fundamentally impossible, and it is difficult to objectively evaluate human condition. However, the proposed method allows you to measure all emotional states. When head movement parameter changes are functionally related to energy exchange changes, it is natural that head movement parameters are the overall characteristic psycho-physiological state of humans.

The accuracy of the proposed formula for emotional state aggregation according to existing evaluation criteria is low compared to the emotional state assessment method through head movement. There is no overall standard for emotional state assessment at the level of the existing technology. The proposed method is characterized by the possibility of an integrated approach to all emotional measurements. All previous methods were also used for various emotional state assessments. Adoption of the proposed concept for measuring emotional state allows inclusion of psychology in the precision science and enables the same emotional measurement.

The acquisition of the signal about the head movement of the subject is done through the image comparison by the camera. SPATIAL AND TIME DISTRIBUTION INFORMATION STATISTICAL PARAMETERS In the dimension dimension, the motion speed of the organism head is measured as the marker motion average frequency, which is determined in units of 10 seconds, which yields the maximum frequency of the television camera operation. These characteristics can reflect human emotional anxiety and characterize anxiety levels.

When the vibration image simultaneously represents the spatial and temporal distribution of the target motion energy, the number of factors having the same vibration frequency for the specific time is obtained in order to obtain the frequency histogram. Thus, the histogram excludes information about the spatial distribution of the vibration frequency. This apparent loss of spatial information actually increases the motion information because, unlike the fine movement of the face in terms of physiological energy, it is not so important where the movement is performed in the head. The configuration of the frequency histogram is determined according to the following.

As is well known, a new formula has been proposed that takes into account two major factors that are different from the contradictory existing approaches that determine the level of aggressiveness. The two main factors are the vibrational mean frequency, which best shows the characteristic spread of vibration, or the human head's fine motion and parameters, and the mean square deviation. Such an aggressive person has a high frequency of fine movement of the head and a wide spread in the movement of various points of the head part. Other official correlation coefficients show the aggressive correlation coefficients for numbers from 0 to 1.

Figure 112015034526730-pat00022

Fm-frequency distribution density Maximum frequency of the histogram

Fi- frequency distribution obtained per 50 frame times Density in the histogram i Frequency counts

Fin-vibration image processing frequency

n - the number of aggregations of inter-frame differences higher than the limit in 50 frames

These equations determine the level of aggression for all, naturally, the lower aggression level is close to zero. For people who are in high aggression status, the figure is close to 1. The threshold for detecting aggressive use of the security system of the vibration imaging system to identify potentially dangerous ones is 0.75.

The following is a statistical finding of meaningful vibration image information parameters that determine the vibration image acquisition and subsequent level of aggression. This, above all, determines the vibration symmetry parameter for the amplitude and frequency vibration image.

Unlike the well-known contradictory existing approaches that determine the level of aggression, a new formula has been proposed that considers motion amplitude and frequency symmetry for individual columns scanned over the human head. For a person with the highest level of aggressiveness, it exhibits maximum symmetry in vibration and fine movement to process amplitude and frequency vibration images for 20 seconds. At the same time, it shows low levels of stress and anxiety.

Figure 112015034526730-pat00023

Figure 112016121160585-pat00024
- Left part of object "I" Thermal vibration image Total amplitude

Figure 112016121160585-pat00025
- Right part of subject "I" Thermal vibration image Total amplitude

Figure 112015034526730-pat00026
-
Figure 112015034526730-pat00027
From
Figure 112015034526730-pat00028
Maximum value between

Figure 112016121160585-pat00029
- Left part of target "I" Thermal vibration image Maximum frequency

Figure 112016121160585-pat00030
- Right part of object "I" Thermal vibration image Maximum frequency

Figure 112015034526730-pat00031
-
Figure 112015034526730-pat00032
From
Figure 112015034526730-pat00033
Maximum value between

n - the number of columns the object will occupy

Similar to the information statistical parameters presented previously, the proposed formula allows the measurement of stress levels from 0 to 1 (St), and above all, the minimum stress level meets the minimum measure, In the case of a person, the stress level is close to 1.

The following is a statistical finding of meaningful vibration image information parameters that determine vibration level acquisition and subsequent level of anxiety. This is primarily related to the fast active signal frequency spectrum configuration of amplitude and frequency vibration images.

Unlike the well-known and contradictory existing approaches that determine the level of anxiety, a new formula has been proposed that takes into account the fact that high anxiety increases frequency spectral density rather than low frequency spectral density, rather than motion.

Figure 112015034526730-pat00034

Tn - level of insecurity

Pi (f) - Vibration image frequency spread power spectrum

fmax - vibration image frequency spread spectrum maximum frequency

The proposed formula, similar to the previously presented information statistical parameters, allows us to measure anxiety levels from 0 to 1. In addition, the minimum level of anxiety meets the minimum measure, and the high level of anxiety indicates that the stress level is close to one. The fast signal frequency spread spectrum of the vibrational image appears for control of the operator or system user.

Another example is to find statistically significant informational parameters of the vibration image that determine the level of compatibility between the acquisitions of the vibration image and the people afterwards. Above all, it consists of a vibration image histogram of each individual frequency.

Unlike the well-known contradictory existing approaches that determine the level of compatibility, a new formula that takes into account the compatibility (harmonization) potential, which is characterized by matching proximity to both the total vibration frequency histogram for the distribution normal rule Are presented.

Figure 112015034526730-pat00035

K-Normalized correlation coefficient of the first histogram

y'-normal distribution density

Figure 112015034526730-pat00036

Similar to the previously presented parameters, the proposed formula measures the compatibility (coherence) probability level from 0 to 1. Also, the minimum measure corresponds to the minimum compatible (unity) probability, and the high level of compatibility (unity) measure on both sides is close to one.

The following is to find statistically meaningful information parameters of the vibration image that determine the vibration image acquisition and the false level of the human. Most of all, it relates to the acquisition of temporal dependencies of the maximum amount of vibration image parameters with a minimum correlation to each other.

A new formula is presented that differs from the already well-known existing psychophysiological approach to polygraph detection. In this formula, the dummy is determined by the change in the vibration image parameter measurement compared to the reporting time. The proposed formula allows for verbal and nonverbal falsehoods. Basically, linguistic false decisions in the time dimension utilize the time to the start of the respondents' answers. Nonverbal false analysis is done by comparing the parameters during one period of time and another period of time.

 The integrated level of change (L) of the psychophysiological state used in false decisions is calculated by the following formula:

Figure 112015034526730-pat00037

Pm - Change parameter of higher set threshold

Pc - Vibration image parameters that change when determining false levels

K - Measured Pi Meaning Correlation Coefficient

n - the number of measurement parameters (which may be different from the number of visual parameters)

m-number of change parameters

Similar to the previously presented parameters, the proposed formula allows measurement of false levels from 0 to 1. Also, the minimum level of falsehood is met with the lowest level of measure, and the highest level of falsehood has a value close to one.

However, the present invention is not limited to the above-described examples of human emotion and psychophysiological state measurement. For reference, human condition characteristics are classified into more than 200 according to various classification systems. Above all, the present invention allows to describe all human conditions through head fine motion parameters and / or head vibration image parameters. In psychology, it is considered unclear that the traditional concept of motion is transformed into a reflex micro-movement of the human head using reliable statistical parameters. However, based on the proposed approach, we can determine the human condition similar to the technology information system and utilize the information parameter to characterize the human condition. For example, it is possible to determine the level of human information and thermodynamic entropy according to the formula.

As a basis for computational entropy computation, a histogram of the fine motion frequency distribution histogram is constructed, and this computational entropy calculation follows the following formula.

Figure 112015034526730-pat00038

As a basis for the thermodynamic entropy calculation, a histogram of the fine motion frequency distribution of the head is constructed, and this thermodynamic entropy (S) calculation is given by the following formula.

Figure 112015034526730-pat00039

This individual information statistical parameter is applied to better clarify any emotional state of a human being. For example, through experimentation, it has been found that there is a large correlation between informational entropy of false levels, and thermodynamic entropy is a condition of human anxiety And that there is a strong association with

Based on body and thermodynamic parameters, we were able to more fully characterize and determine human behavior, energy, and charismatic aspects. For example, based on the frequency histogram showing the maximum frequency of vibration image recording using the vibration image 7.1 version system, human energy (E) could be obtained based on the difference between the mean square error and the maximum frequency.

Figure 112015034526730-pat00040

M : The color size of the aura in the current frame of the object

Fps : frames per second

σ : Average color size of the target aura

The quantitative analysis of the reflexive fine movement of the head makes it possible to measure human psychophysiological state more objectively and scientifically, and it can solve many problems in medical, psychological, psychiatric and daily life. The quantitative evaluation of the mental state of the passengers at the airport according to the level of aggression, stress, anxiety and potential danger, and the independent experiment with the developed system, suggest that the present invention is positive (more than 90%) I could see the fact. This confirms the practical feasibility of the present invention.

Hereinafter, the experimental method of the present invention will be described. Fig. 2 illustrates an experimental process.

In this experiment, children with relatively easy emotions were measured. The response of the stimuli coming from the reader was used to analyze the pattern through the amount of change. The pleasure-unpleasantness axis and the awakening and relaxation axis divided into four quadrants were used as stimuli. The data before and after each stimulus (neutral, pleasant - awakening, comfort - relaxation, discomfort - awakening, relaxation - relaxation) are compared for relevance.

A total of five children were comprised of 1 female, 4 male, mean standard deviation of 5.6 years, 0.8. The subject had no cardiovascular or neurologic abnormality and the subject was placed in an independent space. There was no interference between the camera and the subject and it was performed in a measurement space with a white background. The collected images were masked except for the subjects for better results.

As a research tool, the vibration image technique according to the algorithm described above has found out the points on the children's head in real time in order to detect emotional changes. The basic specification of the video camera is the resolution of 640x480 pixels, the frequency of 15.0 frames / second, the dynamic range of 80dB (sensor width), and the distance to the object is 1.5m.

In the actual experiment, the subject sat on the chair for 10 minutes to take a rest and take a preliminary measurement of the camera for 3 minutes. The emotional stimulation induced a pattern of psychophysiological response. Five different verbal assimilations were used for emotional stimuli (neutral, pleasant - awakening, comfort - relaxation, discomfort - awakening, discomfort - relaxation) for about 20 minutes. The post-condition was recorded for 3 minutes.

The measured images are analyzed by the vibration image processing software applying the algorithm using the vibration image as described above, and the average values of the ten parameters after analysis are presented as the analysis results. After collecting data from five children, we divide them into a common variable and a valid variable to distinguish the differences in the four stimuli based on neutrality. The extracted effective variables are used to determine based on Russell's emotional model.

As a result of the analysis of the above experimental results, five effective variables were found out of 10 variables for each stimulus. Common variables extracted from stimuli of pleasure - awakening and comfort - relaxation based on neutral stimuli can be used as common variables of pleasure - emotion, variables extracted from pleasure - awakening and displeasure - It can be said that it is an image effective variable. These operations are repeated to find parameters related to each emotion axis. The common variables of 'pleasant - awakening' and 'discomfort - awakening' are aggression and stress, 'tension / anxiety' and 'nervousness' are 'awakening' and 'relaxation' It means that it can be used for sharing. Table 1 below shows the results of pattern changes in different stimuli. Using these variables, we can create an algorithm that can infer the inverse as shown in Fig.

Figure 112015034526730-pat00041

The development of technology is changing from machine center to human center for a better life. Application techniques using various bio-signals are being developed to establish an environment that is centered on a gap. This system is designed to enable people to live a more comfortable life by understanding human condition by computer or surrounding environment themselves, such as an eye movement tracking system using an EOG and an EMG or an electric wheelchair control system using an EEG Technologies are being developed.

The purpose of the present invention was to create an algorithm that can infer emotion from extracted variables using vibration image technology. By using vibration imaging technology which can substitute contact type bio signal measurement technology which users can feel rejection feeling by using image technology based on psychophysiological mechanism, biometrics technology which minimizes rejection to users and can be used anyplace .

Fig. 4 shows that an aura, which is bio energy, is emitted around an image of a human body formed of amplitude components of a vibration image.

As described above, the internal bio-signal image is a color representation of the position change magnitude of each part. Thus, it is possible to visualize the magnitude of the position change of each part of the subject (1). The external bio-signal image appears around the internal bio-signal image and represents the average maximum vibration frequency by color modulation.

FIG. 5 shows that a bio-signal image, which is bio-energy, is emitted around an actual image of a human body. In Fig. 5B, the internal vital sign image is not represented, and only the vital sign image is displayed around the actual image.

6 and 7 show images of a living body signal in a stable state and an unstable state, respectively. FIG. 6 shows a bio-signal image of a subject in a stable or finite state, and FIG. 7 shows a stressed state.

Referring to FIG. 6, the bio-signal image is sufficiently symmetrical in terms of shape and color, and the color of the bio-signal image appears as a middle of the selected color scale (overall color-green). It can be seen that the subject is in a stable state through the bio-signal image.

On the other hand, referring to FIG. 7, in a bio-signal image, aura contains a lot of red color components. Therefore, it can be seen that the subject in this state is in an unstable state. When a person is stimulated, for example, by being exposed to a violent scene through a screen, the subject becomes stressed or aggressive and the color of the vital sign image turns red.

FIG. 8 is a distribution graph for a frequency component (a biological signal image) of a human body vibration image in a stable state, and FIG. 7B is a distribution graph for a frequency component (biological signal image) of a human body vibration image in a stressed state.

The graph shown in FIG. 9 shows a typical frequency distribution of a person in a normal working state. According to the results of the study, the majority of people show a distribution number distribution similar to the single-mode distribution rule in the normal state. The subject's condition changes as shown in FIG. 7 if he / she experiences certain negative influences such as viewing violent scenes on the screen. If horror, stress, and aggressive conditions, the average (median) value of the frequency distribution (M) shifts toward increasing. In a stable and relaxed state, the (M) average (median) number of frequency distribution values shifts toward the decrease. The frequency axis X can be expressed not only in the relative unit but also in actual unit or time (Hz or sec). The distance between the displayed values is determined by the actual parameters for the camera's rapid processing and the settings of the software (the number of images and the time it accumulates in the order of processing).

10 is a radial graph (chart) showing the emotional state of the subject according to the method of the present invention.

Hereinafter, the variables used in emotion inferencing in the present invention will be described. The following table lists the variables used in the evaluation of the 10 emotional states and the corresponding formula.

Emotional classification Variable name Formula / Description Aggression P7 Equation 3, Equation 8 Stress P6 Equation 4, Equation 9 Tension / anxiety F5X 1/5 value of calculation cycle of F5 Suspect P19 Balance P16 Equation 6, Equation 11 Charm (Charm) P17 Equation 17 Energy P8 Equation 15 Self-regulation P18 Equation 18 Inhibition F6 Calculation cycle of F1 Neuroticism F9 10 times the F6 standard deviation

Aggression (P7) is obtained from Equation (3) or (8) described above. Stress (P6) is expressed by Equation 4 or Equation 9, Balance is represented by Equation 6 or Equation 11, and Charm Equation 17, Energy (P8), and Self-regulation (P18) are obtained by Equation 18 below. All these specific emotions are evaluated by an increase or decrease of their values as shown in FIG. 3, and the evaluation of all variables is also determined by the increase or decrease of the value obtained from the corresponding expression, The final emotional state of the subject is evaluated by mapping the result to the chart of Fig.

In the above Table 2, F6 represents a calculation period of the amplitude (F1) of the pixel brightness difference between two frames, where F1 is expressed by the following formula.

Figure 112015034526730-pat00042

Where Ca is the number of pixels in the specified range, I is the brightness value of the pixel, and i is the number of pixels.

F5 is the amplitude by the sum of the high frequency region and the low frequency region with respect to the brightness difference amplitude for the F1 value, and F9 represents ten times the F6 standard deviation.

On the other hand, P17 represents an average of degree of change in brightness difference with respect to a back frame, and is expressed by the following equation.

Figure 112015034526730-pat00043

W: Target frame size

li: Frame size in the left column of the target frame

ri: Frame size of right column of target frame

C: Color of the target frame

In Table 2 above, P8 is the energy response using the emotion derivation scheme and is the ratio of magnitude and magnitude of change in amplitude and brightness difference, and is expressed by the following equation.

P18 is an average of the sum of P16 (Equation 6) and P17 (Equation 17), and is expressed by the following equation.

Figure 112015034526730-pat00044

Referring to FIG. 3, a real-time moving image is acquired from the subject by the above-described methods (S1), and 10 parameters are extracted from the vibration image obtained by processing the moving image to determine the increase or decrease of the value (S2). (S3b) aggression + doubt + attraction + energy, (S3c) aggression + balance + energy + confidence, (S3d) aggression + suspicion + Determine if the values at each step, such as balance + charm + confidence + suppression, are increasing. It is determined whether the values of all the parameters belonging to each step increase in steps S3a to S3d. If "true ", the process proceeds to steps S4a to S4d. At this time, the ten parameters are determined to increase or decrease in steps S2a to S2d, respectively.

 In steps S4a to S4d, it is determined whether all the parameters included in the step are increased or decreased. Here, the joy-arousance S5a, joy-relaxation S5b, discomfort-awakening S5c, and discomfort-relaxation S5d are determined according to the judgment of "true" and "false".

That is, in step S4a, it is judged that the state is a pleasant-relaxed state (S5b) if the value of tension / anxiety / nervousness is increased, and otherwise a pleasant-awakening state (S5a). In step S4b, the value of balance and nervousness increases, and it is determined that the state is uncomfortable-relaxed (S5d), and it is determined that it is unpleasant-awakening (S5c). In step S4c, the value of the tension / anxiety / nervousness increases and the state of the unpleasant-awakening (S5c) state is determined. Otherwise, the state of the pleasant-awakening state S5a is determined. On the other hand, the value of the stress / nervousness increases and it is judged as a pleasant-relaxed state (S5b). Otherwise, it is judged as a state of discomfort-relaxed (S5d).

Through experimentation with the present invention, we designed an empirical reasoning algorithm that determines the psychophysiological state of human beings in a noncontact manner. Effective variables for empirical reasoning were found by proven vibration image parameters due to the difference between the stimulus stimulus of the two - dimensional emotion model and the accompanying characteristics. Through the present invention, effective variables according to the emotional state were found, and the algorithm was designed based on this. By finding parameters that show the same pattern when given the same stimulus, it can be confirmed that the vibration image technique can contribute to inferring the emotion state and classifying the emotion state.

Limitations may be limited in increasing the accuracy of the algorithm due to the small number of subjects and the lack of a reference point for the numerical expression of each variable. However, it showed the possibility of a new measurement method for analyzing emotions by making algorithms that can infer human basic emotions by using only images. It is expected to contribute to the field of ubiquitous and healthcare as an auxiliary tool to control the service facilities or personal emotions for the public by using the emotional reasoning method using such images.

Various exemplary embodiments of the present invention have been described and shown in the accompanying drawings. However, it should be understood that these embodiments are only a part of various embodiments. Since various other modifications could occur to those of ordinary skill in the art.

Claims (11)

Acquiring a moving image from a subject using a camera;
Extracting a vibration image of the subject from the moving image;
Extracting a value of a parameter for a charm from the vibration image;
Determining whether the value of the parameter is increased or decreased; And
And determining an emotional state using the increase or decrease of the parameter,
Wherein the charm has a value P17 of a parameter defined by Equation (1) below.
<Formula 1>
Figure 112017045406090-pat00075

W: Target frame size
li: Frame size in the left column of the target frame
ri: Frame size of right column of target frame
C: Color of the target frame
delete delete delete The method according to claim 1,
Extracting a value of a parameter for energy from the vibration image in extracting the value of the parameter,
Wherein the energy (Energy, P8) has a parameter value (E) defined by the following equation.
Figure 112017045406090-pat00076

M : The color size of the aura in the current frame of the object
Fps : frames per second
σ : Average color size of the target aura
delete The method according to claim 1,
Further comprising the step of extracting a value of a parameter for balance from the vibration image in extracting the value of the parameter,
The balance has a parameter value C defined by the following equation and the self-regulation is an average of a parameter value of a balance and a parameter value P17 of a charm Sensitivity classification method using vibration image technology.
Figure 112017045406090-pat00060

K - acquired frequency histogram generalized correlation coefficient
Figure 112017045406090-pat00077

y'-general distribution density
delete The method according to claim 1,
Further comprising extracting a parameter for inhibition from the vibration image in extracting the value of the parameter,
Wherein the parameter value (F6) for the inhibition is a calculation period of F1 defined by the following expression.
Figure 112017045406090-pat00063

Ca - the number of pixels in the specified range
I is the brightness value of the pixel
i is the number of pixels
10. The method of claim 9,
Further comprising extracting a parameter for neuroticism from the vibration image in extracting the value of the parameter,
Wherein the parameter value of the nerve impulse is ten times the standard deviation of the value of the suppression (F6).
The method according to claim 1,
Further comprising the step of extracting a parameter for the tension / anxiety from the vibration image in the step of extracting the value of the parameter,
The parameter value F5X of the tension / anxiety is 1/5 of the calculation period of F5, and F5 is the amplitude of the sum of the high-frequency domain and the low-frequency domain with respect to the brightness difference amplitude for the F1 value defined by the following equation Wherein the sensory classification method uses the vibration image technique.
Figure 112017045406090-pat00074

Ca - the number of pixels in the specified range
I is the brightness value of the pixel
i is the number of pixels
KR1020150049955A 2014-04-10 2015-04-08 A Method for Emotional classification using vibraimage technology KR101753834B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140043203 2014-04-10
KR20140043203 2014-04-10

Publications (2)

Publication Number Publication Date
KR20150117614A KR20150117614A (en) 2015-10-20
KR101753834B1 true KR101753834B1 (en) 2017-07-05

Family

ID=54399911

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150049955A KR101753834B1 (en) 2014-04-10 2015-04-08 A Method for Emotional classification using vibraimage technology

Country Status (1)

Country Link
KR (1) KR101753834B1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101989964B1 (en) * 2017-07-14 2019-06-17 상명대학교산학협력단 Method and System for detecting public emotion
KR102198294B1 (en) * 2018-08-28 2021-01-05 (주)마인드아이 Method and System of Brain-Fatigue Evaluation by using Noncontact Vision System
CN113837128A (en) * 2021-09-28 2021-12-24 北京易华录信息技术股份有限公司 Emotion recognition method, system and storage medium

Also Published As

Publication number Publication date
KR20150117614A (en) 2015-10-20

Similar Documents

Publication Publication Date Title
KR101739058B1 (en) Apparatus and method for Psycho-physiological Detection of Deception (Lie Detection) by video
Giannakakis et al. Stress and anxiety detection using facial cues from videos
Bulagang et al. A review of recent approaches for emotion classification using electrocardiography and electrodermography signals
Petrantonakis et al. Emotion recognition from brain signals using hybrid adaptive filtering and higher order crossings analysis
Lewis et al. A novel method for extracting respiration rate and relative tidal volume from infrared thermography
US8571629B2 (en) Detection of deception and truth-telling using fMRI of the brain
Zhao et al. Real-time assessment of the cross-task mental workload using physiological measures during anomaly detection
JP6503327B2 (en) Physiological condition determination apparatus and physiological condition determination method
KR101536348B1 (en) method and apparatus for detecting drowsiness by physiological signal by using video
US20160029965A1 (en) Artifact as a feature in neuro diagnostics
Kumar et al. SmartEye: Developing a novel eye tracking system for quantitative assessment of oculomotor abnormalities
CN111182835A (en) Judgment of comfort and discomfort
KR101500888B1 (en) Method for obtaining information about the psychophysiological state of a living being
KR101753834B1 (en) A Method for Emotional classification using vibraimage technology
Zou et al. Evaluating the effectiveness of biometric sensors and their signal features for classifying human experience in virtual environments
JP5870465B2 (en) Brain function training apparatus and brain function training program
KR20220046734A (en) method and apparatus for detecting potential dangerous living being or abnormal state by using video image
JP5681917B2 (en) Brain function enhancement support device and brain function enhancement support method
CN108451496B (en) Method and system for detecting information of brain-heart connectivity
Hessler et al. A survey on extracting physiological measurements from thermal images
CN108451528A (en) Change the method and system for inferring electroencephalogram frequency spectrum based on pupil
Andreeßen Towards real-world applicability of neuroadaptive technologies: investigating subject-independence, task-independence and versatility of passive brain-computer interfaces
Lee et al. Research to verify and utilize psychophysiological detection of deception based on vibraimage technology
Hee et al. Valid parameters extracted from frequency changes of head movement on stress state using vibraimage technology
WO2014168354A1 (en) Moving-image-based physiological signal detection method, and device using same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E90F Notification of reason for final refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
X701 Decision to grant (after re-examination)