WO2017007179A1 - Method for expressing social presence of virtual avatar by using facial temperature change according to heartbeats, and system employing same - Google Patents

Method for expressing social presence of virtual avatar by using facial temperature change according to heartbeats, and system employing same Download PDF

Info

Publication number
WO2017007179A1
WO2017007179A1 PCT/KR2016/007097 KR2016007097W WO2017007179A1 WO 2017007179 A1 WO2017007179 A1 WO 2017007179A1 KR 2016007097 W KR2016007097 W KR 2016007097W WO 2017007179 A1 WO2017007179 A1 WO 2017007179A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
face
avatar
temperature
virtual
Prior art date
Application number
PCT/KR2016/007097
Other languages
French (fr)
Korean (ko)
Inventor
황민철
원명주
박상인
Original Assignee
상명대학교서울산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020160082673A external-priority patent/KR101848478B1/en
Application filed by 상명대학교서울산학협력단 filed Critical 상명대학교서울산학협력단
Publication of WO2017007179A1 publication Critical patent/WO2017007179A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention proposes a method for producing a human avatar or a virtual avatar with enhanced social presence using a change in face temperature according to the heartbeat.
  • Avatar modeling techniques developed to date have been focused only on the realistic representation of external appearance from the actual anatomical point of view, and the development of a method and system for expressing reality with deeper internal expression is a new task.
  • the infrared thermal camera detects energy in the form of infrared wavelengths emitted from the surface of the subject and monitors them by expressing them in different colors according to the intensity of radiant heat.
  • the surface temperature of the face varies according to the components of the face such as eyes and mouth, and the face temperature may vary according to the thickness of the skin.
  • facial surface temperature may vary depending on the distribution of blood vessels. For this reason, research on the relationship between facial temperature and emotional state is underway. Accordingly, the present inventors confirmed the correlation between RRI and facial temperature change by using electrocardiogram (ECG, Electrocardiogram) of autonomic nervous system measurement method, and can infer the face temperature change according to the activity of autonomic nervous system A regression model. Based on the derived regression model, face temperature change according to the emotional state of the user was inferred and applied to the virtual-avatar system.
  • ECG Electrocardiogram
  • the present invention proposes a method and a system using the same, which can produce a virtual-avatar in which social reality is enhanced using a change in face temperature according to a heartbeat.
  • the realistic representation of the virtual-avatar using the change of the face temperature according to the heartbeat is a real representation of the virtual-avatar using the change of the face temperature according to the heartbeat
  • the heart rate change may be extracted from the physiological information obtained from the user, and the face temperature change of the virtual-avatar correlated with the user may be synchronized according to the heart rate change.
  • the heart rate change may be detected from ECG data obtained from the user.
  • Another embodiment of the present invention may infer a face temperature change of the user from the ECG data through regression analysis, and synchronize the face temperature change of the virtual-avatar with the extracted face temperature change.
  • the RRI (R-peak to R-peak Intervals) can be detected from the ECG data and the face temperature change of the user can be detected through regression analysis of the RRI.
  • the user's face temperature change can be expressed by the following equation.
  • a virtual-avatar realistic representation system comprises: a detection system for extracting physiological information from the user; And an analysis system for detecting a change in the face temperature of the user from the physiological information. And a display system for representing a virtual-avatar having a face model that changes in correspondence with a user's face temperature change from the analysis system by representing the virtual-avatar correlating to the user.
  • the method proposed in the present invention can be utilized as a new expression element applicable to the 3D model engineer character design.
  • factors such as facial expressions or gazes expressed by human-avatars or virtual avatars in a virtual environment directly affect the user's sense of reality. In general, however, they are modeled around visual forms or muscle movements. Therefore, through the present invention, in addition to the external elements of the human-avatar face itself, the face temperature change factor due to the internal reaction of the user is an important factor for effectively expressing the human-avatar, and is utilized as a basic study for designing an avatar in a virtual environment. Very large.
  • FIG. 1 illustrates a stimulus in accordance with the present invention.
  • FIG. 2 illustrates a method of presenting a stimulus in accordance with the present invention.
  • 3 shows the nine ROI regions of the face temperature measured.
  • 4A, 4B, and 4C show the analysis of the correlation between the face temperature response and the ECG response by region.
  • Figure 5 shows the cardiac response based facial temperature inference model verification results (forehead).
  • Figure 6 shows the cardiac response based facial temperature inference model verification results (eye left).
  • Figure 7 shows the cardiac response based facial temperature inference model verification results (no-nose).
  • Figure 8 shows the cardiac response based face temperature inference model verification results (face to face).
  • FIG. 9 shows the virtual sensory evaluation results of the cardiac response based facial temperature sensory expression elements.
  • the present invention confirmed the correlation between RRI and facial temperature change by using electrocardiogram (ECG, Electrocardiogram) of the autonomic nervous system, and regression model that can infer the face temperature control response according to the autonomic nervous system activity Derived.
  • ECG Electrocardiogram
  • Virtual-avatar system by inferring the temperature change response of at least one of the forehead, the eye, the nose, the face, or the entire face according to the emotional state of the user based on the derived regression model Applied to.
  • the virtual avatar may be represented as a human (human), and according to another embodiment, an animal, a virtual creature, for example, an alien or anthropomorphic object may be represented as an avatar.
  • the present invention by applying the expression element of the facial temperature change according to the actual heart rate change of the user in real time to the virtual-avatar implemented in various forms, to reflect the emotional state of the user and improve the realistic interaction and immersion Here's how to do it.
  • Emotion-induced stimuli used in this study consisted of arousal, relaxation and neutral as audiovisual images.
  • the experimental space was 3m wide by 4m long, and the outside noise and noise was less than 30 dB, and the temperature was maintained at 24 ⁇ 1 °C. All subjects were asked to maintain their physical stability through a 10-minute acclimation time in a sitting position. After that, the gaze at the LED monitor screen of 70 cm distance in a comfortable position. The presented stimulus was randomly presented to the subject to predict it, as shown in Figure 2, to eliminate the order effect.
  • Facial temperature was measured with an infrared thermography camera CX-320U (COX Co. Ltd). Image resolution is 320 x 240 and sampled at 60 frames / second. In the measured image, nine regions (ROIs) were designated to obtain temperature data. As shown in FIG. 3, the nine areas of the face correspond to the forehead, the forehead, the left eye, the right eye, the nose, the mouth, the left cheek, the right cheek, and the entire face. At this time, the temperature data recorded the highest temperature, the lowest temperature, the average temperature and the deviation in the frame unit of the current image. Based on the data, the difference between the temperature information of the current image frame and the next image frame temperature was calculated to calculate the difference in facial temperature with Frame by Frame of the image (thermal to thermal difference).
  • Electrocardiogram (ECG, Electrocardiogram) was obtained using the Lead-I method.
  • the signal was amplified by the MP 100 & ECG 100C amplifier (Biopac System Inc., USA) and digitized using the NI-DAQ Pad 9205 (National Instrument, USA).
  • the signal is sampled at 500 Hz / second.
  • the measured ECG signal was detected R-peak of the ECG signal using the QRS Detection Algorithm.
  • the detected R-peak was calculated from the RRI (R-peak to R-peak Intervals) by the difference from the adjacent R-peak.
  • RRI and average face temperature change were used to analyze the correlation between facial and ECG responses.
  • forehead, eye left, nose, and face regions with high correlation were selected as the final effective factors.
  • Equation (2) The facial temperature control response inference model equation using the ECG response estimated by the result is shown in Equation (2), and the detailed pattern is shown in FIGS. 4A, 4B, and 4C.
  • the reasoning result was mapped to HIS (Hue, Intensity, Saturation) of the virtual avatar and applied the intecity according to one embodiment of the present invention.
  • HIS Human, Intensity, Saturation
  • the subjects participated in a comfortable position, watching three types of human-avatars (A avatar, B avatar, and C avatar) for 25 seconds each, and at a distance of 60 cm from a 27-inch LED monitor (LG, resolution 1920 x 1080). Presented.
  • A-avatar is a human-avatar stimulus to which the sensory expression factor is not applied at all
  • B-avatar is a human-avatar stimulus to which a face temperature is arbitrarily applied to a face region
  • C-avatar is The human-avatar stimulus to which the sensory expression element of the heart reaction-based face temperature change confirmed in the experiment of the present invention is applied.
  • Subjective assessment items consisted of three factors: Visual Presence (VP; 3 item), Visual Immersion (VIm; 7 item), and Visual Interaction (VIn, 4 item). The reliability of subjective evaluation data was secured, including 4 items).
  • visual presence refers to the degree to which the virtual environment given to the user is perceived
  • visual immersion refers to how extremely realistic the virtual environment given to the user is expressed
  • visual interaction refers to the degree to which the user can interact with the form or content of each environment through the virtual environment.
  • Table 2 and Figure 9 below shows the virtual sensory evaluation of the heart response based facial temperature sensory expression.

Abstract

Disclosed is a method for realistically expressing a human avatar by using a facial temperature change according to heartbeats. The method comprises the steps of: detecting ECG data of an actual user in real time; detecting, from the ECG data, a facial temperature change according to heartbeats; and changing the face of the user's avatar in response to the facial temperature change.

Description

심장박동에 따른 얼굴 온도 변화를 이용한 가상-아바타의 사회적 실재감 표현 방법 및 이를 적용하는 시스템A Method of Expressing Social Reality of Virtual-Avatar Using Face Temperature Variation by Heart Rate and Its System
본 발명은 심장박동에 따른 얼굴 온도 변화를 이용해 사회적 실재감이 증진된휴먼 아바타 또는 가상 아바타를 제작할 수 있는 방법을 제안한다.The present invention proposes a method for producing a human avatar or a virtual avatar with enhanced social presence using a change in face temperature according to the heartbeat.
현재까지 개발된 아바타의 모델링 기술은 실재 해부학적 관점에서 외현적 모습을 사실적으로 표현하는 것에만 집중되어왔으며, 보다 깊은 내면적 표현이 가능한 실재감 표현 방법 및 시스템의 개발은 새로운 과제이다.Avatar modeling techniques developed to date have been focused only on the realistic representation of external appearance from the actual anatomical point of view, and the development of a method and system for expressing reality with deeper internal expression is a new task.
적외선 열화상 카메라는 피사체의 표면으로부터 방사되는 적외선 파장 형태의 에너지를 검출하여 복사열의 강도에 따라 서로 다른 색상으로 표현하여 모니터링 가능하다. 눈, 입 등의 얼굴의 구성요소에 따라서 얼굴 표면 온도가 달라지며, 피부의 두께에 따라서도 얼굴 온도는 달라 질 수 있다. 또한, 혈관은 다른 피부 조직에 비하여 온도가 높기 대문에 혈관의 분포에 따라 얼굴 표면 온도가 달라 질 수 있다. 이러한 이유로 얼굴 온도 변화와 정서상태의 관련성에 관한 연구가 진행되고 있다. 이에 따라, 본 발명자는 상기 선행연구를 기반으로 자율신경계 측정방법 중 심전도(ECG, Electrocardiogram)를 사용하여 RRI와 얼굴 온도 변화간의 상관성을 확인하였으며, 자율신경계의 활성도에 따른 얼굴 온도 변화를 추론할 수 있는 회귀모델을 도출하였다. 도출된 회귀모델을 기반으로 사용자의 감성상태에 따른 얼굴 온도 변화를 추론하여 가상-아바타 시스템에 적용하였다.The infrared thermal camera detects energy in the form of infrared wavelengths emitted from the surface of the subject and monitors them by expressing them in different colors according to the intensity of radiant heat. The surface temperature of the face varies according to the components of the face such as eyes and mouth, and the face temperature may vary according to the thickness of the skin. In addition, since blood vessels have a higher temperature than other skin tissues, facial surface temperature may vary depending on the distribution of blood vessels. For this reason, research on the relationship between facial temperature and emotional state is underway. Accordingly, the present inventors confirmed the correlation between RRI and facial temperature change by using electrocardiogram (ECG, Electrocardiogram) of autonomic nervous system measurement method, and can infer the face temperature change according to the activity of autonomic nervous system A regression model. Based on the derived regression model, face temperature change according to the emotional state of the user was inferred and applied to the virtual-avatar system.
본 발명은 심장박동에 따른 얼굴 온도 변화를 이용해 사회적 실재감이 증진된 가상-아바타를 제작할 수 있는 방법 및 이를 적용하는 시스템을 제안한다.The present invention proposes a method and a system using the same, which can produce a virtual-avatar in which social reality is enhanced using a change in face temperature according to a heartbeat.
본 발명에 따라, 심장 박동에 따른 얼굴 온도 변화를 이용한 가상-아바타의 사실적 표현 방법:은 According to the present invention, the realistic representation of the virtual-avatar using the change of the face temperature according to the heartbeat:
실시간으로 실제 사용자의 안면을 촬영하는 단계;Photographing a real user's face in real time;
상기 안면 영상에서 심장박동변화에 따른 얼굴 온도 변화를 검출하는 단계; 그리고Detecting a change in face temperature according to a heartbeat change in the face image; And
상기 얼굴 온도 변화에 대응하여 상기 사용자에 상관하는 가상 아바타의 얼굴을 변화시키는 단계;를 포함한다.And changing a face of a virtual avatar correlated with the user in response to the face temperature change.
본 발명의 한 실시 예에 따르면, 사용자로부터 획득한 생리정보로부터 심장 박동 변화를 추출하고, 심장 박동 변화에 따라서 상기 사용자와 상관하는 가상-아바타의 얼굴 온도 변화를 동기 시킬 수 있다.According to an embodiment of the present invention, the heart rate change may be extracted from the physiological information obtained from the user, and the face temperature change of the virtual-avatar correlated with the user may be synchronized according to the heart rate change.
본 발명의 한 실시 예에 따르면, 상기 심장 박동 변화는 상기 사용자로부터 획득하는 ECG 데이터로부터 검출할 수 있다.According to an embodiment of the present disclosure, the heart rate change may be detected from ECG data obtained from the user.
본 발명의 다른 실시 예는 회귀분석을 통해 상기 ECG 데이터로부터 상기 사용자의 얼굴온도 변화를 추론하고, 추출된 얼굴 온도 변화에 상기 가상-아바타의 얼굴 온도변화를 동기화시킬 수 있다.Another embodiment of the present invention may infer a face temperature change of the user from the ECG data through regression analysis, and synchronize the face temperature change of the virtual-avatar with the extracted face temperature change.
본 발명의 구체적인 실시 예에 따르면, 상기 ECG 데이터로부터 RRI (R-peak to R-peak Intervals)를 검출하고 RRI의 회귀분석을 통해 상기 사용자의 얼굴온도 변화를 검출할 수 있다.According to a specific embodiment of the present invention, the RRI (R-peak to R-peak Intervals) can be detected from the ECG data and the face temperature change of the user can be detected through regression analysis of the RRI.
본 발명의 구체적인 실시 예에 따르면, 상기 사용자의 얼굴 온도 변화는 아래의 식으로 표현될 수 있다.According to a specific embodiment of the present invention, the user's face temperature change can be expressed by the following equation.
Figure PCTKR2016007097-appb-I000001
Figure PCTKR2016007097-appb-I000001
본 발명에 따른 가상-아바타 사실적 표현 시스템:은 상기 사용자로부터 생리정보를 추출하는 검출 시스템; 및 상기 생리정보로부터 상기 사용자의 얼굴온도 변화를 검출하는 분석 시스템; 상기 사용자에 상관하는 가상-아바타를 표현하는 것으로 상기 분석 시스템으로부터의 사용자 얼굴 온도변화에 대응하게 변화하는 얼굴 모델을 가지는 가상-아바타를 표시하는 디스플레이 시스템;을 구비한다. A virtual-avatar realistic representation system according to the present invention comprises: a detection system for extracting physiological information from the user; And an analysis system for detecting a change in the face temperature of the user from the physiological information. And a display system for representing a virtual-avatar having a face model that changes in correspondence with a user's face temperature change from the analysis system by representing the virtual-avatar correlating to the user.
본 발명에서 제안된 방법은 3D 모델 엔지니어가 캐릭터 설계 시 적용 가능한 새로운 표현요소로 활용될 수 있다. 특히 가상환경에서 휴먼-아바타 또는 가상 아바타가 표현하는 표정이나 시선 등의 요소는 사용자에게 직접적인 실재감에 영향을 준다. 그러나 일반적으로 시각적인 형태나, 근육의 움직임만을 중심으로 모델링 되고 있다. 따라서 본 발명을 통해 휴먼-아바타 얼굴 자체의 외형적 요소 이외에 사용자 내적 반응에 의한 얼굴 온도 변화 인자는 휴먼-아바타를 효과적으로 표현하기 위한 중요한 요소이며, 가상환경에서 아바타를 설계하기 위한 기초 연구로써 활용도가 매우 크다.The method proposed in the present invention can be utilized as a new expression element applicable to the 3D model engineer character design. In particular, factors such as facial expressions or gazes expressed by human-avatars or virtual avatars in a virtual environment directly affect the user's sense of reality. In general, however, they are modeled around visual forms or muscle movements. Therefore, through the present invention, in addition to the external elements of the human-avatar face itself, the face temperature change factor due to the internal reaction of the user is an important factor for effectively expressing the human-avatar, and is utilized as a basic study for designing an avatar in a virtual environment. Very large.
도1은 본 발명에 따른 자극을 예시한다.1 illustrates a stimulus in accordance with the present invention.
도2는 본 발명에 따른 자극의 제시 방법을 예시한다.2 illustrates a method of presenting a stimulus in accordance with the present invention.
도3은 얼굴 온도를 측정한 9개의 ROI 영역을 도시한다.3 shows the nine ROI regions of the face temperature measured.
도4a, 4b, 4c는 얼굴 온도 반응과 심전도 반응의 상관성 분석을 부위별로 도시한다.4A, 4B, and 4C show the analysis of the correlation between the face temperature response and the ECG response by region.
도5는 심장 반응 기반 얼굴온도 추론 모델 검증 결과 (이마-forehead)를 도시한다.Figure 5 shows the cardiac response based facial temperature inference model verification results (forehead).
도6은 심장 반응 기반 얼굴온도 추론 모델 검증 결과 (좌측 눈-eye left)를 도시한다.Figure 6 shows the cardiac response based facial temperature inference model verification results (eye left).
도7은 심장 반응 기반 얼굴온도 추론 모델 검증 결과 (코-nose)를 도시한다.Figure 7 shows the cardiac response based facial temperature inference model verification results (no-nose).
도8은 심장 반응 기반 얼굴온도 추론 모델 검증 결과 (얼굴전체-face)를 도시한다.Figure 8 shows the cardiac response based face temperature inference model verification results (face to face).
도9는 심장 반응 기반 얼굴 온도 실감 표현요소의 가상 실감화 평가 결과를 도시한다.9 shows the virtual sensory evaluation results of the cardiac response based facial temperature sensory expression elements.
이하, 첨부된 도면을 참고하면서, 본 발명에 따른 심장박동에 따른 얼굴 온도 변화를 이용한 가상 아바타의 사실적 표현 방법 및 이를 적용하는 시스템의 실시 예를 설명한다.Hereinafter, with reference to the accompanying drawings, a description will be given of an embodiment of a realistic expression method of a virtual avatar using a face temperature change according to the heartbeat according to the present invention, and a system for applying the same.
본 발명은 자율신경계 측정방법 중 심전도(ECG, Electrocardiogram)를 사용하여 RRI와 안면 온도 변화 간의 상관성을 확인하였으며, 자율신경계의 활성도에 따른 얼굴 온도 조절 반응을 추론할 수 있는 회귀 모델(Regression Model)을 도출하였다. 도출된 회귀모델을 기반으로 사용자의 감성상태에 따른 이마(forehead), 눈(eye), 코(nose), 피부(face) 중 적어도 어느 하나 또는 얼굴 전체의 온도 변화 반응을 추론하여 가상-아바타 시스템에 적용하였다. 여기에서 가상 아바타는 휴먼(인간)으로 표현될 수 있으며, 다른 실시 예에 따라 동물, 가상의 생명체, 예를 들어 외계인 또는 의인화된 사물이 아바타로서 표현될 수 있다.The present invention confirmed the correlation between RRI and facial temperature change by using electrocardiogram (ECG, Electrocardiogram) of the autonomic nervous system, and regression model that can infer the face temperature control response according to the autonomic nervous system activity Derived. Virtual-avatar system by inferring the temperature change response of at least one of the forehead, the eye, the nose, the face, or the entire face according to the emotional state of the user based on the derived regression model Applied to. Here, the virtual avatar may be represented as a human (human), and according to another embodiment, an animal, a virtual creature, for example, an alien or anthropomorphic object may be represented as an avatar.
즉, 본 발명은 실시간으로 실제 사용자의 심장박동 변화에 따른 얼굴 온도 변화의 표현 요소를 다양한 형태로 구현되는 가상-아바타에 적용함으로써, 사용자의 감성상태를 반영하고 실재감 있는 상호작용 및 몰입감을 향상시킬 수 있는 방법을 제시한다.That is, the present invention by applying the expression element of the facial temperature change according to the actual heart rate change of the user in real time to the virtual-avatar implemented in various forms, to reflect the emotional state of the user and improve the realistic interaction and immersion Here's how to do it.
1. 피험자1. Subject
본 연구에 참여한 피실험자는 시각기능에 이상이 없고 나안시력이 0.8 이상인 대학생 및 일반인 26명 (남 13명, 평균나이: 23.03세 ± 3.27)을 대상으로 하였다. 실험 절차를 충분히 인지 시킨 후, 자발적으로 참여에 동의한 자로 선정하였다. 피험자가 졸음 및 피로에 의한 부교감신경 활성화에 따른 동고크기 감소의 영향을 최소화하기 위하여 실험 전 충분한 휴식을 취하도록 권하였다. 또한, 교감신경과 부교감신경에 영향을 미칠 수 있는 운동, 흡연, 카페인 등을 자제하도록 요청하였다. Participants in this study were 26 university students and 26 general students (13 males, average age: 23.03 years ± 3.27) who had no visual impairment and had uncorrected visual acuity of 0.8 or higher. After fully acknowledging the experimental procedure, they were voluntarily agreed to participate. Subjects were encouraged to take adequate rest prior to the experiment to minimize the effect of sway size reduction due to parasympathetic nerve activation due to drowsiness and fatigue. In addition, it was asked to refrain from exercise, smoking, and caffeine that could affect the sympathetic and parasympathetic nerves.
2. 실험환경 및 방법2. Experimental environment and method
본 연구에 사용된 감성 유발자극은, 도1에 도시된 바와 같이, 시청각 영상으로 각성 (Arousal), 이완 (Relaxation), 중립 (Neutral)으로 구성하였다. 실험 자극 선정의 적합도는 적합성 검증을 통해 확인하였다 (Relaxation: χ2 [2, N = 26] = 30.769, p = .000, Arousal: χ2 [2, N = 26] = 35.615, p = .000).Emotion-induced stimuli used in this study consisted of arousal, relaxation and neutral as audiovisual images. The suitability of the experimental stimulus selection was verified by the fitness test (Relaxation: χ2 [2, N = 26] = 30.769, p = .000, Arousal: χ2 [2, N = 26] = 35.615, p = .000).
실험 공간은 가로 3 m 세로 4 m의 크기로, 외부에서 발생하는 잡음과 소음이 30 dB 이하인 방음실이며, 온도는 24 ± 1℃를 유지하였다. 모든 피험자는 앉은 자세로 10분간 적응시간을 통해 신체적 안정을 유지 할 수 있도록 요청하였다. 이후, 편안한 자세로 70 cm 거리의 LED 모니터 화면을 응시하도록 하였다. 제시된 자극은, 도2에 도시된 바와 같이, 피험자가 예측할 수 없도록 무작위로 제시하여 순서효과를 제거하였다.The experimental space was 3m wide by 4m long, and the outside noise and noise was less than 30 dB, and the temperature was maintained at 24 ± 1 ℃. All subjects were asked to maintain their physical stability through a 10-minute acclimation time in a sitting position. After that, the gaze at the LED monitor screen of 70 cm distance in a comfortable position. The presented stimulus was randomly presented to the subject to predict it, as shown in Figure 2, to eliminate the order effect.
3. 분석 방법3. Analysis method
안면온도는 적외선 열화상 (infrared thermography)카메라 CX-320U (COX Co. Ltd)로 측정하였다. 영상 해상도는 320 X 240이며, 60 프레임/second 속도로 샘플링 하였다. 측정된 영상은 9 개의 영역 (ROI)을 지정하여 온도 데이터를 취득하였다. 얼굴의 9 개의 영역은, 도3에 도시된 바와 같이, 이마, 미간, 좌측 눈, 우측 눈, 코, 입, 좌측 볼, 우측 볼, 얼굴 전체가 해당된다. 이때 온도 데이터는 현재 영상에서 프레임 단위로 최고온도, 최저온도, 평균온도와 편차가 기록하였다. 상기 데이터를 기반으로 현재 영상 프레임의 온도 정보와 다음 영상 프레임 온도의 차이 값을 계산하여 영상의 Frame by Frame으로 얼굴 온도 변화 차이를 계산하였다 (thermal to thermal difference).Facial temperature was measured with an infrared thermography camera CX-320U (COX Co. Ltd). Image resolution is 320 x 240 and sampled at 60 frames / second. In the measured image, nine regions (ROIs) were designated to obtain temperature data. As shown in FIG. 3, the nine areas of the face correspond to the forehead, the forehead, the left eye, the right eye, the nose, the mouth, the left cheek, the right cheek, and the entire face. At this time, the temperature data recorded the highest temperature, the lowest temperature, the average temperature and the deviation in the frame unit of the current image. Based on the data, the difference between the temperature information of the current image frame and the next image frame temperature was calculated to calculate the difference in facial temperature with Frame by Frame of the image (thermal to thermal difference).
심전도 (ECG, Electrocardiogram)는 Lead-I 방식을 이용하여 취득 하였다. 신호는 MP 100 & ECG 100C amplifier (Biopac System Inc., USA)를 통해 신호를 증폭하고 NI-DAQ Pad 9205 (National Instrument, USA)를 이용해 디지털화하였다. 신호는 500 Hz/second 속도로 샘플링 함. 측정된 심전도 신호는 QRS Detection Algorithm을 이용하여 ECG 신호의 R-peak을 검출하였다. 검출된 R-peak은 인접한 R-peak과의 차이를 통해 RRI (R-peak to R-peak Intervals)를 계산하였다.Electrocardiogram (ECG, Electrocardiogram) was obtained using the Lead-I method. The signal was amplified by the MP 100 & ECG 100C amplifier (Biopac System Inc., USA) and digitized using the NI-DAQ Pad 9205 (National Instrument, USA). The signal is sampled at 500 Hz / second. The measured ECG signal was detected R-peak of the ECG signal using the QRS Detection Algorithm. The detected R-peak was calculated from the RRI (R-peak to R-peak Intervals) by the difference from the adjacent R-peak.
4. 실험결과4. Experimental Results
얼굴 온도 반응과 심전도 반응의 상관성 분석을 위해 RRI와 얼굴 온도 변화 평균값을 사용하였다. 상관분석 결과, 높은 상관성을 보이는 이마 (forehead), 좌측 눈 (Eye left), 코 (Nose), 얼굴 전체 영역 (Face)지점을 최종 유효 요소로 선정하였다.RRI and average face temperature change were used to analyze the correlation between facial and ECG responses. As a result of the correlation analysis, forehead, eye left, nose, and face regions with high correlation were selected as the final effective factors.
단순회귀분석법 (Simple Linear Regression Analysis)을 이용하여 심전도 반응의 RRI 변수를 통해 얼굴 온도 변화를 예측하는 모델을 개발하고 이를 검증하였다. 독립변수 (RRI)와 종속변수 (Facial temperature)의 분포에서 실제 분포된 종속변수의 값들로부터 가장 오차가 적은 직선을 찾아내는 최소자승법을 통해 최적의 방정식의 형태로 구하였다. 기울기 B는 회귀계수 (Regression coefficient), A는 상수 (Constant)를 의미한다. R 제곱 (결정계수: R2)은 종속변수의 전체 변동 중에서 회귀모형에 의해 설명된 변동의 비율을 의미한다. 회귀모델의 통계적 유의성 검증 결과 p = 0.000으로 구해진 회귀모델이 유의함을 확인하였다.A simple linear regression analysis was used to develop and validate a model that predicts facial temperature changes through RRI parameters of ECG responses. In the distribution of independent variable (RRI) and dependent temperature (Facial temperature), the optimal equation was obtained through the least-squares method that finds the least linear line from the values of the actual distributed dependent variable. The slope B is the regression coefficient and A is the constant. R-squared (the coefficient of determination: R2) is the ratio of the variation explained by the regression model to the total variation of the dependent variable. The statistical significance test of the regression model confirmed that the regression model obtained with p = 0.000 was significant.
수학식 1
Figure PCTKR2016007097-appb-M000001
Equation 1
Figure PCTKR2016007097-appb-M000001
Figure PCTKR2016007097-appb-I000002
Figure PCTKR2016007097-appb-I000002
수학식 2
Figure PCTKR2016007097-appb-M000002
Equation 2
Figure PCTKR2016007097-appb-M000002
결과에 의해 추정된 심전도 반응을 이용한 얼굴 온도 조절 반응 추론 모델 식은 다음 식 (2)와 같고, 자세한 패턴은 다음 도4a, 4b, 4c와 같다.The facial temperature control response inference model equation using the ECG response estimated by the result is shown in Equation (2), and the detailed pattern is shown in FIGS. 4A, 4B, and 4C.
이러한 추론 결과는 본 발명의 한 실시 예에 따라 가상 아바타의 HIS(Hue, Intensity, Saturation)와 맵핑하여 인테시티를 적용하였다.The reasoning result was mapped to HIS (Hue, Intensity, Saturation) of the virtual avatar and applied the intecity according to one embodiment of the present invention.
상기 단순회귀분석방법을 통해 개발된 추론 모델을 검증하기 위하여 도출된 회귀식을 기반으로 10명의 피험자를 대상으로 검증하였다. 검증결과, 실재 측정한 얼굴 온도와 예측한 얼굴 온도의 편차는 0.00001정도 차이가 있음을 확인하였다.To verify the inference model developed by the simple regression analysis method, 10 subjects were tested based on the regression equation derived. As a result of the verification, it was confirmed that the difference between the actual measured face temperature and the predicted face temperature was about 0.00001.
5. 실재감 평가5. Reality Evaluation
실험에 참여한 피험자는 편안한 자세로 세 가지 유형 (A avatar, B avatar, C avatar)의 휴먼-아바타를 각각 25초간 시청하였고, 27인치 LED 모니터 (LG, 해상도 1920 X 1080)로부터 60 cm의 거리에서 제시하였다.The subjects participated in a comfortable position, watching three types of human-avatars (A avatar, B avatar, and C avatar) for 25 seconds each, and at a distance of 60 cm from a 27-inch LED monitor (LG, resolution 1920 x 1080). Presented.
여기에서, A 아마타(avatar)는 실감 표현요소가 전혀 적용 되지 않은 휴먼-아바타 자극, B 아바타(avatar)는 임의로 얼굴 영역에 얼굴 온도변화를 적용한 휴먼-아바타 자극, 그리고 C 아바타(avatar)는 본 발명의 실험에서 확인한 심장반응기반 얼굴온도 변화의 실감 표현요소를 적용한 휴먼-아바타 자극을 의미한다.Here, A-avatar is a human-avatar stimulus to which the sensory expression factor is not applied at all, B-avatar is a human-avatar stimulus to which a face temperature is arbitrarily applied to a face region, and C-avatar is The human-avatar stimulus to which the sensory expression element of the heart reaction-based face temperature change confirmed in the experiment of the present invention is applied.
선행연구에서 개발한 가상 실감화 주관평가 모델을 기반으로 피험자가 느낀 실재감에 대한 설문을 자극 제시 후에 5 점 척도로 보고 받았다. 주관평가 항목은 시각적 실재감(Visual Presence, VP; 3 item), 시각적 몰입감(Visual Immersion, VIm; 7 item), 시각적 상호작용(Visual Interaction, VIn, 4 item)의 3 요인으로 구성되었고, 역 문항(4 item)을 포함하여 주관평가 데이터의 신뢰성을 확보하였다. Based on the virtual sensory subjective evaluation model developed in the previous study, a questionnaire about the reality felt by the subject was reported on a 5-point scale after the stimulus was presented. Subjective assessment items consisted of three factors: Visual Presence (VP; 3 item), Visual Immersion (VIm; 7 item), and Visual Interaction (VIn, 4 item). The reliability of subjective evaluation data was secured, including 4 items).
여기에서, 시각적 실재감 (Visual presence)은 사용자에게 주어진 가상 환경이 어떻게 지각되는지에 대한 정도를 의미하며, 시각적 몰입감 (Visual immersion)은 사용자에게 주어진 가상 환경이 얼마나 극사실적으로 표현되어 있는지에 대한 정도를 의미하며, 그리고 시각적 상호작용 (Visual interaction)은 사용자가 가상 환경을 통하여 매개 환경의 형태나 내용에 대해 상호작용이 가능한지에 대한 정도를 의미한다.Here, visual presence refers to the degree to which the virtual environment given to the user is perceived, and visual immersion refers to how extremely realistic the virtual environment given to the user is expressed. And visual interaction refers to the degree to which the user can interact with the form or content of each environment through the virtual environment.
표 1
Figure PCTKR2016007097-appb-T000001
Table 1
Figure PCTKR2016007097-appb-T000001
심장 반응 기반 얼굴 온도 실감 표현요소의 가상 실감화를 평가한 결과 본 연구에서 도출된 회귀모델을 기반으로 심장반응을 통해 추론한 얼굴 온도변화를 휴먼-아바타에 적용한 경우 실재감이 높은 것을 확인하였다 (p < 0.001)As a result of evaluating the virtual sensitization of the face response sensory expression factor based on the cardiac response, we confirmed that the presence of the face temperature change inferred through the cardiac response based on the regression model derived from this study was high in human-avatar (p) <0.001)
아래의 표2 및 도9는 심장 반응 기반 얼굴 온도 실감 표현요소의 가상 실감화 평가 결과를 보인다.Table 2 and Figure 9 below shows the virtual sensory evaluation of the heart response based facial temperature sensory expression.
표 2
Figure PCTKR2016007097-appb-T000002
TABLE 2
Figure PCTKR2016007097-appb-T000002
이상에서, 본원 발명의 이해를 돕기 위하여 모범적인 실시 예가 설명되고 첨부된 도면에 도시되었다. 그러나, 이러한 실시 예는 단지 본 발명을 예시하기 위한 것이고 이를 제한하지 않는다는 점이 이해되어야 할 것이다. 또한, 본 발명은 도시되고 설명된 내용에 국한되지 않는다는 점이 이해되어야 할 것이다. 이는 다양한 다른 변형이 본 기술분야에서 통상의 지식을 가진 자에게 일어날 수 있기 때문이다.In the above, exemplary embodiments have been described and illustrated in the accompanying drawings to help understand the present invention. However, it should be understood that these embodiments are merely illustrative of the invention and do not limit it. It is also to be understood that the invention is not limited to the details shown and described. This is because various other modifications may occur to those skilled in the art.

Claims (7)

  1. 사용자로부터 획득한 생리정보로부터 획득한 심장 박동 변화에 따라서 상기 사용자와 상관하는 가상-아바타의 얼굴 온도 변화를 동기 시키는, 가상-아바타 의 사회적 실재감 표현 방법.And synchronizing the facial temperature change of the virtual-avatar correlated with the user according to the heartbeat change acquired from the physiological information obtained from the user.
  2. 제1항에 있어서,The method of claim 1,
    상기 심장 박동 변화는 상기 사용자로부터 획득하는 ECG 데이터로부터 검출하는 것을 특징으로 하는 가상-아바타의 사회적 실재감 표현 방법.The heart rate change is detected from the ECG data obtained from the user.
  3. 제2항에 있어서,The method of claim 2,
    회귀분석을 통해 상기 ECG 데이터로부터 상기 사용자의 얼굴 온도의 변화를 추론하고, 추론된 상기 사용자의 얼굴 온도 변화에 상기 가상-아바타의 얼굴 온도 변화를 동기화 시키는 것을 특징으로 하는 가상-아바타 의 사회적 실재감 표현 방법.Inferring the change of the user's face temperature from the ECG data through regression analysis, and synchronizing the face temperature change of the virtual-avatar with the inferred change of the user's face temperature. Way.
  4. 제3항에 있어서,The method of claim 3,
    상기 ECG 데이터로부터 RRI(R-peak to R-peak Intervals)를 검출하고 RRI의 회귀분석을 통해 상기 사용자의 얼굴 온도 변화를 검출하는, 가상-아바타 의 사회적 실재감 표현 방법.R-peak to R-peak Intervals (ECR) is detected from the ECG data, and the facial temperature change of the user is detected through regression analysis of RRI.
  5. 제4항에 있어서, The method of claim 4, wherein
    가상 아바타의 얼굴에서, 이마(forhead), 눈(Eye), 코(Nose), 안면(Face) 중 적어도 어느 하나의 온도가 아래의 식으로 표현되는 것을 특징으로 하는 가상-아바타의 사회적 실재감 표현 방법. In the face of the virtual avatar, at least one of the temperature of the forehead, the eye, the nose, and the face is expressed by the following expression. .
    Figure PCTKR2016007097-appb-I000003
    Figure PCTKR2016007097-appb-I000003
  6. 제1항 내지 제5항 중의 어느 한 항의 방법을 수행하는 가상-아바타의 사회적 실재감 표현 시스템에 있어서,In the system for expressing the social reality of the virtual-avatar performing the method of any one of claims 1 to 5,
    상기 사용자로부터 생리정보를 추출하는 검출 시스템 및 A detection system for extracting physiological information from the user;
    상기 생리정보로부터 상기 사용자의 얼굴 온도 변화를 검출하는 분석 시스템;An analysis system for detecting a change in temperature of the face of the user from the physiological information;
    상기 사용자에 상관하는 가상-아바타를 표현하는 것으로 상기 분석 시스템으로 부터의 사용자 얼굴 온도 변화에 대응하게 변화하는 얼굴 모델을 가지는 가상-아바타를 표시하는 디스플레이 시스템;을 구비하는, 가상-아바타 사실적 표현 시스템.And a display system for displaying a virtual-avatar having a face model that changes in response to a change in user face temperature from the analysis system by representing a virtual-avatar correlating to the user. .
  7. 제6항에 있어서,The method of claim 6,
    상기 검출 시스템은 ECG 데이터를 검출하는 것을 특징으로 하는 가상-아바타 의 사회적 실재감 표현 시스템.The detection system detects ECG data.
PCT/KR2016/007097 2015-07-03 2016-07-01 Method for expressing social presence of virtual avatar by using facial temperature change according to heartbeats, and system employing same WO2017007179A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20150095045 2015-07-03
KR10-2015-0095045 2015-07-03
KR1020160082673A KR101848478B1 (en) 2015-07-03 2016-06-30 Method and system for social realistic expression of virtual-avatar
KR10-2016-0082673 2016-06-30

Publications (1)

Publication Number Publication Date
WO2017007179A1 true WO2017007179A1 (en) 2017-01-12

Family

ID=57685827

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/007097 WO2017007179A1 (en) 2015-07-03 2016-07-01 Method for expressing social presence of virtual avatar by using facial temperature change according to heartbeats, and system employing same

Country Status (1)

Country Link
WO (1) WO2017007179A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019023400A1 (en) * 2017-07-28 2019-01-31 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060032409A (en) * 2004-10-12 2006-04-17 삼성전자주식회사 Method and apparatus of generating avata for representing state of health
KR20060037697A (en) * 2004-10-28 2006-05-03 에스케이 텔레콤주식회사 System for displaying health information and method thereof
KR20100019067A (en) * 2008-08-08 2010-02-18 소프트포럼 주식회사 System for providing internet service using three-dimensional avatar and method thereof
KR20110121394A (en) * 2010-04-30 2011-11-07 엘지전자 주식회사 Apparatus for displaying image and method for operating the same
KR20130080442A (en) * 2010-06-09 2013-07-12 마이크로소프트 코포레이션 Real-time animation of facial expressions

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060032409A (en) * 2004-10-12 2006-04-17 삼성전자주식회사 Method and apparatus of generating avata for representing state of health
KR20060037697A (en) * 2004-10-28 2006-05-03 에스케이 텔레콤주식회사 System for displaying health information and method thereof
KR20100019067A (en) * 2008-08-08 2010-02-18 소프트포럼 주식회사 System for providing internet service using three-dimensional avatar and method thereof
KR20110121394A (en) * 2010-04-30 2011-11-07 엘지전자 주식회사 Apparatus for displaying image and method for operating the same
KR20130080442A (en) * 2010-06-09 2013-07-12 마이크로소프트 코포레이션 Real-time animation of facial expressions

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019023400A1 (en) * 2017-07-28 2019-01-31 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
US10796469B2 (en) 2017-07-28 2020-10-06 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
US10810780B2 (en) 2017-07-28 2020-10-20 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
US10818061B2 (en) 2017-07-28 2020-10-27 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
US10937219B2 (en) 2017-07-28 2021-03-02 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity

Similar Documents

Publication Publication Date Title
Mandryk et al. Physiological measures for game evaluation
Velana et al. The senseemotion database: A multimodal database for the development and systematic validation of an automatic pain-and emotion-recognition system
Sioni et al. Stress detection using physiological sensors
Slater et al. Analysis of physiological responses to a social situation in an immersive virtual environment
Dommer et al. Between-brain coherence during joint n-back task performance: a two-person functional near-infrared spectroscopy study
Kim et al. Characteristic changes in the physiological components of cybersickness
Kim et al. The application of biosignal feedback for reducing cybersickness from exposure to a virtual environment
Gruebler et al. Design of a wearable device for reading positive expressions from facial EMG signals
Tourangeau et al. The role of facial response in the experience of emotion.
Guterstam et al. The invisible hand illusion: multisensory integration leads to the embodiment of a discrete volume of empty space
Travis Autonomic and EEG patterns distinguish transcending from other experiences during Transcendental Meditation practice
Nakayama et al. Decrease in nasal temperature of rhesus monkeys (Macaca mulatta) in negative emotional state
Peasley-Miklus et al. Alexithymia predicts arousal-based processing deficits and discordance between emotion response systems during emotional imagery.
Wilhelm et al. Attend or defend? Sex differences in behavioral, autonomic, and respiratory response patterns to emotion–eliciting films
Masood et al. Modeling mental stress using a deep learning framework
Bigliassi et al. Cerebral effects of music during isometric exercise: An fMRI study
CN108135491B (en) Physiological state determination device and physiological state determination method
Bucchioni et al. Empathy or ownership? Evidence from corticospinal excitability modulation during pain observation
Shaffer et al. A guide to cleaner electrodermal activity measurements
Keshavarz et al. Detecting and predicting visually induced motion sickness with physiological measures in combination with machine learning techniques
KR101736402B1 (en) Method for enhancing the Social engagement of virtual Avatars through Pupillary Responses based on Heart Rate
WO2017007179A1 (en) Method for expressing social presence of virtual avatar by using facial temperature change according to heartbeats, and system employing same
Welch et al. An affect-sensitive social interaction paradigm utilizing virtual reality environments for autism intervention
Huang et al. Measurement of sensory deficiency in fine touch after stroke during textile fabric stimulation by electroencephalography (EEG)
Groenegress et al. The physiological mirror—a system for unconscious control of a virtual environment through physiological activity

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16821585

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16821585

Country of ref document: EP

Kind code of ref document: A1