KR101973409B1 - Terminal device - Google Patents

Terminal device Download PDF

Info

Publication number
KR101973409B1
KR101973409B1 KR1020130074937A KR20130074937A KR101973409B1 KR 101973409 B1 KR101973409 B1 KR 101973409B1 KR 1020130074937 A KR1020130074937 A KR 1020130074937A KR 20130074937 A KR20130074937 A KR 20130074937A KR 101973409 B1 KR101973409 B1 KR 101973409B1
Authority
KR
South Korea
Prior art keywords
color
emotion
human body
information
image information
Prior art date
Application number
KR1020130074937A
Other languages
Korean (ko)
Other versions
KR20150001539A (en
Inventor
신명진
Original Assignee
삼성전기주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전기주식회사 filed Critical 삼성전기주식회사
Priority to KR1020130074937A priority Critical patent/KR101973409B1/en
Publication of KR20150001539A publication Critical patent/KR20150001539A/en
Application granted granted Critical
Publication of KR101973409B1 publication Critical patent/KR101973409B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

A terminal device is disclosed. The terminal device according to an embodiment of the present invention includes an image information acquisition unit for obtaining image information; A controller configured to compare the image information with the reference image of the human body, recognize a human body, analyze the color of the human body, determine a user's emotion, and generate emotion information on the determined emotion; And an output unit for outputting the emotion information.

Description

Terminal device {TERMINAL DEVICE}

The present invention relates to a terminal device.

With the development of mobile communication technology, the video call service that can make a face-to-face call has become popular, breaking away from the call service that was limited to the voice call service. In particular, as a communication network supporting video telephony or a video conferencing system is established, a user may use a video call service with a counterpart through a mobile communication terminal even while stationary or mobile.

Recently, as video call service is used, researches for providing various additional services to users by analyzing face images of users are being actively conducted. In particular, a technique for converting a user's face image into another image has been proposed.

Korean Laid-Open Patent Publication 10-2009-0125905

The present invention is to provide a terminal device for determining the emotional state of the user by analyzing the color of the human body from the image information.

According to an aspect of the invention, the image information acquisition unit for obtaining the image information; A controller configured to compare the image information with the reference image of the human body, recognize a human body, analyze the color of the human body, determine a user's emotion, and generate emotion information on the determined emotion; And an output unit configured to output the emotion information.

The controller may include: a human body recognizer configured to recognize the human body from the image information; A color analyzer analyzing the color of the human body from the image information; And an emotion determination unit determining the emotional state of the user according to the analyzed color of the human body.

The color analyzer divides the image information into a plurality of zones and detects a numerical value of color for each zone.

The color analyzer may generate a color table of the image information by using the detected numerical value of the color.

The color analyzer may detect a color difference value for each part of the human body by comparing the color table of the image information with the color table of the human reference image.

The color analyzer may amplify the detected difference by applying a preset multiple.

The emotion determination unit may determine the emotion of the user by comparing the difference value detected for each part of the human body with the emotion state reference information.

The emotional state reference information may include a color difference reference value for each part of the human body reflecting the emotional state of the user.

The output unit may include a display device for displaying the emotion information.

The output unit may output the emotion information in a video or audio format using the display device.

The output unit may include a communication device for transmitting the emotion information to the outside.

The apparatus may further include a storage configured to store the human reference image.

According to an exemplary embodiment of the present invention, a terminal device for determining an emotional state of a user by analyzing a color of a human body from image information may be provided.

1 is a view showing the configuration of a terminal device according to an embodiment of the present invention.
2 is a view showing the configuration of a control unit according to an embodiment of the present invention.
3 is a diagram illustrating image information divided by a controller according to an exemplary embodiment of the present invention.
4 is a view showing a color table generated by the control unit according to an embodiment of the present invention.
5 is a view showing a numerical value for each color analyzed by the control unit according to an embodiment of the present invention.

As the invention allows for various changes and numerous embodiments, particular embodiments will be illustrated in the drawings and described in detail in the written description. However, this is not intended to limit the present invention to specific embodiments, it should be understood to include all transformations, equivalents, and substitutes included in the spirit and scope of the present invention. In the following description of the present invention, if it is determined that the detailed description of the related known technology may obscure the gist of the present invention, the detailed description thereof will be omitted.

Terms such as first and second may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.

The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the present invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In this application, the terms "comprise" or "have" are intended to indicate that there is a feature, number, step, operation, component, part, or combination thereof described in the specification, and one or more other features. It is to be understood that the present invention does not exclude the possibility of the presence or the addition of numbers, steps, operations, components, components, or a combination thereof.

Hereinafter, an embodiment of a terminal device according to the present invention will be described in detail with reference to the accompanying drawings, and in the following description with reference to the accompanying drawings, the same or corresponding components are given the same reference numerals and duplicate description thereof. Will be omitted.

1 is a diagram illustrating a configuration of a terminal device according to an embodiment of the present invention. 2 is a diagram illustrating a configuration of a control unit according to an embodiment of the present invention. 3 is a diagram illustrating image information divided by a controller according to an exemplary embodiment of the present invention. 4 is a diagram illustrating a color table generated by a control unit according to an embodiment of the present invention. 5 is a view showing a numerical value for each color analyzed by the control unit according to an embodiment of the present invention.

Referring to FIG. 1, a terminal device according to an embodiment of the present invention includes an image information acquisition unit 110, a control unit 120, a storage unit 130, and an output unit 140.

The image information acquisition unit 110 may acquire image information. Here, the image information acquisition unit 110 may acquire image information of the user, including a photographing apparatus. Alternatively, the image information acquisition unit 110 may obtain image information of a user provided from an external device. In this case, the user may be a user who uses the terminal device or a user of an external device that transmits image information to the terminal device.

The controller 120 may determine the emotion of the user using the image information. In addition, the controller 120 may generate emotion information on the determined emotion. The controller 120 may include a human body recognizer 122, a color analyzer 124, and an emotion determiner 126.

The human body recognizer 122 may recognize the human body from the image information. Here, the human body recognizer 122 may recognize the human body of the user by comparing the image information with the human body reference image stored in the storage 130. For example, the human body recognizer 122 analyzes the image information to recognize the human body by comparing the image pattern of the place where the user's face can be represented such as eyes, nose, mouth, forehead, cheek, ear, etc. with the reference image of the human body. Can be.

The color analyzer 124 may analyze the color of the human body from the image information. Here, the color analyzer 124 may analyze the color for each part of the human body recognized by the human body recognizer 122.

In detail, the color analyzer 124 may divide the image information into a plurality of zones with reference to FIG. 3. In addition, the color analyzer 124 may generate a color table by dividing colors by zones with reference to FIG. 4. Here, the color analyzer 124 may detect the numerical value of the color by analyzing the color information of red, green or blue included in the image information with reference to FIG. For example, the color analyzer 124 may analyze the color of the ball to detect a numerical value of red.

In addition, the color analyzer 124 compares a color table (hereinafter, referred to as a first color table) for image information with a color table (hereinafter, referred to as a second color table) for a reference image of the human body stored in the storage 130. The difference value of color can be detected for each part of. For example, the color analyzer 124 may extract a numerical value of the color of the ball from each of the first color table and the second color table, and detect the difference value of the color by comparing the numerical value with the extracted color. . Here, the color difference value may reflect a change in the emotion of the user. In particular, the color difference value may compare the human body reference image representing the general state of the user with the image information reflecting the emotional state to reflect the state of the human body changed from the general state as an emotional change.

In addition, the color analyzer 124 may amplify the detected difference by applying a preset multiple. Here, the color analyzer 124 may amplify the detected difference value to easily recognize the color change of the human body.

The emotion determination unit 126 may determine the user's emotion by comparing the difference value detected for each part of the human body with the emotion state reference information. The emotional state reference information may include a reference value of the color difference to determine the emotional state as a difference value of color for each part of the human body. For example, the emotional state reference information may include a color difference reference value (hereinafter, referred to as a first color difference reference value) of the ear reflecting an angry state or a color difference reference value (hereinafter, referred to as a second color difference reference value) of a ball reflecting an embarrassing state. Can be. In this way, the emotion determination unit 126 may determine that the user is angry when the difference value of the color of the ear exceeds the first color difference reference value. Alternatively, the emotion determination unit 126 may determine that the user is ashamed when the difference value of the color of the ball exceeds the second color difference reference value.

In addition, the emotion determination unit 126 may generate emotion information on the determined emotion.

The storage unit 130 may store the human body reference image, the second color table, or the emotional state reference information. In addition, the storage unit 130 may provide the human body reference image, the second color table, or the emotional state reference information at the request of the storage unit 130.

The output unit 140 may output emotion information. The output unit 140 may include a display device for displaying emotion information. The output unit 140 including the display device may output emotion information in a video or audio format.

Alternatively, the output unit 140 may include a communication device for transmitting emotion information to the outside. The output unit 140 including the communication device may transmit the emotion information to the outside to transmit the emotion state of the user.

The terminal device according to an embodiment of the present invention may determine the emotional state of the user by analyzing the color of the human body from the image information. In addition, the terminal device according to an embodiment of the present invention may generate emotion information and output emotion information by reflecting the determined emotion. In this way, the terminal device according to an embodiment of the present invention can display the emotion information or transmit it to the outside to transmit the emotion state of the user.

As mentioned above, although an embodiment of the present invention has been described, those of ordinary skill in the art may add, change, delete or add components within the scope not departing from the spirit of the present invention described in the claims. The present invention may be modified and changed in various ways, etc., which will also be included within the scope of the present invention.

110: image information acquisition unit
120: control unit
122: human body recognition unit
124: color analysis unit
126: emotional judgment unit
130: storage unit
140: output unit

Claims (12)

An image information obtaining unit obtaining image information;
A controller for comparing the image information with a reference image of a human body, recognizing a human body, analyzing a color of the human body to determine a user's emotion, and generating emotion information on the determined emotion; And
An output unit for outputting the emotion information; Including,
The controller compares a color table based on the image information with a color table based on the human reference image, obtains a color difference value, and generates the emotion information by comparing the color difference value with emotion state reference information.
Terminal device.
The method of claim 1,
The control unit,
A human body recognition unit recognizing the human body from the image information;
A color analyzer analyzing the color of the human body from the image information; And
An emotion determination unit that determines an emotional state of the user according to the analyzed color of the human body;
Terminal device comprising a.
The method of claim 2,
And the color analyzer divides the image information into a plurality of zones and detects a numerical value of color for each zone.
The method of claim 3,
And the color analyzer generates the color table for the image information by using the detected numerical value of the color.
The method of claim 4, wherein
And the color analyzer compares the color table of the image information with the color table of the human body reference image and detects a difference value of color for each part of the human body.
The method of claim 5,
The color analyzer amplifies the detected difference value by applying a preset multiple.
The method of claim 5,
And the emotion determining unit determines the user's emotion by comparing the difference value detected for each part of the human body with the emotion state reference information.
The method of claim 7, wherein
The emotional state reference information includes a color difference reference value for each part of the human body reflecting the emotional state of the user.
The method of claim 1,
And the output unit includes a display device for displaying the emotion information.
The method of claim 9,
And the output unit outputs the emotion information in a video or audio format using the display device.
The method of claim 1,
The output unit terminal device characterized in that it comprises a communication device for transmitting the emotion information to the outside.
The method of claim 1,
The terminal device further comprises a storage unit for storing the human reference image.
KR1020130074937A 2013-06-27 2013-06-27 Terminal device KR101973409B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130074937A KR101973409B1 (en) 2013-06-27 2013-06-27 Terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130074937A KR101973409B1 (en) 2013-06-27 2013-06-27 Terminal device

Publications (2)

Publication Number Publication Date
KR20150001539A KR20150001539A (en) 2015-01-06
KR101973409B1 true KR101973409B1 (en) 2019-09-02

Family

ID=52475276

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130074937A KR101973409B1 (en) 2013-06-27 2013-06-27 Terminal device

Country Status (1)

Country Link
KR (1) KR101973409B1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008294724A (en) * 2007-05-24 2008-12-04 Panasonic Corp Image receiver

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100958595B1 (en) 2008-06-03 2010-05-18 이현주 apparatus and method for complexing image in visual communication terminal

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008294724A (en) * 2007-05-24 2008-12-04 Panasonic Corp Image receiver

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
이동로봇에서의 2D얼굴 영상을 이용한 사용자의 감정인식, 한국지능시스템학회*
혈압 상승에 따른 얼굴 특정 부위의 색상 변화 분석, 한국통신학회*

Also Published As

Publication number Publication date
KR20150001539A (en) 2015-01-06

Similar Documents

Publication Publication Date Title
JP7225631B2 (en) Image processing device, camera device, and image processing method
US11527242B2 (en) Lip-language identification method and apparatus, and augmented reality (AR) device and storage medium which identifies an object based on an azimuth angle associated with the AR field of view
US10084988B2 (en) Facial gesture recognition and video analysis tool
US10083710B2 (en) Voice control system, voice control method, and computer readable medium
US11069368B2 (en) Glasses with closed captioning, voice recognition, volume of speech detection, and translation capabilities
US20140129207A1 (en) Augmented Reality Language Translation
US20180077095A1 (en) Augmentation of Communications with Emotional Data
EP2925005A1 (en) Display apparatus and user interaction method thereof
KR20100062207A (en) Method and apparatus for providing animation effect on video telephony call
KR102317021B1 (en) Display apparatus and image correction method thereof
CN110418095B (en) Virtual scene processing method and device, electronic equipment and storage medium
CN107370981A (en) The information cuing method and device of personnel participating in the meeting in a kind of video conference
KR102193029B1 (en) Display apparatus and method for performing videotelephony using the same
KR102263154B1 (en) Smart mirror system and realization method for training facial sensibility expression
US20160277707A1 (en) Message transmission system, message transmission method, and program for wearable terminal
NO341316B1 (en) Method and system for associating an external device to a video conferencing session.
KR20210078863A (en) Server, method and computer program for providing avatar service
AU2013222959B2 (en) Method and apparatus for processing information of image including a face
KR20200076170A (en) Assistance system and method for a person who is visually impaired using smart glasses
TW201521415A (en) Communication device and incoming call management method thereof
KR20110035651A (en) Apparatus and method for mediating video phone call
JP4845183B2 (en) Remote dialogue method and apparatus
KR101973409B1 (en) Terminal device
WO2018186698A3 (en) Method, system, and non-transitory computer-readable recording medium for providing multi-point communication service
KR20140093459A (en) Method for automatic speech translation

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant