KR20150001539A - Terminal device - Google Patents

Terminal device Download PDF

Info

Publication number
KR20150001539A
KR20150001539A KR1020130074937A KR20130074937A KR20150001539A KR 20150001539 A KR20150001539 A KR 20150001539A KR 1020130074937 A KR1020130074937 A KR 1020130074937A KR 20130074937 A KR20130074937 A KR 20130074937A KR 20150001539 A KR20150001539 A KR 20150001539A
Authority
KR
South Korea
Prior art keywords
color
emotion
human body
information
image information
Prior art date
Application number
KR1020130074937A
Other languages
Korean (ko)
Other versions
KR101973409B1 (en
Inventor
신명진
Original Assignee
삼성전기주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전기주식회사 filed Critical 삼성전기주식회사
Priority to KR1020130074937A priority Critical patent/KR101973409B1/en
Publication of KR20150001539A publication Critical patent/KR20150001539A/en
Application granted granted Critical
Publication of KR101973409B1 publication Critical patent/KR101973409B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

Disclosed is a terminal device. The terminal device according to an embodiment of the present invention comprises: an image information acquisition unit which acquires image information; a control unit which determines a user′s emotion by comparing the image information with a human reference image to recognize a human body and analyze the color of the human body, and generates emotional information about the determined emotion; and an output unit which outputs the emotional information.

Description

[0001] TERMINAL DEVICE [0002]

The present invention relates to a terminal device.

With the development of mobile communication technology, a video call service that allows users to talk face to face while avoiding call service limited to voice call service has become popular. Particularly, since a communication network supporting a video phone or a video conference system is established, the user can use the video call service with the other party through the mobile communication terminal while the user is stationary or on the move.

In recent years, studies have been actively conducted to provide a variety of supplementary services to a user by analyzing a user 's facial image while a video call service is being used. In particular, a technique for converting a face image of a user into another image has been proposed.

Korean Patent Publication No. 10-2009-0125905

The present invention provides a terminal device for analyzing a color of a human body from image information and determining an emotional state of a user.

According to an aspect of the present invention, there is provided an image processing apparatus including: an image information acquiring unit acquiring image information; A controller for comparing the image information with the human reference image to recognize the human body, analyzing the color of the human body to determine emotion of the user, and generating emotion information about the sensed emotion; And an output unit for outputting the emotion information.

Wherein the control unit comprises: a human body recognizing unit for recognizing the human body from the image information; A color analyzer for analyzing the color of the human body from the image information; And an emotion judgment unit for judging the emotion state of the user according to the color of the human body analyzed.

The color analyzer may divide the image information into a plurality of regions, and may detect a numerical value of a color in each region.

The color analyzer may generate a color table for the image information using the detected numerical value of the color.

The color analyzer may compare a color table for the image information with a color table for the human reference image to detect a color difference value for each part of the human body.

The color analyzer may amplify the detected difference value by applying a preset multiple.

The emotion determination unit may determine the emotion of the user by comparing the difference value detected for each part of the human body with the emotion state reference information.

The emotion state reference information may include a color difference reference value for each part of the human body reflecting the emotion state of the user.

The output unit may include a display device for displaying the emotion information.

The output unit may output the emotion information in video or audio format using the display device.

The output unit may include a communication device for externally transmitting the emotion information.

And a storage unit for storing the human reference image.

According to an embodiment of the present invention, a terminal device for analyzing a color of a human body from image information and determining an emotional state of a user can be provided.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a diagram illustrating a configuration of a terminal device according to an embodiment of the present invention; FIG.
2 is a block diagram of a control unit according to an embodiment of the present invention;
FIG. 3 is a view for dividing image information in a control unit according to an embodiment of the present invention; FIG.
4 is a diagram illustrating a color table generated by a controller according to an exemplary embodiment of the present invention.
5 is a view illustrating numerical values of colors analyzed by a control unit according to an embodiment of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS The present invention is capable of various modifications and various embodiments, and specific embodiments are illustrated in the drawings and described in detail in the detailed description. It is to be understood, however, that the invention is not to be limited to the specific embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, an embodiment of a terminal device according to the present invention will be described in detail with reference to the accompanying drawings. In the following description with reference to the accompanying drawings, the same or corresponding components are denoted by the same reference numerals, Is omitted.

1 is a diagram illustrating a configuration of a terminal apparatus according to an embodiment of the present invention. 2 is a block diagram of a controller according to an embodiment of the present invention. FIG. 3 is a diagram illustrating a division of image information in a controller according to an exemplary embodiment of the present invention. 4 is a diagram illustrating a color table generated by a controller according to an exemplary embodiment of the present invention. FIG. 5 is a view illustrating numerical values of colors analyzed by a controller according to an embodiment of the present invention. FIG.

Referring to FIG. 1, a terminal device according to an embodiment of the present invention includes an image information obtaining unit 110, a control unit 120, a storage unit 130, and an output unit 140.

The image information obtaining unit 110 may obtain image information. Here, the image information obtaining unit 110 may obtain the image information of the user including the photographing apparatus. Alternatively, the image information obtaining unit 110 may obtain image information of a user provided from an external device. At this time, the user may be a user who uses the terminal device or a user of an external device that transmits image information to the terminal device.

The control unit 120 can determine the user's emotion using the image information. Also, the control unit 120 may generate emotion information for the determined emotion. Here, the control unit 120 may include a human body recognizing unit 122, a color analyzing unit 124, and an emotion determining unit 126.

The human body recognizing unit 122 can recognize the human body from the image information. Here, the human body recognizing unit 122 can recognize the human body of the user by comparing the image information with the human reference image stored in the storage unit 130. For example, the human body recognizing unit 122 analyzes the image information and compares the image pattern of the eye, nose, mouth, forehead, ball, ear, etc. where the user's face can be displayed with the human reference image to recognize the human body .

The color analyzer 124 may analyze the color of the human body from the image information. Here, the color analyzer 124 may analyze the colors of the human body recognized by the human body recognizer 122.

More specifically, the color analysis unit 124 may divide the image information into a plurality of regions with reference to FIG. In addition, the color analyzer 124 may generate a color table by separating colors according to regions, referring to FIG. 5, the color analyzer 124 may analyze the color information of red, green, or blue included in the image information and perform a numerical value of 8 bits or 10 bits to detect the numerical value of the color. For example, the color analyzer 124 may detect the numerical value of red by analyzing the color of the ball.

The color analyzer 124 compares a color table (hereinafter referred to as a first color table) for image information with a color table (hereinafter referred to as a second color table) for a human reference image stored in the storage unit 130, A color difference value can be detected for each of the regions. For example, the color analyzer 124 may extract a numerical value of a color for a ball in each of the first color table and the second color table, and may detect a color difference value by comparing numerical values of the extracted color . Here, the color difference value may reflect a user's emotional change. In particular, the color difference value can reflect the state of the human body changed from the general state by the emotion change by comparing the human reference image representing the general state of the user with the image information reflecting the emotion state.

Also, the color analyzer 124 may amplify the detected difference value by applying a preset multiple. Here, the color analyzer 124 may amplify the detected difference value so as to easily recognize the color change of the human body.

The emotion determination unit 126 may determine the emotion of the user by comparing the difference value detected for each part of the human body with the emotion state reference information. Here, the emotion state reference information may include a reference value of a color difference in order to determine an emotion state as a color difference value of each part of the human body. For example, the emotion state reference information may include a color difference reference value (hereinafter referred to as a first color difference reference value) of the ear reflecting the angry state or a color difference reference value of the ball (hereinafter referred to as a second color difference reference value) . Accordingly, the emotion determination unit 126 may determine that the user is angry when the difference value of the ear color exceeds the first color difference reference value. Alternatively, the emotion determination unit 126 may determine that the user is ashamed if the color difference value of the ball exceeds the second color difference reference value.

In addition, the emotion determination unit 126 may generate emotion information for the emotion that has been determined.

The storage unit 130 may store a human reference image, a second color table, or emotion state reference information. Also, the storage unit 130 may provide a human reference image, a second color table, or emotion state reference information at the request of the storage unit 130.

The output unit 140 can output emotion information. Here, the output unit 140 may include a display device for displaying emotion information. The output unit 140 including the display device can output emotion information in video or audio format.

Alternatively, the output unit 140 may include a communication device for externally transmitting emotion information. The output unit 140 including the communication device can transmit the emotion state of the user by transmitting the emotion information to the outside.

The terminal device according to an embodiment of the present invention can determine the emotional state of the user by analyzing the color of the human body from the image information. Also, the terminal device according to an embodiment of the present invention may generate emotion information by reflecting the emotion that is determined and output emotion information. Accordingly, the terminal device according to the embodiment of the present invention can display the emotion information or transmit it to the outside to transmit the emotion state of the user.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit of the invention as set forth in the appended claims. The present invention can be variously modified and changed by those skilled in the art, and it is also within the scope of the present invention.

110:
120:
122: human body recognition unit
124: color analysis unit
126:
130:
140:

Claims (12)

An image information acquiring unit acquiring image information;
A controller for comparing the image information with the human reference image to recognize the human body, analyzing the color of the human body to determine emotion of the user, and generating emotion information about the sensed emotion; And
An output unit for outputting the emotion information;
.
The method according to claim 1,
Wherein,
A human body recognizing unit for recognizing the human body from the image information;
A color analyzer for analyzing the color of the human body from the image information; And
An emotion determination unit for determining an emotion state of the user according to the color of the human body analyzed;
And a terminal device.
3. The method of claim 2,
Wherein the color analyzing unit divides the image information into a plurality of regions and detects a numerical value of a color for each region.
The method of claim 3,
Wherein the color analyzer generates a color table for the image information using the detected numerical value of the color.
5. The method of claim 4,
Wherein the color analyzer compares a color table for the image information with a color table for the human reference image to detect a color difference value for each part of the human body.
6. The method of claim 5,
Wherein the color analyzer amplifies the detected difference value by applying a preset multiple.
3. The method of claim 2,
Wherein the emotion determination unit compares the difference value detected for each part of the human body with the emotion state reference information to determine the emotion of the user.
8. The method of claim 7,
Wherein the emotion state reference information includes a color difference reference value for each part of the human body reflecting the emotion state of the user.
The method according to claim 1,
Wherein the output unit comprises a display device for displaying the emotion information.
10. The method of claim 9,
Wherein the output unit outputs the emotion information in a video or audio format using the display device.
The method according to claim 1,
And the output unit includes a communication device for externally transmitting the emotion information.
The method according to claim 1,
And a storage unit for storing the human reference image.
KR1020130074937A 2013-06-27 2013-06-27 Terminal device KR101973409B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130074937A KR101973409B1 (en) 2013-06-27 2013-06-27 Terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130074937A KR101973409B1 (en) 2013-06-27 2013-06-27 Terminal device

Publications (2)

Publication Number Publication Date
KR20150001539A true KR20150001539A (en) 2015-01-06
KR101973409B1 KR101973409B1 (en) 2019-09-02

Family

ID=52475276

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130074937A KR101973409B1 (en) 2013-06-27 2013-06-27 Terminal device

Country Status (1)

Country Link
KR (1) KR101973409B1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008294724A (en) * 2007-05-24 2008-12-04 Panasonic Corp Image receiver
KR20090125905A (en) 2008-06-03 2009-12-08 이현주 Apparatus and method for complexing image in visual communication terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008294724A (en) * 2007-05-24 2008-12-04 Panasonic Corp Image receiver
KR20090125905A (en) 2008-06-03 2009-12-08 이현주 Apparatus and method for complexing image in visual communication terminal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
이동로봇에서의 2D얼굴 영상을 이용한 사용자의 감정인식, 한국지능시스템학회* *
혈압 상승에 따른 얼굴 특정 부위의 색상 변화 분석, 한국통신학회* *

Also Published As

Publication number Publication date
KR101973409B1 (en) 2019-09-02

Similar Documents

Publication Publication Date Title
US11114091B2 (en) Method and system for processing audio communications over a network
KR101533065B1 (en) Method and apparatus for providing animation effect on video telephony call
US11527242B2 (en) Lip-language identification method and apparatus, and augmented reality (AR) device and storage medium which identifies an object based on an azimuth angle associated with the AR field of view
US8957943B2 (en) Gaze direction adjustment for video calls and meetings
RU2578210C1 (en) Method and device for correcting skin colour
KR102317021B1 (en) Display apparatus and image correction method thereof
US20140129207A1 (en) Augmented Reality Language Translation
KR101501183B1 (en) Two Mode AGC for Single and Multiple Speakers
US10373648B2 (en) Apparatus and method for editing content
EP2925005A1 (en) Display apparatus and user interaction method thereof
US20110274311A1 (en) Sign language recognition system and method
KR102193029B1 (en) Display apparatus and method for performing videotelephony using the same
WO2017166598A1 (en) Sound channel adjusting method and apparatus for headphones, electronic device, and storage medium
NO341316B1 (en) Method and system for associating an external device to a video conferencing session.
KR102263154B1 (en) Smart mirror system and realization method for training facial sensibility expression
US20150181161A1 (en) Information Processing Method And Information Processing Apparatus
TW201521415A (en) Communication device and incoming call management method thereof
AU2013222959B2 (en) Method and apparatus for processing information of image including a face
US20190273901A1 (en) Selectively applying color to an image
KR20100041061A (en) Video telephony method magnifying the speaker's face and terminal using thereof
US10796106B2 (en) Apparatus and method for selecting speaker by using smart glasses
KR101973409B1 (en) Terminal device
JP2014149571A (en) Content search device
US10798337B2 (en) Communication device, communication system, and non-transitory computer readable medium storing program
CN111182256A (en) Information processing method and server

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant