WO2014119900A1 - Smart device having user interface based on human emotions or inclinations, and user interface method - Google Patents

Smart device having user interface based on human emotions or inclinations, and user interface method Download PDF

Info

Publication number
WO2014119900A1
WO2014119900A1 PCT/KR2014/000792 KR2014000792W WO2014119900A1 WO 2014119900 A1 WO2014119900 A1 WO 2014119900A1 KR 2014000792 W KR2014000792 W KR 2014000792W WO 2014119900 A1 WO2014119900 A1 WO 2014119900A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
smart device
user interface
information
inclinations
Prior art date
Application number
PCT/KR2014/000792
Other languages
French (fr)
Korean (ko)
Inventor
이종식
Original Assignee
Lee Jong Sik
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020130012809A external-priority patent/KR20140096935A/en
Application filed by Lee Jong Sik filed Critical Lee Jong Sik
Priority to CN201480006469.4A priority Critical patent/CN104969563A/en
Publication of WO2014119900A1 publication Critical patent/WO2014119900A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42201Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user

Definitions

  • the present invention relates to a user interface device and a user interface method, and more particularly, to a smart device and a user interface method having a user interface based on human emotion or disposition.
  • Smart TV is a multi-functional TV that combines Internet access function with TV and installs various applications such as surfing the web, watching VOD, Social Networking Service (SNS), and games. to be.
  • SNS Social Networking Service
  • the biggest feature of smart TV is that user and TV have the function to exchange information with each other. This is the biggest difference from existing TVs that only unilaterally transmit information.
  • voice recognition technology is representative.
  • a user-friendly interface such as voice recognition
  • the present invention is to solve the problems of the prior art as described above, a smart device and a user interface method that can provide content suitable for the user by reflecting the user's disposition or the current emotional state of the user using the smart device
  • the purpose is to provide.
  • the present invention obtains the face image information of the user through a camera or the like, generates information on the emotion or propensity of the user from the acquired image, and outputs the necessary screen or sound according to the generated information.
  • the present invention has the effect of providing the user with suitable content according to the user's disposition or the current emotional state of the user.
  • FIG. 1 is a block diagram showing an embodiment of a smart device capable of recognizing a user's feelings / tendencies according to the present invention.
  • FIG. 2 is a flowchart illustrating an interface method according to an emotion of a user using a smart device according to the present invention.
  • FIG. 3 is a flowchart illustrating an interface method according to a user's disposition using a smart device according to the present invention.
  • a smart device includes: a camera unit obtaining face image information of a user, a recognition unit generating information on at least one of an emotion or a propensity of a user using an image obtained through the camera unit, and the camera; And a control unit for controlling the unit and the recognition unit and selecting which screen or sound to output based on the information transmitted from the recognition unit, and an output unit for outputting the screen or sound according to the selection of the controller.
  • a user interface method of a smart device comprising: recognizing at least one of an emotional state of a user and a user's disposition using the face image information of a user acquired through a camera unit provided in the smart device; And performing any one of content provision, channel change, application driving, and advertisement provision corresponding to the result.
  • FIG. 1 is a block diagram showing an embodiment of a smart device capable of recognizing a user's feelings / tendencies according to the present invention.
  • a smart device 10 capable of recognizing a user's emotion / inclination includes a camera unit 11, a recognition unit 12, a control unit 13, and an output unit 14.
  • the smart device 10 is an electronic device capable of driving contents or applications under the control of a user, and includes a smart TV, a smart phone, a smart advertisement display device, a notebook computer, and a robot capable of playing / educating. .
  • the camera unit 11 included in the smart device 10 according to the present invention acquires face image information of the current user in real time.
  • the camera unit 11 may be mounted on the front surface of the smart device 10 so that the face image information of the user who uses the smart device 10 may be easily obtained.
  • the face image information of the user acquired through the camera unit 11 is transmitted to the recognition unit 12.
  • the recognition unit 12 derives the user's current emotion or disposition by analyzing the face image of the user acquired through the camera unit 11. For example, emotions that people generally have can be divided into five categories: expressionlessness, joy, sadness, anger, and surprise. In addition, a person's disposition can be classified into an introvert, an extrovert, or a mixture of the two.
  • the recognition unit 12 analyzes the face image information of the user transmitted through the camera unit 11 to determine what emotional state the user currently has. That is, the recognition unit 12 determines whether the user is currently in an expressionless state, a happy state, a sad state, an angry state, or a surprised state based on the expression of the user transmitted through the camera unit 11. .
  • the recognition unit 12 whether the user is an introvert, an outgoing person, or a mixture of two inclinations through the change of the face from the expressionless state of the user in the joyous state, sad state, angry state, surprised state Determine if you are a person.
  • the emotion / propensity information of the user determined in real time through the recognition unit 12 is transmitted to the control unit 13.
  • the controller 13 selects content to be displayed to the user, an application to be run (application), or an advertisement based on the information transmitted through the recognition unit.
  • the controller 13 may control the comedy program or content corresponding to the entertainment to be output through the output unit 14.
  • the control unit 13 may control to output good music, drama or advertisement that can stabilize the mind through the output unit 14. have.
  • the current state of the user is recognized as a sad state, it may be controlled to output a movie, a game application or an advertisement that can be impressed through the output unit 14.
  • control unit 13 may output content such as a movie or music that may impress and / or an advertisement (eg, a classic performance advertisement) suitable for the user's disposition. It can be controlled to output through (14).
  • the controller 13 controls the output unit 14 to output content such as sports and / or advertisements (eg, sports advertisements) suitable for the user's disposition. can do.
  • the output unit 14 outputs a screen and / or sound under the control of the controller 13.
  • FIG. 2 is a flowchart illustrating an interface method according to an emotion of a user using a smart device according to the present invention.
  • the smart device When the user appears in front of the smart device to use the smart device 10, the smart device recognizes the appearance of the user (S21), and displays a question to ask the user whether to turn on the smart device 10 (S22). ). For example, you can display a question such as "Would you like to turn on the TV?" In response to this, when the user instructs to turn on the smart device 10 by using a voice or a remote controller, the smart device 10 is operated (S23).
  • the recognition unit 12 analyzes the face image information of the user transmitted through the camera unit 11 to determine the current emotional state of the user (S24). According to the emotional state of the user, the smart device 10 provides a content list determined to be helpful to the user, or provides a channel list or an application program list (S25). In this case, an advertisement related to a service or a product which is determined to be helpful to the emotional state of the user together with the list or separately may be provided.
  • the smart device 10 may not directly provide a list to the user, but may directly select the required content, channel, and application program and output the same to the user.
  • the smart device 10 executes the recommended content, sets a channel, or drives the application ( S27).
  • FIG. 3 is a flowchart illustrating an interface method according to a user's disposition using a smart device according to the present invention.
  • the smart device 10 When the smart device 10 is operated and the content is displayed to the user (S31), the smart device 10 obtains information on the expression change of the user according to the displayed content (S32).
  • the smart device 10 detects the user's disposition on the basis of the information on the user's facial expression change (S33). For example, it may be determined that the disposition of the user is outward, inward, or a mixture of both dispositions.
  • the determined user information is stored in the memory of the smart device 10 (S34).
  • the smart device 10 recommends a content / channel / application suitable for the user by using stored user's propensity information and / or user's current emotional state information acquired in real time (S35).
  • an advertisement regarding a service or a product corresponding to the user's propensity and / or emotional state information may be provided.
  • the user selects any one of the recommended content / channel / application (S36), the selected content is output, the channel is set, or the application is driven (S37).
  • the present invention can be used for the interface of the electronic device.

Abstract

The invention relates to a smart device having a user interface based on human emotions or inclinations, and a user interface method, and has the effects of providing a user with contents; allowing the user to change channels; allowing the user to drive an application program; and providing the user with advertisements in light of the user's emotion or inclination.

Description

인간의 감정 또는 성향 기반으로 한 사용자 인터페이스를 구비한 스마트 기기 및 사용자 인터페이스 방법Smart device and user interface method with user interface based on human emotion or disposition
본 발명은 사용자 인터페이스 장치 및 사용자 인터페이스 방법에 관한 것으로서, 보다 상세하게는, 인간의 감정 또는 성향 기반으로 한 사용자 인터페이스를 구비한 스마트 기기 및 사용자 인터페이스 방법에 관한 것이다. The present invention relates to a user interface device and a user interface method, and more particularly, to a smart device and a user interface method having a user interface based on human emotion or disposition.
텔레비전은 19세기 말에 처음 등장한 이후, 화면 표시 방법이나 디자인 등이 지속적으로 발전하면서 20세기 후반부터 가장 대중적인 정보 전달 기기로 확고히 자리잡았다. 그러나, 종래의 텔레비젼들은 방송국에서 전해주는 단방향의 정보를 시청자들이 일방적으로 수용해야만 한다는 단점이 있었다. Since its debut in the late 19th century, television has become a most popular information transmission device since the late 20th century, with the continuous development of display methods and designs. However, conventional televisions have a disadvantage in that the viewers must unilaterally receive the unidirectional information transmitted from the broadcasting station.
이러한 단방향 통신의 문제점을 해결하기 위해 스마트TV(Smart TV)가 등장하였다. 스마트TV란 TV에 인터넷 접속 기능을 결합, 각종 앱(application: 응용프로그램)을 설치해 웹 서핑 및 VOD 시청, 소셜 네트워크 서비스(Social Networking Service, 이하 SNS), 게임 등의 다양한 기능을 활용할 수 있는 다기능 TV이다. In order to solve this problem of unidirectional communication, smart TVs have emerged. Smart TV is a multi-functional TV that combines Internet access function with TV and installs various applications such as surfing the web, watching VOD, Social Networking Service (SNS), and games. to be.
스마트TV의 가장 큰 특징은 사용자와 TV가 서로 정보를 주고받을 수 있는 기능을 갖췄다는 점이다. 이는 정보를 일방적으로 전달하기만 하는 기존의 TV와의 가장 큰 차이점이다. The biggest feature of smart TV is that user and TV have the function to exchange information with each other. This is the biggest difference from existing TVs that only unilaterally transmit information.
스마트TV가 대두되면서, 사용자와 TV가 서로 정보를 주고받을 수 있는 방식에 관한 기술들이 제안되었는데, 이와 같은 기술 중 대표적인 것이 음성인식 기술이다. 즉, 종래에는 사용자가 채널변경 등 TV를 제어하고 싶을 때, 리모컨 등을 사용하였으나, 점차 음성인식과 같이 사용자 친화적인 인터페이스를 갖추게 되었다. With the rise of smart TVs, technologies related to how users and TVs can exchange information with each other have been proposed. Among these technologies, voice recognition technology is representative. In other words, when the user wants to control a TV such as changing a channel, a remote controller or the like is used, but gradually a user-friendly interface such as voice recognition is provided.
그러나, 음성인식과 같은 기술 만으로는 사용자의 성향이나 감정에 따라, 사용자에게 필요한 컨텐츠 등을 제공할 수 없다는 문제점이 있었다. However, there is a problem in that a technology such as voice recognition alone cannot provide contents necessary for the user according to the user's disposition or emotion.
본 발명은 상기와 같은 종래 기술의 문제점을 해결하기 위한 것으로, 사용자의 성향이나 스마트 기기를 사용하고 있는 현재 사용자의 감정 상태를 반영하여 사용자에게 적합한 컨텐츠 등을 제공할 수 있는 스마트 기기 및 사용자 인터페이스 방식을 제공하는데 그 목적이 있다. The present invention is to solve the problems of the prior art as described above, a smart device and a user interface method that can provide content suitable for the user by reflecting the user's disposition or the current emotional state of the user using the smart device The purpose is to provide.
본 발명은 카메라 등을 통해 사용자의 얼굴 이미지 정보를 획득하고, 획득된 이미지로부터 사용자의 감정이나 성향 등에 관한 정보를 생성하며, 생성된 정보에 따라, 필요한 화면 혹은 음향을 출력하도록 한다. The present invention obtains the face image information of the user through a camera or the like, generates information on the emotion or propensity of the user from the acquired image, and outputs the necessary screen or sound according to the generated information.
본 발명은 사용자의 성향이나 사용자의 현재 감정상태에 따른 적합한 컨텐츠 등을 사용자에게 제공할 수 있도록 하는 효과가 있다. The present invention has the effect of providing the user with suitable content according to the user's disposition or the current emotional state of the user.
도 1 은 본 발명에 따른 사용자의 감정/성향 인식이 가능한 스마트 기기를 나타낸 일 실시예 구성도이다. 1 is a block diagram showing an embodiment of a smart device capable of recognizing a user's feelings / tendencies according to the present invention.
도 2 는 본 발명에 따른 스마트 기기를 이용하여, 사용자의 감정에 따른 인터페이스 방법을 나타낸 일실시예 흐름도이다. 2 is a flowchart illustrating an interface method according to an emotion of a user using a smart device according to the present invention.
도 3 은 본 발명에 따른 스마트 기기를 이용하여, 사용자의 성향에 따른 인터페이스 방법을 나타낸 일실시예 흐름도이다. 3 is a flowchart illustrating an interface method according to a user's disposition using a smart device according to the present invention.
본 발명에 따른 스마트 기기는, 사용자의 얼굴 이미지 정보를 획득하는 카메라부와, 상기 카메라부를 통해 획득된 이미지를 이용하여 사용자의 감정 이나 성향 중 적어도 하나에 관한 정보를 생성하는 인식부와, 상기 카메라부 및 인식부를 제어하고, 상기 인식부로부터 전달된 정보를 기초로 하여, 어떤 화면 혹은 음향을 출력할 것인지 선택하는 제어부 및 상기 제어부의 선택에 따라, 화면 혹은 음향을 출력하는 출력부를 포함하여 이루어진다.According to an aspect of the present invention, a smart device includes: a camera unit obtaining face image information of a user, a recognition unit generating information on at least one of an emotion or a propensity of a user using an image obtained through the camera unit, and the camera; And a control unit for controlling the unit and the recognition unit and selecting which screen or sound to output based on the information transmitted from the recognition unit, and an output unit for outputting the screen or sound according to the selection of the controller.
본 발명에 따른 스마트 기기의 사용자 인터페이스 방법은, 스마트 기기에 구비된 카메라부를 통해 획득된 사용자의 얼굴 이미지 정보를 이용하여 사용자의 감정 상태 또는 사용자의 성향 중 적어도 하나를 인식하는 단계와, 상기 인식된 결과에 상응하는 컨텐츠 제공, 채널 변경, 응용프로그램 구동, 광고 제공 중 어느 하나를 실행하는 단계를 포함하여 이루어진다. In accordance with an aspect of the present invention, there is provided a user interface method of a smart device, the method comprising: recognizing at least one of an emotional state of a user and a user's disposition using the face image information of a user acquired through a camera unit provided in the smart device; And performing any one of content provision, channel change, application driving, and advertisement provision corresponding to the result.
도 1 은 본 발명에 따른 사용자의 감정/성향 인식이 가능한 스마트 기기를 나타낸 일 실시예 구성도이다. 1 is a block diagram showing an embodiment of a smart device capable of recognizing a user's feelings / tendencies according to the present invention.
도 1 을 참조하면, 사용자의 감정/성향 인식이 가능한 스마트 기기(10)는, 카메라부(11), 인식부(12), 제어부(13) 및 출력부(14)를 포함하여 이루어진다. Referring to FIG. 1, a smart device 10 capable of recognizing a user's emotion / inclination includes a camera unit 11, a recognition unit 12, a control unit 13, and an output unit 14.
본 발명에서 스마트 기기(10)는 사용자의 제어에 따라 컨텐츠나 응용프로그램을 구동할 수 있는 전자기기로서, 스마트TV, 스마트폰, 스마트 광고 디스플레이 장치, 노트북 컴퓨터 및 놀이/교육이 가능한 로봇을 포함한다.In the present invention, the smart device 10 is an electronic device capable of driving contents or applications under the control of a user, and includes a smart TV, a smart phone, a smart advertisement display device, a notebook computer, and a robot capable of playing / educating. .
본 발명에 따른 스마트 기기(10)에 구비된 카메라부(11)는 현재 사용자의 얼굴 이미지 정보를 실시간으로 획득한다. 카메라부(11)는 스마트 기기(10)의 전면에 장착되어, 스마트 기기(10)를 사용하는 사용자의 얼굴 이미지 정보를 용이하게 획득할 수 있도록 하는 것이 바람직하다. The camera unit 11 included in the smart device 10 according to the present invention acquires face image information of the current user in real time. The camera unit 11 may be mounted on the front surface of the smart device 10 so that the face image information of the user who uses the smart device 10 may be easily obtained.
카메라부(11)를 통해 획득된 사용자의 얼굴 이미지 정보는 인식부(12)로 전달된다. 인식부(12)는 카메라부(11)를 통해 획득된 사용자의 얼굴 이미지를 분석하여 사용자의 현재 감정이나 성향을 도출해 낸다. 예를 들어, 사람들이 일반적으로 가지는 감정은 무표정, 기쁨, 슬픔, 화남 및 놀람의 다섯가지로 구분할 수 있다. 또한, 사람의 성향은 내향적인 사람, 외향적인 사람, 두가지 성향이 혼합된 사람으로 구분할 수 있다. The face image information of the user acquired through the camera unit 11 is transmitted to the recognition unit 12. The recognition unit 12 derives the user's current emotion or disposition by analyzing the face image of the user acquired through the camera unit 11. For example, emotions that people generally have can be divided into five categories: expressionlessness, joy, sadness, anger, and surprise. In addition, a person's disposition can be classified into an introvert, an extrovert, or a mixture of the two.
인식부(12)에서는 카메라부(11)를 통해 전달된 사용자의 얼굴 이미지 정보를 분석하여, 사용자가 현재 어떠한 감정 상태를 가지고 있는지를 파악한다. 즉, 인식부(12)에서는 카메라부(11)를 통해 전달된 사용자의 표정을 통해, 사용자가 현재 무표정한 상태인지, 기쁜 상태인지, 슬픈 상태인지, 화가 난 상태인지, 혹은 놀란 상태인지를 판단한다. The recognition unit 12 analyzes the face image information of the user transmitted through the camera unit 11 to determine what emotional state the user currently has. That is, the recognition unit 12 determines whether the user is currently in an expressionless state, a happy state, a sad state, an angry state, or a surprised state based on the expression of the user transmitted through the camera unit 11. .
한편, 인식부(12)에서는 사용자의 무표정한 상태로부터 기쁜 상태, 슬픈 상태, 화가 난 상태, 놀란 상태로부터의 얼굴 변화를 통해, 사용자가 내향적인 사람인지, 외향적인 사람인지, 혹은 두가지 성향이 혼합된 사람인지 여부를 판단한다. On the other hand, the recognition unit 12, whether the user is an introvert, an outgoing person, or a mixture of two inclinations through the change of the face from the expressionless state of the user in the joyous state, sad state, angry state, surprised state Determine if you are a person.
이와 같이, 인식부(12)를 통해 실시간으로 판단된 사용자의 감정/성향 정보는 제어부(13)로 전달된다. 제어부(13)는 인식부를 통해 전달된 정보를 기초로 하여, 사용자에게 표출할 컨텐츠나 구동시킬 응용프로그램(어플리케이션), 혹은 광고를 선택한다. As such, the emotion / propensity information of the user determined in real time through the recognition unit 12 is transmitted to the control unit 13. The controller 13 selects content to be displayed to the user, an application to be run (application), or an advertisement based on the information transmitted through the recognition unit.
예를 들어, 사용자의 현재 상태가 기쁜 상태인 것으로 인식된 경우, 제어부(13)는 코메디 프로그램이나, 오락물에 해당하는 컨텐츠가 출력부(14)를 통해 출력되도록 제어할 수 있다. 한편, 사용자의 현재 상태가 화난 상태 혹은 놀란 상태인 것으로 인식된 경우, 제어부(13)는 마음을 안정시킬 수 있는 좋은 음악이나, 드라마 혹은 광고등이 출력부(14)를 통해 출력되도록 제어할 수 있다. 또한, 사용자의 현재 상태가 슬픈 상태로 인식된 경우에는, 감동을 줄 수 있는 영화나 게임 응용프로그램 혹은 광고 등이 출력부(14)를 통해 출력되도록 제어할 수 있다. For example, when it is recognized that the user's current state is a happy state, the controller 13 may control the comedy program or content corresponding to the entertainment to be output through the output unit 14. On the other hand, when it is recognized that the user's current state is an angry state or a surprised state, the control unit 13 may control to output good music, drama or advertisement that can stabilize the mind through the output unit 14. have. In addition, when the current state of the user is recognized as a sad state, it may be controlled to output a movie, a game application or an advertisement that can be impressed through the output unit 14.
또한, 사용자가 내향적인 사람으로 인식된 경우, 제어부(13)는 감동을 줄 수 있는 영화나 음악 등의 컨텐츠 및/또는 사용자의 성향에 적합한 광고(예를 들어, 클래식 공연 광고)등이 출력부(14)를 통해 출력되도록 제어할 수 있다. 한편, 사용자가 외향적인 사람으로 인식된 경우, 제어부(13)는 스포츠 등의 컨텐츠 및/또는 사용자의 성향에 적합한 광고(예를 들어, 스포츠 광고)등이 출력부(14)를 통해 출력되도록 제어할 수 있다. In addition, when the user is recognized as an introvert, the control unit 13 may output content such as a movie or music that may impress and / or an advertisement (eg, a classic performance advertisement) suitable for the user's disposition. It can be controlled to output through (14). On the other hand, when the user is recognized as an outgoing person, the controller 13 controls the output unit 14 to output content such as sports and / or advertisements (eg, sports advertisements) suitable for the user's disposition. can do.
출력부(14)는 제어부(13)의 제어에 따라 화면 및/또는 음향을 출력한다. The output unit 14 outputs a screen and / or sound under the control of the controller 13.
도 2 는 본 발명에 따른 스마트 기기를 이용하여, 사용자의 감정에 따른 인터페이스 방법을 나타낸 일실시예 흐름도이다. 2 is a flowchart illustrating an interface method according to an emotion of a user using a smart device according to the present invention.
사용자가 스마트 기기(10)를 사용하기 위해 스마트 기기 앞에 나타나면, 스마트 기기는 사용자의 출현을 인식하고(S21), 스마트 기기(10)를 켤 것인지 여부를 사용자에게 묻기 위해, 질문을 디스플레이 한다(S22). 예를 들어, "TV를 켜시겠습니까?"와 같은 질문을 디스플레이 할 수 있다. 이에 대해, 사용자가 음성 또는 리모컨 등을 사용하여 스마트 기기(10)를 켤 것을 명령하면, 스마트 기기(10)가 작동된다(S23). When the user appears in front of the smart device to use the smart device 10, the smart device recognizes the appearance of the user (S21), and displays a question to ask the user whether to turn on the smart device 10 (S22). ). For example, you can display a question such as "Would you like to turn on the TV?" In response to this, when the user instructs to turn on the smart device 10 by using a voice or a remote controller, the smart device 10 is operated (S23).
스마트 기기(10)가 작동되면, 인식부(12)는 카메라부(11)를 통해 전달된 사용자의 얼굴 이미지 정보를 분석하여, 사용자의 현재 감정 상태를 판단한다(S24). 스마트 기기(10)는 사용자의 감정 상태에 따라, 사용자에게 도움이 될 것으로 판단되는 컨텐츠 리스트를 제공하거나, 채널 리스트 혹은 응용프로그램 리스트를 제공한다(S25). 이때, 상기 리스트와 함께 혹은 개별적으로 사용자의 감정상태에 도움이 될 것으로 판단되는 서비스 혹은 상품에 관련되는 광고를 제공할 수도 있다.When the smart device 10 is operated, the recognition unit 12 analyzes the face image information of the user transmitted through the camera unit 11 to determine the current emotional state of the user (S24). According to the emotional state of the user, the smart device 10 provides a content list determined to be helpful to the user, or provides a channel list or an application program list (S25). In this case, an advertisement related to a service or a product which is determined to be helpful to the emotional state of the user together with the list or separately may be provided.
이때, 스마트 기기(10)는 사용자에게 리스트를 제공하지 않고, 직접 필요한 컨텐츠, 채널, 응용프로그램을 선택하여 사용자에게 출력하도록 할 수도 있다. In this case, the smart device 10 may not directly provide a list to the user, but may directly select the required content, channel, and application program and output the same to the user.
사용자가 스마트 기기(10)로부터 추천된 컨텐츠/채널/응용프로그램 중에서 어느 하나를 선택하면(S26), 스마트 기기(10)는 추천된 컨텐츠를 실행하거나, 채널을 설정하거나, 응용프로그램을 구동시킨다(S27).When the user selects any one of the recommended content / channel / application from the smart device 10 (S26), the smart device 10 executes the recommended content, sets a channel, or drives the application ( S27).
도 3 은 본 발명에 따른 스마트 기기를 이용하여, 사용자의 성향에 따른 인터페이스 방법을 나타낸 일실시예 흐름도이다. 3 is a flowchart illustrating an interface method according to a user's disposition using a smart device according to the present invention.
스마트 기기(10)가 작동하면서, 컨텐츠를 사용자에게 표출하면(S31), 스마트 기기(10)는 표출된 컨텐츠에 따른 사용자의 표정 변화에 관한 정보를 획득한다(S32). When the smart device 10 is operated and the content is displayed to the user (S31), the smart device 10 obtains information on the expression change of the user according to the displayed content (S32).
스마트 기기(10)는 사용자의 표정 변화에 관한 정보를 바탕으로 사용자의 성향을 파악해 낸다(S33). 예를 들어, 사용자의 성향이 외향적인 것으로 판정하거나, 내향적인 것으로 판정하거나 혹은 두가지 성향이 혼합된 것으로 판정할 수 있다. The smart device 10 detects the user's disposition on the basis of the information on the user's facial expression change (S33). For example, it may be determined that the disposition of the user is outward, inward, or a mixture of both dispositions.
이와 같이, 판정된 사용자 정보는 스마트 기기(10)의 메모리에 저장된다(S34). 스마트 기기(10)는 저장된 사용자의 성향 정보 및/또는 실시간으로 획득된 사용자의 현재 감정 상태 정보를 활용하여 사용자에게 적합한 컨텐츠/채널/응용프로그램을 추천한다(S35). 이때, 사용자의 성향 및/또는 감정상태 정보를 기초로 그에 상응하는 서비스나 상품에 관한 광고를 제공할 수 있다. In this way, the determined user information is stored in the memory of the smart device 10 (S34). The smart device 10 recommends a content / channel / application suitable for the user by using stored user's propensity information and / or user's current emotional state information acquired in real time (S35). In this case, an advertisement regarding a service or a product corresponding to the user's propensity and / or emotional state information may be provided.
사용자가 추천된 컨텐츠/채널/응용프로그램 중 어느 하나를 선택하면(S36), 선택된 컨텐츠를 출력하거나, 채널설정을 하거나, 응용프로그램을 구동시킨다(S37). If the user selects any one of the recommended content / channel / application (S36), the selected content is output, the channel is set, or the application is driven (S37).
본 발명은 전자기기의 인터페이스에 사용할 수 있다. The present invention can be used for the interface of the electronic device.

Claims (5)

  1. 스마트 기기에 있어서,In a smart device,
    사용자의 얼굴 이미지 정보를 획득하는 카메라부;A camera unit for obtaining face image information of a user;
    상기 카메라부를 통해 획득된 이미지를 이용하여 사용자의 감정 이나 성향 중 적어도 하나에 관한 정보를 생성하는 인식부;A recognition unit generating information on at least one of an emotion and a propensity of a user by using the image acquired through the camera unit;
    상기 카메라부 및 인식부를 제어하고, 상기 인식부로부터 전달된 정보를 기초로 하여, 어떤 화면 혹은 음향을 출력할 것인지 선택하는 제어부; 및A control unit which controls the camera unit and the recognition unit and selects which screen or sound to output based on the information transmitted from the recognition unit; And
    상기 제어부의 선택에 따라, 화면 혹은 음향을 출력하는 출력부Output unit for outputting a screen or sound, according to the selection of the controller
    를 포함하여 이루어지는 인간의 감정 또는 성향 기반으로 한 사용자 인터페이스를 구비한 스마트 기기.Smart device having a user interface based on human emotions or inclinations made.
  2. 제 1 항에 있어서,The method of claim 1,
    상기 화면 혹은 음향은 상기 스마트 기기를 통해 제공될 수 있는 컨텐츠, 채널 혹은 응용프로그램 중 어느 하나의 실행에 따른 것임을 특징으로 하는 인간의 감정 또는 성향 기반으로 한 사용자 인터페이스를 구비한 스마트 기기.The screen or sound is a smart device having a user interface based on human emotions or inclinations, characterized in that according to the execution of any one of the content, channel or application that can be provided through the smart device.
  3. 제 2 항에 있어서,The method of claim 2,
    상기 사용자의 감정에 관한 정보는, 무표정, 기쁨, 슬픔, 화남, 놀람 중 어느 하나인 것을 특징으로 하는 인간의 감정 또는 성향 기반으로 한 사용자 인터페이스를 구비한 스마트 기기.The information on the emotion of the user is a smart device having a user interface based on human emotions or inclinations, characterized in that any one of expression, joy, sadness, anger, surprise.
  4. 제 2 항에 있어서,The method of claim 2,
    사용자의 성향에 관한 정보는, 외향적 성향, 내향적 성향, 혼합된 성향 중 어느 하나인 것을 특징으로 하는 스마트 기기.The information on the user's disposition, smart device, characterized in that any one of outward inclination, inward inclination, mixed propensity.
  5. 스마트 기기의 사용자 인터페이스 방법에 있어서,In the user interface method of the smart device,
    스마트 기기에 구비된 카메라부를 통해 획득된 사용자의 얼굴 이미지 정보를 이용하여 사용자의 감정 상태 또는 사용자의 성향 중 적어도 하나를 인식하는 단계; 및Recognizing at least one of an emotional state of the user and an inclination of the user by using face image information of the user acquired through a camera unit provided in the smart device; And
    상기 인식된 결과에 상응하는 컨텐츠 제공, 채널 변경, 응용프로그램 구동, 광고 제공 중 어느 하나를 실행하는 단계Executing any one of contents providing, channel changing, application driving, and advertisement providing corresponding to the recognized result;
    를 포함하여 이루어지는 스마트 기기의 사용자 인터페이스 방법. Smart device user interface method comprising a.
PCT/KR2014/000792 2013-01-29 2014-01-28 Smart device having user interface based on human emotions or inclinations, and user interface method WO2014119900A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201480006469.4A CN104969563A (en) 2013-01-29 2014-01-28 Smart device having user interface based on human emotions or inclinations, and user interface method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20130009568 2013-01-29
KR10-2013-0009568 2013-01-29
KR10-2013-0012809 2013-02-05
KR1020130012809A KR20140096935A (en) 2013-01-29 2013-02-05 Smart device having a user interface based on human emotion or tendency and user interface method

Publications (1)

Publication Number Publication Date
WO2014119900A1 true WO2014119900A1 (en) 2014-08-07

Family

ID=51262557

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/000792 WO2014119900A1 (en) 2013-01-29 2014-01-28 Smart device having user interface based on human emotions or inclinations, and user interface method

Country Status (1)

Country Link
WO (1) WO2014119900A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106610990A (en) * 2015-10-22 2017-05-03 北京国双科技有限公司 Emotional tendency analysis method and apparatus
CN110908340A (en) * 2018-09-14 2020-03-24 珠海格力电器股份有限公司 Smart home control method and device
WO2020135334A1 (en) * 2018-12-28 2020-07-02 深圳Tcl数字技术有限公司 Television application theme switching method, television, readable storage medium, and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003060956A (en) * 2001-08-20 2003-02-28 Fuji Photo Film Co Ltd Digital camera
KR20050025552A (en) * 2004-01-17 2005-03-14 주식회사 헬스피아 Digital cellular phone
KR20060099182A (en) * 2005-03-10 2006-09-19 엘지전자 주식회사 (a) digital tv having (a) program encouragement function and method of controlling the same
KR20110100061A (en) * 2010-03-03 2011-09-09 엘지전자 주식회사 Apparatus for displaying image and and method for operationg the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003060956A (en) * 2001-08-20 2003-02-28 Fuji Photo Film Co Ltd Digital camera
KR20050025552A (en) * 2004-01-17 2005-03-14 주식회사 헬스피아 Digital cellular phone
KR20060099182A (en) * 2005-03-10 2006-09-19 엘지전자 주식회사 (a) digital tv having (a) program encouragement function and method of controlling the same
KR20110100061A (en) * 2010-03-03 2011-09-09 엘지전자 주식회사 Apparatus for displaying image and and method for operationg the same

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106610990A (en) * 2015-10-22 2017-05-03 北京国双科技有限公司 Emotional tendency analysis method and apparatus
CN106610990B (en) * 2015-10-22 2020-12-29 北京国双科技有限公司 Method and device for analyzing emotional tendency
CN110908340A (en) * 2018-09-14 2020-03-24 珠海格力电器股份有限公司 Smart home control method and device
WO2020135334A1 (en) * 2018-12-28 2020-07-02 深圳Tcl数字技术有限公司 Television application theme switching method, television, readable storage medium, and device

Similar Documents

Publication Publication Date Title
CN111372109B (en) Intelligent television and information interaction method
CN111787375B (en) Display device and information display method
WO2021031809A1 (en) Interface display method and display device
US11924513B2 (en) Display apparatus and method for display user interface
CN111510788B (en) Display method and display device for double-screen double-system screen switching animation
CN112383802B (en) Focus switching method, projection display device and system
Saleme et al. Coping with the challenges of delivering multiple sensorial media
US20170289633A1 (en) Information processing device
CN109302631B (en) Video interface display method and device
WO2014119900A1 (en) Smart device having user interface based on human emotions or inclinations, and user interface method
CN111464840B (en) Display device and method for adjusting screen brightness of display device
WO2020248697A1 (en) Display device and video communication data processing method
US20220291752A1 (en) Distributed Application Platform Projected on a Secondary Display for Entertainment, Gaming and Learning with Intelligent Gesture Interactions and Complex Input Composition for Control
KR20150013394A (en) Smart device having a user interface based on human emotion or tendency and user interface method
KR20140096935A (en) Smart device having a user interface based on human emotion or tendency and user interface method
WO2020248682A1 (en) Display device and virtual scene generation method
CN113012647B (en) Display device and backlight light source control method
CN112073803B (en) Sound reproduction method and display device
CN113015023A (en) Method and device for controlling video in HTML5 webpage
CN112927653A (en) Display device and backlight brightness control method
CN112883144A (en) Information interaction method
CN112786036B (en) Display device and content display method
CN112752159B (en) Interaction method and related device
CN113630633B (en) Display device and interaction control method
von Bruhn Hinné et al. Using motion-sensing remote controls with older adults

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14745374

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 12/11/2015)

122 Ep: pct application non-entry in european phase

Ref document number: 14745374

Country of ref document: EP

Kind code of ref document: A1