KR20160139481A - User terminal apparatus and control method thereof - Google Patents

User terminal apparatus and control method thereof Download PDF

Info

Publication number
KR20160139481A
KR20160139481A KR1020150074277A KR20150074277A KR20160139481A KR 20160139481 A KR20160139481 A KR 20160139481A KR 1020150074277 A KR1020150074277 A KR 1020150074277A KR 20150074277 A KR20150074277 A KR 20150074277A KR 20160139481 A KR20160139481 A KR 20160139481A
Authority
KR
South Korea
Prior art keywords
display device
user terminal
input unit
touch screen
gui
Prior art date
Application number
KR1020150074277A
Other languages
Korean (ko)
Inventor
고나영
방준호
진-크리스토프 나우어
이관민
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020150074277A priority Critical patent/KR20160139481A/en
Publication of KR20160139481A publication Critical patent/KR20160139481A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04842Selection of a displayed object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42208Display device provided on the remote control
    • H04N21/42209Display device provided on the remote control for displaying non-command information, e.g. electronic program guide [EPG], e-mail, messages or a second television channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry
    • H04N5/4403User interfaces for controlling a television receiver or set top box [STB] through a remote control device, e.g. graphical user interfaces [GUI]; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry
    • H04N5/4403User interfaces for controlling a television receiver or set top box [STB] through a remote control device, e.g. graphical user interfaces [GUI]; Remote control devices therefor
    • H04N2005/4405Hardware details of remote control devices
    • H04N2005/4408Display
    • H04N2005/441Display for the display of non-command information, e.g. electronic program guide [EPG], e-mail, messages or a second television channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry
    • H04N5/4403User interfaces for controlling a television receiver or set top box [STB] through a remote control device, e.g. graphical user interfaces [GUI]; Remote control devices therefor
    • H04N2005/4405Hardware details of remote control devices
    • H04N2005/443Touch pad or touch panel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q2209/00Arrangements in telecontrol or telemetry systems
    • H04Q2209/40Arrangements in telecontrol or telemetry systems using a wireless architecture

Abstract

A user terminal device is disclosed. The user terminal device for controlling the display device includes a communication unit that communicates with the display device, a first input unit that is provided on one side of the remote control device and receives a user command for controlling basic functions of the display device, And a processor for providing information corresponding to a context of the user terminal device through a touch screen. The second input unit displays a UI (User Interface) through a touch screen.

Description

[0001] USER TERMINAL APPARATUS AND CONTROL METHOD THEREOF [0002]

The present invention relates to a user terminal and a control method thereof, and more particularly, to a user terminal having a remote control function and a control method thereof.

Various types of display devices are being developed due to the development of electronic technology. In particular, display devices such as TVs, PCs, laptop computers, tablet PCs, mobile phones, MP3 players and the like are widely used in most households.

In recent years, efforts have been made to develop display devices in a more new form in order to meet the needs of users who want more new and various functions.

As a part of this effort, a remote control device including a UI for quickly accessing various contents provided in a display device has been developed and used in various fields.

However, this type of user terminal device was not enough to meet various needs of users who want to quickly access vast contents including various contents such as web-based contents and social contents.

SUMMARY OF THE INVENTION The present invention has been made in view of the above-mentioned needs, and an object of the present invention is to provide a user terminal device capable of providing basic control UI on one side and providing additional information suitable for a situation on the other side, And a control method thereof.

According to another aspect of the present invention, there is provided a user terminal device for controlling a display device, the device comprising: a communication unit for performing communication with the display device; A second input unit provided on the other side of the remote control device for displaying a UI (User Interface) through a touch screen, a second input unit for receiving a user command for controlling a basic function of the user terminal, ) Through the touch screen.

Here, the first input unit may include a touch screen including a basic UI for controlling a basic function of the display device.

In addition, the first input unit may include a PUI (Phisical User Interface) including at least one physical button for controlling a basic function of the display device.

In addition, the context of the user terminal may include at least one of a status in which a specific menu is selected, a status in which a specific signal is received in the display, and a status in which the user terminal is flipped.

In addition, the processor may control activation of at least one of the first input unit and the second input unit based on a context in which the user terminal device is flipped.

In addition, the processor may provide information corresponding to the context of the display device through the touch screen when a signal corresponding to the context of the display device is received.

The processor may further display additional information on the content through the touch screen when the display device is displaying content, or when the display device is in a situation for receiving a character, UI can be displayed on the touch screen.

In addition, the processor may provide a UI screen including at least one GUI for instantly reproducing at least one content according to a predetermined event, and the UI screen may be set in advance based on the usage frequency of the at least one content The GUI of the type can be sequentially arranged.

If the user operation for selecting the GUI is a short press input, the processor directly reproduces the content corresponding to the selected GUI, and if the user operation for selecting the GUI is a long press input, Can be provided.

The processor scrolls the UI screen in a predetermined direction according to a preset touch interaction and displays at least one GUI that is not displayed on the UI screen according to a predetermined event on the touch screen Additional display is possible.

The remote control device may further include a communication unit provided on one side of the remote control device for performing communication with the display device, a first input unit for receiving a user command for controlling a basic function of the display device, A method of controlling a user terminal device for controlling a display device having a second input unit for displaying a UI through a touch screen includes the steps of determining a context of the user terminal device, Through the touch screen.

Here, the first input unit may include a touch screen including a basic UI for controlling a basic function of the display device.

In addition, the first input unit may include a PUI including at least one physical button for controlling a basic function of the display device.

In addition, the context of the user terminal may include at least one of a situation in which a specific menu is selected, a situation in which a specific signal is received in the display device, and a situation in which the user terminal is flipped.

The method may further include controlling an activation state of at least one of the first input unit and the second input unit based on a context in which the user terminal device is flipped.

In addition, the step of providing through the touch screen may provide information corresponding to the context of the display device when a signal corresponding to the context of the display device is received.

The step of providing through the touch screen may further include displaying the additional information on the content through the touch screen when the display device is displaying content, , A UI for inputting the character can be displayed.

In addition, the step of providing through the touch screen may provide a UI screen including at least one GUI for instantly reproducing at least one content according to a predetermined event, and the UI screen may include at least one content And the GUI of a predetermined type may be sequentially arranged based on the frequency.

If the user operation for selecting the GUI is a short press input, the content corresponding to the selected GUI is immediately reproduced. If the user operation for selecting the GUI is a long press input, a menu related to the content corresponding to the selected GUI The method comprising the steps of:

The method further includes displaying the UI screen in a predetermined direction according to a predetermined touch interaction and further displaying at least one GUI not displayed on the UI screen according to a predetermined event on the touch screen .

As described above, according to the present invention, the basic control UI is provided on one side and the additional information suitable for the situation is provided on the other side, so that the user can quickly access desired contents, thereby improving convenience for the user.

1 is a view for explaining an embodiment of a user terminal according to an embodiment of the present invention.
2A and 2B are block diagrams showing a configuration of a user terminal device for controlling a display device according to various embodiments of the present invention.
3A to 3F are views for explaining the structure of a user terminal according to an embodiment of the present invention.
4A and 4B illustrate a structure of a first input unit of a user terminal according to an exemplary embodiment of the present invention.
5A and 5B illustrate operations of a user terminal according to an exemplary embodiment of the present invention.
6A and 6B illustrate operations of a user terminal according to an exemplary embodiment of the present invention.
7A to 7C are diagrams for explaining operations of a user terminal according to an embodiment of the present invention.
8 is a diagram for explaining the operation of a user terminal according to an embodiment of the present invention.
9A to 9C are diagrams for explaining operations of a user terminal according to an embodiment of the present invention.
10 is a flowchart illustrating a method of controlling a user terminal according to an embodiment of the present invention.

Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

1 is a view for explaining an embodiment of a display system according to an embodiment of the present invention.

Referring to FIG. 1, a display system according to an embodiment of the present invention includes a user terminal device 100 and a display device 200.

The user terminal device 100 can be implemented in various forms such as a mobile phone, a PMP, a PDA, and a notebook computer.

Specifically, the user terminal device 100 may be implemented as a touch-based portable terminal capable of displaying a UI screen and controlling the displayed UI screen as a touch interaction. In this case, the user terminal device 100 may be implemented with a touch screen. Accordingly, the user terminal device 100 has a built-in touch screen and can be implemented to execute a program using a finger or a pen (for example, a stylus pen). Also, the user terminal 100 provides a UI (User Interface) screen for controlling the display device 200 on the touch screen, and outputs a signal corresponding to the user touch operation input through the UI screen to the display device 200). To this end, the user terminal device 200 may be implemented to include a touch sensor for receiving various types of user commands or an optical joystick (OJ) sensor for applying optical technology.

In some cases, the user terminal device 100 detects movement of the user terminal device 100, transmits a signal corresponding to the motion, recognizes the voice, transmits a signal corresponding to the recognized voice, And may be implemented in various forms such as transmitting corresponding signals. To this end, it may be implemented to further include a motion sensor, a microphone, a physical button (for example, a Tact Switch), and the like.

The display device 200 may be implemented as a digital TV as shown in FIG. 1B, but the present invention is not limited thereto. For example, a display device 200 such as a PC (personal computer), a navigation device, a kiosk, a digital information display And may be implemented in various types of devices with display capabilities. In some cases, the display device 200 may be implemented as an apparatus that does not have a display function if the user terminal device 100 is controllable.

Meanwhile, the user terminal device 100 according to the present invention may include a user interface on both sides to provide a basic control UI for the display device 200 on one side and additional information suitable for the situation on the other side Various embodiments of the present invention will be described in detail with reference to the drawings.

2A and 2B are block diagrams showing a configuration of a user terminal device 100 for controlling a display device 200 according to various embodiments of the present invention.

Referring to FIG. 2A, a user terminal 100 includes a communication unit 110, a first input unit 120, a second input unit 130, and a processor 140.

The communication unit 110 performs communication with the display device (FIGS. 1 and 200).

Here, the communication unit 110 may be connected to the communication unit 110 through various communication methods such as BT (BlueTooth), Wireless Fidelity (WI-FI), Zigbee, IR, Serial Interface, Universal Serial Bus And can communicate with the display device 200 or an external server (not shown).

Specifically, when the preset event occurs, the communication unit 110 can perform the communication according to the predetermined communication method with the display device 200 and can be in an interlocked state. Here, the interlocking may mean all the states in which the communication between the user terminal device 100 and the display device 200 becomes ready for communication, such as initialization of communication, operation of forming a network, and operation of performing device pairing. For example, the device identification information of the user terminal device 100 is provided to the display device 200, and a pairing procedure between the two devices can be performed accordingly. For example, when a predetermined event is generated in the user terminal device 100, the peripheral device may be searched for through a DLNA (Digital Living Network Alliance) technology, and the device may be paired with the searched device to be in an interlocked state.

Here, the preset event may be generated in at least one of the user terminal device 100 and the display device 200. For example, a user command for selecting the display device 200 as the controlled device in the user terminal device 100 may be input, or the power of the display device 200 may be turned on.

The first input unit 120 is provided on one side of the remote control apparatus 100 and receives a user command for controlling the basic functions of the display apparatus 200.

The first input unit 120 may be implemented as a touch screen including a basic UI for controlling the basic functions of the display device 200, And a PUI including one physical button.

Here, the basic UI for controlling the basic functions of the display device 200 may include at least one of a channel up / down button, a volume control button, and an info (information) button for providing predetermined information.

According to one embodiment, the UI may be implemented in a PUI form including at least one physical button of a channel up / down button, a volume adjustment button, and an info button. Alternatively, the corresponding UI may include at least one of a channel up / down button, a volume adjustment button, and an info button. In this case, the first input unit 120 may display corresponding GUIs And a touch screen capable of receiving a user's touch input for the corresponding GUI.

In the above-described embodiment, the basic UI for controlling the basic functions of the display device 200 includes a channel up / down button, a volume control button, and an info button. However, this is merely an example, But the present invention is not limited to this, as long as it is a button related to the basic function of the apparatus 200.

The second input unit 130 is provided on the other side of the remote control device 100 and displays a UI (User Interface) through a touch screen.

Specifically, the second input unit 130 may provide a menu screen for selecting various functions available in the display device 200, a UI screen for selecting various modes, and the like on the touch screen. Here, the UI screen may include not only a channel but also various content playback screens such as an image, a moving picture, a text, a music, etc., an application execution screen including various contents, a web browser screen, a GUI (Graphic User Interface) screen and the like.

The touch screen provided in the first input unit 120 and the second input unit 130 may be implemented by a liquid crystal display panel (LCD), an organic light emitting diode (OLED), or the like, but is not limited thereto. In addition, it may be implemented as a flexible display, a transparent display, or the like depending on the touch screen provided in the first input unit 120 and the second input unit 130.

The processor 140 controls the overall operation of the user terminal device 100.

The processor 140 provides information corresponding to the context of the user terminal device 100 through a touch screen provided in the second input unit 130. [

Herein, the context of the user terminal device 100 may be a context in which a specific menu or a specific button is selected in the user terminal device 100, a specific signal (for example, the display device 200) And a situation in which the user terminal device 100 is flipped. Specifically, in a situation where a specific menu or a specific button is selected, the processor 140 displays a detailed description related to the content to be reproduced on the display device 200, when the info button is selected in the user terminal device 100, The QWERTY keypad may be displayed on the touch screen provided in the input unit 130 or the QWERTY keyboard output to the display device 200 may be output to the second input unit 130 when the QWERTY menu is selected, 200 may be displayed.

In response to a situation in which a specific signal is received from the outside of the user terminal device 100, the processor 140 determines whether or not the corresponding signal is received based on the status information on the context of the display device 200 received through the communication unit 110 The UI can be output on the touch screen provided in the second input unit 130. [ Here, the context of the display device 200 refers to a situation in which control is required, and includes various states and situations such as a function provided in the display device 200, a provided content type, a provided image panel, It can be meaningful.

Specifically, the processor 140 receives at least one of a broadcast viewing mode for viewing a real time broadcast channel, a content playback mode for playing VOD content, a menu providing mode, a game mode, and a web mode, The UI corresponding to the mode can be provided to the touch screen of the second input unit 130. [

In addition, when the processor 140 receives the status information indicating that the detailed function provided in the specific mode is being executed even in the specific mode, the processor 140 transmits the UI corresponding to the detailed function to the touch Can be provided on the screen. For example, when a signal according to a case where the display apparatus 200 is in a volume adjustment state in a broadcast viewing mode or a state in which a volume adjustment is required is received, the processor 140 transmits a corresponding UI to the second input unit 130, On a touch screen provided on the touch screen. For example, when the display device 200 is in the mute state, the processor 140 may provide the UI for volume adjustment on the touch screen of the second input unit 130 when the status information is received.

On the other hand, the processor 140 receives information on the UI screen corresponding to the state of the display device 200, control information corresponding to the UI information, and the like from an external server (not shown) Can be provided. For example, when an SNS screen is provided by the user terminal device 100 according to a user command, the information may be received from an external server (not shown). In this case, the external server (not shown) may be connected to the Internet or the like via the network to update the information related to the user terminal device 100 and the display device 200. For example, device driver information, control information, UI information, and the like can be updated.

The processor 140 may control the activation state of at least one of the first input unit 120 and the second input unit 130 based on a context in which the user terminal device 100 is flipped.

For example, when it is determined that the first input unit 120 is flipped to the user's field of view, the processor 140 turns on the touch screen of the first input unit 120 and turns on the touch screen of the second input unit 130 The provided touch screen can be turned off. On the contrary, when it is determined that the second input unit 130 is flipped to enter the user's field of view, the touch screen provided on the second input unit 130 is turned on and the touch screen provided on the first input unit 120 is turned off .

Specifically, when the processor 140 determines that the grip state is changed so as to grip the surface on which the second input unit 130 is provided in a state where the surface provided with the first input unit 120 is gripped, or when the second input unit 130 It is recognized that the flip operation is performed when it is determined that the grip state is changed so as to grip the surface provided with the first input unit 120 in a state where the surface provided with the first input unit 130 is gripped. This grip operation can be recognized through various sensors.

For example, the processor 140 may recognize that a grip operation occurs when a user touch is detected through a touch sensor provided on at least one of both sides and front and rear surfaces of the user terminal device 100.

On the other hand, the processor 140 can recognize that there is a flip operation when at least one of the rotation and the tilt is detected through at least one of the gyro sensor and the acceleration sensor provided in the user terminal device 100.

In addition, the processor 140 recognizes the direction in which the user is located through the camera sensor and recognizes that there is a flip operation.

As described above, different UIs are provided through the first input unit 120 and the second input unit 130, thereby improving the user's convenience. For example, when the user is leaning back in a couch or bed or in a comfortable position (Lean Back), the user mainly watches the content being executed in the display device 200 for a long time, In this case, since the user simply operates the display device 200 in a simple manner, the user can easily access the display device 200 through the first input unit 110 as described above 200). In addition, when it is desired to control the detailed function of the display device 200, the user terminal device 100 can be easily flipped and conveniently displayed on the display device 200 through various UIs provided through the touch screen of the second input device 130 200). That is, it is possible to provide a suitable UI according to the status of the display device 200, the status of the user terminal device 100, the control status of the user, and the like.

2B is a block diagram illustrating a detailed configuration of a user terminal according to another embodiment of the present invention. 2B, the user terminal 100 'includes a communication unit 110, a first input unit 120, a second input unit 130, a processor 140, a storage unit 150, and a sensing unit 160 do. Details of the components shown in FIG. 2B that overlap with those shown in FIG. 2A will not be described in detail.

The processor 140 generally controls the operation of the user terminal device 100 'using various programs stored in the storage unit 150. [

Specifically, the processor 140 includes a RAM 141, a ROM 142, a main CPU 143, a graphics processing unit 144, first to n interfaces 145-1 to 145-n, .

The RAM 141, the ROM 142, the main CPU 143, the graphics processing unit 144, the first to n interfaces 145-1 to 145-n, etc. may be connected to each other via the bus 146.

The first to n interfaces 145-1 to 145-n are connected to the various components described above. One of the interfaces may be a network interface connected to an external device via a network.

The main CPU 143 accesses the storage unit 150 and performs booting using the O / S stored in the storage unit 150. [ Then, various operations are performed using various programs, contents, data, and the like stored in the storage unit 150.

The ROM 142 stores a command set for booting the system and the like. When the turn-on command is input and power is supplied, the main CPU 143 copies the O / S stored in the storage unit 150 to the RAM 141 according to the instruction stored in the ROM 142, executes O / S Boot the system. When the booting is completed, the main CPU 143 copies various application programs stored in the storage unit 150 to the RAM 141, executes the application program copied to the RAM 141, and performs various operations.

The graphic processing unit 144 generates a screen including various objects such as an icon, an image, and a text using an operation unit (not shown) and a rendering unit (not shown). The operation unit (not shown) calculates an attribute value such as a coordinate value, a shape, a size, and a color to be displayed by each object according to the layout of the screen based on the received control command. The rendering unit (not shown) creates screens of various layouts including the objects based on the attribute values calculated by the operation unit (not shown). The screen generated in the rendering unit (not shown) is displayed in the display area of the first input unit 120 and the second input unit 130.

Meanwhile, the operation of the processor 140 may be performed by a program stored in the storage unit 150.

The storage unit 150 may store various data such as an O / S software module for driving the user terminal device 100 'and various multimedia contents.

In particular, the storage unit 150 may store data for constructing various UI screens provided in the display areas of the first input unit 120 and the second input unit 130 according to an embodiment of the present invention.

In addition, the storage unit 150 may store data for generating a control signal corresponding to a user command input through various UI screens.

The sensing unit 160 includes a touch sensor, a geomagnetic sensor, a gyro sensor, an acceleration sensor, a proximity sensor, a grip sensor, and the like. In addition to the touch described above, the sensing unit 160 can sense various operations such as rotation, tilt, pressure, approach, grip, and the like. For example, the grip sensor may be disposed on the rear surface, the rim, and the grip portion separately from the touch sensor provided on the touch screen of the user terminal device 100 'to sense the user's grip. The grip sensor may be implemented as a pressure sensor in addition to the touch sensor.

In addition, the user terminal 100 'may include an audio processing unit (not shown) for performing processing on audio data, a video processing unit (not shown) for performing processing on video data, various types of audio processing units A speaker (not shown) for outputting various kinds of notification sounds or voice messages as well as audio data, and a microphone (not shown) for receiving user's voice or other sounds and converting the received voice data into audio data.

3A to 3F are views for explaining a structure of a user terminal 100 according to an embodiment of the present invention.

3A and 3B are views for explaining the structure of the first input unit 310 of the user terminal 100 according to an embodiment of the present invention.

According to one embodiment, the first input unit 310 may include a physical user interface (PUI) including at least one physical button for controlling the basic functions of the display device 200. For example, the first input unit 310 may include only physical buttons as shown in FIG. 3A. 3A, each button may be implemented by a channel up / down button 311, 312, and an info button 313, but is not limited thereto.

According to another embodiment, the first input 310 may include a touch screen 314 that includes a basic UI for controlling the basic functions of the display device 200.

4A and 4B are views illustrating a case where the touch screen 314 of the user terminal 100 according to the embodiment of the present invention is provided.

 4A, a basic UI for controlling the basic functions of the display device 200 may be provided on the touch screen 411 provided in the first input unit 410. In addition, For example, the basic UI includes a GUI 411-1 for turning the display device 200 on and off, a channel up / down GUI 411-2 and 411-3, and a volume control GUI 411-4 and 411-5 ), And the like, but the present invention is not limited thereto.

According to another embodiment, as shown in FIG. 4B, the first input unit 410 may additionally include physical buttons 412-414 as well as a touch screen 415 providing a basic UI. For example, physical buttons such as a button for turning on / off the display device 200, a channel up / down button, and a volume control button may be additionally provided.

FIGS. 3C and 3D are views for explaining the structure of the second input unit 320 of the user terminal 310 according to an embodiment of the present invention.

According to one embodiment, the second input unit 320 may include only the touch screen 315 as shown in FIG. 3C, but according to another embodiment, at least one physical button 316, Screen 318. < / RTI >

3E and 3F are views for explaining a structure of one side and an upper side of the user terminal 100 according to an embodiment of the present invention.

According to one embodiment, at least one button may be provided on one side of the user terminal device 100 as shown in FIG. 3E. For example, the button may be provided with three buttons including a volume control button and a mute button, but the present invention is not limited thereto.

According to one embodiment, as shown in FIG. 3F, at least one button, for example, an on-off button 322 may be provided on the upper side of the user terminal device 100, but the present invention is not limited thereto.

5A and 5B illustrate operations of the user terminal 100 according to an embodiment of the present invention.

According to an embodiment of the present invention, when the info button 513 included in the first input unit 510 is pushed as shown in FIGS. 5A and 5B, information on the content being reproduced on the display device 200 May be output to the touch screen 523 provided in the second input unit 520. [ For example, when the info button 513 provided on the first input unit 510 is pressed, the user terminal device 100 requests detailed information about the content currently displayed on the display device 200, When the detailed information corresponding to the content is received according to the request, the UI screen based on the received detailed information can be displayed. As another example, when the info button 513 provided on the first input unit 510 is pressed, the user terminal device 100 requests identification information on the content currently displayed on the display device 200, , It may receive and provide the detailed information of the content corresponding to the received identification information from an external server (not shown).

6A and 6B illustrate operations of the user terminal 100 according to an embodiment of the present invention.

According to an embodiment of the present invention, the processor 140 may provide a UI screen including at least one GUI for directly reproducing at least one content according to a preset event. Here, the predetermined event may be an event in which the user terminal device 100 is rotated or accelerated, an event in which a predetermined area of the touch screen provided in the user terminal device 100 is touched, But the present invention is not limited thereto.

For example, as shown in FIG. 6A, in response to an event that a preset button 612 provided in the user terminal device 100 is input, a UI screen is displayed on the touch screen 613 as shown in FIG. 6B Can be provided.

Specifically, the processor 140 may provide a UI screen in which GUIs of predetermined types are sequentially arranged based on the stored content usage history. Here, the content use history may include at least one content usage frequency, and the processor 140 may provide a UI screen for sequentially arranging GUIs of predetermined types corresponding to respective contents based on the frequency of use of the contents can do.

For example, as shown in FIG. 6B, a UI screen in which circular GUIs are sequentially arranged according to the determined order based on the frequency of use of contents is displayed on a touch screen 613 provided in the second input device 610, Can be output. In this case, the processor 140 can sequentially arrange the content from the most frequently used to the least frequently used content. However, the shape of the GUI is circular, but the shape of the GUI is not limited to a polygon such as a triangle, a rectangle, or an ellipse.

It is possible to provide a GUI for frequently used contents on the UI on the basis of the frequency of content use by the user as in the above-described embodiment, thereby allowing the user to search for a convenient content.

7A to 7C are diagrams for explaining operations of the user terminal 100 according to an embodiment of the present invention.

According to one embodiment of the present invention, the processor 140 may perform different functions according to different user operations on the GUI provided on the UI screen shown in FIG. 6A.

Specifically, when the user operation for selecting a specific GUI is a short press input, contents corresponding to the selected GUI are immediately reproduced, and when the user operation for selecting a specific GUI is the long press input, a menu related to contents corresponding to the selected GUI .

For example, when the user touches (short-presses) the GUI 711 corresponding to the sports-related content on the touch screen 712 provided in the second input unit 710 as shown in FIG. 7A, The controller 100 may transmit the content execution request signal and the detailed information request signal corresponding to the touched content to the display device 200. [ In this case, the display device 200 reproduces and outputs the content according to the content execution request signal, and simultaneously transmits a signal corresponding to the detailed information related to the touched content to the user terminal device 100. The user terminal device 100 outputs the content detail information corresponding to the received signal to the touch screen 712 provided in the second input unit 710.

As another example, when the GUI 711 corresponding to the content is briefly touched (shot press), the user terminal device 100 transmits the content execution request signal to the display device 200 and the identification information about the content currently being displayed request. Then, when the identification information about the content is received according to the request, detailed information of the content corresponding to the received identification information may be received from an external server (not shown) and provided.

7B, when the specific GUI 713 is pressed and operated (long press) on the touch screen 712 'provided in the second input unit 710', the depressed GUI 713 ' GUIs 714-716 that provide various options around the display are displayed. For example, GUIs such as an option view GUI 714, a content playback GUI 715, and a content information providing GUI 716 may be provided as shown, but the present invention is not limited thereto.

7C shows the operation of the GUIs when the user drags the touch screen 712 " provided in the second input unit 710 ".

According to one embodiment of the present invention, the processor 140 scrolls and displays the UI screen in a predetermined direction according to the preset touch interaction, and displays at least one GUI that is not displayed on the UI screen according to a preset event It can be additionally displayed on the touch screen.

Specifically, as shown in FIG. 7C, when the user drags the touch screen 712 '' upward, all of the GUIs displayed on the touch screen 712 '' are pushed upward. At this time, when the GUI is all pushed up, a specific area 720 on the screen provided as an empty area is touched or dragged, and the content is determined to be subordinate to the frequency of use of the content, The GUIs 721-725 are displayed on one side of the screen. At this time, the displayed GUIs 721-725 move to the upper side of the screen and can be displayed adjacent to the existing GUIs 717-719. In this way, since the frequency of use of the user is low, contents corresponding to the subordinate can be additionally displayed on the screen in turn according to a specific user operation, and can be provided to the user. However, according to another embodiment, even if there is no specific event on the screen provided as a blank area as the GUIs displayed on the screen are all pushed up, it is possible to cope with the subordinate contents only by the event of moving the GUI displayed on the screen upward New GUIs may be provided.

Meanwhile, in the above-described embodiment, when the positions of the GUIs are shifted according to various events, or when the UI and UI portions indicating the edges of the GUI and the touch screen are brought into contact with each other, May provide an animation effect in which the GUI bounces back and then pauses slowly.

FIG. 8 is a diagram for explaining the operation of the user terminal 100 according to an embodiment of the present invention.

According to one embodiment of the present invention, the processor 140 determines whether at least one of the first input 810 and the second input 820, based on the context in which the user terminal 100 is flipped, The activation state can be controlled.

8, when the content is being reproduced on the display device 200, the user terminal 100 displays the second input unit 820 in a situation in which the face on which the first input unit 810 is provided faces the user. Is flipped toward the user, the touch screen 821 provided in the second input unit 820 of the user terminal device 100 can be activated.

In this case, the user terminal device 100 requests detailed information about the content currently being displayed on the display device 200, and when the detailed information corresponding to the content is received in response to the request, Can be displayed on the activated touch screen (821).

The first input unit 810 is connected to the first input unit 810 when the face on which the first input unit 810 is provided is flipped toward the user in a situation where the face on which the second input unit 820 is provided faces the user. May be activated.

The activation of the first input unit 810 or the second input unit 820 preferably indicates a state in which the touch screen provided on each input unit is turned off, but the present invention is not limited thereto.

 9A to 9C are diagrams for explaining the operation of the user terminal 100 according to an embodiment of the present invention.

According to an embodiment of the present invention, when a signal corresponding to the context of the display device 200 is received, the processor 140 may provide information corresponding to the context of the display device 200 through the touch screen. Here, the context of the display device 200 may be a status of the display device 200 related to various functions of the display device 200 as well as a situation where the display device 200 is turned on / off.

9A, in a situation where the display device 200 displays the QWERTY keyboard 921 for inputting characters, the user terminal device 100 receives the information corresponding to the corresponding situation from the display device 200 Signal can be received. In this case, the user terminal device 100 can display the QWERTY keyboard 912 on the touch screen 911 of the second input unit 910 as shown in FIG. In this case, the display device 200 transmits a signal corresponding to the situation to the user terminal device 100, or transmits a request signal to the QWERTY keyboard 921 displayed on the screen from the user terminal device 100, The QWERTY keyboard 921 displayed on the screen may disappear from the screen.

9B, in a situation where the display device 200 is displaying a menu screen 923 for inputting characters, the user terminal device 100 receives, from the display device 200, Signal can be received. In this case, the user terminal 100 can display the menu screen 913 on the touch screen 911 'provided in the second input unit 910' as shown in FIG. In this case, the display device 200 transmits a signal corresponding to the corresponding situation to the user terminal device 100, or transmits a request signal for the menu screen 923 displayed on the screen from the user terminal device 100 The menu screen 923 displayed on the screen may disappear from the screen.

9C, in a situation where a UI screen capable of menu navigation is displayed on the display device 200, the user terminal device 100 receives a signal corresponding to the corresponding situation from the display device 200 can do. In this case, the user terminal device 100 may provide a navigation GUI 914 capable of menu navigation on the screen. Here, the navigation GUI may be a four direction menu button as shown, but it is not limited thereto, and may be implemented in various forms. On the other hand, the user can manipulate the movement position of the highlight GUI for content selection provided on the screen of the display device 200 through the navigation GUI 914 provided in the user terminal device 100.

10 is a flowchart illustrating a method of controlling a user terminal 100 according to an embodiment of the present invention.

First, when the context of the user terminal device 100 is determined (S1010), information corresponding to the context of the user terminal device 100 may be provided through the touch screen (S1020). In this case, the user terminal includes a first input unit 120 for receiving a user command for controlling basic functions of the display device 200, and a UI (User Interface) And a second input unit 130 for displaying the image.

Here, the first input unit 120 may include a touch screen including a basic UI for controlling the basic functions of the display device 200.

Also, the first input 120 may include a PUI including at least one physical button for controlling the basic function of the display device 200. [

In addition, the context of the user terminal 100 may include at least one of a status in which a specific menu is selected, a status in which a specific signal is received in the display device 200, and a status in which the user terminal 100 is flipped.

The control method may further include controlling an activation state of at least one of the first input unit 120 and the second input unit 130 based on a context in which the user terminal device 100 is flipped.

In addition, when a signal corresponding to the context of the display device 200 is received in step S1020, the information corresponding to the context of the display device 200 may be provided.

In a case where the display device 200 displays the content in step S1020, the display device 200 may display additional information on the content through the touch screen, In case of a situation for receiving, a UI for inputting characters can be displayed.

In operation S1020, a UI screen including at least one GUI for instantly reproducing at least one content is provided according to a preset event. The UI screen includes at least one content GUIs of predetermined types can be sequentially arranged based on the frequency.

When the user operation for selecting the GUI is a short press input, the control method directly reproduces the content corresponding to the selected GUI, and when the user operation for selecting the GUI is the long press input, provides a menu related to the content corresponding to the selected GUI The method comprising the steps of:

In addition, the control method further includes the step of displaying at least one GUI that is not displayed on the UI screen on the touch screen in accordance with the predetermined event by scrolling and displaying the UI screen in a preset direction according to the preset touch interaction .

Meanwhile, the control method of the user terminal device 100 according to various embodiments of the present invention described above may be implemented in computer-executable program code and stored in various non-transitory computer readable media, Or may be provided to each server or devices to be executed by the server 140.

For example, a program for performing the steps of determining the context of the user terminal device 100 of the present invention and providing the information corresponding to the context of the user terminal device 100 through the touch screen is stored in the non-transitory readable medium a non-transitory computer readable medium may be provided.

A non-transitory readable medium is a medium that stores data for a short period of time, such as a register, cache, memory, etc., but semi-permanently stores data and is readable by the apparatus. In particular, the various applications or programs described above may be stored on non-volatile readable media such as CD, DVD, hard disk, Blu-ray disk, USB, memory card, ROM,

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

100: user terminal device 200: display device
110: communication unit 120: first input device
130: second input device 140: processor
150: storage unit 160: sensing unit

Claims (20)

  1. A user terminal device for controlling a display device,
    A communication unit for performing communication with the display device;
    A first input unit provided on one side of the remote control device and receiving a user command for controlling a basic function of the display device;
    A second input unit provided on the other side of the remote control device and displaying a UI (User Interface) through a touch screen; And
    And a processor for providing information corresponding to a context of the user terminal through the touch screen.
  2. The method according to claim 1,
    Wherein the first input unit comprises:
    And a touch screen including a basic UI for controlling basic functions of the display device.
  3. The method according to claim 1,
    Wherein the first input unit comprises:
    And a PUI (Phisical User Interface) including at least one physical button for controlling basic functions of the display device.
  4. The method according to claim 1,
    Wherein the context of the user terminal device comprises:
    A state in which a specific menu is selected, a state in which a specific signal is received in the display device, and a state in which the user terminal device is flipped.
  5. The method according to claim 1,
    The processor comprising:
    Wherein the control unit controls the activation state of at least one of the first input unit and the second input unit based on a context in which the user terminal device is flipped.
  6. The method according to claim 1,
    The processor comprising:
    Wherein when the signal corresponding to the context of the display device is received, information corresponding to the context of the display device is provided through the touch screen.
  7. The method according to claim 6,
    The processor comprising:
    A UI for inputting the character, when the display device is displaying a content, displaying additional information on the content through the touch screen, or when the display device is in a situation for receiving a character, Wherein the display unit displays the image on the touch screen.
  8. The method according to claim 1,
    The processor comprising:
    A UI screen including at least one GUI for instantly reproducing at least one content according to a preset event,
    Wherein the UI screen is configured such that the GUI of a predetermined type is sequentially arranged based on a frequency of use of the at least one content.
  9. 9. The method of claim 8,
    The processor comprising:
    If the user operation for selecting the GUI is a short press input, the content corresponding to the selected GUI is immediately reproduced. If the user operation for selecting the GUI is a long press input, a menu related to the content corresponding to the selected GUI is provided To the user terminal device.
  10. 9. The method of claim 8,
    The processor comprising:
    And displays at least one GUI that is not displayed on the UI screen according to a predetermined event on the touch screen in a predetermined direction. Terminal device.
  11. A first input unit for receiving a user command for controlling a basic function of the display device and a second input unit provided on another surface of the remote control device, A method of controlling a user terminal device for controlling a display device having a second input unit for displaying a user interface (UI) through a touch screen,
    Determining a context of the user terminal; And
    And providing information corresponding to the context of the user terminal through the touch screen.
  12. 12. The method of claim 11,
    Wherein the first input unit comprises:
    And a touch screen including a basic UI for controlling basic functions of the display device.
  13. 12. The method of claim 11,
    Wherein the first input unit comprises:
    And a PUI including at least one physical button for controlling a basic function of the display device.
  14. 12. The method of claim 11,
    Wherein the context of the user terminal device comprises:
    A state in which a specific menu is selected, a state in which a specific signal is received in the display device, and a state in which the user terminal device is flipped.
  15. 12. The method of claim 11,
    Further comprising: controlling an activation state of at least one of the first input unit and the second input unit based on a context in which the user terminal device is flipped.
  16. 12. The method of claim 11,
    Wherein the step of providing through the touch screen comprises:
    Wherein the control unit provides information corresponding to the context of the display device when a signal corresponding to the context of the display device is received.
  17. 17. The method of claim 16,
    Wherein the step of providing through the touch screen comprises:
    In a case where the display device is displaying content, additional information on the content is displayed on the touch screen, or when the display device is in a situation for receiving a character, a UI for inputting the character is displayed .
  18. 12. The method of claim 11,
    Wherein the step of providing through the touch screen comprises:
    A UI screen including at least one GUI for instantly reproducing at least one content according to a preset event,
    Wherein the UI screen is configured such that the GUI of a predetermined type is sequentially arranged based on a frequency of use of the at least one content.
  19. 19. The method of claim 18,
    If the user operation for selecting the GUI is a short press input, the content corresponding to the selected GUI is immediately reproduced. If the user operation for selecting the GUI is a long press input, a menu related to the content corresponding to the selected GUI is provided Further comprising the steps of:
  20. 19. The method of claim 18,
    Further comprising the steps of: scrolling and displaying the UI screen in a preset direction according to a preset touch interaction, and additionally displaying at least one GUI not displayed on the UI screen according to a preset event on the touch screen .

KR1020150074277A 2015-05-27 2015-05-27 User terminal apparatus and control method thereof KR20160139481A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150074277A KR20160139481A (en) 2015-05-27 2015-05-27 User terminal apparatus and control method thereof

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR1020150074277A KR20160139481A (en) 2015-05-27 2015-05-27 User terminal apparatus and control method thereof
US15/096,585 US20160349946A1 (en) 2015-05-27 2016-04-12 User terminal apparatus and control method thereof
PCT/KR2016/003969 WO2016190545A1 (en) 2015-05-27 2016-04-15 User terminal apparatus and control method thereof
CN201680024602.8A CN107533424A (en) 2015-05-27 2016-04-15 Subscriber terminal equipment and its control method
EP16168482.4A EP3098702A1 (en) 2015-05-27 2016-05-05 User terminal apparatus and control method thereof

Publications (1)

Publication Number Publication Date
KR20160139481A true KR20160139481A (en) 2016-12-07

Family

ID=55953015

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150074277A KR20160139481A (en) 2015-05-27 2015-05-27 User terminal apparatus and control method thereof

Country Status (5)

Country Link
US (1) US20160349946A1 (en)
EP (1) EP3098702A1 (en)
KR (1) KR20160139481A (en)
CN (1) CN107533424A (en)
WO (1) WO2016190545A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK201670583A1 (en) * 2016-03-28 2017-10-16 Apple Inc Keyboard input to an electronic device
JP2020511247A (en) * 2017-03-20 2020-04-16 3シェイプ アー/エス 3D scanner system with handheld scanner
US10585637B2 (en) * 2017-03-27 2020-03-10 Lenovo (Beijing) Co., Ltd. Information processing method and electronic device
KR20190022157A (en) * 2017-08-25 2019-03-06 엘지전자 주식회사 Image display apparatus

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7170500B2 (en) * 2000-08-29 2007-01-30 Palm, Inc. Flip-style user interface
US7142195B2 (en) * 2001-06-04 2006-11-28 Palm, Inc. Interface for interaction with display visible from both sides
KR101657565B1 (en) * 2010-04-21 2016-09-19 엘지전자 주식회사 Augmented Remote Controller and Method of Operating the Same
KR20130135282A (en) * 2010-12-10 2013-12-10 요타 디바이시스 아이피알 리미티드 Mobile device with user interface
KR102033764B1 (en) * 2010-10-06 2019-10-17 삼성전자주식회사 User interface display method and remote controller using the same
US8670078B2 (en) * 2010-10-26 2014-03-11 BBY Solutions Two-sided remote control
US20120242601A1 (en) * 2011-03-21 2012-09-27 Bang & Olufsen A/S Assembly Of A Display Apparatus And A Remote Control And A Method Of Operating The Assembly
KR101788006B1 (en) * 2011-07-18 2017-10-19 엘지전자 주식회사 Remote Controller and Image Display Device Controllable by Remote Controller
US8456575B2 (en) * 2011-09-21 2013-06-04 Sony Corporation Onscreen remote control presented by audio video display device such as TV to control source of HDMI content
KR20130053040A (en) * 2011-11-14 2013-05-23 삼성전자주식회사 Photographing appratus and photographing method
EP2884721A4 (en) * 2012-08-09 2015-08-12 Yonggui Li Keyboard and mouse of cellular phone
US8773591B1 (en) * 2012-08-13 2014-07-08 Nongqiang Fan Method and apparatus for interacting with television screen

Also Published As

Publication number Publication date
CN107533424A (en) 2018-01-02
WO2016190545A1 (en) 2016-12-01
EP3098702A1 (en) 2016-11-30
US20160349946A1 (en) 2016-12-01

Similar Documents

Publication Publication Date Title
US10387016B2 (en) Method and terminal for displaying a plurality of pages,method and terminal for displaying a plurality of applications being executed on terminal, and method of executing a plurality of applications
JP6286599B2 (en) Method and apparatus for providing character input interface
US10401964B2 (en) Mobile terminal and method for controlling haptic feedback
US20190012051A1 (en) User terminal apparatus and method of controlling the same
US9467732B2 (en) Display apparatus and control method for displaying an operational state of a user input
US9939904B2 (en) Systems and methods for pressure-based haptic effects
US9671936B2 (en) System and methods for interacting with a control environment
US10691291B2 (en) Method and apparatus for displaying picture on portable device
US9335887B2 (en) Multi display device and method of providing tool therefor
US10521110B2 (en) Display device including button configured according to displayed windows and control method therefor
EP3041201B1 (en) User terminal device and control method thereof
ES2748044T3 (en) Display apparatus and control procedure thereof
US20200167061A1 (en) Display device and method of controlling the same
KR101515620B1 (en) User termincal device and methods for controlling the user termincal device thereof
US9813768B2 (en) Configured input display for communicating to computational apparatus
KR101761190B1 (en) Method and apparatus for providing user interface in portable terminal
US8988342B2 (en) Display apparatus, remote controlling apparatus and control method thereof
US9864504B2 (en) User Interface (UI) display method and apparatus of touch-enabled device
US9843618B2 (en) Method and apparatus for displaying user interface through sub device that is connectable with portable electronic device
US20190243517A1 (en) User terminal device with pen and controlling method thereof
US10048824B2 (en) User terminal device and display method thereof
US10152228B2 (en) Enhanced display of interactive elements in a browser
KR20150094492A (en) User terminal device and method for displaying thereof
US20160313966A1 (en) User terminal device providing user interaction and method therefor
US10083617B2 (en) Portable apparatus and screen displaying method thereof