KR20170017413A - Terminal and operating method thereof - Google Patents

Terminal and operating method thereof Download PDF

Info

Publication number
KR20170017413A
KR20170017413A KR1020150111270A KR20150111270A KR20170017413A KR 20170017413 A KR20170017413 A KR 20170017413A KR 1020150111270 A KR1020150111270 A KR 1020150111270A KR 20150111270 A KR20150111270 A KR 20150111270A KR 20170017413 A KR20170017413 A KR 20170017413A
Authority
KR
South Korea
Prior art keywords
widget
page
terminal
input
widget page
Prior art date
Application number
KR1020150111270A
Other languages
Korean (ko)
Inventor
백범현
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150111270A priority Critical patent/KR20170017413A/en
Publication of KR20170017413A publication Critical patent/KR20170017413A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

According to an embodiment of the present invention, an operating method of a terminal comprises: displaying a widget page item for providing at least one widget page; receiving a first input for the displayed widget page item; and responding to the received first input to display a first widget page corresponding to the first input among the at least one widget page. The first widget page includes at least one widget related to the status information of the terminal.

Description

[0001] TERMINAL AND OPERATING METHOD THEREOF [0002]

An embodiment according to the concept of the present invention relates to a terminal and an operation method thereof, and more particularly, to a terminal that provides a user interface (UI) and a user experience (UX) in which a user can conveniently use a widget and an operation method thereof .

A widget displayed on a display of a terminal such as a smartphone or a tablet PC may shorten various steps for executing a specific function of the application to one process, It is possible to increase the convenience of the user by providing an interface for directly confirming the information provided by the application and the information desired by the user without executing the corresponding application.

In order for the user to use the widget, the widget must be added to the page displayed on the display. However, since the widget is generally larger than the application icon and the display of the terminal is limited in the screen size, the number of widgets that can be displayed on the page is limited. Therefore, it is common for the user to add only frequently used widgets to the page.

In this case, when the user wants to use widgets that are not frequently used or widgets that are not added to the page, there is an inconvenience that the widgets must be added to the page.

Disclosure of Invention Technical Problem [8] The present invention provides a terminal that can be conveniently used when a user wants to use a widget that is not included in a widget or a page having a small execution count and an operation method thereof.

A method of operating a terminal according to an embodiment of the present invention includes displaying a widget page item for providing at least one widget page, receiving a first input for a displayed widget page item, And displaying a first widget page corresponding to the first input of the at least one widget page in response to the first widget page, wherein the first widget page includes at least one widget associated with the status information of the terminal do.

According to an embodiment, the widget page item may be a UI window displayed between the first page and the second page while the first page displayed among the plurality of pages is switched to the second page.

According to another embodiment, the widget page item is an icon for providing the at least one widget page, and the icon may be displayed when the at least one widget page is created.

Wherein the status information includes content information of an application displayed on a display unit of the terminal, and the step of displaying the first widget page includes displaying the first widget page including at least one widget related to the content information can do.

According to an embodiment, the status information is a number of widget executions of each of the plurality of widgets, and the first widget page may include at least one widget among the plurality of widgets, the number of widget executions being lower than the reference number have.

According to another embodiment, the status information is information on a widget included in the plurality of pages among a plurality of widgets, and the first widget page is included in the plurality of pages of the plurality of widgets At least one widget.

The first widget page may be automatically generated based on the status information before receiving the first input.

Wherein displaying the first widget page comprises generating the first widget page based on the status information in response to the received first input and displaying the generated first widget page .

The method includes receiving a request to create a second widget page that includes a widget designated by the user, generating the second widget page in response to the received request, generating a second widget page for the displayed widget page item Receiving a second input, and responsive to the received second input, displaying the second widget page.

The method includes receiving a third input for an application icon displayed on a display portion of the terminal, and responsive to the received third input, including at least one widget provided by an application corresponding to the application icon And displaying the third widget page.

A terminal according to an exemplary embodiment of the present invention displays a widget page item for providing a display unit, a user input unit, and at least one widget page, and displays a first input for a displayed widget page item on the user input unit And displaying a first widget page corresponding to the received first input on the display unit, wherein the controller includes: a first widget that includes at least one widget related to the status information of the terminal, You can create a page.

The terminal according to the embodiment of the present invention separately provides a widget page composed of only a widget whose execution frequency is small or not included in the page so that when the user wants to use the widget, There is an effect that it can be shortened.

In addition, the terminal according to the embodiment of the present invention can automatically generate a widget page related to information acquired using the intelligent agent and provide the widget page to the user, thereby improving the convenience of use.

1 is a schematic block diagram of a terminal according to an embodiment of the present invention.
2 is a schematic block diagram of an intelligent agent running in a terminal according to an embodiment of the present invention.
3 is a flowchart illustrating an operation of a terminal according to an embodiment of the present invention.
Figures 4A through 4D illustrate widget page items according to an embodiment of the present invention.
5A and 5B are exemplary views of widget page items according to another embodiment of the present invention.
6 is an exemplary view of a widget page item according to another embodiment of the present invention.
7A and 7B are diagrams for explaining a widget page creation operation according to an embodiment of the present invention.
8A and 8B are diagrams for explaining a widget page creation operation according to another embodiment of the present invention.
FIGS. 9A through 9C are exemplary diagrams illustrating an operation for displaying a widget page in response to an input for the widget page item shown in FIG. 4B.
FIGS. 10A through 10C are illustrations for explaining an embodiment of an operation for displaying a widget page in response to an input for the widget page item shown in FIGS. 5A and 5B.
FIGS. 11A through 11C are illustrations for explaining another embodiment of an operation for displaying a widget page in response to an input for the widget page item shown in FIGS. 5A and 5B.
12A and 12B are diagrams for explaining a widget page display operation according to another embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The terminal described in this specification may be a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC, A tablet PC, an ultrabook, a wearable device (e.g., a smartwatch, a smart glass, a head mounted display (HMD), etc.) .

However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, .

Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings attached hereto.

1 is a schematic block diagram of a terminal according to an embodiment of the present invention.

Referring to FIG. 1, a terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, 160, an interface unit 170, a control unit 180, a power supply unit 190, and the like. The components shown in FIG. 1 are not essential, and a terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may perform wireless communication between the terminal 100 and the wireless communication system or between the terminal 100 and another terminal 100 or between the terminal 100 and another device or between the terminal 100 and an external server. Or < / RTI > In addition, the wireless communication unit 110 may include one or more modules for connecting the terminal 100 to one or more networks.

The wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, the broadcast-related information may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 may be a digital multimedia broadcasting (DMB-T), a digital multimedia broadcasting (DMB-S), a media forward link only (DVF-H) (ISDB-T), and digital broadcasting systems such as ISDB-T (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module 111 may be configured to be suitable for other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IRDA), Ultra Wideband (UWB), ZigBee, NFC (Near Field Communication), etc. can be used as short range communication technology have.

The position information module 115 is a module for obtaining the position (or current position) of the terminal 100, and representative examples thereof include a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the terminal 100 utilizes a GPS module, it can acquire the position of the terminal 100 using a signal transmitted from the GPS satellite. As another example, when the terminal 100 utilizes a Wi-Fi module, it acquires the position of the terminal 100 based on information of a wireless access point (AP) that transmits or receives a wireless signal with the Wi-Fi module can do. Optionally, the location information module 115 may replace or additionally perform any of the other modules of the wireless communication unit 110 to obtain data regarding the location of the terminal 100. The position information module 115 is a module used to obtain the position (or the current position) of the terminal 100, and is not limited to a module that directly calculates or acquires the position of the terminal 100. [

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The input unit 120 may include a camera 121 or an image input unit for inputting an image signal, a microphone 122 for inputting an audio signal, or an audio input unit, a user input unit for receiving information from a user touch key, mechanical key, etc.). The voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.

The user input unit 130 may refer to the user input unit described above.

The sensing unit 140 may include at least one sensor for sensing at least one of information in the terminal, surrounding environment information surrounding the terminal 100, and user information. For example, the sensing unit 140 may include a proximity sensor, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor G- sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, an optical sensor (for example, a camera 121), a microphone (see 122), a battery gauge, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation sensor, Gas sensors, etc.), chemical sensors (e.g., electronic noses, healthcare sensors, biometric sensors, etc.). Meanwhile, the terminal 100 disclosed in this specification can combine and utilize the information sensed by at least two of the sensors.

The output unit 150 includes at least one of a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154 for generating an output related to visual, auditory, . The display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input unit 130 for providing an input interface between the terminal 100 and a user and may provide an output interface between the terminal 100 and the user.

The display unit 151 displays (outputs) information to be processed by the terminal 100. For example, when the terminal 100 is in the call mode, a UI (User Interface) or GUI (Graphic User Interface) associated with the call is displayed. When the terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the terminal 100. For example, in the terminal 100, a plurality of display portions may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor may be configured to detect not only a position and an area to be touched but also a pressure and a capacitance at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

In addition, the controller 180 can determine the type of the touch input of the user based on the area, pressure, and capacitance at the time of touch. Accordingly, the control unit 180 can distinguish the finger touch of the user, the nail touch, the finger touch, and the multi-touch using a plurality of fingers.

A proximity sensor may be disposed within an interior region of the terminal or the proximity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life than the contact-type sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the terminal 100. [ The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the terminal 100. Examples of events that occur in the terminal 100 include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152 so that they may be classified as a part of the alarm unit 153.

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. More than two haptic modules 154 may be provided according to the configuration of the terminal 100.

In addition, the haptic module 154 may include a vibrating element capable of generating vibration. For example, the haptic module 154 may include one or more vibration motors, and the vibration motors may be in various forms such as bar time, coin type, and the like.

The haptic module 154 may be provided at various positions according to the shape of the terminal 100.

In addition, the memory 160 stores data supporting various functions of the terminal 100. The memory 160 may store a plurality of application programs (application programs or applications) driven by the terminal 100, data for operation of the terminal 100, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least some of these application programs may reside on the terminal 100 from the time of departure for the basic functions (e.g., call incoming, outgoing, message receiving, originating functions) of the terminal 100. The application program may be stored in the memory 160 and installed on the terminal 100 and may be operated by the control unit 180 to perform the operation (or function) of the terminal 100. [

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all the home devices connected to the terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the terminal 100 or transmits data to the external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various kinds of information for authenticating the usage right of the terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. The device (hereinafter, referred to as 'identification device') having the identification module may be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

The interface unit 170 may be a path through which power from the cradle is supplied to the terminal 100 when the terminal 100 is connected to an external cradle or various command signals input from the cradle by the user 100). ≪ / RTI > The various command signals input from the cradle or the power source may be operated as a signal for recognizing that the terminal 100 is correctly mounted on the cradle.

A controller 180 typically controls the overall operation of the terminal 100. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The control unit 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, May be implemented by the control unit 180.

According to a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that perform at least one function or operation. The software code may be implemented by a software application written in a suitable programming language. The software codes are stored in the memory 160 and can be executed by the control unit 180. [

The terminal 100 may be a portable terminal or a stationary terminal. Accordingly, the terminal 100 may be carried by the user directly, or may be mounted in a predetermined area.

2 is a schematic block diagram of an intelligent agent running in a terminal according to an embodiment of the present invention.

1 and 2, the controller 180 may load and operate an intelligent agent (Intelligent Agent, or an intelligent agent application (application or app)) 182 from the memory 160. The intelligent agent 182 may be executed by user input or automatically by the control unit 180 at the time of booting of the terminal 100. [ The intelligent agent 182 may be, but is not limited to, a background application running in the background.

The intelligent agent 182 may refer to an application that performs an autonomous process that performs tasks on behalf of a user for a particular purpose.

The intelligent agent 182 may receive information input through the input units 120 and 130 or the sensing unit 140 included in the terminal 100 and receive information received from the outside through the wireless communication unit 110 or the interface unit 170 And / or content information of an application executed by the control unit 180, for example, on a periodic or real-time basis. The intelligent agent 182 can actively perform the task by extracting the information necessary for achieving the specific purpose based on the monitoring result and autonomously processing the extracted information.

The control unit 180 can provide the function of the terminal 100 intended by the user or the operation of the terminal 100 via the intelligent agent 182 that is driven. This will be described later.

Hereinafter, the expression that the terminal 100 performs a function through the intelligent agent 182 or that the intelligent agent 182 performs a function means that the controller 180 processes the intelligent agent application program to perform the function . According to an embodiment, the intelligent agent application program may be stored in the memory 160, or may be loaded into the memory 160 from an external memory and driven by the controller 180.

3 is a flowchart illustrating an operation of a terminal according to an embodiment of the present invention.

Referring to FIGS. 1 and 3, the controller 180 may display a widget page item on the display unit 151 to provide a widget page separated from a plurality of pages (S100).

Each of the plurality of pages may include an application icon for application execution and a widget that displays a function or information provided by the corresponding application. Any one of the plurality of pages may be a home screen.

The home screen may refer to a page displayed in response to a home button input of the terminal 100. If the page switching request is received while the home screen is being displayed, the controller 180 may switch the home screen to another page and display the page.

The widget page is distinguished from the plurality of pages and may include only widgets. That is, the widget page may not include an application icon. The operation of generating the widget page and the structure of the widget page will be described later with reference to FIGS. 7A to 8B.

Embodiments of widget page items displayed on the display unit 151 will be described with reference to FIGS. 4A to 6.

Figures 4A through 4D illustrate widget page items according to an embodiment of the present invention.

4A shows an embodiment of a first page 230 among a plurality of pages. The first page 230 may be a home screen.

The configuration and configuration of the first page 230 shown in FIG. 4A may be varied according to the embodiment for convenience of explanation, and each of the plurality of pages may be configured in the configuration and the form In a sufficiently predictable range.

4A, the controller 180 displays an icon area 210, a status bar area 220, and a page area (the first page 230 in FIG. 4A) on the display part 151 can do.

The icon area 210 may include (or display) at least one application icon. 4A, the icon area 210 may include a call icon, a message icon, an internet icon, and an application list icon (a menu screen icon or an application drawer icon) The type, number, and / or arrangement of application icons included in the icon area 210 can be freely modified.

The icon region 210 may operate independently of the page region. For example, the icon area 210 may be fixed even if the controller 180 switches the first page 230 of the page area to another page in response to the page change request. The controller 180 may switch the application icon displayed in the icon area 210 to another application icon in response to the switching request for the icon area 210. [ The icon region 210 may be disposed at the lower end of the display unit 151, but the position of the icon region 210 is not limited thereto.

The status bar area 220 may display the status of the terminal 100, such as the current time, battery status, and notification, or information obtained through the application. The status bar area 220 may be disposed at the top of the display part 151, but the position of the status bar area 220 is not limited thereto. When a specific input (e.g., a touch drag input from top to bottom) to the status bar area 220 is received, the controller 180 displays the status and / or the information displayed in the status bar area 220 in detail The status window can be displayed on the display unit 151. [

The first page 230 may include at least one widget and / or application icon W1, W2, AI1 through AI4. 4A, it is assumed that the first page 230 includes a first widget W1, a second widget W2, and four application icons AI1 through AI4 for convenience of explanation. The first widget W1 and the second widget W2 may display the function provided by the corresponding application and / or the information obtained from the corresponding application. The control unit 180 may execute an application corresponding to the selected widget or perform a function provided by the corresponding application in response to an input for selecting either the first widget W1 or the second widget W2 have.

The sizes of the first and second widgets W1 and W2 may vary and the positions of the first and second widgets W1 and W2 may be freely changed Do.

The control unit 180 may receive an input for any one of the application icons AI1 through AI4 included in the first page 230 and may receive an application corresponding to any one of the application icons .

The control unit 180 may control the display unit 151 to switch the first page 230 to the second page and display the second page in response to the page change request. For example, the user can input a page change request by performing a touch drag operation from right to left through the touch screen. In response to the page change request, the controller 180 causes the display unit 151 to display the first page 230 The first page 230 can be switched to the second page by displaying the page switching effect that causes the second page to appear from the right side. The input and / or page switching effect for the control unit 180 to switch the first page 230 to the second page and display the same on the display unit 151 can be variously modified according to the embodiment.

4B illustrates a widget page item according to an embodiment of the present invention.

The widget page item 300 shown in FIGS. 4B to 4D may be in the form of a widget page display UI window.

4A and 4B, when the controller 180 switches the first page 230 to the second page 240 in response to the page change request, the first page 230 displayed on the display unit 151 230) and the second page (240).

The widget page item 300 may include at least one field indicating the number of pre-stored widget pages. For example, when the widget page item 300 includes two spaces, the number of pre-stored widget pages may be two.

According to an embodiment, when a widget page item 300 includes two columns, one column (e.g., left column) refers to a widget page generated by the user and the other column (right column) May refer to a widget page that is automatically generated by the intelligent agent 182.

The form of the widget page item 300 shown in FIG. 4B may be variously modified according to the embodiment. For example, the widget page item 300 may include vertically arranged cells, and the appearance of the widget page item 300 itself may have a different appearance.

Referring to FIG. 4C, the widget page item 300 may be displayed in different colors depending on the category of the widget page, the number of times of use, and the like.

For example, suppose the widget page item 300 includes two columns, the left column corresponds to a widget page generated by the user, and the right column corresponds to a widget page that is automatically generated by the intelligent agent 182 do. At this time, the controller 180 may classify the types of the widget pages by different colors displayed in the respective boxes, change the colors displayed according to the presence or absence of the corresponding widget pages, The displayed color may be different.

4D, if the intelligent agent 182 automatically recommends any one of the pre-stored widget pages based on the status information of the terminal 100, or if the intelligent agent 182 is notified of the widget page The control unit 180 may control the display unit 151 to display a notification effect on the widget page item 300. In this case, In FIG. 4D, an effect of emphasizing the widget page item 300 (for example, a neon effect) is displayed as an example of the alert effect, but the type of the alert effect is not limited thereto.

5A and 5B are exemplary views of widget page items according to another embodiment of the present invention.

Referring to FIG. 5A, the widget page item 301 may be implemented in an icon form. The widget page item 301 can be displayed on the page displayed on the display unit 151 and can be freely moved within the display unit 151 by the touch drag input. Widget page item 301 is displayed on the page means that when the widget and / or application icon contained in the page and the widget page item 301 overlap, the widget page item 301 is displayed and a widget of the nested portion And / or the application icon may not be displayed.

The widget page item 301 may be displayed when at least one widget page is generated, but the present invention is not limited thereto. The widget page item 301 may be displayed on the display unit 151 at all times.

5B, the widget page item 301 may be displayed on the content screen 250 even when the content screen 250 of the specific application is displayed on the display unit 151. [ FIG. 5B illustrates a widget page item 301 displayed on a web site screen 250 provided by an Internet application, as an example of a widget page item 301 displayed on the content screen 250. Accordingly, when the controller 180 receives an input for the widget page item 301 while a content screen of a specific application is being displayed, the controller 180 can directly switch the content screen to a widget page and display the widget page.

5A and 5B, the controller 180 may display the status of the widget page item 301 displayed on the display unit 151 differently according to the type of the widget page to be displayed. This will be described in more detail with reference to FIGS. 10A and 10B.

The form of the widget page item 301 shown in FIGS. 5A and 5B is merely an example for convenience of explanation, and therefore the form of the widget page item 301 is not limited thereto.

6 is an exemplary view of a widget page item according to another embodiment of the present invention. Referring to FIG. 6, a widget page item 303 may be included in the status window 260 and displayed. Except for the widget page item 303, the remaining components are well known in the art, and a detailed description of the remaining components will be omitted.

The widget page item 303 may include at least one widget page icon 311, 312 for entering each corresponding widget page. When an input for selecting one of the widget pages 311 and 312 is received, the controller 180 may display a widget page corresponding to the selected widget page icon. Depending on the embodiment, at least one widget automatically selected by the user's designation and / or intelligent agent may be displayed directly instead of the widget page item 303. [

The form and structure of the widget page item 303 shown in FIG. 6 are for convenience of explanation, and can be variously modified according to the embodiment.

Referring back to FIG.

The control unit 180 may receive an input for the displayed widget page item (S200). The input may be received via the user input unit 130 or the A / V input unit 120.

The control unit 180 may display the widget page in response to the received input (S300).

First, embodiments of the operation of generating a widget page will be described, and then, embodiments of steps S200 and S300 will be described.

7A and 7B are diagrams for explaining a widget page creation operation according to an embodiment of the present invention. FIG. 7A shows an embodiment of a user creating a first widget page using a widget page creation screen, and FIG. 7B shows an embodiment of a first widget page generated by the user.

7A, a widget page creation screen 270 displayed on the display unit 151 may include a widget list area 272 and a widget page area (e.g., a first widget page 311).

The widget list area 272 may include a plurality of widgets W7 to W11. The controller 180 responds to an input (for example, a touch drag input) from which one of the plurality of widgets W7 to W11 is transferred from the widget list area 272 to the first widget page 311 And add the widget W7 to the first widget page 311. [ The control unit 180 may store the generated first widget page 311 in the memory 160 when the first widget page 311 is generated.

According to the embodiment, the controller 180 may arrange a plurality of widgets included in the widget list area 272 in order of execution count. The execution count may mean the execution count of each of the plurality of widgets or the execution count of each application corresponding to each of the plurality of widgets.

According to another embodiment, the plurality of widgets included in the widget list area 272 may be widgets that are not included in a plurality of pages.

Accordingly, the user can add, to the first widget page 311, a widget that can be usefully used in a specific situation, out of widgets that are not frequently used or included in the plurality of pages. In this particular situation, the controller 180 may display the first widget page 311 in response to a request to display the first widget page 311, and the user may include the first widget page 311 in the displayed first widget page 311 You can use the widget immediately.

8A and 8B are diagrams for explaining a widget page creation operation according to another embodiment of the present invention. FIG. 8A is a flowchart illustrating a widget page generation operation using an intelligent agent according to an exemplary embodiment of the present invention, and FIG. 8B illustrates an exemplary embodiment of a second widget page generated by the intelligent agent.

8A and 8B, the intelligent agent 182 executed by the controller 180 may acquire status information of the terminal 100 (S400). For example, the intelligent agent 182 may receive data transmitted through the wireless communication unit 110 of the terminal 100, the A / V input unit 120, the user input unit 130, the sensing unit 140, and / 170, or the like, information processed by the controller 180, or the like, in real time or periodically.

For example, the status information may include current time, location information of the terminal 100, current application information, number of executions of each of a plurality of widgets, information of widgets contained in a plurality of pages, and / 151, and the like.

The intelligent agent 182 may automatically generate a widget page based on the acquired state information (S420). The intelligent agent 182 extracts at least one widget W12 to W15 from among a plurality of widgets based on the obtained state information and extracts at least one widget W12 to W15, The page 312 can be automatically generated. At least one of the widgets W12 to W15 included in the second widget page 312 may be automatically arranged according to a predetermined arrangement criterion (e.g., widget size order, execution number order, or name order).

The control unit 180 may store the generated second widget page 312 in the memory 160. [ In general, since the second widget page 312 generated by the intelligent agent 182 may vary according to the state information at the time of generation, the control unit 180 automatically updates the second widget page 312 after the reference time It can also be deleted.

According to an embodiment, each of the at least one widget W12-W15 included in the second widget page 312 may be a widget that is not frequently used. The infrequently used widget may mean a widget whose execution frequency of the widget (or the execution frequency of the application) is lower than the reference frequency. For example, the intelligent agent 182 may extract at least one widget (W12 to W15) related to the status information, the number of widget executions is lower than the reference number of the plurality of widgets, The second widget page 312 may be automatically generated.

According to another embodiment, the intelligent agent 182 may automatically generate a second widget page 312, which is not included in a plurality of pages, and which includes at least one widget W12-W15 associated with the status information It is possible.

FIGS. 9A through 9C are exemplary diagrams illustrating an operation of displaying a widget page in response to an input to the widget page item shown in FIG. 4B.

Referring to FIG. 9A, the controller 180 may display a widget page item 300 and receive a first input for the displayed widget page item 300 while switching pages in response to the page change input. The control unit 180 may display the first widget page 311 shown in FIG. 7B on the display unit 151 in response to the received first input. In FIG. 9A, a touch drag input from the bottom to the top is shown as an example of the first input, but the first input is not limited thereto.

Referring to FIG. 9B, the controller 180 may receive a second input for the displayed widget page item 300. The control unit 180 may display the second widget page 312 shown in FIG. 8B on the display unit 151 in response to the received second input. FIG. 9B shows an example of the second input, in which the touch drag input from the top to the bottom is shown, but the second input is not limited thereto.

The second widget page 312 may be automatically generated by the intelligent agent 182 before the second input is entered and may be generated by the intelligent agent 182 in response to the second input.

For example, in a case where the intelligent agent 182 generates a second widget page based on a specific time (e.g., commute time), the intelligent agent 182 may determine that the current time is not the time The second widget page can be automatically generated.

One embodiment of the case where the intelligent agent 182 generates the second widget page in response to the second input will be described with reference to FIG. 9C and FIGS. 11A to 11C.

9C is a flowchart of an embodiment of an operation in which an intelligent agent generates a widget page in response to a user input. 9C may refer to a widget page that is automatically generated by the intelligent agent 182. [

Referring to FIG. 9C, when a user attempts to enter a recommended widget page including widgets related to 'travel' while browsing contents related to 'travel' using the Internet application, the control unit 180 A request to move to the home screen may be received (S500). The control unit 180 may display the home screen on the display unit 151 in response to the received request.

The control unit 180 may receive an input (e.g., a second input) for displaying the recommended widget page (S520).

In response to the received input, the intelligent agent 182 obtains, as state information, information on the content (content related to 'travel') before moving to the home screen, and includes at least one widget related to the acquired state information A recommended widget page can be created (S540). The control unit 180 may display the generated recommended widget page on the display unit 151 (S560).

FIGS. 10A through 10C are illustrations for explaining an embodiment of an operation for displaying a widget page in response to an input for the widget page item shown in FIGS. 5A and 5B.

Referring to FIGS. 10A and 10B, the control unit 180 may receive an input for the displayed widget page item 301. 7B and the second widget page 312 shown in FIG. 8B, based on the status of the widget page item 301 at the time when the input is received, Can be displayed on the display unit 151. [0213] FIG.

For example, when an input to the widget page item 301 is received while the color of the widget page item 301 is half as shown in FIG. 10A, the control unit 180 displays the first widget The page 311 can be displayed on the display unit 151. [

On the other hand, when the input of the widget page item 301 is received while all the colors of the widget page item 301 are filled as shown in FIG. 10B, the control unit 180 displays the second widget page 301 (312) can be displayed on the display unit (151).

In other words, according to the embodiment shown in FIGS. 10A and 10B, the state of the widget page item 301 is classified according to the degree of color filling of the widget page item 301, and based on the separated state, Any one of the first widget page 311 and the second widget page 322 may be displayed.

Fig. 10C shows an embodiment for displaying the first widget page 311 or the second widget page 312 in a manner different from the embodiment shown in Figs. 10A and 10B. 10C, when an input to the widget page item 301 is received, the control unit 180 can display the widget page UIs 301A and 301B on the display unit 151 outside the widget page item 301 have.

Assuming that the first widget page UI 301A corresponds to the first widget page 311 and the second widget page UI 301B corresponds to the second widget page 312, May display the first widget page 311 in response to an input to the widget page UI 301A and may display the second widget page 312 in response to an input to the second widget page UI 301B. have. The number of widget page UIs can be variously changed depending on the type or number of widget pages.

FIGS. 11A through 11C are illustrations for explaining another embodiment of an operation for displaying a widget page in response to an input for the widget page item shown in FIGS. 5A and 5B.

11A to 11C, when a user searches for a movie and pre-selects a specific movie, the widget page item 301 is moved to a position where the specific movie is displayed, and then input to the widget page item 301 Can be performed.

In response to the input of the moved widget page item 301, the control unit 180 may use the intelligent agent 182 to display the content information (e.g., ') As state information. The control unit 180 determines whether or not the widget page (for example, the movie preview widget W4) related to 'movie' among the widget pages generated by the user as shown in FIG. (E.g., the first widget page 311) on the display unit 151. [

Referring to FIG. 11B, in accordance with an embodiment, the user may perform an input to select a widget page item 301 to identify additional information related to a movie booking, such as a schedule, point information, and the like. At this time, the intelligent agent 182, in response to the input to the widget page item 301, generates widgets related to the movie booking (e.g., the movie booking widget W4, the calendar widget W9 And the point management widget W13) to automatically generate the third widget page 313 and display the generated third widget page 313 on the display unit 151. [

11A, if the input to the widget page item 301 moved in FIG. 11A is received, without going through the embodiment shown in FIG. 11B, the intelligent agent 182 may move to the third widget page The control unit 313 may be automatically generated and displayed on the display unit 151.

12A and 12B are diagrams for explaining a widget page entry operation according to another embodiment of the present invention.

12A and 12B, the control unit 180 receives a specific input for the application icon AI2 displayed on the display unit 151 and, in response to the received specific input, displays the application corresponding to the application icon AI2 On the display unit 151, a fourth widget page 314 composed only of the widgets W16 to W19 provided by the user.

12A and 12B show a spread touch or zoom-in touch input for the application icon AI2 as an example of the specific input, The present invention is not limited thereto. The control unit 180 may display a fourth widget page 314 composed only of the widgets W16 to W19 provided by the application corresponding to the application icon AI2 in response to the spread touch. The spread touch may mean a touch input in which a user collects two fingers and touches the display unit 151, and then performs a touch drag operation to open the two fingers.

According to the embodiment, even when the specific input is input to any one of the widgets included in the displayed page, the controller 180 displays a widget page composed only of the widgets provided by the application corresponding to any of the widgets can do. For example, when the specific input is input to the first widget W1 of the first page 230 shown in FIG. 4A, the control unit 180 determines whether the widget W1 of the first page 230 is composed only of the widgets provided by the application corresponding to the first widget W1 Widget page can be displayed.

12B, the arrangement of the widgets W16 to W19 included in the fourth widget page 314 may be automatically arranged by the controller 180 or the intelligent agent 182. [ For example, the widgets W16 to W19 may be arranged in the order of magnitude, or may be arranged in the order of function. According to an embodiment, the fourth widget page 314 may include a plurality of pages if the widgets provided by the application corresponding to the application icon AI2 are not all displayed on one page.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

Claims (13)

A method of operating a terminal,
Displaying a widget page item for providing at least one widget page;
Receiving a first input for a displayed widget page item; And
Responsive to a received first input, displaying a first widget page corresponding to the first input of the at least one widget page,
Wherein the first widget page comprises:
And at least one widget associated with the status information of the terminal
A method of operating a terminal.
The method according to claim 1,
The widget page item includes:
A UI window displayed between the first page and the second page while the first page displayed among the plurality of pages is switched to the second page
A method of operating a terminal.
The method according to claim 1,
The widget page item includes:
An icon for providing the at least one widget page,
The icon is displayed when the at least one widget page is created
A method of operating a terminal.
The method of claim 3,
The status information may include:
The content information of the application displayed on the display unit of the terminal,
Wherein the displaying the first widget page comprises:
Displaying the first widget page including at least one widget associated with the content information
A method of operating a terminal.
The method according to claim 1,
Wherein the status information is the number of widget executions of each of the plurality of widgets,
Wherein the first widget page comprises:
Wherein among the plurality of widgets, at least one widget whose execution frequency of widgets is lower than the reference frequency
A method of operating a terminal.
The method according to claim 1,
Wherein the status information is information on a widget included in the plurality of pages among a plurality of widgets,
Wherein the first widget page comprises:
And at least one widget that is not included in the plurality of pages of the plurality of widgets
A method of operating a terminal.
The method of claim 1, wherein the first widget page comprises:
Before receiving the first input, is automatically generated based on the state information
A method of operating a terminal.
The method of claim 1, wherein displaying the first widget page comprises:
Generating the first widget page based on the status information, in response to the received first input; And
And displaying the generated first widget page
A method of operating a terminal.
The method according to claim 1,
Receiving a request to create a second widget page comprising a widget designated by the user;
Generating the second widget page in response to the received request;
Receiving a second input for the displayed widget page item; And
Responsive to the received second input, displaying the second widget page
A method of operating a terminal.
The method according to claim 1,
Receiving a third input for an application icon displayed on a display portion of the terminal; And
Responsive to the received third input, displaying a third widget page comprising at least one widget provided by an application corresponding to the application icon
A method of operating a terminal.
In the terminal,
A display unit;
A user input section; And
Displaying a widget page item for providing at least one widget page on the display unit, receiving a first input for the displayed widget page item via the user input unit, receiving a first widget page corresponding to the received first input, On the display unit,
Wherein,
Generating the first widget page including at least one widget related to the status information of the terminal
terminal.
12. The method of claim 11,
Wherein the status information is the number of widget executions of each of the plurality of widgets,
Wherein the first widget page comprises:
Wherein among the plurality of widgets, at least one widget whose execution frequency of widgets is lower than the reference frequency
terminal.
12. The apparatus according to claim 11,
Creating a second widget page that includes a widget specified by the user,
In response to a second input of the widget page item, displaying the second widget page on the display unit
terminal.
KR1020150111270A 2015-08-06 2015-08-06 Terminal and operating method thereof KR20170017413A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150111270A KR20170017413A (en) 2015-08-06 2015-08-06 Terminal and operating method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150111270A KR20170017413A (en) 2015-08-06 2015-08-06 Terminal and operating method thereof

Publications (1)

Publication Number Publication Date
KR20170017413A true KR20170017413A (en) 2017-02-15

Family

ID=58112223

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150111270A KR20170017413A (en) 2015-08-06 2015-08-06 Terminal and operating method thereof

Country Status (1)

Country Link
KR (1) KR20170017413A (en)

Similar Documents

Publication Publication Date Title
EP3686723B1 (en) User terminal device providing user interaction and method therefor
US9928028B2 (en) Mobile terminal with voice recognition mode for multitasking and control method thereof
KR102204554B1 (en) Mobile terminal and control method for the mobile terminal
EP3411780B1 (en) Intelligent electronic device and method of operating the same
US8799784B2 (en) Method for displaying internet page and mobile terminal using the same
EP2813933A1 (en) Mobile terminal and a method of interacting with transparent overlapping GUI layers
KR20170055715A (en) Mobile terminal and method for controlling the same
KR20180134668A (en) Mobile terminal and method for controlling the same
KR20170088691A (en) Mobile terminal for one-hand operation mode of controlling paired device, notification and application
KR20160009428A (en) Mobile terminal and method for controlling the same
KR20160021524A (en) Mobile terminal and method for controlling the same
CN105975190B (en) Graphical interface processing method, device and system
KR101651033B1 (en) Terminal and operating method thereof
US9317244B2 (en) Mobile terminal
KR102004986B1 (en) Method and system for executing application, device and computer readable recording medium thereof
KR20160026480A (en) Mobile terminal and method of controlling the same
KR102215178B1 (en) User input method and apparatus in a electronic device
KR20130085653A (en) Method for providing route guide using augmented reality and mobile terminal using this method
KR20150115365A (en) Method and apparatus for providing user interface corresponding user input in a electronic device
KR20140008643A (en) Mobile terminal and control method for mobile terminal
KR102201727B1 (en) Mobile terminal and control method thereof
KR20150111834A (en) Mobile terminal and method for controlling the same
KR101476201B1 (en) Mobile terminal and display method thereof
US20160196058A1 (en) Mobile terminal and control method thereof
KR101995234B1 (en) Terminal and control method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
E902 Notification of reason for refusal