KR20140130854A - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR20140130854A
KR20140130854A KR1020130049342A KR20130049342A KR20140130854A KR 20140130854 A KR20140130854 A KR 20140130854A KR 1020130049342 A KR1020130049342 A KR 1020130049342A KR 20130049342 A KR20130049342 A KR 20130049342A KR 20140130854 A KR20140130854 A KR 20140130854A
Authority
KR
South Korea
Prior art keywords
application
input
memo
execution
mobile terminal
Prior art date
Application number
KR1020130049342A
Other languages
Korean (ko)
Inventor
정헌재
권윤미
이한나
최재호
정혜미
최지안
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020130049342A priority Critical patent/KR20140130854A/en
Publication of KR20140130854A publication Critical patent/KR20140130854A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/42Graphical user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The present invention relates to a mobile terminal and a control method thereof, in which usage of a terminal can be realized by further considering convenience of a user. According to at least one of the embodiments of the present invention, an application running at the time when a memo is input and an inputted memo content are interlocked with each other to provide a browsing environment in which the convenience of the user is improved when browsing the stored memo contents There is an advantage to be able to do.

Figure P1020130049342

Description

[0001] MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME [0002]

The present invention relates to a mobile terminal and a control method thereof, in which usage of a terminal can be realized by further considering convenience of a user.

The terminal can move And can be divided into a mobile / portable terminal and a stationary terminal depending on whether the mobile terminal is a mobile terminal or a mobile terminal. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

Recently, as the smartphone market has been rapidly expanding, smartphone users have also surged, and smart phones have come a step closer to the daily life of smartphone users due to their various functions.

Of the functions that can be performed with smartphones in particular, the closest function to everyday life is the memo function. The amount of various events or information that can occur in everyday life is increasing, and the amount of this information will be enormous if we consider the amount of information collected through the network. Accordingly, a user interface that can store a large amount of information at a glance has been researched.

The present invention provides a mobile terminal and a control method thereof, which are capable of easily linking an application running at the time of inputting a memo with the memo contents. do.

It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, unless further departing from the spirit and scope of the invention as defined by the appended claims. It will be possible.

According to an aspect of the present invention, there is provided an information processing apparatus including a display unit, a user input unit, and a memo layer to be overlapped on an execution screen of an application, And outputting an execution object of the application on the memo layer upon receiving the message.

In this case, upon receipt of the storage input of the memo layer, the control unit may store the execution object of the application in association with the memo layer.

In addition, upon receiving the call input of the stored memo layer, the control unit may output the stored memo layer and output the execution objects of the application corresponding to the memo layer stored therein.

The control unit may be operable to execute the application upon receiving an input for selecting an execution object of the output application.

Furthermore, the control unit may be operable to store execution state information of the application upon receipt of an input from the execution screen of the application in a state in which the memo layer is superimposed on the execution screen of the application.

In this case, upon receiving the input for selecting the execution object of the output application, the control unit controls the execution state of the application at the time of receiving the input from the execution screen of the application, based on the execution state information of the stored application And operates to execute the application.

The execution status information may be identification information of the photographic content being viewed at the time of receiving the input that is out of the execution screen of the application in the case of the application being a photo gallery application, Upon receipt of an input for selecting an execution object of the application, to output the photo content corresponding to the identification information of the photo content on the photo gallery application execution screen.

Or when the application is a call transmission application, the execution state information is phone number information input at the time of receiving an input from an execution screen of the application, and the control unit selects an execution object of the output application Upon receiving the input, it may operate to execute the call application with the stored phone number information being entered.

The execution status information may include at least one of function information executed by the application, text information output by the application, content identification information, and menu object identification information.

Further, the control unit receives an input for selecting a partial area of the execution screen of the application and, when receiving an input for selecting an execution object of the output application, causes the control unit to further output a capture screen of the selected partial area do.

According to another aspect of the present invention, there is provided a method for controlling an application, comprising the steps of: outputting a memo layer so as to overlap on an execution screen of an application through a display unit; And outputting the execution object of the application on the memo layer upon receiving the input.

Effects of the mobile terminal and the control method according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, there is an advantage that the memo inputted on the execution screen can be browsed together with the execution screen of the application.

Further, according to at least one embodiment of the present invention, there is an advantage that, when browsing stored memo contents, the status of an application executing at the time of inputting the memo contents being read can be browsed together.

The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2A is a front perspective view of a mobile terminal according to an embodiment of the present invention.
2B is a rear perspective view of a mobile terminal according to an embodiment of the present invention.
3 is a diagram showing an example of a state diagram of the application memo mode by a general method.
4 is a diagram showing an example of a state diagram of the quick memo mode by a general method.
5 is a diagram illustrating an example of a method of linking an application with a memo content according to an embodiment of the present invention.
6 to 8 illustrate a method of linking memo contents by objects according to an embodiment of the present invention.
9 is a diagram illustrating an example of a memo management manager capable of managing memo contents in which a stamp operation is performed according to an embodiment of the present invention.
10 illustrates an example of a method for managing and copying memo contents in the memo manager according to an embodiment of the present invention.
11 illustrates an example of a memo content linked with another application when memo contents are stored in a screen receiving a call signal according to an embodiment of the present invention.
12 shows another example of memo contents linked with another application when memo contents are stored in a screen receiving a call signal according to an embodiment of the present invention.
13 illustrates an example of a memo content linked with another application when memo content is stored in a screen displaying a received message according to an embodiment of the present invention.
14 is a diagram showing an example of storing an execution object of an application in correspondence with a memo layer according to an embodiment of the present invention.
Fig. 15 is a diagram showing another example of storing an execution object of an application in a memo layer according to an embodiment of the present invention.
16 shows two examples of execution screens of an application that can be output when an application execution object is selected according to an embodiment of the present invention.
Fig. 17 is a diagram showing another example of storing an execution object of an application in correspondence with a memo layer according to an embodiment of the present invention. Fig.
18 is a diagram showing another example of storing an execution object of an application in a memo layer in accordance with an embodiment of the present invention.
19 is a diagram illustrating an example of a user interface for assisting input of a memo according to an embodiment of the present invention.
FIG. 20 illustrates an example of a method for saving an execution state of an application when memo contents are stored in a web browser application according to an embodiment of the present invention.
21 is a diagram illustrating an example of a method of easily managing a plurality of memo layers according to an embodiment of the present invention.
22 is a diagram illustrating an example of a method of controlling a memo layer icon according to an embodiment of the present invention.
23 is a diagram illustrating another example of a method of controlling a memo layer icon according to an embodiment of the present invention.

Hereinafter, a mobile terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The mobile terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be understood by those skilled in the art that the configuration according to the embodiments described herein may be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to a mobile terminal.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. Two or more broadcast receiving modules 1100 may be provided to the mobile terminal 100 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.

The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information means information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only (Mobile Broadcasting Business Management System), ISDB-T (Integrated Services Digital Broadcasting (ISDB-T)), Digital Multimedia Broadcasting (MBMS) Digital Broadcast-Terrestrial) or the like. Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 may be coupled to a base station, an external terminal, or a server on a mobile communication network, such as, but not limited to, Gobal System for Mobile communications (GSM), Code Division Multiple Access (CDMA), Wideband CDMA Transmits and receives wireless signals with one. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. Wireless Internet technologies include WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), GSM, CDMA, WCDMA, LTE Evolution (but not limited to) may be used.

The wireless Internet module 113, which performs wireless Internet access through the mobile communication network, is connected to the mobile communication module 110 through the mobile communication network, for example, from the viewpoint that the wireless Internet access by Wibro, HSDPA, GSM, CDMA, WCDMA, LTE, (112). ≪ / RTI >

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The position information module 115 is a module for obtaining the position of the mobile terminal, and a representative example thereof is a Global Position System (GPS) module. According to the current technology, the position information module 115 calculates distance information and accurate time information from three or more satellites, and then applies a trigonometric method to the calculated information to obtain three-dimensional (3D) information according to latitude, longitude, The current position information can be accurately calculated. At present, a method of calculating position and time information using three satellites and correcting an error of the calculated position and time information using another satellite is widely used. In addition, the GPS module 115 can calculate speed information by continuously calculating the current position in real time.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence or absence of user contact, the orientation of the mobile terminal, And generates a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. It is also possible to sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor 141.

The output unit 150 is for generating output related to the visual, auditory or tactile sense and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154 .

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another, or may be disposed integrally with one another, and may be disposed on different surfaces, respectively.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller (not shown). The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

The proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch" The act of actually contacting the pointer on the touch screen may be referred to as "contact touch. &Quot; The location where the pointer is proximately touched on the touch screen may refer to a position where the pointer corresponds vertically to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. In this case, the display unit 151 and the audio output module 152 may be a type of the alarm unit 153. The display unit 151 and the audio output module 152 may be connected to the display unit 151 or the audio output module 152, .

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the haptic module 154 are controllable. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. At least two haptic modules 154 may be provided according to the configuration of the mobile terminal 100.

The memory 160 may store a program for processing and controlling the control unit 180 and temporarily store the input / output data (e.g., telephone directory, message, audio, still image, For example. The memory 160 may store the frequency of use of each of the data (for example, each telephone number, each message, and frequency of use for each multimedia).

In addition, the memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data to the external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Therefore, the identification device can be connected to the mobile terminal 100 through the port.

When the mobile terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the mobile terminal 100, or various command signals input by the user to the cradle may be transmitted And may be a passage to be transmitted to the terminal 100. The various command signals or the power source inputted from the cradle may be operated as a signal for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, The embodiments described may be implemented by the control unit 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

2A is a perspective view of an example of a mobile terminal or a mobile terminal according to the present invention.

The disclosed mobile terminal 100 includes a bar-shaped terminal body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injection molding a synthetic resin, or may be formed to have a metal material such as stainless steel (STS) or titanium (Ti) or the like.

A display unit 151, an audio output unit 152, a camera 121, user input units 130/131 and 132, a microphone 122, an interface unit 170, and the like are disposed in the front body 101 .

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 152 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface unit 170 may be disposed on the side surfaces of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the mobile terminal 100 and may include a plurality of operation units 131 and 132. The operation units 131 and 132 may be collectively referred to as a manipulating portion.

The contents inputted by the first or second operation unit 131 or 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 The touch recognition mode can be activated or deactivated.

2B is a rear perspective view of the mobile terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 'may be further mounted on the rear surface of the terminal body, that is, the rear case 102. The camera 121 'may have a photographing direction substantially opposite to that of the camera 121 (see FIG. 2A), and may be a camera having the same or different pixels as the camera 121.

For example, the camera 121 may have a low pixel so that the face of the user can be photographed and transmitted to the other party in case of a video call or the like, and the camera 121 ' It is preferable to have a large number of pixels. The cameras 121 and 121 'may be installed in the terminal body so as to be rotatable or pop-upable.

A flash 123 and a mirror 124 may be additionally disposed adjacent to the camera 121 '. The flash 123 illuminates the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to illuminate the user's own face or the like when the user intends to shoot himself / herself (self-photographing) using the camera 121 '.

An acoustic output 152 'may be additionally disposed on the rear surface of the terminal body. The sound output unit 152 'may implement the stereo function together with the sound output unit 152 (see FIG. 2A), and may be used for the implementation of the speakerphone mode during a call.

In addition to the antenna for communication, a broadcast signal receiving antenna 116 may be additionally disposed on the side of the terminal body. The antenna 116, which forms part of the broadcast receiving module 111 (see FIG. 1), can be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the mobile terminal 100 is mounted on the terminal body. The power supply unit 190 may be built in the terminal body or may be detachable from the outside of the terminal body.

The rear case 102 may further include a touch pad 135 for sensing a touch. The touch pad 135 may be of a light transmission type for the display unit 151. [ In this case, if the display unit 151 is configured to output time information on both sides (i.e., in the directions of both the front and back sides of the mobile terminal 100), the time information is also recognized through the touch pad 135 . The information output on both sides may be all controlled by the touch pad 135. [

Meanwhile, the display for exclusive use of the touch pad 135 may be separately installed, so that the touch screen may be disposed in the rear case 102 as well.

The touch pad 135 operates in correlation with the display portion 151 of the front case 101. The touch pad 135 may be disposed parallel to the rear of the display unit 151. The touch pad 135 may have a size equal to or smaller than that of the display unit 151.

The mobile terminal 100 may support various memo modes or functions in order to satisfy various demands of the user.

For example, the mobile terminal 100 may store a memo by simply using text, and may store a URL (uniform resource locator) information or voice recording of a photograph, a moving image, a web page, or the like It can support multimedia notes. Furthermore, in the embodiment of the present invention to be described below, it is proposed that the application information at the time of inputting a memo is stored together.

An application memo mode for storing a memo by executing a memo application and a Quick memo mode for memo in any state of the mobile terminal 100 ). The difference between the two modes will be described below with reference to FIG. 3 and FIG.

3 is a diagram showing an example of a state diagram of the application memo mode by a general method.

3 (a) is a diagram showing an example of a state diagram in which the memo application of the mobile terminal 100 is executed. 3 (a), when the memo application is executed, the control unit 180 outputs at least one icon 301-1 to 301-4 of the notebook to the display unit 151 . Here, the notebook is a tool for collectively managing a plurality of memo sheets as a set of at least one memo sheet. The memo sheet refers to a virtual blank sheet (sheet) provided by a user for taking notes or a sheet on which a note is written.

Referring to the state diagram of FIG. 3 (a), although four notebook icons 301-1 to 301-4 are output, the user may desire to create a new notebook in addition to the notebook being output. To this end, the control unit 180 may output an icon 302 (hereinafter referred to as a new notebook icon) for creating a new notebook on the display unit 151. [ In response to the input of the user touching the new notebook icon 302, the controller 180 may output a new notebook icon 301-5 as shown in FIG. 3 (b).

When the user wishes to add, edit or delete new contents to a new notebook or an existing notebook, the user can touch the notebook icon 301 to enter a screen for editing the notebook. For example, when the user touches the new notebook icon 301-5 as shown in FIG. 3 (b), the controller 180 can provide a screen for editing the new notebook in response to the input. (Fig. 3 (c)).

The state diagram in FIG. 3 (c) is a general notebook edition screen, and explanations of icons and functions which are not necessary for explanation of the present invention will be omitted.

The notebook edition screen of FIG. 3 (c) outputs one of the memo sheets included in the notebook. In this case, the outputted memo sheet may be the memo sheet located at the beginning of the notebook or the most recently edited memo sheet Sheet. The identification number 304 may include the number of memo sheets included in the currently edited notebook and the page number of the memo sheet currently being edited. For example, as indicated by the reference numeral 304, "4/7" means that the total number of memo sheets included in the notebook is 7, and the page number of the currently edited memo sheet is 4 pages.

In order to move between the memo sheets, the control unit 180 may output the navigation icons 303-1 and 303-2. Accordingly, the user can touch the left navigation icon 303-1 to move to the memo sheet ahead of the currently edited memo sheet, and the right navigation icon 303 -2).

There are many ways to enter or add content to a memo sheet. For example, the user may input a handwriting using a pointer such as a finger or a touch pen or the like in the handwriting format 306 (see the reference numeral 306) when the user enters a note on the memo sheet. Hereinafter, an object input in the handwriting format is referred to as a handwritten memo object 306. As another example, when a user enters a note in a memo sheet, the user can input a note in a text format using a virtual keypad (or use a keyboard that can communicate with the mobile terminal 100) 307). Hereinafter, the content input in the text format is referred to as a typing memo object 307.

With respect to the memo function described above, the memo sheet may include photographs, moving pictures, or voice recordings. When the user touches the content attachment icon 305 included in the memo sheet, the user can receive a list of contents that can be included in the memo sheet. If one of the lists is selected, the user can attach the content to the memo sheet have. The attachment of the contents will be omitted for the sake of clarity of the specification.

As described above with reference to FIG. 3, in general, when a memo is desired using a memo application (application memo mode), the memo sheet can be efficiently managed through a method of managing the notebook.

In addition to the application memo mode, the mobile terminal 100 can support the quick memo mode. Hereinafter, a quick memo mode will be described with reference to FIG. The quick memo mode means a mode in which a virtual blank memo paper is not generated when a memo sheet is created, but an image captured by the screen of the mobile terminal 100 is used as a memo sheet. Or quick memo mode, when a memo sheet is created, a transparent or semitransparent virtual blank memo paper (hereinafter, a transparent or semitransparent virtual blank memo paper is referred to as a memo layer) is generated and executed And a mode for inputting a memo in an application or a home screen state (standby screen state).

4 is a diagram showing an example of a state diagram of the quick memo mode by a general method. 4 (a) is a view showing a state of the home screen of the mobile terminal 100. As shown in FIG. According to an embodiment of the present invention, a user can separate a pen provided in the mobile terminal 100 to enter the quick memo mode. That is, the controller 180 senses that the pen is separated, and enters the quick memo mode in response to the detection. Alternatively, the user can input the quick memo button to enter the quick memo mode. In this case, the quick memo button may be provided in hardware of the mobile terminal 100, and the quick memo button may be provided in the form of an icon on the screen of the mobile terminal 100, and the button may be input through the touch screen. Or to enter the quick memo mode when a predetermined touch action is performed on the touch screen.

4 (b) is a diagram showing a state in which the apparatus enters the quick memo mode. 4B, the control unit 180 continuously outputs the home screen state screen of FIG. 4A, outputs a transparent blank memo layer 402 on the output home screen state screen, And a memo menu 401 for inputting a memo.

Each of the menu icons included in the memo menu 401 is generally the same as or similar to the icons provided in the memo input environment, and the explanation of the icons that are not related to the embodiments of the present invention will be omitted do. In the following detailed description, icons associated with embodiments of the present invention will be described with reference to respective embodiments and associated icons.

Furthermore, in the quick memo mode, the control unit 180 outputs the memo layer 402 at a moment when a user command for entering the quick memo mode is inputted not only on the home screen but also on any menu or any application execution screen of the mobile terminal Can be output on the screen.

The user can create a memo (for example, a memo memo object) in the transparent memo layer 402 as shown in Fig. 4 (b) (Fig. 4 (c)) and store the memo layer including the memo.

The control unit 180 according to the embodiment of the present invention can store the storage input 1000-1 of the memo layer in the quick memo mode when the memo layer 1000-1 is detected. Therefore, the control unit 180 can also output the notebook icon 301-6 for the memo layer stored on the execution screen of the memo application (see Fig. 4 (d)).

Meanwhile, in the quick memo mode according to the embodiment of the present invention, the control unit 180 may capture the captured image as it is when storing the memo, and then store the captured image. That is, the execution screen of the application is outputted as the background on the screen at the time of saving, and the memo layer 402 is also output so as to overlap the execution screen of the application. The memo content input to the memo layer 402 is also output together with a memo layer 402 and a screen shot capturing the entire output screen itself including the memo content as a memo Can be stored.

As described with reference to FIG. 3, when the controller 180 detects the input 1000-2 for selecting the icon 301-6 of the outputted notebook, the controller 180 outputs the stored memo layer to the user can do. 4 (d), the controller 180 can output a memo browsing screen having the same screen as that of FIG. 4 (c) upon receiving an input for touching the notebook icon 301-6.

The application memo mode and the quick memo mode have been described with reference to FIGS. 3 and 4. FIG. It should be understood that any one of the two modes may be described in the following description of the embodiments of the present invention, but it is apparent that the embodiments of the present invention can be applied to both modes.

Hereinafter, embodiments related to a control method that can be implemented in the mobile terminal configured as above will be described with reference to the accompanying drawings.

5 is a diagram illustrating an example of a method of linking an application with a memo content according to an embodiment of the present invention.

According to an embodiment of the present invention, interworking between a memo content and an application can be defined in two forms.

In the first form, when the control unit 180 outputs the execution screen of the application, it means that the memo contents interlocked on the execution screen are output together. This first form will be described later in detail with reference to Figs. 5 to 13.

In the second form, when the control unit 180 reads the memo contents (or executes the memo application), the application linked or linked with the memo contents is automatically executed or immediately executed Can be provided. The second embodiment will be described later in detail with reference to FIGS. 14 to 18. FIG.

5A, a control unit 180 shows a state diagram of an alarm application through a display unit 151, and a case that forms an appearance of the mobile terminal 100 has a convenience of explanation and a simplicity For example.

FIG. 5B is a diagram showing a state in which the user enters the quick memo mode while the alarm application is being executed. In the example of FIG. 5 (b), the user disconnects the pen from the mobile terminal 100 and enters the quick memo mode. 5B, in the quick memo mode, the control unit 180 can output the memo layer 402 and the memo menu 401 on the execution screen of the application.

According to an embodiment of the present invention, the control unit 180 provides a stamp icon 501 to the memo menu 401 in order to link the application with the memo content. The stamp icon 501 is an icon provided for storing the memo contents on the execution screen of the application. That is, in a state in which the memo content is input to the memo layer 402, when the input to which the stamp icon 501 is touched is detected, the control unit 180 stores the memo content And stores the content. 5, the control unit 180 outputs the overlapping memo layer 402 on the execution screen of the application. When the stamp icon 501 detects the input to be selected, The memo contents inputted to the application are moved and stored on the execution screen of the overlapping application.

The embodiment described above with reference to FIG. 5 may be the first form of interlocking the memo contents with the application. The fact that the memo contents are transferred and stored on the execution screen of the application means that the memo contents stored together on the execution screen are output even if only the application is executed. Therefore, when executing the application, the control unit 180 outputs the memo contents stored in the execution screen of the application while outputting the execution screen of the application, at the corresponding positions at the time when the contents are stored (Fig. 5 (d)). Therefore, the user can directly input a memo on the execution screen of the application used by the user, and can view the memo each time the application is executed.

The operation of the stamp icon 501 described with reference to Figs. 5 (a) to 5 (c) will be referred to as a stamp operation in the following detailed description.

5D is a diagram showing a state diagram for outputting an execution screen of an application and outputting memo contents stored in correspondence with the execution screen according to an embodiment of the present invention. In FIG. 5C, an input to which the stamp icon 501 is touched is received, and the control unit 180 stores the memo contents in association with the execution screen of the application. Thereafter, when the application is executed, the control unit 180 outputs the memo contents 501-1 and 501-2 corresponding to each other when the alarm application is executed.

On the other hand, upon receiving the storage input 1000-1 in the quick memo mode shown in FIG. 5C, the control unit 180 displays the execution screen of the application, the memo layer 402, It is the same as described above that the screenshot capturing itself can be stored as a single memo.

Further, in one embodiment of the present invention, it is further proposed to link memo contents on an execution screen of an application for each object. This specific embodiment will be described with reference to Figs. 6 to 8. Fig.

6 to 8 illustrate a method of linking memo contents by objects according to an embodiment of the present invention. 6 is a conceptual diagram for explaining the operation of the application and the operation of the memo layer 402 when the stamp operation is performed.

When the stamp operation is performed as described above, the memo contents can be stored in correspondence with the execution screen of the application. When the memo contents are stored in correspondence with the execution screen of the application in this way, the controller 180 in the embodiment of the present invention uses the layer structure as shown in Fig. 6 to store the execution screen (application layer) The layer 402 can be managed. That is, the memo layer 402 stores and manages the memo contents, and the application layer can manage the output of the application itself. The control unit 180 in the embodiment of the present invention may independently manage the memo contents and the application objects through separate layers so that the memo layer can be edited or deleted later. Hereinafter, with reference to FIG. 7, a specific example of interlocking the memo contents on an object-by-object basis on the execution screen of the application will be described.

FIG. 7A shows an execution screen of an alarm application, in which two alarms are set. The set two alarms are outputted to the first alarm object 701-1 and the second alarm object 701-2, respectively. In particular, the second alarm object 701-2 stores a memo 702-1 corresponding to "morning medication" by the stamp operation according to the above-described embodiment of the present invention. This memo 702-1 is a memo created by the user in association with the second alarm object 701-2. Therefore, when the position of the second alarm object 701-2 is changed or deleted in accordance with the setting of the application itself or the mode change, etc., the memo associated with the second alarm object 701-2 is also changed Or deleted. This is because the memo is created by the user in association with the alarm.

Therefore, in one embodiment of the present invention, it is proposed to apply the same change to the memo contents when the created memo and the objects outputted from the application itself are interlocked and the position of the interlocked object is changed or deleted .

FIG. 7B shows an example in which the position of the second alarm object 701-2 is changed. 7B, as the third alarm object 701-3 is added, the positions of the first alarm object 701-1 and the second alarm object 701-2 are moved down by one space . Accordingly, the control unit 180 changes the position of the second alarm object 701-2 while changing the position of the memo content linked to the second alarm object 701-2 at the same time.

On the other hand, the method of interlocking the individual memo contents with the object can be determined based on the position where the memo contents are input. For example, the memo contents shown in Fig. 7, "breakfast medication & The memo 702-1 is output at a position where the second alarm object 701-2 is output. Accordingly, the control unit 180 can interlock with each other based on the position of the memo contents 702-1 and the position of the second alarm object 701-2.

FIG. 8 shows an example in which, when an object is deleted by a user's operation, the memo contents linked with the object are also deleted according to an embodiment of the present invention.

In the output screen of the application shown in FIG. 8A, alarm objects 1 to 3 (701-1 to 701-3) are outputted, and the contents of the memo are stored in correspondence with the output screen of the application.

The memo contents 702-2 covers the area 801 occupying the alarm objects 1 and 3 (701-1 and 701-3). Therefore, in such a case, it may be a matter of which alarm object and the memo content are linked. In one embodiment of the present invention, the memo content 702-2 defines a partial area 802 as a reference among the occupied areas 801, and corresponds to a position occupied by the partial area 802 as a reference It is proposed to interlock with the alarm object 701-3.

Accordingly, the control unit 180 interlocks the third alarm object 701-3 and the memo content 702-2, which are output at a position corresponding to the position occupied by the partial area 802 of the memo content 702-2 .

8 (b) shows a state in which the third alarm object 701-3 is deleted by the user's operation. Referring to FIG. 8, when deleting the third alarm object 701-3, the controller 180 may delete the memo content 702-2 linked with the third alarm object 701-3.

In the meantime, when the stamp operation according to the embodiment of the present invention is performed, the memo contents are output together on the execution screen of the application, as described above. However, the user interface May be required. 9, a memo management manager for collectively managing memos for which a stamp operation has been performed will be described.

9 is a diagram illustrating an example of a memo management manager capable of managing memo contents in which a stamp operation is performed according to an embodiment of the present invention.

9A, the control unit 180 outputs a memo screen shot including an execution screen and memo contents of an application in which the stamp operation is performed in the screen state diagram (see 901 area in the state diagram of FIG. 9A) . This memo screenshot itself is identical to the state diagram that is output when the application is run. In the lower end portion of the area 901, other memo screen shots managed by the memo management manager are superimposed on each other in a form in which sheets of paper are stacked (see 902 area).

When the controller 180 detects a predetermined touch input (for example, an operation of double-touching within the area 901, 1000-4) on the memo screen shot being output as shown in FIG. 9A, The user can move to the execution screen of the corresponding application on the screen.

9 (a), when the controller 180 detects a predetermined touch input (for example, flicking in the upward direction within the area 901, 1000-3) on the memo screen shot, You can switch to the next memo screen shot from the memo screen shot that is being output. The next memo screenshot may be a screenshot that is superimposed on the 902 area. And in such a conversion, the next memo screen shot that is output can be output with the animation effect as if the story is over the top 902 area (see Figs. 9 (b) and (c))

Hereinafter, a method for managing and copying memo contents in the memo management manager will be described with reference to Fig.

10 illustrates an example of a method for managing and copying memo contents in the memo manager according to an embodiment of the present invention.

In the state diagram shown in Fig. 10 (a), the memo management manager outputs a memo screen shot. The user may wish to copy the memo contents 702-3 (card value redeemed) to another memo or another application. Accordingly, the control unit 180 of the embodiment of the present invention provides the icon 1001 for copying the memo contents.

When the memo management manager senses the input to select the memo contents copy icon 1001 while viewing the memo screen shot, the control unit 180 displays a memo layer 402 including the memo contents 702-3 (See Fig. 10 (b)).

In the state of FIG. 10 (b), the control unit 180 superimposes and outputs the memo layer 402 while outputting the home screen screen. When receiving an input for executing another application (calendar application in FIG. 10), the control unit 180 continuously outputs the memo layer 402 being output, and displays only the output screen of the home screen as an execution screen of the calendar application (See Fig. 10 (c)). In this case, it is a matter of course that the memo contents 702-3 stored in the memo layer 402 are output together. The handwriting input mode is inactivated, and the controller 180 can receive additional notes when the handwriting input mode is activated.

10 (c) or (d), the control unit 180 displays the memo contents 702-3 on the execution screen of the calendar application Can be stored in correspondence with each other.

Furthermore, in an embodiment of the present invention, the size or position of the memo contents 702-3 itself can be adjusted during editing of the memo contents 702-3. As shown in FIG. 10C, when a predetermined gesture for the memo content 702-3 is input, the controller 180 can move the position of the memo content 702-3 in response to the input. An example of the predetermined gesture may be an input for touching one area of the memo contents 702-3 and then dragging it to a desired position while maintaining the touch.

When a predetermined gesture for the memo content 702-3 is input as shown in FIG. 10D, the controller 180 can adjust the size of the memo content 702-3 in response to the input. An example of the predetermined gesture may be an input for touching one end of the memo contents 702-3 and then dragging the memo contents 702-3 to the desired size while keeping the touch.

Hereinafter, a specific example in which the memo contents and the execution screen of the application are interlocked according to an embodiment of the present invention will be described with reference to FIG. 11 to FIG.

11 illustrates an example of a memo content linked with another application when memo contents are stored in a screen receiving a call signal according to an embodiment of the present invention.

11A shows a state in which the memo content 702-4 is input through the quick memo mode while the reception screen of the call signal is being output and the stamp icon 501 is selected. The control unit 180 can store the memo contents 702-4 in correspondence with the reception screen of the call signal when the input 1000-3 in which the stamp icon 501 is selected is detected. Furthermore, when storing the memo contents 702-4 corresponding to the memo contents 702-4, the control unit 180 identifies and stores the telephone number of the call signal reception screen. Then, the memo contents 702-4 stored together only upon receipt of the call signal for the telephone number identified later can be output together (see Fig. 11 (b)). This is because the memo stored in the call signal reception screen is a memo for the transmission destination of the call signal.

On the other hand, in the embodiment of the present invention, it is proposed that the interlock function of the memo contents 702-4 is further expanded and applied.

Referring to FIG. 11 (c), the control unit 180 outputs an execution screen of the message application, and the list of the transmission / reception messages is sorted according to a certain criterion (for example, transmission / reception time). In the embodiment of the present invention, the memo contents 702-4 stored in correspondence with the telephone numbers specifically identified on the call signal reception screen as shown in Fig. 11A are output together with the message list of the message application (See Fig. 11 (c)).

In FIG. 11A, when the memo contents 702-4 are stored together with the identified phone number 02-2033-XXXX, the controller 180, when displaying the message list in the message application, And the memo contents 702-4 stored in the list of telephone numbers can be displayed together.

12 shows another example of memo contents linked with another application when memo contents are stored in a screen receiving a call signal according to an embodiment of the present invention.

12A shows a state in which the memo content 702-5 is input through the quick memo mode while the reception screen of the call signal is being output and the stamp icon 501 is selected. The control unit 180 can store the memo contents 702-5 in correspondence with the reception screen of the call signal when the input 1000-3 in which the stamp icon 501 is selected is detected. When storing the memo contents 702-5 in correspondence with each other, the controller 180 identifies and stores the telephone number of the call signal receiving screen in the same manner as described above with reference to FIG. Furthermore, the controller 180 in the embodiment of the present invention recognizes the text of the memo content 702-5 stored therein, and proposes to interoperate with the calendar application when the recognized text includes a date. For example, the memo content 702-5 stored in Fig. 12 includes date text "January 30 ". When the memo contents 702-5 is stored, the control unit 180 recognizes the date text and interlocks with the date of the calendar application (see Fig. 12 (c)).

Accordingly, the control unit 180 executes the calendar application and can output the corresponding memo contents 702-5 on January 30, which is the date recognized through the memo contents 702-5, of the calendar application dates . In this case, the control unit 180 can omit the memo content indicating January 30 of the memo contents 702-5 and output only the remaining memo contents. Furthermore, the control unit 180 may display the identified phone number (or, if the incoming phone number is stored in the contact, the name 1201 stored in the contact) together with the calendar application.

13 illustrates an example of a memo content linked with another application when memo content is stored in a screen displaying a received message according to an embodiment of the present invention.

13A shows a state in which the memo content 702-6 is input through the quick memo mode while the message reception screen is being output and the stamp icon 501 is selected. When the input 1000-3 in which the stamp icon 501 is selected is detected, the control unit 180 can store the memo contents 702-6 in correspondence with the message receiving screen. When storing the memo contents 702-5 in correspondence with each other, the control unit 180 identifies and stores the transmission phone number of the message reception screen in the same manner as described above with reference to FIG. 11 and FIG. Then, the memo contents 702-6 stored together only at the time of sending / receiving a message to the telephone number identified later can be outputted together (see Fig. 13 (b)).

If the pop-up notification window 1301 is present at the time of receiving a message in the message application, the control unit 180 according to an exemplary embodiment of the present invention displays the linked memo content 702-6 at the time of outputting the pop- (See Fig. 13 (c)).

In the foregoing, embodiments have been described in which an execution screen of an application, which is the first form according to an embodiment of the present invention, is output together with a memo content linked to the execution screen.

Hereinafter, a second form according to an embodiment of the present invention will be described in which an application linked to the memo contents is automatically executed in browsing the memo contents (or in executing a memo application) Let's look at some examples of providing objects.

14 is a diagram showing an example of storing an execution object of an application in correspondence with a memo layer according to an embodiment of the present invention.

14 (a) shows an execution screen of a gallery application capable of browsing a photograph or an image stored in the mobile terminal 100. As shown in Fig. The memo layer 402 and the memo content 702-7 included in the memo layer are overlapped with the execution screen of the gallery application of Fig. 14 (a).

As shown in FIG. 14 (a), when receiving an input that leaves the application execution screen in a state in which the memo layer 402 is output, the control unit 180 outputs the application execution object on the memo layer.

For example, when receiving an input (home screen button input) for entering the home screen state in the state of Fig. 14A, the control unit 180 outputs the gallery application execution object 1401 on the memo layer (See Fig. 14 (b)). On the other hand, the input beyond the execution screen of the application is not limited to the input that enters the above-mentioned home screen state. In the background by the multitasking function (the execution in the background means a state in which the control unit itself is not output through the display unit but is operating in the background), the switching input to another application being executed is also the same as the one embodiment It may be an input that goes beyond the execution screen of the application according to the example.

Upon receiving an input for selecting the execution object 1401 of the output application, the control unit 180 can execute the application. The type of the execution object 1401 of the output application is consistent with the type of the execution icon of the application so that the user can recognize the application only by the form of the object. Then, the execution object 1401 of the output application can provide a touch item 1402 that can delete the object. The control unit 180 can delete the execution object 1401 of the application on the memo layer 402 when receiving the input from which the touch item 1402 capable of deleting the object is selected.

The control unit 180 can store the execution object 1401 of the application in association with the memo layer 402 upon receiving the storage input 1000-1 of the memo layer 402. [

The control unit 180 receives the memo content 702-7 and the execution object 1401 of the application on the home screen screen when receiving the input 1000-3 in which the stamp icon 501 for the memo layer 402 is selected, Can be stored in association with each other. Accordingly, the control unit 180 outputs the memo contents 702-7 and the application execution object 1401 together when the home screen screen is output thereafter (see Fig. 14C).

It is a matter of course that the memo screen shot stored by the stamp operation in FIG. 14 can also be read through the memo management manager described above with reference to FIG.

On the other hand, when the user selects the execution object 1401 of the application when browsing the stored memo contents, the control unit 180 can execute the corresponding application.

Furthermore, the control unit according to an embodiment of the present invention proposes not only to execute the corresponding application, but also to execute it in the application state at the time when the corresponding application execution object 1401 is generated. That is, the control unit 180 can store the execution state information of the application when generating the application execution object 1401 in association with the execution object 1401 of the application. Here, the application state at the time of creation may include an entry state of a menu in a application, a kind of content being browsed, and an output state of memo contents stored on an application execution screen. In this case, the execution status information of the application may include at least one of function information, text information output by the application, content identification information, menu object identification information, and execution object identification information.

The text information is information indicating the text if there is at least one text being output on the application screen. The content identification information is information for identifying all contents being output, and may be identification information of pictures in a gallery application as a representative example. The menu object identification information is information that can identify at least one menu that is being output while the application is running.

In the example shown in Fig. 14, at the time when the application execution object 1401 is created, the control unit 180 is in a state of browsing a specific picture through the gallery application. The control unit 180 can store the execution state information of the gallery application when generating the application execution object 1401 in association with the execution object 1401 of the application. The execution state information in the gallery application may include photo content identification information that is output. Thereafter, when the application execution object 1401 receives the selected input, the control unit 180 executes the gallery application using the execution state information of the gallery application corresponding to the stored gallery application, and simultaneously generates the application execution object 1401 It is possible to output the screen of the execution state of the application at the time (outputting the same photographic content which has been output).

Fig. 15 is a diagram showing another example of storing an execution object of an application in a memo layer according to an embodiment of the present invention. In Fig. 15, a selection frame 1501 for selecting only a part of the application execution screen is provided.

15 (a) shows a state in which the memo layer 402 is outputted together with the execution screen of the gallery application. Upon receiving the input 1000-5 from which the select icon 1503 is selected in the state of outputting the memo layer 402, the control unit 180 displays an option frame 1501 ), An OK button 1504, and a cancel button 1505 (see Fig. 15 (b)). Through the selection frame 1501, the user can adjust the position and size of the area to be selected, and can select and cancel some areas through the OK button 1504 and the cancel button 1505. When the confirmation button 1504 receives the selected input, the control unit 180 can output the area indication frame 1502 to a part of the area designated by the selection frame 1501. [ This area designation frame 1502 is a frame for identifying the area selected by the area selection frame 1501. [

When the memo layer 402 is outputted as shown in Fig. 15C and an input for leaving the application execution screen is received while the area indication frame 1502 is displayed, the control unit 180 executes the application execution The object 1401 can be output. In the embodiment shown in FIG. 15, the application execution status information may further include identification information for a selected partial area.

On the other hand, the control unit 180 in the embodiment described with reference to Fig. 15 can output such that some selection region portions are identified when an input for selecting the application execution object 1401 is received. This will be described with reference to FIG.

16 shows two examples of execution screens of an application that can be output when an application execution object is selected according to an embodiment of the present invention. Both examples are cases where an application execution object generated after a certain area of the execution screen of the application is selected in Fig. 15 is selected.

16 (a) is a first example of an execution screen of an application that can be output when an application execution object is selected. When the application execution object is selected, the control unit 180 can output the area indication frame 1502 together with the output of the corresponding photo content while outputting the execution screen of the application. The area indication frame 1502 serves to display a partial area selected by the user so as to be identified.

16 (b) is a second example of an execution screen of an application that can be output when an application execution object is selected. When the application execution object is selected, the control unit 180 may output a pop-up window 1601 for outputting only a selected region of the photo content. The control unit 180 may further provide an icon 1602 to the user to remove the output of the pop-up window 1601 in one area of the pop-up window 1601.

When receiving a predetermined input in the pop-up window 1601, the control unit 180 can output the corresponding photo content using the entire display unit. When the control unit 180 receives a predetermined input in the pop-up window 1601, the control unit 180 outputs an execution screen of the application as shown in Fig. 16 (a), and can output the corresponding photo content. Similarly, on the output screen of the photographic content, the control unit 180 outputs the area designation frame 1502 together so as to output a partial area selected from the photographic content.

17 is a diagram showing another example of storing an execution object of an application in a memo layer in accordance with an embodiment of the present invention.

Fig. 17A shows a screen for viewing e-mail in an e-mail application, and a memo layer 402 is superimposed and output (quick memo mode). In the specific situation illustrated in FIG. 17, while the user is viewing the e-mail, he / she finds that his / her English name is wrong in the e-mail and tries to contact the e-mail sender in order to correct it. However, have. Thus, the user calls the "Red Cap" company on the memo layer 402 as shown in FIG. 17 (a) and inputs a memo 702-8 for correcting the English name.

17 (a), the control unit 180 receives an input (an input for executing a telephone dial input application) out of the e-mail application, and a control unit (not shown) 180 outputs the execution object 1701-1 of the e-mail application to the memo layer 402 in response thereto.

The control unit 180 can output the memo layer 402 together with the execution screen of the telephone dial input application as shown in Fig. 17 (b) corresponding to the input for executing the telephone dial input application. In this case, the control unit 180 can output the execution object 1701-1 of the e-mail application together with the output of the memo layer 402 as described above.

The user can enter the phone number of a company called "Red Cap " and receive input (input to switch to the home screen) that goes beyond the phone dial input application. Accordingly, the control unit 180 can output the execution object 1701-2 of the telephone dial input application to the memo layer 402 in response to the input to switch to the home screen. When the user exits the telephone dial input application, the control unit 180 can store the execution status information of the telephone dial input application. The execution status information of the telephone dial input application includes text information (02-2001- XXXX). The textual information of the input telephone number can be used to automatically make a number entry state without additional input of the user when the telephone dial input application is executed again.

Upon receiving the input for selecting the execution object 1701-2 of the telephone dial input application thus generated, the control unit 180 executes the telephone dial input application. At this time, based on the text information of the telephone number, the control unit 180 can execute the telephone dial input application with the telephone number input on the execution screen of the telephone dial input application. The user will be able to perform the desired telephone connection using the entered telephone number without having to re-enter the telephone number.

18 is a diagram showing another example of storing an execution object of an application in a memo layer in accordance with an embodiment of the present invention.

18 (a) and 18 (b), the procedure and procedure for switching to the e-mail application and the telephone dial input application are the same as in Fig. The difference from the example in Fig. 17 is the setting of the selection of some areas 1801-1 and 1801-2.

As shown in FIG. 18C, when the input 1000-8 at which the execution object 1701-1 of the e-mail application is selected is received, the control unit 180 outputs only the selected area 1801-1 A pop-up window 1802-1 can be output (see Fig. 18 (d)).

As shown in FIG. 18C, or when the input 1000-9 in which the execution object 1701-2 of the telephone dial input application is selected is received, the control unit 180 selects the partial area 1801-2 (See Fig. 18 (d)).

When the control unit 180 detects a predetermined input to the pop-up window 1802-2 in the state that the pop-up window 1802-2 is output, the control unit 180 outputs the output of the pop-up window 1802-2 corresponding to the input And can output a telephone dial input application execution screen. The predetermined input may be the touch input 1000-9 for one area of the pop-up window 1802-2. The control unit 180 can output an execution state (a state in which a telephone number is input) at the time of leaving the telephone dial input application on the output screen of the telephone dial input application program as described above.

Hereinafter, a user interface for facilitating entry of a memo will be described.

19 is a diagram illustrating an example of a user interface for assisting input of a memo according to an embodiment of the present invention.

19A shows a state in which the memo layer 402 is being output by entering the quick memo mode while the home screen screen is being output. When the user inputs the memo content into the memo layer 402 . Since the size of the memo layer 402 is limited, the empty space of the memo layer 402 becomes smaller as the memo contents are input. Accordingly, in an embodiment of the present invention, a method of securing a free space by adjusting the size of a memo content that has been previously input when a vacant space to receive an input becomes smaller to some extent is proposed.

If the empty space of the memo layer 402 falls below the first predetermined rate (for example, 20% or less) as the input of the memo contents continues as shown in FIG. 19 (a) 402 to output the area limitation warning 1901. [ This area limitation warning 1901 can inform the user that the empty space of the memo layer 402 is insufficient. Further, the automatic limitation function of the memo contents according to the embodiment of the present invention can be executed I can tell you. If the empty space of the memo layer 402 continues to be lower than a second predetermined rate (for example, 10% or less) even if the area restriction warning 1901 is output and the input of the memo contents continues, And automatically adjusts the size of the inputted memo contents. For example, when the empty space of the memo layer 402 falls below a second predetermined rate (for example, 10% or less), the controller 180 may adjust the size of the memo content input to 70% 19 (b)). When the memo contents are input continuously after the adjustment, the controller 180 may repeat the adjustment process as shown in FIG. 19 (c).

On the other hand, in the case of the web browser application, the execution state of the application at the time of storing the memo contents may be different. This is because the content corresponding to a specific Web address changes depending on the storage state of the external server. Accordingly, in one embodiment of the present invention, it is proposed that, when the execution state information is stored, information about the web page itself is further stored in the web browser application. A specific example will be described with reference to Fig.

FIG. 20 illustrates an example of a method for saving an execution state of an application when memo contents are stored in a web browser application according to an embodiment of the present invention.

FIG. 20A shows an execution state diagram of the memo management manager described with reference to FIG. 9. The memo screen shot being output includes the output screen 2001 of the web browser application and the memo content 702-10 have. As described above with reference to FIG. 9, when the predetermined input 1000-10 for the area of the memo screen shot is sensed, the control unit 180 outputs the execution screen of the corresponding application, -10) can be outputted together.

However, in this case, as the content stored in the external server is changed, the execution screen of the application can be changed. Accordingly, in one embodiment of the present invention, the execution status information of the application further includes web page information, and the control unit 180 may execute the web page information using the stored web page information when executing the application. In this case, the control unit 180 may further output a notification 2002 indicating that the stored page is used.

Hereinafter, a method of easily managing a plurality of memo layers according to an embodiment of the present invention will be described.

21 is a diagram illustrating an example of a method of easily managing a plurality of memo layers according to an embodiment of the present invention.

21 (a) shows a memo layer icon 2101-1 in which the memo layer 402 is set to a certain size or smaller. The control unit 180 changes the memo layer 402 to the memo layer icon 2101-1 when the memo layer 402 becomes smaller than a predetermined size through the resizing input sensed by the memo layer 402 . The resize input may be an input (pinch-in, pinch-in input) where a touch is sensed at two points in the memo layer 402 and the distance between the two points is approached while the touch is maintained. The memo layer icon 2101-1 can output the memo content included in the memo layer 402 in proportion to its size.

In the state of the memo layer icon 2101-1, the user can not input a memo but can easily move, edit or copy the memo layer icon 2101-1. 21 (b) and (c), the controller 180 can arrange a plurality of memo layer icons 2101-1 to 2101-4 on one screen of the home screen.

When a predetermined input is detected on the memo layer icons 2101-1 to 2101-4, the controller 180 may perform copying or pasting operations of the memo layer icons 2101-1 to 2101-4.

22 is a diagram illustrating an example of a method of controlling a memo layer icon according to an embodiment of the present invention.

In Fig. 22, the memo layer icon 2101-5 is output. When the control unit 180 detects a predetermined input to the memo layer icon 2101-5, the control unit 180 can perform a reverting function or an input repetition function in the memo layer 402 editing operation corresponding to the input. The reverting function means a function of canceling the editing operation made in the latest editing order. The input repeat function means to restore the operation canceled by the revert function back to the original editing operation.

In this case, the predetermined input may include an input that rotates by a predetermined angle or more in a clockwise direction, or an input that rotates by a predetermined angle or more in a counterclockwise direction while touching the note layer icon 2101-5.

23 is a diagram illustrating another example of a method of controlling a memo layer icon according to an embodiment of the present invention.

It is proposed in the embodiment of FIG. 23 that when the setting of the memo layer icon is changed by a predetermined input, the user's memo input is recognized and a predetermined function of the mobile terminal is executed.

23 illustrates an example of changing the memo layer icon to the command recognition icon 2301 when the length of the memo layer icon is increased by a predetermined length or more according to an embodiment of the present invention.

The control unit 180 performs a text recognition function on the memo input 702-11 of the user input on the command recognition icon 2301 and can execute a predetermined function corresponding to the recognized text. In the example shown in Fig. 23, the text for executing the camera is inputted, and when the control unit 180 recognizes the text, the camera function of the mobile terminal 100 can be executed.

It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

The present invention described above can be implemented as computer readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer-readable medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . Also, the computer may include a control unit 180 of the terminal.

Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

100: mobile terminal 110: wireless communication unit
120: A / V input / output unit 130: user input unit
140: sensing unit 150: output unit
160: memory 170: interface section
180: control unit 190: power supply unit

Claims (20)

A display unit;
A user input section; And
A memo layer is superimposed on the execution screen of the application,
And a control unit for outputting an execution object of the application on the memo layer upon receipt of an input from the execution screen of the application in a state in which the memo layer is output,
Mobile terminal.
The apparatus of claim 1,
And storing the execution object of the application in association with the memo layer upon receiving the storage input of the memo layer,
Mobile terminal.
3. The apparatus of claim 2,
And outputting the stored memo layer and outputting an execution object of the stored application in correspondence with the memo layer when receiving the memo layer call input,
Mobile terminal.
4. The apparatus according to claim 1 or 3,
Upon receiving an input for selecting an execution object of the output application,
Mobile terminal.
The apparatus of claim 1,
Wherein the execution state information of the application is stored when receiving an input from the execution screen of the application in a state in which the memo layer is superimposed on the execution screen of the application,
Mobile terminal.
6. The apparatus of claim 5,
Wherein the application execution unit executes the application in the execution state of the application at the time of receiving the input from the execution screen of the application based on the execution state information of the stored application when receiving the input for selecting the execution object of the output application ,
Mobile terminal.
The method according to claim 6,
The application is a photo gallery application,
Wherein the execution state information is identification information of the photo content being viewed at the time of receiving an input that is out of the execution screen of the application,
Wherein the control unit is configured to output the photo content corresponding to the identification information of the photo content on the photo gallery application execution screen upon receiving the input for selecting the execution object of the output application,
Mobile terminal.
The method according to claim 6,
Wherein the application is a call sending application,
Wherein the execution state information is telephone number information input at the time of receiving an input from an execution screen of the application,
Wherein the control unit executes the call application in a state in which the stored phone number information is input upon receiving an input for selecting an execution object of the output application,
Mobile terminal.
The information processing apparatus according to claim 5,
And at least one of function information that the application is executing, text information output by the application, content identification information, and menu object identification information.
Mobile terminal.
The apparatus of claim 1,
Further receiving an input for selecting a partial area of an execution screen of the application,
And outputting a capture screen of the selected partial area upon receiving an input for selecting an execution object of the output application,
Mobile terminal.
Outputting a memo layer over the execution screen of the application through the display unit; And
And outputting an execution object of the application on the memo layer when receiving an input that leaves the execution screen of the application in a state in which the memo layer is output,
A method of controlling a mobile terminal.
12. The method of claim 11,
Further comprising storing the execution object of the application in correspondence with the memo layer upon receiving the storage input of the memo layer.
A method of controlling a mobile terminal.
13. The method of claim 12,
Further comprising the step of outputting the stored memo layer and outputting the execution object of the stored application corresponding to the stored memo layer upon receiving the call input of the stored memo layer,
A method of controlling a mobile terminal.
14. The method according to claim 11 or 13,
Further comprising executing the application upon receiving an input for selecting an execution object of the output application,
A method of controlling a mobile terminal.
12. The method of claim 11,
Further comprising the step of storing execution state information of the application upon receipt of an input from an execution screen of the application in a state in which the memo layer is superimposed on the execution screen of the application,
A method of controlling a mobile terminal.
16. The method of claim 15,
Wherein the application execution unit executes the application in the execution state of the application at the time of receiving the input from the execution screen of the application based on the execution state information of the stored application when receiving the input for selecting the execution object of the output application Further comprising:
A method of controlling a mobile terminal.
17. The method of claim 16,
The application is a photo gallery application,
Wherein the execution state information is identification information of the photo content being viewed at the time of receiving an input that is out of the execution screen of the application,
The step of executing the application comprises:
And outputting photo content corresponding to identification information of the photo content on the photo gallery application execution screen upon receiving an input for selecting the execution object of the output application,
A method of controlling a mobile terminal.
17. The method of claim 16,
Wherein the application is a call sending application,
Wherein the execution state information is telephone number information input at the time of receiving an input from an execution screen of the application,
The step of executing the application comprises:
And the call application is executed in a state that the stored phone number information is input upon receiving an input for selecting an execution object of the output application,
A method of controlling a mobile terminal.
16. The method according to claim 15,
And at least one of function information being executed by the application, text content information output by the application, content identification information, and menu object identification information.
A method of controlling a mobile terminal.
12. The method of claim 11,
Further comprising receiving an input for selecting a portion of an execution screen of the application,
Further comprising the step of outputting a capture screen of the selected partial area upon receiving an input for selecting an execution object of the output application,
A method of controlling a mobile terminal.
KR1020130049342A 2013-05-02 2013-05-02 Mobile terminal and method for controlling the same KR20140130854A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130049342A KR20140130854A (en) 2013-05-02 2013-05-02 Mobile terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130049342A KR20140130854A (en) 2013-05-02 2013-05-02 Mobile terminal and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20140130854A true KR20140130854A (en) 2014-11-12

Family

ID=52452486

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130049342A KR20140130854A (en) 2013-05-02 2013-05-02 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20140130854A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020180167A1 (en) * 2019-03-07 2020-09-10 삼성전자 주식회사 Electronic device and method for controlling application thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020180167A1 (en) * 2019-03-07 2020-09-10 삼성전자 주식회사 Electronic device and method for controlling application thereof

Similar Documents

Publication Publication Date Title
KR101860341B1 (en) Mobile terminal and control method for the same
KR101568351B1 (en) Mobile Terminal With Touch Screen And Method Of Processing Data Using Same
KR101609162B1 (en) Mobile Terminal With Touch Screen And Method Of Processing Data Using Same
KR101592296B1 (en) Mobile terminal and method for selection and activation object thereof
KR102049855B1 (en) Mobile terminal and controlling method thereof
KR101788046B1 (en) Mobile terminal and method for controlling the same
KR101830653B1 (en) Mobile device and control method for the same
KR101853057B1 (en) Mobile Terminal And Method Of Controlling The Same
KR101701839B1 (en) Mobile terminal and method for controlling the same
US20110122077A1 (en) Method for displaying data in mobile terminal having touch screen and mobile terminal thereof
KR101871718B1 (en) Mobile terminal and method for controlling the same
KR20140145894A (en) Mobile terminal and control method for the mobile terminal
KR20140134864A (en) Mobile terminal and method for controlling thereof
KR101592298B1 (en) Mobile terminal and user interface of mobile terminal
KR101692729B1 (en) Mobile terminal, and method for producing and obtaining message about outside object
KR20150009018A (en) Mobile terminal and control method for the mobile terminal
KR101980702B1 (en) Mobile terminal and method for controlling thereof
KR20110064289A (en) Method for transmitting and receiving data and mobile terminal thereof
KR20120001516A (en) Method for editing image contents in mobile terminal and mobile terminal using the same
KR101520686B1 (en) Mobile communication terminal and method of controlling display information using the same
KR101709504B1 (en) Mobile Terminal and Method for managing list thereof
KR20140130854A (en) Mobile terminal and method for controlling the same
KR101649637B1 (en) Mobile terminal and method for controlling the same
KR101840198B1 (en) Mobile terminal and controlling method thereof, and recording medium thereof
KR101695813B1 (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination