KR20140133130A - Mobile terminal and control method thereof - Google Patents

Mobile terminal and control method thereof Download PDF

Info

Publication number
KR20140133130A
KR20140133130A KR20130052769A KR20130052769A KR20140133130A KR 20140133130 A KR20140133130 A KR 20140133130A KR 20130052769 A KR20130052769 A KR 20130052769A KR 20130052769 A KR20130052769 A KR 20130052769A KR 20140133130 A KR20140133130 A KR 20140133130A
Authority
KR
South Korea
Prior art keywords
contents
clipboard
category
mobile terminal
categories
Prior art date
Application number
KR20130052769A
Other languages
Korean (ko)
Other versions
KR102026639B1 (en
Inventor
이서연
이주우
김용
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020130052769A priority Critical patent/KR102026639B1/en
Publication of KR20140133130A publication Critical patent/KR20140133130A/en
Application granted granted Critical
Publication of KR102026639B1 publication Critical patent/KR102026639B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/42Graphical user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The present invention relates to a mobile terminal for outputting a clipboard and a control method thereof, the mobile terminal including a display unit for displaying a clipboard divided into a plurality of sections in at least a part of the area, And a controller for sorting the contents according to a setting criterion and outputting the contents in an ordered order to the divided sections, wherein the setting criterion is set differently according to a user's selection, And output in a different sort order according to the setting criteria.

Description

[0001] MOBILE TERMINAL AND CONTROL METHOD THEREOF [0002]

The present invention relates to a mobile terminal, and more particularly, to a mobile terminal for outputting a clipboard and a control method thereof.

A terminal can be divided into a mobile or portable terminal and a stationary terminal depending on whether the terminal is movable or not. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.

As the functions of such terminals are diversified, they are implemented in the form of a multimedia player having complex functions such as photographing and photographing of movies, reproduction of music and video files, games, and broadcasting reception . Further, in order to support and enhance the function of the terminal, it may be considered to improve the structural and software parts of the terminal.

With these improvements, handsets have evolved to perform copy and paste or cut and paste functions. In addition, the terminal can temporarily store and output data copied or cut using a clipboard. At this time, the clipboard can include a plurality of contents, and the user can conveniently use the paste function using the clipboard. However, contents included in the clipboard are inevitably output in the order stored in the clipboard.

One embodiment of the present invention is to provide a mobile terminal capable of changing a sort order of contents included in a clipboard in outputting a clipboard and a control method thereof.

In addition, an embodiment of the present invention is to provide a mobile terminal and a control method thereof that can edit at least a part of content included in the clipboard on a clipboard.

In addition, an embodiment of the present invention provides a mobile terminal capable of providing an execution history according to execution of a paste function when a pasting function for content included in a clipboard is executed, and a control method thereof will be.

A mobile terminal according to an embodiment of the present invention includes a display unit that displays a clipboard divided into a plurality of sections in at least a part of a region, a memory that stores contents included in the clipboard, And a control unit for outputting the contents in a sorted order in accordance with the setting reference, wherein the setting reference is set differently according to a user's selection, .

In one embodiment, the control unit classifies the contents into a plurality of categories according to the types of data included in the contents, and uses at least one category among the categories as the setting reference .

In one embodiment, the display unit outputs a cursor at a position for pasting at least one of the contents, and the at least one category includes data that can be pasted into a window in which the cursor is located among the categories And at least one category including at least one category.

In one embodiment, the display unit outputs a category menu configured to select at least one category of the categories together with the clipboard, and the at least one category includes at least one of the categories selected by the user And the like.

In one embodiment, the controller divides the contents into categories and outputs the contents.

In one embodiment, the apparatus further comprises a user input unit for receiving an editing command for any one of the contents, wherein the controller executes an editing mode for any one of the contents in response to the editing command, And selects at least one of the above-mentioned one or more.

In one embodiment, the control unit adds the selected at least a portion to the other one when the selected at least one portion is dragged to another one of the contents.

In one embodiment, the control unit may edit at least a part of the selected portion based on a user input on the clipboard, or paste the selected portion at a position where a cursor is output.

In one embodiment, the control unit outputs filtered images to which an effect filter for the image is applied when one of the images includes an image.

In one embodiment, the dive control unit replaces the image with one of the filtered images selected from among the filtered images.

In one embodiment, the control unit outputs an execution history of the pasting function for any one of the contents when receiving a detailed view command for one of the contents.

In one embodiment, the execution history includes at least one of a date on which the paste function is executed, information on the application, and a shortcut link.

In one embodiment, the control unit executes the application when a touch input to the execution history is sensed.

In one embodiment, the control unit changes the layout of the sections based on a drag for at least one of the sections. The controller may change the layout of the sections according to the number of the contents. At this time, the relative size ratio of the sections can be maintained.

In addition, another embodiment of the present invention relates to a control method of a mobile terminal. The method of controlling a mobile terminal includes receiving an output command for a clipboard divided into a plurality of sections, sorting contents included in the clipboard according to a setting criterion in response to the output command, And outputting the contents in an ordered order in the sections, wherein the setting criterion is set differently according to a user's selection, and the contents are outputted in different sort order according to the setting criterion .

In one embodiment, the step of sorting the contents according to a setting criterion includes: classifying the contents into a plurality of categories according to a type of data included in the contents; And arranging the contents using at least one category among the categories as the setting reference.

In one embodiment, the at least one category is at least one category including data that can be pasted in a window in which the cursor is located, among the categories.

In one embodiment, the method further comprises outputting a category menu configured to select at least one category of the categories, wherein the at least one category is at least one category selected from among the categories .

In one embodiment, the method further comprises receiving an edit command for one of the contents, and editing at least a part of the one of the contents on the clipboard based on the edit command.

In one embodiment, the method further comprises a step of receiving a detailed view command for any one of the contents and outputting an execution history of the pasting function for any one of the contents, wherein the execution history includes a pasting function An executed date, information about an application, and a shortcut link.

In one embodiment, the method further comprises executing the application when a touch input to the execution history is sensed.

According to the present invention, in outputting the clipboard, the contents included in the clipboard are sorted according to a setting criterion depending on the user's selection, and the contents are output in an ordered sequence. . Particularly, since contents corresponding to the category selected by the user are preferentially output, the user can search for the content to be used on the clipboard more quickly.

Further, according to the present invention, since at least a part of the contents included in the clipboard can be edited, the user can use the clipboard dynamically. Since you can merge content on the clipboard or modify or delete at least some of the content, you do not have to paste and re-edit the content.

In addition, it provides an execution history for the content contained in the clipboard, so you can see what content has been pasted into the application by copying or truncating the content stored on the clipboard.

Thus, the convenience of the user using the clipboard can be increased.

1 is a block diagram illustrating a mobile terminal according to an embodiment of the present invention;
2A and 2B are perspective views showing the appearance of a mobile terminal according to the present invention.
3 and 4 are views for explaining a method of outputting a clipboard in a mobile terminal according to an embodiment of the present invention
5 is a view for explaining a control method of the clipboard in the mobile terminal according to the embodiment of the present invention;
6 is a view for explaining a pasting function of contents included in a clipboard in a mobile terminal according to an embodiment of the present invention;
7 is a flowchart illustrating a method of controlling a mobile terminal according to an embodiment of the present invention.
8, 9 and 10 are conceptual diagrams for explaining a mobile terminal according to the control method of FIG. 7
11 is a view for explaining a method of changing the layout of the clipboard in the mobile terminal according to the embodiment of the present invention
12A, 12B, 12C, and 13 are diagrams for explaining a method of editing at least a part of contents output on a clipboard in a mobile terminal according to an embodiment of the present invention
14 is a view for explaining a method of outputting a paste execution history of contents included in a clipboard in a mobile terminal according to an embodiment of the present invention;
15 is a view for explaining a method of outputting a clipboard by a copying or trimming function in a mobile terminal according to an embodiment of the present invention;
16 is a view for explaining a method of outputting contents included in a clipboard in a mobile terminal according to an embodiment of the present invention;

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. In addition, it should be noted that the attached drawings are only for easy understanding of the embodiments disclosed in the present specification, and should not be construed as limiting the technical idea disclosed in the present specification by the attached drawings.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC , A tablet PC (tablet PC), and an ultrabook (ultrabook). However, it will be understood by those skilled in the art that the configuration according to the embodiments described herein may be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to a mobile terminal.

1 is a block diagram illustrating a mobile terminal according to one embodiment disclosed herein.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only And a Digital Broadcasting System (ISDB-T) (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The mobile communication module 112 is configured to implement a video communication mode and a voice communication mode. The video call mode refers to a state of talking while viewing a video of the other party, and the voice call mode refers to a state in which a call is made without viewing the other party's video. In order to implement the video communication mode and the voice communication mode, the mobile communication module 112 is configured to transmit and receive at least one of voice and image.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. Examples of the wireless Internet technology include a wireless LAN (WLAN), a wireless fidelity (WiFi) direct, a DLNA (Digital Living Network Alliance), a Wibro (Wireless broadband), a Wimax (World Interoperability for Microwave Access), HSDPA Can be used.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, NFC (Near Field Communication), etc. are used as short range communication technology .

The position information module 115 is a module for obtaining the position of the mobile terminal, and representative examples thereof include a Global Position System (GPS) module or a Wireless Fidelity (WiFi) module.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to an external device through the wireless communication unit 110. [ Further, the position information of the user and the like can be calculated from the image frame obtained by the camera 121. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data according to a control command for controlling the operation of the mobile terminal 100 applied by the user. The user input unit 130 may include a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 may sense the position of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence of the user, the orientation of the mobile terminal, And generates a sensing signal (or sensing signal) for sensing the current state and controlling the operation of the mobile terminal 100. For example, the sensing unit 140 may detect whether the slide phone is opened or closed when the mobile terminal 100 is in the form of a slide phone. The sensing unit 140 may sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like.

The output unit 150 may include a display unit 151, an audio output module 153, an alarm unit 154, a haptic module 155, and the like in order to generate output related to visual, auditory, have.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the display unit 151 displays the photographed and / or received video or UI and GUI.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, a 3D display, and an e-ink display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another, or may be disposed integrally with one another, and may be disposed on different surfaces, respectively.

Also, the display unit 151 may be configured as a stereoscopic display unit 152 for displaying a stereoscopic image.

Here, a stereoscopic image represents a 3-dimensional stereoscopic image, and a 3-dimensional stereoscopic image represents a progressive depth and reality in which objects are located on a monitor or a screen, It is an image that makes you feel the same as the space. 3D stereoscopic images are implemented using binocular disparity. The binocular parallax means a parallax caused by the position of two eyes away from each other. When two eyes see two different images and the images are transmitted to the brain through the retina and fused, the depth and real feeling of the stereoscopic image can be felt .

The stereoscopic display unit 152 may be applied to a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system). The stereoscopic method, which is widely used in home television receivers, includes a Wheatstone stereoscopic method.

Examples of the autostereoscopic method include a parallax barrier method, a lenticular method, an integral imaging method, and a switchable lens method. The projection method includes a reflection type holographic method and a transmission type holographic method.

Generally, 3D stereoscopic images consist of left image (left eye image) and right image (right eye image). A top-down method of arranging a left image and a right image in one frame according to a method in which a left image and a right image are combined into a three-dimensional stereoscopic image, A checker board system in which pieces of a left image and a right image are arranged in a tile form, a left-to-right (right-side) Or an interlaced method in which rows are alternately arranged, and a time sequential (frame-by-frame) method in which right and left images are alternately displayed in time.

In addition, the 3D thumbnail image can generate a left image thumbnail and a right image thumbnail from the left image and right image of the original image frame, respectively, and combine them to generate one 3D thumbnail image. In general, a thumbnail means a reduced image or a reduced still image. The left image thumbnail and the right image thumbnail generated in this way are displayed on the screen with a difference of the left and right distance by the depth corresponding to the parallax between the left image and the right image, thereby exhibiting a stereoscopic spatial feeling.

The left and right images necessary for realizing the three-dimensional stereoscopic image can be displayed on the stereoscopic display unit 152 by a stereoscopic processing unit (not shown). The stereoscopic processing unit receives a 3D image and extracts a left image and a right image from the 3D image, or receives a 2D image and converts it into a left image and a right image.

On the other hand, when a display unit 151 and a sensor (hereinafter, referred to as 'touch sensor') that detects a touch operation form a mutual layer structure (hereinafter referred to as a 'touch screen' It can also be used as an input device in addition to the device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area where the touch object is touched on the touch sensor but also the pressure at the time of touch. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is surrounded by the touch screen. The proximity sensor 141 may be provided as an example of the sensing unit 140. The proximity sensor 141 refers to a sensor that detects the presence of an object approaching a predetermined detection surface or an object existing in the vicinity of the detection surface without mechanical contact using an electromagnetic force or an infrared ray. The proximity sensor 141 has a longer life than the contact type sensor and its utilization is also high.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is electrostatic, it is configured to detect the proximity of the pointer by a change of the electric field along the proximity of an object having conductivity (hereinafter, referred to as a pointer). In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor 141 detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

In the case where the three-dimensional display unit 152 and the touch sensor have a mutual layer structure (hereinafter referred to as a 'three-dimensional touch screen') or a three-dimensional sensor that detects the touch operation and the stereoscopic display unit 152 are combined with each other The stereoscopic display unit 152 may also be used as a three-dimensional input device.

The sensing unit 140 may include a proximity sensor 141, a three-dimensional touch sensing unit 142, an ultrasonic sensing unit 143, and a camera sensing unit 144 as an example of the three-dimensional sensor.

The proximity sensor 141 measures the distance between the sensing surface (for example, a user's finger or a stylus pen) to which the touch is applied without mechanical contact using the force of the electromagnetic field or infrared rays. The terminal recognizes which part of the stereoscopic image has been touched using the distance. In particular, when the touch screen is of the electrostatic type, the proximity of the sensing object is detected by a change of the electric field according to the proximity of the sensing object, and the touch on the three-dimensional is recognized using the proximity.

The stereoscopic touch sensing unit 142 senses the strength or duration of a touch applied to the touch screen. For example, the three-dimensional touch sensing unit 142 senses a pressure to apply a touch, and when the pressing force is strong, recognizes the touch as a touch to an object located further away from the touch screen toward the inside of the terminal.

The ultrasonic sensing unit 143 is configured to recognize the position information of the sensing target using ultrasonic waves.

The ultrasound sensing unit 143 may include, for example, an optical sensor and a plurality of ultrasound sensors. The light sensor is configured to sense light, and the ultrasonic sensor is configured to sense ultrasonic waves. Since light is much faster than ultrasonic waves, the time it takes for light to reach the optical sensor is much faster than the time it takes for the ultrasonic waves to reach the ultrasonic sensor. Therefore, it is possible to calculate the position of the wave generating source using the time difference with the time when the ultrasonic wave reaches the reference signal.

The camera sensing unit 144 includes at least one of a camera 121, a photo sensor, and a laser sensor.

For example, the camera 121 and the laser sensor are combined with each other to sense a touch of a sensing target with respect to a three-dimensional stereoscopic image. When the distance information detected by the laser sensor is added to the two-dimensional image photographed by the camera, three-dimensional information can be obtained.

As another example, a photosensor may be stacked on a display element. The photosensor is configured to scan the movement of the object proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of light change, thereby acquiring position information of the object to be sensed.

The audio output module 153 can output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 153 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The sound output module 153 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 154 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events generated in the mobile terminal 100 include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 154 may output a signal for notifying the occurrence of an event by using a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the sound output module 153 so that the display unit 151 and the sound output module 153 may be classified as a part of the alarm unit 154 .

The haptic module 155 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 155 may be vibration. The intensity and pattern of the vibration generated by the hit module 155 can be controlled by the user's selection or setting of the control unit. For example, the haptic module 155 may combine and output different vibrations or sequentially output the vibrations.

In addition to the vibration, the haptic module 155 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 155 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sense of the finger or arm. At least two haptic modules 155 may be provided according to the configuration of the mobile terminal 100.

The memory 160 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read- And may include a storage medium of at least one type of disk and optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data to the external device. For example, a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, An audio I / O port, a video I / O port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) A universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Therefore, the identification device can be connected to the terminal 100 through the interface unit 170. [

The interface unit 170 may be a path through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, And various command signals may be transmitted to the mobile terminal 100. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal 100. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

 In addition, the control unit 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

In addition, if the state of the mobile terminal meets a set condition, the controller 180 can execute a lock state for restricting input of a control command of the user to the applications. Also, the controller 180 may control the lock screen displayed in the locked state based on the touch input sensed through the display unit 151 in the locked state.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays , Microprocessors, microprocessors, microprocessors, and other electronic units for carrying out other functions. In some cases, the embodiments described herein may be implemented by the controller 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein.

The software code may be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

2A is a perspective view of a mobile terminal 100 according to an embodiment of the present invention.

The disclosed mobile terminal 100 includes a bar-shaped terminal body. However, the present invention is not limited thereto and can be applied to various structures such as a watch type, a clip type, a spectacle type, or a folder type, a flip type, a slide type, a swing type and a swivel type in which two or more bodies are movably coupled have.

The body includes a case (frame, housing, cover, etc.) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. A battery cover 103 covering the battery 191 may be detachably attached to the rear case 102 have.

The cases may be formed by injection molding of synthetic resin, or may be formed of metal such as stainless steel (STS), aluminum (Al), titanium (Ti), or the like.

A first sound output module 153a, a first camera 121a, a first operation unit 131 and the like are disposed on the front surface of the terminal body, and a microphone 122, an interface unit 170 ), A second operation unit 132, and the like.

The display unit 151 is configured to display (output) information processed by the mobile terminal 100. The display unit 151 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) ), A flexible display, a 3D display, and an e-ink display.

The display unit 151 may include touch sensing means for receiving a control command by a touch method. When a touch is performed on any one of the display units 151, the touch sensing unit senses the touch and the content corresponding to the touched position is input. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.

The touch sensing means may be formed of a translucent material so that the visual information output from the display unit 151 can be seen, and a structure for enhancing the visibility of the touch screen in a bright place. 2A, the display unit 151 occupies most of the front surface of the front case 101.

A first sound output module 153a and a first camera 121a are disposed in an area adjacent to one end of both ends of the display unit 151 and a first operation unit 131 and a microphone 122 . The second operation unit 132 (see FIG. 2B), the interface unit 170, and the like may be disposed on the side of the terminal body.

The first sound output module 153a may be implemented in the form of a receiver for delivering a call sound to a user's ear or a loud speaker for outputting various alarm sounds or multimedia playback sounds.

The sound generated from the first sound output module 153a may be configured to be emitted along the assembly gap between the structures. In this case, the hole formed independently for the apparent acoustic output is hidden or hidden, so that the appearance of the mobile terminal 100 can be simplified. However, the present invention is not limited thereto, and a hole for emitting the sound may be formed in the window.

The first camera 121a processes an image frame such as a still image or moving image obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The user input unit 130 is operated to receive a command for controlling the operation of the mobile terminal 100 and may include first and second operation units 131 and 132. The first and second operation units 131 and 132 may be collectively referred to as a manipulating portion and may be any type of tactile manner such as touch, push, scroll, etc., .

In the figure, the first operation unit 131 is a touch key, but the present invention is not limited thereto. For example, the first operation unit 131 may be a mechanical key, or a combination of a touch key and a touch key.

The contents input by the first and / or second operation units 131 and 132 may be variously set. For example, the first operation unit 131 receives a command such as a menu, a home key, a cancellation, a search, and the like, and the second operation unit 132 adjusts the size of the sound output from the first sound output module 153a And a switch to the touch recognition mode of the display unit 151 and the like.

The microphone 122 is configured to receive a user's voice, other sounds, and the like. The microphone 122 may be provided at a plurality of locations to receive stereophonic sound.

The interface unit 170 is a path for allowing the mobile terminal 100 to exchange data with an external device. For example, the interface unit 170 may include a connection terminal for connecting to an earphone by wire or wireless, a port for short-range communication (for example, an IrDA Port, a Bluetooth Port, a Wireless LAN LAN Port) or the like, or power supply terminals for supplying power to the mobile terminal 100. [ The interface unit 170 may be implemented as a socket for receiving an external card such as a SIM (Subscriber Identification Module) or a UIM (User Identity Module) or a memory card for storing information.

2B is a rear perspective view of the mobile terminal 100 shown in FIG. 2A.

Referring to FIG. 2B, a second camera 121b may be additionally mounted on the rear surface of the terminal body, that is, the rear case 102. The second camera 121b may have a photographing direction substantially opposite to that of the first camera 121a (see FIG. 2A), and may be a camera having different pixels from the first camera 121a.

For example, the first camera 121a has low pixels so that it is easy to capture a face of a user in the case of a video call or the like and transmit the face to the other party. The second camera 121b photographs a general object and immediately transmits It is preferable to have a high pixel. The first and second cameras 121a and 121b may be installed in the terminal body so as to be able to rotate or pop-up.

A flash 123 and a mirror 124 are further disposed adjacent to the second camera 121b. The flash 123 shines light toward the subject when the subject is photographed by the second camera 121b. The mirror 124 enables the user to illuminate the user's own face or the like when the user intends to photograph the user himself / herself (self-photographing) using the second camera 121b.

A second sound output module 153b may be further disposed on the rear surface of the terminal body. The second sound output module 153b may implement the stereo function together with the first sound output module 153a (see FIG. 2A) and may be used for implementing the speakerphone mode in a call.

An antenna (not shown) for receiving a broadcast signal may be additionally disposed on the side of the terminal body in addition to an antenna for a call or the like. An antenna constituting a part of the broadcast receiving module 111 (see FIG. 1) can be installed to be able to be drawn out from the terminal body.

The terminal body is provided with a power supply unit 190 (see FIG. 1) for supplying power to the mobile terminal 100. The power supply unit 190 may include a battery 191 built in the terminal body or detachable from the outside of the terminal body. This figure illustrates that the battery cover 103 is coupled to the rear case 102 so as to cover the battery 191 to restrict the detachment of the battery 191 and to protect the battery 191 from external impact and foreign matter .

The control unit 180 of the mobile terminal 100 according to the embodiment of the present invention will be described with reference to the control unit 180 of the memory 160 by a function of copying, cutting, screen capture, Is stored in the clipboard and managed. A clipboard is a memory area reserved for temporary storage when copying or pasting data from one program to another, or a window for outputting such data.

Hereinafter, a clipboard of the mobile terminal 100 according to an embodiment of the present invention will be described in detail with reference to FIG. 3 to FIG.

3 and 4 are views for explaining a method of outputting a clipboard in a mobile terminal according to an embodiment of the present invention.

3, the display unit 151 of the mobile terminal 100 according to the exemplary embodiment of the present invention includes a virtual keypad 240, an input window 210 for outputting data input by the virtual keypad 240, And a cursor 220 indicating an input position of data.

At this time, a control menu for performing at least one of paste, cut, copy, and clipboard may be output through a pop-up window 230 by a user's operation. For example, if the touch input to the cursor continues for a predetermined time, the control unit 180 may output the pop-up window 230 at a position adjacent to the cursor.

The control unit 180 can output at least one of the clipboard 250 and the clipboard window open / close icons 260 in one area of the display unit 151 have.

The clipboard is divided into a plurality of sections, and the contents included in the clipboard can be output to the sections. At this time, the clipboard 250 may be output differently depending on whether the state of the terminal is portrait (see FIG. 3B) or landscape (see FIG. 3C). That is, the layout of the clipboard can be changed according to the state of the terminal. As the layout is changed, the number and size of sections output to one area of the display unit 151 can be changed.

4 is a diagram illustrating a method of outputting a clipboard in a mobile terminal according to an embodiment of the present invention.

Referring to FIG. 4, the display unit 151 may output a clipboard window open / close icon 260. The clipboard 250 may be output or the clipboard 250 may be disappeared by touching the open / close icon 260.

When an output command for the clipboard is detected, the controller 180 may output at least a part of the clipboard 250 from one end of the display unit 151 in a pushing manner. At this time, the clipboard 250 may be output instead of the virtual keypad in the area where the virtual keypad is output.

5 is a view for explaining a method of controlling a clipboard in a mobile terminal according to an embodiment of the present invention.

All the contents included in the clipboard can not be output due to the physical limitations of the display unit 151. In this case, a predetermined number of contents can be preferentially output. Then, the remaining contents that have not been output by scrolling, flicking, etc. may be output.

6 is a diagram for explaining a pasting function of contents included in a clipboard in a mobile terminal according to a practical example of the present invention.

When a touch input to at least one of the contents included in the clipboard is sensed, the controller 180 may input the detected content to a position where the cursor 220 is output. That is, a paste function is executed for the content in which the touch input is detected.

Although not shown in the drawing, the controller 180 may output the cursor 220 at a position where the touch input is sensed when a touch input is detected at a position of the input window 210. [ That is, the position of the cursor 220 can be changed based on the touch input.

A mobile terminal and its control method for changing the order of contents contained in the clipboard using the clipboard described above with reference to FIGS. 3 to 6 will be described below with reference to FIGS. 7 to 10. FIG.

7 is a flowchart illustrating a method of controlling a mobile terminal according to an embodiment of the present invention.

Referring to FIG. 7, a method for controlling a mobile terminal according to an exemplary embodiment of the present invention includes receiving an output command for a clipboard divided into a plurality of sections (S110).

An output command for the clipboard can be input by, for example, a pop-up window (see FIG. 3) 220 described above with reference to FIG. The clipboard window open / close icon 260 (see FIG. 4) described above with reference to FIG. 4 for another example. In another example, an icon formed to output a clipboard to a virtual keyboard is included, and an output command of the clipboard can be input by touch input to the icon.

Next, in response to the output command, the contents included in the clipboard may be sorted according to the setting reference (S120). The step of outputting the contents in sorted order to the divided sections (S130) may be performed. The setting criterion is set differently according to the user's selection, and the contents are output in different sort order according to the setting criterion.

Generally, the clipboard outputs contents in the reverse order of the order stored in the clipboard. However, in outputting the clipboard, the mobile terminal according to the embodiment of the present invention may change the sort order of contents based on the user's selection.

The controller 180 classifies contents into a plurality of categories according to the type of data included in the contents, and uses at least one category among the categories as a setting reference of contents sorting.

In an embodiment, the category of content may be determined depending on what data the content contains. For example, if the content contained in the clipboard is "http://www.lg.com", the category is classified as "URL (uniform resource locator)", and if the content is "02-555-1111" Number ". This category may be classified into at least one of text, image, number, and uniform resource locator (URL) depending on the type of data included in the content.

The control unit 180 may arrange the contents using at least one category selected by the user as a setting reference. For example, when a URL category is selected by a user's selection, content classified in a URL category among the contents included in the clipboard can be preferentially output to the clipboard. Alternatively, only the contents classified into the URL category may be output to the clipboard.

Hereinafter, a mobile terminal in which setting criteria are set differently according to a user's selection will be described with reference to FIGS. 8, 9, and 10. FIG.

Referring to FIG. 8, the mobile terminal 100 according to the present invention may set different setting criteria according to the position of the cursor 220. The cursor 220 is output at a position for pasting at least one of the contents included in the clipboard, and blinks to indicate that data can be input. When the data is input, do. In addition, the cursor 220 can be changed in position by a user's input.

At this time, the controller 180 analyzes the attribute of the input window where the cursor 220 is located, and confirms the type of data that can be input to the input window. The contents classified into the categories that can be input in the input window may be output to the clipboard preferentially or only the contents classified into the category may be output to the clipboard.

8A shows a mobile terminal 100 in which there is no restriction on an input window where the cursor 220 is located, and the content included in the clipboard is output in the reverse order of the stored order.

8B, when the cursor 220 is displayed on the address bar of the browser, since the data to be input to the input window is a URL, contents classified into the URL category are preferentially displayed on the clipboard Or output only the contents classified into the URL category.

8C, when the cursor 220 is displayed on a window for inputting a telephone number, since the data to be input is a number, the contents classified into the numeric category are displayed on the clipboard Or output only the contents classified in the numeric category.

8 (d), when it is impossible to input data related to an image in the window where the cursor 220 is output, the contents classified in the image category are displayed in the order of the lowest order on the clipboard It can be output or not output on the clipboard. That is, contents other than contents classified into the image category are output to the clipboard.

As described above with reference to FIG. 8, not only the setting criterion is changed according to the attribute of the input window where the cursor 220 is located, but also the setting criterion may be changed by the user directly selecting at least one of the categories.

9 and 10, the mobile terminal 100 according to the present invention can select at least one category to be used as a setting reference among the preset categories from the user.

For example, the apprentice fisher 180 may output a category menu 270 configured to select at least one category of categories with the clipboard. When a number menu is selected from the category menu 270, the controller 180 may preferentially output contents included in the number category to the clipboard 250.

The category of the content can be classified in various ways according to the date when the content is added to the clipboard, the origin of the content, and the like. Accordingly, the category menu 270 may also include various categories. For example, referring to FIG. 9, categories classified according to the type of data may be output in the category menu 270, or categories classified according to the date stored in the clipboard may be output as shown in FIG. 10

Although not shown in the figure, the category classified according to the source of the content may also be output to the category menu 270. When the content stored in the clipboard is a copy, the location where the original was stored becomes the source of the content. For example, if the first content is cut in the word processor, the source of the first content is a word, and if the second content is copied in a browser, the source of the second content may be a browser have.

This category menu 270 may be pre-set by the manufacturer at the factory release and may be added, deleted, or modified by the user.

As described above, according to the present invention, in outputting the clipboard, contents included in the clipboard can be sorted according to a setting criterion depending on a user's selection, and contents can be output in an ordered sequence. This makes it possible for the user to search for the content to be used on the clipboard more quickly and to use the clipboard conveniently.

11 is a diagram for explaining a method of changing the layout of the clipboard in the mobile terminal according to the embodiment of the present invention.

The control unit 180 can change the layout of the clipboard output to the display unit 151 based on at least one of the size of the region where the clipboard is output and the number of contents included in the clipboard. For example, if the number of contents contained in the clipboard is two, the size of each section may be larger than that of three.

When one end of one of the sections is dragged to one position of the clipboard, the controller 180 can change the size of the section corresponding to the one. At this time, only the size of the section corresponding to one of the above may be changed, the relative size ratio of the sections may be maintained, and the layout of all the sections may be changed.

Thus, the user can adjust the layout of the clipboard in a style suitable for him / herself, so that user convenience can be increased.

12A, 12B, 12C, and 13 are views for explaining a method of editing at least a part of contents output on a clipboard in a mobile terminal according to an embodiment of the present invention.

The mobile terminal according to the embodiment of the present invention can receive an editing command for any one of the contents included in the clipboard. That is, the user may delete at least a portion of the content contained in the clipboard, change it to another content, or add new content to the content.

To this end, the control unit 180 may execute the edit mode in response to an edit command for any one of the contents, and may select at least one of the above based on the user input. Then, the control unit 180 can edit at least the selected portion. It is also possible to edit the entire content because it is at least a part of the content.

12A, it is possible to execute a paste function for at least a part of contents included in the clipboard. That is, at least a selected portion may be input at a position where the cursor 220 is output.

In addition, referring to FIG. 12B, the selected at least a portion may be combined with other contents included in the clipboard 250. At least partly, the entire content may be combined with other content.

For example, referring to FIG. 12B, a first content 254 composed of text and a second content 256 composed of an image are outputted to the clipboard 250, and an edit mode for the first content 254 is An executed mobile terminal 100 is shown. If at least a portion of the first content 254 is "not all" selected by the user input and is dragged into the second content 256, the control unit 180 sends the selected at least a portion " ). Thus, the second content becomes a content composed of a text and an image.

13 is a diagram for explaining a control method in accordance with execution of an edit mode when an image is included in any one of the contents in which the edit mode is executed.

Referring to FIG. 13, when an editing command for a content including an image is inputted, the mobile terminal 100 according to the embodiment of the present invention outputs filtered images to which the effect filter for the image is applied to the display unit 151 can do.

The effect filter is a mechanism used to adjust the propagation direction of light entering the lens, to obtain various effects by refracting, dispersing, selecting, and passing the light. The controller 180 controls the copying Can be generated as filtered images. For example, a monochrome image to which a monochrome effect filter is applied to an original that is a color image can be generated as a filtered image.

If any one of the filtering images is selected, the control unit 180 can replace the image included in the clipboard 250 with any one selected filtering image. That is, the image can be edited into any one filtered image selected.

Thus, according to the present invention, since at least a part of the contents included in the clipboard can be edited, the user can use the clipboard dynamically. Since you can merge content on the clipboard or modify or delete at least a portion of the content, you do not have to paste and re-edit the content, and you can modify the content on the clipboard in a simple way.

FIG. 14 is a view for explaining a method of outputting a paste execution history of contents included in a clipboard in a mobile terminal according to an embodiment of the present invention.

The mobile terminal according to the embodiment of the present invention can store the execution history every time a control function for contents included in the clipboard is executed. The control function means at least one of copying, cutting, screen capturing and pasting. The execution history includes the time at which the control function is executed, the source of the content, information about the application in which the control function is executed, and the like can do.

When the detailed view command for one of the contents included in the clipboard is received, the execution history for any one of the contents can be outputted. For example, referring to FIG. 14, when a detailed view command for one of the clipboards 250 is received while the clipboard 250 is being output, where the original is copied from, And outputs the pasted to the display unit 151.

In addition, the execution history may include a shortcut link to the application in which the control function is executed. The control unit 180 can execute the control-executed application using the shortcut link.

For example, referring to FIG. 14, the first execution history and the female image, which transmitted the female image (any one of the contents included in the clipboard) as a message to TOMMY on January 24, 2013, The second execution history transmitted to the JANE in a message can be confirmed. At this time, if the touch input for the first execution history is detected, the controller 180 can execute the message application using the shortcut link included in the first execution history.

Thus, according to the present invention, since the mobile terminal provides the execution history of the content included in the clipboard, it can check whether the content stored in the clipboard has been pasted into the application by copying or cutting. In addition, the source of the contents included in the clipboard can be confirmed. Accordingly, the convenience of the user using the clipboard can be increased.

15 is a diagram for explaining a method of outputting a clipboard by a copying or trimming function in a mobile terminal according to an embodiment of the present invention.

When the copying or trimming function is executed, the control unit 180 can output the clipboard 250 to one area of the display unit. And you can add copied or cut data to the clipboard.

For example, referring to FIG. 15, the image and the text "BREAKFAST WITH SLICED LEMON." May be copied by user input. The control unit 180 outputs the clipboard 250 to one area of the display unit 151 and outputs the copied data to one of the sections of the clipboard 250. After a preset time, the output clipboard can be made to disappear. This allows the user to verify that data has been added to the clipboard.

16 is a diagram for explaining a method of outputting contents included in a clipboard in a mobile terminal according to an embodiment of the present invention.

Contents stored in the clipboard can be classified into text, image, and a mixture thereof (text + image). In addition, the attributes of the input window in which contents stored in the clipboard are pasted can be divided into text (text only), image (image only), and a mixture thereof (text + image) according to the type of data that can be input.

At this time, the controller 180 can output contents included in the clipboard differently according to the attribute of the input window where the cursor is located. In particular, depending on the attributes of the input window, it is possible to output the input that can be pasted and not possible.

Referring to FIG. 16, a method of outputting a copied item according to a filed type of an input window is shown. For example, if the cursor is placed in the text only input window, the image contained in the content can be displayed transparently relative to the text, or displayed as non-input objects above the image to indicate that the image can not be output. have.

Although not shown in the drawing, the mobile terminal according to the embodiment of the present invention can output contents by category. For example, different colors may be designated for each category, and the color of the category corresponding to the border of the content may be output. For another example, different objects may be designated for each category, and an object of a category corresponding to a position of the content may be output.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention.

Claims (22)

A display unit displaying a clipboard, the clipboard being divided into a plurality of sections in at least some areas;
A memory for storing contents included in the clipboard; And
And a control unit for sorting the contents according to a setting criterion and for outputting the sorted contents in an ordered order,
Wherein the setting criterion is set differently according to a user's selection, and the contents are output in different sort order according to the setting criterion.
The method according to claim 1,
Wherein the control unit classifies the contents into a plurality of categories according to a type of data included in the contents, and uses at least one category among the categories as the setting reference.
3. The method of claim 2,
Wherein the display unit outputs a cursor at a position for pasting at least one of the contents,
Wherein the at least one category is at least one category including data that can be pasted in a window in which the cursor is located, among the categories.
3. The method of claim 2,
Wherein the display unit outputs a category menu configured to select at least one category among the categories together with the clipboard,
Wherein the at least one category is at least one category selected from the categories by the user.
3. The method of claim 2,
Wherein the controller divides the contents into categories and outputs the classified contents.
The method according to claim 1,
Further comprising a user input unit for receiving an edit command for any one of the contents,
Wherein the control unit executes an editing mode for any one of the plurality of editing units in response to the editing command, and selects at least one of the editing units based on a user input.
The method according to claim 6,
Wherein the control unit adds the selected at least one portion to the other one when the selected at least one portion is dragged to another one of the contents.
The method according to claim 6,
Wherein the control unit edits at least a part of the selected at least one part on the clipboard based on a user input, or pastes the selected part at a position where the cursor is output.
The method according to claim 6,
Wherein the control unit outputs filtered images to which an effect filter for the image is applied when any one of the images includes an image.
10. The method of claim 9,
Wherein the image-taking control unit replaces the image with one of the filtering images selected from among the filtering images.
The method according to claim 1,
Wherein the controller outputs an execution history of the pasting function for any one of the contents when a detailed view command for one of the contents is received.
12. The method of claim 11,
Wherein the execution history includes at least one of a date on which the pasting function is executed, information on the application, and a shortcut link.
13. The method of claim 12,
Wherein the control unit executes the application when a touch input to the execution history is sensed.
The method according to claim 1,
Wherein the control unit changes a layout of the sections based on a drag on at least one of the sections.
15. The method of claim 14,
Wherein the relative size ratio of the sections is maintained.
Receiving an output command for a clipboard divided into a plurality of sections;
Arranging contents included in the clipboard according to a setting criterion in response to the output command; And
And outputting the contents in an ordered order to the divided sections,
Wherein the setting criterion is set differently according to a user's selection, and the contents are output in different sort order according to the setting criterion.
17. The method of claim 16,
The step of sorting the contents according to a setting criterion includes:
Classifying the contents into a plurality of categories according to a type of data included in the contents; And
And arranging the contents using at least one category among the categories as the setting reference.
18. The method of claim 17,
Wherein the at least one category is at least one category including data that can be pasted in a window in which the cursor is located among the categories.
18. The method of claim 17,
Further comprising outputting a category menu configured to select at least one category of the categories,
Wherein the at least one category is at least one category selected from the categories by the user.
17. The method of claim 16,
Receiving an edit command for any one of the contents; And
Further comprising the step of editing at least a part of the at least one of the at least one of the at least one of the at least one of the at least one of the at least two of the at least one of the at least one of the at least one of the at least one of the at least one of the at least one of the at least one of the at least one of
17. The method of claim 16,
Receiving a detailed view command for one of the contents; And
Further comprising the step of outputting an execution history of the pasting function for any one of the above,
Wherein the execution history includes at least one of a date on which the pasting function is executed, information on the application, and a shortcut link.
22. The method of claim 21,
Further comprising executing the application when a touch input to the execution history is detected.
KR1020130052769A 2013-05-09 2013-05-09 Mobile terminal and control method thereof KR102026639B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130052769A KR102026639B1 (en) 2013-05-09 2013-05-09 Mobile terminal and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130052769A KR102026639B1 (en) 2013-05-09 2013-05-09 Mobile terminal and control method thereof

Publications (2)

Publication Number Publication Date
KR20140133130A true KR20140133130A (en) 2014-11-19
KR102026639B1 KR102026639B1 (en) 2019-09-30

Family

ID=52453866

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130052769A KR102026639B1 (en) 2013-05-09 2013-05-09 Mobile terminal and control method thereof

Country Status (1)

Country Link
KR (1) KR102026639B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018092972A1 (en) * 2016-11-21 2018-05-24 엘지전자 주식회사 Mobile terminal and control method therefor
KR20190054295A (en) * 2017-11-13 2019-05-22 삼성전자주식회사 Display apparauts and control method thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010045591A (en) * 1999-11-05 2001-06-05 윤종용 A computer system having multi-clipboard
KR20030004598A (en) * 2001-07-05 2003-01-15 안병곤 Method for providing expansion clipboard
KR20040077530A (en) * 2003-02-28 2004-09-04 마이크로소프트 코포레이션 Method and system for enhancing paste functionality of a computer software application
KR20040079465A (en) * 2003-03-07 2004-09-16 에스케이텔레텍주식회사 A mobile phone with clipboard function
KR20120055876A (en) * 2010-11-24 2012-06-01 엘지전자 주식회사 Mobile terminal and method for controlling multiple clipboard
JP2013016177A (en) * 2011-07-05 2013-01-24 Nhn Corp Document link system and method for aligning document stored in clip board of cloud foundation and displaying the same along with linkable service

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010045591A (en) * 1999-11-05 2001-06-05 윤종용 A computer system having multi-clipboard
KR20030004598A (en) * 2001-07-05 2003-01-15 안병곤 Method for providing expansion clipboard
KR20040077530A (en) * 2003-02-28 2004-09-04 마이크로소프트 코포레이션 Method and system for enhancing paste functionality of a computer software application
KR20040079465A (en) * 2003-03-07 2004-09-16 에스케이텔레텍주식회사 A mobile phone with clipboard function
KR20120055876A (en) * 2010-11-24 2012-06-01 엘지전자 주식회사 Mobile terminal and method for controlling multiple clipboard
JP2013016177A (en) * 2011-07-05 2013-01-24 Nhn Corp Document link system and method for aligning document stored in clip board of cloud foundation and displaying the same along with linkable service

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018092972A1 (en) * 2016-11-21 2018-05-24 엘지전자 주식회사 Mobile terminal and control method therefor
KR20190054295A (en) * 2017-11-13 2019-05-22 삼성전자주식회사 Display apparauts and control method thereof

Also Published As

Publication number Publication date
KR102026639B1 (en) 2019-09-30

Similar Documents

Publication Publication Date Title
KR102080746B1 (en) Mobile terminal and control method thereof
KR102127925B1 (en) Mobile terminal and control method thereof
KR101917690B1 (en) Mobile terminal and control method for the mobile terminal
KR20140113156A (en) Mobile terminal and control method thereof
KR102130797B1 (en) Mobile terminal and control method for the mobile terminal
KR102606119B1 (en) Terminal and method for controlling the same
KR101988262B1 (en) Mobile terminal and control method thereof
KR20130133717A (en) Audio playing apparatus and systme habving the samde
KR20150056353A (en) The mobile terminal and the control method thereof
KR102124801B1 (en) Mobile terminal and control method thereof
KR102080743B1 (en) Mobile terminal and control method thereof
KR20150029451A (en) Mobile terminal and method for controlling the same
EP2763091A1 (en) Mobile terminal and control method thereof
KR20140109719A (en) Mobile terminal and control method thereof
KR20150055448A (en) Mobile terminal and control method thereof
KR20150047905A (en) Mobile terminal and control method thereof
KR102037928B1 (en) Mobile terminal
KR20150009008A (en) Mobile terminal and control method thereof
KR20120122314A (en) Mobile terminal and control method for the same
US20140253474A1 (en) Mobile terminal and control method thereof
KR20150055446A (en) Mobile terminal and control method thereof
KR20150068823A (en) Mobile terminal
KR102026639B1 (en) Mobile terminal and control method thereof
KR20140085039A (en) Control apparatus of mobile terminal and method thereof
KR20140122559A (en) Mobile terminal and control method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant