KR20160046593A - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR20160046593A
KR20160046593A KR1020140142719A KR20140142719A KR20160046593A KR 20160046593 A KR20160046593 A KR 20160046593A KR 1020140142719 A KR1020140142719 A KR 1020140142719A KR 20140142719 A KR20140142719 A KR 20140142719A KR 20160046593 A KR20160046593 A KR 20160046593A
Authority
KR
South Korea
Prior art keywords
floating
icon
menu
mobile terminal
screen
Prior art date
Application number
KR1020140142719A
Other languages
Korean (ko)
Inventor
이강
노병권
신재동
한충신
김수연
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020140142719A priority Critical patent/KR20160046593A/en
Publication of KR20160046593A publication Critical patent/KR20160046593A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/90Additional features
    • G08C2201/93Remote control using other portable devices, e.g. mobile phone, PDA, laptop

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a mobile terminal and a control method thereof, and more particularly, And displaying a floating icon that receives data sharing setting information including shared object data and a shared address on a top layer of the touch screen screen when the preset user input is received, And a controller for transmitting the shared data to the shared address according to the information. Accordingly, it is possible to provide a plurality of menus required for data sharing in a floating manner without switching the screen currently being displayed when data is shared in the mobile terminal, so that even if a menu selection process or a screen switching process is not performed, Data can be shared with a plurality of parties as an input.

Description

[0001] MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME [0002]

The present invention provides a plurality of menus required for data sharing in a floating manner without switching the display screen during data sharing in the mobile terminal, so that even if a menu selection process or a screen switching process is not performed, The present invention relates to a mobile terminal capable of sharing data with a plurality of parties through an input, and a control method thereof.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.

The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

The present invention is directed to solving the above-mentioned problems and other problems. Another object of the present invention is to provide a method and system for providing a plurality of menus required for data sharing in a floating manner without changing the screen currently displayed during data sharing in the mobile terminal, A mobile terminal capable of sharing data with a plurality of parties through a simple input, and a control method thereof.

According to an aspect of the present invention, there is provided a touch screen device comprising: a touch screen; And displaying a floating icon that receives data sharing setting information including shared object data and a shared address on a top layer of the touch screen screen when the preset user input is received, And a controller for transmitting the shared object data to the shared address according to the information.

Here, the control unit may generate a storage box for registering the data sharing setting information, and may display the generated storage box on the same layer as the floating icon.

The controller may display the floating menu corresponding to the floating icon on the same layer as the floating icon when the floating icon is selected.

Here, the floating icon may include at least one of a library execution icon, an application selection menu icon, an address book icon, and a preview icon.

The control unit displays the floating menu corresponding to the floating icon when the floating icon is selected and receives the operation of dragging the item of the floating menu to the storage box to register the data sharing setting information .

Also, the control unit may receive the operation of dragging the item of the menu registered in the storage box out of the storage box, and delete the corresponding data sharing setting information.

When any one of the items of the floating menu is dragged into the storage box, the controller deletes and displays the item in the floating menu, restores the item of the floating menu dragged out of the storage box to the corresponding position, and displays .

When the preset user input is received, the controller sets the location information selected by the user as the sharing object data, and receives a floating icon for receiving the shared address for sharing the location information with the touch screen screen And transmit the position information to the address input through the floating icon.

According to another aspect of the present invention, there is provided a method for displaying data on a touch screen, the method comprising: displaying a floating icon, which receives data sharing setting information including shared object data and a shared address, on a top layer of a touch screen screen, Receiving the data sharing setting information input through the floating icon; And transmitting the shared object data to the shared address according to the data sharing setting information.

The step of registering the data sharing setting information input through the floating icon may include generating a storage box for registering the data sharing setting information and displaying the generated storage box on the same layer as the floating icon can do.

The step of registering the data sharing setting information inputted through the floating icon may include displaying the floating menu corresponding to the floating icon on the same layer as the floating icon when the floating icon is selected .

The step of registering the data sharing setting information input through the floating icon may further include displaying the floating menu corresponding to the floating icon on the same layer as the floating icon when the floating icon is selected; And dragging the item of the floating menu to the storage, and registering the data sharing setting information.

Here, if any one of the items of the floating menu is dragged into the storage, the corresponding item may be deleted from the floating menu and displayed.

The method may further include deleting the data sharing setting information by receiving an operation of dragging a menu item registered in the storage box out of the storage box.

The method may further include restoring an item dragged out of the storage box to a corresponding position of the floating menu and displaying the restored item.

Effects of the mobile terminal and the control method according to the present invention will be described as follows.

According to at least one embodiment of the present invention, a plurality of menus necessary for data sharing are provided in a floating manner without switching the screen currently displayed during data sharing in the mobile terminal, so that a separate menu selection process or screen switching process is performed It is possible to share data with a plurality of parties with a simple input on the screen currently being displayed.

Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.

1 is a block diagram illustrating a mobile terminal according to the present invention.
2 is a control flowchart of a mobile terminal according to the present invention.
3 is a conceptual diagram of a floating menu provided when sharing data in a mobile terminal according to the present invention.
4 is a diagram for explaining a process of deleting a selected item from a floating menu provided in a mobile terminal according to the present invention.
5 and 6 are views for explaining a process of sharing text using a floating menu provided in a mobile terminal according to the present invention.
7 and 8 are views for explaining a process of sharing location information using a floating menu provided in a mobile terminal according to the present invention.
9 and 10 are views for explaining a process of sharing schedule information using a floating menu provided in a mobile terminal according to the present invention.
11 is a view for explaining a process of sharing video capture information using a floating menu provided in a mobile terminal according to the present invention.
12 is a diagram for explaining a process of sharing game information using a floating menu provided in a mobile terminal according to the present invention.
13 is a diagram for explaining a process of providing an invitation message using a floating menu provided in a mobile terminal according to the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.

However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.

Referring to FIG. 1, FIG. 1 is a block diagram illustrating a mobile terminal according to the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, ), And the like. The components shown in FIG. 1 are not essential for implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than the components listed above.

The wireless communication unit 110 may be connected between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and another mobile terminal 100 or between the mobile terminal 100 and the external server 100. [ Lt; RTI ID = 0.0 > wireless < / RTI > In addition, the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.

The wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114, and a location information module 115 .

The input unit 120 includes a camera 121 or an image input unit for inputting a video signal, a microphone 122 for inputting an audio signal, an audio input unit, a user input unit 123 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.

The sensing unit 140 may include at least one sensor for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information. For example, the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, A G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, A microphone 226, a battery gauge, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, A thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). Meanwhile, the mobile terminal disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The output unit 150 includes at least one of a display unit 151, an acoustic output unit 152, a haptic tip module 153, and a light output unit 154 to generate an output related to visual, auditory, can do. The display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and a user and may provide an output interface between the mobile terminal 100 and a user.

The interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100. The interface unit 160 is connected to a device having a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, And may include at least one of a port, an audio I / O port, a video I / O port, and an earphone port. In the mobile terminal 100, corresponding to the connection of the external device to the interface unit 160, it is possible to perform appropriate control related to the connected external device.

In addition, the memory 170 stores data supporting various functions of the mobile terminal 100. The memory 170 may store a plurality of application programs or applications running on the mobile terminal 100, data for operation of the mobile terminal 100, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least a part of these application programs may exist on the mobile terminal 100 from the time of shipment for the basic functions (e.g., telephone call receiving function, message receiving function, and calling function) of the mobile terminal 100. Meanwhile, the application program may be stored in the memory 170, installed on the mobile terminal 100, and may be operated by the control unit 180 to perform the operation (or function) of the mobile terminal.

In addition to the operations related to the application program, the control unit 180 typically controls the overall operation of the mobile terminal 100. The control unit 180 may process or process signals, data, information, and the like input or output through the above-mentioned components, or may drive an application program stored in the memory 170 to provide or process appropriate information or functions to the user.

In addition, the controller 180 may control at least some of the components illustrated in FIG. 1A in order to drive an application program stored in the memory 170. FIG. In addition, the controller 180 may operate at least two of the components included in the mobile terminal 100 in combination with each other for driving the application program.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power to the components included in the mobile terminal 100. The power supply unit 190 includes a battery, which may be an internal battery or a replaceable battery.

At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 170. [

Hereinafter, the components listed above will be described in more detail with reference to FIG. 1 before explaining various embodiments implemented through the mobile terminal 100 as described above.

First, referring to the wireless communication unit 110, the broadcast receiving module 111 of the wireless communication unit 110 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.

The mobile communication module 112 may be a mobile communication module or a mobile communication module such as a mobile communication module or a mobile communication module that uses technology standards or a communication method (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) And an external terminal, or a server on a mobile communication network established according to a long term evolution (e. G., Long Term Evolution-Advanced).

The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. The wireless Internet module 113 is configured to transmit and receive a wireless signal in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.

The wireless Internet module 113 for performing a wireless Internet connection through the mobile communication network can be used for wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE or LTE- May be understood as a kind of the mobile communication module 112.

The short-range communication module 114 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology. The short-range communication module 114 is connected to the mobile terminal 100 and the wireless communication system through the wireless area networks, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal 100 ) And the other mobile terminal 100 (or the external server). The short-range wireless communication network may be a short-range wireless personal area network.

Here, the other mobile terminal 100 may be a wearable device (e.g., a smartwatch, a smart glass, etc.) capable of interchanging data with the mobile terminal 100 according to the present invention (smart glass), HMD (head mounted display)). The short range communication module 114 may detect (or recognize) a wearable device capable of communicating with the mobile terminal 100 around the mobile terminal 100. [ If the detected wearable device is a device authenticated to communicate with the mobile terminal 100 according to the present invention, the control unit 180 may transmit at least a part of the data processed by the mobile terminal 100 to the short- 114 to the wearable device. Therefore, the user of the wearable device can use the data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a telephone is received in the mobile terminal 100, the user performs a telephone conversation via the wearable device, or when a message is received in the mobile terminal 100, It is possible to check the message.

The position information module 115 is a module for obtaining the position (or current position) of the mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the mobile terminal utilizes the GPS module, it can acquire the position of the mobile terminal by using a signal transmitted from the GPS satellite. As another example, when the mobile terminal utilizes the Wi-Fi module, it can acquire the position of the mobile terminal based on information of a wireless access point (AP) that transmits or receives the wireless signal with the Wi-Fi module. Optionally, the location information module 115 may perform any of the other functions of the wireless communication unit 110 to obtain data relating to the location of the mobile terminal, in addition or alternatively. The location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.

Next, the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user. For inputting image information, Or a plurality of cameras 121 may be provided. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame may be displayed on the display unit 151 or stored in the memory 170. [ A plurality of cameras 121 provided in the mobile terminal 100 may be arranged to have a matrix structure and various angles or foci may be provided to the mobile terminal 100 through the camera 121 having the matrix structure A plurality of pieces of image information can be input. In addition, the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for realizing a stereoscopic image.

The microphone 122 processes the external acoustic signal into electrical voice data. The processed voice data can be utilized variously according to a function (or a running application program) being executed in the mobile terminal 100. Meanwhile, the microphone 122 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 123 is for receiving information from a user and when the information is inputted through the user input unit 123, the control unit 180 can control the operation of the mobile terminal 100 to correspond to the input information . The user input unit 123 may include a mechanical input means (or a mechanical key such as a button located on the front, rear or side of the mobile terminal 100, a dome switch, a jog wheel, Jog switches, etc.) and touch-type input means. For example, the touch-type input means may comprise a virtual key, a soft key or a visual key displayed on the touch screen through software processing, The virtual key or the visual key can be displayed on the touch screen with various forms. For example, the virtual key or the visual key can be displayed on the touch screen, ), An icon, a video, or a combination thereof.

Meanwhile, the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a corresponding sensing signal. The control unit 180 may control the driving or operation of the mobile terminal 100 or may perform data processing, function or operation related to the application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.

First, the proximity sensor 141 refers to a sensor that detects the presence of an object approaching a predetermined detection surface, or the presence of an object in the vicinity of the detection surface, without mechanical contact by using electromagnetic force or infrared rays. The proximity sensor 141 may be disposed in the inner area of the mobile terminal or in proximity to the touch screen, which is covered by the touch screen.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. In the case where the touch screen is electrostatic, the proximity sensor 141 can be configured to detect the proximity of the object with a change of the electric field along the proximity of the object having conductivity. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.

On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The proximity sensor 141 can detect a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, have. Meanwhile, the control unit 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 141 as described above, and further provides visual information corresponding to the processed data It can be output on the touch screen. Furthermore, the control unit 180 can control the mobile terminal 100 such that different operations or data (or information) are processed according to whether the touch to the same point on the touch screen is a proximity touch or a touch touch .

The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.

For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like. Here, the touch controller may be a separate component from the control unit 180, and may be the control unit 180 itself.

On the other hand, the control unit 180 may perform different controls or perform the same control according to the type of the touch object touching the touch screen (or a touch key provided on the touch screen). Whether to perform different controls or to perform the same control according to the type of the touch object may be determined according to the current state of the mobile terminal 100 or an application program being executed.

On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.

The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the controller 180 can calculate the position of the wave generating source through the information sensed by the optical sensor and the plurality of ultrasonic sensors. The position of the wave source can be calculated using the fact that the light is much faster than the ultrasonic wave, that is, the time when the light reaches the optical sensor is much faster than the time the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generating source can be calculated using the time difference with the time when the ultrasonic wave reaches the reference signal.

The camera 121 includes at least one of a camera sensor (for example, a CCD, a CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.

The camera 121 and the laser sensor may be combined with each other to sense a touch of the sensing object with respect to the three-dimensional stereoscopic image. The photosensor can be laminated to the display element, which is adapted to scan the movement of the object to be detected proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of change of light, and position information of the object to be sensed can be obtained through the calculation.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, the display unit 151 may display execution screen information of an application program driven by the mobile terminal 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

Also, the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.

In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.

The sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output unit 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output unit 152 may include a receiver, a speaker, a buzzer, and the like.

The haptic module 153 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 153 may be vibration. The intensity and pattern of the vibration generated in the haptic module 153 can be controlled by the user's selection or the setting of the control unit. For example, the haptic module 153 may synthesize and output different vibrations or sequentially output the vibrations.

In addition to vibration, the haptic module 153 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 153 can transmit the tactile effect through the direct contact, and the tactile effect can be felt by the user through the muscles of the finger or arm. The haptic module 153 may include two or more haptic modules 153 according to the configuration of the mobile terminal 100.

The light output unit 154 outputs a signal for notifying the occurrence of an event using the light of the light source of the mobile terminal 100. Examples of events that occur in the mobile terminal 100 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.

The signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or rear surface. The signal output may be terminated by the mobile terminal detecting the event confirmation of the user.

The interface unit 160 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 160 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data in the mobile terminal 100 to an external device. For example, a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, an audio I / O port, a video I / O port, an earphone port, and the like may be included in the interface unit 160.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) A universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the interface unit 160. [

The interface unit 160 may be a path through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, And various command signals may be transmitted to the mobile terminal 100. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The memory 170 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 170 may store data related to vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 170 may be a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type), a multimedia card micro type ), Card type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read memory, a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs the storage function of the memory 170 on the Internet.

Meanwhile, as described above, the control unit 180 controls the operations related to the application program and the general operation of the mobile terminal 100. [ For example, when the state of the mobile terminal meets a set condition, the control unit 180 can execute or release a lock state for restricting input of a user's control command to applications.

In addition, the control unit 180 performs control and processing related to voice communication, data communication, video call, or the like, or performs pattern recognition processing to recognize handwriting input or drawing input performed on the touch screen as characters and images, respectively . Further, the controller 180 may control any one or a plurality of the above-described components in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components. The power supply unit 190 includes a battery, the battery may be an internal battery configured to be chargeable, and may be detachably coupled to the terminal body for charging or the like.

In addition, the power supply unit 190 may include a connection port, and the connection port may be configured as an example of an interface 160 through which an external charger for supplying power for charging the battery is electrically connected.

As another example, the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port. In this case, the power supply unit 190 may use at least one of an inductive coupling method based on a magnetic induction phenomenon from an external wireless power transmission apparatus and a magnetic resonance coupling method based on an electromagnetic resonance phenomenon Power can be delivered.

In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

The mobile terminal according to the present invention can share data with a plurality of parties by simple input by providing a plurality of menus required for data sharing in a floating manner without switching the currently displayed screen during data sharing.

2 is a control flowchart of a mobile terminal according to the present invention.

Referring to FIG. 2, the control unit (180 in FIG. 1) of the mobile terminal (100 in FIG. 1)

When providing the floating icon, the control unit 180 displays a background screen on the lowest layer of the display unit 151, displays a plurality of menu icons or an application task window on the background screen, A floating icon is displayed on the screen. Accordingly, since a floating menu for data sharing is displayed on the uppermost layer in any screen such as a standby screen, an application execution screen, and the like, the user can perform the terminal function using the user interface by using the floating icon.

The floating icon includes a menu provided for setting a plurality of functions required for data sharing, for example, a storage for storing information set for data sharing, an application for data sharing, an address book, a preview, Icon.

Thereafter, the control unit 180 generates a storage box in which information for data sharing is registered according to the user's selection through the floating icon (S120). Typically, in order to share data, the address of the recipient of the data to be shared, the application to share the data, and the data must be selected. Thus, the storage box displays a series of information selected by the user for data sharing, such as item information selected by the user, an application (App) used for sharing, and address information on the receiving side. Such a library may be displayed or hidden on the top layer of the screen with a floating icon.

The controller 180 displays the menu corresponding to the floating icon selected by the user in a floating manner (S130). Accordingly, the user can set the data sharing setting information for data sharing by selecting the floating menu and dragging it to the storage box.

Thereafter, the contents selected in the floating menu are registered in the storage 400 (S140). Thereafter, data is transmitted to the user according to the data sharing setting information set in the storage (S150).

Meanwhile, all of the floating icons, the floating menu, and the archive that are displayed in the course of the above-described process can be displayed on the uppermost layer. Accordingly, the user can easily view the data of the currently displayed screen with other users You can share.

3 is a conceptual diagram of a floating menu provided when data is shared according to the control flow of the mobile terminal of FIG.

the control unit 180 displays a floating icon 300 for sharing an item on the screen 200. When the user selects the floating icon 300 on the screen 200, 200). The user can select the item sharing by performing a operation such as touching the vessel portion of the mobile terminal and the controller 180 receiving the user input displays the floating icon 300 on the uppermost layer of the screen 200 do.

The floating icon 300 may include a storage icon 300a, an application icon 300b, an address book icon 300c, and a preview icon 300d necessary for data sharing. The number of icons constituting the floating icon 300, the kind of icons, and the like can be variously modified and applied.

When the storage icon 300a is selected from among the floating icons 300, the storage 400 is displayed on the uppermost layer of the screen 200. [ In the storage box 400, data sharing setting information set by the user, for example, items to be shared, addresses of recipients, and application information to be used for sharing are registered.

In order to register information in the storage box 400, a method of selecting a desired item by a long-click and dragging the selected item to the storage box 400 may be applied. In addition, various methods such as touch selection, voice command, A selection method can be applied. When the user drags the item 4 from the item list displayed on the screen 200 to the storage 400 after long-clicking on the item 4, the item 4 is registered and displayed in the storage 400 .

Thereafter, when the application icon 300b of the floating icon 300 is selected, a floating menu 500 for selecting an application is displayed in one area of the screen 200. [ Here, an application is an application that can be used to share an item, and may include a program for connecting to a social network such as Facebook, Twitter, and the like.

(b), when a predetermined application is selected in the floating menu 500, the corresponding application is registered and displayed in the library 400. when the application 2 App2 is selected in the floating menu 500 or dragged into the storage 400 in a drag and drop manner as shown in FIG. Registered and displayed. That is, it is set to share item 4 (Item 4) using application 2 (App2).

When the application 2 (App2) is dragged from the floating menu 500 to the library 400, the position of the application 2 (App2) is vacated, and the menu below the application 3 (App3) By applying the animation effect, the user can intuitively select the menu.

when the address book icon 300c of the floating icon 300 is selected, a floating menu 500 for selecting an address is displayed in one area of the screen 200. [ The address book may include a phone number, an email, a social network service account, etc. for sharing data with the other party depending on the application selected for sharing the item.

The user can select the users to share item 4 (item 4) in the floating menu 500 for selecting the address book. the user can select a user to be shared by dragging a plurality of addresses such as Address Book 2, Address Book 5, and the like from the floating menu 500 to the storage 400, as shown in (c) of FIG.

When a user to be shared is selected, address book 2 and address book 5 are registered and displayed in the storage 400 as shown in (d). By dragging the address book 2 and the address book 5 from the floating menu 500 to the library 400, the address book 2 and the address book 5 are emptied, and the remaining address books are moved to the vacated place by the "pushing up" The user can intuitively select the menu.

When the input set for the shared execution is performed such as long-clicking the storage box 400 or dragging the storage box 400 in one direction, the data is shared according to the data sharing setting information registered in the storage box 400. (Item 4) can be shared with the users of Address Book 2 and Address Book 5 using Application 2 (App2) in case of the storage box 400 shown in Fig.

As described above, according to the present invention, when data sharing is selected through predetermined user input on a screen 200 displaying a predetermined item, a floating icon 300 for item sharing is displayed on the uppermost layer of the screen 200 Is displayed. The floating icon 300 may include a storage icon 300a, an application icon 300b, an address book icon 300c, and a preview icon 300d necessary for data sharing.

When the storage icon 300a of the floating icon 300 is selected, the storage 400 is displayed on the uppermost layer of the screen 200 to display data sharing conditions such as items to be shared, addresses of recipients, Can be registered in the storage box 400 and confirmed and transmitted.

When the application icon 300b and the address book icon 300c of the floating icon 300 are selected, the floating menu 500 corresponding to the icon is displayed in the uppermost layer of the screen 200 (400). Accordingly, the address of the application and recipients to be used for sharing in the floating menu 500 can be selected and registered in the library 400.

When the preview icon 300d of the floating icon 300 is selected, the screen when the item 4 (Item 4) is shared with the users of the address book 2 and the address book 5 by using the application 2 (App2) can be previewed have.

Thereafter, when the input set for the shared execution is performed by long-clicking the storage box 400 or dragging the storage box 400 in one direction, data can be shared according to the data sharing setting information registered in the storage box 400. [

FIG. 4 is a diagram for explaining a process of deleting an item selected in the floating menu 500 provided in the mobile terminal according to the present invention. In the screen (d) of FIG. 3, information registered in the library 400 is deleted The process is illustrated.

Data sharing conditions such as an item to be shared, addresses of recipients, and application information to be used for sharing are registered in the storage box 400, and registered conditions can be selectively deleted. In the case of selectively deleting specific information in the library 400, the user can delete information by long clicking on the deletion target information and dragging the deletion target information out of the library 400. [

(Item 4) as an item to be shared, an application 2 (App2) as an application to be used for sharing, an address book 2 and an address book 5 which are addresses of a recipient are registered in the storage box 400. Here, when the user desires to delete the "Address Book 2", the user can delete the information by long clicking the "Address Book 2" and dragging it out of the storage 400.

When deletion of the "address book 2" is selected, the address book 2 is deleted from the storage 400 and the address book 2 is filled with the original place in the floating menu 500, as shown in (f). Here, by applying the animation effect in which the remaining address books of the address book 3 or less located in the place of the address book 2 are moved in the "push down" manner, the user can intuitively select the menu.

5 and 6 illustrate a process of sharing text using a floating menu 500 provided in a mobile terminal according to the present invention. FIG. 5 illustrates a process of sharing and transmitting text, 6 is a diagram for explaining a process of using a shared text in a recipient mobile terminal.

Referring to FIG. 5A, when a user selects a partial area S of a text desired to be shared on the screen 210 and selects the text S, ) Is selected as data to be shared. Here, the text may include not only the general document but also information such as telephone number, URL, and the like.

The control unit 180 receives an input for long-clicking on a certain area S of the text, and displays a floating icon 300 on the screen 210 for item sharing, as shown in (b). In addition, the control unit 180 may display the selected area so as to be distinguished from other text areas, and display the floating icon 300 on the selected text area.

If the user dragged the text area to the address book icon 300c while keeping the text area long clicked, a floating menu 500b for selecting an address in one area of the screen 210, as shown in FIG. 5 (c) Is displayed on the uppermost layer of the screen 210.

The user can select the users who will share the selected text by dragging the address book icon 300c to the floating menu 500b while clicking the address book icon 300c. Accordingly, the controller 180 may transmit the text selected by the user according to the address book selected in the floating menu 500b, and may share the selected text with the other user.

As described above, according to the embodiment of the present invention, the user can display a floating icon 300 for sharing on the screen 200 by merely selecting and long-clicking on a partial area S of the text, The user can complete the process of selecting the other party without switching the screen or entering the menu.

6 is a diagram for explaining a process of using a shared text in a recipient mobile terminal.

When the text is shared through the series of processes of FIG. 5, the receiving-side mobile terminal displays the shared-information receiving icon S1. The receiving icon S1 is displayed on the uppermost layer of the screen H of the receiving-side mobile terminal, and thumbnail information of the received information can be displayed.

(a), a receiving icon S1 may be displayed on the uppermost layer of the home screen H of the mobile terminal. Here, the receiving icon S1 is provided in the form of a floating icon 300, and is always displayed on the uppermost layer, and the user can drag and move the position thereof.

(b), the user can touch the reception icon S1 provided in a floating manner and move the text to the input window 220 of the application which can edit the text. For example, the receiving icon S1 can be touched to move to the screen in which the memo application is running.

(c), if the receiving icon S1 is dropped on the screen 220 on which the input window is displayed, the received text can be automatically copied and inserted.

7 and 8 are diagrams for explaining a process of sharing location information using a floating menu 500 provided in a mobile terminal according to the present invention. FIG. 7 illustrates a process of transmitting location information, 8 is a diagram for explaining a process of using the shared location information in the receiving-side mobile terminal.

Referring to FIG. 7A, if a user clicks a point on the screen 230 on which a map g is displayed to select a point to share, the map g including the point, that is, Data is selected.

Upon receiving the input of selecting the location information sharing, the control unit 180 displays a floating icon 300 for screen sharing on the screen 230, as shown in (b).

When a user clicks on a point to be shared with the address book icon 300c as it is, a floating menu 500b for selecting an address in one area of the screen 230 as shown in FIG. And is displayed on the uppermost layer of the screen 230.

The user can select the users to share the selected location information by dragging the address book icon 300c to the floating menu 500b while clicking the address book icon 300c.

The control unit 180 may transmit the location information selected by the user according to the address book selected in the floating menu 500b and share the selected location information with the other user.

As described above, according to the embodiment of the present invention, the user can select the floating icon 300 on the screen 230 displayed on the map g only by long- It is possible to complete a series of processes for displaying the location information on the display unit 200 and selecting the other party to share location information without switching the screen or entering the menu.

8 is a diagram for explaining a process of using the shared location information in the receiving-side mobile terminal.

When the location information is shared through the process of FIG. 7, the receiving-side mobile terminal displays the shared-information receiving icon g1. The receiving icon g1 is displayed on the uppermost layer of the screen 240 of the receiving-side mobile terminal, and thumbnail information of the received information can be displayed.

(a), the receiving icon g1 may be displayed on the top layer of the screen 240 of the receiving mobile terminal. The receiving icon g1 is provided in the form of a floating icon so that the user can drag and move the position thereof. The reception icon g1 may be displayed as thumbnail information so as to indicate the type of the received information, or may be displayed as characters, images, graphics, etc. that describe the type of information.

(b), the user can select the receiving icon g1 and share the corresponding information. When the user touches the reception icon g1, an application for sharing the shared information, that is, the location information, can be executed. Then, as shown in (c), a menu 242 for sharing location information can be executed. Various menus using position information such as a map view and a traffic information may be provided in the menu 242 for sharing location information. In addition, the menu 242 for sharing location information may be provided with a location search menu, Lt; / RTI >

When the "location search" menu is selected in the menu 242 for location information sharing, a map g containing location information transmitted by the transmitting user as shown in (d) may be displayed.

9 and 10 illustrate a process of sharing schedule information using a floating menu 500 provided in a mobile terminal according to the present invention. FIG. 9 illustrates a process of transmitting schedule information, 10 is a diagram for explaining a process of using the shared schedule information in the recipient mobile terminal.

Referring to FIG. 9A, when a user selects a desired date to be shared on the screen 250 on which the schedule information is displayed, the schedule information of the corresponding date is selected as the sharing target data.

Upon receiving the input of selecting the schedule information sharing, the controller 180 displays a floating icon 300 for sharing the schedule information c of the date on the screen 250 as shown in (b) Display.

When a user wants to share a date by clicking on the address book icon 300c in a state of being clicked, a floating menu 500b for selecting an address in one area of the screen 250 as shown in FIG. And is displayed on the uppermost layer of the screen 250.

The user can select the users who share the schedule information c of the selected date by dragging the address book icon 300c to the floating menu 500b as it is.

The control unit 180 may transmit the schedule information c of the date selected by the user according to the address book selected in the floating menu 500b and share the schedule information with the counterpart user.

FIG. 10 is a diagram for explaining a process of using the shared schedule information in the receiving side mobile terminal in FIG.

When the location information is shared through the series of processes of FIG. 9, the receiving-side mobile terminal displays the shared-information receiving icon c1. The receiving icon c1 is displayed on the uppermost layer of the screen 260 of the receiving side mobile terminal, and thumbnail information of the received information can be displayed.

(a), the receiving icon (c1) may be provided in the form of a floating icon on the uppermost layer of the screen 260 of the receiving mobile terminal. The reception icon c1 may be displayed as thumbnail information so as to indicate the type of the received information, or may be displayed as characters, images, graphics, etc. that describe the type of information.

(b), the user can select the receive icon (c1) and share the corresponding information. When the user touches the reception icon c1, an application for sharing the received schedule information can be executed.

Then, as shown in (c), the schedule management menu of the receiving side mobile terminal can be executed. Accordingly, the shared schedule information is displayed through the schedule management menu of the receiving-side mobile terminal, so that the transmitting-side user and the receiving-side user can share the schedule information.

 11 is a view for explaining a process of sharing video capture information using a floating menu 500 provided in a mobile terminal according to the present invention.

Referring to FIG. 11A, a user can capture an image to be shared while viewing a moving image. The capturing method of the moving picture may be different depending on the moving picture reproducing application. For example, an image can be captured by a long click on the screen being reproduced, or a separate capturing button is selected. When a user captures a video image to be shared, the image can be selected as the sharing target data.

Upon receiving the input for selecting the capture image sharing, the control unit 180 displays the floating icon 300 for item sharing on the top level of the screen 270, as shown in (b).

When the storage icon 300a of the floating icon 300 is selected, the storage 400 is displayed on the uppermost layer of the screen 270, and a sharing condition of a captured image such as an application, a recipient address, It can be registered, confirmed and transmitted.

When the address book icon of the floating icon 300 is selected, a floating menu 500 for selecting an address book is displayed on the screen 270. Accordingly, the address book of the floating menu 500 can be dragged and dropped and registered in the storage 400.

(c), selecting the capture icon 300d among the floating icons 300 (a) can confirm the captured image I on the screen.

FIG. 12 is a diagram for explaining a process of sharing game information using the floating menu 500 provided in the mobile terminal according to the present invention.

(a) illustrates a screen of a mobile terminal of a user sharing game information. (a), when the user performs an input for sharing game information on the game execution screen 230, the control unit 180 may display a floating menu 500 for sharing game information. The control unit 180 may provide a separate menu, icon, or the like for selecting a game information sharing, or may be provided with various users such as touching a vessel portion of the mobile terminal, The user can receive the game information sharing selection input by the input method.

Upon receiving the input for selecting game information sharing, the control unit 180 displays a floating menu 500 for selecting a user to share the game information. When the user selects the address book in the floating menu 500, the controller 180 may transmit the game information according to the address book selected in the floating menu 500b and share the game information with the other user.

(b) illustrates a process of using game information in a receiving-side mobile terminal. (b), a shared information receiving icon (V1) is displayed on the mobile terminal which has received the game information.

The receiving icon V1 may be displayed in the floating icon format on the top layer of the screen 290 of the receiving mobile terminal. The receiving icon V1 may be displayed as thumbnail information so as to indicate the type of the received information, or may be displayed as characters, images, graphics, etc. that describe the type of information.

The receiving side user can share the game information by selecting the receiving icon V1. When the receiving icon (V1) is selected, if the corresponding game is installed on the receiving side mobile terminal, a message "[xx game] together with?"

On the other hand, when the corresponding game is not installed in the receiving-side mobile terminal, a message "Move to [xx game] installation screen?

As described above, according to one embodiment of the present invention, when a user touches a vessel portion on the game execution screen 230 or performs input such as long-clicking on the game execution screen 230, A floating menu 500 for displaying a menu can be displayed and a process of selecting a partner to be shared can be completed without switching screens or entering a menu.

13 is a diagram illustrating a process of providing an invitation message using the floating menu 500 provided in the mobile terminal according to the present invention.

(a), when the user performs an input for inviting a friend on the game screen 280, the control unit 180 may display a floating menu 500 for inviting a friend. The control unit 180 displays the floating menu 500 for selecting the address book of the friend 400 to be invited on the top level of the game screen 280. [

The user can select an address of a friend to be invited by dragging the address book 1, the address book 3, and the address book 5 from the floating menu 500 to the storage box 400 by long click.

the address book 1, the address book 3, and the address book 5 are registered and displayed in the storage 400, and in the floating menu 500, the address book 1, the address book 3, and the address book 5 are deleted do. When the address book 1, the address book 3, and the address book 5 are dragged from the floating menu 500 to the library 400, the positions of the address book 1, the address book 3, and the address book 5 are cleared and the remaining address books are moved to the vacated place By applying the animation effect, the user can intuitively select the menu.

Thereafter, the address book registered in the storage box 400 may be dragged on the game screen 280 to select the transmission of the game invitation message. In addition, when a predetermined input is performed, such as long-clicking the addresses registered in the library 400 or dragging the library 400, a game invitation message is transmitted according to the address book registered in the library 400. [

the control unit 180 may display the transmission result message 282 after transmitting the game invitation message according to the address book registered in the storage 400 according to the selection of the invitation message transmission of the user.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

100: mobile terminal 110: wireless communication unit
120: Input unit
140: sensing unit 150: output unit
160: interface unit 170: memory
180: control unit 190: power supply unit

Claims (15)

touch screen; And
And displays the data sharing setting information input through the floating icon on the touch screen screen when the preset user input is received, displays a floating icon that receives data sharing setting information including the sharing target data and the shared address on the uppermost layer of the touch screen screen, To the shared address according to the shared-address data;
.
The method of claim 1, wherein
The control unit
Generates a storage box for registering the data sharing setting information, and displays the generated storage box on the same layer as the floating icon.
The method according to claim 1,
Wherein,
And displays the floating menu corresponding to the floating icon on the same layer as the floating icon when the floating icon is selected.
The method of claim 3,
The floating icon
An application selection menu icon, an address book icon, and a preview icon.
3. The method of claim 2,
Wherein,
Displaying the floating menu corresponding to the floating icon when the floating icon is selected,
And dragging the item of the floating menu to the storage, and registers the data sharing setting information.
6. The method of claim 5,
Wherein,
And deletes the corresponding data sharing setting information by receiving an operation of dragging a menu item registered in the storage box out of the storage box.
The method according to claim 6,
Wherein,
When one of the items of the floating menu is dragged into the storage box, the corresponding item is deleted from the floating menu,
And displays the item of the floating menu dragged out of the storage box at a corresponding position.
The method according to claim 1,
Wherein,
A touch panel for displaying the touch screen screen on the touch screen screen, and a touch panel for displaying the touch panel screen; And transmits the location information to the address input through the floating icon.
Displaying a floating icon that receives data sharing setting information including a sharing target data and a shared address on a top layer of a touch screen screen when a preset user input is received;
Receiving the data sharing setting information input through the floating icon; And
Transmitting the sharing target data to the shared address according to the data sharing setting information;
And transmitting the control information to the mobile terminal.
The method of claim 9, wherein
Wherein the registering of the data sharing setting information input through the floating icon comprises:
Generating a storage box for registering the data sharing setting information, and displaying the created storage box on the same layer as the floating icon.
10. The method of claim 9,
Wherein the registering of the data sharing setting information input through the floating icon comprises:
And displaying the floating menu corresponding to the floating icon on the same layer as the floating icon when the floating icon is selected.
11. The method of claim 10,
Wherein the registering of the data sharing setting information input through the floating icon comprises:
Displaying the floating menu corresponding to the floating icon on the same layer as the floating icon when the floating icon is selected; And
And registering the data sharing setting information in response to an operation of dragging the item of the floating menu to the storage box.
The method of claim 12, wherein
And deleting and displaying the item in the floating menu when one of the items of the floating menu is dragged to the storage box.
The method of claim 12, wherein
Further comprising the step of deleting the data sharing setting information by receiving an operation of dragging a menu item registered in the storage box out of the storage box.
15. The method of claim 14,
And restoring the dragged item out of the storage box to a corresponding position of the floating menu and displaying the restored item.
KR1020140142719A 2014-10-21 2014-10-21 Mobile terminal and method for controlling the same KR20160046593A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140142719A KR20160046593A (en) 2014-10-21 2014-10-21 Mobile terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140142719A KR20160046593A (en) 2014-10-21 2014-10-21 Mobile terminal and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20160046593A true KR20160046593A (en) 2016-04-29

Family

ID=55915727

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140142719A KR20160046593A (en) 2014-10-21 2014-10-21 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20160046593A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110703974A (en) * 2019-09-26 2020-01-17 珠海市小源科技有限公司 Message interaction method, device and storage medium
WO2022005050A1 (en) * 2020-07-01 2022-01-06 (주)버즈빌 Service control method and user terminal for providing shared handle-based interface
KR20220071486A (en) * 2020-11-24 2022-05-31 주식회사 엔씨소프트 Method and apparatus for generating a game party

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110703974A (en) * 2019-09-26 2020-01-17 珠海市小源科技有限公司 Message interaction method, device and storage medium
WO2022005050A1 (en) * 2020-07-01 2022-01-06 (주)버즈빌 Service control method and user terminal for providing shared handle-based interface
KR20220003308A (en) * 2020-07-01 2022-01-10 (주)버즈빌 Service control method for providing interface based on shared handle, and user terminal therefor
KR20220071486A (en) * 2020-11-24 2022-05-31 주식회사 엔씨소프트 Method and apparatus for generating a game party

Similar Documents

Publication Publication Date Title
KR20180048142A (en) Mobile terminal and method for controlling the same
KR20180016131A (en) Mobile terminal and method for controlling the same
KR20150130053A (en) Mobile terminal and method for controlling the same
KR20160022147A (en) Mobile terminal, glass type terminal and mutial interworking method using screens thereof
KR20170016165A (en) Mobile terminal and method for controlling the same
KR20160009976A (en) Mobile terminal and method for controlling the same
KR20160026244A (en) Mobile terminal and deleted information managing method thereof
KR20180020452A (en) Terminal and method for controlling the same
KR20160091780A (en) Mobile terminal and method for controlling the same
KR20170001329A (en) Mobile terminal and method for controlling the same
KR20180135698A (en) Mobile terminal and method for controlling the same
KR20180017638A (en) Mobile terminal and method for controlling the same
KR20170115863A (en) Mobile terminal and method for controlling the same
KR20160046593A (en) Mobile terminal and method for controlling the same
KR20170019248A (en) Mobile terminal and method for controlling the same
KR20150145893A (en) Mobile terminal and the control method thereof
KR20150088596A (en) Mobile terminal and emotional message displaying method thereof
KR102225133B1 (en) Mobile terminal and method for controlling the same
KR20170108715A (en) Mobile terminal and method for controlling the same
KR20170055225A (en) Mobile terminal and method for controlling the same
KR20170039994A (en) Mobile terminal and method for controlling the same
KR20150123117A (en) Mobile terminal and method for controlling the same
KR20150094355A (en) Mobile terminal and controlling method thereof
KR20180079051A (en) Mobile terninal and method for controlling the same
KR20170047059A (en) Mobile terminal for displaying information and controlling method thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination