WO2015053506A1 - Terminal mobile et son procédé de commande - Google Patents

Terminal mobile et son procédé de commande Download PDF

Info

Publication number
WO2015053506A1
WO2015053506A1 PCT/KR2014/009207 KR2014009207W WO2015053506A1 WO 2015053506 A1 WO2015053506 A1 WO 2015053506A1 KR 2014009207 W KR2014009207 W KR 2014009207W WO 2015053506 A1 WO2015053506 A1 WO 2015053506A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
output
mobile terminal
graphic object
region
Prior art date
Application number
PCT/KR2014/009207
Other languages
English (en)
Inventor
Sangwon Kim
Yunsun Choi
Ahreum KIM
Kibong Song
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020130119985A external-priority patent/KR102068799B1/ko
Priority claimed from KR20130139286A external-priority patent/KR20150041546A/ko
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Priority to CN201480055385.XA priority Critical patent/CN105612487B/zh
Priority to US14/915,838 priority patent/US20160196058A1/en
Priority to EP14852691.6A priority patent/EP3056061B1/fr
Publication of WO2015053506A1 publication Critical patent/WO2015053506A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to a mobile terminal capable of sensing a touch applied thereto and a control method thereof.
  • Terminals may be divided into a mobile terminal and stationary terminal according to whether or not terminals are movable.
  • mobile terminals may be divided into a handheld terminal and a vehicle mount terminal according to whether or not users can directly carry it around.
  • the mobile terminal may support more complicated functions such as capturing images or video, reproducing music or video files, playing games, receiving broadcast signals, and the like.
  • the mobile terminal may be embodied in the form of a multimedia player or device.
  • a user intuitive interface may include an interface through a touch. Thus, the development of various interfaces through touches may be required.
  • an object of the present invention is to provide a method for controlling screen information output to a display unit by using a multi-touch input.
  • Another object of the present invention is to provide a method for controlling a plurality of divided regions of a display unit in different manners by using a multi-touch input.
  • Another object of the present invention is to provide a method for utilizing a function key associated with an operation of a mobile terminal whose entire front surface is configured as a display unit.
  • a mobile terminal including: a display unit configured to output a graphic object; a sensing unit configured to sense a touch input applied to the display unit; and a controller configured to, when a first touch input applied to the display unit and a second touch input different from the first touch input are sensed in a state in which the first touch input is maintained, output at least one different graphic object related to a graphic object maintained to be output to at least one region of the display unit, in response to the second touch input in a state in which output of the graphic object to the region to which the first touch input has been applied is maintained.
  • the controller may divide the display unit into a plurality of regions, and the graphic object, which has been output to the region to which the first touch input has been applied, may be output to a first region among the plurality of regions and a graphic object different from the graphic object may be output to a second region among the plurality of regions.
  • a graphic object different from the graphic object output to the second region may be output to the second region in response to the second touch input which has been applied again.
  • the first and second touch inputs may be different types of touch input, and after the display unit is divided into a plurality of regions on the basis of the sensed first and second touch inputs, when the second touch input is applied to the first region in a state in which the first touch input has been applied to the second region, the graphic object output to the second region may be maintained to be output by the first touch input and the graphic object output to the first region may be changed to a different graphic object related to the graphic object output to the second region.
  • the controller may magnify the graphic object of the second region in response to the magnifying of the graphic object output to the first region.
  • the display unit may include a plurality of graphic objects
  • the controller may fix an output position of the graphic object to which the first touch input is applied, among the plurality of graphic objects on the basis of the applied first and second touch inputs, and change an output position of a graphic object, excluding the graphic object whose output position is fixed, on the basis of the applied second touch input.
  • the controller when the output position of the graphic object is changed according to the second touch input, the controller may make at least one of the plurality of graphic objects output to the display unit disappear from the display unit, and output at least one new graphic object which has not been output to the display unit in response to the disappearance of the at least one graphic object.
  • the graphic objects may be provided as plural, and the controller may set at least two graphic objects, among the plurality of graphic objects as a group according to a user selection, and output the graphic objects included in the set group among the plurality of objects to the display unit.
  • the group may be set by a third touch input different from the first and second touch inputs with respect to the plurality of graphic objects.
  • the graphic objects included in the set group may exist in the display unit, and when first and second touch inputs applied to the graphic objects included in the set group are sensed, the controller may maintain output of the graphic object to which the first touch input has been applied among the graphic objects included in the set group, and output a graphic object other than the graphic object to which the first touch input has been applied among the graphic objects set as the group.
  • a mobile terminal including: a display unit configured to output a plurality of function keys associated with operations of the mobile terminal; and a controller configured to, when a first corresponding to a pre-set scheme is applied to any one of the plurality of function keys, process an operation of the mobile terminal related to the function key to which the first touch has been applied, wherein the controller outputs information related to the any one function key to which the first touch has been applied to a region of the display unit, and when a second touch, different from the first touch, is applied to the output information in a state in which the first touch is maintained, the controller executes a function corresponding to the information to which the second touch has been applied.
  • screen information related to an operation of the mobile terminal may be output to the display unit, and the information related to the any one function key may be output to a region of the screen information related to an operation of the mobile terminal in an overlapping manner.
  • the controller may execute a function corresponding to the information related to the any one function key, and output an executed screen of the function corresponding to the information related to the any one function key to the display unit.
  • information related to the any one key may be plural, and a list corresponding to the plurality of pieces of information may be output to the display unit, and when a second touch is applied to the list, at least one of addition of items constituting the list, deletion of items constituting the list, and merging of items constituting the list may be changed.
  • screen information associated with an operation of the mobile terminal may be output to the display unit, and when a second touch is applied to the screen information, the controller may add the screen information to the list.
  • information related to the function key to which the first touch has been applied may be Web browser history information.
  • the mobile terminal in a state in which the first touch is maintained, when a second touch is applied to the history information, the mobile terminal may access a Web site corresponding to the selected history information and an executed screen of the Web site may be output to the display unit.
  • bookmark information in a case in which a Web browser application operates and a Web browser executed screen is output to the display unit, when a first touch is applied to any one key having a bookmark function storing addresses of Web sites that a user frequently visits among the plurality of function keys, information related to the function key to which the first touch has been applied may be bookmark information.
  • a method for controlling a mobile terminal including: outputting a plurality of function keys associated with operations of the mobile terminal to a display unit; applying a pre-set type first touch to any one of the plurality of function keys; outputting information related to the any one function key to which the first touch is applied to a region of the display unit; and in a state in which the first touch is maintained, executing a function corresponding to the output information when a second touch different from the first touch is applied to the output information.
  • a graphic object related to the graphic object may be output to another region of the display unit.
  • the user may compare the graphic objects output to the two regions.
  • output positions of the other graphic objects may be changed.
  • the user may change output of the other remaining graphic objects, while maintaining output of the desired graphic object.
  • a plurality of function keys associated with operations of the mobile terminal may be output to a region of the display unit, whereby the user may use the plurality of function keys by associating them with screen information output to the display unit.
  • FIG. 1 is a block diagram of a mobile terminal disclosed in the present disclosure.
  • FIG. 2A is a front perspective view of the mobile terminal according to an exemplary embodiment of the present invention.
  • FIG. 2B is a rear perspective view of the mobile terminal according to an exemplary embodiment of the present invention.
  • FIG. 3 is a flow chart illustrating a method for controlling output of a graphic object to a display unit by using a multi-touch in the mobile terminal according to an exemplary embodiment of the present invention.
  • FIG. 4 is a conceptual view illustrating the control method of FIG. 3.
  • FIGS. 5A and 5B are conceptual views illustrating changing of a graphic object by a second touch.
  • FIG. 6 is a conceptual view illustrating changing of a graphic object whose output is fixed to the display unit.
  • FIGS. 7A through 7C are conceptual views illustrating a method for editing graphic objects output to first and second regions.
  • FIGS. 8A and 8B are conceptual views illustrating a method for controlling output of a plurality of graphic objects.
  • FIGS. 9A and 9B are conceptual views illustrating a method for grouping at least a portion of a plurality of graphic objects.
  • FIGS. 10A and 10B are conceptual views illustrating a method for using grouped graphic objects.
  • FIG. 11 is a conceptual view illustrating a method for controlling a display unit on which a camera application is being executed.
  • FIG. 12 is a perspective view of a mobile terminal whose entire front surface is covered with a display unit according to an exemplary embodiment of the present invention.
  • FIG. 13 is a flow chart illustrating a control method using a plurality of function keys in the mobile terminal having the entire front surface covered with the display unit according to an exemplary embodiment of the present invention.
  • FIGS. 14A and 14B are conceptual views illustrating the control method of FIG. 13.
  • FIG. 15 is a conceptual view illustrating a way to determine whether to output information according to whether a first touch is sensed.
  • FIGS. 16A through 16C are conceptual views illustrating a method for controlling information output by a second touch.
  • FIG. 17 is a conceptual view illustrating a method for editing an output memo in the mobile terminal in which a memo application is executed.
  • Mobile terminals described in the present invention may include mobile phones, smart phones, notebook computers, digital broadcast receivers, PDAs (Personal Digital Assistants), PMPs (Portable Multimedia Player), navigation devices, and the like.
  • PDAs Personal Digital Assistants
  • PMPs Portable Multimedia Player
  • FIG. 1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an A/V (Audio/Video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc.
  • FIG. 1 shows the mobile terminal as having various components, but it should be understood that implementing all of the illustrated components is not a requirement. Greater or fewer components may alternatively be implemented.
  • the wireless communication unit 110 typically includes one or more components allowing radio communication between the mobile terminal 100 and a wireless communication system or a network in which the mobile terminal is located.
  • the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives broadcast signals and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel may include a satellite channel and/or a terrestrial channel.
  • the broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits the same to a terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network and, in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal may exist in various forms. For example, it may exist in the form of an electronic program guide (EPG) of digital multimedia broadcasting (DMB), electronic service guide (ESG) of digital video broadcast-handheld (DVB-H), and the like.
  • EPG electronic program guide
  • DMB digital multimedia broadcasting
  • ESG electronic service guide
  • DVB-H digital video broadcast-handheld
  • the broadcast receiving module 111 may be configured to receive signals broadcast by using various types of broadcast systems.
  • the broadcast receiving module 111 may receive a digital broadcast by using a digital broadcast system such as multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video broadcast-handheld (DVB-H), the data broadcasting system known as media forward link only (MediaFLO®), integrated services digital broadcast-terrestrial (ISDB-T), etc.
  • the broadcast receiving module 111 may be configured to be suitable for every broadcast system that provides a broadcast signal as well as the above-mentioned digital broadcast systems.
  • Broadcast signals and/or broadcast-associated information received via the broadcast receiving module 111 may be stored in the memory 160.
  • the mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, Node B, etc.), an external terminal and a server.
  • a base station e.g., access point, Node B, etc.
  • Such radio signals may include a voice call signal, a video call signal or various types of data according to text and/or multimedia message transmission and/or reception.
  • the wireless Internet module 113 supports wireless Internet access for the mobile terminal. This module may be internally or externally coupled to the terminal.
  • the wireless Internet access technique implemented may include a WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), or the like.
  • the short-range communication module 114 is a module for supporting short range communications.
  • Some examples of short-range communication technology include BluetoothTM, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), ZigBeeTM, and the like.
  • the location information module 115 is a module for acquiring a location of the mobile terminal.
  • a typical example of the location information module is a GPS (Global Positioning System).
  • the A/V input unit 120 is configured to receive an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 122.
  • the camera 121 processes image data of still pictures or video obtained by an image capture device in a video capturing mode or an image capturing mode.
  • the processed image frames may be displayed on a display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110. Two or more cameras 121 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 may receive sounds (audible data) via a microphone in a phone call mode, a recording mode, a voice recognition mode, and the like, and can process such sounds into audio data.
  • the processed audio (voice) data may be converted for output into a format transmittable to a mobile communication base station via the mobile communication module 112 in case of the phone call mode.
  • the microphone 122 may implement various types of noise canceling (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
  • the user input unit 130 may generate input data from commands entered by a user to control various operations of the mobile terminal.
  • the user input unit 130 may include a keypad, a dome switch, a touch pad (e.g., a touch sensitive member that detects changes in resistance, pressure, capacitance, etc. due to being contacted), a jog wheel, a jog switch, and the like.
  • the sensing unit 140 detects a current status of the mobile terminal 100 such as an opened or closed state of the mobile terminal 100, a location of the mobile terminal 100, the presence or absence of user contact with the mobile terminal 100 (i.e., touch inputs), the orientation of the mobile terminal 100, an acceleration or deceleration movement and direction of the mobile terminal 100, etc., and generates commands or signals for controlling the operation of the mobile terminal 100.
  • a current status of the mobile terminal 100 such as an opened or closed state of the mobile terminal 100, a location of the mobile terminal 100, the presence or absence of user contact with the mobile terminal 100 (i.e., touch inputs), the orientation of the mobile terminal 100, an acceleration or deceleration movement and direction of the mobile terminal 100, etc.
  • the sensing unit 140 may sense whether the slide phone is opened or closed.
  • the sensing unit 140 can detect whether or not the power supply unit 190 supplies power or whether or not the interface unit 170 is coupled with an external device.
  • the sensing unit 140 may include a proximity sensor 141.
  • the output unit 150 is configured to provide outputs in a visual, audible, and/or tactile manner (e.g., audio signal, video signal, alarm signal, vibration signal, etc.).
  • the output unit 150 may include the display unit 151, an audio output module 152, an alarm unit 153, a haptic module 154, and the like.
  • the display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphic User Interface (GUI) associated with a call or other communication (such as text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or image capturing mode, the display unit 151 may display a captured image and/or received image, a UI or GUI.
  • UI User Interface
  • GUI Graphic User Interface
  • the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor-LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, or the like.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor-LCD
  • OLED organic light emitting diode
  • flexible display a three-dimensional (3D) display, or the like.
  • a typical transparent display may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display, or the like.
  • the mobile terminal 100 may include two or more display units according to its particular desired embodiment.
  • a plurality of display units may be separately or integrally disposed on one surface of the mobile terminal, or may be separately disposed on mutually different surfaces.
  • the display unit 151 may function as both an input device and an output device.
  • the touch sensor may have a form of a touch film, a touch sheet, a touch pad, and the like.
  • the touch sensor may be configured to convert pressure applied to a particular portion of the display unit 151 or a change in the capacitance or the like generated at a particular portion of the display unit 151 into an electrical input signal.
  • the touch sensor may be configured to detect the pressure when a touch is applied, as well as the touched position and area.
  • a corresponding signal (signals) are transmitted to a touch controller.
  • the touch controller processes the signals and transmits corresponding data to the controller 180. Accordingly, the controller 180 may recognize which portion of the display unit 151 has been touched.
  • a proximity sensor 141 may be disposed within or near the touch screen.
  • the proximity sensor 141 is a sensor for detecting the presence or absence of an object relative to a certain detection surface or an object that exists nearby by using the force of electromagnetism or infrared rays without a physical contact.
  • the proximity sensor 141 has a considerably longer life span compared with a contact type sensor, and it can be utilized for various purposes.
  • Examples of the proximity sensor 141 may include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror-reflection type photoelectric sensor, an RF oscillation type proximity sensor, a capacitance type proximity sensor, a magnetic proximity sensor, an infrared proximity sensor, and the like.
  • the touch screen is the capacitance type
  • proximity of the pointer is detected by a change in electric field according to the proximity of the pointer.
  • the touch screen may be classified as a proximity sensor.
  • recognition of the pointer positioned to be close to the touch screen will be called a 'proximity touch', while recognition of actual contacting of the pointer on the touch screen will be called a 'contact touch'.
  • the pointer when the pointer is in the state of the proximity touch, it means that the pointer is positioned to correspond vertically to the touch screen.
  • a proximity touch and a proximity touch pattern e.g., a proximity touch distance, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, or the like
  • a proximity touch and a proximity touch pattern can be detected, and information corresponding to the detected proximity touch operation and the proximity touch pattern can be outputted to the touch screen.
  • the audio output module 152 may convert and output as sound audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode, a record mode, a voice recognition mode, a broadcast reception mode, and the like. Also, the audio output module 152 may provide audible outputs related to a particular function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, or other sound generating device.
  • the alarm unit 153 may provide outputs to inform about the occurrence of an event of the mobile terminal 100. Typical events may include call reception, message reception, key signal inputs, a touch input etc. In addition to audio or video outputs, the alarm unit 153 may provide outputs in a different manner to inform about the occurrence of an event. For example, the alarm unit 153 may provide an output in the form of vibrations. When a call, a message, or some other incoming communication is received, the alarm unit 153 may provide tactile outputs (i.e., vibrations) to inform the user thereof. By providing such tactile outputs, the user can recognize the occurrence of various events even if his mobile phone is in the user's pocket. Outputs informing about the occurrence of an event may be also provided via the display unit 151 or the audio output module 152. The display unit 151 and the audio output module 152 may be classified as a part of the alarm unit 153.
  • the haptic module 154 generates various tactile effects the user may feel.
  • a typical example of the tactile effects generated by the haptic module 154 is vibration.
  • the strength and pattern of the haptic module 154 can be controlled. For example, different vibrations may be combined to be outputted or sequentially outputted.
  • the haptic module 154 may generate various other tactile effects such as an effect by stimulation such as a pin arrangement vertically moving with respect to a contact skin, a spray force or suction force of air through a jet orifice or a suction opening, a contact on the skin, a contact of an electrode, electrostatic force, etc., an effect by reproducing the sense of cold and warmth using an element that can absorb or generate heat.
  • an effect by stimulation such as a pin arrangement vertically moving with respect to a contact skin, a spray force or suction force of air through a jet orifice or a suction opening, a contact on the skin, a contact of an electrode, electrostatic force, etc.
  • the haptic module 154 may be implemented to allow the user to feel a tactile effect through a muscle sensation such as fingers or arm of the user, as well as transferring the tactile effect through a direct contact. Two or more haptic modules 154 may be provided according to the configuration of the mobile terminal 100.
  • the memory 160 may store software programs used for the processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, video, etc.) that are inputted or outputted. In addition, the memory 160 may store data regarding various patterns of vibrations and audio signals outputted when a touch is inputted to the touch screen.
  • the memory 160 may include at least one type of storage medium including a Flash memory, a hard disk, a multimedia card micro type, a card-type memory (e.g., SD or DX memory, etc), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read-Only Memory (ROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Programmable Read-Only memory (PROM), a magnetic memory, a magnetic disk, and an optical disk.
  • the mobile terminal 100 may be operated in relation to a web storage device that performs the storage function of the memory 160 over the Internet.
  • the interface unit 170 serves as an interface with every external device connected with the mobile terminal 100.
  • the external devices may transmit data to an external device, receives and transmits power to each element of the mobile terminal 100, or transmits internal data of the mobile terminal 100 to an external device.
  • the interface unit 170 may include wired or wireless headset ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input/output (I/O) ports, video I/O ports, earphone ports, or the like.
  • the identification module may be a chip that stores various types of information for authenticating the authority of using the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module (SIM) a universal subscriber identity module (USIM), and the like.
  • the device having the identification module (referred to as 'identifying device', hereinafter) may take the form of a smart card. Accordingly, the identifying device may be connected with the terminal 100 via a port.
  • the interface unit 170 may serve as a passage to allow power from the cradle to be supplied therethrough to the mobile terminal 100 or may serve as a passage to allow various command signals inputted by the user from the cradle to be transferred to the mobile terminal therethrough.
  • Various command signals or power inputted from the cradle may operate as signals for recognizing that the mobile terminal is properly mounted on the cradle.
  • the controller 180 typically controls the general operations of the mobile terminal. For example, the controller 180 performs controlling and processing associated with voice calls, data communications, video calls, and the like.
  • the controller 180 may include a multimedia module 181 for reproducing multimedia data.
  • the multimedia module 181 may be configured within the controller 180 or may be configured to be separated from the controller 180.
  • the controller 180 may perform a pattern recognition processing to recognize a handwriting input or a picture drawing input performed on the touch screen as characters or images, respectively.
  • the power supply unit 190 receives external power or internal power and supplies appropriate power required for operating respective elements and components under the control of the controller 180.
  • the embodiments described herein may be implemented by using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic units designed to perform the functions described herein. In some cases, such embodiments may be implemented by the controller 180 itself.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, electronic units designed to perform the functions described herein.
  • controller 180 itself.
  • each software module may perform one or more functions or operations described herein.
  • Software codes can be implemented by a software application written in any suitable programming language.
  • the software codes may be stored in the memory 160 and executed by the controller 180.
  • the mobile terminal according to an embodiment of the present invention described above with reference to FIG. 1, the mobile terminal in which components of the mobile terminal are disposed, or the structure of the mobile terminal will be described.
  • FIG. 2A is a front perspective view of the mobile terminal according to an embodiment of the present invention
  • FIG. 2B is a rear perspective view of the mobile terminal illustrated in FIG. 2A.
  • the disclosed mobile terminal has a bar type terminal body.
  • the present invention is not limited thereto and may be applicable to various structures such as a slide type mobile terminal, a folder type mobile terminal, a swing type mobile terminal, a swivel type mobile terminal, etc, in which two or more bodies are combined to be relatively movable.
  • the terminal body 100 (referred to as 'body', hereinafter) includes a front surface, a lateral surface, and a rear surface. Also, the body includes both ends formed in a length direction.
  • the body includes a case (or casing, housing, cover, etc.) constituting the external appearance.
  • the case may include a front case 101 and a rear case 102.
  • Various electronic components are installed in the space between the front case 101 and the rear case 102.
  • One or more intermediate cases may be additionally disposed between the front case 101 and the rear case 102.
  • the cases may be formed by injection-molding a synthetic resin or may be made of a metallic material such as stainless steel (STS) or titanium (Ti), etc.
  • STS stainless steel
  • Ti titanium
  • the display unit 151, the audio output module 152, the camera 121, the user input unit 130/131, 132, the microphone 122, the interface unit 170, etc. may be disposed mainly on the front case 101 of the terminal body 100.
  • the display unit 151 occupies the most of a main surface of the front case 101.
  • the audio output unit 151 and the camera 121 are disposed at a region adjacent to one end portion among both end portions of the display unit 151, and the user input unit 131 and the microphone 122 are disposed at a region adjacent to another end portion.
  • the user input unit 132 and the interface unit 170 may be disposed at the sides of the front case 101 and the rear case 102.
  • the microphone 122 may be disposed on the other end of the body 100.
  • the user input unit 130 is manipulated to receive a command for controlling the operation of the mobile terminal 100 and may include a plurality of manipulation units 131 and 132.
  • the manipulation units 131 and 132 may be generally referred to as a manipulating portion, and various methods and techniques can be employed for the manipulation portion so long as they can be operated by the user in a tactile manner.
  • Content inputted by the first and second manipulation units 131 and 132 can be variably set.
  • the first manipulation unit 131 may receive a command such as starting, ending, scrolling, etc.
  • the second manipulation unit 132 may receive a command such as controlling of the size of a sound outputted from the audio output unit 152 or conversion into a touch recognition mode of the display unit 151.
  • an audio output unit 152' may be additionally disposed on the rear surface of the terminal body.
  • the audio output module 152' may implement stereophonic sound functions in conjunction with the audio output module 152 (See FIG. 2A) and may be also used for implementing a speaker phone mode for call communication.
  • the power supply unit 190 for supplying power to the mobile terminal 100 is mounted on the terminal body.
  • the power supply unit 190 may be installed within the terminal body or may be directly attached to or detached from the exterior of the terminal body.
  • a touch pad 135 for detecting a touch may be additionally mounted on the rear case 102.
  • the touch pad 135 may be configured to be light transmissive like the display unit 151.
  • the display unit 151 is configured to output visual information from both sides thereof, the visual information may be recognized also via the touch pad 135.
  • a display may be additionally mounted on the touch pad so that a touch screen may be disposed on the rear case 102.
  • a camera 121' may additionally be disposed on the rear case 102 of the terminal body.
  • the camera 121' may have an image capture direction which is substantially opposite to that of the camera 121 (See FIG. 2a), and have a different number of pixels than the camera 121.
  • the camera 121 may have a smaller number of pixels to capture an image of the user's face and transmit such image to another party, and the camera 121' may have a larger number of pixels to capture an image of a general object and not immediately transmit it in most cases.
  • the cameras 121 and 121' may be installed on the terminal body such that they can be rotatable or popped up.
  • a flash 123 and a mirror 124 may be additionally disposed adjacent to the camera 121'.
  • the flash 123 illuminates the subject.
  • the mirror 124 allows the user to see himself when he wants to capture his own image (i.e., self-image capturing) by using the camera 121'.
  • An audio output unit 252' may be additionally disposed on the rear surface of the terminal body.
  • the audio output unit 252' may implement a stereoscopic function along with the audio output module 22 (See FIG. 2A), and may be used for implementing a speaker phone mode during call communication.
  • a power supply unit 190 for supplying power to the mobile terminal 100 may be installed on the terminal body.
  • the power supply unit 190 may be installed in the terminal body or may be directly detached from the outside of the terminal body.
  • a touch pad 135 for detecting a touch may be additionally mounted on the rear case 102.
  • Such a touch pad 135 may be configured to be light-transmissive like the display unit 151.
  • the display unit 151 when the display unit 151 is configured to output visual information from both surfaces thereof, the visual information may be recognized also through the touch pad 135.
  • the information output to the both surfaces of the display unit 151 may be controlled by the touch pad 135.
  • a display unit may be additionally mounted on the touch pad 135, so a touch screen may be disposed on the rear case 102.
  • the mobile terminal according to an exemplary embodiment of the present invention may include one or more of the components as described above may execute various control functions by using a multi-touch input.
  • FIG. 3 is a flow chart illustrating a method for controlling output of a graphic object to a display unit on the basis of an applied touch input
  • FIG. 4 is a conceptual view illustrating the control method of FIG. 3.
  • the mobile terminal may output a graphic object to the display unit (S310).
  • Graphic objects related to various functions operating in the mobile terminal may be output to the display unit 151.
  • the graphic objects may be various images that may be output to the display unit 151.
  • the graphic objects may be icons, widgets, and the like, corresponding to applications included in the home screen page.
  • the graphic objects may be image files.
  • the mobile terminal may sense a second touch input, different from the first touch, applied thereto (S320).
  • the mobile terminal may further include the sensing unit 140 for sensing a touch input applied to the display unit 151.
  • the sensing unit 140 may be configured as a touch sensor.
  • a touch input sensed by the sensing unit 14 may be various types of touch input.
  • the touch input may be a short touch, a long touch, a double touch, drag, flicking, and the like.
  • the sensing unit 140 may sense a multi-touch input.
  • the multi-touch input refers to at least to touch inputs that are simultaneously applied.
  • the multi-touch input may include a first touch input and a second touch input.
  • the first and second touch inputs may be different types of touch input.
  • the first touch input may be a long-touch input, while the second touch input may be a flocking input.
  • the controller 180 may execute a function corresponding to the multi-touch input. For example, when the multi-touch input is sensed, the controller 180 may control output of the graphic object output to the display unit 151.
  • the mobile terminal may maintain output of the graphic object to which the first touch input has been applied and output of at least one graphic object related to the graphic object maintained to be output to at least one region of the display unit in response to the second touch input (S330).
  • the controller may control an output state of the graphic object output to the display unit 151 on the basis of the first and second touch inputs.
  • the output state may refer to maintaining output or not maintaining output.
  • the output state may refer to changing of a shape, color, position, and the like, of the graphic object.
  • the controller 180 may control the display unit 151 to maintain output of the graphic object or not to maintain the output.
  • the sensing unit 140 may sense the second touch input different from the first touch input.
  • the controller 180 may maintain the output of the graphic object output to the display unit 151 in response to the maintained first touch input.
  • the controller 180 may output at least one graphic object related to the graphic object currently maintained to be output to the display unit 151 in response to the sensed second touch input.
  • the controller 180 may divide the display unit 151 into a plurality of regions.
  • the plurality of regions may include a first region and a second region.
  • the graphic object maintained to be output may be output to the first region
  • the at least one graphic object related to the graphic object maintained to be output may be output to the second region.
  • the first region and second region may be divided according to various methods.
  • the first and second regions may be divided according to a user selection or may be divided according to a pre-set method.
  • the user may select any one of a plurality of previously stored methods, and may divide the regions by a touch input.
  • the method for dividing the regions by a user's touch input the regions may be divided on the basis of a touch trace corresponding to the user's touch input. For example, in a case in which the user applies an input of dragging to the center of the display unit 151 in a horizontal direction, regions may be divided according to a trace corresponding to the drag.
  • the graphic object that may be output to the second region may be at least one graphic object related to the graphic object maintained to be output.
  • the graphic object related to the graphic object maintained to be output may be set by various references.
  • the various references may be previously set.
  • the various references may refer to whether the at least one graphic object belongs to the same group, whether the at least one graphic object satisfies a pre-set reference, whether the at least one graphic object is of the same type.
  • the various references may be an image capture time of the image or an image capture position of the image.
  • the relevant graphic object may be a plurality of images captured for a pre-set period of time.
  • the relevant graphic object may be determined by using an image recognition function. For example, in a case in which any one of the plurality of images is set as a graphic object maintained to be output by the first touch input, when any one of the other remaining images, rather than the any one image, is determined as an image which is identical at a predetermined level or higher through the image recognition function, it may be determined as a graphic object related to the graphic object maintained to be output.
  • the graphic object output to the second region may be related to the graphic object being output to the first region, but it is not necessarily limited thereto.
  • various graphic objects other than the graphic object related to the graphic object output to the first region may be output to the first region.
  • the display unit 151 may not be necessarily divided into a plurality of regions.
  • at least one graphic object output according to the second touch input may be output to overlap with the graphic object maintained to be output by the first touch input.
  • the controller 180 may control an output state of a graphic object output to the display unit 151.
  • an image file may be displayed on the display unit 151.
  • the user may apply a first touch input to the display unit 151.
  • the first touch input may be a long-touch input.
  • the user may apply a second touch input, different from the first touch input, to a region of the display unit 151.
  • the second touch input may be a flicking input different from the long touch input.
  • the controller 180 may execute a function corresponding to the second touch input.
  • the function corresponding to the second touch input may be a function of outputting a graphic object which has not been output to the display unit 151.
  • the controller 180 may output at least one graphic object related to the graphic object maintained to be output by the first touch input to at least one region of the display unit 151 according to the second touch input.
  • the controller 180 may divide the display unit into a plurality of regions.
  • the plurality of regions may include a first region 151a and a second region 151b.
  • a graphic object maintained to be output may be output to the first region and at least one graphic object related to the graphic object maintained to be output may be output to the second region.
  • the at least one graphic object related to the graphic object maintained to be output may similar to the graphic object maintained to be output.
  • an image similar to the image maintained to be output may be output to the second region.
  • the output number of relevant images may be set by the user or may be set in advance.
  • the second region may be divided into a plurality of regions according to the number of the relevant images.
  • FIGS. 5A and 5B are conceptual views illustrating a method for controlling each region in a case in which the display unit is divided into a plurality of regions.
  • the mobile terminal may control output of a graphic object output to the display unit 151 on the basis of sensed multi-touch input applied thereto. For example, referring to (a) of FIG. 5A, a first touch input and a second touch input may be applied to the display unit 151.
  • the controller 180 may maintain output of the graphic object output to the display unit 151 on the basis of the first touch input, and may output at least one different graphic object related to the graphic object maintained to be output to at least one region of the display unit 151 on the basis of the second touch input.
  • the controller 180 may divide the display unit 151 into a plurality of regions as illustrated in (b) of FIG. 5A.
  • the display unit 151 may be divided into a first region 151a and a second region 151b.
  • the graphic object maintained to be output may be output to the first region 151a
  • the graphic object related to the graphic object maintained to be output may be output to the second region 151b.
  • the controller 180 may change the graphic object being output to the second region 151b to a different graphic object.
  • the graphic object output to the first region 151a may be continuously maintained to be output.
  • the different graphic object output to the second region may be a graphic object related to the graphic object maintained to be output to the first region 151a.
  • the different graphic object may be an image related to the image maintained to be output.
  • the first touch input may be continuously maintained in the first region.
  • the user may continuously maintain the first touch input in the first region.
  • the output of the graphic object to the first region may be maintained only while the first touch input is maintained.
  • the controller 180 may delete the graphic object from the memory unit, as well as outputting the graphic object, on the basis of a direction of the second touch input. For example, in a case in which the second touch input is a drag input moving from within the region of the display unit 151 to the outside of the display unit 151, the controller 180 may delete the graphic object to which the second touch input has been applied.
  • the controller 180 may maintain outputting of a graphic object to which a first touch input has been applied.
  • the user may apply a second touch input to a graphic object 430 other than the graphic object 420 to which the first touch input has been applied.
  • the second touch input may be a drag input moving from within the region of the display unit 151 to the outside of the display unit 151.
  • the controller 180 may not output the graphic object 430 to which the second touch input has been applied, any longer.
  • the graphic object 430 to which the second touch input has been applied may be deleted from the memory.
  • the controller 180 may output a graphic object 440, which is related to the graphic object maintained to be output and which is different from the deleted graphic object, to the region from which the deleted graphic object 430 was deleted.
  • FIG. 6 illustrates a method for changing a graphic object maintained to be output when a touch input is sensed.
  • the controller 180 may divide the display unit 180 into a plurality of regions on the basis of first and second touch inputs as applied.
  • the display unit 151 may be divided into a first region 151a and a second region 151b.
  • a graphic object 420 maintained to be output is output to the first region 151a and a graphic object 430 related to the graphic object 420 maintained to be output to the first region may be output to the second region 151b.
  • the graphic object 430 output to the second region may be changed to a different graphic object according to a second touch input applied to the second region 151b.
  • the different graphic object may be related to the graphic object maintained to be output to the first region.
  • the user may apply the first touch input to the second region 151b.
  • the user may apply the second touch input to the first region 151a.
  • the output of the graphic object 430 output to the second region 151b is maintained, the output of the graphic object 420 output to the first region 151a may be changed.
  • a graphic object 440 related to the graphic object maintained to be output to the second region 151b may be output to the first region 151a.
  • the user may change the graphic object maintained to be output by using a touch input.
  • FIGS. 7A through 7C are conceptual views illustrating a method for controlling a plurality of graphic objects output to the display unit.
  • the controller 180 may divide the display unit into a plurality of regions according to a multi-touch input (a first touch input and a second touch input that are simultaneously applied).
  • the display unit 151 may be divided into a first region 151a and a second region 151b.
  • controller 180 may output relevant graphic objects to the first and second regions, respectively.
  • the graphic objects output to the first region and the second region may be images obtained by imaging the same object.
  • the user may apply at least one touch input.
  • the at least one touch input may be a pinch-in input, a pinch-out input, a double-touch input, or a drag input.
  • at least one touch input may be a pinch-in input.
  • the controller 180 may execute an editing function corresponding to the at least one touch input.
  • the editing function may include magnifying, reducing, cutting, pasting, and position-moving of the graphic object output to the display unit.
  • the editing function corresponding to the at least one touch input may be a function of magnifying the graphic object output to the region to which the at least one touch input is applied.
  • the at least one touch input may be applied to any one region among the first region 151a and the second region 151b.
  • the controller 180 may apply the editing function corresponding to the at least one touch input, also to the graphic object output to the region (e.g., the second region) to which the at least one touch input was not applied, as well as to the region (e.g., the first region) to which the at least one touch input has been applied.
  • the user may apply a pinch-in input to the first region.
  • the controller 180 may magnify a graphic object 700a output to the first region 151a. Also, the controller 180 may magnify a graphic object 700b of the second region 151b corresponding to the magnified graphic object 700a.
  • the controller 180 may output guide images to respective regions such that graphic objects output to the plurality of regions of the display unit 151 can be conveniently compared.
  • the guide images may be a line images partitioned by a grid.
  • the controller 180 may recognize at least one region in which the other remaining graphic objects are different from any one graphic object based on the any one graphic object among the magnified graphic objects.
  • the controller 180 may use an image recognition function.
  • the controller 180 may output notification information indicating the at least one different region to the plurality of graphic objects.
  • the notification information may be a circular indicator in the different region.
  • the controller 180 may detect different image regions in the magnified regions 700a and 700b by using an image recognition function. In this case, the controller 180 may output circular indicators 710a and 710b to different image regions.
  • the controller 180 may detect at least one different image region by using the image recognition function with respect to the plurality of graphic objects output to the display unit 151.
  • two graphic objects may be output to the display unit 151.
  • the controller 180 may detect at least one different image region 720a, 720b, 730a, and 730b by using the image recognition function with respect to the two graphic objects.
  • the controller 180 may output notification information regarding the at least one different image region 720a, 720b, 730a, and 730b as illustrated in (a) of FIG. 7C.
  • the user may apply a touch input to at least one image region 720a among the at least one image region 720a, 720b, 730a, and 730b.
  • the controller 180 may magnify the region 720 to which the touch input has been applied, and output the same 720a and 720b.
  • the touch input may be applied to any one 720a of the two graphic objects 720a and 720b.
  • the controller 180 may magnify the region 720a of the graphic object to which the touch input has been applied and the region 720b of the graphic object to which the touch input has not been applied corresponding to the region of the graphic object to which the touch input has been applied, and output the same.
  • the user may apply a touch input to any one graphic object 720a among the two graphic objects 720a nad 720b.
  • the controller 180 may magnify both the region 720a of the graphic object to which the touch input has been applied and the region 720b of the graphic object to which the touch input has not been applied corresponding to the region of the graphic object to which the touch input has been applied, and output the same 740a and 740b.
  • the plurality of graphic objects when the plurality of graphic objects are output, different image regions may be detected and information regarding the detection results may be provided to the user.
  • the user may compare the plurality of graphic objects more conveniently and by intuition.
  • FIGS. 8A and 8B are conceptual views illustrating cases in which a plurality of graphic objects are output as a list to the display unit.
  • the controller 180 may change output states of the plurality of graphic objects according to first and second touch inputs applied to the display unit. For example, while maintaining an output position of a graphic object output to a region to which the first touch input has been applied, among the plurality of graphic objects, the controller 180 may change an output position of a graphic object other than the graphic object whose output position is maintained.
  • a plurality of graphic objects 800a, 800b, 800c, 800d, 800e, and 800f may be displayed on the display unit 151.
  • a first touch input may be applied to any one 800a of the plurality of graphic objects 800a, 800b, 800c, 800d, 800e, and 800f.
  • a second touch input may be applied to the display unit 151.
  • the controller 180 may fix the output position of the graphic object 800a to which the first touch input has been applied, and change the output positions of the graphic objects 800a, 800c, 800d, 800e, and 800f other than the graphic object whose output position has been fixed, among the plurality of graphic objects.
  • the graphic object whose output position has been fixed may be changed.
  • the output position of the first graphic object 800a is fixed.
  • the user may apply a first touch input to a third graphic object 800d.
  • the controller 180 may fix the output position of the third graphic object 800d to which the first touch input has been applied.
  • the controller 180 may change the output positions of the graphic objects 800a, 800c, 800e, and 800f other than the third graphic object 800d whose output position has been fixed.
  • the method for changing output positions of the plurality of graphic objects may be performed in pre-set order. For example, as illustrated in FIG. 8A, in a case in which the second touch input is a flicking input of moving from a lower end to an upper end of the display unit, the positions of the plurality of graphic objects may be moved in a direction corresponding to the second touch input.
  • FIGS. 9A and 9B illustrate a method for grouping at least a portion of the plurality of graphic objects.
  • the controller 180 may group at least a portion of the graphic objects.
  • the user may select graphic objects to be grouped from among the plurality of graphic objects.
  • the graphic objects to be grouped may be selected by applying a touch input different from the first and second touch inputs.
  • the third touch input may be a short-touch input different from the first and second touch inputs.
  • the user may select at least a portion 800a, 800c, and 800e of the plurality of graphic objects 800a, 800b, 800c, 800d, 800e, and 800f output to the display unit 151 by applying the third touch input.
  • the controller 180 may output only the selected graphic objects 800a, 800c, and 800e to the display unit 151 as illustrated in (b) of FIG. 9.
  • FIG. 9B As another method for grouping at least a portion of a plurality of graphic objects, there may be a method using a multi-touch as illustrated in FIG. 9B.
  • the user may apply a first touch into to any one 800a of a plurality of graphic objects 800a, 800b, 800c, 800d, 800e, and 800f. Thereafter, the user may apply a third touch input to at least a portion 800d, 800e, and 800f of the other remaining graphic objects.
  • the controller 180 may group the graphic objects 800d, 800e, and 800f to which the first and third touch inputs have been applied on the basis of the applied third touch input. Thereafter, as illustrated in (b) of FIG. 9B, the controller 180 may output only the grouped graphic objects 800d, 800e, and 800f, among the plurality of graphic objects.
  • FIGS. 10A and 10B are conceptual views illustrating a method for controlling grouped graphic objects.
  • the controller 180 may group at least a portion of the plurality of graphic objects 800a, 800b, 800c, 800d, 800e, and 800f. In this case, only the grouped graphic objects 800a, 800c, and 800e may be output to the display unit 151.
  • the user may apply a first touch input to any one of the graphic objects output to the display unit 151.
  • the display unit 151 may be divided into a plurality of regions including a first region 151a and a second region 151b as illustrated in (c) of FIG. 10A.
  • the graphic object 800a positioned in a region to which the first touch input has been applied may be output to the first region 151a, and any one of the grouped graphic objects 800a, 800c, and 800e may be output to the second region 151b.
  • the controller 180 may output graphic objects 800c and 800e, which has not been output, among the grouped graphic objects 800a, 800c, and 800e to the display unit. Namely, the controller 180 may output only the grouped graphic object as a relevant graphic object.
  • the controller 180 may output any one 800a of the graphic objects 800a, 800b, and 800c belonging to the group to the entire surface of the display unit 15 and output the other remaining graphic objects 800b and 800c belonging to the group to one region of the display unit 151.
  • the controller 180 may delete or store the graphic object by using a touch input applied by the user to the graphic object. For example, as illustrated in (b) of FIG. 10B, a position of any one 800b of the grouped graphic objects 800b and 800c output to a region of the display unit 151 may be moved through a drag input. In this case, when the graphic object is moved to a selection region, the graphic object 800b may be selected. For example, the selection region of the graphic object 800b may be an upper region of the display unit 151.
  • any one 800c of the graphic objects output to a region of the display unit 151 may be moved to a deletion region through a drag input.
  • the user may group only desired graphic objects among a plurality of graphic objects, and edit only the grouped graphic object.
  • FIG. 11 is a conceptual view illustrating a case in which a multi-touch is applied while a camera application is being executed.
  • the user may execute a camera application to capture an image.
  • a camera application For example, as illustrated in (a) of FIG. 11, the user may execute the camera application.
  • the user may apply a first touch to an image capture button. Also, in a state in which the first touch is maintained, the user may apply a second touch input to the display unit 151. For example, as illustrated in (b) of FIG. 11, in a state in which the user applies a long-touch input to the image capture button, the user may apply a flicking input to at least one region of the display unit 151.
  • the controller 180 may output the most recently captured image 1120 to at least one region of the display unit.
  • the controller may divide the display unit into a plurality of regions.
  • the display unit 151 may be divided into a first region in which an image received from the camera is output and a second region in which an image which was captured before is output.
  • the method for controlling the display unit by using a multi-touch input while the camera application is being executed has been described.
  • the user may compare a currently captured image with a previously captured image.
  • the entire front surface of the mobile terminal according to an exemplary embodiment of the present invention may be configured as the display unit 151.
  • the mobile terminal according to an exemplary embodiment of the present invention may not include the user input unit 131 in the front surface of the mobile terminal. Instead, the display unit 151 may be present in the position of the user input unit 131.
  • screen information having a function of the user input unit 131 may be output to the region corresponding to the region in which the user input unit 131 is positioned.
  • the user may use the function of the user input unit 131 by using the screen information output to the display unit, even without the user input unit 131.
  • the mobile terminal is configured such that the entire front surface thereof is configured as the display unit 151, without the user input unit 131.
  • the mobile terminal may output a plurality of function keys associated with operations of the mobile terminal to the display unit (S1310).
  • the operations of the mobile terminal may be any operation that can be executed in the mobile terminal such as an operation of executing an application, an operation of outputting a home screen, an operation of setting a system, and the like.
  • the plurality of function keys associated with the operations of the mobile terminal may be output to at least a portion of the display unit.
  • functions associated with operations of the mobile terminal may be matched to the plurality of function keys.
  • the functions associated with operations of the mobile terminal may include a function of canceling an operation executed in the mobile terminal, a function of returning to a home screen of the mobile terminal, a function of entering a system set-up of the mobile terminal, and the like.
  • function keys corresponding to the executed application may displayed on the display unit.
  • function keys related to the Web browser may be displayed.
  • the function related to the Web browser may be a function of returning to a previous screen of a currently output screen.
  • the mobile terminal may output information related to the any one function key to a region of the display unit (S1320).
  • the plurality of function may be output to the display unit.
  • the user may apply a touch input to the plurality of functions to execute functions corresponding to the plurality of function keys.
  • the plurality of function keys may be matched to different functions according to types of touch input.
  • a basic function of the plurality of function keys may be executed.
  • the basic function may refer to an intrinsic function of the plurality of function keys.
  • the information related to the plurality of function keys may be information related to the basic function.
  • the basic function is a function of returning to an immediately previous function of a currently executed function
  • the related information may be a list of functions which have been executed previously.
  • the basic function is a function of outputting addresses of Web sites that the user frequently visits
  • the related information may be a list of the addresses of the Web sites that the user frequently visits.
  • the related information may be a list of applications currently executed in the mobile terminal.
  • information related to the any one function key may be output to a region of the display unit.
  • the related information may be output to the screen currently output to the display unit 151 in an overlapping manner.
  • the related information may be output to a region of the display unit, so that it may be output together with other information that may be output to the display unit 151.
  • the mobile terminal may execute a function corresponding to the output information in response to a second touch, different from the first touch, applied to the output information (S1230).
  • the mobile terminal may further include a sensing unit for sensing at least two touch inputs simultaneously.
  • the at least two touch inputs may be a multi-touch input.
  • the controller 180 may execute a function corresponding to the information to which the second touch has been applied.
  • the function corresponding to the information to which the second touch has been applied may be a function of executing an application corresponding to the information.
  • the information to which the second touch has been applied is an address of a Web site that the user frequently visits (e.g., a favorite address)
  • the mobile terminal may access the Web site indicated by the information.
  • the display unit 151 may output an executed screen of the accessed Web site.
  • the second touch may be a pre-set type touch input.
  • the second touch may be an input of dragging the output information.
  • the drag input may be an input of moving in various directions such as in a downward direction, in an upward direction, in a rightward direction, in a leftward direction, in a diagonal direction, and the like.
  • different functions may be executed according to a direction of the second touch.
  • the second touch is a drag input of moving the output information in a downward direction
  • a function of deleting the output information may be executed.
  • a Web browser application may be executed in the mobile terminal according to an exemplary embodiment of the present invention.
  • a plurality of function keys associated with the mobile terminal may have a function related to the Web browser application.
  • the function related to the Web browser application may be a function of returning to a previous screen of a current screen, a function of going to a next screen of a current screen, a function of returning to a home, a function of adding a new window, a bookmark function, and the like.
  • the user may apply a first touch 1410 to any one of the plurality of function keys.
  • the first touch 1410 may be various types of touch such as a long touch, a double touch, flicking, and the like.
  • the first touch 1410 may be a long touch.
  • the key to which the first user applies the first touch 1410 may be a function key of returning to a previous screen among the plurality of function keys.
  • information 1440 related to the first touch may be output.
  • the information 1440 may be output in various manners.
  • the information 1440 may be output to screen information, which is currently output to the display unit 151, in an overlapping manner.
  • the information 1440 may be output to a region adjacent to a region to which the plurality of function keys are output.
  • the information 1440 may be output in the form of a thumbnail image.
  • the user may apply a second touch different from the first touch to the information in a state in which the first touch is maintained.
  • the second touch may be a drag input different from the long touch.
  • the drag input may be an input of moving the output information 1440 to the center of the display unit 151.
  • the controller 180 may execute a function corresponding to the second touch. For example, referring to (c) of FIG. 14A, in the case in which the second touch is applied, the controller 180 may execute an application corresponding to the information to which the second touch has been applied. For example, in a case in which the information to which the second touch has been applied is an address of a Web browser, the controller 180 may access the Web browser. In this case, an executed screen 1440 of the accessed Web browser may be output to the display unit 151.
  • a home screen page 1450 may be output to the display unit 151.
  • the plurality of function keys associated with the mobile terminal may include a function key of canceling a currently executed function, a function key of entering the home screen page, a function key of setting up a system of the mobile terminal, and the like.
  • the user may apply a first touch 1410 to the key for entering the home screen page among the plurality of function keys.
  • the controller 180 may output information related to the key for entering the home screen page.
  • the function related to the home screen page may be a list of applications currently executed in the mobile terminal.
  • the user may apply a second touch 1420 to any one of items on the list.
  • the second touch 1420 may be a type of touch input different from that of the first touch 1410.
  • the second touch 1420 may be a drag input for moving any one item to a central portion of the display unit 151.
  • the controller 180 may execute a function corresponding to the any one item 1460.
  • the controller 180 may execute the Web browser application.
  • FIG. 15 is a conceptual view illustrating whether to output information according to whether a first touch is maintained.
  • the user may apply a first touch 1410 to any one of the plurality of function keys.
  • the controller 180 may output information related to the function key to which the first touch 1410 has been applied.
  • the plurality of function keys may include a function of returning to a previous screen of a current screen, a function of going to a next screen of a current screen, a function of returning to a home, a function of adding a new window, a bookmark function, and the like.
  • the user may apply a first touch 1410 to the function key of returning to a previous screen of the current screen.
  • the controller 180 may output information related to the function key, namely, the function key of returning to a previous screen of the current screen 1430.
  • the related information may be a plurality of Web browser information that the user accessed previously.
  • the Web browser information may be output in various manners. For example, referring to (b) of FIG. 15, the Web browser information 1470a, 1470b, and 1470c may be output in thumbnail manner.
  • the controller 180 may not output the information related to the function key any longer.
  • FIGS. 16A, 16B, and 16C are conceptual views illustrating a method for editing output information by using a second touch.
  • various functions may be matched to a type of touch in advance.
  • functions such as adding, deleting, position-changing, grouping, and the like, of output information may be matched in advance according to a type of the second touch.
  • a Web browser application may be executed in the mobile terminal.
  • a plurality of function keys may be output to a region of the display unit 151.
  • the plurality of function keys may include a function of returning to a previous screen of a current screen, a function of moving to a next screen of the current screen, a function of returning to a home screen, a function of adding a new window, a bookmark function, and the like.
  • the user may apply a first touch 1410 to any one of the plurality of function keys.
  • the user may apply the first touch 1410 to a function key having the bookmark function.
  • the controller 180 may output information 1610a, 1610b, and 1610c related to the function key to which the first touch 1410 has been applied.
  • the information 1610a, 1610b, and 1610c related to the function key to which the first touch 1410 has been applied may be information related to the bookmark function.
  • the information related to the bookmark function may include a plurality of previously stored bookmark Web browser address lists 1610a, 1610b, and 1610c.
  • the user may apply a second touch to the Web browser page currently output to the display unit 151.
  • the controller 180 may execute a function previously matched to the information related to the function key to which the first touch has been applied. For example, the controller 180 may add an item to the information list related to the function key to which the first touch has been applied.
  • the user may apply an input of dragging a page currently output to the display unit 151 to the regions 1610a, 1610b, and 1610c to which the information related to the bookmark function.
  • the controller may execute a function matched to the second touch.
  • the previously matched function may be a function of adding a current Web browser address to bookmark.
  • the controller 180 may add 1610d the currently output Web browser page to the bookmark Web browser address list.
  • the controller 180 may output notification information 1600 indicating the addition of the page to the Web browser list.
  • the second touch may correspond to a function of grouping the output information.
  • a function of grouping the output information may be matched to the second touch.
  • the controller 180 may output information related to the any one function to one region of the display unit 151.
  • the any one function key may be a function key of returning from a current screen to a previous screen.
  • the information related to the any one function key may be the Web browser address list which the user has accessed previously.
  • the controller 180 may execute a function previously matched to the second touch 1420.
  • the second touch may be a drag input of moving any one of the output information to a region in which the other has been output.
  • the function previously matched to the second touch 1420 may be a function of grouping the item to which the second touch has been applied.
  • an input of dragging any one item of the Web browser address list to a different item may be applied.
  • the controller 810 may group 1470e the any one and the other information and output the same.
  • the controller 180 may group 1470e the any one item to which the second touch 1420 has been applied and the other item and output the same.
  • the second touch 1420 may correspond to a function of deleting any one of the output information.
  • a function of deleting information to which the second touch 1420, among the output information, is applied may have been matched to the second touch 1420.
  • the controller 180 may output the Web browser page lists 1470a, 1470b, and 1470c which have been accessed before the current page.
  • a second touch may be applied to any one 1470c of the Web browser page lists 1470a, 1470b, and 1470c.
  • the second touch 1420 may be an input of dragging the any one Web browser page 1470c such that it faces downwardly in the display unit 151.
  • the controller 180 may not output the any one Web browser page 1470c to the display unit 151.
  • FIG. 17 a method for controlling output information of the display unit by using a multi-touch input will be described.
  • the controller 180 may perform various types of controlling by using a multi-touch input applied to the display unit 151.
  • the controller 180 may execute a function of merging pieces of information by using the multi-touch input.
  • a memory list 1710a, 1710b, 1710c, 1710d, 1710e, and 1710f stored in a memo application may be output to the display unit 151.
  • the user may use a multi-touch input.
  • the controller 180 may sense a first touch 1410 applied to any one item 1710a in the memo list output to the display unit 151.
  • the first touch 1410 may be a long touch.
  • the controller 180 may sense a second touch 1420 applied to an item 1710b different from the any one item.
  • the second touch 1420 may be a drag touch.
  • the second touch 1420 may be a drag input to move the item 1710b to any one item to which the first touch 1410 is applied.
  • the controller 180 may execute a function previously matched to the multi-touch input. For example, as illustrated in (b) of FIG. 17, the controller 180 may merge the item to which the first touch is applied and the item to which the second touch is applied, and store the same (1720).
  • the foregoing method may be implemented as codes that can be read by a processor in a program-recorded medium.
  • the processor-readable medium may include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
  • the processor-readable medium also includes implementations in the form of carrier waves (e.g., transmission via the Internet).
  • the mobile terminal according to the embodiments of the present disclosure is not limited in its application of the configurations and methods, but the entirety or a portion of the embodiments can be selectively combined to be configured into various modifications.

Abstract

L'invention concerne un terminal mobile permettant une entrée tactile, et son procédé de commande. Ledit terminal mobile comprend une unité d'affichage conçue pour émettre un objet graphique, une unité de détection servant à détecter une entrée tactile appliquée à l'unité d'affichage, et un contrôleur destiné à émettre, lorsqu'une première entrée tactile appliquée à l'unité d'affichage et une seconde entrée tactile différente de la première sont détectées alors que la première entrée tactile est maintenue, au moins un objet graphique différent qui est lié à un objet graphique dont l'émission est maintenue dans au moins une zone de l'unité d'affichage, en réponse à la seconde entrée tactile alors que l'émission de l'objet graphique dans la zone où la première entrée tactile a été appliquée est maintenue.
PCT/KR2014/009207 2013-10-08 2014-09-30 Terminal mobile et son procédé de commande WO2015053506A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201480055385.XA CN105612487B (zh) 2013-10-08 2014-09-30 移动终端及其控制方法
US14/915,838 US20160196058A1 (en) 2013-10-08 2014-09-30 Mobile terminal and control method thereof
EP14852691.6A EP3056061B1 (fr) 2013-10-08 2014-09-30 Terminal mobile

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR10-2013-0119984 2013-10-08
KR1020130119985A KR102068799B1 (ko) 2013-10-08 2013-10-08 이동 단말기 및 이의 제어방법
KR20130119984 2013-10-08
KR10-2013-0119985 2013-10-08
KR10-2013-0139286 2013-11-15
KR20130139286A KR20150041546A (ko) 2013-10-08 2013-11-15 이동 단말기 및 이의 제어방법

Publications (1)

Publication Number Publication Date
WO2015053506A1 true WO2015053506A1 (fr) 2015-04-16

Family

ID=52813285

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/009207 WO2015053506A1 (fr) 2013-10-08 2014-09-30 Terminal mobile et son procédé de commande

Country Status (1)

Country Link
WO (1) WO2015053506A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110191704A1 (en) 2010-02-04 2011-08-04 Microsoft Corporation Contextual multiplexing gestures
US20130021273A1 (en) * 2011-07-19 2013-01-24 Lg Electronics Inc. Mobile terminal and display controlling method thereof
US20130058019A1 (en) * 2011-09-06 2013-03-07 Lg Electronics Inc. Mobile terminal and method for providing user interface thereof
KR20130038753A (ko) * 2011-10-10 2013-04-18 엘지전자 주식회사 이동 단말기 및 그것의 사용자 인터페이스 제공 방법

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110191704A1 (en) 2010-02-04 2011-08-04 Microsoft Corporation Contextual multiplexing gestures
US20130021273A1 (en) * 2011-07-19 2013-01-24 Lg Electronics Inc. Mobile terminal and display controlling method thereof
US20130058019A1 (en) * 2011-09-06 2013-03-07 Lg Electronics Inc. Mobile terminal and method for providing user interface thereof
KR20130038753A (ko) * 2011-10-10 2013-04-18 엘지전자 주식회사 이동 단말기 및 그것의 사용자 인터페이스 제공 방법

Similar Documents

Publication Publication Date Title
WO2018034402A1 (fr) Terminal mobile et son procédé de commande
WO2016104922A1 (fr) Dispositif électronique pouvant être porté
WO2018030594A1 (fr) Terminal mobile et son procédé de commande
WO2015068911A1 (fr) Terminal mobile et son procédé de commande
WO2015056844A1 (fr) Terminal mobile et son procédé de commande
WO2015020284A1 (fr) Terminal mobile et procédé de commande associé
WO2015122590A1 (fr) Dispositif électronique et son procédé de commande
WO2017082508A1 (fr) Terminal de type montre, et procédé de commande associé
WO2015199280A1 (fr) Terminal mobile et son procédé de commande
WO2015088166A1 (fr) Terminal mobile, et procédé de commande d'une unité d'entrée de face arrière du terminal
WO2015199270A1 (fr) Terminal mobile, et procédé de commande correspondant
WO2014157885A1 (fr) Procédé et dispositif de présentation d'une interface avec menus
WO2015167165A1 (fr) Procédé et dispositif électronique permettant de gérer des objets d'affichage
WO2015050345A1 (fr) Appareil de commande pour terminal mobile et son procédé de commande
WO2015072677A1 (fr) Terminal mobile et son procédé de commande
WO2017034116A1 (fr) Terminal mobile et procédé de commande de celui-ci
WO2012046890A1 (fr) Terminal mobile, dispositif afficheur, et procédé de commande correspondant
WO2016114444A1 (fr) Terminal mobile et son procédé de commande
WO2017052043A1 (fr) Terminal mobile et son procédé de commande
WO2015056854A1 (fr) Terminal mobile et procédé de commande du terminal mobile
WO2016032045A1 (fr) Terminal mobile et son procédé de commande
WO2014112678A1 (fr) Procédé pour fournir des conseils d'achats au moyen d'un terminal mobile, et interface d'utilisateur pour fournir des conseils d'achats au moyen d'un terminal mobile
WO2018105834A1 (fr) Terminal et procédé de commande associé
WO2017105018A1 (fr) Appareil électronique et procédé d'affichage de notification pour appareil électronique
WO2018128224A1 (fr) Terminal mobile et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14852691

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14915838

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2014852691

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014852691

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE