EP2752751A2 - Verfahren zur Steuerung eines Endgerätes mit doppelter Berührungsgeste und Endgerät dafür - Google Patents

Verfahren zur Steuerung eines Endgerätes mit doppelter Berührungsgeste und Endgerät dafür Download PDF

Info

Publication number
EP2752751A2
EP2752751A2 EP13004583.4A EP13004583A EP2752751A2 EP 2752751 A2 EP2752751 A2 EP 2752751A2 EP 13004583 A EP13004583 A EP 13004583A EP 2752751 A2 EP2752751 A2 EP 2752751A2
Authority
EP
European Patent Office
Prior art keywords
touch
touch gesture
gesture
mobile terminal
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13004583.4A
Other languages
English (en)
French (fr)
Other versions
EP2752751A3 (de
Inventor
Eunkyung Kim
Changhee Han
Kang Lee
Taehyun Lim
Byoungkwon Roh
Kyoungjin Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of EP2752751A2 publication Critical patent/EP2752751A2/de
Publication of EP2752751A3 publication Critical patent/EP2752751A3/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04102Flexible digitiser, i.e. constructional details for allowing the whole digitising part of a device to be flexed or rolled like a sheet of paper
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to a method of controlling a mobile terminal using a double touch gesture.
  • a terminal may be divided into a mobile terminal and a stationary terminal.
  • the mobile terminal may also be divided into a handheld terminal and a vehicle mount terminal.
  • the terminal include various diversified functions including voice communication, text messaging, sending emails, capturing videos, reproducing a music or video file, a game, and receiving a broadcast.
  • a user can also control the different terminal functions through touch actions performed on the display. For example, a user can zoom in or out on contents using a pinch zoom in or a pinch zoom out gesture. However, the gestures used for controlling the mobile terminal are limited.
  • one aspect of the present invention is to address the above noted and other problems of the related art.
  • Another aspect of the present invention is to provide a mobile terminal and corresponding control method using a double touch gesture including a second touch gesture having a direction input after a first touch gesture, and performing a service function intuitively expected for the second touch gesture.
  • the present invention provides in one aspect a mobile terminal including a wireless communication unit configured to wirelessly communicate with at least one other terminal; a touch screen display; and a controller configured to select an object displayed on the touch screen display based on a first touch gesture input with respect to the object, perform a second touch gesture a predetermined time after the first touch gesture, in which the second touch gesture has a dragging direction, and perform a first preset function based on the second touch gesture having a first dragging direction and perform a second preset function based on the second touch gesture having a second direction different than the first direction.
  • the present invention also provides a corresponding method of controlling a mobile terminal.
  • Fig. 1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
  • Fig. 2 is an overview including multiple touch gestures which can be set in a mobile terminal according to an embodiment of the present invention
  • Fig. 3 is flowchart illustrating a method of controlling a mobile terminal according to an embodiment of the present invention
  • Fig. 4 is an overview including service functions which can be set to a double touch gesture according to an embodiment of the present invention
  • Figs. 5(a)-(c) are display screens illustrating generating a home screen using a double touch gesture according to an embodiment of the present invention
  • Figs. 6(a)-(c) are display screens illustrating deleting a home screen using a double touch gesture according to an embodiment of the present invention
  • Figs. 7(a) and 7(b) are display screens illustrating replacing a home screen using a double touch gesture according to an embodiment of the present invention
  • Fig. 8 is a display screen illustrating moving to a quick page in a message screen according to an embodiment of the present invention.
  • Fig. 9 is a display screen illustrating moving to a quick page in an address book screen according to an embodiment of the present invention.
  • Fig. 10 is a display screen illustrating moving to a quick page in a web browsing screen according to an embodiment of the present invention
  • Fig. 11 is a display screen illustrating a quick page in a web browsing screen of a mobile terminal according to an embodiment of the present invention
  • Fig. 12(a) is a display screen illustrating moving to a quick page in a gallery according to an embodiment of the present invention
  • Fig. 12(b) is a display screen illustrating sharing a picture based on a double touch gesture in a gallery according to an embodiment of the present invention
  • Fig. 12(c) is a display screen illustrating deleting a picture based on a double touch gesture in a gallery according to an embodiment of the present invention
  • Fig. 13(a) is a display screen illustrating moving to a quick page based on a double touch gesture in a memo screen according to an embodiment of the present invention
  • Fig. 13(b) is a display screen illustrating deleting a memo based on a double touch gesture in a memo screen according to an embodiment of the present invention
  • Fig. 13(c) is a display screen illustrating writing a memo based on a double touch gesture in a memo screen according to an embodiment of the present invention
  • Fig. 14(a) is a display screen illustrating moving to a quick page based on a double touch gesture in a message screen according to an embodiment of the present invention
  • Fig. 14(b) is a display screen illustrating deleting a message based on a double touch gesture in a message screen according to an embodiment of the present invention
  • Figs. 15(a)-(c) are display screens illustrating switching from a home screen to a quick page according to an embodiment of the present invention
  • Fig. 16 is a display screen illustrating displaying a new message writing window in a message list screen according to an embodiment of the present invention
  • Fig. 17 is a display screen illustrating adding new address book information to an address book list screen according to an embodiment of the present invention.
  • Fig. 18 is a flowchart illustrating adding a new template in a document template according to an embodiment of the present invention.
  • Fig. 19 is a flowchart illustrating a photographing mode in a gallery according to an embodiment of the present invention.
  • a mobile terminal described herein may include a portable phone, a smart phone, a pad, a note, a tablet PC, a laptop computer, a digital broadcast terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), and navigation.
  • PDA personal digital assistant
  • PMP portable multimedia player
  • the present invention is applicable to a stationary terminal such as a digital TV or a desktop computer.
  • Fig. 1 is a block diagram of a mobile terminal 100 according to an embodiment of the present invention.
  • the mobile terminal 100 includes a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190.
  • a mobile terminal having more elements or fewer elements may be also implemented.
  • the mobile terminal 100 may include a multi mode portable terminal which respectively connects to communication networks according to at least two communication methods or at least two operators and a multi standby portable terminal which simultaneously connects to communication networks according to at least two communication methods or at least two operators.
  • the multi standby mobile terminal is a portable terminal which simultaneously connects to three communication networks selected from a plurality of communication methods including, for example, Code Division Multiple Access (CDMA), Global System for Mobile telecommunication (GSM), Wideband Code Division Multiple Access (WCDMA), or Wireless broadband (Wibro) or, in the case of Korea, simultaneously connects to three communication networks selected from a plurality of operators including, for example, SKT, KTF, and LGT.
  • CDMA Code Division Multiple Access
  • GSM Global System for Mobile telecommunication
  • WCDMA Wideband Code Division Multiple Access
  • Wibro Wireless broadband
  • the wireless communication unit 110 may include at least one module which enables a wireless communication between the mobile terminal 100 and a wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located.
  • the wireless communication unit 110 includes a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • the broadcast management server refers to a server which generates and transmits the broadcast signal and/or the broadcast related information or a server which receives an already generated broadcast signal and/or broadcast related information and transmits the already generated broadcast signal and/or broadcast related information to the terminal.
  • the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal but also a broadcast signal having a form in which the data broadcast signal is coupled to the TV broadcast signal or the radio broadcast signal.
  • the broadcast related information may indicate information related to a broadcast channel, a broadcast program, or a broadcast service provider.
  • the broadcast related information may be provided through a mobile communication network. In this instance, the broadcast related information may be received by the mobile communication module 112.
  • the broadcast related information may exist in a form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).
  • EPG Electronic Program Guide
  • DMB Digital Multimedia Broadcasting
  • ESG Electronic Service Guide
  • DVD-H Digital Video Broadcast-Handheld
  • the broadcast receiving module 111 may receive a digital broadcast signal using a digital broadcast system such as Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), Digital Video Broadcast-Handheld (DVB-H), or Integrated Services Digital Broadcast-Terrestrial (ISDB-T).
  • DMB-T Digital Multimedia Broadcasting-Terrestrial
  • DMB-S Digital Multimedia Broadcasting-Satellite
  • MediaFLO Media Forward Link Only
  • DVD-H Digital Video Broadcast-Handheld
  • ISDB-T Integrated Services Digital Broadcast-Terrestrial
  • the broadcast signal and/or broadcast related information received through the broadcast receiving module 111 may also be stored in the memory 160.
  • the mobile communication module 112 transmits and receives a wireless signal to/from at least one of a base station, an external terminal, and a server on the mobile communication network.
  • the wireless signal may include a voice call signal, a video call signal, or a data in various forms according to transmitting and receiving a character/multimedia message.
  • the wireless internet module 113 refers to a module for wireless internet connection and may be built in or built out of the mobile terminal 100.
  • a wireless internet technology such as Wireless LAN (WLAN)(Wi-Fi), Wireless broadband (Wibro), World Interoperability for Microwave Access (Wimax), or High Speed Downlink Packet Access (HSDPA) may be used.
  • WLAN Wireless LAN
  • Wibro Wireless broadband
  • Wimax World Interoperability for Microwave Access
  • HSDPA High Speed Downlink Packet Access
  • the short range communication module 114 refers to a module for a short range communication.
  • a short range communication technology such as Bluetooth, Radio Frequency Identification (RFID), infrared Data Association (IrDA), Ultra Wideband (UWB), or ZigBee may be used.
  • the location information module 115 is a module for obtaining a location of the mobile terminal and, for example, a global position system (GPS) module.
  • GPS global position system
  • the audio/video (A/V) input unit 120 is for inputting an audio signal or a video signal and may include a camera 121 and a microphone (mike) 122.
  • the camera 121 processes a still image obtained by an image sensor in a video call mode or a photographing mode or a video frame such as a moving image.
  • the processed video frame may be displayed on the display unit 151.
  • the video frame processed by the camera 121 may be stored in the memory 160 or externally transmitted through a wireless communication unit 110. Two or more cameras 121 may be included depending on a user environment.
  • the mike 122 receives an external sound signal by a microphone in a call mode, a recording mode, or a voice recognition mode and processes the sound signal to be an electrical voice data.
  • the processed voice data may be converted in a form transmittable to a mobile communication base station through the mobile communication module 112 to be outputted.
  • various noise removal algorithms for removing a noise generated during a process of receiving the external sound signal may be implemented.
  • the user input unit 130 generates an input data for controlling an operation of the terminal by a user.
  • the user input unit 130 may include, for example, a key pad, a dome switch, a touch pad (constant voltage/constant current), a jog wheel, or a jog switch.
  • the user input unit 130 may include an identification module selection switch for generating a selection signal for selecting a certain identification module among a plurality of selection modules.
  • the sensing unit 140 can detect a current state of the mobile terminal 100 such as an opening/closing state of the mobile terminal 100, a location of the mobile terminal 100, whether contacted by the user, an orientation of the mobile terminal, or an acceleration/deceleration of the mobile terminal to generate a sensing signal for controlling an operation of the mobile terminal 100.
  • a current state of the mobile terminal 100 such as an opening/closing state of the mobile terminal 100, a location of the mobile terminal 100, whether contacted by the user, an orientation of the mobile terminal, or an acceleration/deceleration of the mobile terminal to generate a sensing signal for controlling an operation of the mobile terminal 100.
  • a current state of the mobile terminal 100 such as an opening/closing state of the mobile terminal 100, a location of the mobile terminal 100, whether contacted by the user, an orientation of the mobile terminal, or an acceleration/deceleration of the mobile terminal to generate a sensing signal for controlling an operation of the mobile terminal 100.
  • the sensing unit 140 may include, for example, a
  • the touch sensor 141 may have an inter layer structure (hereinafter, referred to as "touch screen") with a display unit 151.
  • the touch sensor 141 may be configured to convert a pressure applied to a specific part of the display unit 151 or a change in capacitance generated at the specific part of the display unit 151 into an electrical input signal.
  • the touch sensor 141 may be configured to detect not only a touched location and area but also a pressure by a touch.
  • a signal (or signals) corresponding thereto is transmitted to a touch controller.
  • the touch controller processes the signal (or signals) and then transmits a corresponding data to the controller 180.
  • the controller 180 can determine which area of the display unit 151 is touched.
  • the proximity sensor 142 may be disposed in an internal area of the mobile terminal surrounded by the touch screen or near the touch screen.
  • the proximity sensor 142 refers to a sensor for detecting existence of an object which approaches a predetermined detection surface or exists in proximity using a force in an electromagnetic field or an infrared light, without a mechanical contact. Further, the proximity sensor 142 has a longer lifespan than a contact sensor and has higher utility.
  • An example of the proximity sensor 142 includes a transmissive photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared light proximity sensor.
  • the touch screen is capacitive, the touch screen is configured to detect a proximity of a pointer by a change in an electric field due to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified into the proximity sensor.
  • a “proximity touch” refers to an act of rendering a pointer which does not contact the touch screen but approaches the touch screen to be recognized as being located on the touch screen.
  • a “contact touch” refers to an act of actually contacting the pointer on the touch screen.
  • a location at which the pointer has the proximity touch on the touch screen means a location to which the pointer vertically corresponds with respect to the touch screen when the pointer has the proximity touch.
  • the proximity sensor 142 detects the proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch location, a proximity touch movement, etc.). Information corresponding to the detected proximity touch and the proximity touch pattern may be output on the touch screen.
  • a proximity touch pattern e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch location, a proximity touch movement, etc.
  • the output unit 150 is used to generate an output related sight, hearing, or touch, and the output unit 150 may include the display unit 151, a sound output module 152, an alarm unit 153, and a haptic module 154.
  • the display unit 151 displays (outputs) information processed in the mobile terminal 100. For example, when the mobile terminal is in the call mode, a user interface (UI) or a graphic user interface (GUI) related to a call is displayed. When the mobile terminal 100 is in the video call mode or the photographing mode, an image photographed and/or received, UI, or GUI is displayed.
  • UI user interface
  • GUI graphic user interface
  • the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display, a three dimensional (3D) display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display a three dimensional (3D) display.
  • Some displays may be formed in a transparent or a light transmissive type such that an outside can be seen therethrough.
  • This type of display is called as a transparent display, and a representative example of the transparent display is a transparent OLED (TOLED).
  • a rear structure of the display unit 151 may also be configured in the light transmissive type. Through this structure, the user can see an object located in a back of a terminal body through an area occupied by the display unit 151 of the terminal body.
  • two or more display units 151 may exist.
  • a plurality of displays may be separated on a surface or integrally disposed or disposed respectively on different surfaces.
  • the sound output module 152 can output an audio data received from the wireless communication unit 110 upon a call signal receipt, in the call mode, the recording mode, the voice recognition mode or the broadcast receiving mode or stored in the memory 160.
  • the sound output module 152 may output a sound signal related to a function (e.g., a call signal receipt sound, a message receipt sound, etc.) which is performed by the mobile terminal 100.
  • the sound output module 152 may include a receiver, a speaker, or a buzzer.
  • the alarm unit 153 outputs a signal for notifying an event generation of the mobile terminal 100.
  • An example of an event generated in the mobile terminal includes a call signal receipt, a message receipt, a key signal input, and a touch input.
  • the alarm unit 153 can output a signal other than a video signal or an audio signal, e.g., a signal for notifying the event generation by vibration.
  • the video signal or the audio signal may be output through the display unit 151 or the voice output module 152. Therefore, the display unit 151 or the voice output module 152 may be classified as a part of the alarm unit 153.
  • the haptic module 154 generates various touch effects which can be felt by the user.
  • a representative example of a touch effect generated by the haptic module 154 is vibration.
  • a strength and a pattern of the vibration of the haptic module 154 may be controlled. For example, different vibrations may be synthesized to be output or sequentially output.
  • the haptic module 154 may generate various touch effects other than vibration such as an effect of a stimulus by, for example, a pin arrangement which moves vertically with respect to a contact skin surface, an injection force or suction force of an air through an injection hole or a suction hole, brushing a skin surface, a contact of an electrode, or an electrostatic force and an effect of reproducing coldness and hotness using an element capable of heat absorption or heat generation.
  • the haptic module 154 is not only capable of transmitting the touch effect through a direct touch but also embody the touch effect through a muscle sense of, for example, a finger or an arm of the user. Two or more haptic modules 154 may be provided depending on a type of formation of the mobile terminal 100.
  • the memory 160 can store a program for operating the controller 180 and may temporarily store input/output data (e.g., an address book, a message, a stationary image, a video, etc.).
  • the memory 160 can also store a data related to vibration of various patterns and sound which are output upon the touch input on the touch screen.
  • the memory 160 may include at least one type of a storage medium among a flash memory type memory, a hard disk type memory, a multimedia card micro type memory, a card type memory (e.g., SD or XD memory), a random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and an optic disk.
  • the terminal 100 may operate in association with a web storage which performs a storage function of the memory 160 on internet.
  • the interface unit 170 performs a role of a passage with all external devices connected to the mobile terminal 100.
  • the interface unit 170 receives a data from the external device, is provided with a power and transmits the power to each element within the mobile terminal 100, or transmits a data within the mobile terminal 100 to the external device.
  • a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for connecting to an apparatus having an identification module, an audio input/output (I/O) port, a video input/output (I/O) port, and an earphone port may be included in the interface unit 170.
  • An identification module is a chip for storing various information for authenticating a user access of the mobile terminal 100 and may include a user identify module (UIM), a subscriber identity module (SIM), and a universal subscriber identity module (USIM).
  • the apparatus having the identification module (hereinafter, "identification apparatus") may be manufactured in a smart card form. Therefore, the identification apparatus may be connected to the terminal 100 through a port.
  • the interface unit 170 may be used as a passage for supplying a power from a cradle to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle or a passage for transmitting various command signals input from the cradle by the user to the mobile terminal.
  • the various command signals or power input from the cradle may be operated as a signal for recognizing that the mobile terminal 100 is accurately mounted to the cradle.
  • the controller 180 controls an overall operation of the mobile terminal.
  • the controller 180 can perform control and processing related to the voice call, the data communication, or the video call.
  • the controller 180 may include a multimedia module 181 for reproducing a multimedia.
  • the multimedia module 181 may be implemented within the controller 180 and may be implemented separately from the controller 180.
  • the controller 180 can also recognize a double touch gesture through the user input unit 130 to perform various functions of the terminal.
  • the double touch gesture indicates two or more touch gestures which are input in association with each other.
  • the double touch gesture according to an embodiment of the present invention includes touch gestures, which are distinguished from a pinch zoom in or a pinch zoom out, where the touch gestures are associated with each other and has a time difference with respect to two points.
  • the controller 180 can receive the double touch gesture through the user input unit 130 and generate a new content or delete a content in the home screen according to a corresponding double touch gesture.
  • the controller 180 can also perform a function set according to a second touch gesture when the second touch gesture is input when a first touch gesture with respect to an object displayed on the display unit 151 of the terminal is input.
  • the second touch gesture has a direction and is input after the first touch gesture, which holds the object, is maintained during a predetermined time.
  • the user can intuitively perform various functions of the terminal in association with the first and second touch gestures.
  • the first and second touch gestures can be selected in a variety of ways.
  • the second touch gesture has a direction to implement an intuitive user interface.
  • the power supply unit 190 receives an external power and an internal power according to a control of the controller 180 and supplies a power needed for operating each element.
  • Fig. 2 is an overview including touch gestures which can be set in a mobile terminal according to an embodiment of the present invention.
  • the controller 180 can recognize various touch gestures through the user input unit 130.
  • the touch gestures such as 'tap', 'drag', 'flick', 'swipe', 'double tap' and 'touch and hold' and 'slicing'.
  • the 'tap' touch gesture is a basic gesture which touches a user interface element on the screen and refers to a motion which contacts a finger or a stylus on the screen during a short time.
  • the 'tap' is similar to a 'single click' of a mouse.
  • the 'drag' touch gesture can be used to scroll the screen in a specific direction while maintaining the drag touch on the screen. For example, a drag gesture may be used to scroll the screen in an up and down direction.
  • a 'flick' touch gesture is similar to the drag touch gesture but is done in a 'flicking manner' in a shorter time period.
  • the flick gesture is used to quickly scroll the screen upward and downward in the user interface element which can be scrolled.
  • a 'swipe' touch gesture is similar to the touch and drag but the user intensively drags the finger or the stylus from the touch screen.
  • the swipe gesture is used to display a hidden deleted menu corresponding to each item in a list.
  • a 'double' touch gesture means two continuous tap motions and, for example, is used for zooming in a content or an image to be located centrally or for a zoom out function.
  • the multimedia mapped to a corresponding pattern may be executed by performing a double tap gesture on a part of the pattern.
  • a 'touch and hold' touch gesture corresponds to the user touching and holding the touch on the touch screen by the finger or the stylus.
  • the 'touch and hold' touch gesture is also referred to as 'long press.' The user may draw a desired pattern on the screen of the touch pad by using the touch and hold touch gesture.
  • the 'touch and hold' touch gesture is used for a function of displaying an enlarged view of a location of a cursor in an editable text.
  • the 'touch and hold' touch gesture may be used to overlap a specific pattern with another pattern or connect the specific pattern with another pattern.
  • a slicing touch gesture is a slicing gesture similar to cutting a fruit in a quick slicing manner. Further, the controller 180 can recognize a multi touch on the touch screen. In this instance, the touch gesture may further include 'pinch open' and 'pinch close' touch gestures.
  • a 'pinch open' touch gesture is a touch gesture in which two fingers touch the touch screen and are spread apart as if expanding the screen.
  • the pinch open touch gesture is used to zoom in a map screen in a map view.
  • the double tap touch gesture is used to perform an automatic zoom in a predetermined ratio, however, the pinch open gesture is used to adjust a zoom in level based on an extent the user spread the fingers apart.
  • a 'pinch close' touch gesture is a gesture which is opposite to the pinch open touch gesture and corresponds to a touch gesture in which two fingers touch the touch screen and are placed close together as if shrinking the screen.
  • Various embodiments described herein may be embodied by, for example, a recording medium readable by a computer or a similar device thereof by using software, hardware, or a combination thereof.
  • embodiments described herein may be implemented by using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, and other electrical units for performing functions. In some cases, the embodiments may be implemented by the controller 180.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, microcontrollers, microprocessors, and other electrical units for performing functions.
  • controller 180 may be implemented by the controller 180.
  • embodiments such as a process or a function may be implemented together with a separate software module for performing at least one function or operation.
  • a software code may be implemented by a software application written in an appropriate program language. The software code may be stored in the memory 160 and executed by the controller 180.
  • Fig. 3 is a method of controlling a mobile terminal using a double touch gesture according to an embodiment of the present invention.
  • the controller 180 displays screen information for performing a service on the display unit 151 (S1).
  • the screen information may include various objects used for a service to be performed.
  • the controller 180 may display a home screen, a message screen, a document screen, a gallery screen, a dialogue screen, etc. on the display unit 151.
  • the controller 180 then receives the first touch gesture with respect to a certain object displayed on the display unit 151 via the user input unit 130 (S2).
  • the first touch gesture may include a touch on the certain object displayed on the display.
  • the controller 180 selects an object located at a point at which the touch is made by the first touch gesture on the screen (S3).
  • the controller 180 receives the second touch gesture having a direction through the user input unit 130 after a predetermined amount of time while the first touch gesture is maintained on the screen (S4). The controller 180 then performs a preset service function corresponding to the second touch gesture input through the user input unit (S5).
  • Fig. 4 is an overview illustrating service functions of the mobile terminal 100, which can be set to a double touch gesture according to an embodiment of the present invention.
  • the controller 180 can set various service functions in association with various gestures described in Fig. 2 .
  • the service function which can be set includes a service which generates, deletes, or controls new objects.
  • the user may want to generate or delete a new content in the home screen.
  • the user can do this using the defined double touch gesture.
  • the user can also select which service function is to be executed based on the first and second touch gestures. Specifically, what touch gestures are combined to set the double touch gesture can be set according to user preference, operator policy, etc.
  • the controller 180 can also select the object on the screen based on the received first touch gesture. Next, the controller 180 can receive the second touch gesture after a predetermined time while selection of the corresponding object and the touch of the first touch gesture are maintained.
  • the second touch gesture has a direction and can be input by the user's other hand.
  • the controller 180 can then perform a service function of the terminal set to the second touch gesture.
  • the first touch gesture includes selecting a specific object and maintaining a touch during a predetermined time. That is, a drag is not necessarily included in the first touch gesture but may be optional.
  • the controller 180 can also hold the object located at a point at which the corresponding first touch gesture is input. Also, the controller 180 can temporarily fix a display screen when the first touch gesture is maintained.
  • the controller 180 receives the second touch gesture through the user input unit 130.
  • the second touch gesture includes an intuitive user interface (UI) which performs a content generation service function of adding a new content between preset contents.
  • the added new content may be a content associated with a content of a function currently performed.
  • the content generation service may include adding a home screen, creating a document, adding a new picture in a gallery, writing a text message, and adding a new contact in an address book.
  • other functions that can be performed by the second touch gesture in combination with the first touch gesture include deletion or replacement of the content currently provided. Namely, the home screen may be deleted or replaced based on a double touch gesture operation when switching through home screens.
  • the controller 180 may gather and identify various contents from web browsing, the gallery, the text, the address book (phone book), etc. to one page through the double touch gesture.
  • a page in which the various contents are gathered may be referred to as a QuickPage or a scrip board, or a screen capture page.
  • FIGs. 5(a)-(c) include display screens illustrating an operation of generating a home screen using a double touch gesture in a home screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 displays a home screen. Then, as shown in Fig. 5(b) , the user performs a first touch gesture using their first finger 1 on the home screen, and then performs the second touch gesture using a second finger 2 on their other hand (while maintaining the first touch gesture).
  • the controller 180 executes or performs a home screen adding function based on the first and second touch gestures while the first touch gesture is maintained.
  • the second touch gesture is a touch and drag or flicking operation and occurs after a predetermined time from the first touch gesture.
  • the second touch gesture has a right direction.
  • the user can intuitively know an adding operation is being performed (because the direction is in a right direction).
  • a third home screen 5 is added between the first home screen 3 and the second home screen 4 ( FIG. 5(c) ).
  • a home screen page indicator 5a displayed at a lower portion of the first home screen 3 indicates the third home screen is currently displayed.
  • the home screen page indicator 5b shown in and FIG. 5(c) shows a home screen 5 is newly added between the first and second home screens 3 and 4 (i.e., five total home screen pages are now provided).
  • Figs. 6(a)-(c) are display screens illustrating an operation of deleting a home screen based on a double touch gesture performed in a home screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 can perform a delete function based on an upward or downward touch and drag second gesture.
  • the upward or downward direction touch and drag gestures are also performed after a predetermined time from the first touch gesture while the user maintains the first touch gesture with their first finger 1.
  • the controller 180 when the user performs a flicking gesture in an upward direction or a downward direction on the second home screen 4 using their second finger 2 after a predetermined time from the first touch gesture while maintaining the first finger 1 on the first home screen 3 during the home screen switching action, the controller 180 performs a function of deleting the second home screen 4 and displays the next home screen 6.
  • the second home screen 4 is deleted from the original five home screen pages such that, in final, four home screen pages are provided.
  • Figs. 7(a) and 7(b) are display screens illustrating an operation of replacing a home screen based on a double touch gesture performed on a home screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 can replace a home screen when the user performs a left direction gesture with their second finger 2 after a drag and hold gesture using their first finger 1.
  • the left direction gesture is also performed after a predetermined time from the first touch gesture while the first touch gesture is maintained.
  • the controller 180 performs a function of replacing the first home screen 3 and the second home screen 4.
  • the flicking gesture by the second finger 2 is described herein, the present invention is not limited thereto and the flicking gesture may be the drag gesture.
  • the first touch gesture includes the user touching the home screen initially displayed and then dragging the touch to switch between home screens.
  • the controller 180 can determine that a home screen switching process is occurring. Further, the user can then pause the first touch gesture and just hold the first touch gesture at a point between two home screens as shown in the figures. Then, the user can perform the second touch gesture to have the controller 180 execute specific defined functions.
  • the user can also simply touch a point on the displayed home screen as the first touch gesture (rather than the touch, drag and hold operation), and then perform the second touch gesture to execute the desired function.
  • the first touch gesture rather than the touch, drag and hold operation
  • Fig. 8 is a display screen illustrating an operation of moving to a quick page in a message screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs a quick page moving function based on the user's right direction gesture with their second finger 2.
  • the right direction gesture is also performed after a predetermined time from the first touch gesture while the touch of the first finger 1 is maintained.
  • the controller 180 can generate a quick page function.
  • a quick page is generated (see Fig 11 , for example).
  • the controller 180 can immediately display the generated quick page or can store generated quick pages in the memory 160 so the user can access the quick pages later.
  • the quick pages can also be transmitted to another terminal.
  • Fig. 9 is a display screen illustrating an operation of moving to a quick page in an address book screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs a quick page function based on the user performing the right direction gesture with their second finger 2.
  • the right direction gesture is also performed after a predetermined time from the first touch gesture while the first finger 1 is maintained on the touch screen.
  • the controller 180 performs the quick page function.
  • Fig. 10 is a display screen illustrating an operation of moving to a quick page in a web browsing screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs a quick page function when the user performs a right direction gesture using their second finger 2.
  • the right direction gesture is also performed after a predetermined time from the first touch gesture while maintaining the touch on the screen with the first finger 1 is maintained.
  • the controller 180 performs the quick page function may be performed.
  • the second touch gesture in combination with the first touch gesture described above can be used to delete an object on the screen such as the home screen, webpage, etc.
  • Fig. 11 is a display screen illustrating a quick page in a web browsing screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 displays a quick page screen 10 including web browsing information selected through the quick page function described in Fig. 10 .
  • summary information 10a of web browsing screens sent to the quick page is displayed.
  • the summary information 10a may include a date, a content source, and summary contents of the content.
  • the identification information 10b may include information for identifying a category including a corresponding content.
  • the identification information 10b may be an icon including a text, a figure, a color for identifying a source which provides the corresponding content.
  • the capture screen 10c also includes a preview function of the corresponding content to the user. Accordingly, the user can remember a corresponding content through the capture screen 10c or determine whether to select the corresponding content.
  • Fig. 12(a) is a display screen illustrating an operation of moving to a quick page in a gallery of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs the quick page function based on the user performing the right direction gesture using their second finger 2.
  • the right direction gesture is also performed after a predetermined time from the first touch gesture while the user maintains the first touch gesture.
  • the quick page function is performed.
  • Fig. 12(b) is a display screen illustrating an operation of sharing a picture based on a double touch gesture in a gallery of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs a sharing function based on the user performing an upward direction gesture using their second finger 2.
  • the upward direction gesture is performed after a predetermined time from the first touch gesture while the first touch gesture is maintained.
  • the controller 180 performs a file sharing function of the corresponding picture.
  • Fig. 12(c) is a display screen illustrating an operation of deleting a picture based on a double touch gesture in a gallery of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs a delete function based on the user performing a downward direction gesture using their second finger 2.
  • the downward direction gesture is also performed after a predetermined time from the first touch gesture while the first touch gesture is maintained.
  • the controller 180 performs the delete function.
  • Fig. 13(a) is a display screen illustrating an operation of moving to a quick page based on a double touch gesture in a memo screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs a quick page function based on the user performing a right direction gesture using their second finger 2.
  • the right direction gesture is also performed after a predetermined time from the first touch gesture while the user maintains the first touch gesture.
  • the controller 180 performs the quick page function.
  • Fig. 13(b) is a display screen illustrating an operation of deleting a memo based on a double touch gesture in a memo screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs the delete function based on the user performing the downward direction gesture using their second finger 2.
  • the downward direction gesture is also performed after a predetermined time from the first touch gesture while the first touch gesture is maintained.
  • the controller 180 performs a delete function of the corresponding memo.
  • Fig. 13(c) is a display screen illustrating an operation of deleting an old memo and writing a new memo based on a double touch gesture in a memo screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs a new writing window opening function based on the user performing a right and upward direction gesture using their second finger 2.
  • the right and upward direction gesture is also performed after a predetermined time from the first touch gesture while the first touch gesture is maintained.
  • the controller 180 performs an animation of tearing off the memo screen 12 corresponding to a direction of the gesture in the corresponding memo screen 12 and then displays a new memo screen 12.
  • Fig. 14(a) is a display screen illustrating an operation of moving to a quick page based on a double touch gesture in a message screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs a quick page moving function based on the user performing the right direction gesture with their second finger 2.
  • the controller 180 performs the quick page function.
  • the controller 180 performs the delete function based on the user performing the downward direction gesture using their second finger 2.
  • the downward direction gesture is also performed after a predetermined time from the first touch gesture while the first touch gesture is maintained.
  • the controller 180 performs the delete function of the corresponding message.
  • Figs. 15(a)-(c) are display screens illustrating an operation of switching from a home screen to a quick page of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs a function of switching to the quick page based on the user performing a double tap gesture with their second finger 2.
  • the controller 180 when the user touches the first finger 1 on a certain point in the home screen 14, and when the user performs the double tap gesture using their second finger 2, the controller 180 performs a function of switching to a quick page collection page per content, as shown in Fig. 15(b) .
  • a quick page collection per content screen 15 may be performed by classifying contents of the quick page content into separate folders according to each content.
  • the quick page collection per content screen 15 may include a text a message folder 15a, a gallery folder 15b, a web page folder 15c, a video folder 15d, and an address book folder 15e.
  • the text message folder 15a may include contents collected into the quick page by a user's choice from the text message.
  • the gallery folder 15b may include contents collected into the quick page by the user's choice from a photo content.
  • the web page folder 15c may include contents collected into the quick page by the user's choice from a web page content
  • the video folder 15d may include contents collected into the quick page by the user's choice from a video content
  • the address book folder 15e may include contents collected into the quick page by the user's choice from an address book content.
  • the quick page collection per content screen 15 may display, in each folder, identification information which may identify contents of a corresponding folder.
  • the identification information may include an icon for identifying a content of the corresponding information.
  • the text message folder 15a may include an icon which symbolizes a text message
  • the gallery folder 15b may include an icon which symbolize a gallery
  • the web page folder 15c may include an icon which symbolizes a web page
  • the video folder 15d may include an icon which symbolizes a video
  • the address book folder 15e may include an icon which symbolizes an address book.
  • the function of switching to the quick page screen may be performed by the double tap gesture and the first touch gesture.
  • a certain quick page screen is selected in the quick page collection per content screen 15 as shown in Fig. 15(b)
  • a corresponding quick page screen 16 as shown in Fig. 15(c) may be switched to.
  • contents collected into the quick page by the user's choice among the web page contents are collected in the quick page screen 16.
  • Fig. 16 is a display screen illustrating an operation of displaying a new message writing window in a message list screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 performs an animation which separates message rows upward and downward and displays a new message writing window.
  • the controller 180 creates a new message writing window.
  • the new message writing window may be created between a message row held with the first finger 1 and a message row touched with the second finger 2. The user can then create a new message by writing on the new window.
  • Fig. 17 is an example screen view for explaining an operation of adding new address book information to an address book list screen of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 displays the new address book information input window.
  • the controller 180 creates a new address book information input window.
  • the new address book information input window may also be created between an address book row held using their first finger 1 and an address book row touched using their second finger 2. The user can then input a new contact on the new address book input window.
  • Fig. 18 is flowchart illustrating an operation of adding a new template in a document template of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 displays a document template on the display unit 151 of the mobile terminal 100 (S11). Then, the user performs the first touch gesture using their first finger (S12). The first touch gesture may be the touch and hold gesture. The mobile terminal 100 then selects an object touched by the first touch gesture (S13).
  • the user performs a downward touch gesture which tears off another object using the second finger while maintaining the first touch gesture (S14).
  • the mobile terminal 100 displays the new document template (S15).
  • the controller when the document template is displayed and the user touches a certain point using their first finger 1 and performs a downward direction gesture which touches and tears off a point below a corresponding point using their second finger 2, the controller creates a new document template. Further, the new document template may be created by a point held by the first finger and a point touched by the second finger. The user can also input new content in the new document template.
  • Fig. 19 is a flowchart illustrating an operation of performing a photographing mode in a gallery of a mobile terminal according to an embodiment of the present invention.
  • the controller 180 displays pictures on the gallery screen of the mobile terminal 100 (S21). Further, the user performs an input of the first touch gesture with their first finger with respect to a certain picture on the gallery screen (S22). The first touch gesture may be the touch and hold gesture. The mobile terminal 100 then selects a picture touched by the first touch gesture (S23).
  • the user While the first touch gesture of the first finger is maintained, the user performs an input of the right direction touch gesture which separates another picture using their second finger (S24).
  • the mobile terminal 100 displays the photographing mode for photographing a new picture (S25).
  • the controller 180 executes a photographing mode.
  • the photographing mode may be displayed between a picture area held by the first finger and a picture area touched by the second finger. The user can thus take new pictures in the photographing mode.
  • a gesture for two points on the screen is clearly defined. Therefore, the present invention defines a user interface using a gesture which is intuitive to the user while the gesture is distinguished from the pinch zoom in or pinch zoom out gesture.
  • the above described method may be implemented in a processor readable code on a medium which records a program.
  • a processor readable medium include ROM, RAM, CD-ROMs, magnetic tapes, floppy disks, and optical data storage devices, and the processor readable medium may also be implemented via transmission through internet.
  • the double touch gesture may be used in order to perform writing anew, sharing, and deleting services of various office documents.
  • the double touch gesture may be used to perform a service of modifying a written content.
  • embodiments of the present invention have been described with respect to the mobile terminal, the present invention can apply to the stationary terminal.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)
EP13004583.4A 2013-01-04 2013-09-19 Verfahren zur Steuerung eines Endgerätes mit doppelter Berührungsgeste und Endgerät dafür Withdrawn EP2752751A3 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130001274A KR102064965B1 (ko) 2013-01-04 2013-01-04 더블 터치 제스처를 이용한 단말기 제어 방법 및 그 단말기

Publications (2)

Publication Number Publication Date
EP2752751A2 true EP2752751A2 (de) 2014-07-09
EP2752751A3 EP2752751A3 (de) 2017-05-31

Family

ID=49263080

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13004583.4A Withdrawn EP2752751A3 (de) 2013-01-04 2013-09-19 Verfahren zur Steuerung eines Endgerätes mit doppelter Berührungsgeste und Endgerät dafür

Country Status (4)

Country Link
US (1) US9710066B2 (de)
EP (1) EP2752751A3 (de)
KR (1) KR102064965B1 (de)
CN (1) CN103914246A (de)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105100428A (zh) * 2015-06-05 2015-11-25 努比亚技术有限公司 一种联系人显示方法和系统
WO2021057699A1 (zh) * 2019-09-29 2021-04-01 华为技术有限公司 具有柔性屏幕的电子设备的控制方法及电子设备

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5063150B2 (ja) * 2007-03-16 2012-10-31 株式会社ソニー・コンピュータエンタテインメント データ処理プログラム、データ処理装置、及びデータ処理方法
US9542091B2 (en) 2010-06-04 2017-01-10 Apple Inc. Device, method, and graphical user interface for navigating through a user interface using a dynamic object selection indicator
KR102213490B1 (ko) * 2013-09-03 2021-02-08 삼성전자주식회사 스크린을 제어하는 전자 장치 및 방법
CN109361810B (zh) * 2013-12-20 2021-03-23 华为终端有限公司 通知栏消息管理的方法及装置
US10120528B2 (en) * 2013-12-24 2018-11-06 Dropbox, Inc. Systems and methods for forming share bars including collections of content items
US9811245B2 (en) 2013-12-24 2017-11-07 Dropbox, Inc. Systems and methods for displaying an image capturing mode and a content viewing mode
US11435895B2 (en) 2013-12-28 2022-09-06 Trading Technologies International, Inc. Methods and apparatus to enable a trading device to accept a user input
US20150309601A1 (en) * 2014-04-28 2015-10-29 Shimane Prefectural Government Touch input system and input control method
US9898162B2 (en) 2014-05-30 2018-02-20 Apple Inc. Swiping functions for messaging applications
US9971500B2 (en) 2014-06-01 2018-05-15 Apple Inc. Displaying options, assigning notification, ignoring messages, and simultaneous user interface displays in a messaging application
KR101731316B1 (ko) 2014-08-29 2017-05-02 엔에이치엔엔터테인먼트 주식회사 파일 일괄 처리 방법
USD771690S1 (en) * 2014-09-18 2016-11-15 Cheetah Mobile Inc. Display screen or portion thereof with animated graphical user interface
USD771094S1 (en) * 2014-09-18 2016-11-08 Cheetah Mobile Inc. Display screen or portion thereof with graphical user interface
USD805102S1 (en) 2015-01-27 2017-12-12 Twitter, Inc. Media sharing device with graphical user interface
CN112152908A (zh) 2015-02-16 2020-12-29 钉钉控股(开曼)有限公司 通讯方法
CN106034068A (zh) 2015-03-20 2016-10-19 阿里巴巴集团控股有限公司 群聊中进行私聊的方法、装置、客户端、服务器及系统
KR20170004702A (ko) 2015-07-03 2017-01-11 엘지전자 주식회사 디스플레이 장치 및 제어 방법
US10503361B2 (en) 2015-09-30 2019-12-10 Samsung Electronics Company, Ltd. Interactive graphical object
CN105610695B (zh) 2015-12-21 2021-01-12 阿里巴巴集团控股有限公司 对象分配方法及装置
CN105681056B (zh) 2016-01-13 2019-03-19 阿里巴巴集团控股有限公司 对象分配方法及装置
CN105812237B (zh) 2016-03-07 2020-12-04 钉钉控股(开曼)有限公司 快速添加提醒对象的方法及装置
CN105812865A (zh) * 2016-03-14 2016-07-27 联想(北京)有限公司 一种信息处理方法及电子设备
KR102586424B1 (ko) * 2016-04-18 2023-10-11 삼성전자주식회사 이벤트 알림 처리 방법 및 이를 지원하는 전자 장치
CN107306286B (zh) 2016-04-21 2020-12-04 钉钉控股(开曼)有限公司 离线考勤的处理方法及装置
CN107305459A (zh) * 2016-04-25 2017-10-31 阿里巴巴集团控股有限公司 语音和多媒体消息的发送方法及装置
CN107368995A (zh) 2016-05-13 2017-11-21 阿里巴巴集团控股有限公司 任务处理方法及装置
US11182853B2 (en) 2016-06-27 2021-11-23 Trading Technologies International, Inc. User action for continued participation in markets
CN107846345A (zh) 2016-09-18 2018-03-27 阿里巴巴集团控股有限公司 通讯方法及装置
WO2018083627A1 (en) 2016-11-02 2018-05-11 Onshape Inc. Second touch zoom control
CN106775294B (zh) * 2016-11-24 2019-04-12 维沃移动通信有限公司 一种图像删除方法及移动终端
US11054988B2 (en) * 2017-06-30 2021-07-06 Huawei Technologies Co., Ltd. Graphical user interface display method and electronic device
CN107728918A (zh) * 2017-09-27 2018-02-23 北京三快在线科技有限公司 浏览连续页面的方法、装置及电子设备
KR102441509B1 (ko) * 2017-12-15 2022-09-07 현대자동차주식회사 단말기, 차량 및 단말기 제어방법
CN109831579B (zh) * 2019-01-24 2021-01-08 维沃移动通信有限公司 一种内容删除方法、终端及计算机可读存储介质
CN110032326A (zh) * 2019-03-29 2019-07-19 网易(杭州)网络有限公司 移动终端显示画面的控制方法、装置、设备和存储介质
KR102246435B1 (ko) 2019-06-19 2021-04-30 구자범 터치 패턴 입력을 이용한 보기 전환 장치 및 그 방법
CN111356217B (zh) * 2020-02-17 2023-07-21 Oppo广东移动通信有限公司 终端控制方法、装置、终端设备以及存储介质
CN112162674B (zh) * 2020-10-20 2022-04-08 珠海格力电器股份有限公司 控制中心的展示方法和装置、存储介质、电子装置

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120081267A1 (en) * 2010-10-01 2012-04-05 Imerj LLC Desktop reveal expansion

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US7925996B2 (en) * 2004-11-18 2011-04-12 Microsoft Corporation Method and system for providing multiple input connecting user interface
US8014760B2 (en) * 2006-09-06 2011-09-06 Apple Inc. Missed telephone call management for a portable multifunction device
US8169414B2 (en) * 2008-07-12 2012-05-01 Lim Seung E Control of electronic games via finger angle using a high dimensional touchpad (HDTP) touch user interface
CN101458591A (zh) 2008-12-09 2009-06-17 三星电子(中国)研发中心 一种具有多点触摸屏硬件结构的手机输入系统
US8224392B2 (en) 2009-04-29 2012-07-17 Lg Electronics Inc. Mobile terminal capable of recognizing fingernail touch and method of controlling the operation thereof
JP2011150478A (ja) * 2010-01-20 2011-08-04 Fujitsu Toshiba Mobile Communications Ltd 携帯端末
EP2393000B1 (de) 2010-06-04 2019-08-07 Lg Electronics Inc. Mobiles Endgerät zur Bereitstellung eines Multiplayer-Spiels und Verfahren zur Steuerung des Betriebs des mobilen Endgeräts
KR101781852B1 (ko) * 2011-01-04 2017-09-26 엘지전자 주식회사 이동단말기 및 그 제어방법
JP2012155408A (ja) * 2011-01-24 2012-08-16 Kyocera Corp 携帯型電子機器

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120081267A1 (en) * 2010-10-01 2012-04-05 Imerj LLC Desktop reveal expansion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105100428A (zh) * 2015-06-05 2015-11-25 努比亚技术有限公司 一种联系人显示方法和系统
WO2021057699A1 (zh) * 2019-09-29 2021-04-01 华为技术有限公司 具有柔性屏幕的电子设备的控制方法及电子设备

Also Published As

Publication number Publication date
KR20140089245A (ko) 2014-07-14
US20140191986A1 (en) 2014-07-10
US9710066B2 (en) 2017-07-18
KR102064965B1 (ko) 2020-01-10
EP2752751A3 (de) 2017-05-31
CN103914246A (zh) 2014-07-09

Similar Documents

Publication Publication Date Title
US9710066B2 (en) Method for controlling a terminal using a double touch gesture and terminal thereof
KR102129795B1 (ko) 이동 단말기 및 그 제어방법
US9973612B2 (en) Mobile terminal with touch screen and method of processing data using the same
EP2302497A2 (de) Mobiles Endgerät und Anzeigesteuerungsverfahren dafür
KR20130010799A (ko) 휴대 전자기기 및 이의 제어방법
KR20130010364A (ko) 휴대 전자기기 및 이의 제어방법
KR20110016337A (ko) 이동 통신 단말기에서의 데이터 표시방법 및 이를 적용한 이동 통신 단말기
KR20120105695A (ko) 이동 단말기 및 그 제어방법
KR101609388B1 (ko) 3차원 메뉴를 표시하는 이동 단말기 및 이동 단말기의 제어방법
KR20110129750A (ko) 이동 단말기 및 그 제어 방법
KR102074347B1 (ko) 이동 단말기 및 그것의 제어방법
KR20100105005A (ko) 듀얼 디스플레이부를 구비한 이동 통신 단말기에서의 아이템 제어 방법 및 이를 적용한 이동 통신 단말기
KR101294306B1 (ko) 휴대 전자기기 및 이의 제어방법
KR20140113155A (ko) 휴대 전자기기 및 이의 제어방법
KR20120035772A (ko) 이동단말기 및 그의 멀티 홈 화면 상 아이콘 제어 방법
KR101778969B1 (ko) 휴대 전자기기 및 이의 스크롤 방법
EP3056061B1 (de) Mobiles endgerät
EP2975510A1 (de) Endgerät und Betriebsverfahren dafür
KR20140014681A (ko) 대화 스레드내에서 대화 정보 표시 기능을 가지는 단말기 및 그 제어 방법
KR20100117417A (ko) 이동 통신 단말기에서의 애플리케이션 실행 방법 및 이를 적용한 이동 통신 단말기
KR20140118061A (ko) 휴대 단말기 및 그 제어 방법
KR102018552B1 (ko) 이동 단말기 및 이의 제어방법
KR102046462B1 (ko) 이동 단말기 및 이동 단말기의 제어 방법
KR20130083201A (ko) 이동 단말기 및 그 제어방법, 이를 위한 기록매체
KR20130090467A (ko) 이동 단말기 및 그 제어방법, 이를 위한 기록매체

Legal Events

Date Code Title Description
17P Request for examination filed

Effective date: 20130919

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/0488 20130101AFI20170427BHEP

Ipc: G06F 3/0483 20130101ALI20170427BHEP

Ipc: G06F 3/0481 20130101ALI20170427BHEP

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20190225

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20201104