KR20090034215A - Mobile terminal and it's character input method - Google Patents

Mobile terminal and it's character input method Download PDF

Info

Publication number
KR20090034215A
KR20090034215A KR1020070099479A KR20070099479A KR20090034215A KR 20090034215 A KR20090034215 A KR 20090034215A KR 1020070099479 A KR1020070099479 A KR 1020070099479A KR 20070099479 A KR20070099479 A KR 20070099479A KR 20090034215 A KR20090034215 A KR 20090034215A
Authority
KR
South Korea
Prior art keywords
key
input
character
mobile terminal
module
Prior art date
Application number
KR1020070099479A
Other languages
Korean (ko)
Inventor
주완호
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020070099479A priority Critical patent/KR20090034215A/en
Publication of KR20090034215A publication Critical patent/KR20090034215A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/02Constructional features of telephone sets
    • H04M1/23Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof

Abstract

A mobile terminal and a character input method using the same are provided to conveniently input a character through one-time key input without respect to the number of characters set in one key when inputting characters by a touch type. A key touched in a keypad displayed on a touch screen is detected(S102). While the key is touched, it is detected whether a drag is performed(S103). If touch of the key is canceled, a character or a function corresponding to the dragged direction is inputted(S107, S108).

Description

MOBILE TERMINAL AND IT'S CHARACTER INPUT METHOD}

The present invention relates to a character input method using a portable terminal, and more particularly, to a method of inputting a character displayed on a touch screen of the portable terminal.

A portable terminal is a portable device that is portable and has one or more functions such as voice and video calling, information input and output, and data storage.

As the functions are diversified, for example, the mobile terminal has complex functions such as taking a picture or a video, playing a music or video file, playing a game, receiving a broadcast, etc., in the form of a comprehensive multimedia player. Is being implemented.

Many new attempts are being applied to these multimedia devices in terms of hardware or software to implement complex functions. For example, a user interface environment is provided for a user to search for or select a function easily and conveniently.

In addition, while the portable terminal is regarded as a personal portable product for expressing its individuality, various designs are required. As such a design form, a folder form, a slide form, a bar form or a rotation form design is applied to the portable terminal.

An object of the present invention is to provide a method for simply inputting a text in a portable terminal having a touch screen and a portable terminal using the same.

An object of the present invention is to provide a method for reducing the number of key inputs and a portable terminal using the same when a character is to be input by a touch method in the portable terminal.

An object of the present invention is to provide a method for reducing a key input time and a portable terminal using the same when a character is input in a portable terminal having a keypad in which a plurality of characters are assigned to one key.

A portable terminal according to an embodiment of the present invention for realizing the above problem is an output unit for displaying a character input through the keypad and the keypad input by the touch method, the touch of the key touched through the keypad displayed on the output unit And a controller for detecting a time and a drag direction and inputting a character or a function corresponding to the time and the drag direction.

In addition, the present invention for realizing the above object is a step of detecting any key touched on the keypad displayed on the touch screen, detecting whether the key is dragged while being touched, if the key is released in the dragged direction And entering a corresponding character or function.

According to the present invention, when a character is input by a touch method in a mobile terminal having a touch screen, the character (or function) can be easily input by one key input regardless of the number of characters set in one key. It is effective.

The present invention has an effect of reducing the number of key inputs when a character is to be input by a touch method in a mobile terminal.

The present invention has an effect of reducing the key input time when a character is to be input by a touch method in the mobile terminal.

Hereinafter, various embodiments of the present invention will be described in detail with reference to the accompanying drawings. However, the detailed description of the well-known technology and its configuration which are determined to obscure the gist of the present invention will be omitted. In addition, in describing the present invention with reference to the drawings, components that perform the same function will be described with the same reference numerals.

The portable terminal described herein includes a mobile phone, a smart phone, a notebook computer, a navigation, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), and the like.

1 is a block diagram showing the configuration of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 illustrated in FIG. 1 includes a wireless communication unit 110, an A / V input unit 120, an operation unit 130, a sensing unit 140, an output unit 150, and a storage unit. 160, an interface unit 170, a controller 180, and a power supply unit 190 may be included. It should be noted that the above components may be combined into one or more components, or one component may be subdivided into two or more components as necessary when implemented in actual application.

Hereinafter, the components will be described in turn.

The wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, a GPS module 115, and the like.

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel.

The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.

Meanwhile, the broadcast related information may be provided through a mobile communication network, and in this case, may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 receives broadcast signals using various broadcasting systems, and in particular, digital multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), and media forward link (MediaFLO). Digital broadcast signals can be received using digital broadcasting systems such as only), digital video broadcast-handheld (DVB-H), integrated services digital broadcast-terrestrial (ISDB-T), and the like. Of course, the broadcast receiving module 111 is configured to be suitable for all broadcast systems providing broadcast signals as well as the digital broadcast system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the storage 160.

In addition, the mobile communication module 112 transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.

The wireless internet module 113 refers to a module for wireless internet access, and the wireless internet module 113 may be internal or external.

The short range communication module 114 refers to a module for short range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.

In addition, the Global Position System (GPS) module 115 receives position information from a plurality of satellites.

Meanwhile, the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera module 121 and a microphone module 122. The camera module 121 processes image frames such as still images or moving images obtained by an image sensor in a video call mode or a photographing mode. The processed image frame may be displayed on the display module 151.

The image frame processed by the camera module 121 may be stored in the storage 160 or transmitted to the outside through the wireless communication unit 110. Two or more camera modules 121 may be provided according to the configuration of the terminal.

The microphone module 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data. The processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode. The microphone module 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.

The operation unit 130 generates key input data input by the user for controlling the operation of the terminal. The operation unit 130 may be configured of a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like. In particular, when the touch pad forms a mutual layer structure with the display module 151 described later, this may be referred to as a touch screen.

The sensing unit 140 detects a current state of the mobile terminal 100 such as an open / closed state of the mobile terminal 100, a position of the mobile terminal 100, presence or absence of a user contact, and controls the operation of the mobile terminal 100. Generate a sensing signal. For example, when the portable terminal 100 is in the form of a slide phone, whether the slide phone is opened or closed may be sensed. In addition, it is responsible for sensing functions related to whether the power supply unit 190 is supplied with power or whether the interface unit 170 is coupled to an external device.

The interface unit 170 serves as an interface with all external devices connected to the mobile terminal 100. For example, wired / wireless headsets, external chargers, wired / wireless data ports, card sockets (e.g. memory card, SIM / UIM / UICC card), audio I / O (Input / Output) terminals , Video I / O (Input / Output) terminals, earphones, etc. The interface unit 170 receives data from an external device or receives power and transmits the data to each component inside the mobile terminal 100 or transmits the data inside the mobile terminal 100 to an external device.

The output unit 150 is for outputting an audio signal, a video signal, or an alarm signal. The output unit 150 may include a display module 151, a sound output module 152, an alarm output module 153, and the like.

The display module 151 displays and outputs information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, a user interface (UI) or a graphic user interface (GUI) related to a call is displayed. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays a photographed and / or received image, a UI, or a GUI.

Meanwhile, as described above, when the display module 13 and the touch pad form a mutual layer structure to form a touch screen, the display module 151 may be used as an input device in addition to the output device.

The display module 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, or a three-dimensional display. 3D display).

In addition, two or more display modules 151 may exist according to the implementation form of the mobile terminal 100. For example, the external display module (not shown) and the internal display module (not shown) may be simultaneously provided in the portable terminal 100.

The sound output module 152 outputs audio data received from the wireless communication unit 110 or stored in the storage unit 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.

In addition, the sound output module 152 outputs a sound signal related to a function (for example, a call signal reception sound, a message reception sound, etc.) performed by the portable terminal 100. The sound output module 152 may include a speaker, a buzzer, and the like.

The alarm output module 153 outputs a signal for notifying occurrence of an event of the portable terminal 100. Examples of events occurring in the mobile terminal include call signal reception, message reception, and key signal input. The alarm output module 153 outputs a signal for notifying occurrence of an event in a form other than an audio signal or a video signal.

For example, the signal may be output in the form of vibration. When a call signal is received or a message is received, the alarm output module 153 may output a vibration to notify this. Alternatively, when a key signal is input, the alarm output module 153 may output a vibration in response to the key signal input. Through the vibration output as described above, the user can recognize the occurrence of the event. Of course, the signal for notification of event occurrence may be output through the display module 151 or the sound output module 152.

The storage unit 160 may store a program for processing and control of the controller 180, and a function for temporarily storing input / output data (for example, a phone book, a message, a still image, a video, etc.). You can also do

The storage unit 160 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (eg, SD or XD memory), It may include a storage medium of at least one type of RAM and ROM. In addition, the mobile terminal 100 may operate a web storage that performs a storage function of the storage unit 150 on the Internet.

The controller 180 typically controls the overall operation of the mobile terminal. For example, perform related control and processing for voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia playback module 181 for multimedia playback. The multimedia playback module 181 may be configured in hardware in the controller 180 or may be configured in software separately from the controller 180.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

In the above, the portable terminal related to the present invention has been described in terms of components according to functions. Hereinafter, referring to FIG. 2 and FIG. 3, the portable terminal related to the present invention will be further described in terms of components according to appearance.

In the following description, a slider-type portable terminal is described as an example among various types of portable terminals such as a folder type, a bar type, a swing type, a slider type, and the like for clarity of explanation. Therefore, the present invention is not limited to the slider type portable terminal, but can be applied to all types of portable terminals including the aforementioned type.

2 is a front perspective view of an example of a mobile terminal related to the present invention; The slide-type portable terminal of the present invention includes a first body (100A) and a second body (100B) configured to be slidable in at least one direction on the first body (100A).

A state in which the first body 100A is disposed to overlap with the second body 100B may be referred to as a closed configuration, and as shown in the drawing, the first body 100A is referred to as the second body 100B. The exposed state of at least a portion of the N may be referred to as an open configuration. The portable terminal operates mainly in the standby mode in a closed state, but the standby mode may be released by a user's operation. In addition, although the portable terminal operates mainly in a call mode or the like in an open state, the portable terminal may be switched to a standby mode by a user's operation or a lapse of a predetermined time.

The case (casing, housing, cover, etc.) forming the exterior of the first body 100A is formed by the first front case 100A-1 and the first rear case 100A-2. Various electronic components are embedded in the space formed by the first front case 100A-1 and the first rear case 100A-2. At least one intermediate case may be further disposed between the first front case 100A-1 and the first rear case 100A-2.

The cases may be formed by injecting a synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).

In the first body 100A, specifically, the first front case 100A-1, the display module 151, the first sound output module 152-1, the first camera module 121-1, or the first manipulation unit ( 130-1) may be disposed.

The display module 151 includes a liquid crystal display (LCD), organic light emitting diodes (OLED), and the like, which visually express information.

In addition, since the touch pad is superimposed on the display module 151 in a layer structure, the display module 151 may operate as a touch screen to enable input of information by a user's touch.

The first sound output module 152-1 may be implemented in the form of a receiver or a speaker.

The first camera module 121-1 may be implemented to be suitable for capturing an image or a video of a user.

The first manipulation unit 130-1 receives a command for recording or capturing a call image of the present invention.

Like the first body 100A, the case forming the external appearance of the second body 100B is formed by the second front case 100B-1 and the second rear case 100B-2.

The second manipulation unit 130-2 may be disposed on the front surface of the second body 100B, specifically, the second front case 100B-1.

The third manipulation unit 130-3, the microphone module 122, and the interface unit 170 may be disposed on at least one of the second front case 100B-1 or the second rear case 100B-2.

The first to third manipulation units 130-1, 130-2, and 130-3 may be collectively referred to as a manipulating portion 130, and may be manipulated while giving a tactile feeling to the user. May be employed in any manner. For example, the operation unit may be implemented as a dome switch or a touch pad that may receive a command or information by a user's push or touch operation, or may be operated by a wheel or jog method that rotates a key or a joystick. Can be implemented.

In the functional aspect, the first operation unit 130-1 is for inputting a command such as start, end, scroll, etc., and the second operation unit 130-2 is for inputting numbers, letters, symbols, and the like. will be.

In addition, the third operation unit 130-3 may operate as a hot-key for activating a special function in the mobile terminal.

The microphone module 122 may be implemented in a form suitable for receiving a user's voice, other sounds, and the like.

The interface unit 170 serves as a passage for allowing the portable terminal according to the present invention to exchange data with an external device. For example, the interface unit 170 is wired or wirelessly, a connection terminal for connecting with an earphone, a port for short-range communication (for example, an infrared port (IrDA port), a Bluetooth port, a wireless LAN) Port (wireless Lan port, etc.), or at least one of power supply terminals for supplying power to each of the components.

Since the interface unit 170 has already been described above, a detailed description thereof will be omitted.

On the side of the second rear case 100B-2, a power supply unit 190 for supplying power to the portable terminal is mounted. The power supply unit 190 may be detachably coupled for charging, for example, as a rechargeable battery.

3 is a rear perspective view of the portable terminal shown in FIG.

Referring to FIG. 3, a second camera module 121-2 may be additionally mounted on the rear surface of the second rear case 100B-2 of the second body 100B. The second camera module 121-2 may have a photographing direction substantially opposite to that of the first camera module 121-1 (see FIG. 1), and may have different pixels from the first camera module.

For example, the first camera module 121-1 has a low pixel enough to take a picture of a user's face and transmit it to a counterpart in a video call or the like, and the second camera module 121-2 It is desirable to have a high pixel because a normal subject is often photographed and not immediately transmitted.

The flash 121-3 and the mirror 121-4 may be further disposed adjacent to the second camera module 121-2. The flash 121-3 illuminates the light toward the subject when the subject is photographed by the second camera module 121-2. The mirror 121-4 allows the user to see his / her face or the like when the user wants to photograph himself (self-photographing) using the second camera module 121-2.

The second sound output module 152-2 may be further disposed on the second rear case 100B-2.

The second sound output module 152-2 may implement a stereo function together with the first sound output module 152-1 (see FIG. 2), and may be used for a call in a speaker phone mode.

In addition, an antenna 111-1 for receiving a broadcast signal may be disposed on one side of the second rear case 100B-2 in addition to an antenna for a call. The antenna 111-1 may be installed to be pulled out of the second body 100B.

A portion of the slide module 100C for slidably coupling the first body 100A and the second body 100B is disposed on the first rear case 100A-2 side of the first body 100A.

The other part of the slide module 100C may be disposed on the side of the second front case 100B-1 of the second body 100B and may not be exposed to the outside as shown in the drawing.

In the above description, the second camera module 121-2 is disposed on the second body 100B, but is not necessarily limited thereto.

For example, at least one of the elements 111-1, 121-2 to 121-3, and 152-2 described as being disposed on the second rear case 100B-2, like the second camera module 121-2. It is also possible for one or more to be mounted on the first body 100A, mainly the first rear case 100A-2. In such a case, there is an advantage that the configuration (s) disposed in the first rear case 100A-2 in the closed state are protected by the second body 100B. Furthermore, even if the second camera module 121-2 is not separately provided, the first camera module 121-1 is formed to be rotatable so that shooting can be performed up to the shooting direction of the second camera module 121-2. May be

4 is an exemplary view illustrating a text input screen displayed on a touch screen of a mobile terminal according to the present invention. The touch screen 151 is divided into a plurality of display areas, and a keypad is displayed on some display areas 310. In some other display areas 320, characters input through the keypad are displayed.

A plurality of characters are assigned to each key constituting the keypad.

In the present embodiment, the character assigned to the keypad is an English alphabet, but is not necessarily limited to English, and various special characters, codes, and alphabets of a specific country may correspond to each other.

In applying the present invention, the alphabet assigned to each key of the keypad may be assigned in the same order as in the prior art. Therefore, there is an effect that a user who is already familiar with the conventional character arrangement does not have to try to acquire a new character arrangement.

However, in the conventional character input method, it is cumbersome to input a key as many times as the order according to the number of characters assigned to each key and the order in which each character is assigned, but in the present invention, the key is assigned to each key. Regardless of the number or order of characters, the user can enter any character with a single keystroke.

Hereinafter, a specific character input method of the present invention will be described with reference to FIGS. 5 to 7.

FIG. 5 is a flowchart illustrating a text input method of a mobile terminal according to the present invention, and describes a method of inputting text (or a function) through a keypad displayed on a touch screen in the text input mode.

The controller 180 of the portable terminal displays a keypad on the touch screen of the output unit 150 in the text input mode (S101). Each key constituting the keypad is displayed with an indicator indicating a character input method. The indicator is an indicator indicating a drag direction after a key touch.

After displaying the keypad, the controller 180 detects whether an arbitrary key is touched (S102), and determines whether the key is touched and then dragged or released immediately (S103, S104). If the touch is released immediately after the touch, the drag direction is not set or a character (or function) whose drag direction is set in place is input (S105).

However, if the key is touched and then dragged (S103), the controller 180 detects the dragged direction (S106) and determines whether the touch is released (S107). That is, when the user touches an arbitrary key and drags it in a specific direction and releases the touch, the controller 180 inputs a character (or function) set in the dragging direction of the touched key (S108).

In the text input mode, a text input process by touch and drag is repeatedly performed.

FIG. 6 is a diagram illustrating a keypad for explaining a text input method of a mobile terminal according to the present invention, and the keypad may display a specific icon or indicator 410 indicating a direction to drag each text input. The indicator 410 may be displayed on another area of the touch screen when each key of the keypad is touched.

According to the present invention, a character corresponding to the drag direction is input from among letters assigned to the key according to a direction in which the user touches and then drags an arbitrary key.

For example, suppose you enter 'happy'.

First, after touching the key # 4 assigned to 'h' and releasing the touch, 'h' is input. Next, when the '2' key has been touched and dragged in the left direction (←), the touch is released and 'a' is input. Next, while touching the 7 key to which 'p' is assigned, drag in the left direction (←) and release the touch to input 'p'. Enter 'p' once more in the same order. Next, while touching the key 9 assigned to 'y', drag in the right direction (→) and release the touch to input 'y'.

At this time, if you want to input the number assigned to each key, it may be entered by touching the key for a specific time and then releasing. That is, when a key is released immediately after the touch, a character inputted according to the touched time may be differently assigned.

As described above, in the character input method according to the present invention, the number of keys equal to the number of characters to be input may be input. Therefore, as compared with the related art, the number of key inputs is reduced, and the character input time is reduced as the number of key inputs is reduced.

On the other hand, the drag direction for the character input is not limited to the four directions, as shown in Figure 7, depending on the setting may be set in eight or more directions.

In addition, a plurality of combinations of the above directions may be dragged, and various function keys (for example, an enter key, a delete key, a copy key, a paste key, a mode switch key, etc.) may be assigned as well as characters.

If you want to input a special character, capital letter, number, symbol not set in the keypad, the character input mode may be changed by inputting a specific key. When the character input mode is changed, the keypad may display newly assigned characters or symbols for each key. Alternatively, characters or symbols can be displayed in a table instead of the keypad and selected for selection.

As described above, the present invention has an effect of allowing characters (or functions) to be input by a single key input regardless of the number of characters set in one key.

In the above, preferred embodiments of the present invention have been described with reference to the accompanying drawings. Here, the terms or words used in the present specification and claims should not be construed as being limited to the common or dictionary meanings, but should be interpreted as meanings and concepts corresponding to the technical spirit of the present invention.

Therefore, the embodiments described in the specification and the drawings shown in the drawings are only the most preferred embodiment of the present invention, and do not represent all of the technical idea of the present invention, which can be replaced at the time of the present application It should be understood that there may be various equivalents and variations.

1 is a block diagram showing the configuration of a mobile terminal according to an embodiment of the present invention;

2 is a front perspective view of an example of a slide type portable terminal according to the present invention;

Figure 3 is a side cross-sectional view showing an example of a configuration for explaining a method of detecting whether the slide type portable terminal related to the present invention sliding.

Figure 4 is an exemplary view showing a character input screen displayed on the touch screen of the mobile terminal according to the present invention.

5 is a flowchart for explaining a character input method of a mobile terminal according to the present invention;

Figure 6 is an exemplary view showing a keypad for explaining a character input method of a mobile terminal according to the present invention.

7 is an exemplary view for explaining a drag direction of a key for inputting a character in a mobile terminal according to the present invention;

Claims (6)

  1. Detecting any key touched on the keypad displayed on the touch screen;
    Detecting whether the key is dragged while being touched;
    Inputting a character or a function corresponding to the dragged direction when the key is touch-released. 2.
  2. The method of claim 1, wherein the drag direction is
    Character input method using a mobile terminal, characterized in that for assigning a character or a function to be input for a plurality of directions including up, down, left, right, in place.
  3. The method of claim 2, wherein when the drag direction is in place,
    Character input method using a mobile terminal, characterized in that immediately after the key is released.
  4. The method of claim 2, wherein when the drag direction is in place,
    Character input method using a mobile terminal, characterized in that the other character is input according to the time the key is touched.
  5. An output unit for displaying a keypad input through a touch method and a character input through the keypad;
    And a controller for detecting a touch time and a drag direction of a key touched through the keypad displayed on the output unit, and inputting a character or a function corresponding to the time and the drag direction.
  6. The method of claim 5, wherein the output unit,
    And an indicator indicating a drag direction for inputting a character or a function assigned to each key under the control of the controller.
KR1020070099479A 2007-10-02 2007-10-02 Mobile terminal and it's character input method KR20090034215A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020070099479A KR20090034215A (en) 2007-10-02 2007-10-02 Mobile terminal and it's character input method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020070099479A KR20090034215A (en) 2007-10-02 2007-10-02 Mobile terminal and it's character input method

Publications (1)

Publication Number Publication Date
KR20090034215A true KR20090034215A (en) 2009-04-07

Family

ID=40760230

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020070099479A KR20090034215A (en) 2007-10-02 2007-10-02 Mobile terminal and it's character input method

Country Status (1)

Country Link
KR (1) KR20090034215A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010147394A2 (en) * 2009-06-17 2010-12-23 Kim Hoyon Chinese language and chinese character input system and method
KR20110035376A (en) * 2009-09-30 2011-04-06 엘지전자 주식회사 Mobile terminal and method of controlling the same
KR20110074822A (en) * 2009-12-26 2011-07-04 김기주 Recognition method of multi-touch of touch button on touch screen, text input method on touch screen and modifying method of object
WO2011142606A2 (en) * 2010-05-13 2011-11-17 (주)아이티버스 Character input method and character input apparatus using a touch screen, and recording medium and character input apparatus using a touch screen and in which a program for implementing the character input method is recorded
KR20110128536A (en) * 2010-05-24 2011-11-30 엘지전자 주식회사 Mobile terminal and operation control method thereof
WO2011156282A2 (en) * 2010-06-07 2011-12-15 Google Inc. Selecting alternate keyboard characters via motion input
KR101139131B1 (en) * 2010-12-15 2012-04-30 은익수 Korean character input method for touch screen
KR101253645B1 (en) * 2010-05-13 2013-04-11 (주)아이티버스 Method and apparatus for inputting characters using touch screen and recording medium storing a program to execute thereof
KR20190137222A (en) * 2018-06-01 2019-12-11 조선대학교산학협력단 Touch keypad with indivisual switch function and method for inputting letters using it

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010147394A3 (en) * 2009-06-17 2011-03-31 Kim Hoyon Chinese language and chinese character input system and method
WO2010147394A2 (en) * 2009-06-17 2010-12-23 Kim Hoyon Chinese language and chinese character input system and method
US9367534B2 (en) 2009-09-30 2016-06-14 Lg Electronics Inc. Mobile terminal and method for controlling the same
KR20110035376A (en) * 2009-09-30 2011-04-06 엘지전자 주식회사 Mobile terminal and method of controlling the same
KR20110074822A (en) * 2009-12-26 2011-07-04 김기주 Recognition method of multi-touch of touch button on touch screen, text input method on touch screen and modifying method of object
KR20190006470A (en) * 2009-12-26 2019-01-18 김기주 Recognition method of multi-touch of touch button on touch screen, text input method on touch screen and modifying method of object
WO2011142606A2 (en) * 2010-05-13 2011-11-17 (주)아이티버스 Character input method and character input apparatus using a touch screen, and recording medium and character input apparatus using a touch screen and in which a program for implementing the character input method is recorded
KR101253645B1 (en) * 2010-05-13 2013-04-11 (주)아이티버스 Method and apparatus for inputting characters using touch screen and recording medium storing a program to execute thereof
WO2011142606A3 (en) * 2010-05-13 2012-03-01 (주)아이티버스 Character input method and character input apparatus using a touch screen, and recording medium and character input apparatus using a touch screen and in which a program for implementing the character input method is recorded
KR20110128536A (en) * 2010-05-24 2011-11-30 엘지전자 주식회사 Mobile terminal and operation control method thereof
WO2011156282A3 (en) * 2010-06-07 2012-04-12 Google Inc. Selecting alternate keyboard characters via motion input
US8612878B2 (en) 2010-06-07 2013-12-17 Google Inc. Selecting alternate keyboard characters via motion input
WO2011156282A2 (en) * 2010-06-07 2011-12-15 Google Inc. Selecting alternate keyboard characters via motion input
KR101139131B1 (en) * 2010-12-15 2012-04-30 은익수 Korean character input method for touch screen
KR20190137222A (en) * 2018-06-01 2019-12-11 조선대학교산학협력단 Touch keypad with indivisual switch function and method for inputting letters using it

Similar Documents

Publication Publication Date Title
US9535592B2 (en) Mobile terminal having multi-function executing capability and executing method thereof
US10656712B2 (en) Mobile terminal and method of controlling operation of the same
US9710139B2 (en) Mobile terminal and operation control method thereof
US9395763B2 (en) Mobile terminal and controlling method thereof
US9083814B2 (en) Bouncing animation of a lock mode screen in a mobile communication terminal
CN101651738B (en) Mobile terminal and method of controlling operation of the mobile terminal
US8793607B2 (en) Method for removing icon in mobile terminal and mobile terminal using the same
KR101801188B1 (en) Mobile device and control method for the same
CN101404681B (en) Apparatus and method for reproducing video of mobile terminal
KR101405933B1 (en) Portable terminal and method for displaying position information in the portable terminal
EP2141580B1 (en) Distinguishing input signals detected by a mobile terminal
US9367534B2 (en) Mobile terminal and method for controlling the same
US20140123003A1 (en) Mobile terminal
US9621710B2 (en) Terminal and menu display method thereof
KR101430445B1 (en) Terminal having function for controlling screen size and program recording medium
EP2166443B1 (en) Mobile terminal with transparent flexible display and operation method thereof
KR101510484B1 (en) Mobile Terminal And Method Of Controlling Mobile Terminal
CN101431380B (en) Mobile terminal and method for converting broadcast channel of a mobile terminal
US8185098B2 (en) Mobile terminal using variable menu icons
US9665268B2 (en) Mobile terminal and control method thereof
US8843854B2 (en) Method for executing menu in mobile terminal and mobile terminal using the same
EP2107443B1 (en) Mobile terminal using proximity sensor and control method thereof
EP2065786B1 (en) Mobile terminal and key input method thereof
US8478349B2 (en) Method for executing menu in mobile terminal and mobile terminal using the same
CA2619415C (en) Mobile terminal and touch recognition method therefor

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
J201 Request for trial against refusal decision
AMND Amendment
B601 Maintenance of original decision after re-examination before a trial
J301 Trial decision

Free format text: TRIAL DECISION FOR APPEAL AGAINST DECISION TO DECLINE REFUSAL REQUESTED 20140526

Effective date: 20150123