KR101629315B1 - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR101629315B1
KR101629315B1 KR1020100002125A KR20100002125A KR101629315B1 KR 101629315 B1 KR101629315 B1 KR 101629315B1 KR 1020100002125 A KR1020100002125 A KR 1020100002125A KR 20100002125 A KR20100002125 A KR 20100002125A KR 101629315 B1 KR101629315 B1 KR 101629315B1
Authority
KR
South Korea
Prior art keywords
text input
movement
rotation
display unit
terminal
Prior art date
Application number
KR1020100002125A
Other languages
Korean (ko)
Other versions
KR20110082238A (en
Inventor
손성기
김일국
이영훈
장진아
최현보
김재연
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020100002125A priority Critical patent/KR101629315B1/en
Publication of KR20110082238A publication Critical patent/KR20110082238A/en
Application granted granted Critical
Publication of KR101629315B1 publication Critical patent/KR101629315B1/en

Links

Images

Landscapes

  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a mobile terminal and a control method thereof, in which usage of a terminal can be realized by further considering convenience of a user. According to at least one embodiment of the present invention, even if the size and / or the number of the key buttons of the user input unit for text input provided in the mobile terminal are limited, when the terminal user grasps the terminal main body, And / or a number of key buttons to easily input text.

Description

[0001] MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME [0002]

The present invention relates to a mobile terminal and a control method thereof, in which usage of a terminal can be realized by further considering convenience of a user.

The terminal can move And can be divided into a mobile / portable terminal and a stationary terminal depending on whether the mobile terminal is a mobile terminal or a mobile terminal. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

In the case of a mobile terminal, the size and / or number of the key buttons of the user input unit for text input provided in the mobile terminal is limited due to the limited size of the terminal. Therefore, it is not easy for the terminal user to input text in the mobile terminal by using the limited size and / or number of key buttons. In addition, when the terminal user grasps the terminal main body with one hand and operates the key button with the other hand, it is even more difficult to input the text.

SUMMARY OF THE INVENTION The present invention has been proposed in order to solve the above-mentioned problems, and it is an object of the present invention to provide a mobile terminal, in which, even if the size and / The present invention provides a mobile terminal and a control method thereof that enable text to be easily input using a limited size and / or number of key buttons.

According to another aspect of the present invention, there is provided a method of detecting a motion of a mobile terminal, the method comprising the steps of: detecting a movement of a terminal in a text input window according to movement of a terminal detected by the sensor during execution of a text input window, And a control unit for controlling the predetermined text input related function to be performed.

According to another aspect of the present invention, there is provided a method for displaying a text, the method comprising: displaying a text input window; sensing movement of the terminal; And performing a function of the mobile terminal.

Effects of the mobile terminal and the control method according to the present invention will be described as follows.

According to at least one embodiment of the present invention, even if the size and / or the number of the key buttons of the user input unit for text input provided in the mobile terminal are limited, when the terminal user grasps the terminal main body, And / or a number of key buttons to easily input text.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2A is a front perspective view of a mobile terminal according to an embodiment of the present invention.
2B is a rear perspective view of a mobile terminal according to an embodiment of the present invention.
3 is a flowchart of an embodiment of a method of controlling a mobile terminal according to the present invention.
4 to 11 are state diagrams of a display screen on which an embodiment of a control method of a mobile terminal according to the present invention is implemented.

Hereinafter, a mobile terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The mobile terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), navigation and the like.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only (DVF-H) A digital broadcasting system such as DVB-CB, OMA-BCAST, or Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The position information module 115 is a module for obtaining the position of the mobile terminal, and a representative example thereof is a Global Position System (GPS) module. According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites, and then applies trigonometry to the calculated information to obtain a three-dimensional string of latitude, longitude, The location information can be accurately calculated. At present, a method of calculating position and time information using three satellites and correcting an error of the calculated position and time information using another satellite is widely used. In addition, the GPS module 115 can calculate speed information by continuously calculating the current position in real time.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence of the user, the orientation / And generates a sensing signal for controlling the operation of the mobile terminal 100. The sensing unit may include at least one of a geomagnetic sensor and a gravity sensor for sensing the motion. For example, when the mobile terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. It is also possible to sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor 141. This will be discussed later in connection with touch screens.

The output unit 150 is for generating an output relating to visual, auditory or tactile sense and includes a display unit 151, an acoustic output module 152, an alarm unit 153, a haptic module 154, 155, and the like.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller (not shown). The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

The proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. In this case, the display unit 151 and the audio output module 152 may be a type of the alarm unit 153. The display unit 151 and the audio output module 152 may be connected to the display unit 151 or the audio output module 152, .

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. At least two haptic modules 154 may be provided according to the configuration of the mobile terminal 100.

The projector module 155 is a component for performing an image project function using the mobile terminal 100 and is similar to the image displayed on the display unit 151 in accordance with a control signal of the controller 180 Or at least partly display another image on an external screen or wall.

Specifically, the projector module 155 includes a light source (not shown) that generates light (for example, laser light) for outputting an image to the outside, a light source And a lens (not shown) for enlarging and outputting the image at a predetermined focal distance to the outside. Further, the projector module 155 may include a device (not shown) capable of mechanically moving the lens or the entire module to adjust the image projection direction.

The projector module 155 can be divided into a CRT (Cathode Ray Tube) module, an LCD (Liquid Crystal Display) module and a DLP (Digital Light Processing) module according to the type of the display means. In particular, the DLP module may be advantageous for miniaturization of the projector module 151 by enlarging and projecting an image generated by reflecting light generated from a light source on a DMD (Digital Micromirror Device) chip.

Preferably, the projector module 155 may be provided longitudinally on the side, front or back side of the mobile terminal 100. It goes without saying that the projector module 155 may be provided at any position of the mobile terminal 100 as occasion demands.

The memory unit 160 may store a program for processing and controlling the control unit 180 and temporarily store the input / output data (e.g., telephone directory, message, audio, For example. The memory unit 160 may store the frequency of use of each of the data (for example, each telephone number, each message, and frequency of use for each multimedia).

The memory unit 160 may further store data related to a predetermined text input related function to be performed according to the motion when the motion of the mobile terminal is sensed. This will be explained again later.

In addition, the memory unit 160 may store data on vibration and sound of various patterns output when the touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data to the external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the mobile terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the mobile terminal 100, or various command signals input by the user to the cradle may be transmitted It can be a passage to be transmitted to the terminal. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, The embodiments described may be implemented by the control unit 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

2A is a perspective view of an example of a mobile terminal or a mobile terminal according to the present invention.

The disclosed mobile terminal 100 includes a bar-shaped terminal body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injection molding a synthetic resin, or may be formed to have a metal material such as stainless steel (STS) or titanium (Ti) or the like.

The display unit 151, the sound output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, the interface 170, and the like may be disposed in the front body 101 have.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 151 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on the side surfaces of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the mobile terminal 100 and may include a plurality of operation units 131 and 132. The operation units 131 and 132 may be collectively referred to as a manipulating portion.

The contents inputted by the first or second operation unit 131 or 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 The touch recognition mode can be activated or deactivated.

2B is a rear perspective view of the mobile terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 'may be further mounted on the rear surface of the terminal body, that is, the rear case 102. The camera 121 'may have a photographing direction substantially opposite to that of the camera 121 (see FIG. 2A), and may be a camera having the same or different pixels as the camera 121.

For example, the camera 121 may have a low pixel so that the face of the user can be photographed and transmitted to the other party in case of a video call or the like, and the camera 121 ' It is preferable to have a large number of pixels. The cameras 121 and 121 'may be installed in the terminal body so as to be rotatable or pop-upable.

A flash 123 and a mirror 124 may be additionally disposed adjacent to the camera 121 '. The flash 123 illuminates the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to illuminate the user's own face or the like when the user intends to shoot himself / herself (self-photographing) using the camera 121 '.

An acoustic output 152 'may be additionally disposed on the rear surface of the terminal body. The sound output unit 152 'may implement the stereo function together with the sound output unit 152 (see FIG. 2A), and may be used for the implementation of the speakerphone mode during a call.

In addition to the antenna for communication, a broadcast signal receiving antenna 116 may be additionally disposed on the side of the terminal body. The antenna 116, which forms part of the broadcast receiving module 111 (see FIG. 1), can be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the mobile terminal 100 is mounted on the terminal body. The power supply unit 190 may be built in the terminal body or may be detachable from the outside of the terminal body.

The rear case 102 may further include a touch pad 135 for sensing a touch. The touch pad 135 may be of a light transmission type for the display unit 151. [ In this case, if the display unit 151 is configured to output the time information on both sides (i.e., in the directions of both the front and back sides of the mobile terminal), the time information can be recognized through the touch pad 135 do. The information output on both sides may be all controlled by the touch pad 135. [

Meanwhile, the display for exclusive use of the touch pad 135 may be separately installed, so that the touch screen may be disposed in the rear case 102 as well.

The touch pad 135 operates in correlation with the display portion 151 of the front case 101. The touch pad 135 may be disposed parallel to the rear of the display unit 151. The touch pad 135 may have a size equal to or smaller than that of the display unit 151.

Hereinafter, embodiments related to a control method that can be implemented in the mobile terminal will be described with reference to the accompanying drawings.

The following embodiments can be implemented more easily when the display module 151 is configured as a touch screen. Hereinafter, it is assumed that the display module 151 is a touch screen. Further, in the following embodiments, the mobile terminal assumes that a text input menu is being executed. The text input menu refers to all menus in which text input is possible during the execution of a menu. The menu includes, for example, a document creation menu, a message (including a short message, a multimedia message, an instant message, ), A creation menu, a note creation menu, and the like. In addition, since text can be input in the text input box during web browsing or text can be input in order to register the name of the phone book, the web browsing menu and the phone book menu can be regarded as a kind of the text input menu .

First, referring to FIG. 2A, the direction of the rotational movement of the mobile terminal according to the present invention will be described.

As shown in FIG. 2A, a text input window 410 and a keypad 405 are displayed on the touch screen 151. That is, when the operation of the keypad 405 is operated, the corresponding alphabet is input to the text input window 410. FIG.

A separate hardware keypad for inputting text to the mobile terminal 100 may be provided as the user input unit 130 instead of the keypad 405 of the touch screen 151. [

When the keypad 405 is operated, the text is input in the text input window 410 in the left-to-right direction.

Hereinafter, the direction from left to right will be referred to as "text input direction" of the text input window 410. [ The rotation axis perpendicular to the text input direction is referred to as a "first rotation axis ", and the rotation axis parallel to the text input direction is referred to as a" second rotation axis ".

The mobile terminal 100 can be rotated in two directions with respect to the first rotation axis in the space. The two directions are referred to as a first rotation direction and a second rotation direction, respectively.

Also, the mobile terminal 100 may be rotated in two directions with respect to the second rotation axis in the space. The two directions are referred to as a third rotation direction and a fourth rotation direction, respectively.

When the text input menu is being executed in the mobile terminal 100, the text input window 400 may be displayed on the touch screen 141 (S31). At this time, the mobile terminal 100 is rotated.

Then, the sensor 140 may detect the rotational movement (S32).

Then, the control unit may control the text input related function to be performed in the text input window according to the direction and the number of the detected rotational movement [S33].

In FIG. 2A, the first rotation axis and the second rotation axis, and the first rotation direction to the fourth rotation direction are predefined only for ease of description of the present invention, and they may be understood as an axis or direction different from that defined in FIG. 2A Of course it is possible.

When the mobile terminal is rotated in a predetermined direction while the terminal user grasps the terminal main body, the corresponding text input related function is performed, thereby facilitating text input.

Hereinafter, the text input related functions will be described in more detail.

FIG. 3 is a flowchart of an embodiment of a control method of a mobile terminal according to the present invention, and FIGS. 4 to 11 are state diagrams of a display screen on which an embodiment of a control method of a mobile terminal according to the present invention is implemented.

In FIG. 4, the text input window is shown on the touch screen 151 for simplicity of illustration. FIG. 4 is a side view of the mobile terminal viewed in the direction A in FIG. 2A to help understand the rotational movement direction of the mobile terminal.

As shown in (4-1) of FIG. 4, a predetermined text is input in the text input window 400. FIG. It is exemplified that the input text forms a sentence. The sentence is composed of a plurality of words, and a space is input between the words. A space symbol 423 may be displayed for each space in the text input window 400 so that the terminal user can visually recognize that the space is input. If the input text is a message, it should not be transmitted to the other party until the space symbol 423 is displayed, even if the message is completed and transmitted to the other party.

The text input window 400 may display a cursor 410 indicating a position at which the text is input when the keypad 405 is operated to input text.

The mobile terminal 100 is rotated in the first rotation direction with respect to the first rotation axis, as shown in (4-2) of FIG.

Then, the sensor senses the rotational movement of the mobile terminal 100.

Then, the controller 180 moves the cursor 410 to the left by one alphabet in accordance with the sensed rotational movement.

The controller 400 may control the cursor 410 to move when the sensed rotational movement is greater than or equal to a predetermined angular velocity. Alternatively, the controller may control the cursor 410 to move when the sensed rotational movement is not less than the predetermined angular velocity, but is greater than or equal to the predetermined angular distance. Alternatively, the controller may control the cursor to move only when the rotational movement is detected regardless of the angular velocity and the angular distance.

As shown in (4-3) of FIG. 4, the rotationally moved mobile terminal 100 is returned to the original terminal position. When the rotationally moved mobile terminal 100 is returned to its original position, the controller 180 may control the cursor 410 not to move.

Accordingly, the terminal user can input the text at the position of the cursor 410 moved to the left.

The cursor 410 is moved in the corresponding direction when the mobile terminal 100 is rotated, and when the mobile terminal 100 returns to its original position, the cursor 410 maintains the moved position . However, the present embodiment is not limited to this, and the cursor 100 is maintained at the original cursor position until the mobile terminal 100 is rotated, and when the mobile terminal 100 returns to its original position, The cursor 410 may be configured to move in the corresponding direction.

Hereinafter, for convenience of explanation, "returning to the original position after being rotated" may simply be expressed as "rotational movement ".

Meanwhile, although not shown, the cursor 410 may be moved to the right when the mobile terminal 100 is rotated in the second rotation direction with respect to the first rotation axis. It will be apparent to those skilled in the art that the same can be implemented with the same principles as described above, so that the detailed description will be omitted for the sake of simplicity.

This will be described below with reference to FIG.

As shown in (5-1) of FIG. 5, a predetermined text is input in the text input window 400. FIG.

The mobile terminal 100 is rotated in the third rotation direction with respect to the second rotation axis, as shown in (5-2) of Fig.

Then, the sensor senses the rotational movement of the mobile terminal 100.

The controller 180 moves the cursor 410 upward by one line according to the detected rotation.

As shown in (5-3) of FIG. 5, the rotationally moved mobile terminal 100 is returned to the original terminal position. When the rotationally moved mobile terminal 100 is returned to its original position, the controller 180 may control the cursor 410 not to move.

Accordingly, the terminal user can input the text at the position of the cursor 410 moved to the left.

(E.g., but not limited to) a cursor movement when rotated at a predetermined angular velocity and / or a predetermined angular distance, and returning to the original terminal position after the rotational movement The above-described text input related function can be performed. Therefore, a detailed description will be omitted. This is the same in the following description.

Meanwhile, although not shown, the cursor 410 may be moved to the lower line when the mobile terminal 100 is rotated in the fourth rotation direction with respect to the second rotation axis. It will be apparent to those skilled in the art that the same can be implemented with the same principles as described above, so that the detailed description will be omitted for the sake of simplicity.

The text input related function is not limited to the movement of the cursor but may be more various. This will be further described with reference to FIG.

As shown in (6-1) of FIG. 6, a predetermined text is input to the text input window 400. FIG. The cursor 410 is displayed on the text input window 400. [

The mobile terminal may be rotated twice (i.e., in a double rotation) twice in the first rotation direction with respect to the first rotation axis. The double rotation movement refers to a first rotation movement in the first rotation direction and a second rotation movement within a predetermined time (for example, 0.5 seconds) after returning to the original terminal position.

Then, as shown in (6-2) of FIG. 6, an enter may be inputted to the corresponding position of the cursor 410. [ An enter symbol 425 may be displayed at the entered enter position so that the terminal user can visually recognize that the enter is entered in the text input window 400. [ If the input text is a message, it should not be transmitted to the other party until the enter symbol 425 even though the message is completed and transmitted to the other party.

In FIG. 4, the cursor 410 is moved by one letter when the mobile terminal 100 is rotated in the first or second rotation direction with respect to the first rotation axis. However, this embodiment is not limited to this. This will be further described with reference to FIG.

As shown in (7-1) of FIG. 7, a predetermined text is input to the text input window 400. FIG. The cursor 410 is displayed on the text input window 400. [ Quot; space "is adjacent to the left side of the cursor 410, and the word" you "

The mobile terminal 100 is rotated in the first rotation direction with respect to the first rotation axis.

Then, as shown in (7-2) of FIG. 7, the cursor 410 is moved to the left beyond the "space ". Therefore, the word "miss" is adjacent to the left side of the cursor 410 and the "space"

The mobile terminal 100 is rotated once more in the first rotation direction with respect to the first rotation axis.

Then, as shown in (7-3) of FIG. 7, the cursor 410 can be moved to the left beyond one alphabet "s" of the word "miss". That is, if there is a word in the direction in which the cursor 410 is to be moved, it can be shifted by one letter each time it is rotated.

Alternatively, as shown in (7-4) of FIG. 7, if there is a word in the direction in which the cursor 410 is to be moved, it can move beyond the entire word "miss" rather than moving by one alphabet .

In the above description, when the mobile terminal 100 is rotated and moved, the cursor 410 is moved or entered as the text input related function. However, this embodiment is not limited thereto. 8 will be further described.

As shown in (8-1) of FIG. 8, a predetermined text is input in the text input window 400. FIG. The cursor 410 is displayed on the text input window 400. [ On the left side of the cursor 410 is the word "your ".

The mobile terminal 100 is first rotated with respect to the first rotation axis.

Then, as shown in (8-2) of FIG. 8, the cursor 410 is moved to the left by one alphabetic digit while the "r" of the word "your" on the left side of the cursor 410 is deleted . That is, the first rotational movement may correspond to a backspace function.

Then, the mobile terminal 100 is moved in the second rotational direction about the first rotational axis.

Then, as shown in (8-3) of FIG. 8, a space is input to the moved alphabet position while the cursor 410 moves to the right by one alphabet position. That is, the second rotational movement can correspond to the space function.

Hereinafter, the space and backspace functions will be described in more detail with reference to FIG.

As shown in (8-1) of FIG. 8, a predetermined text is input in the text input window 400. FIG. The cursor 410 is displayed on the text input window 400. [ On the left side of the cursor 410 is the phrase "misssyou ". The phrase "misssyou" is a typo of "miss you". Hereinafter, a process for correcting the erroneousness of the phrase will be described.

Between "misss" and "you" of the phrase "misssyou" is touched. That is, the desired position is touched to move the cursor 410 to the desired position.

Then, as shown in (8-2) of FIG. 8, the cursor 410 is positioned between "misss" and "you ".

Then, the mobile terminal 100 is rotated in the first rotation direction with respect to the first rotation axis.

Then, the cursor " s "of" misss "is deleted, and the cursor 410 is moved to the left by one alphabetic digit. When the cursor 410 is moved by one alphabetical digit to the left, all the text or "you" located on the right side of the cursor 410 is shifted to the left by one alphabet place. That is, the phrase "misssyou" is converted into "missyou ". Then, the cursor 410 is positioned between "miss" and "you" of the converted phrase "missyou ".

Then, the mobile terminal 100 is rotated in the second rotation direction with respect to the first rotation axis.

Then, the cursor 410 is moved to the right by one alphabet position, and a space is input to the moved alphabet position. When the cursor 410 is moved to the right, all of the text "you" located on the right side of the cursor 410 is shifted to the right by one alphabet position. That is, the phrase "missyou" is converted back to "miss you ". As a result, the correction of the above-described error can be completed.

In FIGS. 8 and 9, the mobile terminal 100 is deleted by one letter each time the mobile terminal 100 is rotated in the first rotation direction. However, this embodiment is not limited to this. 10 will be further described.

As shown in (10-1) of FIG. 10, a predetermined text is input to the text input window 400. FIG. The cursor 410 is displayed on the text input window 400 and a space is displayed on the left side of the cursor 410.

The mobile terminal 100 is first rotated with respect to the first rotation axis.

Then, as shown in (10-2) of FIG. 10, the cursor 410 is moved to the left by one alphabet position while the space on the left side of the cursor 410 is deleted. Therefore, the cursor "you" is adjacent to the cursor 410 on the left side.

The mobile terminal 100 is moved once more for the first rotation axis.

Then, as indicated by (10-3) in FIG. 10, the cursor 410 can be shifted to the left by one alphabetic digit while "u" of the word "you" is deleted. That is, if there is a word in the left direction in which the cursor 410 is to be moved, it can be deleted by one alphabet every time the first rotation movement is performed.

Alternatively, as shown in (10-4) in FIG. 10, when there is a word in the left direction in which the cursor 410 is to be moved, the entire word "you" is deleted instead of shifting by one alphabet, The cursor may move to the left by the number of digits corresponding to the number of the alphabet of the deleted word.

Hereinafter, another text input related function will be described with reference to FIG.

As shown in (11-1) of FIG. 11, a predetermined text is input in the text input window 400. FIG.

The mobile terminal 100 is rotated in the fourth rotation direction with respect to the second rotation axis.

Then, as shown in (11-2) of FIG. 11, an enter may be entered at a position corresponding to the cursor 410. [

The mobile terminal 100 is rotated in the third rotation direction with respect to the second rotation axis.

Then, as shown in (11-3) in FIG. 11, all of the previously input text can be deleted.

It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

For example, in the above description, the cursor is moved in succession according to the number of times the mobile terminal 100 is rotated, a space or an enter is input, or a previously input text is deleted.

However, the present embodiment is not limited to this, and the cursor may be shifted in succession to the time until the mobile terminal 100 is rotated and moved back to the original terminal position, a space may be input, Can be deleted.

It has also been described that when the mobile terminal 100 is rotationally moved in the first rotational direction, the space is inputted when the previously inputted text is deleted and rotationally moved in the second rotational direction. However, the present embodiment is not limited to this, and it is also possible that the space is inputted when the previously input text is deleted and rotated in the second rotation direction when rotating in the first rotation direction. This also applies to the third rotation direction and the fourth rotation direction. That is, in the above description, the specific text input related function is matched only in the specific rotation direction is merely an example, and other text input related functions may be matched in the specific rotation direction.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer-readable medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . Also, the computer may include a control unit 180 of the terminal.

Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

100: mobile terminal 110: wireless communication unit
120: A / V input / output unit 130: user input unit
140: sensing unit 150: output unit
160: memory 170: interface section
180: control unit 190: power supply unit

Claims (18)

A display unit for displaying a text input window;
A sensor for sensing movement of the terminal; And
And a control unit for performing a predetermined text input related function in the text input window according to the terminal movement detected by the sensor during execution of the text input menu,
Wherein the sensed movement includes a rotation movement with the text input direction of the display unit or a vertical direction thereof as a rotation axis,
Wherein,
Wherein a space is input to the text input window when the rotation movement is performed in the first rotation direction with respect to the rotation axis.
delete The apparatus of claim 1,
And controls the text input related function to be performed when the rotational movement is equal to or greater than a predetermined angular velocity.
The apparatus of claim 1,
And controls the text input related function to be performed when the rotational movement is equal to or greater than a predetermined angular range.
The apparatus of claim 1,
And controls the text input related function to be performed before or after the terminal returns to its original position when the rotational movement is performed.
delete delete Claim 8 has been abandoned due to the setting registration fee. The apparatus of claim 1,
A cursor is displayed on the text input window,
Wherein the space is input adjacent to a position of the cursor.
A display unit for displaying a text input window;
A sensor for sensing movement of the terminal; And
And a control unit for performing a predetermined text input related function in the text input window according to the terminal movement detected by the sensor during execution of the text input menu,
Wherein the sensed movement includes a rotation movement with the text input direction of the display unit or a vertical direction thereof as a rotation axis,
Wherein,
Wherein the control unit controls the input to be entered into the text input window when the rotation movement is performed twice in succession in the first rotation direction with respect to the rotation axis.
A display unit for displaying a text input window;
A sensor for sensing movement of the terminal; And
And a control unit for performing a predetermined text input related function in the text input window according to the terminal movement detected by the sensor during execution of the text input menu,
Wherein the sensed movement includes a rotation movement with the text input direction of the display unit or a vertical direction thereof as a rotation axis,
Wherein,
Wherein the controller controls the text entered in the text input window to be deleted when the rotation movement is performed in the second rotation direction with respect to the rotation axis.
Claim 11 has been abandoned due to the set registration fee. 11. The apparatus according to claim 10,
A cursor is displayed on the text input window,
And controls to delete the previously input text adjacent to the cursor position.
Claim 12 is abandoned in setting registration fee. 12. The apparatus according to claim 11,
And controls to delete one alphabetic character or one word of the previously input text every time the rotation movement is performed.
delete A display unit for displaying a text input window;
A sensor for sensing movement of the terminal; And
And a control unit for performing a predetermined text input related function in the text input window according to the terminal movement detected by the sensor during execution of the text input menu,
Wherein the sensed movement includes a rotation movement with the text input direction of the display unit or a vertical direction thereof as a rotation axis,
Wherein,
Wherein the control unit controls so that when the rotation movement is performed in the third rotation direction with respect to the rotation axis parallel to the text input direction of the display unit, the enter is inputted to the text input window.
A display unit for displaying a text input window;
A sensor for sensing movement of the terminal; And
And a control unit for performing a predetermined text input related function in the text input window according to the terminal movement detected by the sensor during execution of the text input menu,
Wherein the sensed movement includes a rotation movement with the text input direction of the display unit or a vertical direction thereof as a rotation axis,
Wherein,
And controls to delete all the text input to the text input window when the rotation movement is performed in the fourth rotation direction with respect to the rotation axis parallel to the text input direction of the display unit.
A display unit for displaying a text input window;
A sensor for sensing movement of the terminal; And
And a control unit for performing a predetermined text input related function in the text input window according to the terminal movement detected by the sensor during execution of the text input menu,
Wherein the sensed movement includes a rotation movement with the text input direction of the display unit or a vertical direction thereof as a rotation axis,
Wherein,
A space is input to the text input window when the rotation movement is performed in the first rotation direction with respect to the first rotation axis perpendicular to the text input direction of the display unit and the rotation movement is performed in the second rotation direction with respect to the first rotation axis The text entered in the text input window is deleted by one alphabet or one word,
When the rotary movement is performed in the third rotation direction with respect to the second rotation axis parallel to the text input direction of the display unit, the enter is inputted to the text input window and the rotation movement is performed in the fourth rotation direction with respect to the second rotation axis The control unit controls to delete all the text entered in the text input window.
delete Claim 18 has been abandoned due to the setting registration fee. Displaying a text input window on the display unit;
Detecting rotation movement of the display unit with the rotation direction of the text input direction or the vertical direction thereof; And
And inputting a space to the text input window when the rotation movement is performed in the first rotation direction with respect to the rotation axis.
KR1020100002125A 2010-01-11 2010-01-11 Mobile terminal and method for controlling the same KR101629315B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100002125A KR101629315B1 (en) 2010-01-11 2010-01-11 Mobile terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100002125A KR101629315B1 (en) 2010-01-11 2010-01-11 Mobile terminal and method for controlling the same

Publications (2)

Publication Number Publication Date
KR20110082238A KR20110082238A (en) 2011-07-19
KR101629315B1 true KR101629315B1 (en) 2016-06-14

Family

ID=44920297

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100002125A KR101629315B1 (en) 2010-01-11 2010-01-11 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR101629315B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101292050B1 (en) * 2011-07-29 2013-08-01 엘지전자 주식회사 Mobile terminal and method of controlling operation thereof

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100693591B1 (en) 2004-03-31 2007-03-14 주식회사 팬택앤큐리텔 Wireless communication terminal and its method for recognizing inclined direction as direction key input signal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100612851B1 (en) * 2004-07-14 2006-08-21 삼성전자주식회사 Phone using movement of the phone as input-key and key inputting method for using movement of the phone
KR20060114541A (en) * 2005-05-02 2006-11-07 삼성전자주식회사 Mobile communication terminal for controlling function using moving sensor
KR101487528B1 (en) * 2007-08-17 2015-01-29 엘지전자 주식회사 Mobile terminal and operation control method thereof

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100693591B1 (en) 2004-03-31 2007-03-14 주식회사 팬택앤큐리텔 Wireless communication terminal and its method for recognizing inclined direction as direction key input signal

Also Published As

Publication number Publication date
KR20110082238A (en) 2011-07-19

Similar Documents

Publication Publication Date Title
KR101633332B1 (en) Mobile terminal and Method of controlling the same
KR101595029B1 (en) Mobile terminal and method for controlling the same
KR101651135B1 (en) Mobile terminal and method for controlling the same
KR101605331B1 (en) Mobile terminal and method for controlling the same
KR101690612B1 (en) Mobile terminal and method for controlling the same
KR101608770B1 (en) Mobile terminal and method for controlling the same
KR101608761B1 (en) Mobile terminal and method for controlling the same
KR101631912B1 (en) Mobile terminal and control method thereof
KR20110018589A (en) Mobile and method for controlling the same
KR101674213B1 (en) Method for managing picked-up image in mobile terminal and mobile terminal using the same
KR20110003705A (en) Method for displaying information in mobile terminal and mobile terminal using the same
KR101622216B1 (en) Mobile terminal and method for controlling input thereof
KR101600626B1 (en) Method for transmitting data in mobile terminal and mobile terminal thereof
KR101604698B1 (en) Mobile terminal and method for controlling the same
KR101629315B1 (en) Mobile terminal and method for controlling the same
KR101823479B1 (en) Mobile terminal and method for controlling the same
KR101660743B1 (en) Mobile terminal and method for controlling the same
KR101531507B1 (en) Mobile terminal and method for controlling display thereof
KR101658085B1 (en) Mobile terminal and control method thereof
KR20150008951A (en) Terminal and method for controlling the same
KR101531512B1 (en) Mobile terminal and method for controlling the same
KR101667706B1 (en) Mobile terminal and control method thereof
KR101709503B1 (en) Mobile terminal and image display method
KR101513040B1 (en) Mobile terminal and method for controlling the same
KR101695813B1 (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant