KR102020329B1 - Mobile terminal and control method thereof - Google Patents

Mobile terminal and control method thereof Download PDF

Info

Publication number
KR102020329B1
KR102020329B1 KR1020120151189A KR20120151189A KR102020329B1 KR 102020329 B1 KR102020329 B1 KR 102020329B1 KR 1020120151189 A KR1020120151189 A KR 1020120151189A KR 20120151189 A KR20120151189 A KR 20120151189A KR 102020329 B1 KR102020329 B1 KR 102020329B1
Authority
KR
South Korea
Prior art keywords
virtual keyboard
keyboard
output
display unit
mobile terminal
Prior art date
Application number
KR1020120151189A
Other languages
Korean (ko)
Other versions
KR20140081434A (en
Inventor
김민주
진보필
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120151189A priority Critical patent/KR102020329B1/en
Publication of KR20140081434A publication Critical patent/KR20140081434A/en
Application granted granted Critical
Publication of KR102020329B1 publication Critical patent/KR102020329B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits

Abstract

The present invention relates to a mobile terminal for outputting a virtual keyboard and a control method thereof, the mobile terminal comprising: a display unit for outputting a first virtual keyboard, a user input unit configured to sense a touch input to the display unit, Generating a word based on a second virtual keyboard configured to be output to a display unit, a memory for storing essential words corresponding to the second virtual keyboard, and a touch input to the first virtual keyboard; And a controller which outputs the second virtual keyboard when the generated word matches any one of the essential words.

Description

Mobile terminal and its control method {MOBILE TERMINAL AND CONTROL METHOD THEREOF}

The present invention relates to a mobile terminal, and more particularly, to a mobile terminal for outputting a virtual keyboard and a control method thereof.

Terminals may be divided into mobile or portable terminals and stationary terminals according to their mobility. The mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.

As the functions are diversified, for example, the terminal is implemented in the form of a multimedia player having complex functions such as taking a picture or a video, playing a music or a video file, playing a game, and receiving a broadcast. . Further, in order to support and increase the function of the terminal, it may be considered to improve the structural part and the software part of the terminal.

With this improvement, the mobile terminal can output a virtual keyboard to the display unit and input text using a touch input to the virtual keyboard. In this case, each key of the virtual keyboard may be letters and symbols in units of numbers and syllables. Although text may be input based on touch inputs to these keys, it is necessary to expand the performance of the mobile terminal to input letters in units of words.

The present invention provides a mobile terminal capable of outputting a virtual keyboard for inputting letters in word units and a control method thereof.

 One embodiment of the present invention relates to a mobile terminal. The mobile terminal outputs a display unit for outputting a first virtual keyboard, a user input unit configured to sense a touch input to the display unit, essential words, and when at least one of the essential words is input, the display unit. Generates a word based on a memory for storing a second virtual keyboard and a touch input to the first virtual keyboard, outputs the generated word to the display unit, and the generated word is any one of the essential words; And a control unit for outputting the second virtual keyboard.

According to an embodiment of the present disclosure, the control unit simultaneously outputs the generated word and the first and second virtual keyboards to the display unit, and the first and second virtual keyboards are separated from each other. And a space for outputting information other than the second virtual keyboard. The controller may be further configured to select at least one of the keys included in the second virtual keyboard based on a touch input to the second virtual keyboard, and to display information corresponding to the selected at least one together with the generated word. It can be output to the display unit. The controller may not output the second virtual keyboard when the touch input to the second virtual keyboard is released.

According to an embodiment of the present disclosure, the controller may change the arrangement order of the keys according to the number of use of each of the keys included in the second virtual keyboard.

According to an embodiment of the present disclosure, the second virtual keyboard may include a time combination keyboard including a combination of keys for inputting time information and a place combination keyboard including a combination of keys for inputting place information including the generated word. And a contact combination keyboard including a combination of keys for inputting contact information including the generated word.

According to an embodiment of the present disclosure, the controller may generate schedule information based on a touch input to at least one of the first and second virtual keyboards. The second virtual keyboard may be the contact combination keyboard, and an image stored in at least one contact including the generated word may be output as a key of the contact combination keyboard.

In addition, an embodiment of the present invention relates to a control method of a mobile terminal. The control method of the mobile terminal may include outputting a first virtual keyboard to a display, generating a word based on a touch input to the first virtual keyboard, and outputting the generated word to the display; And if the generated word matches any one of the essential words stored in the memory, outputting a second virtual keyboard corresponding to the one on the display unit.

According to an embodiment of the present disclosure, the method of controlling the mobile terminal may further include generating schedule information based on a touch input to at least one of the first and second virtual keyboards.

According to the present invention, since the word is generated based on the touch input to the first virtual keyboard, and the second virtual keyboard including the keys associated with the generated word is displayed, the user inputs the letters of the word unit in the second virtual keyboard. You can easily input using. Thus, in inputting text, keyboard input may be minimized.

In addition, according to the present invention, since the second virtual keyboard varies depending on a word input to the first virtual keyboard, and a schedule can be generated using the first and second virtual keyboards, the user interface is quick and easy. Can be provided.

1 is a block diagram illustrating a mobile terminal according to an exemplary embodiment of the present invention.
2a and 2b is a perspective view showing the appearance of a mobile terminal according to the present invention
3 is a flowchart illustrating a method of controlling a mobile terminal according to an embodiment of the present invention.
4 is a diagram illustrating a mobile terminal outputting a time combination keyboard according to an embodiment of the present invention.
5 is a view for explaining a mobile terminal for outputting a place combination keyboard according to an embodiment of the present invention.
6 is a view for explaining a mobile terminal for outputting a contact combination keyboard according to an embodiment of the present invention.
FIG. 7 illustrates a mobile terminal for outputting first and second virtual keyboards according to an exemplary embodiment; FIG.
8 and 9 are diagrams for describing a mobile terminal for generating schedule information according to an embodiment of the present invention.

The mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigator, and the like. have. However, those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals, such as digital TV, desktop computer, except that only applicable to the mobile terminal It will be easy to see.

1 is a block diagram illustrating a mobile terminal 100 according to an embodiment of the present invention. Referring to FIG. 1, the mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a detection unit 140, an output unit 150, and a memory. 160, an interface unit 170, a controller 180, and a power supply unit 190 may be included. The components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components 110 to 190 of the mobile terminal 100 will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, and a location information module 115.

The broadcast receiving module 111 receives a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel. Here, the broadcast related information means information related to a broadcast channel, a broadcast program or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, the broadcast related information may be received by the mobile communication module 112. The broadcast signal and broadcast related information detected through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to transmission and reception of voice call signals, video call signals, text messages or multimedia messages.

The wireless internet module 113 is a module for wireless internet access and may be embedded or external to the mobile terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, etc. may be used.

The location information module 115 is a module for acquiring a location of the mobile terminal 100, and a representative example thereof is a GPS (Global Position System) module.

1, the A / V input unit 120 is for inputting an audio signal and a video signal, and may include a camera 121, a microphone 122, and the like. The camera 121 processes image frames such as still images and moving images obtained by the image sensor in a video call mode or a photographing mode. The image frame processed by the camera 121 may be displayed on the display unit 151. In addition, the image frame may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 121 may be provided according to a use environment.

The microphone 122 processes the sound signal input from the outside as electrical voice data in a call mode, a recording mode, a voice selection mode, and the like. The voice data processed by the microphone 122 in the call mode may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output. The microphone 122 may be implemented with various noise removing algorithms for removing noise generated while an external sound signal is input.

The user input unit 130 generates input data for the user to control the operation of the mobile terminal 100. The user input unit 130 may be composed of a key pad, a dome switch, a touch pad (static and electrostatic), a jog wheel, a jog switch, and the like.

The sensing unit 140 detects the current state of the mobile terminal 100 such as whether there is a user contact, the opening / closing state of the mobile terminal 100, the position, the orientation, the acceleration, the deceleration, and the like to control the operation of the mobile terminal 100. Generate a sense signal. For example, when the mobile terminal 100 is in the form of a slide phone, the detector 140 may detect whether the slide phone is opened or closed. In addition, the sensing unit 140 may detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled to an external device.

The sensing unit 140 may include a proximity sensor 141. In addition, the sensing unit 140 may include a touch sensor (not shown) that detects a touch operation on the display unit 151.

The touch sensor may have the form of a touch film, a touch sheet, a touch pad, or the like. The touch sensor may be configured to convert a pressure applied to a specific portion of the display unit 151 or a change in capacitance generated at a specific portion of the display unit 151 into an electrical input signal. The touch sensor may be configured to detect a touch pressure as well as a touched position and area.

When the touch sensor and the display unit 151 have a mutual layer structure, the display unit 151 may be used as an input device in addition to the output device. The display unit 151 may be referred to as a “touch screen”.

If there is a touch input via the touch screen, signals corresponding thereto are sent to a touch controller (not shown). The touch controller processes signals transmitted from the touch sensor and then transmits data corresponding to the processed signals to the controller 180. As a result, the controller 180 can determine which area of the display unit 151 is touched.

When the touch screen is capacitive, the touch screen may be configured to detect the proximity of the sensing object by the change of the electric field according to the proximity of the sensing object. The touch screen may be classified as the proximity sensor 141.

The proximity sensor 141 refers to a sensor that detects the presence or absence of a sensing target without mechanical contact by using electromagnetic force or infrared rays. The proximity sensor 141 has a longer life and higher utilization than a contact sensor. Examples of the proximity sensor 141 include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.

For convenience of description below, the act of approaching the sensing object without being touched on the touch screen is referred to as “proximity touch”, and the act of touching the sensing object on the touch screen is referred to as “contact touch”. touch) ”.

The proximity sensor 141 detects the presence or absence of a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the presence or absence of the proximity touch and the proximity touch pattern may be output on the touch screen.

The output unit 150 generates an output related to visual, auditory, tactile, and the like. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal 100 operates in a call mode, the display unit 151 displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 operates in a video call mode or a photographing mode, the display unit 151 displays a photographed image, a received image, a UI, or a GUI.

The display unit 151 includes a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT-LCD), an organic light emitting diode (OLED), and a flexible display (flexible). The display may include at least one of a 3D display and an e-ink display.

At least one display (or display element) included in the display unit 151 may be configured to be transparent or light transmissive so that the outside can be seen therethrough. This may be referred to as a transparent display. A representative example of such a transparent display is TOLED (Transparant OLED). The rear structure of the display unit 151 may also be configured as a light transmissive structure. That is, the display unit 151 may include overlapping first and second surfaces, and the first and second surfaces may have a transparent or light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 in the terminal body. The display unit 151 may be referred to as a transparent display unit 155.

There may be two or more display units 151 according to the implementation form of the mobile terminal 100. For example, a plurality of display units may be spaced apart or integrally located on one surface of the mobile terminal 100, or may be located on different surfaces.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice selection mode, a broadcast reception mode, and the like. The sound output module 152 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100. The sound output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of events generated in the mobile terminal 100 include call signal reception, message reception, key signal input, and touch input. The alarm unit 153 may output a signal for notifying occurrence of an event by vibration, in addition to a video signal or an audio signal. Since the video signal or the audio signal may be output through the display unit 151 or the sound output module 152, the display unit 151 and the sound output module 152 may be classified as part of the alarm unit 153.

The haptic module 154 generates various haptic effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity, pattern, and the like of the vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.

In addition to vibration, the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of the electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects due to endothermic or reproducing a sense of cold using a heat generating element.

The haptic module 154 may not only deliver a haptic effect through direct contact, but may also be configured to allow the user to feel the haptic effect through a muscle sense such as a finger or an arm. Two or more haptic modules 154 may be provided according to configuration aspects of the mobile terminal 100.

The memory 160 may store a program for the operation of the controller 180 and may temporarily store input and output data (for example, a phone book, a message, a still image, a video, etc.). The memory 160 may store data regarding vibration and sound of various patterns output when a touch is input on the touch screen.

The memory 160 may be a flash memory, a hard disk, a multimedia card micro type, a card type memory (for example, SD or XD memory, etc.), random access memory. RAM, static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic disk, optical disk It may include at least one storage medium. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device. For example, the interface unit 170 may include a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device equipped with an identification module, audio I / O ( Input / Output) ports, video I / O (Input / Output) ports, earphone ports, and the like may be included.

The identification module is a chip that stores various types of information for authenticating the usage rights of the mobile terminal 100. The identification module includes a user identification module (UIM), a subscriber identify module (SIM), and a universal user authentication module (SIM). Universal Subscriber Identity Module (USIM) and the like. A device equipped with an identification module (hereinafter referred to as an "identification device") may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through a port.

When the mobile terminal 100 is connected to an external cradle, the interface unit 170 may be a passage through which power from the cradle is supplied to the mobile terminal 100, or various command signals input from the cradle by a user may be input to the mobile terminal. It may be a passage that is delivered to (100). Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The controller 180 controls the overall operation of the mobile terminal 100. For example, control and processing related to voice call, data communication, video call and the like are performed. The controller 180 may include a multimedia module 181 for playing multimedia. The multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180. The controller 180 may perform a pattern selection process of selecting writing input and drawing input on the touch screen as text and images, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using software, hardware or a combination thereof.

According to a hardware implementation, the embodiments described herein include Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs). It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. In some cases, the embodiments described herein may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more functions and operations described herein. Software code can be implemented in software applications written in the appropriate programming languages. Such software code may be stored in the memory 160 and executed by the controller 180.

Hereinafter, a method of processing user input to the mobile terminal 100 will be described.

The user input unit 130 is manipulated to receive a command for controlling the operation of the mobile terminal 100 and may include a plurality of manipulation units. The manipulation units may also be collectively referred to as manipulating portions, and may be employed in any manner as long as the manipulation units are manipulated using the user's tactile sense.

The display unit 151 may display various types of visual information. Such visual information may be displayed in the form of letters, numbers, symbols, graphics, icons, and the like, and may be formed of a 3D stereoscopic image. For inputting visual information, at least one of letters, numbers, symbols, graphics, and icons may be displayed in a predetermined arrangement to be implemented in the form of a keypad. These keypads can be called "soft keys."

The display unit 151 may operate in an entire area or may be operated in a divided manner. In the latter case, the plurality of regions may be configured to operate in association with each other. For example, an output window and an input window may be displayed on the upper and lower portions of the display unit 151, respectively. The output window and the input window are areas allocated for output or input of information, respectively. In the input window, a softkey displaying a number for inputting a telephone number may be output. When the softkey is touched, the number corresponding to the touched softkey is displayed in the output window. When the operation unit is operated, a call connection to the telephone number displayed in the output window may be attempted or the text displayed in the output window may be input to the application.

The display unit 151 or the touch pad may be configured to detect a touch scroll. The user may move an object displayed on the display unit 151, for example, a cursor or a pointer located on an icon, by scrolling the display unit 151 or the touch pad. In addition, when the finger is moved on the display unit 151 or the touch pad, a path along which the finger moves may be visually displayed on the display unit 151. This may be useful for editing an image displayed on the display unit 151.

In response to the display unit 151 and the touch pad being touched together within a predetermined time range, one function of the mobile terminal 100 may be executed. When touched together, the user may clamp the main body of the mobile terminal 100 using the thumb and the index finger. In this case, one function of the mobile terminal 100 to be executed may be, for example, activation or deactivation of the display unit 151 or the touch pad.

2A and 2B are perspective views showing the appearance of a mobile terminal 100 according to the present invention. In FIG. 2A, the front and one side of the mobile terminal 100 are shown, and in FIG. 2B, the rear and the other side of the mobile terminal 100 are shown.

Referring to FIG. 2A, the mobile terminal 100 includes a terminal body having a bar shape. However, the mobile terminal 100 is not limited thereto, and may be implemented in various forms, such as a slide type, a folder type, a swing type, a swivel type, to which two or more bodies are coupled to be relatively movable.

The terminal body includes a case (casing, housing, cover, etc.) forming an external appearance. In an embodiment, the case may be divided into a front case 101 and a rear case 102. Various electronic components are built in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally located between the front case 101 and the rear case 102.

The cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS), titanium (Ti), or the like.

The display unit 151, the audio output unit 152, the camera 121, the user input unit 130 (see FIG. 1), the microphone 122, the interface 170, and the like are located in the terminal body, mainly the front case 101. can do.

The display unit 151 occupies a main portion of the front case 101. The sound output unit 152 and the camera 121 are positioned in an area adjacent to one end of the display unit 151, and the first user input unit 131 and the microphone 122 are located in an area adjacent to the other end. The second user input unit 132 and the interface 170 may be located at sides of the front case 101 and the rear case 102.

The user input unit 130 is manipulated to receive a command for controlling the operation of the mobile terminal 100. The user input unit 130 may include a plurality of manipulation units 131 and 132.

The first or second operating units 131, 132 can receive various commands. For example, the first operation unit 131 may receive a command such as start, end, scroll, and the like. The second operation unit 132 may receive a command such as adjusting the volume of the sound output from the sound output unit 152, switching to the touch selection mode of the display unit 151, and the like.

Referring to FIG. 2B, a rear camera 121 ′ may be additionally mounted on the rear surface of the terminal body, that is, the rear case 102. The rear camera 121 ′ has a photographing direction opposite to the front camera 121 (see FIG. 2A), and may be configured to have different pixels from the front camera 121.

For example, the front camera 121 may be configured to have a low pixel, and the rear camera 121 'may be configured to have a high pixel. Accordingly, when the front camera 121 is used during a video call, the size of the transmission data may be reduced when the user's face is photographed and transmitted to the counterpart in real time. On the other hand, the rear camera 121 'may be used for the purpose of storing a high quality image.

Meanwhile, the cameras 121 and 121 'may be installed in the terminal body to rotate or pop up.

The flash 123 and the mirror 124 may be further positioned adjacent to the rear camera 121 '. The flash 123 emits light toward the subject when the user photographs the subject with the rear camera 121 '. The mirror 124 illuminates the user's face when the user photographs himself (self-photographed) using the rear camera 121 '.

The rear sound output unit 152 'may be additionally located at the rear of the terminal body. The rear sound output unit 152 ′ may perform a stereo function together with the front sound output unit 152 (see FIG. 2A), and may perform a speakerphone function during a call.

The antenna 116 for receiving a broadcast signal may be additionally positioned beside the antenna for the call on the side of the terminal body. The antenna 116 constituting a part of the broadcast receiving module 111 (refer to FIG. 1) may be installed to be pulled out of the terminal body.

The main body of the terminal is equipped with a power supply unit 190 for supplying power to the mobile terminal (100). The power supply unit 190 may be built in the terminal body or may be configured to be directly removable from the outside of the terminal body.

The rear case 102 may be additionally equipped with a touch pad 135 for sensing a touch. Like the display unit 151 (refer to FIG. 2A), the touch pad 135 may be configured to have a light transmission type. In addition, a rear display unit for outputting visual information may be additionally mounted on the touch pad 135. In this case, information output from both the front display unit 151 and the rear display unit may be controlled by the touch pad 135.

The touch pad 135 operates in association with the display unit 151. The touch pad 135 may be located parallel to the rear of the display unit 151. The touch pad 135 may have the same or smaller size as the display unit 151.

Referring back to the display unit 151 of the present invention, the display unit 151 may output a virtual keyboard including a plurality of keys. The virtual keyboard refers to an input device in which keys are arranged according to a certain standard. The controller 180 may detect a touch input for at least one of the keys included in the virtual keyboard, and output the symbol information corresponding to the at least one to the display unit 151. Hereinafter, the display unit 151 for outputting a virtual keypad will be described in detail.

3 is a flowchart illustrating a control method of a mobile terminal according to an embodiment of the present invention.

Referring to FIG. 3, a control method of a mobile terminal according to an embodiment of the present disclosure includes the step of outputting a first virtual keyboard to a display unit (S110). In general, the first virtual keyboard may be a virtual keyboard output to the mobile terminal 100. For example, at least one of a Korean keyboard, an English keyboard, and a symbol keyboard may be output to the first virtual keyboard.

Next, an operation (S120) of outputting a word generated by a touch input to the first virtual keyboard to the display may be performed. A touch input may be sensed with respect to at least one of the keys included in the first virtual keyboard. In this case, the controller 180 may output numbers, symbols, consonants, vowels, etc. corresponding to each of the keys to the display unit 151 in response to the touch input.

In addition, the controller 180 may generate a word that can be separated and used independently, based on a touch input to the first virtual keyboard that is continuously detected, and output the generated word. For example, when touch inputs to keys corresponding to "o", "ㅗ", "ㅎ", and "TT" are sequentially detected, the controller 180 displays the word "afternoon" in the display unit ( 151).

Next, a step (S130) of comparing the generated word with any one of the essential words stored in the memory may proceed. If the generated word matches any one of the essential words, an operation (S140) of outputting a second virtual keyboard corresponding to the one on the display unit may be performed.

The essential word may mean a word included in a precondition for outputting a second virtual keyboard different from the first virtual keyboard to the display unit 151. These essential words may be stored in the memory 160 along with a special virtual keyboard corresponding to the essential words.

 For example, the word "afternoon" may be stored in the memory 160 as an essential word, and when the word "afternoon" is input, a second virtual keyboard to be output may be stored in the memory 160. In this case, the second virtual keyboard may be a time combination keyboard capable of inputting text related to time. For example, "14:00", "3 o'clock", etc. may be configured as keys of the time combination keyboard.

There may be a plurality of such essential words and corresponding special virtual keyboards. In addition, there may be a plurality of essential words corresponding to one special virtual keyboard.

Although not shown in the drawing, a step of outputting text to the display unit 151 may be performed based on a touch input to at least one of the first and second virtual keyboards. That is, the controller 180 creates text using the second virtual keyboard corresponding to the word generated by the touch input to the first virtual keyboard as well as the first virtual keyboard, and writes the text to the display unit 151. You can print

According to the present invention, the text that needs to be typed by using the first virtual keyboard can be simply input using the second virtual keyboard. In addition, since a second virtual keyboard corresponding to a word generated by a touch input to the first virtual keyboard is output, various special virtual keyboards may be used according to the generated word. Accordingly, a quick and simple user interface can be provided.

4 is a diagram illustrating a mobile terminal outputting a time combination keyboard according to an embodiment of the present invention.

Referring to FIG. 4, the mobile terminal 100 according to an embodiment of the present disclosure may include a display unit 151 that outputs a first virtual keyboard 200. In response to a touch input to the first virtual keyboard, the controller 180 may select at least one of the plurality of keys and output text corresponding to the selected key to the display unit 151. . For example, text corresponding to the selected key may be output to the text output window 300.

If the word generated by the touch input to the first virtual keyboard 200 matches at least one of the essential words stored in the memory 160, the controller 180 displays the second virtual keyboard corresponding to the at least one. It can output to the unit 151.

For example, when the word “afternoon” is generated as a touch input to the first virtual keyboard 200, the controller 180 may output a time combination keyboard composed of a combination of keys for inputting time information. have. Essential words for outputting the time combination keyboard may include, for example, "am", "pm", "AM", "pm", and the like.

In this case, the controller 180 may output the second virtual keyboard 210 instead of the first virtual keyboard 200 or the second virtual keyboard 220 together with the first virtual keyboard 200.

When the first and second virtual keyboards 200 and 220 are output together, the first and second virtual keyboards 200 and 220 are separated from each other and are different from the first and second virtual keyboards 200 and 220. A space for outputting information can be formed. For example, the text 300 generated as a touch input to the virtual keyboard may be output in a space formed between the first and second virtual keyboards 200 and 220.

The controller 180 selects at least one of the keys included in the second virtual keyboard based on a touch input to the second virtual keyboard, and generates the letters corresponding to the selected at least one key. The printed word may be output together with the displayed word. For example, when a touch input for a key "3 o'clock" is detected, the letter "3 o'clock" may be output together with the word "afternoon".

When the touch input to the second virtual keyboard is released, the controller 180 may not output the second virtual keyboard. This is because the second virtual keyboard is a special virtual keyboard and thus may not be reused.

5 is a diagram for describing a mobile terminal outputting a place combination keyboard according to an exemplary embodiment.

Referring to FIG. 5, the mobile terminal 100 according to an embodiment of the present invention may include a display unit 151 for outputting the first virtual keyboard 200. The text generated by the touch input to the first virtual keyboard 200 may be output to the text input window 300.

If the word generated by the touch input to the first virtual keyboard 200 matches at least one of the essential words stored in the memory 160, the controller 180 displays the second virtual keyboard corresponding to the at least one. It can output to the unit 151.

For example, when the word “pressure bulb” is generated by a touch input to the first virtual keyboard 200, the controller 180 is a combination of keys for inputting place information including the word “pressure bulb”. You can output the configured place combination keyboard. Essential words outputting the place combination keyboard may include, for example, a word indicating a place such as a country, a city, a county, a ward, a town, and the like stored in a map application.

The keys included in the place combination keyboard may be composed of place names including a word generated by a touch input to the first virtual keyboard 200. For example, when an essential word "apgu" is generated, place names such as "apgujeong station" and "apgujeong rodeo" may be output to the display unit 151 as keys of a place combination keyboard.

In this case, the controller 151 may change the arrangement order of the keys according to the number of use of each of the keys included in the second virtual keyboard. For example, a database of a history of a place name input by a user is stored in the memory 160, and the keys of the second virtual keyboard are arranged in the order of place names input frequently (or frequently used). can do. That is, referring to FIG. 5, when the keys are arranged in the order of “Apgujeong Station”, “Apgujeong Rodeo”, but “Apgujeong Rodeo” is more frequently input than “Apgujeong Station”, “Apgujeong Rodeo” and “Apgujeong Station” in this order. The keys can be rearranged.

6 is a view for explaining a mobile terminal for outputting a contact combination keyboard according to an embodiment of the present invention.

Referring to FIG. 6, the mobile terminal 100 according to an embodiment of the present disclosure may include a display unit 151 that outputs a first virtual keyboard 200. The text generated by the touch input to the first virtual keyboard 200 may be output to the text input window 300.

Referring to FIG. 6, a contact combination keyboard including a combination of keys for inputting contact information using a second virtual keyboard may be output. Essential words outputting the contact combination keyboard may include, for example, words corresponding to names, group names, titles, and the like stored in the phone book.

 For example, when the word “family” is generated by a touch input to the first virtual keyboard 200, the controller 180 is a combination of keys for inputting contact information including the word “family”. You can output the configured contact combination keyboard.

That is, the controller 180 may search for contact information including the word "family" among the contacts stored in the phone book, and output the searched contact information as keys of the contact combination keyboard. At this time, the keys of the contact combination keyboard may be composed of images included in the contact information. Although not shown in the figure, a name included in the contact information may be output as a key of the contact combination keyboard.

FIG. 7 illustrates a mobile terminal for outputting first and second virtual keyboards according to an exemplary embodiment.

Referring to FIG. 7, the mobile terminal 100 according to an embodiment of the present disclosure may include a display unit 151 for outputting first and second virtual keyboards 200 and 270. The word “family” is generated by the first virtual keyboard 200, and a second virtual keyboard 270 in which contacts including the word “family” are output as a key may be output.

In addition, the first key 272 of the second virtual keyboard 270 corresponds to a contact having the name "sister" among the contacts stored in the memory 160, and the second key 274 is the name "brother". It can correspond to contacts with. Contacts corresponding to "sister" and "brother" may be included in a group called "family".

Referring to FIG. 7, the keys constituting the second virtual keyboard 270 may output image data included in a contact as a thumbnail image. Although not shown in the drawing, some of the data constituting the contact may be output as a key constituting the second virtual keyboard 270. For example, data corresponding to a phone number or name included in a contact may be output as a key.

In this case, the controller 180 may detect a touch input to the second virtual keyboard 270. The controller 180 may select any one of the keys included in the second virtual keyboard 270 using the coordinates at which the touch input is detected. The controller 180 can output the text corresponding to the selected key to the text output window 300.

For example, when a touch input to the first key 272 is detected, the controller 180 selects the first key 272 and distinguishes the first key 272 from other keys to display the display unit 151. ) Can be printed. The controller 180 may output some of the data included in the first key 272 to the text output window 300. In particular, text corresponding to a phone number or name included in a contact may be output.

In addition, when a plurality of touch inputs are detected on a plurality of keys on the second virtual keyboard 270, the controller 180 displays a portion of data corresponding to the plurality of keys in the text output window 300. Can be output to For example, when a touch input is detected on each of the first key 272 and the second key 274, the controller 180 may output the texts "sister" and "brother" to the text output window 300. Can be. At this time, a search such as "and" may be added to distinguish between the texts "sister" and "brother". That is, the text "sister and brother" may be output to the text output window 300.

8 and 9 are diagrams for describing a mobile terminal for generating schedule information according to an embodiment of the present invention.

The mobile terminal 100 according to an embodiment of the present invention may generate schedule information. For example, referring to FIG. 8, a schedule generation mode for generating schedule information according to a user input may be executed. When the schedule generation mode is executed, the controller 180 can output a calendar for each day, week, month, and year to the display unit 151. In this case, the controller 180 may select a specific date based on a user input and generate schedule information related to the specific date.

In this schedule generation mode, the embodiments of the present invention described above with reference to FIGS. 3 to 7 may be applied. That is, a first virtual keyboard corresponding to at least one of a Korean keyboard, an English keyboard, and a symbol keyboard is output, a word generated based on a touch input to the first virtual keyboard is output, and a second corresponding to the generated word. The virtual keyboard may be output. In this case, the user may generate schedule information using the first and second virtual keyboards.

In addition, when the schedule generation mode is executed, the controller 180 not only the second virtual keyboard corresponding to the generated word, but also the time combination keyboard 282, the person combination keyboard 284, and the place combination keyboard corresponding to the special virtual keyboard. At least one of 286 may be displayed. The user may convert the second virtual keyboard into one of the time combination keyboard 282, the person combination keyboard 284, and the location combination keyboard 286 using the virtual keyboard conversion key.

Referring to FIG. 8, only the second virtual keyboard is output, but a first virtual keyboard and a second virtual keyboard corresponding to at least one of a Korean keyboard, an English keyboard, and a symbol keyboard may be simultaneously output.

The controller 180 may generate schedule information based on touch inputs to the first and second virtual keyboards. At least one contact may be included in the schedule information based on a touch input to the person combination keyboard. For example, referring to FIG. 7, schedule information of “3 pm Apgujeong family (older sister)” is generated, and the schedule information may include contacts of “older sister” and “brother”.

At this time, the controller 180 may transmit the schedule information of "3 pm Apgujeong family (older sister)" to the "older sister" and "sister" by means of a text and an email. That is, the controller 180 may transmit the generated schedule information to at least one contact included in the schedule information in the form of a message.

Referring to FIG. 9, the category combination keyboard 290 may be output to the display unit 151 as a special virtual keyboard. In the schedule generation mode, categories related to the schedule may be configured as keys of the category combination keyboard 290. In this case, a key constituting the category combination keyboard 290 may be edited based on a user input, or a key corresponding to a new category may be added.

The arrangement of keys of the category combination keyboard 290 may be changed according to the number of times the user uses each key. In addition, the arrangement may be changed according to the user's preference.

According to an embodiment of the present disclosure, when the schedule generation mode is executed, since a special virtual keyboard related to at least one of time, place, person, and category is output, the user may easily generate schedule information using the special virtual keyboard. Can be. The text required to generate the schedule information can be directly input by using a special virtual keyboard, and the number of touch inputs to the virtual keyboard can be reduced.

While the above has been shown and described with respect to preferred embodiments and applications of the present invention, the present invention is not limited to the specific embodiments and applications described above, the invention without departing from the gist of the invention claimed in the claims Various modifications can be made by those skilled in the art, and these modifications should not be individually understood from the technical spirit or the prospect of the present invention.

Claims (10)

A display unit configured to output a first virtual keyboard;
A user input unit configured to detect a touch input to the display unit;
A memory configured to store essential words and a second virtual keyboard that is output to the display unit when at least one of the essential words is input; And
Generating a word based on a touch input to the first virtual keyboard, outputting the generated word to the display unit,
And a controller for outputting the second virtual keyboard when the generated word matches any one of the essential words.
The control unit changes the arrangement order of the keys according to the number of use of each of the keys included in the second virtual keyboard.
The method of claim 1,
The controller may simultaneously output the generated word and the first and second virtual keyboards to the display unit.
And the first and second virtual keyboards are spaced apart from each other to form a space for outputting information other than the first and second virtual keyboards.
The method of claim 2,
The controller may select at least one of the keys included in the second virtual keyboard based on a touch input to the second virtual keyboard, and display the letter corresponding to the selected at least one key together with the generated word. A mobile terminal characterized in that output to the unit.
The method of claim 3, wherein
The control unit does not output the second virtual keyboard when the touch input to the second virtual keyboard is released.
delete The method of claim 1,
The second virtual keyboard may include a time combination keyboard composed of a combination of keys for inputting time information, a place combination keyboard composed of a combination of keys for inputting place information including the generated word, and a contact including the generated word. The mobile terminal, characterized in that any one of the contact combination keyboard composed of a combination of keys for entering information.
The method of claim 6,
The controller generates schedule information based on a touch input to at least one of the first and second virtual keyboards.
The method of claim 7, wherein
The second virtual keyboard is the contact combination keyboard,
And an image stored in at least one contact including the generated word is output as a key of the contact combination keyboard.
delete delete
KR1020120151189A 2012-12-21 2012-12-21 Mobile terminal and control method thereof KR102020329B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120151189A KR102020329B1 (en) 2012-12-21 2012-12-21 Mobile terminal and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120151189A KR102020329B1 (en) 2012-12-21 2012-12-21 Mobile terminal and control method thereof

Publications (2)

Publication Number Publication Date
KR20140081434A KR20140081434A (en) 2014-07-01
KR102020329B1 true KR102020329B1 (en) 2019-09-10

Family

ID=51732734

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120151189A KR102020329B1 (en) 2012-12-21 2012-12-21 Mobile terminal and control method thereof

Country Status (1)

Country Link
KR (1) KR102020329B1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070074131A1 (en) * 2005-05-18 2007-03-29 Assadollahi Ramin O Device incorporating improved text input mechanism

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070074131A1 (en) * 2005-05-18 2007-03-29 Assadollahi Ramin O Device incorporating improved text input mechanism

Also Published As

Publication number Publication date
KR20140081434A (en) 2014-07-01

Similar Documents

Publication Publication Date Title
KR101772979B1 (en) Mobile terminal and control method thereof
KR102080742B1 (en) Mobile terminal and control method thereof
KR20110073857A (en) Mobile terminal and control method thereof
KR101952177B1 (en) Mobile terminal and control method thereof
KR20150086032A (en) Mobile terminal and method for controlling the same
KR20100062252A (en) Mobile terminal and user interface of mobile terminal
KR20110029681A (en) Mobile terminal
KR101925327B1 (en) Mobile terminal and control method thereof
KR101984094B1 (en) Mobile terminal and control method thereof
KR101300260B1 (en) Mobile terminal and control method thereof
KR20110030223A (en) Mobile terminal and control method thereof
KR102024791B1 (en) Mobile terminal and control method thereof
KR102064419B1 (en) Mobile terminal and control method thereof
KR20100117417A (en) Method for executing application in mobile terminal and mobile terminal using the same
KR102020329B1 (en) Mobile terminal and control method thereof
KR101917687B1 (en) Mobile terminal and control method thereof
KR101951420B1 (en) Mobile terminal and control method thereof
KR20130075340A (en) Mobile terminal and control method thereof
KR101958666B1 (en) Mobile terminal
KR20130091184A (en) Mobile terminal and docking system thereof
KR20110022780A (en) Mobile terminal
KR102043949B1 (en) Mobile terminal and control method thereof
KR102053860B1 (en) Mobile terminal
KR101853857B1 (en) Mobile terminal and control method thereof
KR101268049B1 (en) Mobile terminal and control method thereof

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant