KR101498039B1 - a mobile telecommunication device and a method of displaying characters using the same - Google Patents

a mobile telecommunication device and a method of displaying characters using the same Download PDF

Info

Publication number
KR101498039B1
KR101498039B1 KR1020080051612A KR20080051612A KR101498039B1 KR 101498039 B1 KR101498039 B1 KR 101498039B1 KR 1020080051612 A KR1020080051612 A KR 1020080051612A KR 20080051612 A KR20080051612 A KR 20080051612A KR 101498039 B1 KR101498039 B1 KR 101498039B1
Authority
KR
South Korea
Prior art keywords
character
mobile communication
displayed
communication terminal
characters
Prior art date
Application number
KR1020080051612A
Other languages
Korean (ko)
Other versions
KR20090125482A (en
Inventor
서민철
조선휘
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020080051612A priority Critical patent/KR101498039B1/en
Priority claimed from EP09007233.1A external-priority patent/EP2131272A3/en
Publication of KR20090125482A publication Critical patent/KR20090125482A/en
Application granted granted Critical
Publication of KR101498039B1 publication Critical patent/KR101498039B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality

Abstract

An input device for sensing a proximity-touch according to a real-touch of a contact and a distance from the contact, and a character input window in response to a signal provided from the input device when a text message is created And a control unit for controlling the mobile communication terminal to display the corresponding character in advance, and a character display method using the mobile communication terminal.

Description

[0001] The present invention relates to a mobile communication terminal and a character display method using the same,

The present invention relates to a mobile communication terminal having a proximity touch sensing function and a character display method using the same.

Generally, a terminal refers to a device having at least one of a voice / video call function, an information input / output function, and a data storage function. Here, the terminal can be divided into a mobile type and a fixed type according to mobility, and the mobile type terminal can be divided into a portable type and a stationary type depending on whether the user is portable or not.

In recent years, in addition to the above-mentioned functions, the terminal has been able to perform various functions such as image shooting of a still image or a moving image, playback of a multimedia file such as a music file or a moving image file, game, broadcast reception / And is implemented in the form of a comprehensive multimedia device (multimedia player).

In order to implement complex functions in multimedia devices, various new attempts have been made in terms of hardware or software. For example, a user interface (UI) environment implemented in various forms is provided to allow a user to easily and conveniently search for or select a function.

Further, efforts for supporting and increasing the functions of the mobile terminal continue. Such efforts include not only changes and improvements in structural components that form the mobile terminal, but also improvements in software or hardware.

Recently, mobile terminals capable of inputting key signals using a touch screen have been actively studied. In the case of using such a terminal, there is an inconvenience to repeatedly press the corresponding key in order to select any one of a plurality of characters set in one key, as in the character input method in a general keypad.

It is an object of the present invention to provide a mobile communication terminal using a proximity touch method rather than a direct touch method.

It is another object of the present invention to provide a mobile communication terminal having a different character display method according to distance recognition of positions from a contact point.

It is another object of the present invention to provide a mobile communication terminal capable of recognizing a proximity touch and providing convenience in inputting characters.

In order to achieve the above object, a mobile communication terminal according to the present invention is characterized in that characters are displayed in advance through an input window before a character is pressed.

A detailed feature of the mobile communication terminal according to the present invention is to control characters displayed according to the distance from the contact point.

Another detailed feature of the mobile communication terminal according to the present invention is that characters displayed in advance are displayed in a form different from characters already input.

Another characteristic feature of the mobile communication terminal according to the present invention is that characters displayed in advance are controlled according to the holding time of the proximity touch state.

Another detailed feature of the mobile communication terminal according to the present invention is that characters displayed in advance are controlled according to the proximity distance from the contact point.

Another characteristic feature of the mobile communication terminal according to the present invention is that the strength and the holding time of the vibration are controlled according to the output character.

Another characteristic feature of the mobile communication terminal according to the present invention is that when the actual touch of the contact is made, the character is input or the character is input when the proximity touch state is released.

Another characteristic feature of the mobile communication terminal according to the present invention is that a character corresponding to the moved position is displayed in advance through a character input window when the proximity touch state is not released and the character is moved to another button.

The above-described mobile communication terminal according to the present invention can have the following effects.

First, you can preview the character you want to input without directly touching it, so you can check in advance whether the desired character is being input.

Second, it is possible to show the effect that the button is actually pressed by using the proximity touch.

Third, the sensory effect can be enhanced by giving different vibration feedback depending on the character.

The above objects, features and advantages will become more apparent from the following detailed description in conjunction with the accompanying drawings. Hereinafter, a mobile communication terminal according to the present invention will be described in detail with reference to the drawings. In addition, the detailed description of the known technology, which is considered to be a possibility of blurring the gist of the present invention, will be omitted.

The terminal described in this specification may be a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a Navigation, A digital camera, and an MPEG layer 3 player (MP3P). Of course, it is needless to say that the terminal to which the present invention is applicable is not limited to the above-described type, but may include both a memory capable of storing character information and a terminal having a function of displaying a character and detecting a touch.

Referring to FIG. 1, a terminal related to the present invention will be described in terms of functional components. 1 is a block diagram of a terminal according to an embodiment of the present invention.

1 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160 An interface unit 170, a control unit 180, a power supply unit 190, and the like. Figure 1 shows a terminal with various components. However, not all illustrated components are required. A terminal may be implemented by more components than the components shown, or a terminal may be implemented by fewer components.

Hereinafter, the components constituting the terminal 100 shown in FIG. 1 will be described in order.

The wireless communication unit 110 may include at least one component that performs wireless communication between the terminal 100 and the wireless communication system or wireless communication between the terminal 100 and the network in which the terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

Here, the broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may be a server for generating and transmitting broadcast signals and / or broadcast-related information, or a server for receiving broadcast signals and / or broadcast-related information generated by the broadcast server and transmitting the generated broadcast signals and / or broadcast- related information. In addition, the broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. In addition, the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may also be provided through a mobile communication network, in which case it may be received by the mobile communication module 112.

Broadcast-related information can exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 receives broadcasting signals using various broadcasting systems. In particular, the broadcasting receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S) , Digital Video Broadcast-Handheld (DVB-H), Integrated Services Digital Broadcast-Terrestrial (ISDB-T), and the like. Of course, the broadcast receiving module 111 is configured to be suitable for all broadcasting systems that provide broadcast signals as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

In addition, the mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data depending on a voice call signal, a video call signal, or a character / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the terminal 100.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, RFID (Radio Frequency Identification), IrDA (Infrared Data Association), UWB (Ultra Wideband), ZigBee, etc. may be used as the short distance communication technology.

In addition, the position information module 115 is a module for confirming or obtaining the position of the terminal 100. One example is a Global Position System (GPS) module. The GPS module receives position information from a plurality of satellites. Here, the location information may include coordinate information indicated by latitude and longitude. For example, the GPS module can calculate the current position according to the triangular method at three different distances, using the precisely measured time and distance from three or more satellites. For example, a method of obtaining distance and time information from three satellites and correcting the error by one satellite may be used. In particular, the GPS module can acquire latitude, longitude and altitude as well as three-dimensional velocity information and accurate time from the position information received from the satellite.

An audio / video (A / V) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. Then, the processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ The camera 121 may include two or more cameras according to the configuration of the terminal 100.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. The microphone 122 may implement various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 130 receives the input operation from the user and generates input data for controlling the operation of the terminal 100. [ The user input unit 130 may include a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like. Particularly, when the touch pad has a mutual layer structure with the display unit 151 described later, it can be called a touch screen.

The sensing unit 140 senses the current state of the terminal 100 such as the open / close state of the terminal 100, the position of the terminal 100, the presence of the user, the orientation of the terminal 100, And generates a sensing signal for controlling the operation of the terminal 100. For example, when the terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. In addition, a sensing function related to whether or not the power supply unit 190 is powered on, whether the interface unit 170 is coupled to an external device, and the like can also be performed.

The interface unit 170 serves as an interface with all the external devices connected to the terminal 100. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, Video input / output (I / O) ports, earphone ports, and the like.

Here, the identification module is a chip that stores various information for authenticating the usage right of the terminal 100, and includes a user identification module (UIM), a subscriber identity module (SIM) ), A Universal Subscriber Identity Module (" USIM "), and the like. In addition, an apparatus having an identification module (hereinafter referred to as 'identification device') can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port. The interface unit 170 receives data from an external device or receives power from the external device, transfers the data to each component in the terminal 100, or transmits data in the terminal 100 to an external device.

The output unit 150 is for outputting an audio signal, a video signal, or an alarm signal. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.

The display unit 151 displays and outputs information to be processed by the terminal 100. For example, when the terminal is in the call mode, a UI (User Interface) or GUI (Graphic User Interface) associated with the call is displayed. When the terminal 100 is in the video communication mode or the photographing mode, the captured or / and received image or UI and GUI are displayed.

Meanwhile, as described above, when the display unit 151 and the touch pad have a mutual layer structure to constitute a touch screen, the display unit 151 can be used as an input device in addition to the output device. The display unit 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display). In addition, according to the embodiment of the terminal 100, there may be two or more display units 151. FIG. For example, the terminal 100 may be provided with an external display unit (not shown) and an internal display unit (not shown) at the same time.

The audio output module 152 outputs audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, and a broadcast reception mode. In addition, the sound output module 152 outputs an acoustic signal related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the terminal 100. [ The sound output module 152 may include a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the terminal 100. Examples of events that occur in a terminal include receiving a call signal, receiving a message, and inputting a key signal. The alarm unit 153 may output a signal for informing occurrence of an event in a form other than an audio signal or a video signal. For example, it is possible to output a signal in a vibration mode. When a call signal is received or a message is received, the alarm unit 153 can output a vibration to notify it. Or when the key signal is input, the alarm unit 153 can output the vibration by the feedback to the key signal input. The user can recognize the occurrence of an event through the vibration output as described above. Of course, a signal for notifying the occurrence of an event may also be output through the display unit 151 or the sound output module 152.

The memory 160 may store a program for processing and control of the controller 180 and may have a function for temporarily storing input / output data (e.g., a phone book, a message, a still image, . ≪ / RTI >

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory) SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM (Programmable Read- And an optical disc. In addition, the terminal 100 may operate a web storage that performs a storage function of the memory 160 on the Internet.

The control unit 180 typically controls the overall operation of the terminal. For example, for voice calls, data communications, video calls, and the like. In addition, the control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a computer-readable recording medium, for example, using software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays Microprocessors, microprocessors, and electrical units for performing functions, as will be described in more detail below. In some cases, such embodiments may be implemented by the controller 180.

According to a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that perform at least one function or operation. The software code may be implemented by a software application written in a suitable programming language. In addition, the software codes may be stored in the memory 160 and executed by the control unit 180. [

Hereinafter, in order to perform the character control operation according to the present invention, the organic operation relationship among the respective components of the terminal will be described in detail with reference to FIG. First, the display unit 151 of the present invention includes a touch screen. In addition, the sensing unit 140 has a configuration for sensing a real-touch and a proximity-touch of the touch screen. In the memory 160, data including information such as name, telephone number, job name, e-mail, homepage information, and group name, that is, human information is stored. In addition, the character information corresponding to the human information data is stored.

When the event requiring the personal information data is generated, the controller 180 reads the personal information data corresponding to the event condition and the corresponding character data from the memory 160, As shown in FIG. In addition, a shape of a character displayed through the display unit 151 is controlled according to a signal provided from a sensing unit 140 that recognizes the approach and touch of the pointer on the touch screen.

In the foregoing, the terminal related to the present invention has been examined from the viewpoint of components according to functions. Hereinafter, the terminal related to the present invention will be further described with reference to FIGS. 2 and 3 in terms of components according to the outline. Hereinafter, for simplicity of explanation, a slider type terminal will be exemplified among various types of terminals such as a folder type, a bar type, a swing type, a slider type, and the like. Therefore, the present invention is not limited to a slider type terminal, but can be applied to all types of terminals including the above-mentioned types.

2 is a perspective view of a terminal according to an exemplary embodiment of the present invention. The terminal 100 according to the present invention includes a first body 200 and a second body 205 that is slidable along at least one direction on the first body 200. [

Meanwhile, when the terminal 100 according to the present invention is a folder type, it includes a first body and a second body that is configured to fold or unfold at least one side of the first body.

The first body 200 may be referred to as a closed configuration in which the first body 200 and the second body 205 are overlapped with each other and the first body 200 may be referred to as a second body 205 may be referred to as an open configuration.

Although the terminal 100 operates mainly in the standby mode in the closed state, the standby mode is also canceled by the user's operation. The terminal 100 operates mainly in the communication mode in the open state, but is also switched to the standby mode after the user's operation or a predetermined time elapses.

The casing (casing, housing, cover, etc.) constituting the outer appearance of the first body 200 is formed by the first front case 220 and the first rear case 225. Various electronic components are embedded in the space formed by the first front case 220 and the first rear case 225. At least one intermediate case may be additionally disposed between the first front case 220 and the first rear case 225.

The cases may be formed by injection molding of a synthetic resin or may be formed of a metal material such as stainless steel (STS) or titanium (Ti).

The display unit 151, the sound output module 152, the camera 121 or the first user input unit 210 may be disposed in the first body 200, specifically, the first front case 220.

The display unit 151 includes a liquid crystal display (LCD), an OLED (Organic Light Emitting Diodes), and the like that visually express information.

In addition, the touch pad is superimposed on the display unit 151 in a layer structure, so that the display unit 151 may operate as a touch screen to enable information input by a user's touch.

The sound output module 152 may be implemented in the form of a speaker.

The camera 121 may be implemented so as to be suitable for photographing an image or a moving image for a user or the like.

As in the case of the first body 200, the case constituting the outer appearance of the second body 205 is formed by the second front case 230 and the second rear case 235.

The second user input unit 215 may be disposed on the front surface of the second body 205, specifically, the second front case 230.

The third user input unit 245, the microphone 122, and the interface unit 170 may be disposed in at least one of the second front case 230 and the second rear case 235.

The first to third user input units 210, 215, and 245 may be collectively referred to as a user input unit 130 and may be employed in any manner as long as the user operates in a tactile manner with a tactile impression.

For example, the user input unit 130 may be embodied as a dome switch or a touch pad capable of receiving a command or information by a push or touch operation of a user, or may be a wheel, a jog type, a joystick, Or the like.

In a functional aspect, the first user input unit 210 is for inputting commands such as start, end, scrolling, etc., and the second user input unit 215 is for inputting numbers, letters, symbols and the like. Also, the third user input 245 may operate as a hot-key for activating a special function in the terminal.

The microphone 122 may be implemented in a form suitable for receiving voice, sound, etc. of the user.

The interface unit 170 is a channel through which the terminal related to the present invention can exchange data with an external device. For example, the interface unit 170 may be a wired or wireless connection terminal for connecting to an earphone, a port for short-range communication (for example, an IrDA port, a Bluetooth port, a wireless LAN port a wireless Lan port), or power supply terminals for supplying power to the terminal.

The interface unit 170 may be a card socket that accommodates an external card such as a SIM (subscriber identification module) or a UIM (user identity module), a memory card for information storage, and the like.

A power supply unit 190 for supplying power to the terminal 100 is mounted on the second rear case 235 side.

The power supply unit 190 may be, for example, a rechargeable battery and may be detachably coupled for charging or the like.

3 is a rear perspective view of the terminal shown in FIG.

Referring to FIG. 3, a camera 121 may be further mounted on the rear surface of the second rear case 235 of the second body 205. The camera 121 mounted on the second body 205 has a photographing direction which is substantially opposite to that of the camera 121 of the first body 200. The camera 121 of the first body 200, Lt; / RTI >

For example, the camera 121 of the first body 200 has low pixels so that it is easy to capture a face of the user and transmit the captured image to the other party in case of a video call or the like, and the camera 121 of the second body 205 It is preferable to have a high pixel since a general subject is photographed and is not immediately transferred.

A flash 250 and a mirror 255 may be additionally disposed adjacent to the camera 121 of the second body 205. The flash 250 shines light toward the subject when the subject of the subject is photographed by the camera 121 of the second body 205. The mirror 255 allows the user to illuminate the user's own face or the like when the user intends to take a picture of himself / herself (self-photographing) by using the camera 121 of the second body 205.

The second rear case 235 may further include an acoustic output module 152. The sound output module 152 of the second body 205 may implement the stereo function together with the sound output module 152 of the first body 200 and may be used for the talk in the speakerphone mode.

In addition, an antenna 260 for receiving broadcast signals may be disposed on one side of the second rear case 235 in addition to the antenna for communication. The antenna 260 may be installed to be detachable from the second body 205.

A part of the slide module 265 that slidably connects the first body 200 and the second body 205 is disposed on the first rear case 225 side of the first body 200.

The other part of the slide module 265 may be disposed on the second front case 230 side of the second body 205 and may not be exposed to the outside as in this figure.

In the above description, the camera 121 or the like is disposed on the second body 205, but the present invention is not limited thereto.

For example, at least one of the configurations described as being disposed in the second rear case 235, such as the camera 121 of the second body 205, may include the first body 200, 225). In such a case, there is an advantage that the configuration (s) disposed in the first rear case 225 in the closed state is protected by the second body 205. Even if the camera 121 of the second body 205 is not separately provided, the camera 121 of the first body 200 is rotatably formed so that the photographing direction of the camera 121 of the second body 205 As shown in FIG.

The terminal 100 shown in Figs. 1 to 3 can be used in a communication system capable of transmitting data through a frame or a packet, including a wired / wireless communication system and a satellite-based communication system Lt; / RTI >

Hereinafter, a wireless communication system in which a terminal according to an embodiment of the present invention can operate will be described with reference to FIG.

The communication system may utilize different air interfaces and / or physical layers. For example, wireless interfaces that can be used by a communication system include Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), Universal Mobile Telecommunications Systems (UMTS), in particular Long Term Evolution (LTE), Global System for Mobile Communications (GSM) However, the present invention is applicable to all communication systems including a CDMA wireless communication system.

4, the CDMA wireless communication system includes a plurality of terminals 100, a plurality of base stations (BSs) 270, base station controllers (BSCs) 275, , And a mobile switching center ('MSC') 280. MSC 280 is configured to be coupled to a Public Switched Telephone Network (" PSTN ") 290, and is also configured to be coupled to BSCs 275. BSCs 275 may be coupled in pairs with BS 270 through a backhaul line. The backhaul line may be provided according to at least one of E1 / T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL or xDSL. Thus, a plurality of BSCs 275 may be included in the system shown in FIG.

Each BS 270 may include at least one sector, and each sector may include an omnidirectional antenna or an antenna pointing to a particular direction of radial from the BS 270. In addition, each sector may include two or more antennas of various types. Each BS 270 may be configured to support a plurality of frequency assignments, and each of the plurality of frequency assignments has a specific spectrum.

The BS 270 may be referred to as a Base Station Transceiver Subsystem (BTSs). In this case, the word "base station" may be referred to as a combination of one BSC 275 and at least one BS 270. [ The base station may also represent a "cell site ". Or each of the plurality of sectors for a particular BS 270 may be referred to as a plurality of cell sites.

4, a broadcasting transmitter (BT) 295 transmits a broadcasting signal to the terminals 100 operating in the system. The broadcast receiving module 111 shown in FIG. 1 is provided in the terminal 100 to receive a broadcast signal transmitted by the BT 295. In addition, FIG. 4 illustrates several Global Positioning System ('GPS') satellites 300. The satellites 300 help locate at least one of the plurality of terminals 100. Although two satellites are shown in Figure 4, useful location information may be obtained by two or more satellites. The location information module 115 shown in FIG. 1 cooperates with satellites 300 to obtain desired location information. Here you can track your location using all of the technologies that can track your location, as well as your GPS tracking technology. Also, at least one of the GPS satellites 300 may optionally or additionally be responsible for satellite DMB transmission.

Among the typical operations of a wireless communication system, the BS 270 receives a reverse link signal from the various terminals 100. At this time, the terminals 100 are in the process of connecting a call, transmitting or receiving a message, or performing another communication operation. Each of the reverse link signals received by a particular base station 270 is processed by a particular base station 270. The resultant data is transmitted to the connected BSC 275. The BSC 275 provides call resource allocation and mobility management functions, including the organization of soft handoffs between the base stations 270. The BSC 275 also transmits the received data to the MSC 280 and the MSC 280 provides additional transmission services for connection with the PSTN 290. Similarly, the PSTN 290 connects to the MSC 280, the MSC 280 connects to the BSCs 275, and the BSCs 275 communicate with the BSs 270 ).

Hereinafter, a method of inputting characters using a mobile communication terminal according to an embodiment of the present invention will be described. Hereinafter, the character input in the text message creation process is described as an example, but this is for the purpose of explaining the operation of the present invention and does not mean that the application of the present invention is implemented only when a text message is created. That is, it can be applied to all operations requiring input of a keypad, such as input of a telephone number, creation of a telephone book, memo, and schedule management. In the following description, a touch screen is taken as an example. However, if a proximity sensor is provided on a general keypad, the character input method by proximity touch recognition according to the present invention can be sufficiently applied.

A method of displaying a screen using the relationship between proximity touch and real-touch in the embodiment of the present invention will be described. 5 is an exemplary view showing the principle of proximity touch. The proximity-touch refers to a case where a pointer is not actually touched on the screen but is approached at a predetermined distance from the screen. The pointer is a tool for actually touching or touching a specific portion of a displayed screen. For example, there are stylus fans and fingers. The term proximity touch in the present specification means that the pointer is located at a position on a space vertically corresponding to a predetermined point on the surface of the touch screen and is recognized as a proximity touch.

In this case, the controller 180 can recognize the proximity touch as a predetermined signal input. That is, the mobile terminal 100 according to an embodiment of the present invention can recognize the pointer as a proximity touch when approaching within a predetermined distance from the screen. The predetermined distance may mean a vertical distance between the pointer and the screen. That is, "D0" represents a real touch, and "D1 "," D2 ", and "D3" represent proximity touches at a predetermined vertical distance from the screen, respectively.

In addition, real-touch refers to a case where a pointer is actually touched on the screen. In this case, the controller 180 can recognize the real-touch as a predetermined signal input. This may be implemented in a mobile terminal 100 having a touchscreen.

That is, the mobile terminal 100 according to an embodiment of the present invention can detect the proximity-touch or the real-touch through the sensing unit 140. The sensing unit 140 may include various sensors to perform various sensing functions. For example, a proximity sensor or a tactile sensor may be provided to detect the proximity-touch or the real-touch.

The proximity sensor refers to a proximity switch that detects the presence of an object approaching the sensing surface (surface) of the switch, or the presence of an object in the vicinity of the switch, using mechanical force or infrared rays without mechanical contact. The proximity switch is a switch that sends ON / OFF output when the sensing object falls within the sensing distance determined by the sensor without mechanical contact, as opposed to outputting the ON / OFF output through mechanical contact. Therefore, its lifetime is considerably longer than that of the contact type switch, and its utilization is also very high. The operation principle of the proximity switch is to oscillate the high frequency of the stationary wave in the oscillation circuit, and when the sensing object approaches the vicinity of the sensor sensing surface, the oscillation amplitude of the oscillation circuit is attenuated or stopped. . Therefore, even if any non-metallic substance comes between the high frequency oscillation proximity switch and the sensing object, the proximity switch can detect the sensing object to be detected without interference of the object.

Further, the tactile sensor is a sensor that detects contact of a specific object with a degree or more that a person feels. The tactile sensor can sense various information such as the roughness of the contact surface, the rigidity of the contact object, and the temperature of the contact point.

Meanwhile, the sensing unit 140 may sense proximity or proximity speed. The proximity means the distance between the surface of the touch screen and the pointer. In particular, proximity may mean the shortest distance between the screen and the pointer. The proximity speed means a speed at which the pointer approaches the screen or a speed at which the pointer moves away from the screen.

The mobile terminal 100 according to an embodiment of the present invention can control the display of the screen based on the relationship between the actual touch and the proximity touch. For example, character display and input can be controlled based on the relationship between the actual touch and the proximity touch.

6 is a diagram illustrating an example of a character input according to an embodiment of the present invention. 6A shows character input in the "English input" state, and FIG. 6B shows an example when characters are input in the "Hangul input" state. As described above, the present invention can be applied to a case where a proximity sensor is provided on a general keypad, and can be applied even when a structure of a keypad is displayed on a touch screen.

6A and 6B show a state of proximity touch. As shown in FIG. 6A, "Hello I '" is a state where input is completed, and "m" is a state to be input. When the user brings the pointer (finger or the like) close to the button (key) to which the "MNO" character is assigned to a distance enough to recognize it as a proximity touch, Display. 6B is an embodiment of Korean input. Suppose a user wants to type the word "now". If you enter the character "now" and move the pointer closer to the button (key) to which "ㄴ" is assigned, "ㄴ" is displayed in the character input window. Therefore, the user can preview the character to be input next, thereby reducing waste of time due to mistyping.

There are two ways to select characters. That is, there may be a method of selecting a character displayed in the character input window due to the proximity touch by an actual touch, and a method of releasing the proximity touch if the user desires to select a character previously displayed in the proximity touch state.

7 is a diagram illustrating an example of a character input method according to an embodiment of the present invention. Usually, multiple characters are assigned to a key. Therefore, when the user presses the corresponding button continuously for a predetermined period of time, the assigned characters are changed and input. If you move your finger to the character you want to input using the proximity touch, the character assigned at the beginning is displayed. If you move up and down without touching, the assigned characters are changed and displayed. When you touch, the character is input and when it goes out of D3, it returns to the original. This operation is performed in the proximity touch sensing range (D1, D2, D3) If you move the operation from the top to the bottom again in a certain time within a certain time, you will recognize it as if you press the button. Therefore, it is possible to perform an operation similar to pressing a button.

8 is a diagram illustrating an example of a character input method according to another embodiment of the present invention. When you bring your finger to the character you want to input using the proximity touch, the character assigned at the beginning is displayed. The assigned characters are displayed in order according to the time to maintain the state without touching. For example, if you place the pointer on a button that allows you to enter the "MNO" character, the letter "M" will be displayed first. After a certain period of time, the character "N" assigned in the next step is displayed. In this manner, the display is sequentially displayed in the order of "M → N → O → m → n → o → M → N ..." in a predetermined period of time.

9 is a diagram illustrating an example of a character input method according to another embodiment of the present invention. The present embodiment shows a case where the user moves to another button in a state in which the proximity touch state is not released. That is, the present invention relates to matters to be considered in order to prevent an undesired character from being input when the proximity touch is canceled in the proximity touch state. When the proximity touch state is not released and the user moves to another button, the character corresponding to the moved position is controlled to be displayed in advance through the character input window.

At this time, care must be taken to maintain the proximity touch area. As shown in the figure, when the user tries to input "m" while the "j" is displayed in advance in the input window by touching the "JKL" button in advance, the user presses the "MNO" button Move the pointer. The previously displayed "j" disappears in the input window and a new letter "M" is displayed in advance. At this time, in order to select the desired "m ", the proximity touch is maintained for a predetermined time, or the pointer is moved up and down so as to expect the same effect as pressing the button in the proximity touch state, → D2 ...).

10 is a diagram illustrating an embodiment of a character input method according to another embodiment of the present invention.

It can be seen that all the characters assigned to a predetermined key are displayed in advance around the key display area. For example, when a proximity touch is made to a key to which "ABC" is assigned, all characters such as "A, B, C, a, b, c ..." are displayed through the display area as shown. In order to facilitate the movement of the pointer, it may be desirable to display in the form of 3X3 around the proximity touch key. On the other hand, if it is desired to convert from the current state to Korean, for example, it is possible to facilitate switching between Korean and English by using a function key separately allocated to the side of the main body of the terminal. Therefore, the effect of reducing the proximity touch waiting time required from "A" to "c " can be expected.

On the other hand, as another display method in a case where a larger number of characters than the number of characters displayed on the keypad are allocated, the display method as shown in FIG. 11 will be possible.

Displays the default character displayed on the keypad once. At this time, an arrow is displayed around the character to indicate that more characters are allocated. When the proximity or contact touch is made to the arrow, corresponding characters are displayed. For example, "A, B, C, a, b, c ..." are assigned to the "ABC" key. According to the present embodiment, once the proximity touch is performed to the "ABC" key, "a, A, α" is displayed and an arrow is displayed on the left and right to indicate that there are more characters assigned. If the user desires "c ", a proximity touch or an actual touch is performed on the right arrow. When "c, C, γ" is displayed through "b, B, β", the desired "c" character is selected. Of course, at this time, the "c" character is displayed in advance on the screen in a form different from the other characters in accordance with the proximity touch to "c".

12 is a diagram illustrating an embodiment of a character input method according to another embodiment of the present invention. As shown in the figure, the distance of the proximity touch indicates the magnitude of the vibration in accordance with the speed of approaching the contact point of the actual touch such as "D3? D2? D1? D0". When a character is input, the user distinguishes between a strong touch and a weak touch, and the user can recognize that he or she has touched strongly or weakly by providing different vibration effects according to the respective actions.

If you move quickly from D3 to the actual touch point, the touch position will be pressed harder at the moment you actually touch it. If you move slowly, you will press it weaker than when you move quickly. In this case, when the user moves quickly, the vibration is roughly or roughly given. When the user moves slowly, the user gives a weak and soft vibration, so that the user can recognize whether the button is pressed hard or lightly.

On the other hand, the key pad screen of the touch screen can provide a visual effect that the button is pressed in accordance with the proximity touch. In general, when entering characters, you can visually see that the pushing button is depressed by pressing a resilient button. By expressing such an effect using proximity touch, it is possible to give a visual effect that a button is pressed to a user. In other words, since it is possible to recognize each step from D3 to the actual touch contact point, it is possible to display the image effect so that the D3 can be seen slightly in the D2, and the D1 button can be fully pressed when the D1 is slightly touched.

While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. The present invention is not limited to the drawings. In addition, all or some of the embodiments may be selectively combined so that various modifications may be made to the embodiments described above.

1 is a block diagram of a terminal according to an embodiment of the present invention.

2 is a front perspective view of a terminal according to an embodiment of the present invention.

3 is a rear view of a terminal according to an embodiment of the present invention.

4 is a schematic configuration diagram of a wireless communication system in which a terminal according to an embodiment of the present invention can operate.

5 is an exemplary diagram for explaining the operation of the proximity touch.

6 is an exemplary view showing an embodiment of a character preview display function by proximity touch.

7 is a diagram illustrating an example of a character input method according to an embodiment of the present invention.

8 is a diagram illustrating an example of a character input method according to another embodiment of the present invention.

9 is a diagram illustrating an example of a character input method according to another embodiment of the present invention.

10 is a diagram illustrating an embodiment of a character input method according to another embodiment of the present invention.

11 is a diagram illustrating an example of a character input method according to another embodiment of the present invention.

12 is a diagram illustrating an embodiment of a character input method according to another embodiment of the present invention.

Claims (22)

  1. An input device for detecting a proximity-touch according to a real-touch of a contact and a distance from the contact;
    And a control unit for controlling a display mode of a character displayed through a character input window according to a signal provided from the input device when a text message is created,
    Wherein the control unit displays one character among a plurality of characters assigned to the predetermined key in advance in the input window when the proximity touch is detected on the predetermined key through the input device,
    If the detected distance change of the proximity touch is sequentially detected by the first distance, the second distance, and the first distance again, it is switched to another character among the plurality of characters assigned to the predetermined key and displayed in advance in the input window Mobile communication terminal.
  2. The mobile communication terminal of claim 1, wherein the input device is a touch screen.
  3. 3. The mobile communication terminal of claim 2, wherein when the proximity touch is detected by the input device, the controller displays a predetermined number of characters among the characters assigned to adjacent keys in advance in the vicinity of the key display area.
  4. 4. The mobile communication terminal of claim 3, wherein arrows are displayed when a number of characters larger than the number of displayed characters is allocated, and corresponding characters are displayed when proximity or touching is performed.
  5. The mobile communication terminal according to claim 3, wherein all the characters assigned to the adjacent keys are displayed in advance in the vicinity of the key display area.
  6. 3. The mobile communication terminal of claim 2, wherein the control unit controls the key button to be displayed in a form in which the key button is pressed according to a proximity distance from the contact point.
  7. delete
  8. The mobile communication terminal according to claim 1, wherein the character displayed in advance is displayed in a different form from the other characters inputted.
  9. 9. The mobile communication terminal of claim 8, wherein characters displayed in advance are displayed in a color different from other characters input in the mobile communication terminal.
  10. The mobile communication terminal of claim 8, wherein the character displayed in advance is blinking.
  11. 2. The mobile communication terminal of claim 1, wherein the controller controls characters displayed in advance in accordance with a retention time of the proximity touch state.
  12. delete
  13. The mobile communication terminal of claim 1, wherein the control unit controls the output character and controls the vibration using the motor.
  14. 14. The mobile communication terminal of claim 13, wherein the controller controls the intensity or duration of vibration differently.
  15. The mobile communication terminal of claim 1, wherein the control unit processes input of a corresponding character when an actual touch of a contact is made.
  16. The mobile communication terminal of claim 1, wherein when the proximity touch state is released, the controller processes the input of the corresponding character.
  17. 17. The mobile communication terminal of claim 16, wherein the controller controls to display a character corresponding to the moved position in advance through the character input window when the proximity touch state is not released and the user moves to another button.
  18. Switching to a character creation mode;
    Confirming whether a proximity touch is detected on a predetermined key according to a signal provided through an input device; And
    And displaying a character in the character input window in response to the proximity touch of the predetermined key through the input device when the proximity touch is detected through the input device,
    The pre-displaying step may include switching to another character among the plurality of characters assigned to the predetermined key when the detected distance change of the proximity touch is sequentially detected by the first distance, the second distance, and the first distance again And displaying the character string in advance in the input window.
  19. delete
  20. 19. The method of claim 18, wherein characters displayed in advance in the character input window have colors different from those of the input characters.
  21. The method as claimed in claim 18, further comprising, when an actual touch of the contact is made, processing the input of the corresponding character.
  22. The method as claimed in claim 18, further comprising, when the proximity touch state is released, processing the input of the corresponding character.
KR1020080051612A 2008-06-02 2008-06-02 a mobile telecommunication device and a method of displaying characters using the same KR101498039B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020080051612A KR101498039B1 (en) 2008-06-02 2008-06-02 a mobile telecommunication device and a method of displaying characters using the same

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR1020080051612A KR101498039B1 (en) 2008-06-02 2008-06-02 a mobile telecommunication device and a method of displaying characters using the same
EP09007233.1A EP2131272A3 (en) 2008-06-02 2009-05-29 Mobile communication terminal having proximity sensor and display controlling method therein
US12/476,213 US8482532B2 (en) 2008-06-02 2009-06-01 Mobile communication terminal having proximity sensor and display controlling method therein

Publications (2)

Publication Number Publication Date
KR20090125482A KR20090125482A (en) 2009-12-07
KR101498039B1 true KR101498039B1 (en) 2015-03-03

Family

ID=41686979

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020080051612A KR101498039B1 (en) 2008-06-02 2008-06-02 a mobile telecommunication device and a method of displaying characters using the same

Country Status (1)

Country Link
KR (1) KR101498039B1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006236143A (en) * 2005-02-25 2006-09-07 Sony Ericsson Mobilecommunications Japan Inc Input processing program, portable terminal device and input processing method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006236143A (en) * 2005-02-25 2006-09-07 Sony Ericsson Mobilecommunications Japan Inc Input processing program, portable terminal device and input processing method

Also Published As

Publication number Publication date
KR20090125482A (en) 2009-12-07

Similar Documents

Publication Publication Date Title
KR101496512B1 (en) Mobile terminal and control method thereof
KR101952179B1 (en) Mobile terminal and control method for the mobile terminal
KR101995486B1 (en) Mobile terminal and control method thereof
KR101631958B1 (en) Input device and mobile terminal having the same
KR101613555B1 (en) Mobile terminal
KR101574117B1 (en) Mobile Terminal and Method Of Executing Call Function Using Same
KR101886753B1 (en) Mobile terminal and control method thereof
KR101559178B1 (en) Method for inputting command and mobile terminal using the same
KR20150025385A (en) Mobile terminal and controlling method thereof
US8103296B2 (en) Mobile terminal and method of displaying information in mobile terminal
KR101461954B1 (en) Terminal and method for controlling the same
US8922494B2 (en) Mobile terminal and method of controlling the same
KR101617461B1 (en) Method for outputting tts voice data in mobile terminal and mobile terminal thereof
KR101466027B1 (en) Mobile terminal and its call contents management method
US20130250034A1 (en) Mobile terminal and control method thereof
KR101474438B1 (en) Apparatus and method for setting communication service interception mode of mobile terminal
US8565828B2 (en) Mobile terminal having touch sensor-equipped input device and control method thereof
KR101462932B1 (en) Mobile terminal and text correction method
KR101595029B1 (en) Mobile terminal and method for controlling the same
KR101456001B1 (en) Terminal and method for controlling the same
KR101078929B1 (en) Terminal and internet-using method thereof
KR101566353B1 (en) Mobile Terminal And Method Of Displaying Information In Mobile Terminal
KR101561703B1 (en) The method for executing menu and mobile terminal using the same
KR101562582B1 (en) Mobile terminal using flexible lcd and picture enlarging method thereof
KR100988377B1 (en) Apparatus and method for managing menu of mobile terminal

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
LAPS Lapse due to unpaid annual fee