KR101469280B1 - Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same - Google Patents

Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same Download PDF

Info

Publication number
KR101469280B1
KR101469280B1 KR1020080030446A KR20080030446A KR101469280B1 KR 101469280 B1 KR101469280 B1 KR 101469280B1 KR 1020080030446 A KR1020080030446 A KR 1020080030446A KR 20080030446 A KR20080030446 A KR 20080030446A KR 101469280 B1 KR101469280 B1 KR 101469280B1
Authority
KR
South Korea
Prior art keywords
touch screen
touch
input medium
objects
portable terminal
Prior art date
Application number
KR1020080030446A
Other languages
Korean (ko)
Other versions
KR20090105160A (en
Inventor
김종환
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020080030446A priority Critical patent/KR101469280B1/en
Publication of KR20090105160A publication Critical patent/KR20090105160A/en
Application granted granted Critical
Publication of KR101469280B1 publication Critical patent/KR101469280B1/en

Links

Images

Abstract

The present invention relates to a portable terminal capable of sensing a proximity touch operation of an input medium, and a method of providing a graphical user interface using the portable terminal.
The portable terminal according to the present invention can distinguish the operation of touching or touching the touch screen of the input medium. Particularly, in the portable terminal, a predetermined lower layer object is displayed on the touch screen, and at least a part of an upper layer object is overlaid on the lower layer object, and the input medium touches the upper layer object And displaying the upper layer object to move on the touch screen in response to movement of the input medium. Accordingly, the portable terminal can provide various graphic user interfaces according to direct or proximity touch operation of the input medium.
Mobile terminal, touch screen, proximity touch, user interface

Description

Technical Field [0001] The present invention relates to a portable terminal having a proximity touch sensing function and a graphical user interface using the portable terminal. [0002]

The present invention relates to a portable terminal, and more particularly, to a portable terminal capable of detecting a proximity touch operation of an input medium and a method of providing a graphical user interface using the portable terminal.

2. Description of the Related Art Generally, a portable terminal is a portable device having one or more functions that can carry a voice and video call, a function for inputting / outputting information, and a function for storing data.

As the functions of the portable terminal are diversified, for example, the portable terminal has complicated functions such as photographing and photographing of a moving picture, reproduction of music or video file, reception of a game and broadcasting, and is implemented as a multimedia device .

In order to implement complex functions in multimedia devices, various new attempts have been made in terms of hardware or software. For example, a user interface environment is provided for a user to easily and conveniently search for or select a function.

In recent years, in order to provide various user interface environments, a touch screen having a touch pad coupled to a display module is employed in a portable terminal. Thus, a user can input various user commands while viewing a screen implemented in the display module, It was possible.

However, in a conventional portable terminal using a touch screen, a user can simply select a function associated with the menu or icon by directly touching a menu or icon displayed on the screen of the portable terminal with a finger, There is a problem that the interface environment can not be provided.

SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned problems, and it is an object of the present invention to provide various user-friendly graphical user interfaces according to different touch operations of a user using a mobile terminal having a proximity touch sensing function.

In order to accomplish the above object, the portable terminal according to the present invention generates different touch input signals according to the proximity touch or direct touch operation of the input medium, displays a predetermined lower layer object, A touch screen for displaying at least a part of an upper layer object on an upper part of the object, and an operation for moving along the surface of the touch screen while the input medium is touched with the upper layer object, And controlling the upper layer object to move on the touch screen in response to movement of the upper layer object.

Accordingly, it is possible to input a user command specific to the proximity touch operation of the input medium in a state in which the predetermined lower layer object and the upper layer object are formed in a multi-layer structure on the touch screen of the portable terminal, Can be provided.

In addition, the controller may identify an operation of moving the input medium along a surface of the touch screen in a state that the input layer directly touches the lower layer object or the upper layer object, And controlling at least one of the upper layer objects to move on the touch screen.

Accordingly, it is possible to input a user command specific to a direct touch operation of an input medium in a state where a predetermined lower layer object and an upper layer object are formed in a multi-layer structure on the touch screen of the portable terminal, Environment can be provided.

The controller may remove the upper layer object from the visible region of the touch screen when a predetermined position of the upper layer object is out of the visible region of the touch screen.

Accordingly, when the upper layer object is a pop-up window such as an advertisement, the user can easily remove the pop-up window.

The controller displays a user command input window for receiving a user command on whether to remove the upper layer object on the touch screen when a certain position of the upper layer object is out of the visible range of the touch screen .

Accordingly, it is possible to prevent the information loss due to mis-input of the user command by giving the user a chance to re-confirm whether or not to remove the upper layer object.

Preferably, the controller identifies an operation of the input medium to touch the lower layer object for a predetermined time or longer, and displays the lower layer object on the upper layer object.

Accordingly, after the object initially displayed in the lower layer is changed to an upper layer object, the user command according to the present invention can be executed according to the proximity touch operation of the input medium.

Preferably, the control unit determines that the input medium has a proximity touch operation of the input medium when the input layer touches the upper layer object or the lower layer object for a predetermined time or more.

In this way, by allowing the input medium to input a user command according to the proximity touch at a close time, it is possible to prevent the portable terminal from malfunctioning due to unintended operation.

According to another aspect of the present invention, there is provided a method of providing a graphical user interface using a portable terminal, the method comprising: displaying a predetermined lower layer object and superimposing at least a part of an upper layer object on the lower layer object; A second step of detecting, in a displayed touch screen, an operation in which an input medium touches the upper layer object in close proximity, a second step of measuring a time when the input medium touches the touch screen in a near- When the measured proximity touch time is equal to or greater than a predetermined time, the input medium identifies an operation of moving along the surface of the touch screen while the upper layer object is touched, The upper layer object is moved in the touch screen Characterized in that it comprises a third step of locking the display.

Accordingly, the input medium may provide various graphic user interfaces according to an operation of touching a predetermined lower layer object or an upper layer object displayed on the touch screen of the portable terminal.

According to the portable terminal and the method for providing a graphical user interface using the portable terminal according to the present invention, when a predetermined lower layer object and an upper layer object are formed in a multi-layer structure on a touch screen of the portable terminal, Various user commands can be input according to operation or direct touch operation. Accordingly, there is an advantage in that a variety of graphical user interfaces can be provided in a user-friendly manner in accordance with functions executed in the portable terminal.

The portable terminal described in this specification includes a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and a navigation system.

Hereinafter, a mobile terminal and a screen control method using the same according to the present invention will be described in detail with reference to the drawings.

Referring to FIG. 1, a mobile terminal according to the present invention will be described in terms of components according to functions.

1 is a block diagram of a portable terminal according to an embodiment of the present invention.

The portable terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, an operation unit 130, a sensing unit 140, an output unit 150, a storage unit 160, An interface unit 170, a control unit 180, a power supply unit 190, and the like. It should be noted that when the components are implemented in a practical application, two or more components may be merged into one component, or one component may be subdivided into two or more components as necessary.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114 and a GPS module 115.

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server (not shown) through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

Meanwhile, the broadcast related information may be provided through a mobile communication network, and in this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 receives broadcasting signals using various broadcasting systems. In particular, the broadcasting receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S) Only Digital Broadcast-Handheld (DVB-H), Integrated Services Digital Broadcast-Terrestrial (ISDB-T), and the like. Of course, the broadcast receiving module 111 is configured to be suitable for all broadcasting systems that provide broadcasting signals as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the storage unit 160.

Further, the mobile communication module 112 transmits and receives radio signals to and from at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data depending on a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and the wireless Internet module 113 can be built in or externally.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and the like can be used as the short distance communication technology.

In addition, the GPS (Global Position System) module 115 receives navigation information from a plurality of satellites.

The A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal. The audio / video input unit 120 may include a camera module 121 and a microphone module 122. The camera module 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. Then, the processed image frame can be displayed on the display module 151.

The image frame processed by the camera module 121 may be stored in the storage unit 160 or transmitted to the outside through the wireless communication unit 110. [ The camera module 121 may be equipped with two or more cameras according to the configuration of the terminal.

The microphone module 122 receives an external sound signal by a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. The microphone module 122 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The operation unit 130 generates key input data that the user inputs for controlling the operation of the terminal. The operation unit 130 may include a key pad dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like. Particularly, when the touch pad has a mutual layer structure with the display module 151 described later, it can be called a touch screen.

The sensing unit 140 senses the current state of the portable terminal 100 such as the open / close state of the portable terminal 100, the position of the portable terminal 100, Thereby generating a sensing signal. For example, when the portable terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. Also, it is responsible for a sensing function related to whether or not the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like.

The interface unit 170 serves as an interface with all the external devices connected to the portable terminal 100. For example, it may be a wired / wireless headset, an external charger, a wired / wireless data port, a card socket (e.g. a memory card, a SIM / UIM card), an audio I / I / O (input / output) terminals, and earphones. The interface unit 170 receives data from an external device or receives power from the external device, transfers the data to each component in the portable terminal 100, or transmits data in the portable terminal 100 to an external device.

The output unit 150 is for outputting an audio signal, a video signal, or an alarm signal. The output unit 150 may include a display module 151, an audio output module 152, an alarm output module 153, and the like.

The display module 151 displays and outputs information processed by the portable terminal 100. For example, when the portable terminal 100 is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the portable terminal 100 is in the video communication mode or the photographing mode, the captured image and / or the received image or UI and GUI are displayed.

Meanwhile, as described above, when the display module 151 and the touch pad have a mutual layer structure to constitute a touch screen, the display module 151 can be used as an input device in addition to the output device. The display module 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display). Also, there may be two or more display modules 151 according to the implementation mode of the portable terminal 100. [ For example, the portable terminal 100 may include an external display module (not shown) and an internal display module (not shown) at the same time.

The audio output module 152 outputs audio data received from the wireless communication unit 110 or stored in the storage unit 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, and the like) performed in the portable terminal 100. The sound output module 152 may include a speaker, a buzzer, and the like.

The alarm output module 153 outputs a signal for notifying the occurrence of an event of the portable terminal 100. Examples of events generated in the portable terminal 100 include a call signal request for requesting a telephone call, a message reception, a key signal input, and an alarm for notifying a predetermined time. The alarm output module 153 outputs a signal for notifying the occurrence of an event in a form other than an audio signal or a video signal. For example, it is possible to output a signal in a vibration mode. When a call signal is received or a message is received, the alarm output module 153 may output a vibration to inform it. Alternatively, when the key signal is input, the alarm output module 153 can output the vibration as the feedback to the key signal input. The user can recognize the occurrence of an event through the vibration output as described above. Of course, a signal for notifying the occurrence of an event may also be output through the display module 151 or the sound output module 152.

The storage unit 160 may store a program for processing and controlling the controller 180 and may store a program for temporarily storing input / output data (e.g., a phone book, a message, a still image, Function may be performed.

The storage unit 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory) A RAM, and a ROM. In addition, the portable terminal 100 may operate a web storage that performs a storage function of the storage unit 150 on the Internet.

The control unit 180 typically controls the overall operation of the portable terminal 100. For example, voice communication, data communication, video communication, and the like. In addition, the control unit 180 may include a multimedia playback module 181 for multimedia playback. The multimedia playback module 181 may be configured in hardware in the controller 180 or separately from software in the controller 180. [

In addition, the control unit 180 identifies an operation of touching or directly touching the touch screen by an input medium (e.g., a user's finger), and provides different GUIs to the touch screen . For example, the controller 180 displays a predetermined lower layer object on the touch screen, displays at least a part of an upper layer object on the lower layer object, The upper layer object may be closely touched and then moved along the surface of the touch screen to identify the movement of the upper layer object so that the upper layer object moves on the touch screen. Also, the controller 180 activates the lower layer object and the upper layer object displayed on the touch screen, thereby identifying an operation of the input medium to perform the proximity touch or direct touch of the lower layer object and the upper layer object . The detailed function of the controller 180 will be described in more detail below.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

In the foregoing, the mobile terminal related to the present invention has been examined from the viewpoint of components according to functions. Hereinafter, the portable terminal related to the present invention will be further described with reference to FIGS. 2 and 3 in terms of components according to the external form. Hereinafter, for simplicity of explanation, a slider type portable terminal will be exemplified among various types of portable terminals such as a folder type, a bar type, a swing type, a slider type, and the like. Therefore, the present invention is not limited to the slider-type portable terminal but can be applied to all types of portable terminals including the above-mentioned type.

2 is a perspective view of a portable terminal according to an embodiment of the present invention.

The portable terminal 100 of the present invention includes a first body 100A and a second body 100B configured to be slidable along at least one direction on the first body 100A.

The state where the first body 100A is disposed in a state of being overlapped with the second body 100B may be referred to as a closed configuration and the first body 100A may be referred to as a second body 100B ) May be referred to as an open configuration.

The portable terminal 100 operates mainly in the standby mode in the closed state, but the standby mode is also released by the operation of the user. Also, the mobile terminal 100 operates mainly in a call mode in an open state, but is also switched to a standby mode after an operation of a user or a predetermined time elapses.

The casing (casing, housing, cover, etc.) constituting the outer appearance of the first body 100A is formed by the first front case 100A-1 and the first rear case 100A-2. Various electronic components are embedded in the space formed by the first front case 100A-1 and the first rear case 100A-2. At least one intermediate case may be additionally disposed between the first front case 100A-1 and the first rear case 100A-2.

The cases may be formed by injection molding of a synthetic resin or may be formed of a metal material such as stainless steel (STS) or titanium (Ti).

The display module 151, the first sound output module 152-1, the first camera module 121-1, or the first operation unit (not shown) is connected to the first body 100A, specifically, the first front case 100A- 130-1 may be disposed.

The display module 151 includes a liquid crystal display (LCD), organic light emitting diodes (OLED), and the like, which visually represent information.

In addition, the display module 151 is overlapped with the touch pad in a layer structure, so that the display module 151 may operate as a touch screen to enable information input by a user's touch.

The first sound output module 152-1 may be implemented in the form of a receiver or a speaker.

The first camera module 121-1 may be adapted to photograph an image or a moving image of a user.

Like the first body 100A, the case constituting the outer appearance of the second body 100B is formed by the second front case 100B-1 and the second rear case 100B-2.

The second operation unit 130-2 may be disposed on the front face of the second body 100B, specifically, the second front case 100B-1.

The third operating unit 130-3, the microphone module 122, and the interface unit 170 may be disposed on at least one of the first front case 100B-1 and the second rear case 100B-2.

The first to third operating portions 130-1, 130-2, and 130-3 may be collectively referred to as a manipulating portion 130. In a tactile manner, Any way can be employed.

For example, the operation unit 130 may be embodied as a dome switch or a touch pad capable of receiving a command or information by a user's push or touch operation, or may be a wheel, a jog type, a joystick, Or the like.

In terms of functions, the first operation unit 130-1 is for inputting commands such as start, end, and scroll, and the second operation unit 130-2 is for inputting numbers, characters, symbols, will be.

Also, the third operating unit 130-3 may operate as a hot-key for activating a special function in the portable terminal.

The microphone module 122 may be implemented in a form suitable for receiving voice, sound, etc. of the user.

The interface unit 170 is a channel through which the portable terminal 100 related to the present invention can exchange data with an external device. For example, the interface unit 170 may be a wired or wireless connection terminal for connecting to an earphone, a port for short range communication (e.g., an IrDA port, a Bluetooth port, A wireless LAN port, or the like), or power supply terminals for supplying power to each component in the portable terminal 100.

The interface unit 170 may be a card socket for accommodating an external card such as a subscriber identification module (SIM) or a user identity module (UIM) or a memory card for storing information.

A power supply unit 190 for supplying power to the portable terminal 100 is mounted on the second rear case 100B-2.

The power supply unit 190 may be detachably coupled to a rechargeable battery, for example, for charging.

3 is a rear perspective view of the portable terminal of FIG.

Referring to FIG. 3, a second camera module 121-2 may be further mounted on the rear surface of the second rear case 100B-2 of the second body 100B. The second camera module 121-2 has a photographing direction substantially opposite to that of the first camera module 121-1 (see FIG. 1), and may have different pixels from the first camera module 121-1 (see FIG. 1).

For example, the first camera module 121-1 has low pixels so that it is easy to capture a face of a user in the case of a video call or the like and transmit the face to the other party, and the second camera module 121-2 has a general object It is often desirable to have a high pixel.

A flash 121-3 and a mirror 121-4 may be further disposed adjacent to the second camera module 121-2. The flash 121-3 shines light toward the subject when the subject is photographed by the second camera module 121-2. The mirror 121-4 allows the user to illuminate the user's face or the like when the user intends to take a picture of himself / herself (self-photographing) using the second camera module 121-2.

And the second sound output module 152-2 may be additionally disposed in the second rear case 100B-2.

The second sound output module 152-2 may implement a stereo function together with the first sound output module 152-1 (see FIG. 2), and may be used for a call in a speakerphone mode.

On one side of the second rear case 100B-2, a broadcast signal reception antenna 111-1 may be arranged in addition to an antenna for communication. The antenna 111-1 may be installed to be able to be drawn out from the second body 100B.

A part of the slide module 100C that slidably connects the first body 100A and the second body 100B is disposed on the first rear case 100A-2 side of the first body 100A.

The other part of the slide module 100C may be disposed on the second front case 100B-1 side of the second body 100B and may not be exposed to the outside as in this figure.

The second camera module 121-2 and the like are disposed on the second body 100B, the present invention is not limited thereto.

For example, at least two of the configurations 111-1, 121-2 to 121-3, and 152-2 described as being arranged in the second rear case 100B-2, such as the second camera module 121-2, It is also possible that at least one is mounted on the first body 100A, mainly the first rear case 100A-2. In such a case, there is an advantage that the configuration (s) arranged in the first rear case 100A-2 in the closed state is protected by the second body 100B. Further, even if the second camera module 121-2 is not separately provided, the first camera module 121-1 is rotatably formed so that the camera module 121-2 can be photographed up to the photographing direction .

The above-described portable terminal is not limited to the configuration and method of the above-described embodiments, but the embodiments may be configured such that all or some of the embodiments are selectively combined so that various modifications can be made. It is possible.

4 shows a structure of a touch screen related to the present invention.

As shown in FIG. 4, the display module 151 according to the present invention includes a touch screen 500 by forming a mutual layer structure with the touch pad 400.

4, the touch pad 400 includes a rectangular conductive film 411 formed of a transparent conductive material such as ITO (Indium Tin Oxide), and metal electrodes (not shown) formed at each corner of the conductive film 411 412-1 to 412-4. A protective layer 420 may be formed on the conductive layer 411.

The touch pad 400 is a capacitance sensing type position detecting device. The touch pad 400 is a position detecting device of a capacitive sensing type, Electric field lines are formed between the reception side metal electrodes (R) 412-1 and 412-4 and the reception side metal electrodes (R) 412-2 and 412-3. The formed line of electric force extends to the outside of the touch panel 400 through the protective film 420. Accordingly, when the input line (for example, the user's finger) touches the touch pad 400 or touches the touch pad 400 to partially cut off the line of electric force, the reception-side metal electrodes R: 412-2, and 412-3, respectively. This is because the human body has a capacitance of several pF with respect to the ground, and therefore, when the user touches or touches the touch panel 400 with a finger, the electric force line formed on the touch panel 400 is distorted.

Processors (not shown) configured in the portable terminal 100 are connected to the input mediums R, 412-2, and 412-3, It is possible to detect a distance between the touch point and the position where the touch occurs.

The input medium includes not only the user's finger but also any object that can recognize the touch input by distorting the electric force line formed on the touch pad 400.

5 is a diagram for explaining the principle of detecting a proximity distance of an input medium using the touch screen of FIG.

5, the AC voltage 430 is applied to the transmission-side metal electrode 412-1 of the metal electrodes 412-1 to 412-4 formed on the transparent conductive film 411, Electric force lines 501 to 503 are formed between the side metal electrode 412-1 and the reception side metal electrode 412-2. The electric lines of force 501 to 503 are formed extending in the vertical direction (i.e., the z direction) of the touch screen 500.

The amount of electric force lines 501 to 503 blocked by the finger 510 varies according to the distance of the user's finger 510 to the touch screen 500. [ That is, as the finger 510 approaches the touch screen 500, the influence of the finger 510 on the electric force lines 501 to 503 increases.

The influence exerted on the electric force lines 501 to 503 by the finger 510 in this way is the current applied to the current detecting portions 440-1 and 440-2 connected to the metal electrodes 412-1 and 412-2 And the current detectors 440-1 and 440-2 detect the change of the current and provide it to an analog-to-digital converter (ADC) 450. Then, the analog-to-digital converter (ADC) 450 converts the amount of change of the current input in the form of an analog signal into a digital value, and provides the digital value to the touch time measuring unit 460.

Then, the touch time measuring unit 460 may determine that the touch screen 500 recognizes that the finger 510 is in proximity from the information on the current change amount provided from the analog-digital converter (ADC) 450 And measures the time at which the finger 510 remains within the touch recognition effective distance (i.e., 'd1' in FIG. 5). Accordingly, when the finger 510 remains within the proximity touch recognition effective distance (i.e., 'd1' in FIG. 5) for a predetermined time (for example, one second) It is possible to recognize that the user is performing a proximity touch or a direct touch operation according to the present invention. On the other hand, the touch time measurement unit 460 determines whether the finger 510 remains within the proximity touch recognition effective distance (i.e., 'd1' in FIG. 5) for a time shorter than a predetermined time It can be determined that the near touch or the direct touch operation according to the present invention is not performed.

If it is determined by the touch time measuring unit 460 that the finger 510 has a touch input in accordance with the proximity touch or the touch operation of the touch screen 500, the touch time measuring unit 460 Provides information on the touch input occurrence information and the amount of current change to the distance detector 470.

The distance detecting unit 460 detects a distance between the finger 510 and the touch screen 500 based on the information on the amount of current change supplied from the touch screen 500, That is, in the z direction).

Specifically, when the finger 510 is located at a distance d1 (for example, 30 mm) from the vertical direction (that is, z direction) of the touch panel 400 and a distance f2 (for example, 20 mm) (I.e., located between d1 and d2), the distance detecting unit 460 detects that the finger 510 is within the touch recognition effective distance at which the touch screen 500 starts to detect whether or not the external input medium is touched It is possible to provide a function corresponding to the proximity touch operation according to the present invention. Here, the proximity touch refers to a state where an input medium (e.g., a user's finger) is positioned within the touch recognition effective distance of the touch screen 500 in order to input a user command, Since the touch screen 500 is not directly touched, the input medium is distinguished from the direct touch operation in which the touch screen 500 is directly touched.

In addition, when the finger 510 is located at a distance d2 (for example, 20 mm) and distance d3 (for example, 10 mm) from the vertical direction (that is, z direction) of the touch screen 500 The distance detecting unit 460 may determine that the finger 510 is very close to the touch screen 500. In this case,

Further, if the finger 510 is located at a distance (i.e., located within d3) closer than d3 (for example, 10 mm) from the vertical upward direction (i.e., z direction) of the touch screen 500, 510 directly touches the surface of the touch screen 500, the distance detecting unit 460 determines that the finger 510 directly touches the touch screen 500 within an error range.

5, the touch operation of the finger 510 is divided into three levels according to the distance between the finger 510 and the touch screen 500. However, the touch operation may be performed more precisely in four or more steps Can be distinguished.

Next, the position detector 480 calculates the position on the touch screen 500 indicated by the finger 510, that is, the horizontal coordinates in the x and y directions on the touch screen 500, from the information on the current change amount . The y direction is a direction perpendicular to the x and z directions shown in Fig.

The vertical distance between the finger 510 and the touch screen 500 measured in this manner and the horizontal coordinate of the finger 510 on the touch panel 400 are provided to the controller 180. Accordingly, the controller 180 determines a command of the user according to the vertical distance and the horizontal coordinate, performs a control operation according to the user command, and displays a predetermined graphical user interface (GUI) on the display module 151 ).

FIG. 6 is a view for explaining the principle of position detection of an input medium using the touch screen of FIG.

6, when an AC voltage is applied from the AC voltage source to the transmission-side metal electrodes T 412-1 and 412-4 of the touch panel 400, the transmission-side metal electrodes T 412 -1, and 412-4 and the reception-side metal electrodes R 412-2 and 412-3, respectively.

When the finger 510 of the user is proximate to the touch panel 400 or directly touches the touch panel 400, a current change occurs in the metal electrodes 412-1 to 412-4. The position detecting unit 470 detects the position of the finger 510 on the basis of the amount of the current detected by the current detecting unit 440-1 through 440-4, (I.e., xy coordinates) located on the display unit 400 and provides the detected coordinates to the control unit 180. Accordingly, the controller 180 recognizes the horizontal coordinate on the touch screen 500 that the finger 510 touches, performs a user command corresponding to the touch operation, and controls the display module 151 To provide a predetermined graphical user interface (GUI).

5 and 6, the touch time measuring unit 460, the distance detecting unit 470, and the position detecting unit 480 are shown as being separated from each other according to their functions. However, the touch time measuring unit 460 ), The distance detector 470 and the position detector 480 may be formed in the controller 180.

4 to 6 illustrate a touch screen 500 having a touch panel 400 according to an electrostatic capacity sensing method. The proximity touch and the direct touch of the input medium to the touch screen 500 The type of the touch panel 400 or the type of the touch panel 500 may be used as long as it can provide the function of detecting the distance between the input medium and the touch screen 500 and the position indicated by the input medium. The arrangement of the metal electrodes 412-1 to 412-4 constituted in the electrodes 400 is not limited.

For example, the touch panel 400 may include a touch panel 400 using a photoelectric sensor using a laser diode and a light emitting diode, a high frequency oscillation proximity sensor, and a magnetic proximity sensor, The present invention can be implemented not only to detect a proximity position between the panels 400 but also to form a metal electrode on an upper plate or a lower plate and to detect a change in voltage according to the pressed position by the input medium, Type) and the capacitance sensing method may be combined with each other.

7A to 7C are views illustrating a method of providing a graphical user interface using a portable terminal having a proximity touch recognition function according to the first embodiment of the present invention.

The portable terminal 100 shown in FIGS. 7A to 7C is configured such that the touch pad 400 has a mutual layer structure in the display module 151 so that a user can input a user command through the screen while viewing the screen And a touch screen 500. Particularly, the touch screen 500 can distinguish between the proximity touch operation and the direct touch operation of the input medium (e.g., the user's finger) 510, and accordingly, An input signal can be generated. Accordingly, the user can input different user commands by touching or directly touching the touch screen 500 using the input medium 510.

7A is a diagram for explaining a graphical user interface provided on a touch screen in a proximity touch operation of an input medium according to the first embodiment of the present invention.

7A, a web browser 600 is executed on the touch screen 500 of the portable terminal 100 to display a web page 610 in the web browser 600 and a popup window 620 is displayed on the web page 610 In a state in which they are superimposed on each other.

The objects defined in the present invention include letters, numbers, symbols, symbols, graphics, photographs, images, moving pictures, or a combination thereof that can be displayed on the touch screen 500. Accordingly, the web browser 600 and the web page 610 may be classified as one independent object, and a plurality of objects may be included in the web browser 600 and the web page 610 .

The web page 610 may include various objects such as a company logo, a menu bar, an advertisement, an image, a text, a graphic, and the like. A URL (Uniform Resource Locator) of another web page is linked to the company logo, the menu bar, the advertisement, the image, the text and the graphic included in the web page 610, , A screen of the web page can be displayed on the touch screen 500 by directly touching an image, text, or graphic.

Also, the pop-up window 620 refers to a new window in which a specific web site is suddenly generated to display certain contents. The pop-up window 620 is mainly used to announce advertisements, announcements, or install new programs.

In FIG. 7A, only a partial area of the web page 610 is enlarged and displayed, but the entire area of the web page 610 may be reduced and displayed. A pop-up window 620 indicating an advertisement is superimposed on the upper part of the web page 610. When two or more different objects 610 and 620 are displayed on the touch screen 500 in a multi-layer structure, the object displayed on the lower portion of the objects 610 and 620 (I.e., the web page 620 or the web browser 600) is defined as a lower layer object, and an object displayed in a superimposed manner on the lower layer object (i.e., the popup window 620) may be defined as an upper layer object. 7A, although the entire area of the pop-up window 620 is overlapped on the web page 610, even if only a part of the pop-up window 620 is overlapped and displayed on the web page 610, the web page 610 And the pop-up window 620 may be classified into a lower layer object and an upper layer object, respectively.

In FIG. 7A, reference numeral 511 denotes a pop-up window (not shown) displayed on the touch screen 500 within the touch recognition effective distance of the touch screen 500 620).

In this way, when the web page 610 and the pop-up window 620 are displayed on the touch screen 500, the input medium (e.g., the user's finger) The control unit 180 determines that a user command based on the proximity touch operation is input.

When the input medium 510 moves close to the pop-up window 620 in the direction of the arrow 520 while the predetermined position of the pop-up window 620 is touched, the controller 180 determines that the input medium 510 is close It is identified that the position movement command is input and the pop-up window 620 is controlled to move in accordance with the movement of the input medium 510 on the touch screen 500, as shown in FIG. 7B. That is, the pop-up window 620 may be displayed to move on the touch screen 510 following the input medium 510 according to the movement path of the input medium 510. Here, the proximity position movement means that the input medium 510 is horizontally moved along the surface of the touch screen 500 without directly touching the touch screen 500, .

Further, the input medium 510 may continue to move in proximity to the pop-up window 620 while at least a portion of the pop-up window 620 may be displayed on the touch screen 500 The control unit 180 removes the pop-up window 620 from the touch screen 500 when the user exits the viewing area. For example, when the central portion of the pop-up window 620 is out of the visible range of the touch screen 500, the controller 180 determines that the user has input a user command for removing the pop-up window 620 The pop-up window 620 may be removed on the touch screen 500. When the input medium 510 is moved out of the visible range of the touch screen 500 while the input medium 510 continues to move in the proximity position, the controller 180 inputs a user command for removing the pop-up window 620 It is possible to control the pop-up window 620 to disappear from the touch screen 500.

As described above, according to the present invention, a pop-up window 620 such as an advertisement displayed on the touch screen 500 can be easily removed according to the proximity touch operation of the input medium 510. In addition, according to the present invention, it is possible to input a user command specific to the proximity touch operation, thereby causing interest of the user and providing a more diverse graphical user interface environment.

8A to 8C are views illustrating a method of providing a graphical user interface using a portable terminal having a proximity touch recognition function according to a second embodiment of the present invention.

8A, a web browser 600 is executed on the touch screen 500 of the portable terminal 100, and a web page 610 is displayed in the web browser 600. When a popup window 620 is displayed on the web page 610 In a state in which they are superimposed on each other.

8A, a check mark (?) 512 indicates that an input medium (e.g., a user's finger) has touched the pop-up window 620 directly.

When the input medium is moved in the direction of the arrow 520 along the surface of the touch screen 500 while directly touching a predetermined position of the pop-up window 620, the control unit 180 controls the input medium It is possible to identify the inputting of the direct touch position movement command according to the present invention so that both the pop-up window 620 and the web page 610 are moved in correspondence with the movement of the input medium 510 on the touch screen 500 Respectively. 8B, when the input medium 510 moves in the right direction of the touch screen 500 as indicated by an arrow 520, the controller 180 controls the pop-up window 620 and the pop- It is possible to control all of the web pages 610 to move leftward on the touch screen 500.

The direction of movement of the pop-up window 620 and the web page 610 may be changed according to a direct touch position movement command of the input medium 510. For example, when the input medium 510 moves to the right of the touch screen 500, the controller 180 controls the pop-up window 620 and the web page 610 to be displayed on the input medium 510 It may control to follow the input medium 510 according to the movement path and to move to the right side on the touch screen 500.

8C, the check mark (?) 512 indicates that the input medium (e.g., the user's finger) has touched the web page 610 directly.

When the input medium is moved in the direction of the arrow 520 along the surface of the touch screen 500 while directly touching a predetermined position of the web page 610, 8B, the pop-up window 620 and the web page 610 are both displayed on the touch screen 500, as shown in FIG. 8B, by inputting the direct touch position move command according to the present invention. It can be controlled to move in accordance with the movement of the input medium 510.

9A to 9E illustrate a method of providing a graphical user interface using a portable terminal having a proximity touch recognition function according to a third embodiment of the present invention.

9A illustrates a state in which a plurality of photo objects 700 are displayed on a touch screen 500 of the portable terminal 100 in the form of thumbnails such as thumbnails. The plurality of photographic objects 700 overlap each other to form a multi-layer structure.

Referring to FIG. 9A, a first area of the first photographic object 710 is partially overlaid on the second and third photographic objects 720 and 730. Accordingly, the first photo object 710 corresponds to a lower layer object of the second and third photo objects 720 and 730, and the second and third photo objects 720 and 730 correspond to a lower layer object of the first photo Corresponds to the upper layer objects of the object 710. In addition, the second photo object 720 is partially displayed on the fourth photo object 740 in a superimposed manner. The second photographic object 720 corresponds to the lower layer object of the fourth photographic object 740 and the fourth photographic object 740 corresponds to the upper layer object of the second photographic object 710. [ do.

9A, reference numeral 511 indicates that an input medium (e.g., the user's finger) 510 is located within the touch recognition area of the touch screen 500, Indicating that the photo object 710 has been touched proximately.

In this way, when the first photo object 710 is displayed on the touch screen 500 as a lower layer object for the second and third photo objects 720 and 730, The control unit 180 determines that the user command based on the proximity touch operation is input, and if the user touches the first photo object 710 for a predetermined time (for example, one second) Similarly, the first photo object 710 is displayed on top of the second and third photo objects 720 and 730.

9B, the first photo object 710 is displayed on top of the second and third photo objects 720 and 730 so that the first photo object 710 is displayed on the second, 3 photo objects 720 and 730 and the second and third photo objects 720 and 730 are changed to lower layer objects for the first photo object 710 .

9C, when the input medium 510 touches a predetermined position of the first photo object 710 changed to an upper layer object by a predetermined time (for example, one second) or more, 520, the control unit 180 identifies that the input medium 510 inputs the proximity movement command according to the present invention, and as shown in FIG. 9D, The control unit 710 controls the touch screen 500 to move according to the movement of the input medium 510. That is, the first photo object 710 may be displayed on the touch screen 510 following the input medium 510 according to the movement path of the input medium 510.

In addition, the input medium 510 may continue to move in the proximity of the first photo object 710 while at least a certain area of the first photo object 710 may be moved toward the touch screen 500 The control unit 180 removes the first photographic object 710 from the touch screen 500. As a result, For example, when the central portion of the first photo object 710 is out of the visible range of the touch screen 500, the controller 180 displays a user command for removing the first photo object 710 It is possible to remove the first photo object 710 on the touch screen 500. In addition, when the input medium 510 is moved out of the visible range of the touch screen 500 while the input medium 510 continues to move in the proximity position, the controller 180 controls the user to remove the first photo object 710 The first photographic object 710 may be removed on the touch screen 500, based on the determination that the command is input.

In addition, as described above, when at least a certain area of the first photo object 710 is removed from the visible area of the touch screen 500 and is removed on the touch screen 500, As shown in FIG. 9E, the touch screen 500 receives a user command regarding whether to delete the first photo object 710 from the storage unit 160 of the portable terminal 100 The user command input window 701 is displayed. The user command input window 701 includes a first text object 702 for receiving a delete command from a user and a second text object 703 for receiving a command for not deleting the command. The controller 180 identifies that the input medium 510 directly touches one of the first and second text objects 702 and 703 and transmits the first photo object 710 to the storage unit 160 or may remain in the storage unit 160 as it is.

That is, the user can directly confirm whether to delete the first photo object 710 by touching any one of the first and second text objects 702 and 703 using the input medium 750 do. Accordingly, it is possible to prevent the first photo object 710 from being lost due to the unintended touch operation by the user.

9E, the user input command window 701 is configured to receive a user command related to whether or not the first photo object 710 is deleted. However, the user command input window 701 may include the first photo object 710 or copy or move the storage space to another folder.

10A to 10C are views illustrating a method of providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to a fourth embodiment of the present invention.

10A illustrates a state in which a plurality of photo objects 700 are displayed on a touch screen 500 of the portable terminal 100 in the form of thumbnails such as thumbnails. The plurality of photographic objects 700 overlap each other to form a multi-layer structure.

10A, the check mark () 512 indicates that the input media (e.g., the user's finger) has touched the fourth photographic object 740 of the plurality of photographic objects 700 directly . The fourth photographic object 740 corresponds to the upper layer object with respect to the second photographic object 720. In contrast, the second photo object 720 corresponds to the lower layer object with respect to the fourth photo object 740.

When the input medium is moved in the direction of the arrow 520 (i.e., the right direction) along the surface of the touch screen 500 while directly touching a predetermined position of the fourth photo object 740, The control unit 180 identifies that the input medium is inputting the direct touch position movement command according to the present invention so that all of the plurality of photo objects 700 are displayed on the touch screen 500, In response to the movement of the input medium 510.

Referring to FIG. 10B, as the input medium moves rightward along the surface of the touch screen 500, the seventh and eighth picture objects 770 and 780, which are located outside the right boundary of the touch screen 500, Enters the visible region of the touch screen (500).

That is, when the input medium moves directly to the right of the touch screen 500, the plurality of photo objects 700 are moved to the left on the touch screen 500. Accordingly, the fifth and sixth photographic objects 750 and 756 displayed on the left side in the visible region of the touch screen 500 disappear from the left border of the touch screen 500. On the other hand, the seventh and eighth photo objects 770 and 786 arranged outside the right boundary of the visible region of the touch screen 500 are displayed in the visible region of the touch screen 500.

Meanwhile, the direction in which the plurality of photo objects 700 are moved may be changed according to the direct touch position movement command of the input medium. For example, when the input medium 510 is moved to the right of the touch screen 500, the controller 180 controls the plurality of photo objects 700 to move the input It may be controlled to move in the right direction on the touch screen 500 following the medium.

10C, a check mark (?) 512 indicates that the input media (e.g., the user's finger) has touched the first photo object 710 directly.

When the input medium is moved in the direction of the arrow 520 along the surface of the touch screen 500 while directly touching a predetermined position of the first photo object 710 corresponding to the lower layer object, The control unit 180 identifies that the input medium inputs the direct touch position movement command according to the present invention so that all of the plurality of photo objects 700 are displayed on the touch screen 500 In response to the movement of the input medium 510. [

11A to 11C are views illustrating a method of providing a graphical user interface using a portable terminal having a proximity touch recognition function according to a fifth embodiment of the present invention.

11A, a predetermined menu screen 800 is displayed on the entire visible region of the touch screen 500 of the portable terminal 100 and a pop-up window 810 for setting the sound is displayed on the upper portion of the menu screen 800 Respectively. The menu screen 800 and the pop-up window 810 overlap each other to form a multi-layer structure.

Referring to FIG. 11A, the entire area of the pop-up window 810 is overlaid on the menu screen 800. Accordingly, the pop-up window 810 corresponds to an upper layer object of the menu screen 800 and the menu screen 800 corresponds to a lower layer object of the pop-up window 810. [

11A, reference numeral 511 denotes a pop-up window (e.g., a user's finger) 510 displayed on the touch screen 500 within a touch recognition effective distance of the touch screen 500 810).

In this way, when the pop-up window 810 is displayed as an upper layer object on the menu screen 800 on the touch screen 500, the input medium 510 displays the pop-up window 810 for a predetermined time The control unit 180 identifies that the input medium 510 inputs the proximity position movement command according to the present invention, and if the proximity position movement command is input to the input medium 510, The pop-up window 810 is controlled to move in accordance with the movement of the input medium 510 on the touch screen 500, as shown in FIG. That is, the pop-up window 810 can be displayed on the touch screen 510 following the input medium 510 according to the movement path of the input medium 510.

When the input medium 510 continues to move in the proximity position while the pop-up window 810 is touched and at least a certain portion of the pop-up window 810 is out of the visible range of the touch screen 500, The control unit 180 removes the pop-up window 810 from the touch screen 500. For example, when the central portion of the pop-up window 810 is out of the visible range of the touch screen 500, the controller 180 determines that the user has input a user command for removing the pop-up window 810 The pop-up window 810 may be removed on the touch screen 500. When the input medium 510 is moved out of the visible range of the touch screen 500 while the input medium 510 continues to move in the proximity position, the control unit 180 causes the user to input a user command for removing the pop-up window 810 It may be formed to remove the pop-up window 810 on the touch screen 500.

12A to 12B are views illustrating a method of providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to a sixth embodiment of the present invention.

12A, a predetermined menu screen 800 is displayed on the entire visible region of the touch screen 500 of the portable terminal 100 and a pop-up window 810 for sound setting is displayed on the upper portion of the menu screen 800 Respectively. The menu screen 800 and the pop-up window 810 overlap each other to form a multi-layer structure.

12A, the check mark (?) 512 indicates that the input medium (e.g., the user's finger) has touched the pop-up window 810 directly.

When the input medium is moved in the direction of the arrow 520 along the surface of the touch screen 500 in a state where the input medium directly touches a predetermined position of the pop-up window 810, 12B, the touch screen 500 is controlled to move the pop-up window 810 according to the movement of the input medium 510, . That is, the pop-up window 810 may be displayed on the touch screen 500 following the input medium 510 according to the movement path of the input medium 510.

Further, the input medium 510 directly touches the pop-up window 810 and continues to directly move the touch position so that at least a certain region of the pop-up window 810 is deviated from the visible region of the touch screen 500 The control unit 180 removes the pop-up window 810 from the touch screen 500. For example, when the central portion of the pop-up window 810 is out of the visible range of the touch screen 500, the controller 180 determines that the user has input a user command for removing the pop-up window 810 The pop-up window 810 may be removed on the touch screen 500. In addition, if the input medium 510 is moved out of the visible range of the touch screen 500 while the input medium 510 continues to directly move the touch position, the controller 180 displays a user command for removing the pop-up window 810 The pop-up window 810 may be formed on the touch screen 500 to be removed.

13A to 13C are views illustrating a method of providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to a seventh embodiment of the present invention.

13A is a diagram illustrating a graphical user interface provided on a touch screen in a proximity touch operation of an input medium according to a seventh embodiment of the present invention.

13A, a game function (e.g., a chess game) is executed on the portable terminal 100, a background screen (e.g., a chessboard) 900 is displayed on the touch screen 500, (For example, chess horse) 910 is displayed on the background screen 900 in a superimposed manner. The background screen 900 and the game character 910 overlap each other to form a multi-layer structure. Here, the background screen 900 corresponds to an upper layer object, and the game character 910 corresponds to a lower layer object.

13A illustrates a chess game as an example of the game function. However, the background screen 900 may be composed of various indoor and outdoor spaces or objects, and the game character 910 may also be a character, an animal, As long as it can be moved on the background screen 900 according to a certain rule or story contents.

In FIG. 13A, reference numeral '511' indicates that the input medium (for example, the user's finger) 510 indicates the game character displayed on the touch screen 500 within the touch recognition effective distance of the touch screen 500 (910).

In this way, when the background image 900 and the game character 910 are displayed on the touch screen 500, an input medium (e.g., a user's finger) 510 is displayed on the game character 910, The control unit 180 determines that a user command based on the proximity touch operation is input.

If the input medium 510 moves close to the game character 910 in the direction of the arrow 520 while the game character 910 is touched at a predetermined position, 13B, the background screen 900 is fixed to the touch screen 500 and the game character 910 is moved to the input side of the touch screen 500, And controls to move in the touch screen 500 according to the movement of the medium 510. That is, the game character 910 may be displayed on the touch screen 510 following the input medium 510 according to the movement path of the input medium 510.

Further, the input medium 510 may continue to move in proximity to the game character 910 while the game character 910 is being touched, so that at least a certain region of the game character 910 may deviate from the visible region of the touch screen 500 The controller 180 removes the game character 910 from the touch screen 500.

According to the present invention, the game character 910 displayed on the touch screen 500 according to the proximity touch operation of the input medium 510 while the game function is being executed in the portable terminal 100 is displayed on the background screen 900). ≪ / RTI >

As a result, according to the present invention, it is possible to input a user command specific to the proximity touch operation, thereby providing various game operation functions and providing a more diverse graphical user interface environment.

14A to 14C are views illustrating a method for providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to an eighth embodiment of the present invention.

14A shows a state in which the game function is executed in the portable terminal 100 and the background screen 900 is displayed on the touch screen 500 and the game character 910 is superimposed on the background screen 900 Respectively.

14A, the check mark (?) 512 indicates that the input medium (e.g., the user's finger) has touched the game character 910 directly.

When the input medium is moved in the direction of the arrow 520 along the surface of the touch screen 500 while the input character is directly touched on the predetermined position of the game character 910, The game character 910 and the background screen 900 are all displayed on the touch screen 500 as shown in FIG. 14B, In response to the movement of the movable member 510. For example, as shown in FIG. 14B, when the input medium 510 moves in the direction of the right of the touch screen 500 as shown by the arrow 520, It is possible to control both the screen 900 and the game character 910 to move leftward and downward on the touch screen 500.

Meanwhile, the direction in which the background screen 900 and the game container 910 are moved may be changed according to the direct touch position movement command of the input medium 510. For example, when the input medium 510 moves to the upper right of the touch screen 500, the controller 180 controls the input medium 510 to follow the input medium 510, And control the movement of the background screen 900 and the game character 910 in the upper right direction on the touch screen 500.

14C, the check mark (?) 512 indicates that the input medium (e.g., the user's finger) has touched the background screen 900 directly.

When the input medium is moved in the direction of the arrow 520 along the surface of the touch screen 500 while the input medium directly touches a predetermined position of the background screen 900, As shown in FIG. 14B, the background screen 900 and the game character 910 are both input to the touch screen 500 from the touch screen 500, as shown in FIG. 14B, Or may be formed to move in correspondence with the movement of the medium 510.

The portable terminal having the above-described proximity touch recognition function and the method of providing a graphic user interface using the portable terminal are not limited to the configuration and method of the embodiments described above, All or some of the embodiments may be selectively combined.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention;

FIG. 2 is a perspective view of an example of a portable terminal according to the present invention. FIG.

3 is a rear perspective view of the portable terminal of FIG. 2;

4 illustrates a structure of a touch screen associated with the present invention.

5 is a view for explaining the principle of detecting a proximity distance of an input medium using the touch screen of FIG.

6 is a view for explaining the principle of position detection of an input medium using the touch screen of FIG.

7A to 7C are views illustrating a method of providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to a first embodiment of the present invention.

8A to 8C illustrate a method of providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to a second embodiment of the present invention.

9A to 9E illustrate a method of providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to a third embodiment of the present invention.

10A to 10C illustrate a method for providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to a fourth embodiment of the present invention.

11A to 11C illustrate a method of providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to a fifth embodiment of the present invention.

12A to 12B illustrate a method of providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to a sixth embodiment of the present invention.

13A to 13C illustrate a method for providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to a seventh embodiment of the present invention.

14A to 14C illustrate a method for providing a graphical user interface using a mobile terminal having a proximity touch recognition function according to an eighth embodiment of the present invention.

Claims (11)

  1. A touch screen for displaying a plurality of objects disposed on different layers; And
    Wherein the controller is configured to receive a proximity touch or a direct touch from an input medium with respect to any one of the plurality of objects and to receive the proximity touch for any one of the objects for a predetermined time or longer, A touch is directly received with respect to any one of the objects, and when the input medium moves while maintaining the direct touch, the plurality of objects are displayed on the touch screen, And a controller for moving and displaying on the touch screen.
  2. delete
  3. The apparatus of claim 1,
    And removes any one of the objects from the touch screen when the input medium moves out of the visible range of the touch screen while maintaining the proximity touch.
  4. The apparatus of claim 1,
    The touch screen device according to claim 1, wherein the input medium is a touch screen, and the input medium is moved away from the visible area of the touch screen while maintaining the proximity touch. .
  5. The apparatus of claim 1,
    And changes any one of the objects to an object of an upper layer when proximity touch is received for any one of the objects for a predetermined time or longer.
  6. delete
  7. Displaying a plurality of objects disposed on different layers on a touch screen;
    Receiving a proximity or direct touch from an input medium for any one of the plurality of objects;
    Moving and displaying the one of the objects on the touch screen when the proximity touch is received for a predetermined time or longer and the input medium moves while maintaining the proximity touch; And
    And moving and displaying the plurality of objects on the touch screen when a direct touch is received for any one of the objects and the input medium moves while maintaining the direct touch.
  8. delete
  9. 8. The method of claim 7,
    Further comprising the step of, when the input medium moves out of the visible range of the touch screen while maintaining the proximity touch, removing any one of the objects from the touch screen.
  10. 8. The method of claim 7,
    Displaying an input window for receiving a user command to remove the object from the touch screen when the input medium moves away from the visible area of the touch screen while maintaining the proximity touch, Wherein the graphical user interface further comprises:
  11. 8. The method of claim 7, wherein moving and displaying the one or more objects on the touch screen comprises:
    And changing one of the objects to an object of an upper layer.
KR1020080030446A 2008-04-01 2008-04-01 Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same KR101469280B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020080030446A KR101469280B1 (en) 2008-04-01 2008-04-01 Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020080030446A KR101469280B1 (en) 2008-04-01 2008-04-01 Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same

Publications (2)

Publication Number Publication Date
KR20090105160A KR20090105160A (en) 2009-10-07
KR101469280B1 true KR101469280B1 (en) 2014-12-04

Family

ID=41534892

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020080030446A KR101469280B1 (en) 2008-04-01 2008-04-01 Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same

Country Status (1)

Country Link
KR (1) KR101469280B1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101033316B1 (en) * 2009-11-19 2011-05-09 주식회사 티엘아이 Capacitive touch detect system
KR101646616B1 (en) 2010-11-30 2016-08-12 삼성전자주식회사 Apparatus and Method for Controlling Object
KR101763263B1 (en) 2010-12-24 2017-07-31 삼성전자주식회사 3d display terminal apparatus and operating method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060058732A (en) * 1998-01-26 2006-05-30 웨인 웨스터만 Method and apparatus for integrating manual input
KR20070036077A (en) * 2004-06-29 2007-04-02 코닌클리케 필립스 일렉트로닉스 엔.브이. Multi-layered display of a graphical user interface
KR20070044758A (en) * 2005-10-25 2007-04-30 엘지전자 주식회사 Mobile communication terminal having a touch panel and touch key pad and controlling method thereof
KR20070113022A (en) * 2006-05-24 2007-11-28 엘지전자 주식회사 Apparatus and operating method of touch screen responds to user input

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060058732A (en) * 1998-01-26 2006-05-30 웨인 웨스터만 Method and apparatus for integrating manual input
KR20070036077A (en) * 2004-06-29 2007-04-02 코닌클리케 필립스 일렉트로닉스 엔.브이. Multi-layered display of a graphical user interface
KR20070044758A (en) * 2005-10-25 2007-04-30 엘지전자 주식회사 Mobile communication terminal having a touch panel and touch key pad and controlling method thereof
KR20070113022A (en) * 2006-05-24 2007-11-28 엘지전자 주식회사 Apparatus and operating method of touch screen responds to user input

Also Published As

Publication number Publication date
KR20090105160A (en) 2009-10-07

Similar Documents

Publication Publication Date Title
EP2192474B1 (en) Method for operating execution icon of mobile terminal
US9122392B2 (en) Mobile terminal, display device and controlling method thereof
KR101611302B1 (en) Mobile terminal capable of receiving gesture input and control method thereof
KR101482115B1 (en) Controlling a Mobile Terminal with a Gyro-Sensor
US9030418B2 (en) Mobile terminal capable of sensing proximity touch
KR101729523B1 (en) Mobile terminal and operation control method thereof
US10042534B2 (en) Mobile terminal and method to change display screen
KR101888457B1 (en) Apparatus having a touch screen processing plurality of apllications and method for controlling thereof
US8373666B2 (en) Mobile terminal using proximity sensor and control method thereof
KR101646254B1 (en) Method for removing icon in mobile terminal and mobile terminal using the same
KR102040611B1 (en) Mobile terminal and controlling method thereof
EP2360665A2 (en) Mobile terminal and control method thereof
KR20100000744A (en) Mobile terminal capable of previewing different channel
KR101549558B1 (en) Mobile terminal and control method thereof
KR20110130603A (en) Electronic device and method of controlling the same
KR20090024006A (en) Mobile terminal and it's touch recognition method
US8279174B2 (en) Display device and method of controlling the display device
RU2402179C2 (en) Device of mobile communication equipped with sensor screen and method of its control
KR20100131605A (en) The method for executing menu and mobile terminal using the same
KR101617461B1 (en) Method for outputting tts voice data in mobile terminal and mobile terminal thereof
CN101655769B (en) Portable terminal and driving method thereof
KR101549557B1 (en) Mobile terminal and control method thereof
US20110312387A1 (en) Mobile terminal and method of controlling the same
KR20100005440A (en) Controlling a mobile terminal with a gyro-sensor
KR20100003587A (en) Controlling a mobile terminal

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20171024

Year of fee payment: 4

LAPS Lapse due to unpaid annual fee