KR20110090447A - Method for providing user interface according to holding type of terminal and apparatus - Google Patents

Method for providing user interface according to holding type of terminal and apparatus Download PDF

Info

Publication number
KR20110090447A
KR20110090447A KR1020100010220A KR20100010220A KR20110090447A KR 20110090447 A KR20110090447 A KR 20110090447A KR 1020100010220 A KR1020100010220 A KR 1020100010220A KR 20100010220 A KR20100010220 A KR 20100010220A KR 20110090447 A KR20110090447 A KR 20110090447A
Authority
KR
South Korea
Prior art keywords
terminal
contact
user interface
gripping form
providing
Prior art date
Application number
KR1020100010220A
Other languages
Korean (ko)
Inventor
홍상우
Original Assignee
에스케이텔레콤 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 에스케이텔레콤 주식회사 filed Critical 에스케이텔레콤 주식회사
Priority to KR1020100010220A priority Critical patent/KR20110090447A/en
Publication of KR20110090447A publication Critical patent/KR20110090447A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

PURPOSE: A method for providing a user interface according to a holding type of a terminal is provided to arrange a menu icon by the types of grasping based on the contact of a user. CONSTITUTION: A sensor unit(14) senses the contact of a user. A display unit(13) displays a menu icon by the types of grasping. A controller(17) checks the contact sensing time. The controller discriminates the grasping type if the contact is sensed more than a predetermined time. The controller arranges function keys by the sensed grasping type.

Description

Method and device for providing user interface according to the gripping form of terminal {Method for providing user interface according to holding type of terminal and apparatus}

The present invention relates to a method and apparatus for providing a user interface of a terminal, and more particularly, to detect a user's contact in a terminal having a touch sensor or a proximity sensor, and to classify and classify a grip type of the terminal according to the detected contact. The present invention relates to a method and apparatus for providing a user interface according to a gripping form of a terminal capable of arranging a function key and a menu icon of the terminal according to the gripping form.

Recently, with the development of technology related to the terminal and its spread, the terminal has become a necessity of the individual. In addition, with the spread of terminals, operators have provided various services to meet the needs of users. In addition, the terminal user selects functions such as making a phone call, sending and receiving a text message, etc., using a plurality of buttons installed on one side, and inputs Korean, English, and numerals to execute the function. Along with this, terminals have been developed to integrate and fuse functions of other electronic devices as well as calling, text message sending and receiving functions. For example, the terminal includes a very large number of functions such as playing a MP3 music file of an MP3 player, a video recording function and a video playing function of a digital camera, an electronic dictionary function, or a digital TV function.

However, the increase in the function of the terminal is accompanied by the increase in the device for controlling the functions, the complex functionalization of the terminal is the cause of the complexity and high price of the portable terminal. That is, the terminal is not larger than a certain level compared to the size, it is necessary to have a structure that can provide a quick and easy function to the user.

An object of the present invention for solving the conventional problems is a method for providing a user interface according to the gripping form of the terminal to sense the user's touch through the touch sensor or proximity sensor, and to classify the gripping form according to the contact to arrange the function keys And an apparatus.

In addition, an object of the present invention is to provide a method and apparatus for providing a user interface according to a gripping form of a terminal for arranging menu icons by dividing the gripping form according to a user's contact.

In the terminal for providing a user interface according to the gripping form of the present invention for achieving the above object, a sensor unit for detecting a contact of the user, a display unit for displaying a menu icon according to the gripping form divided by the contact and And a controller configured to check a time at which the contact is detected, to classify the grip type according to the contact, and to arrange a function key according to the divided grip type when the contact is detected for more than a preset time. .

In addition, the sensor unit according to the present invention is disposed on both sides of the terminal, characterized in that composed of a plurality of proximity sensors or pressure sensors.

The apparatus may further include a storage unit which stores the menu icon displayed according to the grip type of the user according to the present invention.

The apparatus may further include an input unit including a function key whose arrangement changes according to the contact according to the present invention.

In addition, the function key according to the invention is characterized in that it comprises at least one key of a volume key, a camera key and a shortcut key.

A method for providing a user interface according to a terminal gripping form of the present invention for achieving the above object, the method comprising the steps of: detecting the user's contact, the terminal checking the time when the contact is detected; When the contact is detected for a predetermined time or more, the terminal may include classifying a gripping form according to the contact and arranging a function key according to the divided gripping form.

In addition, the detecting according to the present invention is characterized in that the terminal detects the contact according to an infrared ray change or a pressure change.

In addition, the step of identifying according to the present invention includes the step of identifying, by the terminal, one side from which the contact occurs between both sides of the terminal and the terminal identifying the gripping form corresponding to the identified one side. Characterized in that it comprises a.

In addition, the step of confirming according to the invention, if the contact is detected less than a predetermined time, the terminal further comprises the step of waiting for setting the function key.

In addition, the arranging according to the present invention is characterized in that the terminal comprises the step of rearranging by mapping a specific function to a function key located on the left side of the terminal when the gripping form is a right hand grip.

In addition, the arranging according to the present invention is characterized in that the terminal includes a rearrangement by mapping a specific function to a function key located on the right side of the terminal when the gripping form is a left hand grip.

The arranging according to the present invention may further include rearranging the menu icons by the terminal according to the gripping form.

In addition, the rearranging according to the present invention may further include arranging the menu icons from the right side to the left side when the gripping form is a right hand grip.

In addition, the rearranging according to the present invention may further include arranging the menu icons from the left side to the right side when the gripping form is a left hand grip.

According to the present invention, the terminal is provided with a touch sensor or a proximity sensor, by detecting the user's contact, and by arranging the function keys by dividing the gripping form according to the contact, the user can easily select the function key of the appropriate position according to the gripping form It is available.

In addition, the user may be provided with a screen on which menu icons are arranged according to the gripping form of the terminal.

1 is a block diagram illustrating a configuration of a terminal providing a user interface according to a user's grip form according to an embodiment of the present invention.
FIG. 2 is a diagram for describing in detail the input unit of FIG. 1. FIG.
3 is a flowchart illustrating a method of providing a user interface according to a gripping form of a terminal according to an embodiment of the present invention.
4 to 9 are screen examples for explaining a user interface according to a gripping form of a terminal in an embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. However, in the following description and the accompanying drawings, detailed descriptions of well-known functions or configurations that may obscure the subject matter of the present invention will be omitted. In addition, it should be noted that like elements are denoted by the same reference numerals as much as possible throughout the drawings.

The terms or words used in the specification and claims described below should not be construed as being limited to the ordinary or dictionary meanings, and the inventors are appropriate to the concept of terms in order to explain their invention in the best way. It should be interpreted as meanings and concepts in accordance with the technical spirit of the present invention based on the principle that it can be defined. Therefore, the embodiments described in the present specification and the configuration shown in the drawings are only the most preferred embodiments of the present invention, and do not represent all of the technical ideas of the present invention, and various alternatives may be substituted at the time of the present application. It should be understood that there may be equivalents and variations.

Prior to the description, the terminal according to the embodiment of the present invention, as well as a terminal using a mobile communication network, as a terminal capable of distinguishing a user's gripping form by providing a touch sensor or a proximity sensor, general wired terminal, fixed terminal and IP It may be applied to various terminals such as an Internet Protocol terminal.

1 is a block diagram illustrating a configuration of a terminal providing a user interface according to a user's grip form according to an embodiment of the present invention.

Referring to FIG. 1, in an embodiment of the present invention, a terminal 10 providing a user interface according to a user's gripping form may include a communication unit 11, an input unit 12, a display unit 13, a sensor unit 14, and a storage unit. The unit 15 includes an audio processor 16 and a controller 17.

The communication unit 11 performs communication of the terminal 10. For example, the communication unit 11 performs data communication including voice communication and video communication. Here, the communication unit 11 includes an RF transmitter for up-converting and amplifying the frequency of the transmitted signal, and an RF receiver for low-noise-amplifying and down-converting the received signal. In particular, the communication unit 11 according to an exemplary embodiment of the present invention may execute a call function according to input of a number key among menu icons displayed on the screen according to the gripping form of the terminal 10.

The input unit 12 receives a number or character information and includes a plurality of input keys and function keys for setting various functions. In this case, the input unit 12 may include a volume key, a direction key, a side key and a shortcut key set for a specific function. In addition, the input unit 12 generates a key signal input in relation to the user setting and the function control of the terminal 10, and transmits it to the controller 17. In particular, the input unit 12 according to an embodiment of the present invention may be located at both sides of the terminal 10 and include a function key for executing a specific function. In this case, the function key may be set or released according to the gripping form of the terminal 10.

The display unit 13 may be a liquid crystal display (LCD), organic light emitting diodes (OLED), or the like, and when the LCD is applied, the display unit 13 may include an LCD terminal controller (not shown) and a memory capable of storing LCD data. And an LCD display element or the like. Here, when the display device such as LCD and OLED is implemented by the touch screen method, the display unit 13 may be operated as an input unit. In particular, in the embodiment of the present invention, the display unit 13 rearranges and displays menu icons according to the gripping form of the terminal 10.

The sensor unit 14 may emit light, detect a physical signal such as a user's contact or an object's contact input from the outside, and transmit the detected physical signal to a signal processor (not shown). Here, the sensor unit 14 is disposed on both sides of the terminal 10, it may be composed of a plurality of proximity sensors or pressure sensors. Among them, the sensor unit 14 may use an infrared sensor that detects that an external object approaches the detection area using infrared rays. Here, the sensor unit 14 may be classified into an active type for detecting a change by radiating infrared light and a passive type for detecting only a change in the infrared light received from the outside without having a light emitting part on its own. In particular, when the sensor unit 14 is actively configured, a light emitting unit (not shown) including a light emitting diode (LED) that emits infrared rays (IR), a diode capable of sensing reflected light, or It may be configured as a light receiving unit (not shown) made of a detector such as a transistor (Transister, TR).

The light emitting unit may serve to emit light to an external arbitrary object according to the signal of the controller 17, and the light receiving unit may serve to detect the light reflected from the external object through the detector. In detail, the light emitting unit emits a predetermined amount of infrared rays, and the light receiving unit may detect a change in voltage according to the amount of infrared rays reflected by the object. In particular, the sensor unit 14 in the embodiment of the present invention is composed of a plurality of both sides of the terminal 10, by using a proximity sensor applying a change of infrared ray or a pressure sensor applying a change of pressure of the terminal 10 Detect the grip type. In this case, the sensor unit 14 may detect a physical quantity or a chemical quantity such as a distribution of temperature or heat and intensity of a spectrum, and convert it into an electrical quantity capable of signal processing. Here, the sensor unit 14 may transmit a data signal to the controller 18 when the user's contact occurs for a predetermined time or more. To this end, the sensor unit 14 may use a method of detecting infrared rays entering the light receiving unit using a comparator and a method of detecting infrared rays through an analog digital converter (ADC) of a microcontroller. In addition, the sensor unit 14 may be configured as a pressure sensor to detect the contact of the terminal. Here, the sensor unit 14 is composed of a pressure sensor such as a piezoelectric element, a capacitive sensor and a strain gauge sensor.

The sensor unit 14 may be installed adjacent to the input unit 12 or attached to the input unit 12 in a film form so as to sense a user's contact.

The storage unit 15 stores an application program for distinguishing a gripping form of the terminal 10 as well as an application program required for the function operation of the terminal 10 according to an exemplary embodiment of the present invention. The storage unit 15 may largely include a program area and a data area.

The program area stores an operating system (OS) for booting the terminal 10 and an application program for operating the sensor unit 14 for detecting a user's contact. In this case, when the terminal 10 activates each function in response to a user's request, the terminal 10 provides each function by using corresponding application programs under the control of the controller 17. In particular, a program area according to an embodiment of the present invention is a program for determining a time when a user's touch is detected, a program for rearranging function keys and a menu icon for rearranging function keys according to the gripping form of the terminal 10. Etc. are stored.

The data area is an area where data generated according to use of the terminal 10 is stored. In particular, the data area according to an embodiment of the present invention stores data for classifying a grip type according to a time when a contact of the terminal 10 is detected, and a menu icon displayed according to the grip type of the terminal 10.

The audio processor 16 includes a speaker SPK for reproducing audio data transmitted and received during the call function of the terminal 10, and a microphone MIC for collecting a user's voice or other audio signal during the call. Here, the audio processor 16 may further include a speaker for outputting audio data received during a video call separately from the speaker installed for the voice call.

The controller 17 may initialize each configuration of the terminal 10 and perform necessary signal control. In particular, the controller 17 according to an embodiment of the present invention executes a mode for providing a user interface by dividing a grip form according to a user's contact. In this case, the controller 17 may activate the function of the sensor unit 14 to determine whether a contact is detected by the terminal 10.

When the contact is detected through the sensor unit 14 of the terminal 10, the controller 17 checks the time when the contact is detected. Here, the controller 17 may classify the grip type when the contact is detected by the terminal 10 for more than a predetermined time. That is, the controller 17 classifies the user's grip type according to the contact detected by the sensor unit 14. In detail, the controller 17 may identify one side in which contact occurs among both sides of the terminal 10, and may classify a gripping form corresponding to the identified side.

The controller 17 may reset the function key arrangement of the terminal 10 according to the divided gripping form. In this case, the function key may include a key such as a volume key, a camera key, or a shortcut key located on the side of the terminal. Here, the controller 17 maps and sets keys located in the divided directions according to the gripping form of the terminal 10 and specific functions. For example, the controller 17 rearranges function keys on the left side of the terminal 10 when the gripping form of the terminal 10 is a right hand grip. On the other hand, the controller 17 rearranges the function keys on the right side of the terminal 10 when the gripping form of the terminal 10 is a left hand grip.

The controller 17 may rearrange and arrange the menu icons displayed on the screen according to the gripping form of the terminal 10. For example, the controller 17 may arrange the menu icons from the right side to the left side of the screen when the grip form of the terminal 10 is a right hand grip. On the other hand, the controller 17 may arrange the menu icons from the left side to the right side of the screen when the grip form of the terminal 10 is a left hand grip.

FIG. 2 is a diagram for describing in detail the input unit of FIG. 1. FIG.

1 and 2, the terminal 10 providing a user interface according to a user's gripping form may include an input unit 12 and a display unit 13. In this case, the input unit 12 may include a plurality of function keys 14a to 14e. In this case, the function keys may be matched with the sensor units 14 on both sides of the terminal 10. Through this, the terminal 10 may detect a user's grip form and arrange a function key according to the detected grip form. That is, the function key is activated according to the gripping form when the contact by the user of the terminal 10 is detected for a predetermined time or more to provide a user interface.

3 is a flowchart illustrating a method of providing a user interface according to a gripping form of a terminal according to an embodiment of the present invention.

1 to 3, in a method of providing a user interface according to a gripping form of the terminal 10, first, the controller 17 executes a gripping type classification mode according to a user's request in step S21. Here, the grip type classification mode is a mode in which function keys and menu icons can be arranged according to the grip type of the terminal 10. In this case, the grip type classification mode may be set to be executed or automatically executed at the request of the user.

When the grip type classification mode is executed, the control unit 17 activates the sensor unit 14 in step S23. In this way, the controller 17 may detect the infrared change or the pressure change of the sensor unit 14 according to the contact of the user to distinguish the gripping form of the terminal 10. Here, the sensor unit 14 may be located at both sides of the terminal 10, it may be composed of a plurality of proximity sensors or pressure sensors.

When the sensor unit 14 is activated, the controller 17 determines whether a contact is detected in step S25. At this time, if a touch is detected, the terminal 10 checks the time when the touch is detected in step S27. Here, the controller 17 checks the time detected in step S29 to determine whether the user's contact is detected for a predetermined time or more. That is, the controller 17 determines whether a contact is detected for a time corresponding to a threshold set in order to prevent a malfunction of the terminal 10. On the other hand, if no contact is detected, the terminal 10 continues to activate the function of the sensor unit 14 in step S23 to check whether the contact is detected.

If a contact is detected for a predetermined time or more, the terminal 10 distinguishes a grip type according to the touch in step S31. On the other hand, if the contact is detected less than the preset time, the terminal 10 waits for the function key setting, and continues checking the time when the contact is detected.

When the gripping forms are divided, the terminal 10 arranges the function keys according to the gripping forms classified at step S33. For example, the terminal 10 may be configured by mapping a specific function to a function key located on the left side of the terminal 10 when the separated grip type is a right hand grip. In addition, the terminal 10 may be configured by mapping a specific function to a function key located on the right side of the terminal 10 when the separated grip type is a left hand grip. Here, the function key may be one of a volume key, a camera key and a shortcut key of the terminal 10.

The terminal 10 may rearrange the menu icons on the screen according to the grip types separated in step S35. For example, the terminal 10 may arrange the menu icons from the right side to the left side when the separated grip form is the right hand grip. In addition, the terminal 10 may arrange the menu icons from the left side to the right side when the separated grip form is the left hand grip.

The order of the process of arranging function keys according to the gripping form of the terminal 10 and rearranging the menu icons of the screen is not limited thereto. That is, the terminal 10 may rearrange the menu icons and arrange the function keys when the grip types according to the user's contact are classified. In addition, the terminal 10 may arrange menu icons and function keys at the same time according to the gripping form.

4 to 9 are screen examples for explaining a user interface according to a gripping form of a terminal in an embodiment of the present invention.

1 to 9, a user interface providing method according to a gripping form of the terminal 10 will be described through an embodiment. First, when the gripping type classification mode is executed, the terminal 10 activates the sensor unit 14.

When the sensor unit 14 is activated, the terminal 10 determines whether a user's contact is detected. At this time, if a touch is detected, the terminal 10 checks the time at which the touch is detected. Here, when a contact is detected for more than a predetermined time, the terminal 10 distinguishes a grip type according to the contact. When the gripping forms are divided, the terminal 10 arranges the function keys according to the divided gripping forms.

For example, as illustrated in FIG. 4, the terminal 10 may identify that the user's grip form is left-hand grip through the sensor unit 14. In this case, as shown in FIG. 5, the terminal 10 may set specific functions by mapping specific functions to the function keys 14a and 14b located on the right side of the terminal 10 according to a separate gripping form. Here, the terminal 10 may map volume keys among function keys.

In addition, as illustrated in FIG. 6, the terminal 10 may identify that the user's grip form is a right hand grip through the sensor unit 14. In this case, as illustrated in FIG. 7, the terminal 10 may set specific functions by mapping specific functions to function keys 14e and 14f located on the left side of the terminal 10 according to the divided gripping form. Here, the terminal 10 may map volume keys among function keys. Meanwhile, the terminal 10 may map one of a camera key and a shortcut key of the terminal 10 as a function key.

In addition, the terminal 10 may rearrange the menu icons on the screen according to the divided gripping form. For example, as illustrated in FIG. 8, when the separated grip form is a left hand grip, the terminal 10 may arrange the numeric keys of the menu icons from the left side to the right side. In addition, as illustrated in FIG. 9, the terminal 10 may arrange the numeric keys of the menu icons from the right side to the left side when the separated grip form is the right hand grip.

On the other hand, the embodiments of the present invention disclosed in the specification and drawings are merely presented specific examples for clarity and are not intended to limit the scope of the present invention. It is apparent to those skilled in the art that other modifications based on the technical idea of the present invention can be carried out in addition to the embodiments disclosed herein.

The present invention is applied to various types of user devices such as mobile communication terminals, PMPs, PDAs, notebook computers, and MP3 players, and detects a user's contact in a terminal having a touch sensor or a proximity sensor, and according to the detected contact, By classifying the gripping type and arranging the function keys and menu icons of the terminal according to the separated gripping type, the character string is recognized through an electronic pen connected to the terminal by Bluetooth, and the character data corresponding to the recognized character string is generated to the terminal. By transmitting and applying to a specific application, a user can input text into the terminal using an electronic pen, thereby increasing the utilization of the terminal.

10: terminal 11: communication unit
12: input unit 13: display unit
14: sensor unit 15: storage unit
16: audio processor 17: control unit

Claims (14)

A sensor unit detecting a user's contact;
A display unit which displays a menu icon according to the gripping forms classified by the contact; And
A controller configured to check a time at which the contact is detected, to classify the gripping form according to the contact, and to arrange a function key according to the divided gripping form when the contact is detected for a preset time;
User interface providing terminal according to the gripping form, characterized in that it comprises a.
The method of claim 1, wherein the sensor unit
Terminals provided on both sides of the terminal, the user interface providing terminal according to the gripping form, characterized in that composed of a plurality of proximity sensors or pressure sensors.
The method of claim 1,
A storage unit for storing the menu icon displayed according to the grip type of the user;
User interface providing terminal according to the gripping form, characterized in that it further comprises.
The method of claim 1,
An input unit including a function key whose arrangement changes according to the contact;
User interface providing terminal according to the gripping form, characterized in that it further comprises.
The method of claim 1, wherein the function key
A user interface providing terminal according to a grip form, characterized in that it comprises at least one of a volume key, a camera key and a shortcut key.
Detecting, by the terminal, a touch of the user;
Checking, by the terminal, a time when the contact is detected;
When the contact is detected for more than a predetermined time, the terminal, by the contact type, the grading type; And
Arranging a function key according to the separated gripping form by the terminal;
Method for providing a user interface according to the gripping form of the terminal comprising a.
The method of claim 6, wherein the detecting step
And the terminal detects the contact according to an infrared ray change or a pressure change.
The method of claim 6, wherein the step of distinguishing
Identifying, by the terminal, one side from which the contact occurs among both sides of the terminal; And
Identifying, by the terminal, the gripping form corresponding to the identified one side;
Method for providing a user interface according to the gripping form of the terminal comprising a.
The method of claim 6, wherein the checking step
If the contact is detected less than a preset time, the terminal waiting for the function key setting;
Method for providing a user interface according to the gripping form of the terminal, characterized in that it further comprises.
The method of claim 6, wherein the arranging
If the gripping type is a right hand grip, mapping and reordering a specific function to a function key located on the left side of the terminal;
Method for providing a user interface according to the gripping form of the terminal comprising a.
The method of claim 6, wherein the arranging
If the gripping form is a left hand grip, mapping and rearranging a specific function to a function key located on the right side of the terminal;
Method for providing a user interface according to the gripping form of the terminal comprising a.
The method of claim 6, wherein the arranging
Rearranging the menu icons by the terminal according to the grip type;
Method for providing a user interface according to the gripping form of the terminal, characterized in that it further comprises.
The method of claim 12, wherein the rearranging
The terminal aligning the menu icon from right to left when the grip type is a right hand grip;
Method for providing a user interface according to the gripping form of the terminal, characterized in that it further comprises.
The method of claim 12, wherein the rearranging
The terminal aligning the menu icons from left to right when the grip type is left hand grip;
Method for providing a user interface according to the gripping form of the terminal, characterized in that it further comprises.
KR1020100010220A 2010-02-04 2010-02-04 Method for providing user interface according to holding type of terminal and apparatus KR20110090447A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100010220A KR20110090447A (en) 2010-02-04 2010-02-04 Method for providing user interface according to holding type of terminal and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100010220A KR20110090447A (en) 2010-02-04 2010-02-04 Method for providing user interface according to holding type of terminal and apparatus

Publications (1)

Publication Number Publication Date
KR20110090447A true KR20110090447A (en) 2011-08-10

Family

ID=44928229

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100010220A KR20110090447A (en) 2010-02-04 2010-02-04 Method for providing user interface according to holding type of terminal and apparatus

Country Status (1)

Country Link
KR (1) KR20110090447A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013176472A1 (en) * 2012-05-21 2013-11-28 Samsung Electronics Co., Ltd. Method and apparatus of controlling user interface using touch screen
KR101482867B1 (en) * 2013-07-23 2015-01-15 원혁 Method and apparatus for input and pointing using edge touch
US9335847B2 (en) 2013-04-09 2016-05-10 Samsung Electronics Co., Ltd. Object display method and apparatus of portable electronic device
CN109522276A (en) * 2018-09-27 2019-03-26 努比亚技术有限公司 Application icon management method, terminal and computer readable storage medium
WO2019117461A1 (en) * 2017-12-13 2019-06-20 조돈우 Braille generation device and control method thereof

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013176472A1 (en) * 2012-05-21 2013-11-28 Samsung Electronics Co., Ltd. Method and apparatus of controlling user interface using touch screen
US10338705B2 (en) 2012-05-21 2019-07-02 Samsung Electronics Co., Ltd. Method and apparatus of controlling user interface using touch screen
US11061496B2 (en) 2012-05-21 2021-07-13 Samsung Electronics Co., Ltd. Method and apparatus of controlling user interface using touch screen
US9335847B2 (en) 2013-04-09 2016-05-10 Samsung Electronics Co., Ltd. Object display method and apparatus of portable electronic device
KR101482867B1 (en) * 2013-07-23 2015-01-15 원혁 Method and apparatus for input and pointing using edge touch
WO2015012478A1 (en) * 2013-07-23 2015-01-29 Won Hyuk Method and apparatus for input and pointer using border touch
WO2019117461A1 (en) * 2017-12-13 2019-06-20 조돈우 Braille generation device and control method thereof
CN109522276A (en) * 2018-09-27 2019-03-26 努比亚技术有限公司 Application icon management method, terminal and computer readable storage medium
CN109522276B (en) * 2018-09-27 2021-11-26 努比亚技术有限公司 Application icon management method, terminal and computer-readable storage medium

Similar Documents

Publication Publication Date Title
CN107291356B (en) File transmission display control method and device and corresponding terminal
TWI663541B (en) Mobile terminal, fingerprint identification control method and device, computer-readable storage medium and computer program product
US9507521B2 (en) Input apparatus, input mode switching method and computer apparatus
WO2020258929A1 (en) Folder interface switching method and terminal device
TW201839658A (en) Mobile Terminal, Fingerprint Identification Area Display Method And Device
US20140331146A1 (en) User interface apparatus and associated methods
WO2018082657A1 (en) Method for searching for icon, and terminal
CN108255372B (en) Desktop application icon sorting method and mobile terminal
US20150186003A1 (en) Electronic device and method for displaying user interface thereof
KR20120023339A (en) Method for executing function using touch signal and apparatus
US10037135B2 (en) Method and electronic device for user interface
CN111459358B (en) Application program control method and electronic equipment
WO2020001358A1 (en) Icon sorting method and terminal device
CN110703972B (en) File control method and electronic equipment
JP2023500390A (en) Icon movement method and electronic device
KR20110090447A (en) Method for providing user interface according to holding type of terminal and apparatus
WO2020088268A1 (en) Desktop icon organizing method and terminal
CN110837404A (en) Shortcut operation processing method and device for internal function module and storage medium
KR20160088609A (en) Electronic device and method for controlling of information disclosure thereof
CN108604160A (en) The method and device of touch-screen gesture identification
US9965098B2 (en) Clamshell electronic device and calibration method capable of enabling calibration based on separated number of cover
EP3349098B1 (en) Electronic device and operation method therefor
US20210271384A1 (en) Electronic device and method for performing function of electronic device
CN111313114A (en) Charging method and electronic equipment
KR20140009687A (en) Method for processing composite inputs using macro function and device thereof

Legal Events

Date Code Title Description
A201 Request for examination
N231 Notification of change of applicant
E902 Notification of reason for refusal
E601 Decision to refuse application