KR20090022466A - Method for selecting a menu - Google Patents

Method for selecting a menu Download PDF

Info

Publication number
KR20090022466A
KR20090022466A KR1020070087828A KR20070087828A KR20090022466A KR 20090022466 A KR20090022466 A KR 20090022466A KR 1020070087828 A KR1020070087828 A KR 1020070087828A KR 20070087828 A KR20070087828 A KR 20070087828A KR 20090022466 A KR20090022466 A KR 20090022466A
Authority
KR
South Korea
Prior art keywords
menu
area
menu selection
touch screen
selection object
Prior art date
Application number
KR1020070087828A
Other languages
Korean (ko)
Inventor
정덕화
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020070087828A priority Critical patent/KR20090022466A/en
Priority to US12/196,104 priority patent/US8219936B2/en
Priority to PCT/KR2008/005066 priority patent/WO2009028892A2/en
Priority to US12/202,025 priority patent/US8432365B2/en
Priority to PCT/KR2008/005133 priority patent/WO2009028921A2/en
Publication of KR20090022466A publication Critical patent/KR20090022466A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/247Telephone sets including user guidance or feature selection means facilitating their use
    • H04M1/2477Telephone sets including user guidance or feature selection means facilitating their use for selecting a function from a menu display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a method of selecting a menu in a mobile communication terminal. Instead of selecting a menu by a key input as in the conventional method, the relative distance between the menu selection object and the mobile communication terminal is measured to provide a display / search / execution step of the menu. To this end, the sensor determines whether the menu selection object enters the first area or the second area or touches the touch screen, and the controller determines whether to display, search, select, and execute the menu according to the determined signal.

The present invention can select and execute a menu without a key input, and enable various menu selections and intuitive menu selections through a step-by-step configuration according to the access distance to the terminal.

Description

Method For Selecting a Menu}

1 is a block diagram showing the configuration of a mobile communication terminal according to an embodiment of the present invention;

2A-2C are state diagrams illustrating the steps of display / search / execute of a menu according to approach distance in accordance with an embodiment of the present invention.

3 is a flowchart illustrating a menu selection method according to an embodiment of the present invention.

※ Description of the main parts of the drawing ※

11 sensor 12 control unit

13 memory unit 14 display unit

21: Menu window

The present invention relates to a method of selecting a menu in a mobile communication terminal, and more particularly, to a menu selection method for displaying / navigating / executing a menu according to an access distance to a touch screen without a key input.

Today, due to the development of information and communication technology, various portable information devices capable of searching for necessary information anytime, anywhere have been developed and studied. A typical example is Personal Digital Appliances (PDAs). PDA has developed from the function of simply managing personal schedules and contacts, and the PDA with a built-in wireless call module is introduced, which enables not only voice calls but also Internet access, wireless e-mail transmission, and fax transmission and reception using a wireless communication network. It became.

The mobile communication terminal stores a large amount of menus therein, thereby allowing a user to use various menus, thereby increasing user convenience. However, since the selection of the menu is generally made through key input, it is inconvenient to input the corresponding key several times when the menu is diversified or has a hierarchical structure.

Accordingly, the present invention measures and uses the relative distance between the touch screen and the menu selection object provided in the mobile communication terminal in order to improve the method of selecting a menu in the conventional mobile communication terminal, according to the approach distance of the menu selection object. Its purpose is to provide a display / search / execute step of.

The present invention relates to a method of selecting a menu in a terminal.

More specifically, the menu selection method according to the embodiment of the present invention, determining whether the menu selection object enters the first area; Displaying a menu window according to the determination result; Determining whether the menu selection entity has entered a second area; Searching for a menu included in the menu window according to the determination result; Determining whether the menu selection object has touched the touch screen; And selecting and executing a menu of the contacted part according to the determination result.

Preferably, the method further comprises determining whether the object approaching the touch screen is a menu selection object for menu selection, wherein the second area is a space over a predetermined distance from the touch screen surface in a vertical direction of the touch screen. The first area is a space over a predetermined distance in a vertical direction of the touch screen on the second area.

 Preferably, when the menu selection object enters the first area, displaying a menu window further comprises moving the displayed menu window to a position corresponding to the menu selection object, wherein the menu selection object When is entered into the second area, the displayed menu window is fixedly displayed at a portion set when entering the second area.

DETAILED DESCRIPTION Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art may fully understand and implement the present invention.

1 is a block diagram showing the configuration of a mobile communication terminal according to an embodiment of the present invention.

The mobile communication terminal according to the embodiment of the present invention includes a sensor 11, a control unit 12, a memory unit 13, and a display unit 14. The sensor 11 detects a menu selection object proximate to the touch screen and determines a three-dimensional position of the menu selection object. That is, the sensor 11 detects an object approaching the touch screen screen and transmits it to the control unit 12.

The object for selecting a menu is an object in which a user accesses a touch screen to execute a menu. In general, a user's finger may be used and a pen or other device may be used.

In addition, the sensor 11 detects the three-dimensional position of the menu selection object approaching the touch screen. That is, when the menu selection object approaches the touch screen, the sensor 11 recognizes the position as (x, y, z) coordinates consisting of x, y, and z axes. At this time, the plane parallel to the touch screen is set as the (x, y) plane, and the axis perpendicular to the touch screen is set as the z axis.

There are various ways to determine the three-dimensional position of the menu selection object approached by the sensor 11. As an example, the sensor 11 irradiates a beam and measures the time it returns by being reflected by the menu selection object to obtain the z coordinate and add it to the (x, y) plane coordinates to add 3 of the menu selection object. Dimensional location can be determined.

Alternatively, infrared can be used to determine the three-dimensional position.

The sensor 11 provides the controller 12 with a three-dimensional position of the menu selection object. The controller 12 determines whether to display / search / execute a menu according to the three-dimensional position of the menu selection object received from the sensor 11 and provides it to the display unit 14. In addition, the controller 12 determines whether an object transmitted from the sensor 11 is an object for menu selection.

The display unit 14 displays a menu according to the command of the controller 12.

Each menu, a first region, and a second region are set and stored in the memory unit 13, and the controller 12 determines display / search / execution of a menu according to a database stored in the memory unit 13. .

2A-2C are state diagrams illustrating the steps of display / search / execute of a menu according to an approach distance according to an embodiment of the present invention.

FIG. 2A illustrates a state in which the menu window 21 is displayed when the menu selection object enters the first area, and FIG. 2B illustrates a state in which the user searches for a menu when the menu selection object enters the second area. 2C illustrates a state in which a menu is selected and executed when the menu selection object contacts the touch screen.

The second area is a space over a predetermined distance in a vertical direction of the touch screen from the touch screen surface, and the first area is a space over a predetermined distance in a vertical direction of the touch screen on top of the second area. .

As shown in FIGS. 2A to 2C, the first region B and the second region A may be immediately adjacent to each other, and the two regions may be clearly defined by having a buffer region between the first region B and the second region A. FIG. It can also be divided. Alternatively, a buffer region may be placed between the touch screen surface and the second region A. FIG.

As described above, the stacking relationship between the heights of the first area B and the second area A and the buffer area may be set differently according to embodiments, and the set values are databased in the memory unit and according to the command of the controller. Perform menu display / navigation / execution steps.

FIG. 2A illustrates a state in which the menu window 21 is displayed when the menu selection object enters the first area B. FIG. When the menu selection object enters the first area B, the sensor detects this, calculates the three-dimensional position, and provides the same to the controller. The controller, which receives this, determines the three-dimensional position and compares it with the database stored in the memory to determine whether to display / search / execute the menu.

In FIG. 2A, since the menu selection object enters the first area B, the menu window 21 is displayed according to the database of the preset memory unit, and the display unit receiving the menu window 21 displays the menu window 21. Whether to display the menu window depends on the z coordinate of the three-dimensional position of the menu selection object. Therefore, the controller compares the z-coordinate of the menu selection object transmitted from the sensor with the database stored in the memory unit and transmits it to the display unit when it is determined to enter the first area, and the display unit displays the menu window.

In this step, the displayed menu window 21 may be moved to a position corresponding to the menu selection object. That is, when the menu selection object is moved horizontally in the first area, the menu window may be displayed at a position corresponding to the menu selection object. In this case, the position of the menu window is displayed based on the (x, y) plane coordinates of the menu selection object.

2B illustrates a state in which the menu is searched when the menu selection object enters the second area A. FIG. When the menu selection object enters the second area A, the sensor detects this, calculates its three-dimensional position, and provides it to the controller. The controller receives the 3D location, compares it with the database stored in the memory unit, and determines whether to display / search / execute the menu.

In FIG. 2B, since the second area A is entered, a menu is selected according to a database of a predetermined memory unit. When the menu selection object enters the second area A, the menu window 21 is fixedly displayed at the portion set at the time of entry, and navigates the menu while moving the menu selection object horizontally at the height.

2C illustrates a state in which a menu is selected and executed when the menu selection object contacts the touch screen. When the menu is searched in the second area A, and the menu selection object moves in the vertical direction and contacts the corresponding menu, the contacted menu is selected and executed. Therefore, when the menu selection object contacts the touch screen, the sensor detects the contact and transmits the occurrence of the contact to the controller. The controller receives the menu and executes the menu in comparison with the database stored in the memory unit, and the display unit displays the menu.

3 is a flowchart illustrating a menu selection method according to an embodiment of the present invention.

The sensor detects an object approaching the touch screen screen (S305). The sensor that detects an object approaching the screen transmits it to the controller.

Upon receiving this, the controller determines whether the corresponding object is an object for selecting a menu (S310). Since the objects approaching the touch screen screen are very diverse and unpredictable, it is desirable to selectively accept only menu selection objects whose purpose is to execute a menu. Normally, the object for selecting a menu is a user's finger or a pen or other device. It is preferable to set the objects so that the objects are not affected by the execution of the menu even if they are detected by the sensor.

To this end, the controller may further include determining whether the approaching object is an object for menu selection and if not, ignoring the access of the object (S315).

If the approaching object is a menu selection object, the sensor measures the position of the menu selection object (S320). In this case, the sensor measures the position of the menu selection object in three-dimensional coordinates. The sensor transmits the measured position to the controller, and the controller that receives the determined position determines whether the object is in the first region (S325). If the determination result enters the first area, the menu window is displayed (S330). If it is determined that the first area has not been entered, the position is continuously measured and the menu is not displayed.

While the menu selection object is in close proximity to the touchscreen screen, the sensor continues to calculate its three-dimensional position and send it to the controller. When the menu window is displayed, the controller determines whether the object enters the second area (S335). As a result of the determination, when entering the second area, the menu window is fixed at the corresponding position and navigates the menu while moving the object in the horizontal direction (S340). If it is determined that the second area has not been entered, the location is not determined and the menu is not searched.

When the menu selection object approaches the touch screen, the sensor continuously calculates the three-dimensional position and transmits it to the controller. When the menu is searched and the menu selection object is in contact with the corresponding menu (S345), the sensor detects it and transmits it to the controller, and the controller executes the menu (S350). In addition, the control unit transmits this to the display unit so that the display unit displays the menu for the user to execute.

So far, the present invention has been described with reference to the preferred embodiments, and those skilled in the art to which the present invention pertains to the detailed description of the present invention and other forms of embodiments within the essential technical scope of the present invention. Could be implemented.

Here, the essential scope technical scope of the present invention is shown in the claims, and all differences within the equivalent range will be construed as being included in the present invention.

According to the present invention, more intuitive menu selection is possible by displaying / navigating / executing a menu according to the distance to the touch screen without using a key input, and can solve user's inconvenience through a direct key input.

Claims (4)

Determining whether the menu selection entity has entered the first area; Displaying a menu window according to the determination result; Determining whether the menu selection entity has entered a second area; Searching for a menu included in the menu window according to the determination result; Determining whether the menu selection object has touched the touch screen; And And selecting and executing a menu of the contacted part according to the determination result. The second area is a space over a predetermined distance in the vertical direction of the touch screen from the touch screen surface, the first area is a space over a predetermined distance in the vertical direction of the touch screen on top of the second area. Menu selection method characterized in that. The method of claim 1, And determining whether the object accessing the touch screen is a menu selection object for menu selection. The method according to claim 1 or 2, The displaying of the menu window may further include moving the displayed menu window to a position corresponding to the menu selection object. The method according to claim 1 or 2, And when the menu selection object enters the second area, the displayed menu window is fixedly displayed at a portion set when the second area is entered.
KR1020070087828A 2007-08-30 2007-08-30 Method for selecting a menu KR20090022466A (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
KR1020070087828A KR20090022466A (en) 2007-08-30 2007-08-30 Method for selecting a menu
US12/196,104 US8219936B2 (en) 2007-08-30 2008-08-21 User interface for a mobile device using a user's gesture in the proximity of an electronic device
PCT/KR2008/005066 WO2009028892A2 (en) 2007-08-30 2008-08-28 A user interface for a mobile device using a user's gesture in the proximity of an electronic device
US12/202,025 US8432365B2 (en) 2007-08-30 2008-08-29 Apparatus and method for providing feedback for three-dimensional touchscreen
PCT/KR2008/005133 WO2009028921A2 (en) 2007-08-30 2008-09-01 Apparatus and method for providing feedback for three-dimensional touchscreen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020070087828A KR20090022466A (en) 2007-08-30 2007-08-30 Method for selecting a menu

Publications (1)

Publication Number Publication Date
KR20090022466A true KR20090022466A (en) 2009-03-04

Family

ID=40692332

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020070087828A KR20090022466A (en) 2007-08-30 2007-08-30 Method for selecting a menu

Country Status (1)

Country Link
KR (1) KR20090022466A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011002414A2 (en) * 2009-06-29 2011-01-06 Razer (Asia-Pacific) Pte Ltd A user interface
US20130050143A1 (en) * 2011-08-31 2013-02-28 Samsung Electronics Co., Ltd. Method of providing of user interface in portable terminal and apparatus thereof
KR20140104822A (en) * 2013-02-21 2014-08-29 삼성전자주식회사 Method for displaying for virtual keypad an electronic device thereof

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011002414A2 (en) * 2009-06-29 2011-01-06 Razer (Asia-Pacific) Pte Ltd A user interface
WO2011002414A3 (en) * 2009-06-29 2011-04-21 Razer (Asia-Pacific) Pte Ltd A user interface
US20130050143A1 (en) * 2011-08-31 2013-02-28 Samsung Electronics Co., Ltd. Method of providing of user interface in portable terminal and apparatus thereof
KR20140104822A (en) * 2013-02-21 2014-08-29 삼성전자주식회사 Method for displaying for virtual keypad an electronic device thereof

Similar Documents

Publication Publication Date Title
JP5620440B2 (en) Display control apparatus, display control method, and program
AU2013276998B2 (en) Mouse function provision method and terminal implementing the same
US20140325443A1 (en) Method and apparatus for operating menu in electronic device including touch screen
EP2770423A2 (en) Method and apparatus for operating object in user device
CN103759737B (en) A kind of gestural control method and navigation equipment
KR101872272B1 (en) Method and apparatus for controlling of electronic device using a control device
US20040169638A1 (en) Method and apparatus for user interface
US20110074829A1 (en) Mobile communication terminal including touch interface and method thereof
TW201516765A (en) Touch device having switching function, system of the touch device, and method for controlling the switching function of the touch device
KR20200051768A (en) Task switching method and terminal
CN102750035B (en) The determination method and apparatus of display position of cursor
JP6183820B2 (en) Terminal and terminal control method
CN109002223A (en) A kind of touch interface display methods and mobile terminal
US20120218207A1 (en) Electronic device, operation control method, and storage medium storing operation control program
KR20090022466A (en) Method for selecting a menu
KR20150018748A (en) Method and system for transfering content among devices using mobile device sensor
KR101151300B1 (en) Mobile terminal and method for displaying object using approach sensing of touch tool thereof
JP2014182429A (en) Information processor, information processing method and information processing program
US9501166B2 (en) Display method and program of a terminal device
KR101207451B1 (en) Mobile Terminal Having Non-Contacting Sensor And Method Of Searching Item List Using Same
JP6245334B2 (en) Display program
JP5482549B2 (en) Display device, display method, and display program
JP6569546B2 (en) Display device, display control method, and display control program
CN101308434A (en) Electronic device and its software user interface guiding and browsing method
KR102117450B1 (en) Display device and method for controlling thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application