KR100981877B1 - Method For Configurating User-defined Menu And Apparatus For Having Function For Configuration Of User-defined Menu - Google Patents

Method For Configurating User-defined Menu And Apparatus For Having Function For Configuration Of User-defined Menu Download PDF

Info

Publication number
KR100981877B1
KR100981877B1 KR20080091504A KR20080091504A KR100981877B1 KR 100981877 B1 KR100981877 B1 KR 100981877B1 KR 20080091504 A KR20080091504 A KR 20080091504A KR 20080091504 A KR20080091504 A KR 20080091504A KR 100981877 B1 KR100981877 B1 KR 100981877B1
Authority
KR
South Korea
Prior art keywords
user
graphic object
menu
displayed
defined menu
Prior art date
Application number
KR20080091504A
Other languages
Korean (ko)
Other versions
KR20100032560A (en
Inventor
박종현
전재욱
조영준
조윤찬
Original Assignee
성균관대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 성균관대학교산학협력단 filed Critical 성균관대학교산학협력단
Priority to KR20080091504A priority Critical patent/KR100981877B1/en
Publication of KR20100032560A publication Critical patent/KR20100032560A/en
Application granted granted Critical
Publication of KR100981877B1 publication Critical patent/KR100981877B1/en

Links

Images

Abstract

Disclosed are a user-defined menu configuration method and a user-configured menu configuration function for easily configuring a menu optimized for a user. First, obtaining a coordinate value of a touched position, and if a graphic object is not displayed at the touched position, a user-defined menu creation mode is activated, and displaying menu information set through the activated user-defined menu creation mode. do. Therefore, the menu can be configured to be optimized for the user's habits or preferences, thereby improving usability.
Touch screen, custom, menu, create, edit

Description

Method for Configurating User-defined Menu And Apparatus For Having Function For Configuration Of User-defined Menu}

The present invention relates to a menu configuration, and more particularly, to a user-defined menu configuration method and a device having a user-defined menu configuration function that can be applied to a data processing device having a touch type input device.

Recently, touch screens capable of simultaneously performing input and display in one device without having a separate keypad have been applied to various data processing devices.

The touch screen has the advantage that the input operation is simplified and the user can easily recognize the meaning of the input screen because the touch screen can execute the event interactively and intuitively just by touching the graphic object displayed on the display with a finger or a pen. .

In addition, since the touch screen is integrally installed with the display device and the touch pad, the touch screen does not have to have a keypad installation space as in a conventional portable terminal, and thus the display device having a larger visible screen may be adopted in the portable terminal. have.

The touch screen has capacitive type, infrared light sensing type, surface ultrasonic type, piezoelectric type, integrative tension measuring type, and resistive type depending on the driving method. Among them, the touch screen has high transmittance, fast reaction speed, and good resistance. In general, a resistive method or a capacitive method that is less affected by the resistance is generally used.

In general, a device having a touch screen does not have a separate keypad or often has a minimum keypad. Therefore, events such as various programs and functions must be executed through touch-oriented operation.

However, since a menu included in a device having a touch screen is generally fixed and sold by a device manufacturer, it is difficult for a user to use a menu optimized for him, which causes inconvenience in use.

For example, when a menu frequently used by a user is located at the bottom of the menu tree, the delay time for executing a specific program or function is increased because the path of the menu tree must be moved step by step through a touch operation.

Accordingly, a first object of the present invention is to provide a user-defined menu configuration method for easily configuring a menu optimized for a user.

In addition, a second object of the present invention is to provide an apparatus having a user-defined menu configuration function for easily configuring a menu optimized for a user.

According to an aspect of the present invention, there is provided a method of configuring a user-defined menu according to an aspect of the present invention. The method may include obtaining coordinate values of a touched position, and when a graphic object is not displayed at the touched position. And activating a user-defined menu creation mode and displaying menu information set through the activated user-defined menu creation mode. When the graphic object is not displayed at the touched position, activating the user-defined menu creation mode may include selecting a graphic object and setting an execution event corresponding to the selected graphic object. . Selecting the predetermined graphic object may include editing at least one of a shape, a size, a color, and a transparency of the selected graphic object. When the graphic object is not displayed at the touched position, activating the user-defined menu creation mode may further include setting a position at which the selected graphic object is to be displayed. When the graphic object is not displayed at the touched position, activating a user-defined menu generation mode includes acquiring a touched time, wherein the touched time is longer than a preset reference time and the touched position The method may further include executing the user-defined menu creation mode when the graphic object is not displayed. The method for configuring a user defined menu may further include activating a user defined menu editing mode for editing a menu corresponding to the displayed graphic object when the graphic object is displayed at the touched position. The activation of the user-defined menu editing mode may include changing an execution event corresponding to the displayed graphic object. In the displaying of the menu information set through the activated user-defined menu generation mode, the graphic object corresponding to a predetermined menu set through the activated user-defined menu generation mode may be displayed. In the obtaining of the coordinate values of the touched position, when the coordinate values of the touched position are plural, the coordinate values corresponding to the center of gravity of the figure composed of the plurality of coordinate values may be acquired as the touched coordinate values. Can be.

In addition, a device having a user-defined menu configuration function according to an aspect of the present invention for achieving the second object of the present invention, the touch screen for obtaining the coordinate value of the touched position and providing the obtained coordinate value and And a controller for activating a user-defined menu creation mode when the graphic object is not displayed at the touched position and displaying menu information set through the activated user-defined menu creation mode on the touch screen. The controller may determine whether to display a graphic object at the touched position based on the coordinate value provided from the touch screen. When the user-defined menu creation mode is activated, the controller may include a user interface for selecting a graphic object and a user interface for setting an execution event corresponding to the selected graphic object. The controller may provide a user interface for editing at least one of a shape, a size, a color, and a transparency of the selected graphic object when the user-defined menu generation mode is activated. When the user-defined menu creation mode is activated, the controller may provide a user interface for selecting a location where the selected graphic object is to be displayed in the display area of the touch screen. The controller may activate the user-defined menu generation mode when the touch screen is touched and the touched time is longer than a preset reference time and the graphic object is not displayed at the touched position. The controller may activate a user-defined menu editing mode when a graphic object is displayed at the touched position. The controller may provide a user interface for changing an execution event corresponding to the graphic object displayed at the touched position when the user-defined menu editing mode is activated. The device having the user-defined menu configuration function may further include a storage unit which stores menu information generated through the activated user-defined menu generation mode.

According to the device having the user-defined menu configuration method and the user-defined menu configuration function as described above, when the predetermined position of the touch screen is touched for more than a preset reference time and the graphic object is not displayed at the touched position, a user-defined menu is generated. The mode is activated so that the user can select or edit the type and shape of the graphic object and provide a user interface for setting the execution event to be executed accordingly when the graphic object is manipulated. Alternatively, when a predetermined position of the touch screen is touched for a preset reference time and a graphic object is already displayed at the touched position, a user-defined menu editing mode for activating a menu corresponding to the displayed graphic object is activated.

Therefore, the position and / or shape of the graphic object representing the predetermined menu can be configured according to the user's intention, and the user can directly select an execution event corresponding to the graphic object, so that the menu can be optimized to the user's habit or preference. It can be configured and this improves the usability.

As the invention allows for various changes and numerous embodiments, particular embodiments will be illustrated in the drawings and described in detail in the written description. However, this is not intended to limit the present invention to specific embodiments, it should be understood to include all modifications, equivalents, and substitutes included in the spirit and scope of the present invention. Like reference numerals are used for like elements in describing each drawing.

The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as the second component, and similarly, the second component may also be referred to as the first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.

When a component is referred to as being "connected" or "connected" to another component, it may be directly connected to or connected to that other component, but it may be understood that other components may exist in the middle. Should be. On the other hand, when a component is referred to as being "directly connected" or "directly connected" to another component, it should be understood that there is no other component in between.

The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the present invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In this application, the terms "comprise" or "have" are intended to indicate that there is a feature, number, step, operation, component, part, or combination thereof described in the specification, and one or more other features. It is to be understood that the present invention does not exclude the possibility of the presence or the addition of numbers, steps, operations, components, components, or a combination thereof.

Hereinafter, with reference to the accompanying drawings, it will be described in detail a preferred embodiment of the present invention. Hereinafter, the same reference numerals are used for the same components in the drawings, and duplicate descriptions of the same components are omitted.

1 is a flowchart illustrating a method of configuring a user defined menu according to an exemplary embodiment of the present invention, and illustrates a process of configuring a user defined menu corresponding to a user's touch operation in a device having a touch screen. 2 is a conceptual diagram for explaining a process of acquiring a coordinate value corresponding to a touch manipulation.

1 and 2, when power is first applied to a device having a touch screen, the device having a touch screen determines whether a touch manipulation has occurred (step 101), and if it is determined that a touch manipulation has occurred, touch The coordinate value of the acquired position is obtained (step 103).

Here, the device having the touch screen may determine whether a touch manipulation occurs based on the presence or absence of coordinate values provided from the touch screen.

In addition, as illustrated in FIG. 2, when the touch screen is touched by an object having a large area, such as a finger, and has a plurality of coordinate values corresponding to the touched area, among the plurality of coordinate values. The predetermined coordinate value is extracted to obtain a coordinate value representing the touched area.

For example, when a device having a touch screen has a plurality of coordinate values corresponding to a touched area, a coordinate value corresponding to the center of gravity 211 of the figure 210 configured as a region corresponding to the plurality of coordinate values is provided. It can be extracted and obtained as the coordinate value of the touched area.

In addition, the device with the touch screen acquires the time when the touch screen is touched (step 105), and compares the touched time with a preset reference time (step 107) so that the touched time is greater than the preset reference time. If it is determined to be long, it is determined whether the graphic object is displayed at the touched position (step 109).

Here, the device having the touch screen may determine whether to display the graphic object based on the screen configuration information displayed on the touch screen and the coordinate values obtained in step 103. That is, when the coordinate value acquired in step 103 is included in the display coordinate value of the graphic object included in the screen configuration information of the touch screen, it may be determined that the graphic object is displayed at the touched position.

Also, the graphic object is a concept including all graphic information such as an icon, an image, text, a video, and a flash content that symbolically displays a predetermined execution event.

If it is determined in step 109 that the graphic object is not displayed at the touched position, the device with the touch screen activates the user-defined menu creation mode (step 111).

When the user-defined menu creation mode is activated in step 111, the device with the touch screen causes a new menu to be generated by displaying a user interface on the touch screen that allows the user to directly configure a menu that performs a predetermined function (step 120). ).

Thereafter, the device having the touch screen stores menu information newly generated by the user in the storage unit (step 131). The menu information stored in the storage unit may include, for example, a graphic object representing a generated menu and execution event information connected to the graphic object.

If it is determined in step 109 that the graphic object is displayed at the touched position of the touch screen, the device with the touch screen activates the user-defined menu editing mode (step 133), and the menu corresponding to the graphic object displayed on the touch screen. By providing a user interface that can edit the user to allow the user to edit the preset menu (step 135).

Here, the user may edit the shape, size, display position, color, transparency, etc. of the graphic object through the user-defined menu editing mode, and edit an execution event corresponding to the graphic object.

Thereafter, the device with the touch screen proceeds to step 131 and stores the edited menu information in the storage unit.

If it is determined in step 107 that the touched time is shorter than the preset reference time, the device with the touch screen determines whether the graphic object is displayed at the touched position (step 137), and the graphic object is displayed at the touched position. If it is determined that the information is determined, the event corresponding to the displayed graphic object is executed (step 139).

Alternatively, if it is determined in operation 137 that the graphic object is not displayed at the touched position, the device with the touch screen determines that an abnormal touch operation is caused by a user's mistake.

3 is a flowchart illustrating a menu generating process in detail in the method of configuring a user-defined menu shown in FIG. 1.

Referring to FIG. 3, first, a graphic object displayed on a display area of a touch screen and representing a predetermined menu is selected (step 121). The graphic object may be selected through a user interface for file selection, and may be any one of various icons previously stored in a device having a touch screen. Alternatively, the graphic object may be selected from an image, a video, or flash content photographed or downloaded by the user, or may be composed of text directly input by the user.

Thereafter, editing may be performed on the selected graphic object (step 123). Here, the device with a touch screen may provide a user interface such as a separate program or menu for editing the graphic object, and the user may select the shape, size, color, transparency, etc. of the graphic object selected through the provided program or the user interface. You can edit

According to the user's selection, the graphic object editing step (step 123) may not be performed.

Next, the position of the touch screen on which the graphic object selected in step 121 is to be displayed is set (step 125). Here, the user may input a coordinate value of the position where the graphic object is to be displayed through the provided user interface, or set the display position of the graphic object by dragging and dropping the selected graphic object to a predetermined position on the touch screen. It may be.

The display position setting step (step 125) of the graphic object may be omitted according to a user's selection. If step 125 is not performed, the graphic object selected in step 121 may be displayed in the display area (ie, the touched position) of the touch screen corresponding to the coordinate value obtained in step 103 of FIG. 1.

Thereafter, an execution event corresponding to the graphic object is set (step 127). Here, the execution event refers to an event that is executed when the graphic object is manipulated. The execution event may be a software program installed on a device having a touch screen, a menu item, or a specific function provided on a device having a touch screen. .

For example, when the device having the touch screen is a mobile communication terminal, the execution event corresponding to the graphic object may be a function such as a voice call or a video call.

4 illustrates an example of a user interface screen displayed on a touch screen in a process of configuring a user defined menu according to an embodiment of the present invention. In FIG. 4, a portable terminal is illustrated as an example of a device having a touch screen.

Referring to FIG. 4, first, when a user touches a predetermined position of the touch screen for more than a predetermined reference time 310, the portable terminal determines whether a graphic object is displayed at the touched position, and the graphic object is not displayed. If it is determined, the user interface 320 for performing the user-defined menu generation mode is displayed on the touch screen.

Subsequently, when the user selects a menu generation item among the menu items 320 displayed on the user interface screen (330), the portable terminal displays the user interfaces 340 and 350 on the touch screen to allow the user to select a graphic object representing a menu. Display.

In addition, when the user selects a predetermined execution event by displaying a user interface 360 on the touch screen, the portable terminal may set an execution event corresponding to the selected graphic object, and the corresponding portable event corresponds to the selected graphic object. Save all menu information set in user defined menu creation mode.

Here, the execution event corresponding to the graphic object may correspond to a function included in each different application program or device, or a sub-menu included in one application program.

For example, the execution event corresponding to the graphic object may correspond to various application programs or functions such as image shooting, voice call, video call, phone book, video playback, and the like. , Shooting phone, shooting mode, timer shooting).

5 illustrates another example of a user interface screen displayed on a touch screen in the process of configuring a user defined menu according to an embodiment of the present invention. In FIG. 5, a tablet PC is illustrated as an example of a device having a touch screen.

Referring to FIG. 5, when a user touches a predetermined position of the touch screen for more than a preset reference time in a state where an execution screen of a predetermined application is displayed on the touch screen (410), the tablet PC means a predetermined menu at the touched position. Determines whether a graphic object is displayed.

If the graphic object is not displayed, the user-defined menu creation mode is activated (420) so that the user-selected graphic object is displayed at the touched position (or desired position), and then a function corresponding to the graphic object is set. It provides a user interface to enable (430). In operation 440, the user-created menu is displayed on the touch screen.

In FIG. 5, a page up (PgUp) and a page down (PgDn) button (that is, a graphic object) are generated at a predetermined position of a display area where a presentation screen is displayed. A case where the page up and page down functions are respectively associated with the page down button is shown as an example.

As shown in FIGS. 4 and 5, the method of constructing a user-defined menu according to an exemplary embodiment of the present invention enables a user to easily configure a menu, thereby configuring a menu optimized for the user, and thus, the apparatus. Or the usability of the application is improved.

Referring to FIG. 6, a device having a user-defined menu configuration function according to an embodiment of the present invention includes a display unit 510, an input detector 520, a controller 540, and a storage unit 550. .

The display unit 510 may be, for example, a display device such as a liquid crystal display (LCD) or an organic light emitting diode (OLED), and is based on the control of the controller 540. To display graphic information such as an icon indicating a menu.

In addition, when the user-defined menu creation mode or the user-defined menu editing mode is activated, the display unit 510 displays a user interface for creating or editing a menu based on the control of the controller 540.

The input sensing unit 520 may be, for example, a device such as a touch pad or a touch panel, generates an electrical signal corresponding to a touch operation to a user, converts the generated electrical signal into a digital signal, and then touches the touch. The coordinate value of the acquired position is obtained and provided to the controller 540.

In detail, the input detecting unit 520 may include a coordinate value obtaining module 521, and the coordinate value obtaining module 521 may include a plurality of coordinate values corresponding to the touched area. After extracting a coordinate value corresponding to the center of gravity of the figure composed of the corresponding area and obtaining the coordinate value of the touched area, the acquired coordinate value may be provided to the controller 540.

The input sensing unit 520 may be installed on the display area of the display unit 510 to be configured as a touch screen 530 or may be installed separately from the display unit 510.

Hereinafter, the device having the user-defined menu configuration function according to an embodiment of the present invention will be described with an example in which the input sensing unit 520 includes a touch screen 530 installed on the display unit 510. .

In the embodiment of the present invention, the touch screen 530 may be any one of various methods such as a resistive film type, a capacitive type, an infrared light sensing type, and a surface ultrasonic type.

The controller 540 may be implemented as, for example, a processor. The controller 540 obtains a touched time based on a coordinate value provided from the touch screen 530, and if it is determined that the touched time is longer than a preset reference time, the touch is performed. Determines whether a graphic object is displayed at the specified position.

Here, the controller 540 may obtain a touched time by measuring a time when the same coordinate value is continuously input, and includes screen configuration information including coordinates of a display position of a graphic object displayed on the touch screen 530. It is possible to determine whether the graphic object is displayed at the touched position by comparing the coordinate values of the touched position.

If the touched time is longer than the preset reference time and the graphic object is not displayed at the touched position, the controller 540 activates the user-defined menu creation mode.

Here, the user-defined menu generation mode may be implemented by a software program and may be performed by the menu editing module 543 which may be executed by a processor constituting the controller 540.

When the user-defined menu creation mode is activated, the menu editing module 543 displays a user interface for configuring a menu directly by the user through the touch screen 530, and menu setting information set by the user through the provided user interface. Store them in the storage unit 550.

Here, the user interface provided for the execution of the user-defined menu creation mode may select a graphic object, and edit an execution event corresponding to the graphic object and edit the shape, size, display position, color, and transparency of the selected graphic object. It can include a set of functions.

Alternatively, if it is determined that the touched time is longer than the preset reference time and the graphic object is displayed at the touched position, the controller 540 activates the user-defined menu editing mode.

In this case, the user-defined editing mode may be performed by the menu editing module 543, and the menu editing module may edit a menu corresponding to the graphic object displayed on the touch screen 530 when the user-defined editing mode is activated. The user interface is provided, and the menu information edited by the user is stored in the storage unit 550.

The user interface provided for executing the user-defined edit mode may include a function of editing the shape, size, display position, color, transparency, etc. of the graphic object, and editing an execution event corresponding to the graphic object.

Alternatively, when the touched time is shorter than the preset reference time and the graphic object is displayed at the touched position, the controller 540 executes an event corresponding to the graphic object displayed at the touched position, and the touched time is the reference time. If the graphic object is not displayed at the shorter and touched position, it is determined that an abnormal touch operation is caused by a user's mistake and does not execute any event.

Here, the execution of the event corresponding to the displayed graphic object may be performed by the menu execution module 541 which may be implemented as a software program.

The storage unit 550 may be configured of various media having nonvolatile characteristics, and menu configuration information set through a user-defined menu creation mode or a user-defined menu editing mode is stored.

As shown in FIG. 6, when a device having a user-defined menu configuration function according to an embodiment of the present invention has a touched time longer than a reference time, a graphic object such as an icon is not displayed at the touched location. Activates a custom menu creation mode that allows you to create new menus, and allows you to edit the displayed graphic object and the execution event corresponding to the graphic object if the graphic object is displayed at the touched position. By activating the menu editing mode, the user can easily configure the desired menu in the desired position.

The device having a user-defined menu configuration function shown in FIG. 6 is applied to various information processing devices including a touch screen or a touch pad, such as a portable terminal, a mobile communication terminal, a computer monitor, a tablet PC, an electronic blackboard, and the like. can do.

Although described with reference to the embodiments above, those skilled in the art will understand that the present invention can be variously modified and changed without departing from the spirit and scope of the invention as set forth in the claims below. Could be.

1 is a flowchart illustrating a user-defined menu configuration method according to an embodiment of the present invention.

2 is a conceptual diagram for describing a process of obtaining a coordinate value corresponding to a touch manipulation.

3 is a flowchart illustrating a menu generating process in detail in the method of configuring a user-defined menu shown in FIG. 1.

4 illustrates an example of a user interface screen displayed on a touch screen in a process of configuring a user defined menu according to an embodiment of the present invention.

5 illustrates another example of a user interface screen displayed on a touch screen in a process of configuring a user defined menu according to an embodiment of the present invention.

6 is a block diagram illustrating a configuration of an apparatus having a user-defined menu configuration function according to an embodiment of the present invention.

Explanation of symbols on the main parts of the drawings

510: display unit 520: input detection unit

530: touch screen 540: control unit

550: storage unit

Claims (18)

  1. Obtaining a coordinate value of a touched position;
    Activating a user-defined menu creation mode when the graphic object is not displayed at the touched position;
    Selecting a graphic object through the activated user-defined menu creation mode;
    Setting an execution event corresponding to the selected graphic object; And
    And displaying the graphic object corresponding to a predetermined menu set through the activated user-defined menu creation mode.
  2. delete
  3. The method of claim 1, wherein the selecting of the predetermined graphic object comprises:
    And editing at least one of a shape, a size, a color, and a transparency of the selected graphic object.
  4. The method of claim 1, wherein when the graphic object is not displayed at the touched position, the user-defined menu creation mode is activated.
    And setting the position at which the selected graphic object is to be displayed.
  5. The method of claim 1, wherein when the graphic object is not displayed at the touched position, the user-defined menu creation mode is activated.
    And obtaining a touched time, and executing the user-defined menu creation mode when the touched time is longer than a preset reference time and a graphic object is not displayed at the touched position. How to configure a custom menu, characterized in that.
  6. The method of claim 1, wherein the custom menu configuration method is
    And when the graphic object is displayed at the touched position, activating a user-defined menu editing mode for editing a menu corresponding to the displayed graphic object.
  7. The method of claim 6, wherein the user-defined menu editing mode is activated.
    And changing an execution event corresponding to the displayed graphic object.
  8. delete
  9. The method of claim 1, wherein the obtaining of the coordinate value of the touched position comprises:
    And when the coordinate values of the touched position are plural, obtaining a coordinate value corresponding to the center of gravity of the figure composed of the plurality of coordinate values as the touched coordinate value.
  10. A touch screen which acquires coordinate values of the touched position and provides the obtained coordinate values; And
    If the graphic object is not displayed in the touched position, activate the user-defined menu creation mode, match the predetermined graphic object selected through the activated user-defined menu creation mode with an execution event corresponding to the selected graphic object. And a controller configured to display the graphic object corresponding to a predetermined menu set through an activated user defined menu creation mode through the touch screen.
  11. The method of claim 10, wherein the control unit
    And determining whether to display a graphic object at the touched position based on the coordinate values provided from the touch screen.
  12. delete
  13. The method of claim 10, wherein the control unit
    And a user interface for editing at least one of a shape, a size, a color, and a transparency of the selected graphic object when the user-defined menu creation mode is activated.
  14. The method of claim 10, wherein the control unit
    And a user interface for selecting a position where the selected graphic object is to be displayed on the display area of the touch screen when the user-defined menu creation mode is activated.
  15. The method of claim 10, wherein the control unit
    A user-defined menu, wherein the user-defined menu generation mode is activated when the touch screen obtains a touched time and the touched time is longer than a preset reference time and a graphic object is not displayed at the touched position. Device with configuration functions.
  16. The method of claim 10, wherein the control unit
    And displaying a graphic object at the touched position, activating a user-defined menu editing mode.
  17. The method of claim 16, wherein the control unit
    And providing a user interface for changing an execution event corresponding to the graphic object displayed at the touched location when the user-defined menu editing mode is activated.
  18. The apparatus of claim 10, wherein the apparatus having a user-defined menu configuration function is provided.
    And a storage unit for storing menu information generated through the activated user-defined menu creation mode.
KR20080091504A 2008-09-18 2008-09-18 Method For Configurating User-defined Menu And Apparatus For Having Function For Configuration Of User-defined Menu KR100981877B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR20080091504A KR100981877B1 (en) 2008-09-18 2008-09-18 Method For Configurating User-defined Menu And Apparatus For Having Function For Configuration Of User-defined Menu

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR20080091504A KR100981877B1 (en) 2008-09-18 2008-09-18 Method For Configurating User-defined Menu And Apparatus For Having Function For Configuration Of User-defined Menu

Publications (2)

Publication Number Publication Date
KR20100032560A KR20100032560A (en) 2010-03-26
KR100981877B1 true KR100981877B1 (en) 2010-09-10

Family

ID=42181721

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20080091504A KR100981877B1 (en) 2008-09-18 2008-09-18 Method For Configurating User-defined Menu And Apparatus For Having Function For Configuration Of User-defined Menu

Country Status (1)

Country Link
KR (1) KR100981877B1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101129103B1 (en) * 2010-09-15 2012-03-23 양다몬 Input apparatus using center of gravity of figure
KR101948645B1 (en) * 2011-07-11 2019-02-18 삼성전자 주식회사 Method and apparatus for controlling contents using graphic object
US9547417B2 (en) 2013-03-29 2017-01-17 Deere & Company Retracting shortcut bars, status shortcuts and edit run page sets
CN103345348A (en) * 2013-06-13 2013-10-09 深圳Tcl新技术有限公司 Method and device for controlling display menu
CN104936040A (en) * 2015-05-19 2015-09-23 乐视致新电子科技(天津)有限公司 Page display method and page display device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070006477A (en) * 2005-07-08 2007-01-11 삼성전자주식회사 Method for arranging contents menu variably and display device using the same
KR20070010415A (en) * 2005-07-18 2007-01-24 삼성전자주식회사 Method and apparatus for providing touch screen based user interface,and electronic devices including the same
KR20080061713A (en) * 2006-12-28 2008-07-03 삼성전자주식회사 Method for providing contents list by touch on touch screen and multimedia device thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070006477A (en) * 2005-07-08 2007-01-11 삼성전자주식회사 Method for arranging contents menu variably and display device using the same
KR20070010415A (en) * 2005-07-18 2007-01-24 삼성전자주식회사 Method and apparatus for providing touch screen based user interface,and electronic devices including the same
KR20080061713A (en) * 2006-12-28 2008-07-03 삼성전자주식회사 Method for providing contents list by touch on touch screen and multimedia device thereof

Also Published As

Publication number Publication date
KR20100032560A (en) 2010-03-26

Similar Documents

Publication Publication Date Title
JP5302967B2 (en) Editing interface
US9389718B1 (en) Thumb touch interface
KR101892315B1 (en) Touch event anticipation in a computing device
US9146672B2 (en) Multidirectional swipe key for virtual keyboard
US10241626B2 (en) Information processing apparatus, information processing method, and program
KR101070111B1 (en) Hand held electronic device with multiple touch sensing devices
US9436346B2 (en) Layer-based user interface
KR101087479B1 (en) Multi display device and method for controlling the same
US8171417B2 (en) Method for switching user interface, electronic device and recording medium using the same
KR20150068330A (en) A Device for operating according to pressure state of touch input and method thereof
CN102246134B (en) Soft keyboard control
EP2299351A2 (en) Information processing apparatus, information processing method and program
EP2357556A1 (en) Automatically displaying and hiding an on-screen keyboard
KR101701492B1 (en) Terminal and method for displaying data thereof
US8413075B2 (en) Gesture movies
EP1860537B1 (en) Touch screen device and operating method thereof
KR101004463B1 (en) Handheld Terminal Supporting Menu Selecting Using Drag on the Touch Screen And Control Method Using Thereof
US8677277B2 (en) Interface cube for mobile device
JP6141300B2 (en) Indirect user interface interaction
JP5066055B2 (en) Image display device, image display method, and program
KR100831721B1 (en) Apparatus and method for displaying of mobile terminal
CN101627360B (en) Method, system, and graphical user interface for viewing multiple application windows
TWI493394B (en) The touch sensitive dual-mode digital notebook
JP2011221640A (en) Information processor, information processing method and program
KR101704549B1 (en) Method and apparatus for providing interface for inpputing character

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20130530

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20140617

Year of fee payment: 5

LAPS Lapse due to unpaid annual fee