US20140351755A1 - Facilitating display of a menu and selection of a menu item via a touch screen interface - Google Patents
Facilitating display of a menu and selection of a menu item via a touch screen interface Download PDFInfo
- Publication number
- US20140351755A1 US20140351755A1 US14/324,582 US201414324582A US2014351755A1 US 20140351755 A1 US20140351755 A1 US 20140351755A1 US 201414324582 A US201414324582 A US 201414324582A US 2014351755 A1 US2014351755 A1 US 2014351755A1
- Authority
- US
- United States
- Prior art keywords
- operating tool
- menu
- pointing direction
- display panel
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present invention relates to an information processing apparatus and an information processing method.
- an information processing apparatus that detects movement of an operating tool such as a finger of a user on a display panel and performs interaction with the user. For example, when the user selects a desired GUI (Graphical User Interface) object on the display panel, the information processing apparatus displays an operation menu containing one or more operation items selectable for the object and asks the user to select a desired operation item. Then, when the operating tool is in touch with a display area of the object for a predetermined time period, the information processing apparatus recognizes input of the menu starting operation and displays the operation menu.
- GUI Graphic User Interface
- Patent Document 1 Japanese Patent Application Laid-Open No. 2005-352619
- Patent Document 2 Japanese Patent Application Laid-Open No. 2007-80291
- Patent Document 3 Japanese Patent Application Laid-Open No. 2007-226571
- the user has to keep a touch state of an operating tool for a predetermined time period until the operation menu is displayed.
- the predetermined period is shortened, it is ambiguous to discriminate between the general object selecting operation (tapping) and the menu starting operation.
- the user needs to perform complicated operation to select the desired operation item, and it may not be necessarily said that the user can enjoy a favorable operation environment.
- the user needs to perform very many operations including selecting an option menu after selecting the object.
- the user in displaying the operation menu after selection of a plurality of objects and selecting a desired operation item, the user also needs to perform the complicated operation.
- an information processing apparatus including an operating tool detector for detecting a touch state of an operating tool with a display panel, a display controller for, when change of a pointing direction of the operating tool is detected by the operating tool detector on an object selected on the display panel, controlling the display panel to display near the object an operation menu containing one or more operation items selectable for the object, and an operation item selecting portion for, when the operation menu is displayed, selecting one of the operation items in accordance with the change in the pointing direction of the operating tool detected by the operating tool detector from the operation menu.
- the operation item selecting portion may select the operation item on an extension of the pointing direction of the operating tool from the operation menu.
- the operation item selecting portion may select the operation item placed in a direction defined by a change amount obtained by multiplying a change amount of the pointing direction of the operating tool by a coefficient a, the coefficient a being larger than 1, from the operation menu.
- the display controller may control the display panel to rotate the operation menu by a change amount obtained by multiplying the change amount of the pointing direction of the operating tool by a coefficient (1 ⁇ a) and then, display the operation menu.
- the operation item selected by the operation item selecting portion may be executed.
- the display controller may control the display panel to stop display of the operation menu.
- an information processing method including the steps of when change in a pointing direction of an operating tool is detected on an object selected on a display panel, controlling the display panel to display, near the object, an operation menu containing one or more operation items selectable for the object, and when the operation menu is displayed, selecting one of the operation items in accordance with the change of the pointing direction of the operating tool from the operation menu.
- an information processing apparatus and an information processing method capable of facilitating display of an operation menu for an object and selection of an operation item.
- FIG. 1 is a view illustrating an overview of an information processing apparatus according to an embodiment of the present invention
- FIG. 2 is a block diagram illustrating a principal functional structure of the information processing apparatus according to the embodiment of the present invention
- FIG. 3A is a view illustrating a detection result of an operating tool and a position of the operating tool on a display panel
- FIG. 3B is a view illustrating a detection result of an operating tool and a position of the operating tool on a display panel
- FIG. 3C is a view illustrating a detection result of an operating tool and a position of the operating tool on a display panel
- FIG. 4 is a flowchart illustrating an information processing method according to an embodiment of the present invention.
- FIG. 5 is a view illustrating a processing example by the information processing method (display of an operation menu);
- FIG. 6 is a view illustrating a processing example by the information processing method (execution of an operation item);
- FIG. 7 is a view illustrating a processing example by the information processing method (stop of display of the operation menu);
- FIG. 8 is a view illustrating a processing example by the information processing method (selection of an operation item).
- FIG. 9 is a view illustrating a first modification for selection of operation item
- FIG. 10 is a view illustrating a second modification for selection of operation item.
- FIG. 11 is a view illustrating another display example of the operation menu.
- FIG. 1 is a view illustrating an overview of an information processing apparatus 100 according to an embodiment of the present invention.
- the information processing apparatus 100 detects a touch state of an operating tool M such as a user finger with a display panel 101 .
- the information processing apparatus 100 is a personal computer, a PDA, a portable music player or the like.
- the information processing apparatus 100 has a built-in type display panel 101 , however, the information processing apparatus 100 may be connected to a display panel 101 via communication means.
- the information processing apparatus 100 controls the display panel 101 in such a manner that an operation menu OM containing one or more operation items I selectable for the object O is displayed near the object O. Then, while the operation menu OM is displayed, the information processing apparatus 100 selects an operation item I in accordance with change in the pointing direction of the operating tool M from the operation menu OM.
- the pointing direction of the operating tool M is changed on the object O and the operation menu OM is displayed.
- the operation item I (for example, operation item I6) is selected.
- the pointing direction of the operating tool is a direction pointed out by a finger, for example, when the operating tool is the finger.
- the operation item I and object O selected are illustrated hatched.
- a user can input a menu starting operation by changing the pointing direction of the operating tool M and the user does not need to keep the touch state of the operating tool M for a predetermined time period.
- the user can select a desired operation item I by changing the pointing direction of the operating tool M, and the user does not need to perform complicated operation in selecting of the operation item I.
- the user can perform operations of selecting an object O, displaying an operation menu OM and selecting an operation item I as a series of the operations efficiently.
- FIG. 2 is a block diagram illustrating a principal functional structure of the information processing apparatus 100 according to the embodiment of the present invention.
- the information processing apparatus 100 has the display panel 101 , an operating tool detector 107 , a storage 109 , a display controller 111 and a controller 113 .
- the display panel 101 functions as a touch sensor 103 and a display unit 105 .
- the touch sensor 103 detects a touch state of the operating tool M.
- the touch sensor 103 is an optical sensor, an electric capacitance sensor, a pressure sensor or any other sensor. In the following description, it is assumed that the touch sensor 103 detects the touch state of the operating tool M based on a light-receiving state of the display panel 101 .
- the display unit 105 displays processing results of applications, contents and an object O under control of the display controller 111 and particularly displays an operation menu OM containing one or more operation items I selectable for the object O selected on the display panel 101 .
- the object is an object O that includes GUI, such as an icon, a button or a thumbnail.
- the operating tool detector 107 detects the touch state of the operating tool M with the display panel 101 by the touch sensor 103 .
- the operating tool detector 107 uses the light-receiving state of the display panel 101 as a basis to detect presence or absence of touch of the operating tool M with the display panel 101 , a touch position, a touch area and a pointing direction.
- the method of detecting the operating tool M by the touch sensor 103 will be described later.
- the storage 109 stores information processing programs, application program, object O data and the like and particularly stores data of the operation menu OM.
- the controller 113 controls the overall operation of the information processing apparatus 100 by controlling each portion by execution of an information processing program.
- the controller 113 has a function as an operation item selecting portion to select an operation item I from the operation menu OM in accordance with change in the pointing direction of the operating tool M detected by the operating tool detector 107 while the operation menu OM is displayed.
- change in the pointing direction of the operating tool M can be discriminated from conventional button down, button up, click, double click, touch, drag, drop, flick and the like. It is detected without interference with these operations.
- RGB pixels and light-receiving sensors are arranged in a matrix.
- the light-receiving sensors function as the touch sensor 103 to receive light emitted from the display panel 101 and reflected by the operating tool M and detect the touch state of the operating tool M based on the light-receiving state.
- the operating tool detector 107 performs digital processing on an output result of the touch sensor 103 thereby to generate a sensor image S.
- the operating tool detector 107 calculates a luminance value expressing the light-receiving state corresponding to each pixel based on the sensor image S, and processes the luminance value into a binary value with use of a predetermined threshold. In the binary processing, the luminance value of each pixel is classified into first or second category, and each area of the sensor image S is classified into first or second area A 1 or A 2 corresponding to respective categories.
- the first and second areas A 1 and A 2 correspond to the large and small luminance areas, which are specified as a touch area and a non-touch area of the operating tool M, respectively.
- the operating tool detector 107 uses existence of the first area A 1 as a basis to detect presence or absence of touch of the operating tool M with the display panel 101 . Besides, the operating tool detector 107 calculates the center-of-gravity position and area of the first area A 1 thereby to detect each of the touch position and touch area of the operating tool M.
- the operating tool detector 107 specifies a long axis direction D of the first area Al thereby to detect the pointing direction of the operating tool M.
- the pointing direction of the operating tool M is defined as a direction of pointing out an upper part of the display panel 101 along the long axis direction D of the first area A 1 .
- the controller 113 calculates an angle difference between pointing directions of the operating tool M before and after rotation thereby to calculate the rotational angle of the operating tool M.
- FIGS. 3A to 3C are views illustrating detection results of the operating tool M and positions of the operating tool M on the display panel 101 .
- the touch area A 1 of a finger end as the operating tool M is grasped as an elliptic area A 1 on a sensor image S.
- the operating tool detector 107 specifies the long axis direction D of the elliptic area A 1 and detects as the pointing direction of the operating tool M a direction of pointing the upper part of the display panel 101 along the specified long axis direction D.
- the touch area A 1 of the finger end with the display panel 101 is grasped as an elliptic area A 1 in which the pointing direction of the finger is the long axis direction D.
- the finger end is rotated from the state of FIG. 3A and a touch area A 1 ′ of the rotated finger end is grasped as an elliptic area A 1 ′ on the sensor image S.
- the operating tool detector 107 specifies the long axis direction D of the elliptic area A 1 ′ and detects the direction of pointing the upper part of the display panel 101 along the specified long axis direction D as a pointing direction of the operating tool M after rotation. Then, the controller 113 uses an angle difference between pointing directions of the operating tool M before and after rotation as a basis to calculate the rotational angle of the operating tool M.
- a touch area A 1 ′′ of the finger end is grasped as an approximately circular area A 1 ′′ on the sensor image S.
- the operating tool detector 107 may not specify the long axis direction D of the touch area A 1 ′′ and the controller 113 regards it as a detection error.
- FIG. 4 is a flowchart illustrating an information processing method according to an embodiment of the present invention.
- FIGS. 5 to 8 are views illustrating processing examples of the information processing method.
- the operating tool detector 107 detects a touch state of the operating tool M for each detection frame (S 101 ).
- the controller 113 determines whether or not the touch state of the operating tool M is changed from that in the last detected frame (S 103 ). When the determination result is positive, the controller 113 performs the processing of step S 105 and later, while when the determination result is negative, it goes back to the processing of step S 101 .
- step S 105 the controller 113 determines whether or not the operation menu OM is displayed. When the determination result is positive, the controller 113 performs the processing of step S 107 and later. When the determination result is negative, it performs the processing of step S 115 .
- step S 107 the controller 113 determines whether or not the object O for display of the operation menu is selected on the display panel 101 .
- the object O is selected on the display panel 101 by tapping of the operating tool M or the like.
- the controller 113 determines whether or not the operating tool M is not moved predetermined distance or more on the selected object O and the operating tool M is rotated a predetermined angle or more (S 109 , S 111 ).
- a moving distance of the operating tool M is a change amount of the touch position of the operating tool M that has moved in touch with the display panel 101 .
- the rotational amount of the operating tool M means a change amount of the pointing direction of the operating tool M.
- movement for a predetermined distance or more means, for example, movement of the selected object O to the outside of display area.
- Rotation by a predetermined angle or more means, for example, rotation by such a rotational angle that input of the menu starting operation is not misidentified.
- step S 113 displays the operation menu OM (S 113 ) and goes back to the processing of step S 101 .
- the controller 113 goes back to the processing of step S 101 .
- the operation menu OM contains one or more operation items I selectable for the selected object O, which are displayed near the object O.
- a selected operation item I is brought into focus and, for example, the operation item I is displayed enlarged.
- the operation menu OM is displayed in consideration of the position of the operating tool M estimated from the pointing direction of the operating tool M so as to prevent the displayed operation item I from being covered with the operating tool M to reduce the visibility.
- labels of music albums 1 to 7 are displayed on the display panel 101 as objects O and the label of album 3 is selected by the operating tool M.
- the operation menu OM containing the operation items I for selecting from songs 1 to 7 stored in the album 3 is displayed.
- the operation item I4 on an extension of the pointing direction of the operating tool M is selectable on the operation menu OM.
- step S 105 determines in step S 115 whether or not the operating tool M is changed into non-touch state. Then, when the determination result is positive, the controller 113 executes the operation item I selected on the operation menu OM (S 117 ), and it goes back to step S 101 .
- step S 119 the processing of step S 119 is performed.
- the operation item I selected on the operation menu OM is executed by changing the operating tool M into the non-touch state while the operation menu OM is displayed.
- replay of the operation item I4 selected on the operation menu OM is started. Then, the user can easily instruct execution of the operation item I by bringing the operating tool M into the non-touch state.
- step S 115 determines whether or not the operating tool M is moved a predetermined distance or more.
- movement of a predetermined distance or more means, for example, movement to the outside of the display area of the operation menu OM.
- display of the operation menu OM is stopped (S 121 ), and it goes back to step S 101 .
- step S 123 the processing of step S 123 is performed.
- display of the operation menu OM is stopped by the operating tool M that has moved a predetermined distance or more while the operation menu OM is displayed.
- display of the operation menu OM is stopped.
- step S 119 determines whether or not the operating tool M is rotated a predetermined angle or more (S 123 ).
- rotation by a predetermined angle or more means, for example, rotation of the pointing direction of the operating tool M with a detection accuracy or more.
- selection of the operation item I is performed (S 125 ) and display of the operation menu OM and the like are updated. Further, the controller 113 returns back to the processing of step S 101 .
- the operation item I is selected in accordance with change in the pointing direction of the operating tool M while the operation menu OM is displayed. Then, on the operation menu OM, the focus is moved to the selected operation item I.
- the operation item I placed on the extension of the pointing direction of the operating tool M on the display panel 101 is selected. Then, the user can easily select the desired operation item I in accordance with the change in the pointing direction of the operating tool M.
- predetermined operation items I may be set to be selectable or all of them may be set to be unselectable.
- FIGS. 9 and 10 are views illustrating first and second modifications for selecting an operation item I.
- FIG. 11 is a view illustrating another display example of the operation menu OM.
- the pointing direction of the operating tool M is rotated 45° clockwise while the operation item I4 on the extension of the pointing direction of the operating tool M is selected.
- the user can easily select a desired operation item I as compared with selecting of the operation item I on the extension of the pointing direction of the operating tool M.
- the operation item I is selected that is placed in the direction defined by a change amount obtained by multiplying a change amount of the pointing direction of the operating tool M by a coefficient a.
- the operation item is selected by highly valuing the change of the pointing direction of the operating tool M
- the operability in selection is improved as compared with selecting of the operation item I on the extension of the pointing direction.
- the position of the selected operation item I does not match the pointing direction of the operating tool M (for example, in FIG. 9 , not the operation item I6 but the operation item I5 is positioned on the extension of the pointing direction of the operating tool M), it is difficult to select the operation item I by an intuitive operation.
- an operation item I is selected that is placed in the direction defined by a change amount obtained by multiplying a change amount of the pointing direction of the operating tool M by a coefficient a (1 ⁇ a), and the operation menu OM is rotated by a change amount obtained by multiplying the change amount of the pointing direction of the operating tool M by a coefficient (1 ⁇ a).
- the pointing direction of the operating tool M is rotated 45° clockwise while the operation item I4 is selected on the extension of the pointing direction of the operating tool M.
- the operation item I6 on the extension of the pointing direction of the operating tool M is selected.
- the user can easily select the desired operation item I by the intuitive operation as compared with selecting of the operation item I placed in the direction defined by the change amount obtained by multiplying the change amount of the pointing direction of the operating tool M by the coefficient a.
- FIG. 11 illustrates a display example of the operation menu OM containing one or more operation items I selectable for a plurality of objects 0 .
- FIG. 11 for example, statistical process of maximum, minimum, average, sum and the like is performed on data contained in a plurality of cells that form a spread sheet (object O).
- the user performs dragging of the operating tool M on the display panel 101 to select a plurality of cells containing data for statistical process, and then, rotates the operating tool M on the cell at the dragging end by a predetermined angle or more.
- the controller 113 recognizes input of the menu starting operation and displays an approximately sector-shaped operation menu OM around the cell at the end. Then, following the menu starting operation, the user can select the operation item I in accordance with change in the pointing direction of the operating tool M (for example, in FIG. 11 , the operation item I3 is selected).
- the controller 113 does not recognize input of the menu starting operation as long as the change in the pointing direction is less than a predetermined angle.
- the information processing apparatus 100 controls the display panel 101 (display unit 105 ) to display the operation menu OM containing one or more operation items I selectable for the object O near the object O. Then, the information processing apparatus 100 selects an operation item I on the operation OM in accordance with the change in the pointing direction of the operating tool M while the operation menu OM is displayed.
- the user can input a menu starting operation by changing the pointing direction of the operating tool M, and does not need to keep the touch state of the operating tool M for a predetermined time period.
- the user can select a desired operation item I by changing the pointing direction of the operating tool M and does not need to perform a complicated operation in selecting of the operation item I.
- the user can perform operations of selecting of the object O, displaying of the operation menu OM and selecting of the operation item I as a series of the operations efficiently.
- the sensor may be an electrical capacitance sensor, a pressure sensor or any other touch sensor.
- the pointing direction of the operating tool M is detected based on the touch state of the operating tool M.
- the pointing direction of the operating tool M may be detected from the touch state and proximity state of the operating tool M.
- the sensor image as an output result of the touch/proximity sensor is processed into three-digit value to specify the touch area, proximity area and non-touch proximity area of the operating tool M.
- the center-of-gravity positions of the proximity area and the touch area are used as a basis to detect the direction toward the center of gravity in the touch area from the center of gravity of the proximity area as a pointing direction of the operating tool M.
Abstract
Description
- This application is a continuation of and claims the benefit under 35 U.S.C. §120 of U.S. patent application Ser. No. 12/821,399, titled “FACILITATING DISPLAY OF A MENU AND SELECTION OF A MENU ITEM VIA A TOUCH SCREEN INTERFACE,” filed on Jun. 23, 2010, which claims the benefit under 35
U.S.C. § 119 of Japanese Patent Application JP 2009-158153, filed on Jul. 2, 2009. The entire contents of these applications are hereby incorporated by reference in their entireties. - 1. Field of the Invention
- The present invention relates to an information processing apparatus and an information processing method.
- 2. Description of the Related Art
- There is known an information processing apparatus that detects movement of an operating tool such as a finger of a user on a display panel and performs interaction with the user. For example, when the user selects a desired GUI (Graphical User Interface) object on the display panel, the information processing apparatus displays an operation menu containing one or more operation items selectable for the object and asks the user to select a desired operation item. Then, when the operating tool is in touch with a display area of the object for a predetermined time period, the information processing apparatus recognizes input of the menu starting operation and displays the operation menu.
- [Patent Document 1] Japanese Patent Application Laid-Open No. 2005-352619
- [Patent Document 2] Japanese Patent Application Laid-Open No. 2007-80291
- [Patent Document 3] Japanese Patent Application Laid-Open No. 2007-226571
- However, the user has to keep a touch state of an operating tool for a predetermined time period until the operation menu is displayed. Here, when the predetermined period is shortened, it is ambiguous to discriminate between the general object selecting operation (tapping) and the menu starting operation. Besides, after the operation menu is displayed, the user needs to perform complicated operation to select the desired operation item, and it may not be necessarily said that the user can enjoy a favorable operation environment.
- Particularly, in the information processing apparatus such as a portable information processing terminal with restricted I/O interface specifications or the like, the user needs to perform very many operations including selecting an option menu after selecting the object. In addition, in displaying the operation menu after selection of a plurality of objects and selecting a desired operation item, the user also needs to perform the complicated operation.
- In light of the foregoing, it is desirable to provide an information processing apparatus and an information processing method capable of facilitating display of an operation menu for an object and selection of an operation item.
- According to a first embodiment of the present invention, there is provided an information processing apparatus including an operating tool detector for detecting a touch state of an operating tool with a display panel, a display controller for, when change of a pointing direction of the operating tool is detected by the operating tool detector on an object selected on the display panel, controlling the display panel to display near the object an operation menu containing one or more operation items selectable for the object, and an operation item selecting portion for, when the operation menu is displayed, selecting one of the operation items in accordance with the change in the pointing direction of the operating tool detected by the operating tool detector from the operation menu.
- The operation item selecting portion may select the operation item on an extension of the pointing direction of the operating tool from the operation menu.
- The operation item selecting portion may select the operation item placed in a direction defined by a change amount obtained by multiplying a change amount of the pointing direction of the operating tool by a coefficient a, the coefficient a being larger than 1, from the operation menu. The display controller may control the display panel to rotate the operation menu by a change amount obtained by multiplying the change amount of the pointing direction of the operating tool by a coefficient (1−a) and then, display the operation menu.
- When the operation menu is displayed and simultaneously a non-touch state of the operating tool is detected by the operating tool detector, the operation item selected by the operation item selecting portion may be executed.
- When the operation menu is displayed and simultaneously, movement of a predetermined distance or more of the operating tool in touch with the display panel is detected by the operating tool detector, the display controller may control the display panel to stop display of the operation menu.
- According to a second embodiment of the present invention, there is provided an information processing method, including the steps of when change in a pointing direction of an operating tool is detected on an object selected on a display panel, controlling the display panel to display, near the object, an operation menu containing one or more operation items selectable for the object, and when the operation menu is displayed, selecting one of the operation items in accordance with the change of the pointing direction of the operating tool from the operation menu.
- According to the embodiments of the present invention described above, it is possible to provide an information processing apparatus and an information processing method capable of facilitating display of an operation menu for an object and selection of an operation item.
-
FIG. 1 is a view illustrating an overview of an information processing apparatus according to an embodiment of the present invention; -
FIG. 2 is a block diagram illustrating a principal functional structure of the information processing apparatus according to the embodiment of the present invention; -
FIG. 3A is a view illustrating a detection result of an operating tool and a position of the operating tool on a display panel; -
FIG. 3B is a view illustrating a detection result of an operating tool and a position of the operating tool on a display panel; -
FIG. 3C is a view illustrating a detection result of an operating tool and a position of the operating tool on a display panel; -
FIG. 4 is a flowchart illustrating an information processing method according to an embodiment of the present invention; -
FIG. 5 is a view illustrating a processing example by the information processing method (display of an operation menu); -
FIG. 6 is a view illustrating a processing example by the information processing method (execution of an operation item); -
FIG. 7 is a view illustrating a processing example by the information processing method (stop of display of the operation menu); -
FIG. 8 is a view illustrating a processing example by the information processing method (selection of an operation item); -
FIG. 9 is a view illustrating a first modification for selection of operation item; -
FIG. 10 is a view illustrating a second modification for selection of operation item; and -
FIG. 11 is a view illustrating another display example of the operation menu. - Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
-
FIG. 1 is a view illustrating an overview of aninformation processing apparatus 100 according to an embodiment of the present invention. - The
information processing apparatus 100 according to the embodiment of the present invention detects a touch state of an operating tool M such as a user finger with adisplay panel 101. Theinformation processing apparatus 100 is a personal computer, a PDA, a portable music player or the like. In the following description, theinformation processing apparatus 100 has a built-intype display panel 101, however, theinformation processing apparatus 100 may be connected to adisplay panel 101 via communication means. - When change in a pointing direction of the operating tool M is detected on an object O selected on the
display panel 101, theinformation processing apparatus 100 controls thedisplay panel 101 in such a manner that an operation menu OM containing one or more operation items I selectable for the object O is displayed near the object O. Then, while the operation menu OM is displayed, theinformation processing apparatus 100 selects an operation item I in accordance with change in the pointing direction of the operating tool M from the operation menu OM. - For example, in
FIG. 1 , while the object O is selected, the pointing direction of the operating tool M is changed on the object O and the operation menu OM is displayed. In accordance with the change of the pointing direction of the operating tool M on the operation menu OM, the operation item I (for example, operation item I6) is selected. Here, the pointing direction of the operating tool is a direction pointed out by a finger, for example, when the operating tool is the finger. InFIG. 1 and other figures, the operation item I and object O selected are illustrated hatched. - Accordingly, a user can input a menu starting operation by changing the pointing direction of the operating tool M and the user does not need to keep the touch state of the operating tool M for a predetermined time period. Besides, the user can select a desired operation item I by changing the pointing direction of the operating tool M, and the user does not need to perform complicated operation in selecting of the operation item I. Further, the user can perform operations of selecting an object O, displaying an operation menu OM and selecting an operation item I as a series of the operations efficiently.
-
FIG. 2 is a block diagram illustrating a principal functional structure of theinformation processing apparatus 100 according to the embodiment of the present invention. Theinformation processing apparatus 100 has thedisplay panel 101, anoperating tool detector 107, astorage 109, adisplay controller 111 and acontroller 113. - The
display panel 101 functions as atouch sensor 103 and adisplay unit 105. Thetouch sensor 103 detects a touch state of the operating tool M. Thetouch sensor 103 is an optical sensor, an electric capacitance sensor, a pressure sensor or any other sensor. In the following description, it is assumed that thetouch sensor 103 detects the touch state of the operating tool M based on a light-receiving state of thedisplay panel 101. - The
display unit 105 displays processing results of applications, contents and an object O under control of thedisplay controller 111 and particularly displays an operation menu OM containing one or more operation items I selectable for the object O selected on thedisplay panel 101. Here, the object is an object O that includes GUI, such as an icon, a button or a thumbnail. - The
operating tool detector 107 detects the touch state of the operating tool M with thedisplay panel 101 by thetouch sensor 103. Theoperating tool detector 107 uses the light-receiving state of thedisplay panel 101 as a basis to detect presence or absence of touch of the operating tool M with thedisplay panel 101, a touch position, a touch area and a pointing direction. Here, the method of detecting the operating tool M by thetouch sensor 103 will be described later. - The
storage 109 stores information processing programs, application program, object O data and the like and particularly stores data of the operation menu OM. Thecontroller 113 controls the overall operation of theinformation processing apparatus 100 by controlling each portion by execution of an information processing program. - Particularly, the
controller 113 has a function as an operation item selecting portion to select an operation item I from the operation menu OM in accordance with change in the pointing direction of the operating tool M detected by theoperating tool detector 107 while the operation menu OM is displayed. - Here, change in the pointing direction of the operating tool M can be discriminated from conventional button down, button up, click, double click, touch, drag, drop, flick and the like. It is detected without interference with these operations.
- On the
display panel 101, RGB pixels and light-receiving sensors (both not shown) are arranged in a matrix. The light-receiving sensors function as thetouch sensor 103 to receive light emitted from thedisplay panel 101 and reflected by the operating tool M and detect the touch state of the operating tool M based on the light-receiving state. Theoperating tool detector 107 performs digital processing on an output result of thetouch sensor 103 thereby to generate a sensor image S. - The
operating tool detector 107 calculates a luminance value expressing the light-receiving state corresponding to each pixel based on the sensor image S, and processes the luminance value into a binary value with use of a predetermined threshold. In the binary processing, the luminance value of each pixel is classified into first or second category, and each area of the sensor image S is classified into first or second area A1 or A2 corresponding to respective categories. The first and second areas A1 and A2 correspond to the large and small luminance areas, which are specified as a touch area and a non-touch area of the operating tool M, respectively. - The
operating tool detector 107 uses existence of the first area A1 as a basis to detect presence or absence of touch of the operating tool M with thedisplay panel 101. Besides, theoperating tool detector 107 calculates the center-of-gravity position and area of the first area A1 thereby to detect each of the touch position and touch area of the operating tool M. - Particularly, the
operating tool detector 107 specifies a long axis direction D of the first area Al thereby to detect the pointing direction of the operating tool M. The pointing direction of the operating tool M is defined as a direction of pointing out an upper part of thedisplay panel 101 along the long axis direction D of the first area A1. Thecontroller 113 calculates an angle difference between pointing directions of the operating tool M before and after rotation thereby to calculate the rotational angle of the operating tool M. - Hereinafter, the method for detecting the pointing direction of the operating tool M will be described with reference to
FIGS. 3A to 3C .FIGS. 3A to 3C are views illustrating detection results of the operating tool M and positions of the operating tool M on thedisplay panel 101. - In
FIG. 3A , the touch area A1 of a finger end as the operating tool M is grasped as an elliptic area A1 on a sensor image S. In this case, theoperating tool detector 107 specifies the long axis direction D of the elliptic area A1 and detects as the pointing direction of the operating tool M a direction of pointing the upper part of thedisplay panel 101 along the specified long axis direction D. Usually, the touch area A1 of the finger end with thedisplay panel 101 is grasped as an elliptic area A1 in which the pointing direction of the finger is the long axis direction D. - In
FIG. 3B , the finger end is rotated from the state ofFIG. 3A and a touch area A1′ of the rotated finger end is grasped as an elliptic area A1′ on the sensor image S. In this case, theoperating tool detector 107 specifies the long axis direction D of the elliptic area A1′ and detects the direction of pointing the upper part of thedisplay panel 101 along the specified long axis direction D as a pointing direction of the operating tool M after rotation. Then, thecontroller 113 uses an angle difference between pointing directions of the operating tool M before and after rotation as a basis to calculate the rotational angle of the operating tool M. - On the other hand, in
FIG. 3C , a touch area A1″ of the finger end is grasped as an approximately circular area A1″ on the sensor image S. In this case, theoperating tool detector 107 may not specify the long axis direction D of the touch area A1″ and thecontroller 113 regards it as a detection error. -
FIG. 4 is a flowchart illustrating an information processing method according to an embodiment of the present invention.FIGS. 5 to 8 are views illustrating processing examples of the information processing method. - As illustrated in
FIG. 4 , theoperating tool detector 107 detects a touch state of the operating tool M for each detection frame (S101). Thecontroller 113 determines whether or not the touch state of the operating tool M is changed from that in the last detected frame (S103). When the determination result is positive, thecontroller 113 performs the processing of step S105 and later, while when the determination result is negative, it goes back to the processing of step S101. - In step S105, the
controller 113 determines whether or not the operation menu OM is displayed. When the determination result is positive, thecontroller 113 performs the processing of step S107 and later. When the determination result is negative, it performs the processing of step S115. - In step S107, the
controller 113 determines whether or not the object O for display of the operation menu is selected on thedisplay panel 101. The object O is selected on thedisplay panel 101 by tapping of the operating tool M or the like. When the determination result is positive, thecontroller 113 determines whether or not the operating tool M is not moved predetermined distance or more on the selected object O and the operating tool M is rotated a predetermined angle or more (S109, S111). - Here, a moving distance of the operating tool M is a change amount of the touch position of the operating tool M that has moved in touch with the
display panel 101. The rotational amount of the operating tool M means a change amount of the pointing direction of the operating tool M. Besides, movement for a predetermined distance or more means, for example, movement of the selected object O to the outside of display area. Rotation by a predetermined angle or more means, for example, rotation by such a rotational angle that input of the menu starting operation is not misidentified. - Then, when the determination result is positive, the
controller 113 displays the operation menu OM (S113) and goes back to the processing of step S101. On the other hand, when the determination result in step S107, S109 or S111 is negative, thecontroller 113 goes back to the processing of step S101. - Here, the operation menu OM contains one or more operation items I selectable for the selected object O, which are displayed near the object O. In the operation menu OM, a selected operation item I is brought into focus and, for example, the operation item I is displayed enlarged. Besides, the operation menu OM is displayed in consideration of the position of the operating tool M estimated from the pointing direction of the operating tool M so as to prevent the displayed operation item I from being covered with the operating tool M to reduce the visibility.
- In
FIG. 5 , labels ofmusic albums 1 to 7 are displayed on thedisplay panel 101 as objects O and the label ofalbum 3 is selected by the operating tool M. As illustrated inFIG. 5 , when the object O is selected and the operating tool M is rotated a predetermined angle or more, input of the menu starting operation is recognized and the operation menu OM is displayed. InFIG. 5 , the operation menu OM containing the operation items I for selecting fromsongs 1 to 7 stored in thealbum 3 is displayed. Besides, the operation item I4 on an extension of the pointing direction of the operating tool M is selectable on the operation menu OM. Here, when the operating tool M is moved a predetermined distance or more, the input of the menu starting operation is not recognized to prevent operation mistake. With this structure, the user can input the menu starting operation easily by changing the pointing direction of the operating tool M. - When the determination result in step S105 is negative, that is, the operation menu OM is displayed, the
controller 113 determines in step S115 whether or not the operating tool M is changed into non-touch state. Then, when the determination result is positive, thecontroller 113 executes the operation item I selected on the operation menu OM (S117), and it goes back to step S101. When the determination result is negative, the processing of step S119 is performed. - As shown in
FIG. 6 , the operation item I selected on the operation menu OM is executed by changing the operating tool M into the non-touch state while the operation menu OM is displayed. InFIG. 6 , once the operating tool M is changed into the non-touch state, replay of the operation item I4 selected on the operation menu OM is started. Then, the user can easily instruct execution of the operation item I by bringing the operating tool M into the non-touch state. - When the determination result in step S115 is negative, that is, the operating tool M is not changed into the non-touch state, then, in step S119, the
controller 113 determines whether or not the operating tool M is moved a predetermined distance or more. Here, movement of a predetermined distance or more means, for example, movement to the outside of the display area of the operation menu OM. Then, when the determination result is positive, display of the operation menu OM is stopped (S121), and it goes back to step S101. When the determination result is negative, the processing of step S123 is performed. - As illustrated in
FIG. 7 , display of the operation menu OM is stopped by the operating tool M that has moved a predetermined distance or more while the operation menu OM is displayed. InFIG. 7 , once the operating tool M is moved outside the display area of the object O of the selectedalbum 3, display of the operation menu OM is stopped. With this structure, the user can easily stop display of the operation menu OM by moving the operating tool M a predetermined distance or more. - When the determination result in step S119 is negative, that is, the operating tool M is not moved a predetermined distance or more, the
controller 113 determines whether or not the operating tool M is rotated a predetermined angle or more (S123). Here, rotation by a predetermined angle or more means, for example, rotation of the pointing direction of the operating tool M with a detection accuracy or more. Then, when the determination result is positive, selection of the operation item I is performed (S125) and display of the operation menu OM and the like are updated. Further, thecontroller 113 returns back to the processing of step S101. - As illustrated in
FIG. 8 , the operation item I is selected in accordance with change in the pointing direction of the operating tool M while the operation menu OM is displayed. Then, on the operation menu OM, the focus is moved to the selected operation item I. InFIG. 8 , when the pointing direction of the operating tool M is rotated 90° clockwise, the focus is moved from the operation item I4 to the operation item I6 placed in the 90° clockwise direction. Here, out of the operation items I contained in the operation menu OM, the operation item I placed on the extension of the pointing direction of the operating tool M on thedisplay panel 101 is selected. Then, the user can easily select the desired operation item I in accordance with the change in the pointing direction of the operating tool M. Here, when display of the operation menu OM is started, predetermined operation items I may be set to be selectable or all of them may be set to be unselectable. -
FIGS. 9 and 10 are views illustrating first and second modifications for selecting an operation item I.FIG. 11 is a view illustrating another display example of the operation menu OM. - In the above-mentioned embodiment, a case in which the operation item I is selected on the extension of the pointing direction of the operating tool M is described. In this case, the change range of the pointing direction of the operating tool M is restricted and sometimes the operability in selection is reduced. For example, there is some difficulty in rotating the pointing direction of the finger 180°.
- Therefore, as illustrated in
FIG. 9 , in the first modification, selected is not the operation item I on the extension of the pointing direction of the operating tool M but the operation item I in the direction defined by a change amount obtained by multiplying a change amount of the pointing direction of the operating tool M by a coefficient a (1<a). - For example, it is assumed that the pointing direction of the operating tool M is rotated 45° clockwise while the operation item I4 on the extension of the pointing direction of the operating tool M is selected. In this case, for example, when the coefficient a=2 is given, a focus is moved to the operation item I6 placed in the 90° clockwise direction (=45 degree×2) from the selected operation item I4 on the operation menu OM and the operation item I6 is selected. With this structure, the user can easily select a desired operation item I as compared with selecting of the operation item I on the extension of the pointing direction of the operating tool M.
- In the first modification, a case in which the operation item I is selected that is placed in the direction defined by a change amount obtained by multiplying a change amount of the pointing direction of the operating tool M by a coefficient a is described. In this case, as the operation item is selected by highly valuing the change of the pointing direction of the operating tool M, the operability in selection is improved as compared with selecting of the operation item I on the extension of the pointing direction. However, as the position of the selected operation item I does not match the pointing direction of the operating tool M (for example, in
FIG. 9 , not the operation item I6 but the operation item I5 is positioned on the extension of the pointing direction of the operating tool M), it is difficult to select the operation item I by an intuitive operation. - Therefore, as illustrated in
FIG. 10 , in the second modification, an operation item I is selected that is placed in the direction defined by a change amount obtained by multiplying a change amount of the pointing direction of the operating tool M by a coefficient a (1<a), and the operation menu OM is rotated by a change amount obtained by multiplying the change amount of the pointing direction of the operating tool M by a coefficient (1−a). - For example, it is assumed that the pointing direction of the operating tool M is rotated 45° clockwise while the operation item I4 is selected on the extension of the pointing direction of the operating tool M. In this case, for example, when the coefficient a=2 is given, a focus is moved to the operation item I6 placed in the 90° clockwise direction (=45°×2) from the operation item I4 on the operation menu OM, and as illustrated by the arrow MD, the operation menu OM is rotated clockwise by −45° (=45°×(−1)), that is, counterclockwise by 45°. Then, the operation item I6 on the extension of the pointing direction of the operating tool M is selected. With this structure, the user can easily select the desired operation item I by the intuitive operation as compared with selecting of the operation item I placed in the direction defined by the change amount obtained by multiplying the change amount of the pointing direction of the operating tool M by the coefficient a.
-
FIG. 11 illustrates a display example of the operation menu OM containing one or more operation items I selectable for a plurality of objects 0. InFIG. 11 , for example, statistical process of maximum, minimum, average, sum and the like is performed on data contained in a plurality of cells that form a spread sheet (object O). - In this case, the user performs dragging of the operating tool M on the
display panel 101 to select a plurality of cells containing data for statistical process, and then, rotates the operating tool M on the cell at the dragging end by a predetermined angle or more. Then, thecontroller 113 recognizes input of the menu starting operation and displays an approximately sector-shaped operation menu OM around the cell at the end. Then, following the menu starting operation, the user can select the operation item I in accordance with change in the pointing direction of the operating tool M (for example, inFIG. 11 , the operation item I3 is selected). - Here, if the pointing direction of the operating tool M is slightly changed during dragging of the operating tool M, the
controller 113 does not recognize input of the menu starting operation as long as the change in the pointing direction is less than a predetermined angle. - As described above, according to the embodiments of the present invention, when change in the pointing direction of the operating tool M is detected on the object O selected on the
display panel 101, theinformation processing apparatus 100 controls the display panel 101 (display unit 105) to display the operation menu OM containing one or more operation items I selectable for the object O near the object O. Then, theinformation processing apparatus 100 selects an operation item I on the operation OM in accordance with the change in the pointing direction of the operating tool M while the operation menu OM is displayed. - With this structure, the user can input a menu starting operation by changing the pointing direction of the operating tool M, and does not need to keep the touch state of the operating tool M for a predetermined time period. Besides, the user can select a desired operation item I by changing the pointing direction of the operating tool M and does not need to perform a complicated operation in selecting of the operation item I. Further, the user can perform operations of selecting of the object O, displaying of the operation menu OM and selecting of the operation item I as a series of the operations efficiently.
- It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
- For example, in the above-described embodiment, a case in which the touch state of the operating tool M is detected with use of the optical touch sensor is described. However, the sensor may be an electrical capacitance sensor, a pressure sensor or any other touch sensor.
- Besides, in the above-described embodiment, a case in which the pointing direction of the operating tool M is detected based on the touch state of the operating tool M is described. However, the pointing direction of the operating tool M may be detected from the touch state and proximity state of the operating tool M. In this case, for example, the sensor image as an output result of the touch/proximity sensor is processed into three-digit value to specify the touch area, proximity area and non-touch proximity area of the operating tool M. Then, the center-of-gravity positions of the proximity area and the touch area are used as a basis to detect the direction toward the center of gravity in the touch area from the center of gravity of the proximity area as a pointing direction of the operating tool M.
- The present application contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2009-158153 filed in the Japan Patent Office on Jul. 2, 2009, the entire content of which is hereby incorporated by reference.
Claims (1)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/324,582 US20140351755A1 (en) | 2009-07-02 | 2014-07-07 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-158153 | 2009-07-02 | ||
JP2009158153A JP5402322B2 (en) | 2009-07-02 | 2009-07-02 | Information processing apparatus and information processing method |
US12/821,399 US8806336B2 (en) | 2009-07-02 | 2010-06-23 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
US14/324,582 US20140351755A1 (en) | 2009-07-02 | 2014-07-07 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/821,399 Continuation US8806336B2 (en) | 2009-07-02 | 2010-06-23 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140351755A1 true US20140351755A1 (en) | 2014-11-27 |
Family
ID=42797427
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/821,399 Expired - Fee Related US8806336B2 (en) | 2009-07-02 | 2010-06-23 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
US14/324,582 Abandoned US20140351755A1 (en) | 2009-07-02 | 2014-07-07 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/821,399 Expired - Fee Related US8806336B2 (en) | 2009-07-02 | 2010-06-23 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
Country Status (3)
Country | Link |
---|---|
US (2) | US8806336B2 (en) |
EP (1) | EP2270642B1 (en) |
JP (1) | JP5402322B2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9251722B2 (en) | 2009-07-03 | 2016-02-02 | Sony Corporation | Map information display device, map information display method and program |
US20160231904A1 (en) * | 2013-10-22 | 2016-08-11 | Nokia Technologies Oy | Apparatus and method for providing for receipt of indirect touch input to a touch screen display |
US11054916B2 (en) * | 2019-03-19 | 2021-07-06 | Samsung Electronics Co., Ltd. | Display apparatus and controlling method of the display apparatus |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4865063B2 (en) * | 2010-06-30 | 2012-02-01 | 株式会社東芝 | Information processing apparatus, information processing method, and program |
WO2012164988A1 (en) * | 2011-05-30 | 2012-12-06 | 本田技研工業株式会社 | Input device |
US9792017B1 (en) | 2011-07-12 | 2017-10-17 | Domo, Inc. | Automatic creation of drill paths |
US10001898B1 (en) | 2011-07-12 | 2018-06-19 | Domo, Inc. | Automated provisioning of relational information for a summary data visualization |
US9202297B1 (en) * | 2011-07-12 | 2015-12-01 | Domo, Inc. | Dynamic expansion of data visualizations |
US20130061122A1 (en) * | 2011-09-07 | 2013-03-07 | Microsoft Corporation | Multi-cell selection using touch input |
DE112012003889T5 (en) * | 2011-10-11 | 2014-06-05 | International Business Machines Corporation | Method, apparatus and computer program for pointing to an object |
JP5816516B2 (en) * | 2011-10-24 | 2015-11-18 | 京セラ株式会社 | Electronic device, control program, and process execution method |
US9645733B2 (en) * | 2011-12-06 | 2017-05-09 | Google Inc. | Mechanism for switching between document viewing windows |
JP2013161205A (en) * | 2012-02-03 | 2013-08-19 | Sony Corp | Information processing device, information processing method and program |
CN104246677A (en) * | 2012-04-20 | 2014-12-24 | 索尼公司 | Information processing device, information processing method, and program |
WO2013187872A1 (en) * | 2012-06-11 | 2013-12-19 | Intel Corporation | Techniques for select-hold-release electronic device navigation menu system |
KR102150289B1 (en) * | 2012-08-30 | 2020-09-01 | 삼성전자주식회사 | User interface appratus in a user terminal and method therefor |
US9696879B2 (en) | 2012-09-07 | 2017-07-04 | Google Inc. | Tab scrubbing using navigation gestures |
JP5844707B2 (en) | 2012-09-28 | 2016-01-20 | 富士フイルム株式会社 | Image display control device, image display device, program, and image display method |
JP6033061B2 (en) * | 2012-11-30 | 2016-11-30 | Kddi株式会社 | Input device and program |
KR102095039B1 (en) * | 2013-06-04 | 2020-03-30 | 삼성전자 주식회사 | Apparatus and method for receiving touch input in an apparatus providing a touch interface |
JP2015118598A (en) * | 2013-12-19 | 2015-06-25 | 船井電機株式会社 | Selection device |
CN103905640A (en) * | 2014-03-12 | 2014-07-02 | 惠州Tcl移动通信有限公司 | Mobile terminal and false-dialing preventing method thereof |
US20150346998A1 (en) * | 2014-05-30 | 2015-12-03 | Qualcomm Incorporated | Rapid text cursor placement using finger orientation |
JP5729513B1 (en) * | 2014-06-06 | 2015-06-03 | 株式会社セガゲームス | Program and terminal device |
JP6260469B2 (en) * | 2014-06-25 | 2018-01-17 | 富士通株式会社 | Data sequence selection method, data sequence selection program, and portable terminal |
US10310675B2 (en) * | 2014-08-25 | 2019-06-04 | Canon Kabushiki Kaisha | User interface apparatus and control method |
US20160062636A1 (en) * | 2014-09-02 | 2016-03-03 | Lg Electronics Inc. | Mobile terminal and control method thereof |
JP6063437B2 (en) * | 2014-12-19 | 2017-01-18 | 株式会社スクウェア・エニックス | Program, computer apparatus, computer processing method, and system |
CN105786371A (en) * | 2014-12-25 | 2016-07-20 | 富泰华工业(深圳)有限公司 | Electronic device and interface display control method |
JP6501533B2 (en) * | 2015-01-26 | 2019-04-17 | 株式会社コロプラ | Interface program for icon selection |
JP6153990B2 (en) * | 2015-11-19 | 2017-06-28 | 富士フイルム株式会社 | Image display control program, image display control device, and image display control method |
US10191611B2 (en) * | 2015-11-27 | 2019-01-29 | GitSuite LLC | Graphical user interface defined cursor displacement tool |
JP2017060861A (en) * | 2016-12-16 | 2017-03-30 | 株式会社スクウェア・エニックス | Program, computer device, computer processing method, and system |
US11328223B2 (en) | 2019-07-22 | 2022-05-10 | Panasonic Intellectual Property Corporation Of America | Information processing method and information processing system |
CN111309206A (en) * | 2020-02-04 | 2020-06-19 | 北京达佳互联信息技术有限公司 | Data processing method and device, electronic equipment and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7034881B1 (en) * | 1997-10-31 | 2006-04-25 | Fuji Photo Film Co., Ltd. | Camera provided with touchscreen |
US20070097096A1 (en) * | 2006-03-25 | 2007-05-03 | Outland Research, Llc | Bimodal user interface paradigm for touch screen devices |
US20070250794A1 (en) * | 2001-05-18 | 2007-10-25 | Miura Britt S | Multiple menus for use with a graphical user interface |
US20090327963A1 (en) * | 2008-06-28 | 2009-12-31 | Mouilleseaux Jean-Pierre M | Radial menu selection |
US20100192101A1 (en) * | 2009-01-29 | 2010-07-29 | International Business Machines Corporation | Displaying radial menus in a graphics container |
US8472665B2 (en) * | 2007-05-04 | 2013-06-25 | Qualcomm Incorporated | Camera-based user input for compact devices |
US20140320434A1 (en) * | 2013-04-26 | 2014-10-30 | Lothar Pantel | Method for gesture control |
Family Cites Families (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02242323A (en) * | 1989-03-15 | 1990-09-26 | Matsushita Electric Ind Co Ltd | Method and device for selecting pop-up menu |
WO1996009579A1 (en) * | 1994-09-22 | 1996-03-28 | Izak Van Cruyningen | Popup menus with directional gestures |
US5689667A (en) * | 1995-06-06 | 1997-11-18 | Silicon Graphics, Inc. | Methods and system of controlling menus with radial and linear portions |
JPH09204426A (en) * | 1996-01-25 | 1997-08-05 | Sharp Corp | Method for editing data |
JPH10198517A (en) * | 1997-01-10 | 1998-07-31 | Tokyo Noukou Univ | Method for controlling display content of display device |
JP3744116B2 (en) * | 1997-04-08 | 2006-02-08 | 松下電器産業株式会社 | Display input device |
JP2000267808A (en) * | 1999-03-16 | 2000-09-29 | Oki Electric Ind Co Ltd | Input method linking touch panel input device with display device |
JP3358583B2 (en) * | 1999-03-30 | 2002-12-24 | 松下電器産業株式会社 | Car navigation device and its selection screen display method |
US6346938B1 (en) * | 1999-04-27 | 2002-02-12 | Harris Corporation | Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model |
JP2001265523A (en) * | 2000-03-21 | 2001-09-28 | Sony Corp | Information input/output system, information input/ output method and program storage medium |
DE50014953D1 (en) * | 2000-08-24 | 2008-03-20 | Siemens Vdo Automotive Ag | Method and navigation device for querying destination information and navigating in a map view |
JP4007549B2 (en) | 2002-06-28 | 2007-11-14 | クラリオン株式会社 | Peripheral information presentation device and method in navigation, and presentation program |
JP4181372B2 (en) * | 2002-09-27 | 2008-11-12 | 富士フイルム株式会社 | Display device, image information management terminal, image information management system, and image display method |
JP4105609B2 (en) | 2003-01-06 | 2008-06-25 | アルパイン株式会社 | 3D display method for navigation and navigation apparatus |
JP2004356819A (en) * | 2003-05-28 | 2004-12-16 | Sharp Corp | Remote control apparatus |
JP4526307B2 (en) | 2004-06-09 | 2010-08-18 | 富士通テン株式会社 | Function selection device |
US7376510B1 (en) * | 2004-11-05 | 2008-05-20 | Navteq North America, Llc | Map display for a navigation system |
JP2006139615A (en) * | 2004-11-12 | 2006-06-01 | Access Co Ltd | Display device, menu display program, and tab display program |
JP4738019B2 (en) * | 2005-02-23 | 2011-08-03 | 任天堂株式会社 | GAME PROGRAM, GAME DEVICE, GAME CONTROL METHOD, AND GAME SYSTEM |
JP4207941B2 (en) | 2005-09-02 | 2009-01-14 | パナソニック株式会社 | Image display device and image generation device |
US7644372B2 (en) * | 2006-01-27 | 2010-01-05 | Microsoft Corporation | Area frequency radial menus |
JP4922625B2 (en) | 2006-02-23 | 2012-04-25 | 京セラミタ株式会社 | Electronic device device by touch panel input, program for input operation of touch panel |
US7552402B2 (en) * | 2006-06-22 | 2009-06-23 | Microsoft Corporation | Interface orientation using shadows |
US8316324B2 (en) * | 2006-09-05 | 2012-11-20 | Navisense | Method and apparatus for touchless control of a device |
KR100774927B1 (en) * | 2006-09-27 | 2007-11-09 | 엘지전자 주식회사 | Mobile communication terminal, menu and item selection method using the same |
JP4171509B2 (en) | 2006-11-27 | 2008-10-22 | 富士通株式会社 | Input processing method and input processing apparatus for implementing the same |
WO2008078603A1 (en) * | 2006-12-22 | 2008-07-03 | Panasonic Corporation | User interface device |
US20080294332A1 (en) * | 2007-01-17 | 2008-11-27 | 3-D-V-U Israel (2000) Ltd. | Method for Image Based Navigation Route Corridor For 3D View on Mobile Platforms for Mobile Users |
US20100306703A1 (en) * | 2007-04-26 | 2010-12-02 | Nokia Corporation | Method, device, module, apparatus, and computer program for an input interface |
US8074178B2 (en) * | 2007-06-12 | 2011-12-06 | Microsoft Corporation | Visual feedback display |
JP5060856B2 (en) | 2007-07-17 | 2012-10-31 | パイオニア株式会社 | Navigation system and navigation method |
GB2451274B (en) * | 2007-07-26 | 2013-03-13 | Displaylink Uk Ltd | A system comprising a touchscreen and one or more conventional display devices |
KR100837283B1 (en) * | 2007-09-10 | 2008-06-11 | (주)익스트라스탠다드 | Mobile device equipped with touch screen |
US20090101415A1 (en) * | 2007-10-19 | 2009-04-23 | Nokia Corporation | Apparatus, method, computer program and user interface for enabling user input |
KR20090047828A (en) * | 2007-11-08 | 2009-05-13 | 삼성전자주식회사 | The method for displaying content and the electronic apparatus thereof |
JP2009140368A (en) | 2007-12-07 | 2009-06-25 | Sony Corp | Input device, display device, input method, display method, and program |
US8032297B2 (en) * | 2008-05-08 | 2011-10-04 | Gabriel Jakobson | Method and system for displaying navigation information on an electronic map |
US20100080491A1 (en) * | 2008-09-26 | 2010-04-01 | Nintendo Co., Ltd. | Storage medium storing image processing program for implementing controlled image display according to input coordinate, information processing device and method for image processing |
US8284170B2 (en) * | 2008-09-30 | 2012-10-09 | Apple Inc. | Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor |
JP5367339B2 (en) * | 2008-10-28 | 2013-12-11 | シャープ株式会社 | MENU DISPLAY DEVICE, MENU DISPLAY DEVICE CONTROL METHOD, AND MENU DISPLAY PROGRAM |
US8627233B2 (en) * | 2009-03-27 | 2014-01-07 | International Business Machines Corporation | Radial menu with overshoot, fade away, and undo capabilities |
US8549432B2 (en) * | 2009-05-29 | 2013-10-01 | Apple Inc. | Radial menus |
JP5792424B2 (en) * | 2009-07-03 | 2015-10-14 | ソニー株式会社 | MAP INFORMATION DISPLAY DEVICE, MAP INFORMATION DISPLAY METHOD, AND PROGRAM |
-
2009
- 2009-07-02 JP JP2009158153A patent/JP5402322B2/en not_active Expired - Fee Related
-
2010
- 2010-06-23 US US12/821,399 patent/US8806336B2/en not_active Expired - Fee Related
- 2010-06-23 EP EP20100167047 patent/EP2270642B1/en not_active Not-in-force
-
2014
- 2014-07-07 US US14/324,582 patent/US20140351755A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7034881B1 (en) * | 1997-10-31 | 2006-04-25 | Fuji Photo Film Co., Ltd. | Camera provided with touchscreen |
US20070250794A1 (en) * | 2001-05-18 | 2007-10-25 | Miura Britt S | Multiple menus for use with a graphical user interface |
US20070097096A1 (en) * | 2006-03-25 | 2007-05-03 | Outland Research, Llc | Bimodal user interface paradigm for touch screen devices |
US8472665B2 (en) * | 2007-05-04 | 2013-06-25 | Qualcomm Incorporated | Camera-based user input for compact devices |
US20090327963A1 (en) * | 2008-06-28 | 2009-12-31 | Mouilleseaux Jean-Pierre M | Radial menu selection |
US20100192101A1 (en) * | 2009-01-29 | 2010-07-29 | International Business Machines Corporation | Displaying radial menus in a graphics container |
US20140320434A1 (en) * | 2013-04-26 | 2014-10-30 | Lothar Pantel | Method for gesture control |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9251722B2 (en) | 2009-07-03 | 2016-02-02 | Sony Corporation | Map information display device, map information display method and program |
US10755604B2 (en) | 2009-07-03 | 2020-08-25 | Sony Corporation | Map information display device, map information display method and program |
US20160231904A1 (en) * | 2013-10-22 | 2016-08-11 | Nokia Technologies Oy | Apparatus and method for providing for receipt of indirect touch input to a touch screen display |
US11360652B2 (en) * | 2013-10-22 | 2022-06-14 | Nokia Technologies Oy | Apparatus and method for providing for receipt of indirect touch input to a touch screen display |
US11054916B2 (en) * | 2019-03-19 | 2021-07-06 | Samsung Electronics Co., Ltd. | Display apparatus and controlling method of the display apparatus |
US11586298B2 (en) | 2019-03-19 | 2023-02-21 | Samsung Electronics Co., Ltd. | Display apparatus and controlling method of the display apparatus |
Also Published As
Publication number | Publication date |
---|---|
EP2270642B1 (en) | 2015-05-13 |
US8806336B2 (en) | 2014-08-12 |
JP5402322B2 (en) | 2014-01-29 |
US20110004821A1 (en) | 2011-01-06 |
JP2011013980A (en) | 2011-01-20 |
EP2270642A2 (en) | 2011-01-05 |
CN101943989A (en) | 2011-01-12 |
EP2270642A3 (en) | 2013-12-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8806336B2 (en) | Facilitating display of a menu and selection of a menu item via a touch screen interface | |
EP3232315B1 (en) | Device and method for providing a user interface | |
US7924271B2 (en) | Detecting gestures on multi-event sensitive devices | |
US20190155420A1 (en) | Information processing apparatus, information processing method, and program | |
CN107066137B (en) | Apparatus and method for providing user interface | |
EP2372516B1 (en) | Methods, systems and computer program products for arranging a plurality of icons on a touch sensitive display | |
US9411496B2 (en) | Method for operating user interface and recording medium for storing program applying the same | |
US10108331B2 (en) | Method, apparatus and computer readable medium for window management on extending screens | |
EP2192477B1 (en) | Portable terminal with touch screen and method for displaying tags in the portable terminal | |
US9696871B2 (en) | Method and portable terminal for moving icon | |
US20090102809A1 (en) | Coordinate Detecting Device and Operation Method Using a Touch Panel | |
US20100245242A1 (en) | Electronic device and method for operating screen | |
US20100073303A1 (en) | Method of operating a user interface | |
EP2699986B1 (en) | Touch screen selection | |
US20080165255A1 (en) | Gestures for devices having one or more touch sensitive surfaces | |
US20090128504A1 (en) | Touch screen peripheral device | |
EP2474896A2 (en) | Information processing apparatus, information processing method, and computer program | |
US20120210273A1 (en) | Method and apparatus for providing graphic user interface in mobile terminal | |
US20130300710A1 (en) | Method and electronic device thereof for processing function corresponding to multi-touch | |
JP6123879B2 (en) | Display device, display method, program thereof, and terminal device | |
WO2022012664A1 (en) | Background program control method and apparatus, and electronic device | |
JP2010211323A (en) | Input system, portable terminal, input/output device, input system control program, computer-readable recording medium and method for controlling input system | |
US20140085340A1 (en) | Method and electronic device for manipulating scale or rotation of graphic on display | |
KR101418922B1 (en) | Method and apparatus for controlling plural objects | |
CN101943989B (en) | Information processor and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIYAZAWA, YUSUKE;HOMMA, FUMINORI;NARITA, TOMOYA;AND OTHERS;SIGNING DATES FROM 20140714 TO 20140727;REEL/FRAME:035389/0267 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |