US10372296B2 - Information processing apparatus, computer-readable recording medium, and information processing method - Google Patents

Information processing apparatus, computer-readable recording medium, and information processing method Download PDF

Info

Publication number
US10372296B2
US10372296B2 US15/276,954 US201615276954A US10372296B2 US 10372296 B2 US10372296 B2 US 10372296B2 US 201615276954 A US201615276954 A US 201615276954A US 10372296 B2 US10372296 B2 US 10372296B2
Authority
US
United States
Prior art keywords
menu item
menu
selecting
pointing
drag
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/276,954
Other versions
US20170255346A1 (en
Inventor
Koki Hatada
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Assigned to FUJITSU LIMITED reassignment FUJITSU LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HATADA, KOKI
Publication of US20170255346A1 publication Critical patent/US20170255346A1/en
Application granted granted Critical
Publication of US10372296B2 publication Critical patent/US10372296B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • Patent Document 1 Japanese Laid-open Patent Publication No. 2001-184458
  • Patent Document 2 Japanese Laid-open Patent Publication No. 08-180138
  • Patent Document 3 Japanese Laid-open Patent Publication No. 05-061596
  • the context menu may possibly hide the already displayed objects.
  • the context menu may possibly hide the already displayed objects.
  • an information processing apparatus includes a processor.
  • the processor executes a process.
  • the process includes first selecting a menu item in a context menu.
  • the process includes second selecting, regarding the menu item selected at the first selecting, an end position that is a result of a movement due to a predetermined operation.
  • the process includes displaying, at the end position selected at the second selecting, a content indicated by the menu item.
  • FIG. 1 is a functional block diagram illustrating the configuration of an information processing apparatus according to a first embodiment
  • FIG. 2 is a schematic diagram illustrating an example of a menu operation process according to the first embodiment
  • FIG. 3 is a flowchart illustrating an example of the flow of the menu operation process according to the first embodiment
  • FIG. 4 is a schematic diagram illustrating another example of the menu operation process according to the first embodiment
  • FIG. 5 is a flowchart illustrating another example of the flow of the menu operation process according to the first embodiment
  • FIG. 6 is a functional block diagram illustrating the configuration of an information processing apparatus according to a second embodiment
  • FIG. 7 is a schematic diagram illustrating an example of a menu operation process according to the second embodiment.
  • FIG. 8 is a flowchart illustrating an example of the flow of the menu operation process according to the second embodiment
  • FIG. 9 is a schematic diagram illustrating another example of the menu operation process according to the second embodiment.
  • FIG. 10 is a flowchart illustrating another example of the flow of the menu operation process according to the second embodiment.
  • FIG. 11 is a functional block diagram illustrating the configuration of an information processing apparatus according to a third embodiment
  • FIG. 12 is a schematic diagram illustrating an example of a menu operation process according to a third embodiment
  • FIG. 13 is a flowchart illustrating an example of the flow of the menu operation process according to the third embodiment
  • FIG. 14 is a schematic diagram illustrating another example of the menu operation process according to the third embodiment.
  • FIG. 15 is a flowchart illustrating another example of the flow of the menu operation process according to the third embodiment.
  • FIG. 16 is a schematic diagram illustrating an example of a menu operation process according to a fourth embodiment
  • FIG. 17 is a flowchart illustrating an example of the flow of the menu operation process according to the fourth embodiment.
  • FIG. 18 is a schematic diagram illustrating an example of a display when a context menu includes therein both menu items available to be dragged and menu items unavailable to be dragged;
  • FIG. 19 is a schematic diagram illustrating another example of a display when a context menu includes therein both menu items available to be dragged and menu items unavailable to be dragged;
  • FIG. 20 is a schematic diagram illustrating an example of a computer that executes an information processing program.
  • FIG. 1 is a functional block diagram illustrating the configuration of an information processing apparatus according to a first embodiment.
  • An information processing apparatus 1 according to the first embodiment simultaneously selects, after displaying a context menu, by detecting a predetermined operation of the menu item selected in the context menu, the command indicated by the menu item and the position at which the subject command is executed and then executes the command at the selected position.
  • the predetermined operation mentioned here is, for example, a drag operation or a pointing operation.
  • a description will be given by using the predetermined operation as the drag operation.
  • the information processing apparatus 1 includes a display device 10 and a control unit 20 .
  • the display device 10 may be, for example, a device with a display screen, such as a monitor of a personal computer (PC), a monitor of a television, a projector, a head mounted display (HMD), a smartphone, a tablet, or the like.
  • PC personal computer
  • HMD head mounted display
  • smartphone smartphone
  • tablet or the like.
  • the control unit 20 includes an internal memory that stores therein control data and programs in which various kinds of procedures are prescribed, whereby the control unit 20 executes various kinds of processes. Furthermore, the control unit 20 corresponds to, for example, an electronic circuit in an integrated circuit, such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or the like. Alternatively, the control unit 20 corresponds to an electronic circuit, such as a central processing unit (CPU), a micro processing unit (MPU), or the like.
  • CPU central processing unit
  • MPU micro processing unit
  • control unit 20 includes an operation acquisition unit 21 , a pointing operation acquisition unit 22 , a context menu display unit 23 , a menu item selection operation decision unit 24 , a target selection operation decision unit 25 , a menu item display unit 26 , and a command calculation unit 27 .
  • the operation acquisition unit 21 acquires an operation performed on a user's screen.
  • the user performs an operation to the screen using a mouse or a touch panel.
  • the operation mentioned here includes, for example, a click of a mouse, a movement of a mouse pointer, a touchdown, a touch-up, and a movement of a touch position.
  • the pointing operation acquisition unit 22 acquires, from the operation acquired by the operation acquisition unit 21 , a pointing operation to the screen. For example, if the operation acquired by the operation acquisition unit 21 is a click, the pointing operation acquisition unit 22 acquires the position in which a click operation is performed and then acquires the status as the pointing operation performed to the acquired position. Furthermore, if the operation acquired by the operation acquisition unit 21 is a touch-up, first, the pointing operation acquisition unit 22 acquires the position in which a touchdown is performed and then acquires the operation as the pointing operation to the acquired position. Then, when the pointing operation acquisition unit 22 acquires the pointing operation, if an object is displayed at the pointing position, the pointing operation acquisition unit 22 sets the displayed object to the target for the pointing. If no object is displayed at the pointing position, the pointing operation acquisition unit 22 sets a canvas of the screen to the target for the pointing.
  • the context menu display unit 23 displays a context menu, in the vicinity of the pointing position, that is used to select an operation related to the target for the pointing.
  • menu items are vertically aligned in a straight line; however, the menu item may also be horizontally aligned in a straight line, or may also be a circularly menu item.
  • the context menu display unit 23 decides the menu item included in the context menu. For example, if the target for the pointing is a canvas, the context menu display unit 23 displays the context menu including the menu items of an “addition of a label” and a “change in a color”. If the target for the pointing is a label, the context menu display unit 23 displays the context menu by including the menu items of a “deletion of a label” and a “change in a color”.
  • the menu item selection operation decision unit 24 detects a first operation that is used to select a menu item in the context menu.
  • the first operation is, for example, an operation to select a menu item and to start a drag.
  • the target selection operation decision unit 25 detects a second operation that is used to select the target in which the command indicated by the menu item is executed.
  • the target mentioned here is, for example, the position in which the command is executed.
  • the second operation is, for example, an operation to end the drag.
  • the menu item display unit 26 displays, in accordance with the drag operation, the menu item selected by the first operation. Namely, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item selected by the first operation.
  • the command calculation unit 27 calculates the command to be executed. For example, it is assumed that the menu item of the “addition of a label” that is included in the context menu and that is displayed when a canvas on a screen is targeted for the pointing is selected by an operation to start a drag. Then, if the drag of the menu item of the selected “addition of a label” is started and the drag is ended at an arbitrary position, the command calculation unit 27 calculates a command to add the label at the position in which the drag has been ended. Furthermore, if the second operation is not performed, the command calculation unit 27 may calculate a command to add a label at the pointing position of the pointing that is performed in order to display the context menu acquired by the pointing operation acquisition unit 22 .
  • FIG. 2 is a schematic diagram illustrating an example of a menu operation process according to the first embodiment. As illustrated in FIG. 2 , it is assumed that the area indicated by a reference numeral a 1 is an area in which objects are densely packed.
  • the pointing operation acquisition unit 22 sets the canvas on the screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22 , the context menu display unit 23 displays a context menu near the pointing position.
  • the menu items of an “addition of a label” and an “addition of an image” are included.
  • the menu item selection operation decision unit 24 detects the first operation that is used to select the menu item of the “addition of a label”.
  • the first operation is an operation to start a drag.
  • the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item of the “addition of a label”.
  • the target selection operation decision unit 25 detects the second operation that is used to select the position in which the command indicated by the menu item of the “addition of a label” is executed.
  • the second operation is an operation to end the drag.
  • the command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the drag a 3 has been ended.
  • the command calculation unit 27 calculates the command to perform the “addition of a label” at the position in which the drag a 3 has been ended. Consequently, as indicated by a reference numeral a 4 , the label is added at the position in which the drag a 3 has been ended.
  • the information processing apparatus 1 can change the position in which the menu item of the “addition of a label” selected from among the menu items included in the context menu is executed. Even if objects displayed on a screen are densely packed, the information processing apparatus 1 can improve the efficiency of the operation to display a context menu. Furthermore, even if a context menu is displayed when the objects displayed on the screen are densely packed, the information processing apparatus 1 can improve the visibility of the objects displayed on the screen.
  • FIG. 3 is a flowchart illustrating an example of the flow of the menu operation process according to the first embodiment.
  • the pointing operation acquisition unit 22 decides whether a pointing to a screen is detected (Step S 11 ). If the pointing operation acquisition unit 22 decides that a pointing to a screen is not detected (No at Step S 11 ), the pointing operation acquisition unit 22 repeats the decision process until the pointing operation acquisition unit 22 detects the pointing to the screen.
  • the context menu display unit 23 displays the context menu near the detected position (Step S 12 ).
  • the menu item selection operation decision unit 24 detects whether an operation to start a drag of the menu item is detected (Step S 13 ). If the menu item selection operation decision unit 24 decides that an operation to start a drag of the menu item is not detected (No at Step S 13 ), the menu item selection operation decision unit 24 repeats the decision process until the menu item selection operation decision unit 24 detects the subject operation.
  • the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S 14 ). Furthermore, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item that is targeted for the drag.
  • the target selection operation decision unit 25 detects whether an operation to end the drag of the menu item is detected (Step S 15 ). If the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is not detected (No at Step S 15 ), the target selection operation decision unit 25 repeats the decision process until the target selection operation decision unit 25 detects the subject operation.
  • the menu item display unit 26 prevents the subject menu item from being displayed (Step S 16 ).
  • the command calculation unit 27 calculates the command indicated by the menu item (Step S 17 ). Then, the menu operation process is ended.
  • a movement of a menu item in a context menu has been described as a drag operation.
  • the movement is not limited to this and, instead of the drag operation, a movement of a menu item included in a context menu may also be a pointing operation. Therefore, a menu operation process performed when a movement of a menu item in a context menu is a pointing operation.
  • FIG. 4 is a schematic diagram illustrating another example of the menu operation process according to the first embodiment.
  • the area indicated by the reference numeral a 1 is an area in which the objects are densely packed.
  • the pointing operation acquisition unit 22 sets the canvas on the screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22 , the context menu display unit 23 displays a context menu near the pointing position.
  • the menu items of the “addition of a label” and the “addition of an image” are included.
  • the menu item selection operation decision unit 24 detects the first operation that is used to select the menu item of the “addition of a label”.
  • the first operation is an operation to perform a pointing on the menu item.
  • the target selection operation decision unit 25 detects the second operation that is used to select the position in which the command indicated by the menu item of the “addition of a label” is executed.
  • the second operation is an operation to perform a pointing the menu item.
  • the command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the pointing a 4 ′ has been performed. Therefore, as indicated by a reference numeral a 5 ′, the label is added at the position in which the pointing a 4 ′ has been performed.
  • the information processing apparatus 1 can change the position in which the menu item of the “addition of a label” selected from among the menu items included in the context menu is executed. Even if objects displayed on a screen are densely packed, the information processing apparatus 1 can improve the efficiency of the operation to display a context menu. Furthermore, even if a context menu is displayed when the objects displayed on the screen are densely packed, the information processing apparatus 1 can improve the visibility of the objects displayed on the screen.
  • FIG. 5 is a flowchart illustrating another example of the flow of the menu operation process according to the first embodiment.
  • the pointing operation acquisition unit 22 decides whether a pointing to a screen is detected (Step S 21 ). If the pointing operation acquisition unit 22 decides that a pointing to a screen is not detected (No at Step S 21 ), the pointing operation acquisition unit 22 repeats the decision process until the pointing operation acquisition unit 22 detects the pointing to the screen.
  • the context menu display unit 23 displays the context menu near the detected position (Step S 22 ).
  • the menu item selection operation decision unit 24 detects whether a pointing to the menu item is detected (Step S 23 ). If the menu item selection operation decision unit 24 decides that a pointing to the menu item is not detected (No at Step S 23 ), the menu item selection operation decision unit 24 repeats the decision process until the menu item selection operation decision unit 24 detects the subject operation.
  • the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S 24 ).
  • the target selection operation decision unit 25 detects whether a pointing to a screen is detected (Step S 25 ). If the target selection operation decision unit 25 decides that a pointing to a screen is not detected (No at Step S 25 ), the target selection operation decision unit 25 repeats the decision process until the subject operation is detected.
  • the menu item display unit 26 prevents the subject menu item from being displayed (Step S 26 ).
  • the command calculation unit 27 calculates the command indicated by the menu item (Step S 27 ). Then, the menu operation process is ended.
  • the information processing apparatus 1 selects a menu item included in a context menu and selects, regarding the selected menu item, an end position that is the result of a movement due to a predetermined operation.
  • the information processing apparatus 1 displays the content indicated by the menu item at the selected end position.
  • the information processing apparatus 1 displays the content indicated by the menu item at the end position that is the result of the movement due to a drag operation of the selected menu item.
  • the information processing apparatus 1 is not limited to this and may also be used in a case in which, on the basis of a drag trajectory of the selected menu item, selection of the menu item is canceled.
  • FIG. 6 is a functional block diagram illustrating the configuration of an information processing apparatus according to a second embodiment.
  • the components having the same configuration as those in the information processing apparatus 1 illustrated in FIG. 1 are assigned the same reference numerals; therefore, overlapped descriptions of the configuration and the operation thereof will be omitted.
  • the second embodiment is different from the first embodiment in that a continuous operation decision unit 31 is added.
  • the continuous operation decision unit 31 decides, on the basis of a movement trajectory of a menu item, whether to continue to select a menu item is.
  • the continuous operation decision unit 31 decides not to continue to select the menu item. Namely, the continuous operation decision unit 31 decides to stop selecting a menu item. Namely, after a drag of a menu item is started and the menu item is moved from the context menu by a predetermined distance or more, if the menu item is dragged and moved such that the menu item returns to the vicinity of the starting position, the continuous operation decision unit 31 decides that an operation to stop selecting the menu item has been performed.
  • the vicinity of the starting position is, for example, the distance between the starting position and the end position is within a predetermined threshold.
  • a movement trajectory of the menu item is a trajectory that satisfies a predetermined condition after the start of the movement
  • the continuous operation decision unit 31 decides not to continue to select the menu item. Namely, the continuous operation decision unit 31 decides to stop selecting the menu item.
  • the trajectory that satisfies a predetermined condition is, as an example, a trajectory indicating a scratch gesture.
  • the scratch gesture is a gesture repeatedly traveling to and from a certain center line in a short time.
  • FIG. 7 is a schematic diagram illustrating an example of a menu operation process according to the second embodiment.
  • the pointing operation acquisition unit 22 sets a canvas of a screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22 , the context menu display unit 23 displays the context menu near the pointing position.
  • the menu items of the “addition of a label” and the “addition of an image” are included.
  • the menu item selection operation decision unit 24 detects the first operation that is used to select the menu item of the “addition of a label”.
  • the first operation is an operation to start a drag.
  • the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item of the “addition of a label”.
  • the continuous operation decision unit 31 decides that an operation to stop selecting the menu item of the “addition of a label” is performed.
  • the menu item display unit 26 prevents the menu item of the “addition of a label” from being displayed.
  • the information processing apparatus 1 can decide whether the drag operation is selection of a menu item or a stop of the selection of the menu item.
  • FIG. 8 is a flowchart illustrating an example of the flow of the menu operation process according to the second embodiment.
  • the pointing operation acquisition unit 22 decides whether a pointing to a screen is detected (Step S 31 ). If the pointing operation acquisition unit 22 decides that a pointing to a screen is not detected (No at Step S 31 ), the pointing operation acquisition unit 22 repeats the decision process until the pointing operation acquisition unit 22 detects a pointing to the screen.
  • the context menu display unit 23 displays a context menu near the detected position (Step S 32 ).
  • the menu item selection operation decision unit 24 detects whether an operation to start a drag of a menu item is detected (Step S 33 ). If the menu item selection operation decision unit 24 decides that an operation to start a drag of a menu item is not detected (No at Step S 33 ), the menu item selection operation decision unit 24 repeats the decision process until the menu item selection operation decision unit 24 detects the subject operation.
  • the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S 34 ). Furthermore, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item targeted for the drag.
  • the continuous operation decision unit 31 decides whether a close approach of the menu item to the starting position of the drag is detected (Step S 35 ).
  • a close approach to the starting position means the vicinity of the starting position and indicates that, for example, the distance between the starting position and the end position is within a predetermined threshold.
  • Step S 35 If the continuous operation decision unit 31 decides that a close approach of the menu item to the starting position of the drag is detected (Yes at Step S 35 ), the menu item display unit 26 prevents the subject menu item from being displayed (Step S 36 ). Then, the menu operation process is ended.
  • the target selection operation decision unit 25 detects whether an operation to end the drag of the menu item is detected (Step S 37 ). If the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is not detected (No at Step S 37 ), the target selection operation decision unit 25 repeats the decision process until the target selection operation decision unit 25 detects the subject operation.
  • the menu item display unit 26 prevents the subject menu item from being displayed (Step S 38 ).
  • the command calculation unit 27 calculates the command indicated by the menu item (Step S 39 ). Then, the menu operation process is ended.
  • FIG. 9 is a schematic diagram illustrating another example of the menu operation process according to the second embodiment.
  • the pointing operation acquisition unit 22 sets a canvas of a screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22 , the context menu display unit 23 displays a context menu near the pointing position.
  • the menu items of the “addition of a label” and the “addition of an image” are included.
  • the menu item selection operation decision unit 24 detects the first operation that is used to select the menu item of the “addition of a label”.
  • the first operation is an operation to start a drag.
  • the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item of the “addition of a label”.
  • a trajectory of the drag operation of the menu item is a scratch gesture b 4 .
  • the continuous operation decision unit 31 decides that an operation to stop selecting the menu item of the “addition of a label” has been performed.
  • the menu item display unit 26 prevents the menu item of the “addition of a label” from being displayed.
  • the information processing apparatus 1 can decide whether the drag operation is selection of a menu item or a stop of selection of the menu item.
  • FIG. 10 is a flowchart illustrating another example of the flow of the menu operation process according to the second embodiment.
  • the pointing operation acquisition unit 22 decides whether a pointing to a screen is detected (Step S 41 ). If the pointing operation acquisition unit 22 decides that a pointing to a screen is not detected (No at Step S 41 ), the pointing operation acquisition unit 22 repeats the decision process until the pointing operation acquisition unit 22 detects a pointing to the screen.
  • the context menu display unit 23 displays a context menu near the detected position (Step S 42 ).
  • the menu item selection operation decision unit 24 detects whether an operation to start a drag of the menu item is detected (Step S 43 ). If the menu item selection operation decision unit 24 decides that an operation to start a drag of the menu item is not detected (No at Step S 43 ), the menu item selection operation decision unit 24 repeats the decision process until the menu item selection operation decision unit 24 detects the subject operation.
  • the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S 44 ). Furthermore, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item targeted for the drag.
  • the continuous operation decision unit 31 decides whether a scratch gesture is detected (Step S 45 ). Namely, the continuous operation decision unit 31 decides whether the trajectory of the drag operation of the menu item is a trajectory that indicates a scratch gesture after the start of the drag. If the continuous operation decision unit 31 decides that a scratch gesture is detected (Yes at Step S 45 ), the menu item display unit 26 prevents the menu item targeted for the drag from being displayed (Step S 46 ). Then, the menu operation process is ended.
  • the target selection operation decision unit 25 detects whether an operation to end the drag of the menu item is detected (Step S 47 ). If the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is not detected (No at Step S 47 ), the target selection operation decision unit 25 repeats the decision process until the target selection operation decision unit 25 detects the subject operation.
  • the menu item display unit 26 prevents the subject menu item from being displayed (Step S 48 ).
  • the command calculation unit 27 calculates the command indicated by the menu item (Step S 49 ). Then, the menu operation process is ended.
  • the information processing apparatus 1 decides, on the basis of the movement trajectory of the menu item selected in the context menu, whether to continue to select the menu item.
  • the information processing apparatus 1 can decide whether a move operation moves the menu item, continues the operation of the menu item, or stops the operation of the menu item.
  • the information processing apparatus 1 decides whether the movement trajectory of the menu item is the trajectory that returns to the vicinity of the starting position of the movement after the movement is started. If the information processing apparatus 1 decides the movement trajectory of the menu item is the trajectory that returns to the vicinity of the starting position of the movement after the movement is started, the information processing apparatus 1 stops selecting the menu item. With this configuration, by using the movement trajectory of the menu item in the context menu, the information processing apparatus 1 can decide whether the move operation moves the menu item or stops the operation of the menu item.
  • the information processing apparatus 1 decides whether the movement trajectory of the menu item is the trajectory that indicates the scratch gesture after the movement is started. If the information processing apparatus 1 decides that the movement trajectory of the menu item is the trajectory that indicates a scratch gesture after the movement is started, the information processing apparatus 1 stops selecting the menu item. With this configuration, by using the movement trajectory of the menu item in the context menu, the information processing apparatus 1 can easily decide the movement operation is a stop of the operation of the menu item.
  • the information processing apparatus 1 displays the content indicated by the menu item at the end position that is the result of the movement due to the drag operation of the selected menu item.
  • the information processing apparatus 1 cancels the selection of the menu item on the basis of the drag trajectory of the selected menu item.
  • the information processing apparatus 1 is not limited to these but may also further continuously operate the menu item on the basis of the drag trajectory of the selected menu item.
  • FIG. 11 is a functional block diagram illustrating the configuration of an information processing apparatus according to a third embodiment.
  • the components having the same configuration as those in the information processing apparatus 1 illustrated in FIG. 6 are assigned the same reference numerals; therefore, overlapped descriptions of the configuration and the operation thereof will be omitted.
  • the third embodiment is different from the second embodiment in that the menu item selection operation decision unit 24 is changed to a menu item selection operation decision unit 24 A.
  • the third embodiment is different from the second embodiment in that the continuous operation decision unit 31 is changed to a continuous operation decision unit 31 A.
  • the menu item selection operation decision unit 24 A detects the first operation that is used to select a menu item in the context menu.
  • the first operation is, for example, an operation to start a drag. For example, if the same menu item in the context menu is touched several times, the menu item selection operation decision unit 24 A decides that the first operation has been performed.
  • the touch mentioned here means a “touchdown”.
  • the continuous operation decision unit 31 A decides, on the basis of a movement trajectory of the menu item, whether to continue to select the menu item.
  • the continuous operation decision unit 31 A decides whether an amount of movement of the movement trajectory of the menu item indicated by a second touch and the subsequent touches is zero. If an amount of movement of the movement trajectory of the menu item indicated by the second touch and the subsequent touches is zero, the continuous operation decision unit 31 A decides to continue to select the menu item is. For example, it is assumed that the first operation has been performed on a certain menu item indicated by the first touch and the second touch.
  • the continuous operation decision unit 31 A decides that the operation to select a menu item is continued.
  • the continuous operation decision unit 31 A decides whether a predetermined time has elapsed for a time period from the end of the first movement trajectory of the menu item to the start of the second movement trajectory of the subject menu item. If the predetermined time has not elapsed for a time period from the end of the first movement trajectory of the menu item to the start of the second movement trajectory of the subject menu item, the continuous operation decision unit 31 A decides that the operation to select a menu item is continued.
  • FIG. 12 is a schematic diagram illustrating an example of a menu operation process according to a third embodiment.
  • the pointing operation acquisition unit 22 sets a canvas of the screen to the target for the pointing. Then, when the pointing operation is acquired by the pointing operation acquisition unit 22 , the context menu display unit 23 displays a context menu near the pointing position.
  • the menu items of the “addition of a label” and the “addition of an image” are included.
  • the menu item selection operation decision unit 24 A decides that the first operation has been performed.
  • the first operation is an operation to start a drag.
  • the target selection operation decision unit 25 detects the second operation that is used to select the position at which the command indicated by the menu item of the “addition of a label” is executed.
  • the second operation is an operation to end the drag.
  • the command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the drag c 4 has been ended.
  • the command calculation unit 27 calculates the command to perform the “addition of a label” at the position in which the drag c 4 has been ended. Consequently, as indicated by a reference numeral c 5 , the label is added to the position in which the drag c 4 has been ended.
  • the continuous operation decision unit 31 A decides whether an amount of movement of the movement trajectory of the menu item of the “addition of a label” due to the second touch c 3 is zero. If an amount of movement of the movement trajectory of the menu item of the “addition of a label” due to the second touch c 3 is zero, the continuous operation decision unit 31 A decides that the operation to select the menu item is continued. Here, because the touch position of the second touch c 3 does not move, the continuous operation decision unit 31 A decides that the operation to select the menu item of the “addition of a label” is continued.
  • the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item of the “addition of a label”. If the drag c 6 of the menu item of the “addition of a label” has been ended, the command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the drag c 6 has been ended and then the label is added.
  • the information processing apparatus 1 can continuously operate the menu item of the context menu.
  • FIG. 13 is a flowchart illustrating an example of the flow of the menu operation process according to the third embodiment.
  • the pointing operation acquisition unit 22 decides whether a pointing to the screen is detected (Step S 51 ). If the pointing operation acquisition unit 22 decides that a pointing to the screen is not detected (No at Step S 51 ), the pointing operation acquisition unit 22 repeats the decision process until the pointing operation acquisition unit 22 detects a pointing to the screen.
  • the context menu display unit 23 displays the context menu near the detected position (Step S 52 ).
  • the menu item selection operation decision unit 24 A decides whether two or more touches of the menu items are detected (Step S 53 ). If the menu item selection operation decision unit 24 A decides that two or more touches of the menu item are not detected (No at Step S 53 ), the menu item selection operation decision unit 24 A repeats the decision process until the subject operation is detected.
  • the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S 54 ). Then, the menu item display unit 26 duplicates the detected touched menu items to the respective touch positions and displays each of the duplicated menu items (Step S 55 ).
  • the menu item display unit 26 decides whether one or more movements of the touch positions are detected (Step S 56 ). If the menu item display unit 26 decides that no movement of the touch position is detected (No at Step S 56 ), the menu item display unit 26 proceeds to Step S 58 .
  • Step S 56 if the menu item display unit 26 decides that one or more movements of the touch positions are detected (Yes at Step S 56 ), the menu item display unit 26 changes the display position of the menu item to each of the touch positions (Step S 57 ). Then, the menu item display unit 26 proceeds to Step S 58 .
  • Step S 58 the target selection operation decision unit 25 decides whether release of one or more touches is detected (Step S 58 ). If the target selection operation decision unit 25 decides that no release of the touches is detected (No at Step S 58 ), the menu item display unit 26 proceeds to Step S 56 in order to detect a movement of the touch position.
  • the menu item display unit 26 prevents the menu item from which the touch has been released from being displayed (Step S 59 ). Then, on the basis of the position in which the touch has been released, the command calculation unit 27 executes the command that is indicated by the menu item from which the touch has been released (Step S 60 ).
  • the continuous operation decision unit 31 A decides whether there is a touch in which no movement of the touch position is detected (Step S 61 ). For example, the continuous operation decision unit 31 A decides whether an amount of movement of the movement trajectory of the menu item due to the touch in which no movement of the touch position is detected is zero. If the continuous operation decision unit 31 A decides that there is a touch in which no movement of the touch position is detected (Yes at Step S 61 ), the continuous operation decision unit 31 A proceeds to Step S 56 in order to detect a movement of a touch position. This is because the continuous operation decision unit 31 A decides that the operation is a continuous operation of the menu item.
  • FIG. 14 is a schematic diagram illustrating another example of the menu operation process according to the third embodiment.
  • the pointing operation acquisition unit 22 sets the canvas of the screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22 , the context menu display unit 23 displays the context menu near the pointing position.
  • the menu items of the “addition of a label” and the “addition of an image” are included.
  • the menu item selection operation decision unit 24 A detects the first operation that is used to select the menu item of the “addition of a label”.
  • the first operation is an operation to start a drag.
  • the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item of the “addition of a label”.
  • the target selection operation decision unit 25 detects the second operation that is used to select the position in which the command indicated by the menu item of the “addition of a label” is performed.
  • the second operation is an operation to end the drag.
  • the command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the drag d 2 has been ended.
  • the command calculation unit 27 calculates the command to perform the “addition of a label” at the position in which the drag d 2 has been ended. Consequently, as indicated by a reference numeral d 3 , the label is added to the position in which the drag d 2 has been ended.
  • the continuous operation decision unit 31 A decides whether a predetermined time has elapsed for a time period from the end of the movement trajectory of the menu item of the “addition of a label” to the start of the subsequent movement trajectory of the menu item of the same “addition of a label”.
  • the continuous operation decision unit 31 A decides that a predetermined time has not elapsed for a time period from the end of the movement trajectory of the menu item of the “addition of a label” to the start of the subsequent movement trajectory of the menu item of the same “addition of a label”.
  • the menu item selection operation decision unit 24 A detects the first operation that is used to select the menu item of the “addition of a label”. Then, the menu item display unit 26 changes, on the basis of the drag operation d 4 , the display position of the menu item of the “addition of a label”.
  • the target selection operation decision unit 25 detects the second operation that is used to select the position in which the command indicated by the menu item of the “addition of a label” is executed.
  • the command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the drag d 4 has been ended. At this point, the command calculation unit 27 calculates the command to perform the “addition of a label” at the position in which the drag d 4 has been ended. Consequently, as indicated by the reference numeral d 5 , the label is added at the position in which the drag d 4 has been ended.
  • the continuous operation decision unit 31 A prevents the menu item of the “addition of a label” from being displayed.
  • the information processing apparatus 1 can continuously operate the menu item included in the context menu.
  • FIG. 15 is a flowchart illustrating another example of the flow of the menu operation process according to the third embodiment.
  • the pointing operation acquisition unit 22 decides whether a pointing to the screen is detected (Step S 71 ). If the pointing operation acquisition unit 22 decides that a pointing to the screen is not detected (No at Step S 71 ), the pointing operation acquisition unit 22 repeats the decision process until a pointing to the screen is detected.
  • the context menu display unit 23 displays the context menu near the detected position (Step S 72 ).
  • the menu item selection operation decision unit 24 A detects whether an operation to start a drag of the menu item is detected (Step S 73 ). If the menu item selection operation decision unit 24 A decides that an operation to start a drag of the menu item is not detected (No at Step S 73 ), the menu item selection operation decision unit 24 A repeats the decision process until the subject operation is detected.
  • the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S 74 ). Furthermore, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item targeted for the drag.
  • the target selection operation decision unit 25 detects whether an operation to end the drag of the menu item is detected (Step S 75 ). If the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is not detected (No at Step S 75 ), the target selection operation decision unit 25 repeats the decision process until the target selection operation decision unit 25 detects the subject operation.
  • Step S 75 the command calculation unit 27 executes, on the basis of the position in which the drag has been ended, the command indicated by the menu item (Step S 76 ).
  • the menu item selection operation decision unit 24 A detects whether an operation to start a drag of the menu item is detected (Step S 77 ). If the menu item selection operation decision unit 24 A decides that an operation to start a drag of the menu item is detected (Yes at Step S 77 ), the target selection operation decision unit 25 proceeds to Step S 75 in order to detect the end of the drag of the menu item.
  • Step S 77 the continuous operation decision unit 31 A decides whether a predetermined time has elapsed from the end of the drag performed last time (Step S 78 ). If the continuous operation decision unit 31 A decides that a predetermined time has not elapsed from the end of the drag performed last time (No at Step S 78 ), the menu item selection operation decision unit 24 A proceeds to Step S 77 in order to detect the start of the drag of the menu item.
  • the menu item display unit 26 prevents the menu item targeted for the drag from being displayed (Step S 79 ). Then, the menu operation process is ended.
  • the information processing apparatus 1 selects a menu item due to the first touch and the second touch.
  • the information processing apparatus 1 allows the menu item to be moved due to the first touch and executes, at the end position of the movement, the command indicated by the menu item.
  • the information processing apparatus 1 decides whether an amount of movement of the movement trajectory of the menu item due to the second touch is zero. If the information processing apparatus 1 decides that an amount of movement due to the second touch is zero, the information processing apparatus 1 continues to select the menu item. With this configuration, the information processing apparatus 1 can allow the menu item in the context menu to be continuously operated.
  • the information processing apparatus 1 decides whether a predetermined time has elapsed for a time period from the end of a first movement trajectory of the menu item to the start of a second movement trajectory of the subject menu item. If the information processing apparatus 1 decides that a predetermined time has not elapsed, the information processing apparatus 1 continues to select the subject menu item. With this configuration, the information processing apparatus 1 can allow the menu item in the context menu to be continuously operated.
  • the information processing apparatus 1 displays, the content indicated by the menu item at the end position that is the result of a movement of the selected menu item due to the drag operation.
  • the information processing apparatus 1 cancels a selection of the menu item on the basis of the drag trajectory of the selected menu item.
  • the information processing apparatus 1 further continuously operates the menu item on the basis of the drag trajectory of the selected menu item.
  • the information processing apparatus 1 is not limited to this and may also further change the display of the menu item in accordance with the target of the dragged menu item.
  • the target mentioned here is an object, a canvas, or the like.
  • a menu item is a “deletion of a label”
  • the object targeted for the deletion is a label. Namely, if a menu item is not dragged to the position of the label targeted for the deletion, the command indicated by the “deletion of a label” is not able to be executed.
  • a description will be given of a case in which the information processing apparatus 1 further changes the display of the menu item in accordance with the target of a dragged menu item.
  • the fourth embodiment differs from the third embodiment in that an operation related to the fourth embodiment is added to the command calculation unit 27 .
  • the command calculation unit 27 calculates a command to be executed.
  • the target mentioned here indicates, for example, the position in which a command is executed or indicates, for example, an object or a canvas located at the subject position.
  • the command calculation unit 27 decides whether execution can be performed on the position in which the command indicated by the menu item has been dragged.
  • the command calculation unit 27 may also decide whether a command can be executed for each dragged position or can be executed at the position selected by the second operation.
  • FIG. 16 is a schematic diagram illustrating an example of a menu operation process according to a fourth embodiment.
  • five objects are displayed. Here, it is assumed that these objects are labels.
  • the pointing operation acquisition unit 22 sets the canvas of the screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22 , the context menu display unit 23 displays the context menu near the pointing position.
  • the menu items of the “addition of a label” and the “deletion of a label” are included.
  • the menu item selection operation decision unit 24 A detects the first operation that is used to select the menu item of the “deletion of a label”.
  • the first operation is an operation to start a drag.
  • the menu item display unit 26 changes the display position of the menu item of the “deletion of a label” on the basis of the drag operation.
  • the command calculation unit 27 decides whether the command indicated by the menu item of the “deletion of a label” can be executed at the dragged position.
  • a position e 3 in which the drag has been ended is the position of the label
  • the command calculation unit 27 decides that the command indicated by the menu item of the “deletion of a label” can be executed at the dragged position.
  • the command calculation unit 27 calculates the command indicated by the menu item of the “deletion of a label” at the position e 3 in which the drag e 2 has been ended.
  • the command calculation unit 27 calculates the command to perform the “deletion of a label” at the position e 3 in which the drag e 2 has been ended. Consequently, as indicated by the reference numeral e 4 , the label is deleted.
  • the menu item selection operation decision unit 24 A detects the first operation that is used to select the menu item of the “deletion of a label”.
  • the first operation is an operation to start the drag.
  • the menu item display unit 26 changes, on the basis of the rag operation, the display position of the menu item of the “deletion of a label”.
  • the command calculation unit 27 decides whether the command indicated by the menu item of the “deletion of a label” can be executed at the dragged position.
  • the command calculation unit 27 decides that the command indicated by the menu item of the “deletion of a label” is not able to be executed at the dragged position. This is because the command calculation unit 27 is not able to specify the label in which the command is executed.
  • the menu item display unit 26 changes the display of the menu item in order to indicate that the command is not able to be performed.
  • the menu item display unit 26 changes the color of the subject menu item to a color different from the normal color. If the normal color is black, the menu item display unit 26 changes the color of the subject menu item to red.
  • the information processing apparatus 1 can improve the operation efficiency with respect to a menu item.
  • FIG. 17 is a flowchart illustrating an example of the flow of the menu operation process according to the fourth embodiment.
  • the pointing operation acquisition unit 22 decides whether a pointing to a screen is detected (Step S 81 ). If the pointing operation acquisition unit 22 decides that a pointing to a screen is not detected (No at Step S 81 ), the pointing operation acquisition unit 22 repeats the decision process until a pointing to the screen is detected.
  • the context menu display unit 23 displays the context menu near the detected position (Step S 82 ).
  • the menu item selection operation decision unit 24 A detects whether an operation to start a drag of the menu item is detected (Step S 83 ). If the menu item selection operation decision unit 24 A decides that an operation to start a drag of the menu item is not detected (No at Step S 83 ), the menu item selection operation decision unit 24 A repeats the decision process until the subject operation is detected.
  • the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S 84 ). Furthermore, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item targeted for the drag.
  • the command calculation unit 27 decides whether the command indicated by the menu item can be executed on the dragged position (Step S 85 ). If the command calculation unit 27 decides that the command indicated by the menu item is not able to be executed on the dragged position (No at Step S 85 ), the menu item display unit 26 sets the display of the subject menu item to an abnormal display (Step S 86 ). Namely, because the command calculation unit 27 is not able to perform the command indicated by the menu item at the dragged position, the menu item display unit 26 changes the display of the subject menu item to the display indicating that the command is not able to be executed. Then, the menu item display unit 26 proceeds to Step S 88 .
  • Step S 85 if the command calculation unit 27 decides that the command indicated by the menu item can be executed on the dragged position (Yes at Step S 85 ), the menu item display unit 26 sets the subject menu item to a normal display (Step S 87 ). Then, the menu item display unit 26 proceeds to Step S 88 .
  • the target selection operation decision unit 25 detects whether an operation to end the drag of the menu item is detected (Step S 88 ). If the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is not detected (No at Step S 88 ), the command calculation unit 27 proceeds to Step S 85 in order to perform execution decision of the command at the dragged position.
  • the menu item display unit 26 prevents the subject menu item from being displayed (Step S 89 ).
  • the command calculation unit 27 calculates the command indicated by the menu item at the position in which the drag has been ended (Step S 90 ). Then, the menu operation process is ended.
  • the information processing apparatus 1 displays a context menu and then displays, at the end position that is the result of a movement due to the drag operation of the selected menu item, the content indicated by the menu item.
  • the context menu may also include both a menu item available to be dragged and a menu item unavailable to be dragged and, furthermore, the menu item unavailable to be dragged is made to be unable to be dragged.
  • the menu item unavailable to be dragged indicates the menu meaningless to change the position of the menu item and is, for example, the menu item in which the command over the entire screen is executed.
  • FIG. 18 is a schematic diagram illustrating an example of a display when the context menu includes therein both the menu items available to be dragged and the menu items unavailable to be dragged.
  • the menu items available to be dragged and the menu items unavailable to be dragged are included.
  • the “addition of a label” and the “addition of an image” are displayed.
  • the mark indicating that a drag can be performed is displayed.
  • a “change in the background color” and a “zoom of the screen” are displayed.
  • the mark indicating that a drag can be performed is not displayed. Namely, with the menu items indicated by the “change in the background color” and the “zoom of the screen”, because the background color and the zoom of the entire screen are changed, the position of the menu item may be present anywhere.
  • the target selection operation decision unit 25 performs, on the menu item unavailable to be dragged, a process of setting the drag invalid.
  • FIG. 19 is a schematic diagram illustrating another example of a display when the context menu includes therein both the menu items available to be dragged and the menu items unavailable to be dragged.
  • the menu items available to be dragged and the menu items unavailable to be dragged are included.
  • the “addition of a label” and the “addition of an image” are displayed.
  • the “change in the background color” and the “zoom of the screen” are displayed.
  • the menu item display unit 26 displays the menu item at the drag position in accordance with the drag.
  • the continuous operation decision unit 31 A decides not to select the menu item.
  • the menu item display unit 26 returns the menu item to the position of the menu item that is located before the menu item is dragged.
  • the information processing apparatus 1 can improve the operational efficiency with respect to the menu item.
  • the information processing apparatus 1 can be implemented by mounting each of the functions, such as the display device 10 , the control unit 20 , or the like described above, on an information processing apparatus, such as a known personal computer, workstation, or the like.
  • each device illustrated in the drawings are not always physically configured as illustrated in the drawings.
  • the specific shape of a separate or integrated device is not limited to the drawings; however, all or part of the device can be configured by functionally or physically separating or integrating any of the units depending on various loads or use conditions.
  • the operation acquisition unit 21 and the pointing operation acquisition unit 22 may also be integrated as a single unit.
  • the continuous operation decision unit 31 may also be separated into a decision unit that detects a stop operation and a decision unit that detects a continuous operation.
  • FIG. 20 is a schematic diagram illustrating an example of a computer that executes an information processing program.
  • a computer 200 includes a CPU 203 that executes various kinds of arithmetic processing, an input device 215 that accepts an input of data from a user, and a display control unit 207 that controls a display device 209 . Furthermore, the computer 200 includes a drive device 213 that reads a program or the like from a storage medium and a communication control unit 217 that gives and receives data to and from another computer via a network. Furthermore, the computer 200 includes a memory 201 that temporarily stores therein various kinds of information and an HDD 205 . Then, the memory 201 , the CPU 203 , the HDD 205 , the display control unit 207 , the drive device 213 , the input device 215 , and the communication control unit 217 are connected by a bus 219 .
  • the drive device 213 is a device used for, for example, a removable disk 211 .
  • the CPU 203 reads an information processing program 205 a , loads the program in the memory 201 , and executes the program as a process.
  • the process is associated with each of the functioning units included in the information processing apparatus 1 .
  • Information processing related information 205 b is associated with the information stored in a storage unit that is not illustrated. Then, for example, the removable disk 211 stores therein each of the pieces of the information, such as the information processing program 205 a or the like.
  • the information processing program 205 a is not always stored in the HDD 205 from the beginning.
  • the program is stored in a “portable physical medium”, such as a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optic disk, an IC CARD, or the like, that is to be inserted into the computer 200 .
  • the computer 200 may also read and execute the information processing program 205 a from the portable physical medium.

Abstract

An information processing apparatus includes a menu item selection operation decision unit that selects a menu item in a context menu; a target selection operation decision unit that selects, regarding the selected menu item, an end position that is the result of a movement due to a predetermined operation; and a command calculation unit that calculates, on the basis of the selected end position, a command indicated by the menu item.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2016-040482, filed on Mar. 2, 2016, the entire contents of which are incorporated herein by reference.
FIELD
The embodiments discussed herein are related to an information processing apparatus, or the like.
BACKGROUND
There is a known technology that displays, in an overlapped manner on a screen when a pointing is performed on a screen by an operation tool, such as a mouse, a touch, or the like, an operation menu related to the pointed target near the pointed position. When a user selects a single menu item in the displayed operation menus, the command related to the pointed target is determined and executed by the selected on the basis of the selected menu item. This type of operation menu is typically referred to as a “context menu”. The context menu is widely used as a user interface of application in a typical personal computer (PC) or an application subjected to a touch operation performed by a tablet, a smart phone, or the like.
Patent Document 1: Japanese Laid-open Patent Publication No. 2001-184458
Patent Document 2: Japanese Laid-open Patent Publication No. 08-180138
Patent Document 3: Japanese Laid-open Patent Publication No. 05-061596
However, there is a problem in that, if objects displayed on a screen are densely packed, an operation to display a context menu is not able to be efficiently performed. Furthermore, in another point of view, there is a problem in that, if objects displayed on a screen are densely packed, even if a context menu is displayed, the visibility of the objects that have already been displayed is decreased.
For example, if objects displayed on a screen are densely packed, the area of the screen with no object is small, it is difficult for a user to perform a pointing by avoiding the object. Consequently, a pointing operation to display a context menu is not able to be efficiently performed.
Furthermore, even if a context menu is displayed, because the context menu is displayed on the screen in an overlapped manner, the context menu may possibly hide the already displayed objects. Thus, even if a context menu is displayed, there may be a case of decreasing the visibility of the already displayed objects.
SUMMARY
According to an aspect of an embodiment, an information processing apparatus includes a processor. The processor executes a process. The process includes first selecting a menu item in a context menu. The process includes second selecting, regarding the menu item selected at the first selecting, an end position that is a result of a movement due to a predetermined operation. The process includes displaying, at the end position selected at the second selecting, a content indicated by the menu item.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 is a functional block diagram illustrating the configuration of an information processing apparatus according to a first embodiment;
FIG. 2 is a schematic diagram illustrating an example of a menu operation process according to the first embodiment;
FIG. 3 is a flowchart illustrating an example of the flow of the menu operation process according to the first embodiment;
FIG. 4 is a schematic diagram illustrating another example of the menu operation process according to the first embodiment;
FIG. 5 is a flowchart illustrating another example of the flow of the menu operation process according to the first embodiment;
FIG. 6 is a functional block diagram illustrating the configuration of an information processing apparatus according to a second embodiment;
FIG. 7 is a schematic diagram illustrating an example of a menu operation process according to the second embodiment;
FIG. 8 is a flowchart illustrating an example of the flow of the menu operation process according to the second embodiment;
FIG. 9 is a schematic diagram illustrating another example of the menu operation process according to the second embodiment;
FIG. 10 is a flowchart illustrating another example of the flow of the menu operation process according to the second embodiment;
FIG. 11 is a functional block diagram illustrating the configuration of an information processing apparatus according to a third embodiment;
FIG. 12 is a schematic diagram illustrating an example of a menu operation process according to a third embodiment;
FIG. 13 is a flowchart illustrating an example of the flow of the menu operation process according to the third embodiment;
FIG. 14 is a schematic diagram illustrating another example of the menu operation process according to the third embodiment;
FIG. 15 is a flowchart illustrating another example of the flow of the menu operation process according to the third embodiment;
FIG. 16 is a schematic diagram illustrating an example of a menu operation process according to a fourth embodiment;
FIG. 17 is a flowchart illustrating an example of the flow of the menu operation process according to the fourth embodiment;
FIG. 18 is a schematic diagram illustrating an example of a display when a context menu includes therein both menu items available to be dragged and menu items unavailable to be dragged;
FIG. 19 is a schematic diagram illustrating another example of a display when a context menu includes therein both menu items available to be dragged and menu items unavailable to be dragged; and
FIG. 20 is a schematic diagram illustrating an example of a computer that executes an information processing program.
DESCRIPTION OF EMBODIMENTS
Preferred embodiments of the present invention will be explained with reference to accompanying drawings. The present invention is not limited to the embodiments.
[a] First Embodiment
Configuration of an Information Processing Apparatus According to a First Embodiment
FIG. 1 is a functional block diagram illustrating the configuration of an information processing apparatus according to a first embodiment. An information processing apparatus 1 according to the first embodiment simultaneously selects, after displaying a context menu, by detecting a predetermined operation of the menu item selected in the context menu, the command indicated by the menu item and the position at which the subject command is executed and then executes the command at the selected position. The predetermined operation mentioned here is, for example, a drag operation or a pointing operation. Hereinafter, a description will be given by using the predetermined operation as the drag operation.
As illustrated in FIG. 1, the information processing apparatus 1 includes a display device 10 and a control unit 20. The display device 10 may be, for example, a device with a display screen, such as a monitor of a personal computer (PC), a monitor of a television, a projector, a head mounted display (HMD), a smartphone, a tablet, or the like.
The control unit 20 includes an internal memory that stores therein control data and programs in which various kinds of procedures are prescribed, whereby the control unit 20 executes various kinds of processes. Furthermore, the control unit 20 corresponds to, for example, an electronic circuit in an integrated circuit, such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or the like. Alternatively, the control unit 20 corresponds to an electronic circuit, such as a central processing unit (CPU), a micro processing unit (MPU), or the like. Furthermore, the control unit 20 includes an operation acquisition unit 21, a pointing operation acquisition unit 22, a context menu display unit 23, a menu item selection operation decision unit 24, a target selection operation decision unit 25, a menu item display unit 26, and a command calculation unit 27.
The operation acquisition unit 21 acquires an operation performed on a user's screen. The user performs an operation to the screen using a mouse or a touch panel. The operation mentioned here includes, for example, a click of a mouse, a movement of a mouse pointer, a touchdown, a touch-up, and a movement of a touch position.
The pointing operation acquisition unit 22 acquires, from the operation acquired by the operation acquisition unit 21, a pointing operation to the screen. For example, if the operation acquired by the operation acquisition unit 21 is a click, the pointing operation acquisition unit 22 acquires the position in which a click operation is performed and then acquires the status as the pointing operation performed to the acquired position. Furthermore, if the operation acquired by the operation acquisition unit 21 is a touch-up, first, the pointing operation acquisition unit 22 acquires the position in which a touchdown is performed and then acquires the operation as the pointing operation to the acquired position. Then, when the pointing operation acquisition unit 22 acquires the pointing operation, if an object is displayed at the pointing position, the pointing operation acquisition unit 22 sets the displayed object to the target for the pointing. If no object is displayed at the pointing position, the pointing operation acquisition unit 22 sets a canvas of the screen to the target for the pointing.
If a pointing operation is acquired by the pointing operation acquisition unit 22, the context menu display unit 23 displays a context menu, in the vicinity of the pointing position, that is used to select an operation related to the target for the pointing. In the context menu, for example, menu items are vertically aligned in a straight line; however, the menu item may also be horizontally aligned in a straight line, or may also be a circularly menu item. Furthermore, on the basis of the target for the pointing, the context menu display unit 23 decides the menu item included in the context menu. For example, if the target for the pointing is a canvas, the context menu display unit 23 displays the context menu including the menu items of an “addition of a label” and a “change in a color”. If the target for the pointing is a label, the context menu display unit 23 displays the context menu by including the menu items of a “deletion of a label” and a “change in a color”.
The menu item selection operation decision unit 24 detects a first operation that is used to select a menu item in the context menu. The first operation is, for example, an operation to select a menu item and to start a drag.
The target selection operation decision unit 25 detects a second operation that is used to select the target in which the command indicated by the menu item is executed. The target mentioned here is, for example, the position in which the command is executed. The second operation is, for example, an operation to end the drag.
The menu item display unit 26 displays, in accordance with the drag operation, the menu item selected by the first operation. Namely, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item selected by the first operation.
On the basis of the menu item selected by the first operation, on the basis of the target selected by the second operation, and on the basis of the second operation, the command calculation unit 27 calculates the command to be executed. For example, it is assumed that the menu item of the “addition of a label” that is included in the context menu and that is displayed when a canvas on a screen is targeted for the pointing is selected by an operation to start a drag. Then, if the drag of the menu item of the selected “addition of a label” is started and the drag is ended at an arbitrary position, the command calculation unit 27 calculates a command to add the label at the position in which the drag has been ended. Furthermore, if the second operation is not performed, the command calculation unit 27 may calculate a command to add a label at the pointing position of the pointing that is performed in order to display the context menu acquired by the pointing operation acquisition unit 22.
Example of a Menu Operation Process
An example of a menu operation process according to the first embodiment will be described with reference to FIG. 2. FIG. 2 is a schematic diagram illustrating an example of a menu operation process according to the first embodiment. As illustrated in FIG. 2, it is assumed that the area indicated by a reference numeral a1 is an area in which objects are densely packed.
In this state, it is assumed that pointing is performed at the position that is indicated by a reference numeral a2 and that is easily tapped. Then, because no object is displayed on the pointing position, the pointing operation acquisition unit 22 sets the canvas on the screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22, the context menu display unit 23 displays a context menu near the pointing position. Here, in the context menu, the menu items of an “addition of a label” and an “addition of an image” are included.
Then, it is assumed that the menu item of the “addition of a label” is selected and a drag a3 is started. Then, the menu item selection operation decision unit 24 detects the first operation that is used to select the menu item of the “addition of a label”. Here, the first operation is an operation to start a drag. The menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item of the “addition of a label”.
Then, it is assumed that the drag a3 of the menu item of the “addition of a label” has been ended. Then, the target selection operation decision unit 25 detects the second operation that is used to select the position in which the command indicated by the menu item of the “addition of a label” is executed. Here, the second operation is an operation to end the drag.
Then, the command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the drag a3 has been ended. Here, the command calculation unit 27 calculates the command to perform the “addition of a label” at the position in which the drag a3 has been ended. Consequently, as indicated by a reference numeral a4, the label is added at the position in which the drag a3 has been ended.
Thus, after the context menu is displayed, the information processing apparatus 1 can change the position in which the menu item of the “addition of a label” selected from among the menu items included in the context menu is executed. Even if objects displayed on a screen are densely packed, the information processing apparatus 1 can improve the efficiency of the operation to display a context menu. Furthermore, even if a context menu is displayed when the objects displayed on the screen are densely packed, the information processing apparatus 1 can improve the visibility of the objects displayed on the screen.
Example of a Flowchart of the Menu Operation Process
FIG. 3 is a flowchart illustrating an example of the flow of the menu operation process according to the first embodiment.
As illustrated in FIG. 3, the pointing operation acquisition unit 22 decides whether a pointing to a screen is detected (Step S11). If the pointing operation acquisition unit 22 decides that a pointing to a screen is not detected (No at Step S11), the pointing operation acquisition unit 22 repeats the decision process until the pointing operation acquisition unit 22 detects the pointing to the screen.
In contrast, if the pointing operation acquisition unit 22 decides that a pointing to a screen is detected (Yes at Step S11), the context menu display unit 23 displays the context menu near the detected position (Step S12).
Then, the menu item selection operation decision unit 24 detects whether an operation to start a drag of the menu item is detected (Step S13). If the menu item selection operation decision unit 24 decides that an operation to start a drag of the menu item is not detected (No at Step S13), the menu item selection operation decision unit 24 repeats the decision process until the menu item selection operation decision unit 24 detects the subject operation.
In contrast, if the menu item selection operation decision unit 24 decides that an operation to start a drag of the menu item is detected (Yes at Step S13), the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S14). Furthermore, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item that is targeted for the drag.
Then, the target selection operation decision unit 25 detects whether an operation to end the drag of the menu item is detected (Step S15). If the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is not detected (No at Step S15), the target selection operation decision unit 25 repeats the decision process until the target selection operation decision unit 25 detects the subject operation.
In contrast, if the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is detected (Yes at Step S15), the menu item display unit 26 prevents the subject menu item from being displayed (Step S16).
Then, on the basis of the position in which the drag has been ended, the command calculation unit 27 calculates the command indicated by the menu item (Step S17). Then, the menu operation process is ended.
In the first embodiment, a movement of a menu item in a context menu has been described as a drag operation. However, in the first embodiment, the movement is not limited to this and, instead of the drag operation, a movement of a menu item included in a context menu may also be a pointing operation. Therefore, a menu operation process performed when a movement of a menu item in a context menu is a pointing operation.
Another Example of the Menu Operation Process
Another example of the menu operation process according to the first embodiment will be described with reference to FIG. 4. FIG. 4 is a schematic diagram illustrating another example of the menu operation process according to the first embodiment. In also FIG. 4, similarly to FIG. 2, it is assumed that the area indicated by the reference numeral a1 is an area in which the objects are densely packed.
In this state, it is assumed that a pointing is performed at the position that is indicated by the reference numeral a2 and that is easily tapped. Then, because no object is displayed on the pointing position, the pointing operation acquisition unit 22 sets the canvas on the screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22, the context menu display unit 23 displays a context menu near the pointing position. Here, in the context menu, the menu items of the “addition of a label” and the “addition of an image” are included.
Then, it is assumed that, as indicated by a reference numeral a3′, a pointing is performed on the menu item of the “addition of a label”. Then, the menu item selection operation decision unit 24 detects the first operation that is used to select the menu item of the “addition of a label”. Here, the first operation is an operation to perform a pointing on the menu item.
Then, it is assumed that, as indicated by a reference numeral a4′, a pointing is performed on the menu item of the “addition of a label”. Then, the target selection operation decision unit 25 detects the second operation that is used to select the position in which the command indicated by the menu item of the “addition of a label” is executed. Here, the second operation is an operation to perform a pointing the menu item.
Then, the command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the pointing a4′ has been performed. Therefore, as indicated by a reference numeral a5′, the label is added at the position in which the pointing a4′ has been performed.
Consequently, after having displayed the context menu, the information processing apparatus 1 can change the position in which the menu item of the “addition of a label” selected from among the menu items included in the context menu is executed. Even if objects displayed on a screen are densely packed, the information processing apparatus 1 can improve the efficiency of the operation to display a context menu. Furthermore, even if a context menu is displayed when the objects displayed on the screen are densely packed, the information processing apparatus 1 can improve the visibility of the objects displayed on the screen.
Another Example of a Flowchart of the Menu Operation Process
FIG. 5 is a flowchart illustrating another example of the flow of the menu operation process according to the first embodiment.
As illustrated in FIG. 5, the pointing operation acquisition unit 22 decides whether a pointing to a screen is detected (Step S21). If the pointing operation acquisition unit 22 decides that a pointing to a screen is not detected (No at Step S21), the pointing operation acquisition unit 22 repeats the decision process until the pointing operation acquisition unit 22 detects the pointing to the screen.
In contrast, if the pointing operation acquisition unit 22 decides that a pointing to a screen is detected (Yes at Step S21), the context menu display unit 23 displays the context menu near the detected position (Step S22).
Then, the menu item selection operation decision unit 24 detects whether a pointing to the menu item is detected (Step S23). If the menu item selection operation decision unit 24 decides that a pointing to the menu item is not detected (No at Step S23), the menu item selection operation decision unit 24 repeats the decision process until the menu item selection operation decision unit 24 detects the subject operation.
In contrast, if the menu item selection operation decision unit 24 decides that a pointing to the menu item is detected (Yes at Step S23), the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S24).
Then, the target selection operation decision unit 25 detects whether a pointing to a screen is detected (Step S25). If the target selection operation decision unit 25 decides that a pointing to a screen is not detected (No at Step S25), the target selection operation decision unit 25 repeats the decision process until the subject operation is detected.
In contrast, if the target selection operation decision unit 25 decides that a pointing to a screen is detected (Yes at Step S25), the menu item display unit 26 prevents the subject menu item from being displayed (Step S26).
Then, on the basis of the position in which the pointing is detected, the command calculation unit 27 calculates the command indicated by the menu item (Step S27). Then, the menu operation process is ended.
Effect of the First Embodiment
In this way, in the first embodiment described above, the information processing apparatus 1 selects a menu item included in a context menu and selects, regarding the selected menu item, an end position that is the result of a movement due to a predetermined operation. The information processing apparatus 1 displays the content indicated by the menu item at the selected end position. With this configuration, after having displayed the context menu, the information processing apparatus 1 can change the position in which the menu item from among the menu items included in the context menu. Furthermore, even if objects displayed on a screen are densely packed, the information processing apparatus 1 can improve the efficiency of the operation to display a context menu. Furthermore, even if a context menu is displayed when the objects displayed on the screen are densely packed, the information processing apparatus 1 can improve the visibility of the objects displayed on the screen.
[b] Second Embodiment
In the first embodiment, after having displayed a context menu, the information processing apparatus 1 displays the content indicated by the menu item at the end position that is the result of the movement due to a drag operation of the selected menu item. However, the information processing apparatus 1 is not limited to this and may also be used in a case in which, on the basis of a drag trajectory of the selected menu item, selection of the menu item is canceled.
Thus, in a second embodiment, a description will be given of a case in which the information processing apparatus 1 cancels the selection of the menu item on the basis of a drag trajectory of the selected menu item.
Configuration of an Information Processing Apparatus According to a Second Embodiment
FIG. 6 is a functional block diagram illustrating the configuration of an information processing apparatus according to a second embodiment. The components having the same configuration as those in the information processing apparatus 1 illustrated in FIG. 1 are assigned the same reference numerals; therefore, overlapped descriptions of the configuration and the operation thereof will be omitted. The second embodiment is different from the first embodiment in that a continuous operation decision unit 31 is added.
The continuous operation decision unit 31 decides, on the basis of a movement trajectory of a menu item, whether to continue to select a menu item is.
As an example, if a movement trajectory in a menu item is a trajectory that returns to the vicinity of the starting position of the movement after a start of the movement, the continuous operation decision unit 31 decides not to continue to select the menu item. Namely, the continuous operation decision unit 31 decides to stop selecting a menu item. Namely, after a drag of a menu item is started and the menu item is moved from the context menu by a predetermined distance or more, if the menu item is dragged and moved such that the menu item returns to the vicinity of the starting position, the continuous operation decision unit 31 decides that an operation to stop selecting the menu item has been performed. The vicinity of the starting position is, for example, the distance between the starting position and the end position is within a predetermined threshold.
For another example, if a movement trajectory of the menu item is a trajectory that satisfies a predetermined condition after the start of the movement, the continuous operation decision unit 31 decides not to continue to select the menu item. Namely, the continuous operation decision unit 31 decides to stop selecting the menu item. The trajectory that satisfies a predetermined condition is, as an example, a trajectory indicating a scratch gesture. The scratch gesture is a gesture repeatedly traveling to and from a certain center line in a short time.
Example of a Menu Operation Process
An example of a menu operation process according to the second embodiment will be described with reference to FIG. 7. FIG. 7 is a schematic diagram illustrating an example of a menu operation process according to the second embodiment.
It is assumed that a pointing is performed at the position indicated by the reference numeral b1. Because no object is displayed at the pointing position, the pointing operation acquisition unit 22 sets a canvas of a screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22, the context menu display unit 23 displays the context menu near the pointing position. Here, in the context menu, the menu items of the “addition of a label” and the “addition of an image” are included.
Then, it is assumed that the menu item of the “addition of a label” is selected and then a drag b2 is started. Then, the menu item selection operation decision unit 24 detects the first operation that is used to select the menu item of the “addition of a label”. Here, the first operation is an operation to start a drag. The menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item of the “addition of a label”.
Then, it is assumed that, after the menu item is moved from the context menu by a predetermined distance or more, the operation indicated by the drag b3 is performed such that the menu item returns to the vicinity of the starting position. Then, the continuous operation decision unit 31 decides that an operation to stop selecting the menu item of the “addition of a label” is performed. Then, the menu item display unit 26 prevents the menu item of the “addition of a label” from being displayed.
Consequently, by using the drag trajectory of the menu item of the context menu, the information processing apparatus 1 can decide whether the drag operation is selection of a menu item or a stop of the selection of the menu item.
Example of a Flowchart of the Menu Operation Process
FIG. 8 is a flowchart illustrating an example of the flow of the menu operation process according to the second embodiment.
As illustrated in FIG. 8, the pointing operation acquisition unit 22 decides whether a pointing to a screen is detected (Step S31). If the pointing operation acquisition unit 22 decides that a pointing to a screen is not detected (No at Step S31), the pointing operation acquisition unit 22 repeats the decision process until the pointing operation acquisition unit 22 detects a pointing to the screen.
In contrast, if the pointing operation acquisition unit 22 decides that a pointing to a screen is detected (Yes at Step S31), the context menu display unit 23 displays a context menu near the detected position (Step S32).
Then, the menu item selection operation decision unit 24 detects whether an operation to start a drag of a menu item is detected (Step S33). If the menu item selection operation decision unit 24 decides that an operation to start a drag of a menu item is not detected (No at Step S33), the menu item selection operation decision unit 24 repeats the decision process until the menu item selection operation decision unit 24 detects the subject operation.
In contrast, if the menu item selection operation decision unit 24 decides that an operation to start a drag of a menu item is detected (Yes at Step S33), the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S34). Furthermore, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item targeted for the drag.
Then, the continuous operation decision unit 31 decides whether a close approach of the menu item to the starting position of the drag is detected (Step S35). A close approach to the starting position means the vicinity of the starting position and indicates that, for example, the distance between the starting position and the end position is within a predetermined threshold.
If the continuous operation decision unit 31 decides that a close approach of the menu item to the starting position of the drag is detected (Yes at Step S35), the menu item display unit 26 prevents the subject menu item from being displayed (Step S36). Then, the menu operation process is ended.
In contrast, if the continuous operation decision unit 31 decides that a close approach of the menu item to the starting position of the drag is not detected (No at Step S35), the target selection operation decision unit 25 detects whether an operation to end the drag of the menu item is detected (Step S37). If the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is not detected (No at Step S37), the target selection operation decision unit 25 repeats the decision process until the target selection operation decision unit 25 detects the subject operation.
In contrast, if the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is detected (Yes at Step S37), the menu item display unit 26 prevents the subject menu item from being displayed (Step S38).
Then, on the basis of the position in which the drag has been ended, the command calculation unit 27 calculates the command indicated by the menu item (Step S39). Then, the menu operation process is ended.
Another Example of the Menu Operation Process
Another example of the menu operation process according to the second embodiment will be described with reference to FIG. 9. FIG. 9 is a schematic diagram illustrating another example of the menu operation process according to the second embodiment.
It is assumed that a pointing is performed at the position indicated by the reference numeral b1. Then, because no object is displayed at the pointing position, the pointing operation acquisition unit 22 sets a canvas of a screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22, the context menu display unit 23 displays a context menu near the pointing position. Here, in the context menu, the menu items of the “addition of a label” and the “addition of an image” are included.
Then, it is assumed that the menu item of the “addition of a label” is selected and the drag b2 is started. Then, the menu item selection operation decision unit 24 detects the first operation that is used to select the menu item of the “addition of a label”. Here, the first operation is an operation to start a drag. The menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item of the “addition of a label”.
Then, it is assumed that a trajectory of the drag operation of the menu item is a scratch gesture b4. Then, the continuous operation decision unit 31 decides that an operation to stop selecting the menu item of the “addition of a label” has been performed. Then, the menu item display unit 26 prevents the menu item of the “addition of a label” from being displayed.
Consequently, by using the drag trajectory of the menu item included in the context menu, the information processing apparatus 1 can decide whether the drag operation is selection of a menu item or a stop of selection of the menu item.
Another Example of a Flowchart of the Menu Operation Process
FIG. 10 is a flowchart illustrating another example of the flow of the menu operation process according to the second embodiment.
As illustrated in FIG. 10, the pointing operation acquisition unit 22 decides whether a pointing to a screen is detected (Step S41). If the pointing operation acquisition unit 22 decides that a pointing to a screen is not detected (No at Step S41), the pointing operation acquisition unit 22 repeats the decision process until the pointing operation acquisition unit 22 detects a pointing to the screen.
In contrast, if the pointing operation acquisition unit 22 decides that a pointing to a screen is detected (Yes at Step S41), the context menu display unit 23 displays a context menu near the detected position (Step S42).
Then, the menu item selection operation decision unit 24 detects whether an operation to start a drag of the menu item is detected (Step S43). If the menu item selection operation decision unit 24 decides that an operation to start a drag of the menu item is not detected (No at Step S43), the menu item selection operation decision unit 24 repeats the decision process until the menu item selection operation decision unit 24 detects the subject operation.
In contrast, if the menu item selection operation decision unit 24 decides that an operation to start a drag of the menu item is detected (Yes at Step S43), the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S44). Furthermore, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item targeted for the drag.
Then, the continuous operation decision unit 31 decides whether a scratch gesture is detected (Step S45). Namely, the continuous operation decision unit 31 decides whether the trajectory of the drag operation of the menu item is a trajectory that indicates a scratch gesture after the start of the drag. If the continuous operation decision unit 31 decides that a scratch gesture is detected (Yes at Step S45), the menu item display unit 26 prevents the menu item targeted for the drag from being displayed (Step S46). Then, the menu operation process is ended.
In contrast, if the continuous operation decision unit 31 decides that a scratch gesture is not detected (No at Step S45), the target selection operation decision unit 25 detects whether an operation to end the drag of the menu item is detected (Step S47). If the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is not detected (No at Step S47), the target selection operation decision unit 25 repeats the decision process until the target selection operation decision unit 25 detects the subject operation.
In contrast, if the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is detected (Yes at Step S47), the menu item display unit 26 prevents the subject menu item from being displayed (Step S48).
Then, on the basis of the end position of the drag, the command calculation unit 27 calculates the command indicated by the menu item (Step S49). Then, the menu operation process is ended.
Effect of the Second Embodiment
In this way, in the second embodiment described above, the information processing apparatus 1 decides, on the basis of the movement trajectory of the menu item selected in the context menu, whether to continue to select the menu item. With this configuration, by using the movement trajectory of the menu item in the context menu, the information processing apparatus 1 can decide whether a move operation moves the menu item, continues the operation of the menu item, or stops the operation of the menu item.
Furthermore, in the second embodiment described above, the information processing apparatus 1 decides whether the movement trajectory of the menu item is the trajectory that returns to the vicinity of the starting position of the movement after the movement is started. If the information processing apparatus 1 decides the movement trajectory of the menu item is the trajectory that returns to the vicinity of the starting position of the movement after the movement is started, the information processing apparatus 1 stops selecting the menu item. With this configuration, by using the movement trajectory of the menu item in the context menu, the information processing apparatus 1 can decide whether the move operation moves the menu item or stops the operation of the menu item.
Furthermore, in the second embodiment described above, the information processing apparatus 1 decides whether the movement trajectory of the menu item is the trajectory that indicates the scratch gesture after the movement is started. If the information processing apparatus 1 decides that the movement trajectory of the menu item is the trajectory that indicates a scratch gesture after the movement is started, the information processing apparatus 1 stops selecting the menu item. With this configuration, by using the movement trajectory of the menu item in the context menu, the information processing apparatus 1 can easily decide the movement operation is a stop of the operation of the menu item.
[c] Third Embodiment
In the first embodiment, after the information processing apparatus 1 displays a context menu, the information processing apparatus 1 displays the content indicated by the menu item at the end position that is the result of the movement due to the drag operation of the selected menu item. In the second embodiment, the information processing apparatus 1 cancels the selection of the menu item on the basis of the drag trajectory of the selected menu item. However, the information processing apparatus 1 is not limited to these but may also further continuously operate the menu item on the basis of the drag trajectory of the selected menu item.
Thus, in the third embodiment, a description will be given of a case in which the information processing apparatus 1 further continuously operates the menu item on the basis of the drag trajectory of the selected menu item.
Configuration of an Information Processing Apparatus According to a Third Embodiment
FIG. 11 is a functional block diagram illustrating the configuration of an information processing apparatus according to a third embodiment. The components having the same configuration as those in the information processing apparatus 1 illustrated in FIG. 6 are assigned the same reference numerals; therefore, overlapped descriptions of the configuration and the operation thereof will be omitted. The third embodiment is different from the second embodiment in that the menu item selection operation decision unit 24 is changed to a menu item selection operation decision unit 24A. Furthermore, the third embodiment is different from the second embodiment in that the continuous operation decision unit 31 is changed to a continuous operation decision unit 31A.
The menu item selection operation decision unit 24A detects the first operation that is used to select a menu item in the context menu. The first operation is, for example, an operation to start a drag. For example, if the same menu item in the context menu is touched several times, the menu item selection operation decision unit 24A decides that the first operation has been performed. The touch mentioned here means a “touchdown”.
The continuous operation decision unit 31A decides, on the basis of a movement trajectory of the menu item, whether to continue to select the menu item.
As an example, if the continuous operation decision unit 31A decides that the first operation has been performed due to the same menu item being touched several times, the continuous operation decision unit 31A decides whether an amount of movement of the movement trajectory of the menu item indicated by a second touch and the subsequent touches is zero. If an amount of movement of the movement trajectory of the menu item indicated by the second touch and the subsequent touches is zero, the continuous operation decision unit 31A decides to continue to select the menu item is. For example, it is assumed that the first operation has been performed on a certain menu item indicated by the first touch and the second touch. Then, even if the menu item is dragged due to the first touch, if the touch position of the second touch is not moved, i.e., if an amount of movement is zero, the continuous operation decision unit 31A decides that the operation to select a menu item is continued.
As another example, the continuous operation decision unit 31A decides whether a predetermined time has elapsed for a time period from the end of the first movement trajectory of the menu item to the start of the second movement trajectory of the subject menu item. If the predetermined time has not elapsed for a time period from the end of the first movement trajectory of the menu item to the start of the second movement trajectory of the subject menu item, the continuous operation decision unit 31A decides that the operation to select a menu item is continued.
Example of a Menu Operation Process
An example of a menu operation process according to the third embodiment will be described with reference to FIG. 12. FIG. 12 is a schematic diagram illustrating an example of a menu operation process according to a third embodiment.
It is assumed that pointing is performed at the position indicated by a reference numeral c1. Then, because no object is displayed on the pointing position, the pointing operation acquisition unit 22 sets a canvas of the screen to the target for the pointing. Then, when the pointing operation is acquired by the pointing operation acquisition unit 22, the context menu display unit 23 displays a context menu near the pointing position. Here, in the context menu, the menu items of the “addition of a label” and the “addition of an image” are included.
Then, it is assumed that the menu item of the “addition of a label” is selected by the first touch c2 and the second touch c3. Then, the menu item selection operation decision unit 24A decides that the first operation has been performed. Here, the first operation is an operation to start a drag.
Then, it is assumed that a drag c4 of the menu item of the “addition of a label” is started from the first touch c2. Furthermore, the touch position of the second touch c3 does not move. Then, the menu item display unit 26 changes the display position of the menu item of the “addition of a label” on the basis of the drag operation.
Then, the drag c4 of the menu item of the “addition of a label” is ended. Then, the target selection operation decision unit 25 detects the second operation that is used to select the position at which the command indicated by the menu item of the “addition of a label” is executed. Here, the second operation is an operation to end the drag.
Then, the command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the drag c4 has been ended. Here, the command calculation unit 27 calculates the command to perform the “addition of a label” at the position in which the drag c4 has been ended. Consequently, as indicated by a reference numeral c5, the label is added to the position in which the drag c4 has been ended.
Subsequently, the continuous operation decision unit 31A decides whether an amount of movement of the movement trajectory of the menu item of the “addition of a label” due to the second touch c3 is zero. If an amount of movement of the movement trajectory of the menu item of the “addition of a label” due to the second touch c3 is zero, the continuous operation decision unit 31A decides that the operation to select the menu item is continued. Here, because the touch position of the second touch c3 does not move, the continuous operation decision unit 31A decides that the operation to select the menu item of the “addition of a label” is continued.
Then, if a drag c6 of the menu item of the “addition of a label” is started from the second touch c3, the user can continuously operate the menu item of the “addition of a label”. Namely, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item of the “addition of a label”. If the drag c6 of the menu item of the “addition of a label” has been ended, the command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the drag c6 has been ended and then the label is added.
Consequently, the information processing apparatus 1 can continuously operate the menu item of the context menu.
Example of a Flowchart of a Menu Operation Process
FIG. 13 is a flowchart illustrating an example of the flow of the menu operation process according to the third embodiment.
As illustrated in FIG. 13, the pointing operation acquisition unit 22 decides whether a pointing to the screen is detected (Step S51). If the pointing operation acquisition unit 22 decides that a pointing to the screen is not detected (No at Step S51), the pointing operation acquisition unit 22 repeats the decision process until the pointing operation acquisition unit 22 detects a pointing to the screen.
In contrast, if the pointing operation acquisition unit 22 decides that a pointing to the screen is detected (Yes at Step S51), the context menu display unit 23 displays the context menu near the detected position (Step S52).
Then, the menu item selection operation decision unit 24A decides whether two or more touches of the menu items are detected (Step S53). If the menu item selection operation decision unit 24A decides that two or more touches of the menu item are not detected (No at Step S53), the menu item selection operation decision unit 24A repeats the decision process until the subject operation is detected.
In contrast, if the menu item selection operation decision unit 24A decides that two or more touches of the menu items are detected (Yes at Step S53), the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S54). Then, the menu item display unit 26 duplicates the detected touched menu items to the respective touch positions and displays each of the duplicated menu items (Step S55).
The menu item display unit 26 decides whether one or more movements of the touch positions are detected (Step S56). If the menu item display unit 26 decides that no movement of the touch position is detected (No at Step S56), the menu item display unit 26 proceeds to Step S58.
In contrast, if the menu item display unit 26 decides that one or more movements of the touch positions are detected (Yes at Step S56), the menu item display unit 26 changes the display position of the menu item to each of the touch positions (Step S57). Then, the menu item display unit 26 proceeds to Step S58.
At Step S58, the target selection operation decision unit 25 decides whether release of one or more touches is detected (Step S58). If the target selection operation decision unit 25 decides that no release of the touches is detected (No at Step S58), the menu item display unit 26 proceeds to Step S56 in order to detect a movement of the touch position.
In contrast, if the target selection operation decision unit 25 decides that one or more releases of the touches are detected (Yes at Step S58), the menu item display unit 26 prevents the menu item from which the touch has been released from being displayed (Step S59). Then, on the basis of the position in which the touch has been released, the command calculation unit 27 executes the command that is indicated by the menu item from which the touch has been released (Step S60).
Then, the continuous operation decision unit 31A decides whether there is a touch in which no movement of the touch position is detected (Step S61). For example, the continuous operation decision unit 31A decides whether an amount of movement of the movement trajectory of the menu item due to the touch in which no movement of the touch position is detected is zero. If the continuous operation decision unit 31A decides that there is a touch in which no movement of the touch position is detected (Yes at Step S61), the continuous operation decision unit 31A proceeds to Step S56 in order to detect a movement of a touch position. This is because the continuous operation decision unit 31A decides that the operation is a continuous operation of the menu item.
In contrast, if the continuous operation decision unit 31A decides that there is no touch in which no movement of the touch position is detected (No at Step S61), the menu operation process is ended.
Another Example of the Menu Operation Process
Another example of the menu operation process according to the third embodiment will be described with reference to FIG. 14. FIG. 14 is a schematic diagram illustrating another example of the menu operation process according to the third embodiment.
It is assumed that a pointing is performed at the position indicated by a reference numeral d1. Then, because no object is displayed on the pointing position, the pointing operation acquisition unit 22 sets the canvas of the screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22, the context menu display unit 23 displays the context menu near the pointing position. Here, in the context menu, the menu items of the “addition of a label” and the “addition of an image” are included.
Then, it is assumed that the menu item of the “addition of a label” is selected and a drag d2 is started. Then, the menu item selection operation decision unit 24A detects the first operation that is used to select the menu item of the “addition of a label”. Here, the first operation is an operation to start a drag. The menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item of the “addition of a label”.
Then, it is assumed that the drag d2 of the menu item of the “addition of a label” has been ended. Then, the target selection operation decision unit 25 detects the second operation that is used to select the position in which the command indicated by the menu item of the “addition of a label” is performed. Here, the second operation is an operation to end the drag. The command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the drag d2 has been ended. Here, the command calculation unit 27 calculates the command to perform the “addition of a label” at the position in which the drag d2 has been ended. Consequently, as indicated by a reference numeral d3, the label is added to the position in which the drag d2 has been ended.
Then, the continuous operation decision unit 31A decides whether a predetermined time has elapsed for a time period from the end of the movement trajectory of the menu item of the “addition of a label” to the start of the subsequent movement trajectory of the menu item of the same “addition of a label”. Here, it is assumed that the continuous operation decision unit 31A decides that a predetermined time has not elapsed for a time period from the end of the movement trajectory of the menu item of the “addition of a label” to the start of the subsequent movement trajectory of the menu item of the same “addition of a label”. At this point, it is assumed that the menu item selection operation decision unit 24A detects the first operation that is used to select the menu item of the “addition of a label”. Then, the menu item display unit 26 changes, on the basis of the drag operation d4, the display position of the menu item of the “addition of a label”.
Then, it is assumed that a drag d4 of the menu item of the “addition of a label” has been ended. Then, the target selection operation decision unit 25 detects the second operation that is used to select the position in which the command indicated by the menu item of the “addition of a label” is executed. The command calculation unit 27 calculates the command indicated by the menu item of the “addition of a label” at the position in which the drag d4 has been ended. At this point, the command calculation unit 27 calculates the command to perform the “addition of a label” at the position in which the drag d4 has been ended. Consequently, as indicated by the reference numeral d5, the label is added at the position in which the drag d4 has been ended.
Then, if a predetermined time has elapsed for a time period from the end of the movement trajectory of the menu item of the “addition of a label” to the start of the subsequent movement trajectory of the menu item of the same “addition of a label”, as indicated by a reference numeral d6, the continuous operation decision unit 31A prevents the menu item of the “addition of a label” from being displayed.
Consequently, the information processing apparatus 1 can continuously operate the menu item included in the context menu.
Another Example of the Flowchart of the Menu Operation Process
FIG. 15 is a flowchart illustrating another example of the flow of the menu operation process according to the third embodiment.
As illustrated in FIG. 15, the pointing operation acquisition unit 22 decides whether a pointing to the screen is detected (Step S71). If the pointing operation acquisition unit 22 decides that a pointing to the screen is not detected (No at Step S71), the pointing operation acquisition unit 22 repeats the decision process until a pointing to the screen is detected.
In contrast, if the pointing operation acquisition unit 22 decides that a pointing to the screen is detected (Yes at Step S71), the context menu display unit 23 displays the context menu near the detected position (Step S72).
Subsequently, the menu item selection operation decision unit 24A detects whether an operation to start a drag of the menu item is detected (Step S73). If the menu item selection operation decision unit 24A decides that an operation to start a drag of the menu item is not detected (No at Step S73), the menu item selection operation decision unit 24A repeats the decision process until the subject operation is detected.
In contrast, if the menu item selection operation decision unit 24A decides that an operation to start a drag of the menu item is detected (Yes at Step S73), the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S74). Furthermore, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item targeted for the drag.
Then, the target selection operation decision unit 25 detects whether an operation to end the drag of the menu item is detected (Step S75). If the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is not detected (No at Step S75), the target selection operation decision unit 25 repeats the decision process until the target selection operation decision unit 25 detects the subject operation.
In contrast, if the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is detected (Yes at Step S75), the command calculation unit 27 executes, on the basis of the position in which the drag has been ended, the command indicated by the menu item (Step S76).
Then, the menu item selection operation decision unit 24A detects whether an operation to start a drag of the menu item is detected (Step S77). If the menu item selection operation decision unit 24A decides that an operation to start a drag of the menu item is detected (Yes at Step S77), the target selection operation decision unit 25 proceeds to Step S75 in order to detect the end of the drag of the menu item.
In contrast, if the menu item selection operation decision unit 24A decides that an operation to start a drag of the menu item is not detected (No at Step S77), the continuous operation decision unit 31A decides whether a predetermined time has elapsed from the end of the drag performed last time (Step S78). If the continuous operation decision unit 31A decides that a predetermined time has not elapsed from the end of the drag performed last time (No at Step S78), the menu item selection operation decision unit 24A proceeds to Step S77 in order to detect the start of the drag of the menu item.
In contrast, if the continuous operation decision unit 31A decides that a predetermined time has elapsed from the end of the drag performed last time (Yes at Step S78), the menu item display unit 26 prevents the menu item targeted for the drag from being displayed (Step S79). Then, the menu operation process is ended.
Effect of the Third Embodiment
In this way, in the third embodiment described above, the information processing apparatus 1 selects a menu item due to the first touch and the second touch. The information processing apparatus 1 allows the menu item to be moved due to the first touch and executes, at the end position of the movement, the command indicated by the menu item. The information processing apparatus 1 decides whether an amount of movement of the movement trajectory of the menu item due to the second touch is zero. If the information processing apparatus 1 decides that an amount of movement due to the second touch is zero, the information processing apparatus 1 continues to select the menu item. With this configuration, the information processing apparatus 1 can allow the menu item in the context menu to be continuously operated.
Furthermore, in the third embodiment described above, the information processing apparatus 1 decides whether a predetermined time has elapsed for a time period from the end of a first movement trajectory of the menu item to the start of a second movement trajectory of the subject menu item. If the information processing apparatus 1 decides that a predetermined time has not elapsed, the information processing apparatus 1 continues to select the subject menu item. With this configuration, the information processing apparatus 1 can allow the menu item in the context menu to be continuously operated.
[d] Fourth Embodiment
In the first embodiment, after having displayed a context menu, the information processing apparatus 1 displays, the content indicated by the menu item at the end position that is the result of a movement of the selected menu item due to the drag operation. In the second embodiment, the information processing apparatus 1 cancels a selection of the menu item on the basis of the drag trajectory of the selected menu item. In the third embodiment, the information processing apparatus 1 further continuously operates the menu item on the basis of the drag trajectory of the selected menu item. However, the information processing apparatus 1 is not limited to this and may also further change the display of the menu item in accordance with the target of the dragged menu item. The target mentioned here is an object, a canvas, or the like.
For example, if a menu item is a “deletion of a label”, with the operation decided by the target selection operation decision unit 25, an object targeted for deletion needs to be selected. Here, the object targeted for the deletion is a label. Namely, if a menu item is not dragged to the position of the label targeted for the deletion, the command indicated by the “deletion of a label” is not able to be executed. Thus, in a fourth embodiment, a description will be given of a case in which the information processing apparatus 1 further changes the display of the menu item in accordance with the target of a dragged menu item.
Configuration of an Information Processing Apparatus According to the Fourth Embodiment
Because the functional configuration of the information processing apparatus 1 according to the fourth embodiment is the same as that of the third embodiment, the configuration thereof will be omitted. The fourth embodiment differs from the third embodiment in that an operation related to the fourth embodiment is added to the command calculation unit 27.
On the basis of the menu item selected by the first operation, on the basis of the target in which the selected command is executed by the second operation, and on the basis of the second operation, the command calculation unit 27 calculates a command to be executed. The target mentioned here indicates, for example, the position in which a command is executed or indicates, for example, an object or a canvas located at the subject position. Furthermore, the command calculation unit 27 decides whether execution can be performed on the position in which the command indicated by the menu item has been dragged. Furthermore, the command calculation unit 27 may also decide whether a command can be executed for each dragged position or can be executed at the position selected by the second operation.
Example of a Menu Operation Process
An example of a menu operation process according to the fourth embodiment will be described with reference to FIG. 16. FIG. 16 is a schematic diagram illustrating an example of a menu operation process according to a fourth embodiment. In FIG. 16, before a menu operation process is performed, five objects are displayed. Here, it is assumed that these objects are labels.
It is assumed that a pointing is performed at a reference numeral e1. Then, because no object is displayed at the pointing position, the pointing operation acquisition unit 22 sets the canvas of the screen to the target for the pointing. Then, when a pointing operation is acquired by the pointing operation acquisition unit 22, the context menu display unit 23 displays the context menu near the pointing position. Here, in the context menu, the menu items of the “addition of a label” and the “deletion of a label” are included.
In the following, a description will be given of a case in which the command indicated by the menu item can be executed. It is assumed that the menu item of the “deletion of a label” is selected and a drag e2 is started. Then, the menu item selection operation decision unit 24A detects the first operation that is used to select the menu item of the “deletion of a label”. Here, the first operation is an operation to start a drag. The menu item display unit 26 changes the display position of the menu item of the “deletion of a label” on the basis of the drag operation.
Then, it is assumed that the drag e2 of the menu item of the “deletion of a label” has been ended. Then, the command calculation unit 27 decides whether the command indicated by the menu item of the “deletion of a label” can be executed at the dragged position. Here, because a position e3 in which the drag has been ended is the position of the label, the command calculation unit 27 decides that the command indicated by the menu item of the “deletion of a label” can be executed at the dragged position.
Then, the command calculation unit 27 calculates the command indicated by the menu item of the “deletion of a label” at the position e3 in which the drag e2 has been ended. Here, the command calculation unit 27 calculates the command to perform the “deletion of a label” at the position e3 in which the drag e2 has been ended. Consequently, as indicated by the reference numeral e4, the label is deleted.
In the following, a description will be given of a case in which the command indicated by a menu item is not able to be executed. It is assumed that the menu item of the “deletion of a label” is selected and a drag e5 is started. Then, the menu item selection operation decision unit 24A detects the first operation that is used to select the menu item of the “deletion of a label”. Here, the first operation is an operation to start the drag. The menu item display unit 26 changes, on the basis of the rag operation, the display position of the menu item of the “deletion of a label”.
Then, it is assumed that the drag e5 of the menu item of the “deletion of a label” has been ended. Then, the command calculation unit 27 decides whether the command indicated by the menu item of the “deletion of a label” can be executed at the dragged position. Here, because a dragged position e6 is not present on the label, the command calculation unit 27 decides that the command indicated by the menu item of the “deletion of a label” is not able to be executed at the dragged position. This is because the command calculation unit 27 is not able to specify the label in which the command is executed. Thus, the menu item display unit 26 changes the display of the menu item in order to indicate that the command is not able to be performed. As an example, the menu item display unit 26 changes the color of the subject menu item to a color different from the normal color. If the normal color is black, the menu item display unit 26 changes the color of the subject menu item to red.
Consequently, the information processing apparatus 1 can improve the operation efficiency with respect to a menu item.
Example of a Flowchart of a Menu Operation Process
FIG. 17 is a flowchart illustrating an example of the flow of the menu operation process according to the fourth embodiment.
As illustrated in FIG. 17, the pointing operation acquisition unit 22 decides whether a pointing to a screen is detected (Step S81). If the pointing operation acquisition unit 22 decides that a pointing to a screen is not detected (No at Step S81), the pointing operation acquisition unit 22 repeats the decision process until a pointing to the screen is detected.
In contrast, if the pointing operation acquisition unit 22 decides that a pointing to a screen is detected (Yes at Step S81), the context menu display unit 23 displays the context menu near the detected position (Step S82).
Then, the menu item selection operation decision unit 24A detects whether an operation to start a drag of the menu item is detected (Step S83). If the menu item selection operation decision unit 24A decides that an operation to start a drag of the menu item is not detected (No at Step S83), the menu item selection operation decision unit 24A repeats the decision process until the subject operation is detected.
In contrast, if the menu item selection operation decision unit 24A decides that an operation to start a drag of the menu item is detected (Yes at Step S83), the menu item display unit 26 prevents the other menu items included in the context menu from being displayed (Step S84). Furthermore, the menu item display unit 26 changes, on the basis of the drag operation, the display position of the menu item targeted for the drag.
Then, the command calculation unit 27 decides whether the command indicated by the menu item can be executed on the dragged position (Step S85). If the command calculation unit 27 decides that the command indicated by the menu item is not able to be executed on the dragged position (No at Step S85), the menu item display unit 26 sets the display of the subject menu item to an abnormal display (Step S86). Namely, because the command calculation unit 27 is not able to perform the command indicated by the menu item at the dragged position, the menu item display unit 26 changes the display of the subject menu item to the display indicating that the command is not able to be executed. Then, the menu item display unit 26 proceeds to Step S88.
In contrast, if the command calculation unit 27 decides that the command indicated by the menu item can be executed on the dragged position (Yes at Step S85), the menu item display unit 26 sets the subject menu item to a normal display (Step S87). Then, the menu item display unit 26 proceeds to Step S88.
Then, the target selection operation decision unit 25 detects whether an operation to end the drag of the menu item is detected (Step S88). If the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is not detected (No at Step S88), the command calculation unit 27 proceeds to Step S85 in order to perform execution decision of the command at the dragged position.
In contrast, if the target selection operation decision unit 25 decides that an operation to end the drag of the menu item is detected (Yes at Step S88), the menu item display unit 26 prevents the subject menu item from being displayed (Step S89).
Then, the command calculation unit 27 calculates the command indicated by the menu item at the position in which the drag has been ended (Step S90). Then, the menu operation process is ended.
In the first to the fourth embodiments, in the first embodiment, the information processing apparatus 1 displays a context menu and then displays, at the end position that is the result of a movement due to the drag operation of the selected menu item, the content indicated by the menu item. However, the information processing apparatus 1 is not limited to this. The context menu may also include both a menu item available to be dragged and a menu item unavailable to be dragged and, furthermore, the menu item unavailable to be dragged is made to be unable to be dragged. The menu item unavailable to be dragged indicates the menu meaningless to change the position of the menu item and is, for example, the menu item in which the command over the entire screen is executed.
Here, an example of a display of a context menu that includes therein both a menu item available to be dragged and a menu item unavailable to be dragged will be described with reference to FIG. 18. FIG. 18 is a schematic diagram illustrating an example of a display when the context menu includes therein both the menu items available to be dragged and the menu items unavailable to be dragged. As illustrated in FIG. 18, in the context menu, the menu items available to be dragged and the menu items unavailable to be dragged are included. As the menu items available to be dragged, the “addition of a label” and the “addition of an image” are displayed. Then, in each of the menu items available to be dragged, the mark indicating that a drag can be performed is displayed. As the menu items unavailable to be dragged, a “change in the background color” and a “zoom of the screen” are displayed. Then, in each of the menu items unavailable to be dragged, the mark indicating that a drag can be performed is not displayed. Namely, with the menu items indicated by the “change in the background color” and the “zoom of the screen”, because the background color and the zoom of the entire screen are changed, the position of the menu item may be present anywhere. Furthermore, in order to indicate that the menu item unavailable to be dragged is unable to be dragged, the following process may also be performed. For example, the target selection operation decision unit 25 performs, on the menu item unavailable to be dragged, a process of setting the drag invalid.
Furthermore, another example of the display of the context menu that includes therein both the menu items available to be dragged and the menu items unavailable to be dragged will be described with reference to FIG. 19. FIG. 19 is a schematic diagram illustrating another example of a display when the context menu includes therein both the menu items available to be dragged and the menu items unavailable to be dragged. As illustrated in FIG. 19, in the context menu, the menu items available to be dragged and the menu items unavailable to be dragged are included. As the menu items available to be dragged, the “addition of a label” and the “addition of an image” are displayed. As the menu items unavailable to be dragged, the “change in the background color” and the “zoom of the screen” are displayed. In order to indicate that the menu item unavailable to be dragged is unable to be dragged, the following process is performed. For example, the menu item display unit 26 displays the menu item at the drag position in accordance with the drag. In a case of the menu item unavailable to be dragged, when the drag reaches a predetermined distance, the continuous operation decision unit 31A decides not to select the menu item. Then, the menu item display unit 26 returns the menu item to the position of the menu item that is located before the menu item is dragged.
Consequently, the information processing apparatus 1 can improve the operational efficiency with respect to the menu item.
Others
Furthermore, the information processing apparatus 1 can be implemented by mounting each of the functions, such as the display device 10, the control unit 20, or the like described above, on an information processing apparatus, such as a known personal computer, workstation, or the like.
Furthermore, the components of each device illustrated in the drawings are not always physically configured as illustrated in the drawings. In other words, the specific shape of a separate or integrated device is not limited to the drawings; however, all or part of the device can be configured by functionally or physically separating or integrating any of the units depending on various loads or use conditions. For example, the operation acquisition unit 21 and the pointing operation acquisition unit 22 may also be integrated as a single unit. In contrast, the continuous operation decision unit 31 may also be separated into a decision unit that detects a stop operation and a decision unit that detects a continuous operation.
Furthermore, the various processes described in the embodiments can be implemented by a program prepared in advance and executed by a computer such as a personal computer or a workstation. Accordingly, in the following, an example of a computer that executes an information processing program that implements the same function as the performed by the information processing apparatus 1 illustrated in FIG. 1 will be described. FIG. 20 is a schematic diagram illustrating an example of a computer that executes an information processing program.
As illustrated in FIG. 20, a computer 200 includes a CPU 203 that executes various kinds of arithmetic processing, an input device 215 that accepts an input of data from a user, and a display control unit 207 that controls a display device 209. Furthermore, the computer 200 includes a drive device 213 that reads a program or the like from a storage medium and a communication control unit 217 that gives and receives data to and from another computer via a network. Furthermore, the computer 200 includes a memory 201 that temporarily stores therein various kinds of information and an HDD 205. Then, the memory 201, the CPU 203, the HDD 205, the display control unit 207, the drive device 213, the input device 215, and the communication control unit 217 are connected by a bus 219.
The drive device 213 is a device used for, for example, a removable disk 211.
The CPU 203 reads an information processing program 205 a, loads the program in the memory 201, and executes the program as a process. The process is associated with each of the functioning units included in the information processing apparatus 1. Information processing related information 205 b is associated with the information stored in a storage unit that is not illustrated. Then, for example, the removable disk 211 stores therein each of the pieces of the information, such as the information processing program 205 a or the like.
Furthermore, the information processing program 205 a is not always stored in the HDD 205 from the beginning. For example, the program is stored in a “portable physical medium”, such as a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optic disk, an IC CARD, or the like, that is to be inserted into the computer 200. Then, the computer 200 may also read and execute the information processing program 205 a from the portable physical medium.
According to an aspect of an embodiment, it is possible to improve the efficiency of an operation to display a context menu. Furthermore, even if the context menu is displayed, it is possible to improve the visibility of an object that is already displayed.
All examples and conditional language recited herein are intended for pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventor to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.

Claims (7)

What is claimed is:
1. An information processing apparatus comprising:
a processor that executes a process including:
first selecting a menu item in a context menu that is displayed at a location that is not on an object to which the menu item is to be executed;
moving the selected menu item to an end position, or pointing to the end position by a predetermined operation, the end position existing within the object;
second selecting the end position; and
executing, at the end position selected at the second selecting, the menu item, wherein
the first selecting includes selecting the menu item by a first touch and a second touch at a same time, wherein the menu item is then duplicated as a first menu item and a second menu item,
the moving includes moving the first menu item by the first touch to the end position by the predetermined operation,
the second selecting includes selecting the end position, and the process further includes continuing, when deciding that the second touch is continued at the first selecting, processes of the moving, the second selecting and the executing for the second menu item and the second touch.
2. The information processing apparatus according to claim 1, the process further including:
deciding, on a basis of a trajectory in which the menu item is moved by the predetermined operation, whether to continue to select the menu item.
3. The information processing apparatus according to claim 2, wherein
the deciding includes deciding whether the trajectory is a first trajectory of returning to a vicinity of a starting position of the trajectory, and the process further includes:
stopping selecting the menu item when the deciding decides that the trajectory is the first trajectory.
4. The information processing apparatus according to claim 2, wherein
the deciding includes deciding whether the trajectory is a second trajectory that satisfies a predetermined condition, and the process further includes:
stopping selecting the menu item when the deciding decides that the trajectory is the second trajectory.
5. The information processing apparatus according to claim 2, wherein
the deciding includes deciding whether a predetermined time has elapsed since the menu item is moved to the end position and is left positioned thereat, and the process further includes:
continuing, when the deciding decides that the predetermined time has not elapsed, processes of the moving, the second selecting and the executing.
6. A non-transitory computer-readable recording medium having stored therein an information processing program that causes a computer to execute a process comprising:
first selecting a menu item in a context menu that is displayed at a location that is not on an object to which the menu item is to be executed;
moving the selected menu item to an end position, or pointing to the end position by a predetermined operation, the end position existing within the object;
second selecting the end position; and
executing, at the end position selected at the second selecting, the menu item, wherein
the first selecting includes selecting the menu item by a first touch and a second touch at a same time, wherein the menu item is then duplicated as a first menu item and a second menu item,
the moving includes moving the first menu item by the first touch to the end position by the predetermined operation,
the second selecting includes selecting the end position, and the process further includes continuing, when deciding that the second touch is continued at the first selecting, processes of the moving, the second selecting and the executing for the second menu item and the second touch.
7. An information processing method comprising:
first selecting, performed by a computer, a menu item in a context menu that is displayed at a location that is not on an object to which the menu item is to be executed;
moving, performed by the computer, the selected menu item to an end position, or pointing to the end position by a predetermined operation, the end position existing within the object;
second selecting, performed by the computer, the end position; and
executing, performed by the computer, at the end position selected at the second selecting, the menu item, wherein
the first selecting includes selecting the menu item by a first touch and a second touch at a same time, wherein the menu item is then duplicated as a first menu item and a second menu item,
the moving includes moving the first menu item by the first touch to the end position by the predetermined operation,
the second selecting includes selecting the end position, and the process further includes continuing, when deciding that the second touch is continued at the first selecting, processes of the moving, the second selecting and the executing for the second menu item and the second touch.
US15/276,954 2016-03-02 2016-09-27 Information processing apparatus, computer-readable recording medium, and information processing method Active 2037-03-30 US10372296B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016040482A JP6677019B2 (en) 2016-03-02 2016-03-02 Information processing apparatus, information processing program, and information processing method
JP2016-040482 2016-03-02

Publications (2)

Publication Number Publication Date
US20170255346A1 US20170255346A1 (en) 2017-09-07
US10372296B2 true US10372296B2 (en) 2019-08-06

Family

ID=59722227

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/276,954 Active 2037-03-30 US10372296B2 (en) 2016-03-02 2016-09-27 Information processing apparatus, computer-readable recording medium, and information processing method

Country Status (2)

Country Link
US (1) US10372296B2 (en)
JP (1) JP6677019B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10599296B2 (en) * 2009-04-15 2020-03-24 Sony Corporation Menu display apparatus, menu display method and program

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0561596A (en) 1991-09-03 1993-03-12 Hitachi Ltd Character input/cursor instruction judging method for onlined handwritten input device
JPH06161698A (en) 1992-11-27 1994-06-10 Matsushita Electric Ind Co Ltd Window system
JPH08180138A (en) 1994-12-27 1996-07-12 Nagano Nippon Denki Software Kk Character recognizing device
JP2001184458A (en) 1999-10-15 2001-07-06 Matsushita Electric Ind Co Ltd Device and method for character input and computer- readable recording medium
US20030179235A1 (en) 2002-03-22 2003-09-25 Xerox Corporation Method and system for overloading loop selection commands in a system for selecting and arranging visible material in document images
US6694056B1 (en) 1999-10-15 2004-02-17 Matsushita Electric Industrial Co., Ltd. Character input apparatus/method and computer-readable storage medium
US20070188482A1 (en) * 2006-02-14 2007-08-16 Seiko Epson Corporation Image display system, image display method, image display program, recording medium, data processing device, and image display device
US20100146451A1 (en) * 2008-12-09 2010-06-10 Sungkyunkwan University Foundation For Corporate Collaboration Handheld terminal capable of supporting menu selection using dragging on touch screen and method of controlling the same
US20130219433A1 (en) 2012-02-20 2013-08-22 Takahiro Arai Electronic apparatus and switching method
US20130246970A1 (en) * 2012-03-16 2013-09-19 Nokia Corporation Electronic devices, associated apparatus and methods
US8743070B2 (en) * 2003-06-16 2014-06-03 Sony Corporation Touch screen input method and device
US20140152586A1 (en) * 2012-11-30 2014-06-05 Kabushiki Kaisha Toshiba Electronic apparatus, display control method and storage medium
US20140215401A1 (en) * 2013-01-29 2014-07-31 Lg Electronics Inc. Mobile terminal and control method thereof
US20150042584A1 (en) * 2013-08-06 2015-02-12 Samsung Electronics Co., Ltd. Electronic device and method for editing object using touch input
US20160370958A1 (en) * 2013-07-12 2016-12-22 Sony Corporation Information processing device, information processing method, and computer program
US20160378318A1 (en) * 2013-07-12 2016-12-29 Sony Corporation Information processing device, information processing method, and computer program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000148750A (en) * 1998-11-09 2000-05-30 Hitachi Software Eng Co Ltd Method and device for command control over document processor
JP2003303050A (en) * 2003-05-06 2003-10-24 A I Soft Inc Drawing creation apparatus and method thereof
KR100883115B1 (en) * 2007-03-28 2009-02-10 삼성전자주식회사 Mobile device having touchscreen with predefined execution zone and related method for executing function thereof
KR102064836B1 (en) * 2012-06-25 2020-01-13 삼성전자주식회사 An apparatus displaying a menu for mobile apparatus and a method thereof

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0561596A (en) 1991-09-03 1993-03-12 Hitachi Ltd Character input/cursor instruction judging method for onlined handwritten input device
JPH06161698A (en) 1992-11-27 1994-06-10 Matsushita Electric Ind Co Ltd Window system
JPH08180138A (en) 1994-12-27 1996-07-12 Nagano Nippon Denki Software Kk Character recognizing device
JP2001184458A (en) 1999-10-15 2001-07-06 Matsushita Electric Ind Co Ltd Device and method for character input and computer- readable recording medium
US6694056B1 (en) 1999-10-15 2004-02-17 Matsushita Electric Industrial Co., Ltd. Character input apparatus/method and computer-readable storage medium
US20030179235A1 (en) 2002-03-22 2003-09-25 Xerox Corporation Method and system for overloading loop selection commands in a system for selecting and arranging visible material in document images
JP2003296012A (en) 2002-03-22 2003-10-17 Xerox Corp System for inputting and displaying graphic and method of using interface
US8743070B2 (en) * 2003-06-16 2014-06-03 Sony Corporation Touch screen input method and device
US20070188482A1 (en) * 2006-02-14 2007-08-16 Seiko Epson Corporation Image display system, image display method, image display program, recording medium, data processing device, and image display device
US20100146451A1 (en) * 2008-12-09 2010-06-10 Sungkyunkwan University Foundation For Corporate Collaboration Handheld terminal capable of supporting menu selection using dragging on touch screen and method of controlling the same
US20130219433A1 (en) 2012-02-20 2013-08-22 Takahiro Arai Electronic apparatus and switching method
JP2015038666A (en) 2012-02-20 2015-02-26 株式会社東芝 Electronic apparatus, switching method and switching program
US20130246970A1 (en) * 2012-03-16 2013-09-19 Nokia Corporation Electronic devices, associated apparatus and methods
US20140152586A1 (en) * 2012-11-30 2014-06-05 Kabushiki Kaisha Toshiba Electronic apparatus, display control method and storage medium
US20140215401A1 (en) * 2013-01-29 2014-07-31 Lg Electronics Inc. Mobile terminal and control method thereof
US20160370958A1 (en) * 2013-07-12 2016-12-22 Sony Corporation Information processing device, information processing method, and computer program
US20160378318A1 (en) * 2013-07-12 2016-12-29 Sony Corporation Information processing device, information processing method, and computer program
US20150042584A1 (en) * 2013-08-06 2015-02-12 Samsung Electronics Co., Ltd. Electronic device and method for editing object using touch input

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Wilders Security Forum-When I Drag an Icon on My Desktop, It Creates a Duplicate Copy of the Dragged Copy", published to web on May 7, 2013 to https://www.wildernesssecurity.com/threads/when-i-drag-an-icon-on-my-desktop-it-creates-a-duplicate-copy-of-the-dragged-icon.346685, retrieved Dec. 7, 2018. *
"Wilders Security Forum—When I Drag an Icon on My Desktop, It Creates a Duplicate Copy of the Dragged Copy", published to web on May 7, 2013 to https://www.wildernesssecurity.com/threads/when-i-drag-an-icon-on-my-desktop-it-creates-a-duplicate-copy-of-the-dragged-icon.346685, retrieved Dec. 7, 2018. *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10599296B2 (en) * 2009-04-15 2020-03-24 Sony Corporation Menu display apparatus, menu display method and program

Also Published As

Publication number Publication date
JP6677019B2 (en) 2020-04-08
JP2017157046A (en) 2017-09-07
US20170255346A1 (en) 2017-09-07

Similar Documents

Publication Publication Date Title
US9626021B2 (en) Information processing apparatus, information processing method and program
US9684443B2 (en) Moving object on rendered display using collar
US20160092062A1 (en) Input support apparatus, method of input support, and computer program
US20110267371A1 (en) System and method for controlling touchpad of electronic device
US20140075302A1 (en) Electronic apparatus and handwritten document processing method
US10318152B2 (en) Modifying key size on a touch screen based on fingertip location
US9830069B2 (en) Information processing apparatus for automatically switching between modes based on a position of an inputted drag operation
US9880684B2 (en) Information processing apparatus, method for controlling information processing apparatus, and storage medium
US20150199020A1 (en) Gesture ui device, gesture ui method, and computer-readable recording medium
US20150169134A1 (en) Methods circuits apparatuses systems and associated computer executable code for providing projection based human machine interfaces
US20130246975A1 (en) Gesture group selection
US10379729B2 (en) Information processing apparatus, information processing method and a non-transitory storage medium
US20140372939A1 (en) Systems and methods for assisting in selection and placement of graphical objects in a graphical user interface
US20170212658A1 (en) Display control device, display control method, and recording medium
US20130007612A1 (en) Manipulating Display Of Document Pages On A Touchscreen Computing Device
US9632697B2 (en) Information processing apparatus and control method thereof, and non-transitory computer-readable medium
US20150153834A1 (en) Motion input apparatus and motion input method
CN104185829A (en) Display control device, display control method, and program
US10372296B2 (en) Information processing apparatus, computer-readable recording medium, and information processing method
US10802702B2 (en) Touch-activated scaling operation in information processing apparatus and information processing method
JP2014146127A (en) Information processing device, information processing method, and program
JP5620895B2 (en) Display control apparatus, method and program
EP3210101B1 (en) Hit-test to determine enablement of direct manipulations in response to user actions
JP2016115215A (en) Image display system, image display method, and program
JP2013114466A (en) Display system, display method and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJITSU LIMITED, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HATADA, KOKI;REEL/FRAME:040167/0904

Effective date: 20160909

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4