WO2013011863A1 - Information processing device, operation screen display method, control program, and recording medium - Google Patents

Information processing device, operation screen display method, control program, and recording medium Download PDF

Info

Publication number
WO2013011863A1
WO2013011863A1 PCT/JP2012/067526 JP2012067526W WO2013011863A1 WO 2013011863 A1 WO2013011863 A1 WO 2013011863A1 JP 2012067526 W JP2012067526 W JP 2012067526W WO 2013011863 A1 WO2013011863 A1 WO 2013011863A1
Authority
WO
WIPO (PCT)
Prior art keywords
icon
unit
operation screen
trajectory
user
Prior art date
Application number
PCT/JP2012/067526
Other languages
French (fr)
Japanese (ja)
Inventor
正義 神原
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2013011863A1 publication Critical patent/WO2013011863A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to a user interface technology of an information processing apparatus including an input unit and a display unit.
  • the tablet terminal has a flat outer shape and includes a touch panel as a display unit and an input unit. By touching the object displayed on the touch panel with a finger, a pen, or the like, the user can perform various operations on the tablet terminal body.
  • the tablet terminal can discriminate various contact operations on the user's screen by the touch panel, and can perform object display according to the contact operation. For example, for the touch action, various actions such as tapping (lightly tapping), flicking (flicking, picking), pinching (pinch with a finger), dragging an object displayed on the screen with a finger (pen) There is.
  • the tablet terminal discriminates such various contact actions, and selects / moves objects, scrolls a list, enlarges / reduces an image, etc. according to the discrimination result.
  • the tablet terminal realizes a more intuitive operation by the touch panel as described above, and is supported by many people.
  • Patent Document 1 discloses a mobile communication terminal including a touch panel display unit.
  • an object URL, e-mail address, character string, image, etc.
  • a finger pen
  • the mobile communication terminal extracts a keyword from the selected object and accesses a related site.
  • Patent Document 2 discloses a portable device having a touch panel display.
  • the portable device of Patent Document 2 displays a through image (an image reflected on a camera, etc.) on a touch panel display, detects a specific target in the through image selected by touching the surrounding area, and reduces the specific target.
  • the image can be displayed on the edge of the touch panel display as a release button.
  • Patent Document 3 discloses a website search system using a touch panel.
  • the website search system of Patent Literature 3 accepts it as a search keyword and displays a first mother icon corresponding to the accepted keyword. Then, the website search system searches the website with a search engine according to the keyword, and displays a thumbnail image of the searched website around the first mother icon.
  • Patent Document 4 discloses an information processing apparatus including a display panel having a contact sensor.
  • the information processing apparatus disclosed in Patent Document 4 detects rotation of the operating body (finger) by a predetermined angle or more while an object is selected, and displays operation items related to the object around the object.
  • Patent Document 5 discloses an information processing apparatus including a touch panel unit.
  • the information processing apparatus disclosed in Patent Document 5 acquires a trajectory of a user's touch position, specifies an object image selected by the trajectory, and moves the selected object image to a position corresponding to an end point of the trajectory.
  • Patent Documents 6 to 9 disclose information processing apparatuses that realize menu display for the purpose of improving user convenience and operability.
  • Patent Document 6 discloses an information display device that reduces the amount of movement of the cursor on the display screen and reduces the movement of the user's line of sight. Specifically, in the information display device, when the cursor is moved by a pointing device that controls the movement of the cursor position, the information display device detects the locus shape of the cursor. Then, the information display device searches for menu information associated with the trajectory shape, and displays the searched menu information near the end of the cursor trajectory.
  • Patent Document 7 discloses a window display control device that detects a movement trajectory of a mouse or the like, selects a pop-up menu associated with the trajectory information, and displays it on a display screen. Specifically, the window display control device displays a pop-up menu starting from the cursor position of the mouse, or arranges a plurality of predetermined pop-up menus on the locus of the mouse.
  • Patent Document 8 discloses a method and apparatus for providing a menu display that speeds up selection of a plurality of items. Specifically, the apparatus displays a combination of an angle marking menu and a straight line menu. In the angle marking menu, an item selected by a stroke pattern created by a pen or the like is determined. In the straight line menu, an item is selected by selecting a position with a pen or the like.
  • Patent Document 9 discloses a menu display control device and an electronic blackboard system to which the menu display control device is applied.
  • the menu display control device recognizes an input history related to a plurality of coordinates input to the device by a user drawing a line or the like on the coordinate input surface with a fingertip.
  • the menu display control device determines whether or not to display the processing menu based on the input history, and displays the processing menu. More specifically, after the user draws a circle on the coordinate input surface with the index finger, the menu display control device touches the coordinate input surface with the middle finger while touching the index finger on the screen, and the middle finger is touched. Display the processing menu.
  • JP 2010-218322 A (published on September 30, 2010) JP 2010-182023 A (released on August 19, 2010) JP 2009-134738 A (released on June 18, 2009) JP 2011-13980 A (published January 20, 2011) JP 2006-244353 A (published on September 14, 2006) JP-A-8-305535 (released on November 22, 1996) Japanese Patent Laid-Open No. 10-307664 (published November 17, 1998) No. 11-507455 (announced on June 29, 1999) JP 2001-265475 A (published September 28, 2001)
  • the good operability of the tablet terminal is how to display the final result, which is the user's purpose, with a simple contact operation and a small number of operations, and a natural flow that does not contradict the user's intuition, It depends on whether to display the result based on the contact operation.
  • Such an improvement in operability is realized by appropriately grasping the user's purpose, the user's state, and the user's tendency.
  • the tablet terminal displays what the user wants to do now, what the user wants to do next, how the user is now operating, where the user is, and how the user moves. It is required to “see” the user's intentions from all points of view, such as whether or not it is natural.
  • Patent Documents 1 to 9 are not necessarily sufficient to detect the user's intention.
  • Patent Document 1 discloses that an object is selected by an operation of surrounding the object, but it is disclosed that an item related to the object is extracted and displayed by the above operation.
  • Patent Document 2 discloses that an icon corresponding to an object is displayed by an operation of enclosing the object. However, the object is selected by the above operation, and the object is displayed along with the selection. Extracting and displaying related items is not disclosed.
  • Patent Document 3 discloses that when an object is selected, the object is displayed, and a thumbnail related to the object is displayed around the object. Are not connected.
  • Patent Document 4 discloses that an object is touched to be selected and an icon related to the object is displayed around the object. In order to display an icon around the object, the object is displayed. Apart from the selection, it is necessary to perform complicated operations (operations such as pressing the finger against the touch surface and twisting the finger angle) until the number of operations increases and the desired result (icon) is displayed. The operation becomes very complicated.
  • Patent Documents 6 and 7 do not mention that an event of selecting an object occurs before menu information is displayed, so the arrangement position of each item of menu information is controlled according to the selection of the object. Can not do it.
  • the technique of patent document 8 has the problem that operation becomes very complicated. Specifically, according to Patent Document 8, there are two types of menu lists, and the method of selecting an item is different for each type of menu. As a result, there are two patterns of pen movement and item selection flow, and the pen operation must be used for each type.
  • the touch panel must be operated using a plurality of fingers, which is complicated and cannot be applied to a device that operates with a pen.
  • the above-mentioned operability problem is not limited to a small tablet terminal excellent in portability, but also an information processing apparatus of any size provided with a touch panel type display unit and input unit, as well as all types of forms not limited to a touch panel. This is a problem that commonly occurs in information processing apparatuses including a display unit and an input unit.
  • the present invention has been made in view of the above-described problems, and an object thereof is to realize excellent operability in an information processing apparatus including an input unit and a display unit.
  • an information processing apparatus is based on a trajectory acquisition unit that acquires a trajectory of movement of an indicator that indicates a position of a screen of a display unit, and a trajectory acquired by the trajectory acquisition unit.
  • a trajectory acquisition unit that acquires a trajectory of movement of an indicator that indicates a position of a screen of a display unit, and a trajectory acquired by the trajectory acquisition unit.
  • the trajectory acquisition unit acquires a trajectory of an action (movement of the indicator) performed by the user for selecting an object, and based on the trajectory, the object specifying unit performs the operation according to the operation. Identifies the selected object.
  • the related item extracting unit extracts items related to the identified object. Since the related information storage unit stores the object and the item related to the object in association with each other, the related item extracted by the related item extracting unit is related to the object selected by the user. It is.
  • the operation screen processing means arranges the extracted icons of related items at specific arrangement positions on the display unit.
  • the operation screen processing unit determines which related item icon is to be arranged in which arrangement position as follows.
  • the operation screen processing means is extracted with reference to the distance between some arrangement positions specified in advance for arranging the icons and the position of the end point of the trajectory acquired by the trajectory acquisition means.
  • the icon of the related item having the higher priority is determined to be arranged at the arrangement position close to the end point.
  • the tablet terminal 100 can output a result of selectively displaying icons of related items related to the selected object.
  • the arrangement of the icons displayed so as to be selectable here is an arrangement in consideration of the user's operation. That is, the higher priority icons are displayed near the position where the user finished the previous operation (end point of the trajectory).
  • the user is expected to perform an operation of selecting an icon of a related item related to the object immediately after performing the operation of selecting the previous object, and this is a natural flow of operation.
  • the user moves the indicator (finger, pen, or cursor operated by the mouse) from the position where the previous operation is completed to the position where the target icon is displayed next. .
  • Such annoyance becomes more prominent as the screen size of the display unit is larger, and becomes a particularly serious problem when the user is operating with one hand and the accessible area is limited. Even when the objects are managed in a hierarchy and the selection operation is performed repeatedly, the annoyance is further increased.
  • the user can complete the movement of the previous action with a high probability by arranging the icons that are more likely to be selected to be displayed near the completion position of the previous action. A desired icon displayed near the position is selected. As a result, it is possible to avoid the troublesome selection operation described above with a high probability.
  • the information processing apparatus of the present invention can preliminarily detect related items that the user will select next after selecting an object, and can display the related items in a selectable manner for the user.
  • the icons arranged on the operation screen by the operation screen processing means are all related item icons extracted as items related to the object selected by the user. is there.
  • the related items that will be selected next are displayed as icons in the order from the highest possibility of being selected, so that the user can select the next target item.
  • the icon to be performed can be immediately specified without greatly moving the indicator.
  • the information processing apparatus can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
  • the related information storage unit stores the number of times the icon of the related item is selected by the user for each related item
  • the operation screen processing means stores the number of times the selected item is selected. You may arrange
  • the operation screen processing means when the extracted icon is arranged at a specific arrangement position by the operation screen processing means, the operation screen processing means indicates that the icon of the related item that is selected more frequently becomes the end point of the trajectory. Place it close.
  • the icon (related item) most frequently selected by the user is considered to be the icon (related item) having the highest priority, that is, the most likely to be selected next.
  • the operation screen processing means arranges the icons based on the past number of selections, so that the icons that are more likely to be selected by the user are arranged closer to the end point of the locus. (Result) can be provided to the user.
  • the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations.
  • the information processing apparatus including the input unit and the display unit.
  • the attribute of the related item is stored for each related item in the related information storage unit, and the operation screen processing unit is similar to the attribute of the selected object.
  • the icon of the related item having a higher attribute may be arranged closer to the end point of the trajectory.
  • the operation screen processing means when the extracted icon is arranged at a specific arrangement position by the operation screen processing means, the operation screen processing means indicates that the icon of the related item whose attribute is similar to the selected object, Place near the end of the trajectory.
  • -It is considered that related items having a high degree of similarity between the selected object and the attribute, that is, similar in nature and classification, are more closely related to the selected object or similar in nature.
  • the operation screen processing unit arranges icons based on the similarity to the attribute of the object, so that icons that are more likely to be selected by the user are arranged near the end point of the trajectory.
  • an operation screen (result) can be provided to the user.
  • the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations.
  • the information processing apparatus including the input unit and the display unit.
  • the selected object and the related item are photographs
  • the related information storage unit stores the shooting date and time of the photograph as an attribute.
  • the screen processing means can arrange the icon of the photograph with the photographing date and time closer to the photographing date and time of the photograph as the selected object closer to the end point of the trajectory.
  • the operation screen processing unit displays the icon of the photo whose shooting date and time is close to the selected object. It is arranged near the end point of the locus.
  • a photo that has similar attributes to the photo here, the shooting date and time is close
  • a higher priority that is, may be selected. Can be said to be higher.
  • the operation screen processing means places a photo icon based on the similarity of the shooting date and time of the photo, so that a photo that is more likely to be selected by the user is closer to the end of the trajectory. It is possible to provide an operation screen (result) to the user so as to be arranged.
  • the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations.
  • the information processing apparatus including the input unit and the display unit.
  • the related item is video content
  • the related information storage unit stores a recommendation degree indicating a degree of recommending the video content to the user as an attribute.
  • the operation screen processing means may be arranged near the end point of the trajectory as the icon of the moving image content having the higher recommendation level.
  • the operation screen processing unit displays the video content having a high recommendation level associated with each video content.
  • the icon is placed closer to the end point of the trajectory.
  • the user has a strong tendency to select the recommended moving image content, and it is considered that the moving image content having a higher degree of recommendation has a higher priority, that is, the possibility of being selected by the user is higher.
  • the operation screen processing unit arranges icons based on the recommendation level of the moving image content, so that icons that are more likely to be selected by the user are arranged closer to the end point of the trajectory.
  • an operation screen (result) can be provided to the user.
  • the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations.
  • the information processing apparatus including the input unit and the display unit.
  • the trajectory acquisition unit acquires a trajectory of movement of the indicator that has moved the screen of the display unit so as to surround the object displayed on the display unit, and the object identification The means identifies an object at least partially included in the region surrounded by the locus as the selected object, and the operation screen processing means displays the extracted icon of the related item on the outline of the ring. It is preferable to arrange them side by side.
  • the trajectory of the motion enclosed by the trajectory acquisition unit is acquired, and based on the trajectory, the object specifying unit specifies the object selected by the user by the surrounding motion.
  • the related item extracting unit extracts items related to the identified object. Since the related information storage unit stores the object and the item related to the object in association with each other, the related item extracted by the related item extracting unit is related to the object selected by the user. It is.
  • the operation screen processing means arranges the icons of the extracted related items in a ring shape that is easily associated with the surrounding operation. Note that the operation screen processing means refers to the distance between the specific arrangement position on the contour line and the end point of the trajectory even when the icons are arranged on the contour line of the ring, Decide to place it near the end point.
  • the icons are arranged as described above, and the generated operation screen is presented to the user.
  • the information processing apparatus according to the present invention is extremely natural and simple in designating an object to “enclose” the object with an indicator (a pen, a finger, or a cursor controlled by a mouse). With the user's action as an opportunity, an operation screen in which icons of related items related to the selected object are arranged in a ring shape can be provided to the user.
  • the trajectory of “enclose” has a shape that encloses something, it can be said that the shape of the ring is similar to the trajectory of movement of the indicator obtained by the enclosing operation. Therefore, the result (an icon arranged in a ring shape) is likely to be associated with the above-described operation surrounded by the user.
  • the menu list with related item icons arranged in a circle has the following advantages compared to the linear one-dimensional menu list.
  • icons are arranged from top to bottom or from left to right. Therefore, priorities are unintentionally assigned to the icons according to the arrangement positions.
  • the circular menu list it is possible to treat all icons arranged in a circular manner on an equal basis.
  • the information processing apparatus of the present invention can display the final result desired by the user in a more natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. .
  • the information processing apparatus including the input unit and the display unit.
  • the operation screen processing means determines the position and size of the ring so that an icon is arranged around the selected object.
  • the information processing apparatus places an icon around the object in response to an extremely natural and simple user action of “enclosing” the object. Can be output.
  • the user can obtain, as a result, an operation screen in which icons of related items are arranged so as to surround the object that he / she has previously enclosed and selected.
  • the positional relationship between these icons and the object matches the positional relationship between the object and the locus of movement of the indicator by the action previously performed by the user.
  • the movement trajectory of the indicator obtained by surrounding the object is similar to the shape of the ring in which the icon is arranged.
  • a menu list in which icons of related items are arranged in a circle around an object has the following advantages compared to a linear one-dimensional menu list.
  • icons are arranged from top to bottom or from left to right. Therefore, priorities are unintentionally assigned to the icons according to the arrangement positions.
  • the circular menu list it is possible to treat all icons arranged in a circular manner on an equal basis.
  • a one-dimensional menu list is displayed near the previously selected object, it is difficult to express the relationship between the object and each icon.
  • a circular menu list is displayed around the previously selected object, there is a relationship between the previously selected (enclosed) object and the surrounding icons. This makes it possible for the user to recognize this naturally.
  • the information processing apparatus can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
  • the operation screen processing means determines the trajectory acquired by the trajectory acquisition means, or a similar shape or an approximate shape thereof, as the shape of the ring.
  • the user performs an operation of freely surrounding the object with an arbitrary shape, and the trajectory at this time is held by the trajectory acquisition means. Then, when creating the operation screen, the operation screen processing means sets each icon so as to surround a predetermined area (or the object itself) on the contour line of the ring that is the same or similar to the locus obtained as described above. Deploy.
  • the shape of the ring in which the icons are arranged is displayed on the operation screen in a state that matches or is similar to the movement trajectory of the operation body obtained by surrounding the object.
  • the icons are arranged in the shape as enclosed by the user, the user can obtain the operation screen in which the icons are arranged in the desired shape by surrounding the object in the desired shape. Thereby, the playability at the time of displaying an operation screen and operating information processing apparatus increases.
  • the information processing apparatus can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
  • the operation screen processing means arranges the icon of the related item having the highest priority at the point closest to the end point of the trajectory on the outline of the ring, and the remaining related items. You may arrange
  • the operation screen processing means can always arrange the icon with the highest priority at the shortest distance from the end point of the locus on the outline of the ring.
  • the icon that is most likely to be selected by the user can be displayed closest to the position of the user's indicator (such as a finger), and the user does not move the indicator greatly with a high probability.
  • a target icon can be selected.
  • the trajectory acquisition unit acquires the trajectory of the indicator that has occurred in a predetermined period until the indicator moves to select the object displayed on the display unit,
  • the operation screen processing means determines that the trajectory acquired during the predetermined period is biased toward a specific area on the screen of the display unit, the icon screen is arranged so that the icon is arranged in the specific area. The position may be determined.
  • the trajectory acquisition means acquires a trajectory not only for an operation for selecting an object but also for an operation that has occurred in a past predetermined period. Subsequently, the operation screen processing means can grasp at which position of the display unit the movement (movement of the indicator) has occurred in the past predetermined period from the acquired trajectory. If the locus is concentrated on a specific area of the display unit, it is possible to detect the deviation of the moving position of the indicator.
  • the fact that the movement position is biased in this way means that the information processing apparatus can be operated under a special use situation in which only that area can be touched (or other areas are difficult to touch and the indicator cannot be moved). You can guess that you are using it.
  • the operation screen processing means determines the position of the ring so that the icon is arranged in the area where the deviation is detected (that is, the limited area where the indicator can be moved).
  • an icon is displayed in an area that can be selected by the user.
  • a desired icon can be selected immediately from the selectable area.
  • the touch position tends to be biased toward the lower left area of the touch panel (when operating with the left hand) or the lower right area of the screen (when operating with the right hand).
  • the operation becomes complicated if an object or icon that requires a contact operation is displayed on the upper part of the screen or the lower part of the screen opposite to the hand. . This is because the user cannot touch the target object immediately and must perform an extra operation of rushing to a contactable area or switching to a two-handed operation.
  • the information processing apparatus can solve the above-mentioned problem by observing the usage status of the user with the above-described configuration. That is, the information processing apparatus according to the present invention can be configured to detect the bias of the contact position of the operating tool and arrange the icon in a region where the user's indicator is estimated to arrive immediately.
  • the icon is displayed so as to fit within the lower left (or right) area of the screen of the touch panel. There is no need to drag icons, and a desired icon can be selected immediately.
  • the operation screen processing means may be configured such that the icon is arranged in a specific area overlapping the locus of movement of the indicator that selects the object on the screen of the display unit. May be determined.
  • the icon is arranged at or near the position selected by the user with an indicator such as a finger. It can be said that the position where the user has selected the object is a selectable area for the user. Therefore, it is possible to reliably display an icon in the selectable area.
  • the input unit and the display unit included in the information processing apparatus constitute a touch panel
  • the trajectory acquisition unit acquires a trajectory of movement of the indicator that has moved on the touch panel. May be.
  • the input unit included in the information processing apparatus inputs an instruction to move the cursor displayed on the display unit to the information processing apparatus, and the trajectory acquisition unit includes the instruction You may acquire the locus
  • An operation screen display method of the present invention is an operation screen display method in an information processing apparatus to solve the above-described problem, and a trajectory of a movement of an indicator that indicates a position of a screen of a display unit included in the information processing apparatus.
  • a trajectory acquisition step to acquire, an object specifying step to specify, as the selected object, an object that at least partially overlaps the selection area specified based on the trajectory acquired in the trajectory acquisition step, and the object and the object
  • An operation screen processing step to be displayed on the display unit.
  • the icon of the related item having a higher priority is acquired in the trajectory acquisition step. It is characterized by being placed near the end point of the trajectory.
  • the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations.
  • the information processing apparatus including the input unit and the display unit.
  • the information processing apparatus may be realized by a computer.
  • an information processing apparatus control program for causing the information processing apparatus to be realized by the computer by causing the computer to operate as the above-described means, and A computer-readable recording medium on which is recorded also falls within the scope of the present invention.
  • an information processing apparatus provides a trajectory acquisition unit that acquires a trajectory in which an indicator pointing a screen position of a display unit has moved in an information processing apparatus including a touch panel, and the trajectory acquisition.
  • An object specifying means for specifying an object at least partially overlapping the selected area specified based on the trajectory acquired by the means as the selected object, and the object and an item related to the object are stored in association with each other.
  • related item extraction means for extracting an item associated with the object specified by the object specifying means as a related item, and an icon for the related item extracted by the related item extraction means
  • Operation screen processing means arranged at a specific position and displayed on the display unit The operation screen processing unit, in the related item of the extracted more icons higher priority related items, and characterized in that disposed near the end point of the trace obtained by the trace obtaining means.
  • An operation screen display method of the present invention is an operation screen display method in an information processing apparatus to solve the above-described problem, and a trajectory of a movement of an indicator that indicates a position of a screen of a display unit included in the information processing apparatus.
  • a trajectory acquisition step to acquire, an object specifying step to specify, as the selected object, an object that at least partially overlaps the selection area specified based on the trajectory acquired in the trajectory acquisition step, and the object and the object
  • An operation screen processing step to be displayed on the display unit.
  • the icon of the related item having a higher priority is acquired in the trajectory acquisition step. It is characterized by being placed near the end point of the trajectory.
  • the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations.
  • the information processing apparatus including the input unit and the display unit.
  • (a) is a figure which shows a mode that the user performed contact operation of "surrounding" an object in order to select the target object
  • (b) ) Is a diagram showing an example of contact information generated by the contact information generation unit in accordance with the contact operation shown in (a)
  • (c) is a display unit during the period from t0 to tn when contact is detected. It is a figure which shows an example of the map information of the displayed video frame. It is a figure which shows an example of the relevant information memorize
  • (B) is a figure which shows an example of the contact information which the contact information generation part produced
  • FIG. 1 It is a figure which shows the other specific example of the operation screen obtained as a result of the operation screen production
  • FIG. 1 It is a functional block diagram which shows the principal part structure of the tablet terminal in other embodiment of this invention. It is a figure explaining the processing content of each part of the operation screen process part of a tablet terminal, (a) is a figure explaining an example of the display process of the object by an operation screen process part, (b) is an operation screen process. It is a figure which shows an example of the icon arrangement pattern specialized in the ring shape which the ring shape determination part of the part determined.
  • (C) is a diagram showing an example of the operation screen at the time point ta
  • (d) is a diagram showing an example of the operation screen at the time point tb
  • (e) is an operation at the time point tc.
  • (f) is a figure which shows an example of the operation screen at the time of tn.
  • (a) demonstrates an example of the condition where the user is operating with the left hand.
  • (B) is a figure which shows the specific example of the contact information produced
  • (A)-(c) is a figure which shows the specific example of correlation with the icon arrangement position and priority which the icon arrangement
  • (A) And (b) is a figure explaining a mode that the ring of an icon is rotated by a user dragging on a contactable area
  • (A) is a figure which shows the specific example of the icon arrangement
  • (A) is a figure which shows the specific example of the icon arrangement
  • (b) It is a figure which shows the result of which the priority order was linked
  • Embodiment 1 Embodiments of the present invention will be described with reference to FIGS. 1 to 14 as follows.
  • the tablet terminal is realized by a small smartphone that can be operated with one hand and is excellent in portability.
  • the information processing apparatus of the present invention is not limited to the above-described example, and the information processing apparatus of any size (for example, a notebook-size tablet PC or an electronic blackboard equipped with a large touch panel) can be used. An information processing apparatus may be applied.
  • FIG. 2 is a block diagram illustrating a hardware configuration of the tablet terminal 100 according to the present embodiment.
  • the tablet terminal 100 includes at least a control unit 10, an input unit 11, a display unit 12, and a storage unit 19. Furthermore, the tablet terminal 100 may include an operation unit 13, an external interface 14, a communication unit 15, a wireless communication unit 16, an audio output unit 17, and an audio input unit 18 in order to realize inherent functions. .
  • the tablet terminal 100 When the tablet terminal 100 is a multi-function mobile communication terminal such as a smartphone, the tablet terminal 100 is omitted here. However, the tablet terminal 100 includes a call processing unit, an imaging unit (such as a lens / image sensor) that performs imaging, and a broadcast image. Other parts (such as a tuner / demodulation unit), GPS, and sensors (such as an acceleration sensor and an inclination sensor) may be included as well as various components that are typically included in a smartphone.
  • a call processing unit such as a lens / image sensor
  • Other parts such as a tuner / demodulation unit
  • GPS GPS
  • sensors such as an acceleration sensor and an inclination sensor
  • the input unit 11 is for inputting an instruction signal for the user to operate the tablet terminal 100 via the touch panel.
  • the input unit 11 is a touch surface that accepts contact with an indicator (indicating the screen position of the display unit 12, here, for example, a finger or a pen), and contact / non-contact between the indicator and the touch surface. (Approach / non-approach) and a touch sensor for detecting the contact (approach) position.
  • the touch sensor may be realized by any sensor as long as it can detect contact / non-contact between the indicator and the touch surface. For example, it is realized by a pressure sensor, a capacitance sensor, an optical sensor, or the like.
  • the display unit 12 displays an object to be processed by the tablet terminal 100 (any display object such as an icon) and a processing result, and displays an operation screen for the user to operate the tablet terminal 100 using a GUI (Graphical (User ⁇ Interface) screen.
  • the display unit 12 is realized by a display device such as an LCD (Liquid Crystal Display).
  • the input unit 11 and the display unit 12 are integrally formed, and these constitute a touch panel. Therefore, in such an embodiment, an object to be moved (operated) to indicate a screen position, that is, an operation body (here, a finger or a pen) is simultaneously positioned on the screen of the display unit 12. It is also an indicator that indicates
  • the touch panel of the tablet terminal 100 of the present invention is realized by a projected capacitive touch panel
  • the touch sensor has a transparent electrode pattern in a matrix shape made of ITO (Indium Tin Oxide) or the like. It is formed on a transparent substrate such as glass or plastic.
  • ITO Indium Tin Oxide
  • the control unit 10 can detect the position where the indicator is in contact or approached by detecting a change in the current or voltage of the transparent electrode pattern.
  • contact when “contact detection”, “contact operation”, “contact position”, etc. is not only the state in which the indicator and the touch surface are in complete contact (contact), It also includes a state in which the indicator and the touch surface are close (approaching) to the extent that the touch sensor can detect.
  • the operation unit 13 is for the user to directly input an instruction signal to the tablet terminal 100.
  • the operation unit 13 is realized by an appropriate input mechanism such as a button, switch, key, or jog dial.
  • the operation unit 13 is a switch for turning on / off the power of the tablet terminal 100.
  • the external interface 14 is an interface for connecting an external device to the tablet terminal 100.
  • the external interface 14 is realized by, for example, but not limited to, a socket for inserting an external recording medium (memory card or the like), an HDMI (High Definition Multimedia Interface) terminal, a USB (Universal Serial Bus) terminal, or the like.
  • the control unit 10 of the tablet terminal 100 can exchange data with an external device via the external interface 14.
  • the communication unit 15 communicates with an external device via a communication network.
  • the communication unit 15 is connected to various communication terminals via a communication network, and realizes data transmission / reception between the tablet terminal 100 and the communication terminal. Further, when the tablet terminal 100 is a mobile communication terminal such as a smartphone, the communication unit 15 transmits / receives voice call data, e-mail data, and the like to / from other devices via the mobile phone network. To do.
  • the wireless communication unit 16 communicates with an external device wirelessly.
  • the wireless communication unit 16 is not particularly limited, and may implement any wireless communication means such as infrared communication such as IrDA and IrSS, Bluetooth communication, WiFi communication, and a non-contact type IC card. A plurality of means may be realized.
  • the control unit 10 of the tablet terminal 100 can communicate with devices in the vicinity of the tablet terminal 100 via the wireless communication unit 16, and can exchange data with these devices.
  • the sound output unit 17 outputs sound data processed by the tablet terminal 100 as sound, and is realized by a speaker, a headphone terminal, headphones, and the like.
  • the voice input unit 18 receives voice input generated outside the tablet terminal 100, and is realized by a microphone or the like.
  • the storage unit 19 includes (1) a control program executed by the control unit 10 of the tablet terminal 100, (2) an OS program, and (3) an application program for the control unit 10 to execute various functions of the tablet terminal 100, And (4) storing various data read when the application program is executed.
  • the control unit 10 stores data used for calculation and calculation results in the course of executing various functions.
  • the above data (1) to (4) are stored in a non-volatile storage device such as a ROM (read only memory), a flash memory, an EPROM (Erasable ROM), an EEPROM (Electrically EPROM), an HDD (Hard Disc Drive).
  • the data (5) is stored in a volatile storage device such as a RAM (Random Access Memory). Which data is to be stored in which storage device is appropriately determined based on the purpose of use, convenience, cost, physical restrictions, and the like of the tablet terminal 100.
  • the control unit 10 performs overall control of each unit included in the tablet terminal 100.
  • the control unit 10 is realized by, for example, a CPU (central processing unit).
  • the functions of the tablet terminal 100 are such that the CPU as the control unit 10 reads a program stored in a ROM or the like into a RAM or the like and executes the program. It is realized by doing.
  • Various functions (particularly, the operation screen display function of the present invention) realized by the control unit 10 will be described later with reference to other drawings.
  • FIG. 3 is a plan view showing the appearance of the tablet terminal 100.
  • the tablet terminal 100 includes an input unit 11 and a display unit 12 as a touch panel.
  • the tablet terminal 100 includes an operation unit 13, an external interface 14, a wireless communication unit 16, an audio output unit 17, an audio input unit 18, and the like, although these are not essential components.
  • the wireless communication unit 16 is realized by infrared communication means, an infrared transmission / reception unit is provided as the wireless communication unit 16 on the side surface of the tablet terminal 100.
  • FIG. 4 is a diagram illustrating a state when the user holds and operates the tablet terminal 100. More specifically, FIG. 4A is a diagram illustrating a state in which the tablet terminal 100 is gripped with one hand and is operated with that hand, and FIG. It is a figure explaining a mode that it is hold
  • the tablet terminal 100 is a palm-sized information processing apparatus that can be held with one hand. As shown in FIG. 4A, the tablet terminal 100 is held with the thumb of the hand while holding the tablet terminal 100 with one hand.
  • the touch surface of the input unit 11 can be operated. For example, when there is an icon to be operated at a position where the thumb does not reach, the tablet terminal 100 draws the icon near the thumb by flicking, and surrounds or taps the icon with the thumb. The icon can be selected.
  • the user may hold the tablet terminal 100 with one hand and operate the touch surface of the input unit 11 with the finger of the other hand.
  • the tablet terminal 100 may be horizontally long, hold both sides with both hands, and operate the touch surface of the input unit 11 with the thumbs of both hands.
  • FIG. 1 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
  • the control unit 10 of the tablet terminal 100 includes at least a contact information generation unit 21, an object specification unit 22, and related functions as functional blocks for realizing the operation screen display function of the present invention.
  • An item extraction unit 23 and an operation screen processing unit 24 are provided.
  • the operation screen processing unit 24 includes an icon rank determining unit 31 and an icon arrangement determining unit 33.
  • Each functional block of the control unit 10 described above includes a CPU (central processing unit) that stores a program stored in a non-volatile storage device realized by a ROM (read only memory) or the like (RAM (random access memory)). It can be realized by reading and executing the above.
  • a CPU central processing unit
  • ROM read only memory
  • RAM random access memory
  • the storage unit 19 is specifically a frame map storage unit 41 or a related unit as a storage unit for reading or writing data when the above-described units of the control unit 10 execute the operation screen display function.
  • An information storage unit 42, an icon storage unit 43, and a contact information storage unit 44 are included.
  • the contact information generation part 21 processes the signal output from the touch sensor of the input part 11, and produces
  • the contact information includes at least contact coordinate information indicating the coordinate position of the contact position of the indicator (for example, a finger). Thereby, each part of the control part 10 can acquire the locus
  • the contact time information indicating the time when the contact has occurred is further associated with each point constituting the trajectory as necessary. Also good.
  • the contact information storage unit 44 stores the contact information generated by the contact information generation unit 21.
  • the contact information may be temporarily stored in a storage unit (such as a cache) (not shown) so that the object specifying unit 22 can be used immediately.
  • the contact information can be used when each unit of the operation screen processing unit 24 and the operation screen processing unit 24 executes an operation screen generation process (including a process for displaying an icon). It is stored in the contact information storage unit 44.
  • the contact information storage unit 44 is realized by a non-volatile storage device, that is, whether or not the contact information is stored in a nonvolatile manner depends on the purpose of the operation screen display function executed by the operation screen processing unit 24 and the assumed use. It is determined as appropriate from the environment, the purpose of use of the tablet terminal 100 itself, convenience, cost, physical restrictions, and the like.
  • the contact information generation unit 21 generates contact information after the touch sensor of the input unit 11 detects the contact between the touch surface and the indicator (in this embodiment, a finger). Until the non-contact is detected, the contact information generation unit 21 acquires a signal output from the touch sensor. This signal includes information indicating that “contact” has been detected and information indicating the contact position. Based on this signal, the contact information generation unit 21 generates contact coordinate information indicating the contact position in coordinates. Generate. Further, the contact information generation unit 21 measures the time from when contact is detected until it becomes non-contact, and associates the contact time information with the contact coordinate information. The contact information generation unit 21 may acquire and use absolute time information held by the clock unit mounted on the tablet terminal 100.
  • the contact information generation unit 21 detects contact. Then, timing is started and relative contact time information is obtained. For example, the contact information generation unit 21 measures the elapsed time with the time point when the contact is first detected (t0) as 0.00 seconds, and continues the measurement until the time point when the contact is finally detected (tn). Thus, the relative contact time information corresponding to the contact position may be acquired.
  • the contact information generation unit 21 generates contact information by associating the obtained contact time information with the contact coordinate information. In the present embodiment, the generated contact information is supplied to the object specifying unit 22 and used by the object specifying unit 22.
  • the contact information generation part 21 acquires the coordinate information of the position where contact operation was complete
  • the object specifying unit 22 specifies an object selected by the user's contact operation.
  • the object specifying unit 22 compares the contact information generated by the contact information generating unit 21 with the map information of the video frame displayed on the display unit 12 while the contact is occurring. Thereby, the object specifying unit 22 can specify the object pointed to by the contact operation from among the objects being displayed on the display unit 12.
  • the frame map storage unit 41 stores map information of the video frame output to the display unit 12 at the time of contact.
  • the map information is information indicating the layout of the video frame displayed on the touch panel.
  • the map information includes information for individually identifying each object displayed, and information on the shape, size, and display position of each object. That is, the map information is obtained by plotting each object corresponding to the coordinate system of the touch panel.
  • FIG. 5 is a diagram for explaining the operation of the object specifying unit 22. More specifically, FIG. 5A is a diagram showing that the user has performed a contact operation of “enclosing” an object in order to select the target object.
  • FIG. 5B is a diagram illustrating an example of contact information generated by the contact information generation unit 21 in accordance with the contact operation illustrated in FIG.
  • FIG. 5C is a diagram illustrating an example of map information of a video frame displayed on the display unit 12 during a period from t0 to tn in which contact is detected.
  • the object specifying unit 22 acquires contact information as shown in FIG. 5B from the contact information generating unit 21.
  • the coordinate system of the contact information corresponds to the coordinate system of the touch panel of the tablet terminal 100, and has the leftmost upper end of the panel as the origin.
  • the start point is indicated as t0 and the end point is indicated as tn.
  • contact time information may also be associated with each point in between.
  • the object specifying unit 22 acquires the map information shown in FIG. 5C (that is, the layout of the video frame displayed on the display unit 12 during the period from t0 to tn) from the frame map storage unit 41. Then, the object specifying unit 22 compares the contact information with the map information, and selects an object 80 that completely or substantially overlaps an area (selected area) specified by the trajectory of the user's finger obtained from the contact information. Identified as an object. In the example shown in FIG. 5, the object specifying unit 22 specifies “Picture 1” in FIG. 5C as the selected object. The object specifying unit 22 supplies information on the specified object to the related item extracting unit 23.
  • the map information shown in FIG. 5C that is, the layout of the video frame displayed on the display unit 12 during the period from t0 to tn
  • the object specifying unit 22 compares the contact information with the map information, and selects an object 80 that completely or substantially overlaps an area (selected area) specified by the trajectory of the user's finger obtained from the
  • the object specifying unit 22 specifies the area surrounded by the trajectory (the hatched area in the thick frame in FIG. 5C) as the selection area.
  • the object specifying unit 22 obtains a specific selection region based on a predetermined rule that is not limited to the above example.
  • FIGS. 5A to 5C show, as an example, a method for identifying an object when the contact action for the user to select an object is “enclose”.
  • the configuration is not limited to the above.
  • the user may select the object by simply “tapping” a touch action (single tap or double tap), or check mark (such as a check mark or a cross mark) on the object. ) May be selected by a contact action of “check”, or may be selected by a “flick” contact action of paying the object diagonally from top to bottom, or the object may be selected for a predetermined period of time.
  • the tablet terminal 100 may assign any of the contact actions described above to “an action for selecting an object”, and the object specifying unit 22 appropriately specifies the object pointed to by the user according to the assigned touch action. It is a possible configuration.
  • the related item extracting unit 23 extracts related items related to the object specified by the object specifying unit 22, that is, the object selected by the user. When an object is selected, an item deeply related to the selected object is extracted by the related item extraction unit 23.
  • the object is data such as “photo”
  • the photo is transferred to “display”, “edit”, “send as an email”, “peripheral device (TV etc.)”
  • operations such as “print” and “print” are executed. Therefore, an item corresponding to an “action” executed on an object that is an “action target” may be extracted as a related item of the object.
  • an item corresponding to the relationship of “action partner” when an action is executed on an object that is an action target may be extracted as a related item.
  • the user may want the photos or data contained in the object. In this way, items belonging to the lower layer of the object may be extracted as related items.
  • the related information storage unit 42 stores related information indicating the relationship between objects and items.
  • FIG. 6 is a diagram illustrating an example of related information stored in the related information storage unit 42.
  • the related information is information in which at least “related items” are associated with each “object”.
  • the association information indicates the association between the object and the item by this association.
  • the related item extracting unit 23 refers to related information stored in the related information storage unit 42 and extracts items related to the specified object as related items. To do.
  • the object specifying unit 22 specifies that the selected object is “Photo 1”.
  • the related item extraction unit 23 extracts a related item group 60 associated with the object “Photo” from the related information.
  • Information on the related items extracted by the related item extracting unit 23 is supplied to the operation screen processing unit 24.
  • the extracted related items are displayed so as to be selectable (for example, as icons) as items related to the previously selected object.
  • an icon may be assigned to each “related item”.
  • the icon “1: TV” is associated with the related item “display on television (transfer to television)” associated with the object “photo”.
  • the icon “1: TV” is, for example, an icon on which an illustration of a TV or the like is drawn, and is preferably a picture reminiscent of “sending a photograph to the TV for display”.
  • Other related items are also assigned icons with appropriate patterns that recall the contents of the related items.
  • the related item extracting unit 23 may supply the operation screen processing unit 24 with icons (or icon identification information) corresponding to the extracted related items. Thereby, the operation screen processing unit 24 can proceed to display the icon specified by the related item extraction unit 23.
  • the related information holds information indicating the nature or classification of the related item, that is, “attribute” for each related item.
  • the attribute is an index for obtaining the possibility (height) that the related item is selected by the user.
  • information of “number of selections” is stored in the related information storage unit 42 for each related item.
  • the attribute “number of selections” is information indicating how many times the related item has been selected so far by a user operation. The number of times of selection may be one in which the cumulative number of times since the tablet terminal 100 started to be used has been counted.
  • the selected number of times may be reset every time an event such as power-off, elapse of a predetermined period, or history deletion, and be counted from the beginning each time.
  • the attribute of the related item is not limited to the above, and the related information may hold any type of attribute as long as it is information indicating the nature or classification of the related item.
  • the attribute of the related item is read by the operation screen processing unit 24 and each unit as necessary.
  • the operation screen processing unit 24 performs processing (operation screen generation processing) for generating an operation screen for displaying an object and a related item (its icon) related to the selected object in a selectable manner for the user. It is.
  • FIG. 7 is a diagram showing a specific example of the icon image stored in the icon storage unit 43. As shown in FIG. As shown in FIG. 7, in this embodiment, each icon image can be identified by icon identification information. For example, the icon identification information “1: TV” is associated with an icon image depicting a TV. In addition, although not shown, a portrait or an avatar image of the person may be used as an icon representing personal information such as an acquaintance who often makes a call.
  • the operation screen processing unit 24 reads the icon images assigned to the related items extracted by the related item extraction unit 23 from the icon storage unit 43, and performs operations so that these are displayed at an appropriate position and an appropriate timing.
  • a screen is generated and output to the display unit 12 via a display control unit (not shown).
  • the operation screen processing unit 24 has a function of displaying icons of related items related to the object selected by the contact operation along a predetermined icon arrangement pattern.
  • the operation screen processing unit 24 includes at least an icon rank determining unit 31 and an icon arrangement determining unit 33.
  • the icon rank determination unit 31 and the icon arrangement determination unit 33 are responsible for some functions of the operation screen generation process executed by the operation screen processing unit 24.
  • the icon ranking determining unit 31 gives priority to the related items extracted by the related item extracting unit 23 based on the attributes of the related items. Alternatively, the icon rank determining unit 31 may give a priority to the icons associated with the extracted related items.
  • the related information includes a field of “number of selections” as one of the attributes of the related item. Therefore, the icon order determination unit 31 assigns a priority to each related item or each icon in descending order of the “selection count” of the extracted related items.
  • the related item having a larger number of selections is more likely to be selected by the user, and accordingly, a higher priority is given.
  • FIG. 8 is a diagram illustrating an example of the priority order assigned to each icon by the icon order determination unit 31.
  • the related item extracting unit 23 performs the relation shown in FIG. It is assumed that the related item group 60 is extracted from the items.
  • the icon rank determination unit 31 includes an attribute “number of selections” and icon identification information associated with each of the related items extracted from the related information (FIG. 6) stored in the contact information storage unit 44. And read. Then, the icon order determination unit 31 assigns priorities to the respective icons in the order of the number of selections indicated by the corresponding “number of selections”. In the example shown in FIGS. 6 and 8, when the icons are sorted in descending order of the “number of selections”, “2: printer”, “4: photo display”, “1: television”, “8: memory card”, “ “3: Mail”, “6: Palette”, “7: Trash”, and “5: Information display”, the priority order from the first place to the eighth place is given to these. The result of assigning the priority is returned to the operation screen processing unit 24.
  • the icon arrangement determination unit 33 determines the arrangement of icons of related items. The function of the icon arrangement determination unit 33 will be described with reference to FIG.
  • FIG. 9 is a diagram illustrating a specific example of the operation of the icon arrangement determining unit 33. More specifically, FIG. 9A illustrates a specific example of the icon arrangement pattern acquired by the icon arrangement determining unit 33.
  • FIG. 9B is a diagram illustrating a specific example of the icon arrangement determined by the icon arrangement determining unit 33.
  • the icon arrangement determining unit 33 acquires an icon arrangement pattern, and associates the above-described priority order with each icon arrangement position defined in the icon arrangement pattern.
  • the icon arrangement pattern is defined by patternizing how many icons and where the icons are arranged on the touch panel.
  • the tablet terminal 100 holds one fixed icon arrangement pattern in any area (not shown) of the storage unit 19, and the icon arrangement determination unit 33 displays the icon arrangement pattern.
  • the tablet terminal 100 is not limited to the above configuration, and the storage unit 19 may hold a plurality of types of icon arrangement patterns, and the icon arrangement determination unit 33 selects one icon arrangement pattern as necessary. It may be configured to.
  • a configuration in which an arrangement pattern determination unit (not shown) dynamically determines an icon arrangement pattern according to the input movement trajectory of the contact operation and supplies the icon arrangement pattern to the icon arrangement determination unit 33 may be adopted.
  • FIG. 9A an icon arrangement that defines that eight icons are arranged at equal intervals on the outline of a ring (vertically long ellipse) arranged over the entire screen of the touch panel.
  • the pattern is held in the storage unit 19 as an example.
  • FIG. 9A is not intended to limit the icon arrangement pattern of the present invention to this specific example.
  • the icon arrangement determining unit 33 associates priorities with the arrangement positions of the eight icons defined in the icon arrangement pattern shown in FIG.
  • the icon arrangement determining unit 33 associates the icon arrangement position with the priority order based on the contact information stored in the contact information storage unit 44.
  • the icon arrangement determination unit 33 acquires the contact information generated previously from the contact information storage unit 44.
  • the icon arrangement determination unit 33 may acquire only the contact coordinate information of the end point tn of the locus among the contact information.
  • the icon arrangement determining unit 33 plots the end point tn of the locus on the acquired icon arrangement pattern.
  • positioning determination part 33 associates a high priority in order from the arrangement position with the short distance with the end point tn.
  • the icon arrangement determining unit 33 associates the priority orders from the first place to the eighth place in this order.
  • the icon arrangement determining unit 33 can generate the rank association result shown in FIG. According to the ranking association result shown in FIG. 9B, the second position at the A position, the fourth position at the B position, the fifth position at the C position, the seventh position at the D position, the eighth position at the E position, The sixth position is associated with the F position, the third position with the G position, and the first position with the H position.
  • the icon arrangement determining unit 33 arranges icons according to the priority of each icon determined by the icon order determining unit 31 and the arrangement position for each priority determined by itself. For example, the icon arrangement determining unit 33 determines to arrange the icon “1: TV” to which the priority “third place” is assigned at the third position (originally G) in FIG. To do.
  • the icon arrangement result determined by the icon arrangement determining unit 33 in this way is returned to the operation screen processing unit 24.
  • the operation screen processing unit 24 generates an operation screen in which each icon extracted according to the determination of the icon rank determination unit 31 and the icon arrangement determination unit 33 is arranged at a predetermined position, and is displayed on the display unit 12 via the display control unit. Output.
  • the broken line indicating the outline of the ring is the shape of the ring held as information inside the tablet terminal 100, and actually the display unit 12 May not be displayed. Similarly, the broken line indicating the outline of the ring in each of the drawings shown below is not actually displayed on the display unit 12.
  • FIG. 10 is a diagram illustrating a specific example of the operation screen obtained as a result of the operation screen generation process executed by the operation screen processing unit 24.
  • the example shown in FIG. 10 is a specific example of the operation screen obtained when the object 80 (object “Photo 1”) is selected, as in FIGS.
  • the operation screen processing unit 24 extracts “1: TV”, “2: Printer”, “3: Mail”, “4: Photo display”, “5: Information display”, “6” extracted according to the above-described procedure. : Palette ”,“ 7: Trash ”, and“ 8: Memory card ”are read from the icon storage unit 43 (FIG. 7).
  • the operation screen processing unit 24 generates an operation screen according to the priority order determined as shown in FIG. 8 and the arrangement position determined as shown in FIG. 9B.
  • the generated operation screen is output to the display unit 12 and displayed on the display unit 12 as shown in FIG.
  • the operation screen processing unit 24 may arrange the selected object 80 in the center of the screen as shown in FIG.
  • the operation screen processing unit 24 performs the process of arranging the object 80 in the center, but this is not an essential structure.
  • the operation screen processing unit 24 arranges the object 80 in the center from the viewpoint of making it easy to see the icon so as not to overlap the object because the icon is arranged in a ring shape along the icon arrangement pattern. It is preferable.
  • FIG. 11 is a flowchart showing a flow of operation screen display processing by the tablet terminal 100.
  • the contact operation for selecting an object is a double tap or the like, that is, when a non-contact occurs temporarily in a single contact operation for an extremely short time, the contact is temporarily made in a short time. What is necessary is just to comprise the contact information production
  • the object specifying unit 22 compares the contact information generated in S104 (for example, (b) in FIG. 5) with the map information (for example, (c) in FIG. 5) stored in the frame map storage unit 41. Then, the object overlapping the area where the locus touched by the user exists is specified as the selected object (S105). In the example shown in FIG. 5C, the object 80 “Photo 1” is specified.
  • the related item extraction unit 23 refers to the related information (for example, FIG. 6) in the related information storage unit 42 based on the object specified in S105, and extracts the related item of the specified object (S106). Alternatively, identification information of icons assigned to related items may be extracted.
  • the operation screen processing unit 24 executes an operation screen generation process.
  • the icon order determination unit 31 determines the priority order for each of the related items extracted in S106 (S107).
  • the icon order determination unit 31 associates the determined priority order with the related item or the icon of the related item (for example, FIG. 8).
  • the icon arrangement determining unit 33 reads a predetermined icon arrangement pattern (for example, (a) in FIG. 9) from the storage unit 19, and assigns priority to some icon arrangement positions defined in the icon arrangement pattern.
  • the icon arrangement determining unit 33 associates the icon arrangement position with the priority order based on the contact information obtained in S104. More specifically, the icon arrangement determination unit 33 performs association so that a higher rank is assigned to an icon arrangement position with a shorter distance from the end point of the finger trajectory (for example, (a) and ( b)).
  • S107 and S108 may be executed in parallel, or may be executed sequentially in an arbitrary order in series.
  • the icon arrangement determining unit 33 determines the arrangement of each extracted icon according to the priority determined by the icon order determining unit 31 and the arrangement position determined by itself (S109). The icon arrangement determining unit 33 returns the icon arrangement result to the operation screen processing unit 24.
  • the operation screen processing unit 24 acquires the icon image of the related item extracted in S106 from the icon storage unit 43 (for example, FIG. 7). Then, according to the determination in S109, an operation screen on which the acquired icon image is arranged is generated (S110). In the above-described example, the operation screen processing unit 24 generates an operation screen in which the selected object is arranged in the center and each icon is arranged in a ring shape around the selected object.
  • each icon has a higher priority, that is, a “selection count”, that is, an icon that is more likely to be selected by the user. It is arranged near the end point tn.
  • the tablet terminal 100 displays the icons of related items related to the selected object in a selectable manner with respect to one contact operation of the user for selecting the object.
  • the result can be output.
  • the arrangement of the icons displayed so as to be selectable here is an arrangement in consideration of the user's contact operation. In other words, the icons that are more likely to be selected by the user are displayed closer to the position (end point tn of the trajectory) where the user has finished the previous contact operation.
  • the user is expected to perform the contact operation of selecting the icon of the related item related to the object immediately after performing the contact operation of selecting the previous object, and this is a natural flow of operation.
  • the user moves the indicator (such as a finger) from the position where the previous contact operation is completed to the position where the target icon is displayed next.
  • the user can predict the previous contact operation with a high probability. It is only necessary to select a desired icon displayed near the completion position. As a result, it is possible to avoid the troublesome selection operation described above with a high probability.
  • the tablet terminal 100 of the present invention it is not necessary for the user to unnaturally move the indicator from one place to another on the screen, and the target final result can be reached with a simple contact operation. it can. In addition, it is possible to suppress the induction of erroneous operations as the moving distance of the indicator is shorter.
  • the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations.
  • the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
  • the configuration of the tablet terminal 100 of the present invention is not limited to the above.
  • the tablet terminal 100 can determine a gesture other than the contact operation “enclose” as a gesture for selecting an object.
  • the tablet terminal 100 may hold
  • FIG. 12 is a diagram for explaining the operation of the contact information generating unit 21 and the object specifying unit 22. More specifically, FIG. 12A is a diagram illustrating that the user has performed a contact operation of “checking” an object at a check point in order to select a target object. FIG. 12B is a diagram illustrating an example of contact information generated by the contact information generation unit 21 in accordance with the contact operation illustrated in FIG.
  • the user performs a contact operation of “checking” the object 80 (here, the photograph 2) displayed on the touch panel of the tablet terminal 100 at a check mark.
  • the contact operation is performed in a period from t0 to tn so that the contact point passes the position of the broken line in FIG.
  • the contact information generation unit 21 generates the contact information shown in FIG. Of the user's “check” contact movement trajectory, the start point is indicated as t0 and the end point is indicated as tn. However, the contact information generation unit 21 may associate the contact time information with each point in between.
  • the object specifying unit 22 acquires the contact information shown in FIG. 12B from the contact information generating unit 21. Then, the object specifying unit 22 acquires map information shown in FIG. 5C from the frame map storage unit 41. Then, the object specifying unit 22 compares the contact information with the map information, and an object 80 (here, Photo 2) that overlaps with or substantially overlaps the region where the locus of the user's finger obtained from the contact information exists. Is identified as the selected object. Or in this modification, you may specify the object which overlaps the point with the largest Y coordinate (namely, the turning point of a check point) among each point which comprises the locus
  • an object 80 here, Photo 2
  • the related item extracting unit 23 extracts related items of the specified object 80 (photo 2).
  • the related item group 60 (FIG. 6) is extracted as in the first embodiment, and the icon order determination unit 31 assigns a priority to each related item as in the first embodiment (FIG. 8).
  • the icon arrangement determining unit 33 may acquire the icon arrangement pattern shown in FIG. 13 from the storage unit 19 instead of the icon arrangement pattern shown in FIG.
  • FIG. 13 is a diagram illustrating another specific example of the icon arrangement pattern acquired by the icon arrangement determining unit 33.
  • the icon arrangement determining unit 33 obtains the distance between the end point tn of the trajectory and the icon arrangement position determined at the center of each block. Then, the placement position and the priority order are associated with each other so that a higher priority order is assigned to the placement position whose distance from the end point tn is shorter.
  • the numbers shown at the respective icon arrangement positions in FIG. 13 indicate the priorities associated with the icon arrangement determining unit 33 in this modification. As shown in FIG. 13, the priority “first place” is naturally associated with the arrangement position of the upper right block closest to the end point tn.
  • FIG. 14 is a diagram illustrating another specific example of the operation screen obtained as a result of the operation screen generation process executed by the operation screen processing unit 24.
  • the example shown in FIG. 14 shows an operation obtained when the object 80 (here, photo 2) is selected based on the contact information shown in FIG. 12B and the icon arrangement pattern shown in FIG. It is a specific example of a screen.
  • the operation screen processing unit 24 extracts “1: TV”, “2: Printer”, “3: Mail”, “4: Photo display”, “5: Information display”, “6” extracted according to the above-described procedure. : Palette ”,“ 7: Trash ”, and“ 8: Memory card ”, the priority determined by the icon order determination unit 31, and the icon arrangement determination unit 33 as shown in FIG.
  • the operation screen is generated according to the arrangement position.
  • the generated operation screen is output to the display unit 12 and displayed on the display unit 12 as shown in FIG.
  • the operation screen processing unit 24 may arrange the selected object 80 at the center of the screen as shown in FIG.
  • the operation screen processing unit 24 arranges icons around the screen of the touch panel along the icon arrangement pattern shown in FIG. 13, so that the icons do not overlap with the objects and are easy to see. 80 is preferably arranged in the center.
  • the icon arrangement determination unit 33 determines that no icon is arranged in the blocks associated with the priority orders “9th” and “10th”.
  • the user does not need to unnaturally move the indicator around the screen, and can reach the target final result with a simple contact operation. .
  • the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations.
  • the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
  • Embodiment 2 Another embodiment of the information processing apparatus according to the present invention will be described below with reference to FIGS. 15 to 24B.
  • members having the same functions as those in the drawings described in the first embodiment are denoted by the same reference numerals, and description of the same contents as those in the first embodiment is omitted.
  • the tablet terminal 100 is configured to accept an “enclose” contact operation as an operation for selecting an object, and the operation screen processing unit 24 has a ring shape associated with the “enclose” contact operation. It is the structure which arrange
  • FIG. 15 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
  • the tablet terminal 100 has a configuration in which the control unit 10 further includes a ring shape determining unit 30 as a functional block, as compared with the tablet terminal 100 (FIG. 1) of the first embodiment.
  • control unit 10 of the tablet terminal 100 is not essential, but may further include a gesture determination unit 25 and an animation determination unit 32 as functional blocks as necessary. .
  • the ring shape determination unit 30 determines an icon arrangement position when the operation screen processing unit 24 arranges an icon. In particular, when arranging icons in a ring shape, the ring shape determination unit 30 determines the shape of the ring. Is.
  • the operation screen processing unit 24 is configured to arrange icons in accordance with an arbitrary fixed icon arrangement pattern.
  • the ring shape determination unit 30 determines an icon arrangement pattern that defines that icons are arranged in a ring shape, and the operation screen processing unit Each icon is arranged and displayed in a ring shape according to the icon arrangement pattern 24 determined.
  • the ring shape determining unit 30 is a functional block configured to determine the icon arrangement pattern by specializing the arrangement pattern determining unit (not shown) described in the first embodiment to the shape of the ring.
  • the operation screen processing unit 24 arranges the extracted icons.
  • the ring shape information determined by the ring shape determination unit 30 may further include ring size information and / or ring position information.
  • control unit 10 of the tablet terminal 100 further includes the gesture determination unit 25.
  • gestures various types of contact actions (gestures) performed on the input unit 11 other than “enclose” are assumed, it is determined whether the gesture is “enclose” or another gesture. Because it is necessary to do.
  • the gesture determination unit 25 determines what gesture it is for the contact operation performed on the input unit 11. For example, the gesture determination unit 25 can determine gestures such as “tap”, “flick”, “pinch”, “drag”, and “enclose”. A known technique can be appropriately employed as an algorithm for determining a gesture.
  • the gesture determination unit 25 instructs each unit of the control unit 10 to execute processing corresponding to the determined gesture according to the determination result.
  • the contact information generation unit 21 when the gesture determination unit 25 determines that the detected contact action is a gesture of “enclose”, the contact information generation unit 21 sends the generated contact information to the contact information storage unit 44. It is preferable to instruct to store. As a result, the operation screen processing unit 24 can refer to all information (such as the position, size, locus, contact time, contact point movement timing, etc.) of the gesture “enclose”, and the contact operation can be performed. If the gesture is other than the “enclose” gesture, it is possible to avoid unnecessary writing to the contact information storage unit 44. However, the contact information generation unit 21 may be configured to write all contact information in the contact information storage unit 44 regardless of the determination result of the gesture determination unit 25.
  • control unit 10 of the tablet terminal 100 may further include an animation determination unit 32.
  • the animation determination unit 32 determines an animation to be given to all objects to be arranged on the operation screen, that is, objects, icons, rings, and the like. Thereby, when displaying an object and an icon, the visual effect (namely, animation) can be given to how to display.
  • the animation determination unit 32 may give a visual effect such as fade-in (change of transparency) in addition to the movement of the object, icon, or ring.
  • the animation determination unit 32 may move the icons so that the icons are finally terminated on the contour line from different places instead of causing the icon to be displayed in a circular shape to appear on the contour line of the ring from the beginning. .
  • the animation determination unit 32 adds a movement that diffuses each icon, and moves each icon so that it is finally arranged on the outline of the ring. May be attached.
  • FIG. 16 is a diagram illustrating the processing contents of each unit of the operation screen processing unit 24 in the present embodiment. More specifically, FIG. 16A is a diagram for explaining an example of object display processing executed by the operation screen processing unit 24, and FIG. 16B is a diagram illustrating the operation screen processing unit 24. It is a figure which shows an example of the icon arrangement pattern specialized in the ring shape which the ring shape determination part 30 determined.
  • the operation screen processing unit 24 determines to reposition the object 80 selected by the previous “enclose” contact operation at the center.
  • the animation determination unit 32 may give the object 80 an animation in which the object 80 gradually moves from the original position to the center.
  • the ring shape determination unit 30 of the operation screen processing unit 24 determines an icon arrangement pattern based on the ring shape. Specifically, the ring shape determination unit 30 acquires the icon arrangement pattern stored in the storage unit 19, thereby to obtain the ring shape, position, size, number of icons, icon arrangement for arranging icons. The position etc. can be determined.
  • the ring shape determination unit 30 determines an icon arrangement pattern that defines that eight icons are arranged uniformly along the outline of an elliptical ring around the object 80. An example is shown.
  • the example of FIG. 16B in which the shape of the reference “ring” that defines the icon arrangement position defined in the icon arrangement pattern is an ellipse is merely an example, and the present invention There is no intention to limit the shape of the ring.
  • the “ring” does not necessarily mean a shape formed by a curve.
  • the ring shape determining unit 30 may define the shape of the ring as a circle, square, rectangle, or other polygon, or may be a complex shape, irregular shape, or non-geometric shape.
  • a figure having an outline that separates the outside from the outside may be defined as a ring.
  • “ring” does not necessarily mean a closed curve.
  • the operation screen processing unit 24 can place an icon on the outline of the ring of any shape defined as described above according to the determination of the ring shape determination unit 30.
  • the ring shape determination unit 30 defines an icon arrangement in which a predetermined shape, a predetermined position, and a predetermined size are defined for a ring for arranging icons.
  • the icon arrangement pattern is determined by acquiring the pattern from the storage unit 19.
  • the present invention is not limited to such a configuration, and the ring shape determination unit 30 according to the present embodiment uses the input contact movement “enclose” movement trajectory (contact information stored in the contact information storage unit 44).
  • the icon arrangement pattern may be determined by dynamically determining the shape of the ring. More specifically, the ring shape determination unit 30 can change the shape of the trajectory of the “surround” operation into a ring shape for arranging icons as it is.
  • the size of the ring can be determined based on the size of the area enclosed by the “enclose” operation.
  • the position of the ring can be determined based on the position of the enclosed region.
  • the ring shape determination unit 30 may further determine the ring shape based on the map information stored in the frame map storage unit 41. That is, the size and position of the ring may be determined according to the display position and size of the enclosed object.
  • the operation of the operation screen processing unit 24 in such a case will be described with reference to FIGS. 17 and 18.
  • FIG. 17 is a diagram illustrating a specific example of the contact information stored in the contact information storage unit 44. More specifically, FIG. 17A is a diagram showing that the user has performed a contact operation of “enclosing” an object in an arbitrary shape in order to select a target object. FIG. 17B is a diagram illustrating an example of contact information generated by the contact information generation unit 21 in accordance with the contact operation illustrated in FIG.
  • the user performs a contact operation of “enclosing” an object (here, photo 1) displayed on the touch panel of the tablet terminal 100 in an arbitrary shape (for example, a heart shape).
  • an arbitrary shape for example, a heart shape.
  • the contact operation is performed in the period from t0 to tn so that the contact point passes through the position of the broken line in FIG.
  • the gesture determination unit 25 acquires contact information as illustrated in FIG. 17B from the contact information generation unit 21.
  • the start point is indicated as t0 and the end point is indicated as tn.
  • the contact time information may be associated with each point in between.
  • the gesture determination unit 25 determines that this contact operation is a gesture of “enclose” based on the contact information shown in FIG.
  • the gesture determination unit 25 instructs the contact information generation unit 21 to store the contact information shown in FIG. 17B in the contact information storage unit 44.
  • each part of the operation screen process part 24 can refer to the contact information shown by (b) of FIG. 17 memorize
  • FIG. 18 is a diagram illustrating another example of the icon arrangement pattern determined by the ring shape determining unit 30.
  • the operation screen processing unit 24 can place the selected object 80 (here, Photo 1) in the center.
  • the ring shape determination unit 30 acquires the contact information stored in the contact information storage unit 44. Based on the movement trajectory of the finger tip (contact point) obtained from the contact information, the ring shape determination unit 30 determines a shape that is the same as or similar to the trajectory as the ring shape for arranging the icons. To do. In the present embodiment, as an example, the ring shape determination unit 30 determines to place the ring at the center and to place it as large as possible on the touch screen. As shown in FIGS. 17A and 17B, the object 80 is surrounded by a heart shape.
  • the ring shape determination unit 30 determines the ring shape 81 so that the similar shape of the heart-shaped locus is arranged in the center of the screen as shown by the broken line in FIG. At this time, for each icon arrangement position on the ring shape 81, the ring shape determination unit 30 may determine that the icons are arranged at equal intervals, or the icon can be placed at an arbitrary position on the contour line according to another rule. You may decide to arrange.
  • positioning determination part 33 of the operation screen process part 24 makes the end point tn of the locus
  • FIG. A priority order is associated with each icon arrangement position according to the distance.
  • the operation screen processing unit 24 arranges an icon on the outline of the ring shape 81 determined by the ring shape determination unit 30 as shown in FIG. At this time, the icon arrangement determination unit 33 determines the icon arrangement position so that the priority order given to each icon by the icon order determination unit 31 matches the priority order associated with the arrangement position.
  • the ring shape determining unit 30 of the operation screen processing unit 24 may determine the approximate shape of the trajectory as the ring shape when the trajectory has an extremely complicated shape. By rounding a fine and distorted line of a locus with a straight line or a curve, the amount of information defining the shape of the ring can be reduced, and the processing load for arranging icons can be reduced.
  • the operation screen processing unit 24 displays the selected object (for example, the object 80 in FIG. 16A, FIG. 16B, or the object 80 in FIG. 18) when the icon is arranged.
  • the ring shape determination unit 30 is configured to determine the position of the ring so as to arrange icons around the center object 80.
  • the operation screen processing unit 24 may be configured to maintain the display position of the selected object 80 as it is. Even in this case, the ring shape determination unit 30 may display the ring shape on the screen as shown in FIG. The position and size of the ring may be determined so as to be displayed large in the center.
  • FIG. 19 shows another arrangement example of the object and the circular icon group.
  • the ring shape determination unit 30 may cause the object 80 at the original position to be in the center of the ring as shown in FIG. The position and size of the ring shape may be determined.
  • FIG. 20 shows another arrangement example of the object and the circular icon group.
  • the animation determination unit 32 solves the above problem by adding animation to a ring for arranging icons as shown in FIG. May be. Specifically, the animation determination unit 32 sets the ring shape determined by the ring shape determination unit 30 based on the original display position and size of the object 80 so that the ring shape is largely arranged in the center of the screen over a certain period of time. Add animation to the ring. As a result, the ring of icons once arranged small around the object 80 gradually changes its shape with the passage of time, and is finally arranged large in the center of the screen.
  • 21A shows a state of an icon ring initially arranged around the object 80
  • FIG. 21B shows a state in which the icon ring is being expanded.
  • (C) shows a state in which the ring of icons is finally arranged largely at the center of the screen from (a) of FIG.
  • the icon can be displayed in a sufficient size without impairing the relevance between the user's contact operation and the displayed result.
  • the animation determination unit 32 may give an animation that gradually increases the size of each icon according to the size of the ring, or the icon size is independent of the size of the ring and the size of the icon.
  • the animation may be given such that the size of the icon is fixed and the interval between the icons is gradually increased.
  • the operation screen processing unit 24 is configured to simultaneously arrange a plurality of icons around the selected object.
  • the present invention is not limited to this, and when the operation screen processing unit 24 includes the animation determination unit 32, the animation determination unit 32 may determine the display timing of each icon.
  • FIG. 22 is a diagram showing a modification of the related item icon display method.
  • the animation determination unit 32 refers to the contact information stored in the contact information storage unit 44, and recognizes that the “enclose” gesture has occurred clockwise from t0 to tn.
  • the animation determination unit 32 determines that the first to eighth icons appear one by one at regular intervals in a clockwise direction so as to match this movement.
  • the animation determining unit 32 causes icons to appear in order at regular intervals (b), (c), (d), (e),... In FIG.
  • the display timing of the icons is controlled so that the operation screen shown in FIG.
  • the operation screen can be provided in a natural flow that does not contradict the user's intuition.
  • the icon arrangement determining unit 33 roughly matches the display position of the first icon with the start point of the finger trajectory (contact position at time t0), and the display position of the last icon and the end point (tn It is preferable that the contact position at the time is approximately the same.
  • the icon of the priority “1st place” is arranged at the H arrangement position ((f) in FIG. 22) closest to the end point tn of the trajectory. Instead of appearing in the second place, it will appear in the fourth place ((e) in FIG. 22).
  • the animation determination unit 32 sequentially causes each icon to appear in accordance with not only the finger movement (clockwise or counterclockwise) but also the speed at which the finger moves when the object is surrounded.
  • contact information indicates a trajectory enclosed in a clockwise direction from time t0 to time tn. More specifically, the contact position (tip of the finger) is at the left of the object at the time ta, the contact position is at the upper left of the object at the time tb, and the contact position is at the upper right of the object at the time tc. It can be seen from this contact information.
  • the animation determination unit 32 causes the first icon to appear immediately below the object at time t0, and then matches the finger speed.
  • the icon appears up to the left (third) of the object at the time ta, the icon appears up to the upper left (fourth) of the object at the time tb, and the object appears at the time tc.
  • the icons appear up to the upper right (sixth), and finally, it is determined that all objects appear at the time point tn.
  • FIG. 23B shows the operation screen at time t0.
  • FIG. 23C shows an operation screen at the time point ta.
  • (D) of FIG. 23 has shown the operation screen at the time of tb.
  • FIG. 23E shows the operation screen at the time tc.
  • FIG. 23 (f) shows the operation screen at time tn.
  • the icon of the priority “1st place” is arranged at the H arrangement position ((f) in FIG. 23) closest to the end point tn of the trajectory. Instead of appearing, it appears fourth ((d) in FIG. 23).
  • [Operation screen display flow] 24A and 24B are flowcharts showing a flow of operation screen display processing by the tablet terminal 100 in the present embodiment.
  • the acquisition of the contact coordinate information indicating the contact position is started and acquired over time (S202). This tracking of the contact position is continued until no contact is detected between the touch surface and the finger (NO in S203).
  • the gesture determination unit 25 may determine the gesture of the contact operation based on the contact information (S205). In the present embodiment, if the determined gesture is not “enclose” (NO in S206), the gesture determination unit 25 instructs each unit of the control unit 10 to execute a process according to the determined other gesture. Each unit performs processing according to the determined gesture (S207).
  • the gesture determination unit 25 instructs the contact information generation unit 21 to store the contact information in the contact information storage unit 44. .
  • the contact information generation unit 21 stores the contact information generated in S204 in the contact information storage unit 44 (S208).
  • the object specifying unit 22 includes contact information stored in the contact information storage unit 44 (for example, FIG. 5B or FIG. 17B) and map information stored in the frame map storage unit 41 ( For example, by comparing with (c) of FIG. 5, an object that overlaps the area surrounded by the user is specified as the selected object (S209). In the example shown in FIG. 5C, the object 80 “Photo 1” is specified.
  • the related item extraction unit 23 refers to the related information (for example, FIG. 6) in the related information storage unit 42 based on the object specified in S209, and extracts the related item of the specified object (S210). Alternatively, identification information of icons assigned to related items may be extracted.
  • the operation screen processing unit 24 may acquire the contact information generated in S204 from the contact information storage unit 44 as necessary (S211). Then, the ring shape determination unit 30 of the operation screen processing unit 24 dynamically determines an icon arrangement pattern based on the ring shape based on the contact information (S212). Alternatively, the operation screen processing unit 24 may determine the icon arrangement pattern by acquiring the icon arrangement pattern held in the storage unit 19. Here, the ring shape determination unit 30 determines an icon arrangement pattern based on the ring shape, which position and size the ring is arranged with respect to the object, how many icons are arranged, It is determined what kind of shape each icon is placed on the outline of the ring, where the icon is placed on the outline (for example, FIG. 9A, FIG. 16B, FIG. 18 to 23).
  • the operation screen processing unit 24 executes an operation screen generation process. Specifically, the icon ranking determining unit 31 of the operation screen processing unit 24 gives priority to each of the related items extracted in S210 (S213). Specifically, the icon rank determination unit 31 reads the extracted attribute of the related item (for example, “the number of selections” in FIG. 6) from the related information storage unit 42. Then, the order of each related item is determined in order of the possibility of being selected by the user determined based on the attribute (for example, FIG. 8). The icon order determination unit 31 may associate a priority order with the icon image or icon identification information associated with the related item.
  • the icon rank determination unit 31 may associate a priority order with the icon image or icon identification information associated with the related item.
  • the icon arrangement determining unit 33 of the operation screen processing unit 24 associates each icon arrangement position defined in the icon arrangement pattern determined in S212 with the priority (S214). Specifically, the icon arrangement determining unit 33 acquires the coordinates of the end point of the trajectory from the contact information obtained in S211 and associates the priorities in order from the arrangement position with the shortest distance from the end point.
  • S213 and S214 may be executed in parallel, or may be executed sequentially in an arbitrary order in series. Furthermore, S213 may be executed before S211 or S212 as long as it is a step after S210.
  • the icon placement determination unit 33 sets each priority so that the priority assigned to each icon by the icon order determination unit 31 in S213 matches the priority order associated with each icon placement position in S214. It is determined to arrange an icon (S215).
  • the animation determination unit 32 includes an arrangement object (object, ring, and each) whose arrangement position is determined in each upstream process.
  • An animation may be given to an icon or the like as necessary (S216).
  • the animation determination unit 32 determines the appearance timing of each placement object, decides to gradually change the position, shape, and size of the placement object, fade-in (changes in transparency), and the like. Or other visual effects may be determined.
  • the operation screen processing unit 24 acquires the icon image of the related item extracted in S210 from the icon storage unit 43 (for example, FIG. 7). Then, according to the content determined in each upstream process, the object specified in S209 is arranged, and the acquired icon image is placed on the outline of the ring arranged around the object or at a predetermined position with respect to the object.
  • the operation screen is generated by arranging them (S217). For example, the operation screen processing unit 24 places the object in the center and places each icon on the ring shape arranged around the object, that is, on the ring-shaped outline determined in S212. Moreover, in the final product, the icons arranged are arranged closer to the trajectory end point of the “enclose” contact operation as the possibility of being selected is higher.
  • the video signal of the operation screen generated as described above is output to the display unit 12.
  • the tablet terminal 100 displays an icon around the object in response to an extremely natural and simple user contact operation of “enclosing” the object. Can output an operation screen (final result) on which is placed.
  • the user can obtain an operation screen in which icons of related items are arranged so as to surround the object as a result.
  • the positional relationship between these icons and the object matches the positional relationship between the object and the trajectory of the finger by the previous contact operation previously performed by the user.
  • the finger trajectory obtained by surrounding is similar to a ring shape in which icons are arranged.
  • the transition from the phenomenon of “contacting the object to“ enclose ”the object” to the phenomenon of “obtaining an operation screen with icons arranged around the object” is a natural transition that does not contradict the user's intuition. It can be said that there is.
  • the tablet terminal 100 can preliminarily detect related items that the user will select after selecting an object, and can display the related items in a selectable manner for the user.
  • the icons displayed around the object are all related item icons extracted as items related to the object. That is, after the user surrounds and selects an object, the user can immediately designate “motion”, “motion target”, “action partner”, and the like related to the object from surrounding icons.
  • the icons that are more likely to be selected are arranged closer to the movement completion position of the previous contact operation. Accordingly, the user selects (touches) an icon displayed near the movement completion position of the previous contact operation with a high probability as a desired icon. As a result, it is possible to avoid the troublesome selection operation of moving the indicator unnaturally from place to place on the screen with high probability.
  • the tablet terminal 100 of the present invention it is not necessary for the user to unnaturally move the indicator from one place to another on the screen, and the target final result can be reached with a simple contact operation. it can. In addition, it is possible to suppress the induction of erroneous operations as the moving distance of the indicator is shorter.
  • the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations.
  • the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
  • the ring shape determining unit 30 dynamically determines the icon arrangement pattern (specifically, the ring shape) based on the movement trajectory of the indicator.
  • the tablet terminal 100 displays an icon around the object in response to an extremely natural and simple user contact operation of “enclosing” the object.
  • the operation screen for placing the image can be output as the final result. That is, the user can obtain, as a result, an operation screen in which icons of related items are arranged so as to surround the object.
  • each icon is arranged so as to surround the object on the same or similar ring-shaped contour line as the obtained trajectory.
  • the positional relationship between these icons and the object matches the positional relationship between the object and the locus of the finger by the contact operation previously performed by the user. Further, the trajectory of the finger obtained by enclosing it matches the ring shape where the icon is arranged.
  • the icons displayed in the surroundings after selecting an object indicate related items that are deeply related to the object and are likely to be selected next.
  • the tablet terminal 100 can display the final result desired by the user with a more natural flow that does not contradict the user's intuition, while having a simple contact operation and a small number of operations.
  • the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
  • a tablet terminal 100 in which the first and second embodiments are appropriately combined also falls within the scope of the present invention. That is, the control unit 10 of the tablet terminal 100 according to each of the first and second embodiments partially includes the gesture determination unit 25, the ring shape determination unit 30, and the animation determination unit 32 when not essential in each embodiment. Or all of them may be provided.
  • the attribute associated with the related item in the related information is “selection count”, and the icon rank determination unit 31 may refer to the attribute “selection count” and be selected by the user.
  • the configuration in which the priority order is given according to the height of the above has been described.
  • the configuration of the tablet terminal 100 of the present invention is not limited to the above.
  • the icon order determination unit 31 may assign priorities according to various attributes other than the attribute “number of times of selection”.
  • a related item having a higher correlation with the object compared to the attribute of the selected object may be determined as a related item that is more likely to be selected by the user. Good.
  • the icon order determination unit 31 associates the related items (other photographs) taken at the photographing date and time closer to the photographing date and time of the selected object “photo” so that the priority order is higher.
  • the order of items may be determined. Therefore, the icon arrangement determination unit 33 sets the icon closer to the end point of the trajectory as “photograph” as the selected object is closer to the photographed date and time with respect to “the plurality of other photos” as the extracted related items. Decide to place.
  • the user can A desired icon displayed near the movement completion position of the previous contact operation is selected with high probability. As a result, it is possible to avoid the troublesome selection operation of moving the indicator unnaturally from place to place on the screen with high probability.
  • the icon rank determination unit 31 does not determine the high possibility of selection based on one attribute, but rather includes a plurality of attributes (for example, “shooting date”, “photographer”, “camera model”). ”,“ Photo title ”, etc.) may be used comprehensively to determine the likelihood of being selected. For example, the icon rank determination unit 31 compares a plurality of attributes of the selected object with a plurality of attributes of each related item to obtain a total similarity, and a higher similarity is selected. It may be determined that there is a high possibility that Here, it is considered that a related item having an attribute similar to that of the selected object is more likely to be selected by the user, and accordingly, a higher priority is given.
  • a related item having an attribute similar to that of the selected object is more likely to be selected by the user, and accordingly, a higher priority is given.
  • the selected object is a tool “display video content list” and a plurality of “video content” are arranged as icons as related items.
  • information such as “recommendation degree”, “genre”, and “performer name” is associated with each of the related items and stored in the related information storage unit 42.
  • “Recommendation level (recommendation level)” is information indicating the degree of recommendation for viewing to the user here, based on user preference information, viewing history, etc., or by the video content provider, It is predetermined.
  • the icon ranking determining unit 31 may determine the ranking of the related items such that the higher the “recommended degree” the related items (video content), the higher the priority. Therefore, the icon placement determination unit 33 determines that the icon is placed closer to the end point of the trajectory as the “recommendation degree” is higher for the “video content” as the extracted related item.
  • the icon placement determination unit 33 determines that the icon is placed closer to the end point of the trajectory as the “recommendation degree” is higher for the “video content” as the extracted related item.
  • a related item having a higher recommendation level is more likely to be selected by the user, and therefore, a higher priority is given.
  • the user can have a higher probability.
  • the desired icon displayed near the movement completion position of the contact operation is selected.
  • the icon rank determination unit 31 compares the user preference information set in advance with the attributes of each related item (such as “genre” and “performer name”), and the similarity to the user preference. It may be determined that the higher the similarity is, the higher the possibility of selection is. Here, it is considered that a related item having an attribute close to the user's preference is more likely to be selected by the user, and thus a higher priority is given.
  • attributes of each related item such as “genre” and “performer name”
  • “selected date / time” indicating the recently selected date / time may be stored as one of the attributes of the related item.
  • the icon rank determination unit 31 may refer to the attribute “selected date and time” and give a higher priority to the latest date and time when the icon of the related item was last selected.
  • a higher priority may be given to the older date and time when the icon of the related item was last selected.
  • FIGS. 4A and 4B when the tablet terminal 100 is a small portable terminal and can be operated with one hand or both hands, when operating with one hand, operate with both hands. It is assumed that the area where the user can touch the screen with a finger is different from the time when the user is present. As shown in FIG. 4B, when operating with both hands, any area of the touch panel can be touched. On the other hand, as shown in FIG. 4A, when the operation is performed with one hand, the contact position is the area on the lower left side of the screen (when operated with the left hand) or the area on the lower right side of the screen (operated with the right hand). Tend to be biased.
  • the tablet terminal 100 of the present invention solves the above problem by observing the usage status of the user.
  • the tablet terminal 100 of the present invention can be configured to detect the bias of the contact position of the finger and place the icon in an area where the user's finger is expected to reach immediately.
  • the contact information generation unit 21 of the tablet terminal 100 does not depend on contact / non-contact switching, and further, regardless of whether the contact operation is a gesture of “enclose” (for example, In this configuration, contact information indicating a user's contact operation that occurred within a few seconds to several minutes) is generated and stored in the contact information storage unit 44.
  • FIG. 25 is a diagram for explaining the operation of the tablet terminal 100 of the present invention capable of presenting an operation screen in accordance with the usage status of the user. More specifically, FIG. 25A is a diagram illustrating an example of a situation where the user is operating with the left hand. (B) of FIG. 25 is a figure which shows the specific example of the contact information produced
  • the contact information generation unit 21 generates contact information as shown in FIG. 25B for the above-described series of contact operations for a predetermined period (for example, the past several seconds to several minutes), and a contact information storage unit 44.
  • a predetermined period for example, the past several seconds to several minutes
  • the contact information generation unit 21 may be configured to delete the oldest locus every time a new locus is stored. .
  • the operation screen processing unit 24 refers back to the past several seconds to several minutes of contact information stored in the contact information storage unit 44, and detects whether or not the finger touch position is biased. In the example shown in FIGS. 25A and 25B, the finger trajectory is biased toward the area 82 on the left side of the lower part of the screen. The operation screen processing unit 24 detects this bias and identifies the user accessible area as the area 82 on the lower left side of the screen. Note that the area 82 on the lower left side of the screen and the area 83 on the lower right side of the screen are defined in advance.
  • the ring shape determining unit 30 of the operation screen processing unit 24 determines the shape, size, and arrangement position of the ring so that the ring for arranging the icons fits in the area 82 on the lower left side of the screen.
  • FIG. 26 is a diagram illustrating an example of an operation screen when icons are arranged according to the icon arrangement pattern (or ring shape) determined by the arrangement pattern determining unit (or ring shape determining unit 30). As shown in FIG. 26, since the related items of the selected object 80 are displayed so as to fit within the area 82 on the lower left side of the screen, the user does not need to drag the target icon with the thumb, The icon can be selected.
  • the operation screen processing unit 24 arranges each extracted icon near the locus end point based on the contact operation for selecting an object, as the next icon is more likely to be selected by the user. Yes. For this reason, the user can avoid the troublesome operation of moving the finger unnaturally around the screen with a high probability. Since one-handed operation is less operable than two-handed operation, avoiding the operational inconvenience as described above is particularly effective in improving operability when operating with one hand. .
  • the tablet terminal 100 determines whether or not the user is operating with the thumb based on the thickness of the line of the finger trajectory in order to determine the usage situation that the user is operating with one hand. If it is determined that the operation is performed, it may be determined that the operation is performed with one hand, and an icon may be displayed at the bottom of the screen.
  • a sensor is provided in the casing of the tablet terminal 100, and the tablet terminal 100 determines whether the casing is gripped by four fingers or five fingers, and accordingly One-handed operation or two-handed operation may be determined.
  • the operation screen processing unit 24 of the tablet terminal 100 refers to the contact coordinate information of the area surrounded by the locus of the finger, identifies the locus area, and the locus area and the vicinity thereof are user accessible areas. It may be determined that an icon ring is arranged there.
  • the icon arrangement determining unit 33 determines that all icons extracted by the related item extracting unit 23 cannot be displayed in consideration of the determined ring shape size and icon size, the icon arrangement determining unit 33 should display the icons. It may be decided to reduce the number of icons.
  • the icon arrangement determining unit 33 may determine the number of icons to be displayed based on the absolute size of the finger trajectory (or the enclosed area) with reference to the contact information. Thus, the user can intentionally adjust the number of icons to be displayed next by changing whether the object is surrounded by a smaller ring or a larger ring.
  • the icon arrangement determining unit 33 can decrease the icons in descending order of priority determined by the icon order determining unit 31.
  • the icon placement determination unit 33 when associating priorities with the placement positions of some icons defined in a predetermined icon placement pattern, from the placement position that is close to the end point tn of the trajectory. In order, the higher priority order was associated.
  • the icon arrangement determining unit 33 may associate the icon arrangement position with the priority order by the method described below.
  • FIG. 27 (a) to 27 (c) are diagrams showing specific examples of association between icon arrangement positions and priorities performed by the icon arrangement determining unit 33.
  • FIG. 27 (a) to 27 (c) are diagrams showing specific examples of association between icon arrangement positions and priorities performed by the icon arrangement determining unit 33.
  • the predetermined icon arrangement pattern referred to by the icon arrangement determining unit 33 is the case where the end point of the finger trajectory when the object is selected is at the position of the end point tn shown in FIG. As shown, it is assumed that eight icons are arranged at equal intervals on the outline of the elliptical ring.
  • the icon arrangement determining unit 33 specifies a point P closest to the end point tn of the trajectory on the outline of the ring. Then, the icon arrangement determining unit 33 associates a higher priority with an arrangement position having a shorter distance from the point P to the icon arrangement positions A to H (FIG. 9A) following the outline of the ring. .
  • the icon arrangement determining unit 33 sets the icon arrangement position A to the eighth priority, B to the sixth, C to the fourth, D to the second, E Associate 1st position with F, 3rd position with F, 5th position with G and 7th position with H.
  • the tablet terminal 100 can rotate the icon ring clockwise and counterclockwise by a user's drag operation. There is a merit when displaying the operation screen.
  • the icon ring displayed with the user is rotated by a drag operation, and the target icon is located at the position where the fingertip is located (that is, near the end point tn).
  • the icon at a position where the distance to follow the outline from the end point tn becomes shorter, and the target icon can be dragged with fewer drag operations (rotation operations).
  • the user can reach the target final result with a small number of operations. That is, it is possible to avoid the troublesome selection operation.
  • the icon arrangement determining unit 33 determines to arrange an icon having a high priority at an icon arrangement position where the distance to follow the outline from the end point tn is short.
  • an icon having a high possibility of being selected by the user can be arranged at an icon arrangement position having a short distance on the contour line from the end point tn. For this reason, the user can reach the target icon at a shorter distance with high probability by running the finger from the position where the object has been selected.
  • the icon arrangement determining unit 33 associates the highest priority with the arrangement position closest to the end point tn of the locus among the predetermined icon arrangement positions A to H.
  • the first place is associated with the icon arrangement position E.
  • the second and subsequent priorities are sequentially associated with each arrangement position in the counterclockwise direction from the first E. That is, as shown in FIG. 27 (b), D is 2nd, C is 3rd, B is 4th, A is 5th, H is 6th, G is 7th, and F is Associate 8th place.
  • the tablet terminal 100 has an advantage when displaying an operation screen that can rotate the ring of icons clockwise by a user's drag operation. There is.
  • the icon ring displayed along with it is rotated clockwise by a drag operation, and the target icon is moved to the position where the fingertip is located (that is, the end point).
  • the target icon is moved to the position where the fingertip is located (that is, the end point).
  • each icon shown in FIG. 27B is dragged from the first-ranked icon to the user's fingertip (near the end point tn) in the counterclockwise order as the ring rotates clockwise.
  • the order in which they are gathered matches the priority shown in FIG. 27B (the order in which the user is likely to be selected). For this reason, the user can search for the target icon with a high probability with a smaller number of drag operations.
  • the icon arrangement determining unit 33 associates the first place with the icon arrangement position E as in the example shown in (b) of FIG. Then, in order clockwise from the first E, the second and subsequent priorities are sequentially associated with each placement position. That is, as shown in FIG. 27 (c), F is 2nd, G is 3rd, H is 4th, A is 5th, B is 6th, C is 7th, and D is Associate 8th place.
  • each icon shown in FIG. 27C is dragged from the first-ranked icon to the user's fingertip (near the end point tn) in the clockwise order as the ring rotates counterclockwise.
  • the order in which they are gathered matches the priority shown in FIG. 27C (the order in which the user is likely to be selected). For this reason, the user can search for the target icon with a high probability with a smaller number of drag operations.
  • the tablet terminal 100 as the information processing apparatus of the present invention has been described on the assumption that it is a small and portable smartphone that can be operated with one hand.
  • the information processing apparatus of the present invention is not limited to a small smartphone, but can be applied to a tablet PC having a notebook size screen, an electronic blackboard having a larger screen than that, and the like.
  • Tablet PCs, electronic blackboards, etc. have a larger display screen than smartphones. Therefore, in order for the user to select a desired object or icon, an indicator (finger or pen) or the like must be moved more than when operating the smartphone.
  • the movement of the indicator can be further reduced when selecting a desired object or icon. That is, if the information processing apparatus of the present invention is used in a tablet PC having a large display screen or an electronic blackboard, the advantage of the present invention that realizes excellent operability can be enjoyed more.
  • FIG. 29 and FIG. 31 are views showing a state where the tablet terminal 100a as the information processing apparatus of the present invention is realized by a tablet PC.
  • (a) is a diagram showing a specific example of the icon arrangement pattern in which the locus of the indicator and the end point tn of the locus are plotted by the icon arrangement determining unit 33, and (b) is an icon. It is a figure which shows the result by which the priority order was linked
  • FIGS. 29 and 31 it is assumed that two photo objects are displayed on the entire screen of the display unit 12 of the tablet terminal 100a, and the user selects one of them by moving the indicator. Specifically, it is assumed that the user performs a contact operation by moving a finger (indicator) so as to surround the target photographic object.
  • the user draws a circle so as to surround the photo object from around the lower right of the target photo object, and similarly finishes drawing the circle around the lower right.
  • positioning determination part 33 acquires the locus
  • the icon arrangement determination unit 33 follows each predetermined rule defined in the pattern based on the end point tn according to a predetermined rule (for example, a rule that associates a higher priority in the order of the shortest distance from the end point tn).
  • the priority positions 1 to 8 are associated with the icon arrangement positions A to H.
  • the icon arrangement determining unit 33 associates the priority order with the icon arrangement position according to the above rules, the result is as shown in FIG.
  • the number assigned to each icon arrangement position represents the associated priority order.
  • the operation screen processing unit 24 arranges each icon at the arrangement position according to the determination by the icon arrangement determining unit 33 so that the icon order determining unit 31 matches the priority order previously assigned to the icons.
  • an icon with a higher priority (highly likely to be selected by the user) is displayed near the end point tn where the fingertip is considered to be located. Therefore, even if the user has a large screen of the tablet terminal 100a, the user can move to an operation of selecting a target icon without moving the finger greatly with a high probability.
  • the method of the contact operation varies depending on the user.
  • the user may draw a circle so as to surround the photo object from the upper left of the target photo object, and finish drawing the circle around the upper left as well.
  • the icon arrangement determining unit 33 plots the finger trajectory and its end point tn in the icon arrangement pattern as shown in FIG.
  • the icon arrangement determining unit 33 associates the higher priority with the icon arrangement position having a shorter straight line distance with the end point tn.
  • the result is as shown in FIG.
  • an icon with a higher priority (highly likely to be selected by the user) is displayed near the end point tn where the fingertip is considered to be located. Therefore, even if the user has a large screen included in the tablet terminal 100a, the user can move to an operation of selecting a target icon without moving the finger greatly with a high probability.
  • the moving distance of the indicator when selecting an object becomes long, so that the position of the end point (fingertip) reaches everywhere on the screen. To do. Then, it is possible that the icon to be selected next is displayed at a position away from a certain position. In this case, the user unnaturally moves the indicator from one place to another on the screen, and there is a problem that operability is degraded. Note that the larger the screen of the information processing apparatus, the more troublesome is the operation of moving the indicator further away, and the above problem is more serious.
  • the tablet terminal 100a of the present invention even if the position of the end point (fingertip) reaches any position throughout the screen, the position closer to the end position is selected by the user. Since the icon that is highly likely to be displayed is displayed, the user can select the target icon with a high probability without moving the finger too much. For example, as shown in the above-described example, even when the same photographic object is enclosed, the way of enclosure varies depending on the user, and different results are obtained accordingly. In the contact operation shown in FIG. 29, the highest priority is associated with the icon arrangement position E as shown in FIG. 30, and in the contact operation shown in FIG. 31, the icon arrangement position G is the highest as shown in FIG. A higher priority is associated. In this way, as the information processing apparatus has a large screen, the final product differs greatly depending on how the user is enclosed, and the above serious problems can be solved.
  • the tablet terminal 100a of the present invention it is not necessary for the user to unnaturally move the indicator from one place to another on the screen, and the target final result can be reached with a simple contact operation. it can. In addition, it is possible to suppress the induction of erroneous operations as the moving distance of the indicator is shorter. As a result, there is an effect of realizing excellent operability. As described above, if the information processing apparatus of the present invention is used in a tablet PC having a large display screen, an electronic blackboard, or the like, the effect obtained is particularly great.
  • the icon arrangement determining unit 33 is configured to recognize the icon arrangement position based on a predetermined icon arrangement pattern (for example, (a) in FIGS. 9, 30, and 32). .
  • a predetermined icon arrangement pattern for example, (a) in FIGS. 9, 30, and 32.
  • the icon arrangement determination unit 33 may determine an icon arrangement pattern by itself based on a finger trajectory acquired by the contact information generation unit 21 when the user performs a contact operation.
  • FIG. 33 is a diagram illustrating an example of an icon arrangement pattern determined by the icon arrangement determining unit 33.
  • the icon arrangement determining unit 33 acquires information on the ring shape and the arrangement position determined by the ring shape determining unit 30, and plots the end point tn of the trajectory with respect thereto.
  • the icon arrangement determination unit 33 specifies a point P on the contour line of the ring that is closest to the end point of the locus, and determines that point as the icon arrangement position with the highest priority (first place).
  • a more detailed icon arrangement position of the first icon is not particularly limited. For example, a position where the center of the icon coincides with the point P may be determined as the first icon arrangement position.
  • the icon arrangement determining unit 33 determines the remaining icon arrangement positions based on the previously determined first icon arrangement position. For example, when there are seven icons to be arranged, as shown in FIG. 33, the remaining icon arrangement positions are all equally spaced on the contour line with reference to the first icon arrangement position. A to G are determined. The association between the remaining icon arrangement positions A to G and the priority order may be performed according to any of the rules described above.
  • the operation screen processing unit 24 can always arrange the icon with the highest priority in the shortest distance from the end point tn on the outline of the ring.
  • the icon arrangement determination unit 33 may determine the arrangement position of the icon with the highest priority (first place) as follows.
  • the icon arrangement determining unit 33 first acquires a point tn ⁇ 1 in addition to the end point tn from the trajectory. Then, the icon arrangement determining unit 33 specifies a position where the extended line of the straight line extending from the point tn ⁇ 1 to the end point tn first intersects with the contour line of the ring (here, the dashed ellipse), that is, the intersection point Q. And the icon arrangement
  • This configuration is effective when the end point tn is inside the ring.
  • the intersection Q is the point that the operating tool reaches the earliest naturally on the outline of the ring. Conceivable.
  • the configuration of the tablet terminal 100 as the information processing apparatus of the present invention that is, the configuration in which an icon having a high priority is arranged and displayed near the end point of the movement locus of the indicator, this is represented by the input unit 11 and the display unit 12. And may be applied to an information processing apparatus provided separately.
  • an information processing apparatus in which the input unit 11 is configured by an input device such as a mouse and the display unit 12 is configured by a display device such as an LCD can be considered.
  • the cursor displayed on the display unit 12 indicates the position of the display unit 12 on the screen. Then, when the user operates the mouse to perform an input operation, the cursor moves.
  • the mouse is the operating body
  • the cursor is the pointing body
  • the pointing body displayed on the display unit 12 moves as the operating body moves.
  • the information processing apparatus interlocks the position of the cursor held in advance with the movement of the mouse, The trajectory of the movement is held.
  • an icon having a high priority high possibility of being selected by the user
  • an operation screen in which icons that are often selected are arranged near the position where the user has finished moving the mouse is obtained. It becomes possible to move to operation.
  • various input devices such as a keyboard, a joystick, a digitizer, a tablet, and a stylus pen can be employed in addition to the mouse.
  • an information processing apparatus including a touch panel, at least a selection region specified based on a trajectory acquired by the contact motion acquisition unit that acquires a trajectory of movement of the indicator that has moved on the touch panel
  • the object specifying means specifies the object.
  • Related item extraction means for extracting an item associated with the selected object as a related item, and operation screen processing for displaying the icon of the related item extracted by the related item extraction means at a specific position and displaying it on the touch panel
  • the operation screen processing means is extracted as described above.
  • an icon of a related item having a higher priority (such as a descending order of possibility of being selected by the user) is arranged near the end point of the trajectory acquired by the contact operation acquisition unit.
  • an operation screen display method in an information processing apparatus including a touch panel, a touch motion acquisition step for acquiring a trajectory of movement of an indicator that has moved on the touch panel, and a specification based on the trajectory acquired in the touch motion acquisition step
  • An object specifying step for specifying an object at least partially overlapping the selected area as the selected object, and a related information storage unit for storing the object and an item related to the object in association with each other.
  • a related item extracting step for extracting an item associated with the object specified in the object specifying step as a related item, and an icon specifying position of the related item extracted by the related item extracting step.
  • Operation screen processing steps to be displayed on In the operation screen processing step among the extracted related items, the icons of related items having higher priority (such as the order in which the user is more likely to be selected) are acquired in the contact operation acquisition step.
  • An operation screen display method characterized by being arranged near the end point of the trajectory.
  • each block of the tablet terminal 100 in particular, the contact information generating unit 21, the object specifying unit 22, the related item extracting unit 23, the operation screen processing unit 24, the gesture determining unit 25, the ring shape determining unit 30, the icon rank determining unit 31, the animation determination unit 32, and the icon arrangement determination unit 33 may be configured by hardware logic, or may be realized by software using a CPU as follows.
  • the tablet terminal 100 includes a CPU (central processing unit) that executes instructions of a control program that realizes each function, a ROM (read only memory) that stores the program, a RAM (random access memory) that develops the program, A storage device (recording medium) such as a memory for storing the program and various data is provided.
  • An object of the present invention is to provide a recording medium on which a program code (execution format program, intermediate code program, source program) of a control program of the tablet terminal 100, which is software that realizes the functions described above, is recorded so as to be readable by a computer. This can also be achieved by supplying the tablet terminal 100 and reading and executing the program code recorded on the recording medium by the computer (or CPU or MPU).
  • Examples of the recording medium include tapes such as magnetic tapes and cassette tapes, magnetic disks such as floppy (registered trademark) disks / hard disks, and disks including optical disks such as CD-ROM / MO / MD / DVD / CD-R.
  • Card system such as IC card, IC card (including memory card) / optical card, or semiconductor memory system such as mask ROM / EPROM / EEPROM / flash ROM.
  • the tablet terminal 100 may be configured to be connectable to a communication network, and the program code may be supplied via the communication network.
  • the communication network is not particularly limited.
  • the Internet intranet, extranet, LAN, ISDN, VAN, CATV communication network, virtual private network, telephone line network, mobile communication network, satellite communication. A net or the like is available.
  • the transmission medium constituting the communication network is not particularly limited.
  • wired such as IEEE 1394, USB, power line carrier, cable TV line, telephone line, ADSL line, etc.
  • infrared rays such as IrDA and remote control, Bluetooth (Registered trademark), 802.11 wireless, HDR, mobile phone network, satellite line, terrestrial digital network, and the like can also be used.
  • the present invention can also be realized in the form of a computer data signal embedded in a carrier wave in which the program code is embodied by electronic transmission.
  • the present invention can be widely applied to information processing apparatuses including an input unit and a display unit.
  • a digital TV, personal computer, smartphone, tablet PC, notebook computer, mobile phone, PDA (Personal Digital Assistant), electronic book reader, electronic dictionary, portable, provided with an input unit and a display unit -It can be used suitably for a home game machine, an electronic blackboard, etc.
  • PDA Personal Digital Assistant
  • electronic book reader electronic dictionary
  • portable portable
  • Control unit 11 Input unit (touch panel) 12 Display unit (touch panel) 13 operation unit 14 external interface 15 communication unit 16 wireless communication unit 17 audio output unit 18 audio input unit 19 storage unit 21 contact information generation unit (trajectory acquisition unit / contact operation acquisition unit) 22 Object identification part (object identification means) 23 Related Item Extraction Unit (Related Item Extraction Unit) 24 Operation screen processing unit (operation screen processing means) 25 Gesture determination unit (gesture determination means) 30 Ring shape determination unit (ring shape determination means / arrangement pattern determination means) 31 Icon ranking determining unit (icon ranking determining means) 32 Animation determination unit (animation determination means) 33 Icon arrangement determining unit (icon arrangement determining means) 41 Frame map storage unit 42 Related information storage unit 43 Icon storage unit 44 Contact information storage unit 100 Tablet terminal (information processing apparatus)

Abstract

This information processing device, which is provided with an input unit and a display unit, achieves superior operability. A tablet terminal (100) is characterized by being provided with: a touch information generation unit (21) that acquires the trajectory along which an indicator that points out a position of the screen of the display unit (12) has moved; an object specification unit (22) that, as a selected object, specifies an object of which at least a portion overlaps a selected region specified on the basis of the trajectory; a related item extraction unit (23) that extracts, as a related item, an item associated with the specified object by referring to a related information recording unit (42) that associates and records objects and items related to the objects; and an operation screen processing unit (24) that disposes the icon of the extracted related item at a specific position, and displays the icon at the display unit (12). The tablet terminal is further characterized by the operation screen processing unit (24) disposing the icons of related items having a higher priority rank among the extracted related items closer to the endpoint of the trajectory.

Description

情報処理装置、操作画面表示方法、制御プログラムおよび記録媒体Information processing apparatus, operation screen display method, control program, and recording medium
 本発明は、入力部および表示部を備える情報処理装置のユーザインターフェース技術に関するものである。 The present invention relates to a user interface technology of an information processing apparatus including an input unit and a display unit.
 近年、スマートフォン、タブレットPCなどの、いわゆるタブレット端末が急速に普及している。タブレット端末は、平板状の外形を有しており、表示部および入力部としてのタッチパネルを備えている。このタッチパネルに表示されたオブジェクトを、指やペン等でタッチすることにより、ユーザは、タブレット端末本体への各種操作を行うことができる。 In recent years, so-called tablet terminals such as smartphones and tablet PCs are rapidly spreading. The tablet terminal has a flat outer shape and includes a touch panel as a display unit and an input unit. By touching the object displayed on the touch panel with a finger, a pen, or the like, the user can perform various operations on the tablet terminal body.
 タブレット端末は、タッチパネルによってユーザの画面上での様々な接触動作を判別することができ、その接触動作に合わせたオブジェクト表示を行うことができる。例えば、接触動作には、画面に表示されたオブジェクトを、指(ペン)でタップする(軽くたたく)、フリックする(はじく、はらう)、ピンチする(指でつまむ)、ドラッグするなどの様々な動作がある。タブレット端末は、そうしたさまざまな接触動作を判別し、判別結果に応じて、オブジェクトの選択/移動、リストのスクロール、画像などの拡大/縮小を行う。タブレット端末は、上述のようにタッチパネルによってより直感的な操作を実現し、多くの人から支持されている。 The tablet terminal can discriminate various contact operations on the user's screen by the touch panel, and can perform object display according to the contact operation. For example, for the touch action, various actions such as tapping (lightly tapping), flicking (flicking, picking), pinching (pinch with a finger), dragging an object displayed on the screen with a finger (pen) There is. The tablet terminal discriminates such various contact actions, and selects / moves objects, scrolls a list, enlarges / reduces an image, etc. according to the discrimination result. The tablet terminal realizes a more intuitive operation by the touch panel as described above, and is supported by many people.
 例えば、特許文献1には、タッチパネル式の表示部を備えた携帯通信端末が開示されている。特許文献1の携帯通信端末では、オブジェクト(URL、メールアドレス、文字列、画像など)を、指(ペン)でなぞるようにタッチしたり、囲むようにタッチしたりすることによって選択できるようになっている。そして、そのような動作で、オブジェクトが選択された場合には、携帯通信端末は、選択されたオブジェクトからキーワードを抽出して関連サイトへアクセスする。 For example, Patent Document 1 discloses a mobile communication terminal including a touch panel display unit. In the portable communication terminal of Patent Document 1, an object (URL, e-mail address, character string, image, etc.) can be selected by touching it with a finger (pen) or touching it so as to surround it. ing. When an object is selected by such an operation, the mobile communication terminal extracts a keyword from the selected object and accesses a related site.
 また、特許文献2には、タッチパネルディスプレイを備えた携帯機器が開示されている。特許文献2の携帯機器は、タッチパネルディスプレイにスルー画像(カメラに映った画像等)を表示し、囲むようにタッチして選択されたスルー画像内の特定対象を検出して、該特定対象の縮小画像をレリーズボタンとしてタッチパネルディスプレイの端部に表示することができる。 Further, Patent Document 2 discloses a portable device having a touch panel display. The portable device of Patent Document 2 displays a through image (an image reflected on a camera, etc.) on a touch panel display, detects a specific target in the through image selected by touching the surrounding area, and reduces the specific target. The image can be displayed on the edge of the touch panel display as a release button.
 また、特許文献3には、タッチパネルを用いたウェブサイト検索システムが開示されている。特許文献3のウェブサイト検索システムは、キーワード表示領域に表示されているキーワードが、手でタッチされると、それを検索キーワードとして受け付け、受け付けたキーワードに対応する第1マザーアイコンを表示する。そして、ウェブサイト検索システムは、キーワードに従って検索エンジンでウェブサイトを検索し、検索されたウェブサイトのサムネイル画像を上記第1マザーアイコンの周囲に表示する。 Patent Document 3 discloses a website search system using a touch panel. When the keyword displayed in the keyword display area is touched by hand, the website search system of Patent Literature 3 accepts it as a search keyword and displays a first mother icon corresponding to the accepted keyword. Then, the website search system searches the website with a search engine according to the keyword, and displays a thumbnail image of the searched website around the first mother icon.
 また、特許文献4には、接触センサを有する表示パネルを備えた情報処理装置が開示されている。特許文献4の情報処理装置は、オブジェクトが選択されている状態で、操作体(指)の所定角度以上の回転を検知して、該オブジェクトに関連する操作項目を該オブジェクトの周辺に表示する。 Patent Document 4 discloses an information processing apparatus including a display panel having a contact sensor. The information processing apparatus disclosed in Patent Document 4 detects rotation of the operating body (finger) by a predetermined angle or more while an object is selected, and displays operation items related to the object around the object.
 また、特許文献5には、タッチパネル部を備えた情報処理装置が開示されている。特許文献5の情報処理装置は、ユーザのタッチ位置の軌跡を取得して、軌跡によって選択されたオブジェクト画像を特定し、軌跡の端点に応じた位置に、その選択されたオブジェクト画像を移動させる。 Further, Patent Document 5 discloses an information processing apparatus including a touch panel unit. The information processing apparatus disclosed in Patent Document 5 acquires a trajectory of a user's touch position, specifies an object image selected by the trajectory, and moves the selected object image to a position corresponding to an end point of the trajectory.
 一方で、タブレット端末にかかわらず、また、装置の大小を問わず、情報処理装置一般において、メニューを表示し選択を受け付けるというユーザインターフェース技術は非常によく用いられている。この技術によれば、情報処理装置は、ユーザが所望の項目を選択可能なようにメニューを表示し、ユーザの選択を受け付ける。これにより、ユーザは所望の項目を選択することで装置を操作することができる。例えば、特許文献6~9には、ユーザの利便性およぶ操作性を向上させることを目的として、メニュー表示を実現する情報処理装置が開示されている。 On the other hand, regardless of the tablet terminal and regardless of the size of the device, in general information processing devices, user interface technology that displays a menu and accepts selection is very often used. According to this technology, the information processing apparatus displays a menu so that the user can select a desired item, and accepts the user's selection. Thus, the user can operate the apparatus by selecting a desired item. For example, Patent Documents 6 to 9 disclose information processing apparatuses that realize menu display for the purpose of improving user convenience and operability.
 具体的には、特許文献6には、表示画面においてカーソルの移動量を少なくし、利用者の視線の移動を軽減する情報表示装置が開示されている。具体的には、情報表示装置において、カーソルの位置を移動制御するポインティングデバイスによってカーソルが移動させられると、情報表示装置は、カーソルの軌跡形状を検出する。そして、情報表示装置は、軌跡形状に対応付けられたメニュー情報を検索し、検索されたメニュー情報をカーソルの軌跡の終端近傍に表示する。 Specifically, Patent Document 6 discloses an information display device that reduces the amount of movement of the cursor on the display screen and reduces the movement of the user's line of sight. Specifically, in the information display device, when the cursor is moved by a pointing device that controls the movement of the cursor position, the information display device detects the locus shape of the cursor. Then, the information display device searches for menu information associated with the trajectory shape, and displays the searched menu information near the end of the cursor trajectory.
 特許文献7には、マウスなどの移動の軌跡を検出し、その軌跡情報に対応付けられたポップアップメニューを選択して表示画面上に表示するウィンドウ表示制御装置が開示されている。具体的には、ウィンドウ表示制御装置は、マウスのカーソル位置を起点としてポップアップメニューを表示したり、予め決められた複数のポップアップメニューをマウスの軌跡上に配置したりする。 Patent Document 7 discloses a window display control device that detects a movement trajectory of a mouse or the like, selects a pop-up menu associated with the trajectory information, and displays it on a display screen. Specifically, the window display control device displays a pop-up menu starting from the cursor position of the mouse, or arranges a plurality of predetermined pop-up menus on the locus of the mouse.
 特許文献8には、複数アイテムの選択の高速化を行うメニューディスプレイを提供する方法及び装置が開示されている。具体的には、装置は、角度マーキングメニューと直線メニューとを組み合わせて表示する。角度マーキングメニューでは、ペンなどにより作成されるストロークパターンによって選択されたアイテムが判別される。直線メニューでは、ペンなどで位置を選択することにより、アイテムの選択が行われる。 Patent Document 8 discloses a method and apparatus for providing a menu display that speeds up selection of a plurality of items. Specifically, the apparatus displays a combination of an angle marking menu and a straight line menu. In the angle marking menu, an item selected by a stroke pattern created by a pen or the like is determined. In the straight line menu, an item is selected by selecting a position with a pen or the like.
 特許文献9には、メニュー表示制御装置およびこれを適用した電子黒板システムが開示されている。具体的には、メニュー表示制御装置は、ユーザが座標入力面に指先で線などを描くことによって装置に入力される、複数の座標に関する入力履歴を認識する。メニュー表示制御装置は、入力履歴に基づいて処理メニューを表示すべきか否かを判断し、処理メニューを表示する。より具体的には、メニュー表示制御装置は、ユーザが人差し指で座標入力面に円を描いた後に、人差し指を画面にタッチしたまま、中指で座標入力面をタッチすると、中指がタッチされた付近に処理メニューを表示する。 Patent Document 9 discloses a menu display control device and an electronic blackboard system to which the menu display control device is applied. Specifically, the menu display control device recognizes an input history related to a plurality of coordinates input to the device by a user drawing a line or the like on the coordinate input surface with a fingertip. The menu display control device determines whether or not to display the processing menu based on the input history, and displays the processing menu. More specifically, after the user draws a circle on the coordinate input surface with the index finger, the menu display control device touches the coordinate input surface with the middle finger while touching the index finger on the screen, and the middle finger is touched. Display the processing menu.
特開2010-218322号公報(2010年9月30日公開)JP 2010-218322 A (published on September 30, 2010) 特開2010-182023号公報(2010年8月19日公開)JP 2010-182023 A (released on August 19, 2010) 特開2009-134738号公報(2009年6月18日公開)JP 2009-134738 A (released on June 18, 2009) 特開2011-13980号公報(2011年1月20日公開)JP 2011-13980 A (published January 20, 2011) 特開2006-244353号公報(2006年9月14日公開)JP 2006-244353 A (published on September 14, 2006) 特開平8-305535号公報(1996年11月22日公開)JP-A-8-305535 (released on November 22, 1996) 特開平10-307674号公報(1998年11月17日公開)Japanese Patent Laid-Open No. 10-307664 (published November 17, 1998) 特表平11-507455号公報(1999年6月29日公表)No. 11-507455 (announced on June 29, 1999) 特開2001-265475号公報(2001年9月28日公開)JP 2001-265475 A (published September 28, 2001)
 タブレット端末の操作性の良さは、いかに簡易な接触動作で且ついかに少ない動作数でユーザの目的である最終結果物を表示させるかということ、および、いかにユーザの直感に反しない自然な流れで、該接触動作に基づく結果物の表示を行うかということにかかっている。 The good operability of the tablet terminal is how to display the final result, which is the user's purpose, with a simple contact operation and a small number of operations, and a natural flow that does not contradict the user's intuition, It depends on whether to display the result based on the contact operation.
 こうした操作性の向上は、ユーザの目的、ユーザの状態、ユーザの傾向を適正に把握することによって実現される。タブレット端末は、例えば、ユーザは今何をどうしたいのか、その次に何がしたいのか、ユーザは今どうやって操作しているのか、今どこにいるのか、ユーザの動きに対してどのように表示することが自然であるのか、など、ユーザの意図をあらゆる観点から「察する」ことが求められている。 Such an improvement in operability is realized by appropriately grasping the user's purpose, the user's state, and the user's tendency. For example, the tablet terminal displays what the user wants to do now, what the user wants to do next, how the user is now operating, where the user is, and how the user moves. It is required to “see” the user's intentions from all points of view, such as whether or not it is natural.
 上述の特許文献1~9の各装置の構成では、ユーザの意図を察するには必ずしも十分とは言えない。 The configurations of the devices described in Patent Documents 1 to 9 are not necessarily sufficient to detect the user's intention.
 より具体的には、特許文献1には、オブジェクトを囲うという動作によって、オブジェクトを選択することが開示されているが、上記動作によって、そのオブジェクトに関連する項目を抽出し表示することは開示されていない。また、特許文献2には、オブジェクトを囲うという動作によって、そのオブジェクトに対応するアイコンを表示することは開示されているが、上記動作によって、そのオブジェクトを選択し、該選択に伴って、オブジェクトに関連する項目を抽出し表示することは開示されていない。また、特許文献3には、オブジェクトが選択されると、オブジェクトを表示し、オブジェクトに関連するサムネイルをオブジェクトの周囲に表示することが開示されているが、オブジェクトを囲うという動作とサムネイルの表示とは結びついていない。また、特許文献4には、オブジェクトをタッチして選択すること、および、オブジェクトに関連するアイコンをオブジェクトの周囲に表示することが開示されているが、周囲にアイコンを表示させるために、オブジェクトの選択とは別に、煩雑な動作(指をタッチ面に押さえつけて指の角度を変えてひねるような動作)を行わなければならず、操作数が増え、目的の結果物(アイコン)を表示するまでの操作が非常に複雑なものとなる。 More specifically, Patent Document 1 discloses that an object is selected by an operation of surrounding the object, but it is disclosed that an item related to the object is extracted and displayed by the above operation. Not. Further, Patent Document 2 discloses that an icon corresponding to an object is displayed by an operation of enclosing the object. However, the object is selected by the above operation, and the object is displayed along with the selection. Extracting and displaying related items is not disclosed. Further, Patent Document 3 discloses that when an object is selected, the object is displayed, and a thumbnail related to the object is displayed around the object. Are not connected. Patent Document 4 discloses that an object is touched to be selected and an icon related to the object is displayed around the object. In order to display an icon around the object, the object is displayed. Apart from the selection, it is necessary to perform complicated operations (operations such as pressing the finger against the touch surface and twisting the finger angle) until the number of operations increases and the desired result (icon) is displayed. The operation becomes very complicated.
 特許文献6および7には、メニュー情報を表示する前に、オブジェクトを選択するという事象が発生することに言及されていないため、オブジェクトの選択に応じて、メニュー情報の各項目の配置位置を制御することができない。また、特許文献8の技術は、操作が非常に煩雑になるという問題がある。具体的には、特許文献8によれば、メニューリストが2種類あって、アイテムの選択の仕方がメニューの種類ごとに異なっている。結果として、ペンの動きとアイテムの選択の流れが、2パターンあり、ペン操作を種類ごとに使い分けなければならない。また、特許文献9の技術では、複数の指を使ってタッチパネルを操作しなければならず、操作が煩雑な上、ペンで操作を行う装置には適用できない。 Patent Documents 6 and 7 do not mention that an event of selecting an object occurs before menu information is displayed, so the arrangement position of each item of menu information is controlled according to the selection of the object. Can not do it. Moreover, the technique of patent document 8 has the problem that operation becomes very complicated. Specifically, according to Patent Document 8, there are two types of menu lists, and the method of selecting an item is different for each type of menu. As a result, there are two patterns of pen movement and item selection flow, and the pen operation must be used for each type. In the technique of Patent Document 9, the touch panel must be operated using a plurality of fingers, which is complicated and cannot be applied to a device that operates with a pen.
 結果として、オブジェクトの選択、決定、結果物(関連する項目)の表示、結果物の選択、・・・最終結果物の表示、という一連の処理を、簡易な接触動作、少ない動作数、および、直感的な接触動作で装置に行わせるということができないという問題がある。 As a result, a series of processes such as object selection, determination, display of the result (related items), selection of the result, display of the final result, a simple contact operation, a small number of operations, and There is a problem that the device cannot be operated by an intuitive contact operation.
 上述の操作性の問題は、携帯性に優れた小型のタブレット端末のみならず、タッチパネル式の表示部兼入力部を備えた、あらゆるサイズの情報処理装置、ならびに、タッチパネルに限定されずあらゆる形態の表示部および入力部を備えた情報処理装置に共通して生じる問題である。 The above-mentioned operability problem is not limited to a small tablet terminal excellent in portability, but also an information processing apparatus of any size provided with a touch panel type display unit and input unit, as well as all types of forms not limited to a touch panel. This is a problem that commonly occurs in information processing apparatuses including a display unit and an input unit.
 本発明は、上記の問題点に鑑みてなされたものであり、その目的は、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することにある。 The present invention has been made in view of the above-described problems, and an object thereof is to realize excellent operability in an information processing apparatus including an input unit and a display unit.
 本発明の情報処理装置は、上記課題を解決するために、表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得手段と、上記軌跡取得手段によって取得された軌跡に基づいて特定された選択領域に少なくとも一部が重なるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定手段と、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定手段によって特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出手段と、上記関連項目抽出手段によって抽出された関連項目のアイコンを特定の位置に配置して、上記表示部に表示する操作画面処理手段とを備え、上記操作画面処理手段は、上記抽出された関連項目の中で、優先順位が高い関連項目のアイコンほど、上記軌跡取得手段によって取得された軌跡の終点の近くに配置することを特徴としている。 In order to solve the above problems, an information processing apparatus according to the present invention is based on a trajectory acquisition unit that acquires a trajectory of movement of an indicator that indicates a position of a screen of a display unit, and a trajectory acquired by the trajectory acquisition unit. With reference to an object specifying means for specifying an object that at least partially overlaps the specified selection area as the selected object, and a related information storage unit that stores the object and an item related to the object in association with each other, A related item extracting unit that extracts an item associated with the object specified by the object specifying unit as a related item, and an icon of the related item extracted by the related item extracting unit is arranged at a specific position, and Operation screen processing means for displaying on the display unit, wherein the operation screen processing means comprises the extracted function. Among the items, as icons higher priority related items, and characterized in that disposed near the end point of the trace obtained by the trace obtaining means.
 上記構成によれば、軌跡取得手段が、ユーザによって行われたオブジェクトを選択するための動作(指示体の移動)の軌跡を取得し、この軌跡に基づいて、オブジェクト特定手段が、上記動作によってユーザが選択したオブジェクトを特定する。続いて、関連項目抽出手段が、特定されたオブジェクトに関連する項目を抽出する。関連情報記憶部には、オブジェクトと該オブジェクトに関連する項目とが対応付けて記憶されているので、関連項目抽出手段によって抽出された関連項目は、ユーザによって選択されたオブジェクトに関連があるものばかりである。最後に、操作画面処理手段は、抽出された関連項目のアイコンを、それぞれ、上記表示部上の特定の配置位置に配置する。 According to the above configuration, the trajectory acquisition unit acquires a trajectory of an action (movement of the indicator) performed by the user for selecting an object, and based on the trajectory, the object specifying unit performs the operation according to the operation. Identifies the selected object. Subsequently, the related item extracting unit extracts items related to the identified object. Since the related information storage unit stores the object and the item related to the object in association with each other, the related item extracted by the related item extracting unit is related to the object selected by the user. It is. Finally, the operation screen processing means arranges the extracted icons of related items at specific arrangement positions on the display unit.
 ここで、より詳細には、操作画面処理手段は、どの関連項目のアイコンをどの配置位置に配置するのかを、以下のようにして決定する。 Here, in more detail, the operation screen processing unit determines which related item icon is to be arranged in which arrangement position as follows.
 操作画面処理手段は、アイコンを配置するためにあらかじめ特定されているいくつかの配置位置と、上記軌跡取得手段によって取得された軌跡の終点の位置との間の距離を参考にして、抽出された関連項目の中で、優先順位が高い(例えば、ユーザに選択される可能性が高い)関連項目のアイコンほど、上記終点に近い配置位置に配置することを決定する。 The operation screen processing means is extracted with reference to the distance between some arrangement positions specified in advance for arranging the icons and the position of the end point of the trajectory acquired by the trajectory acquisition means. Among the related items, the icon of the related item having the higher priority (for example, the possibility of being selected by the user is high) is determined to be arranged at the arrangement position close to the end point.
 これにより、オブジェクトを選択するためのユーザの1つの動作に対して、タブレット端末100は、選択されたオブジェクトに関連する関連項目のアイコンを選択可能に表示するという結果を出力することができる。その上、ここで選択可能に表示されたアイコンの配置は、ユーザの動作が考慮された配置となっている。すなわち、優先順位が高いアイコンほど、ユーザが先の動作を終えた位置(軌跡の終点)の近くに表示されるという配置である。 Thereby, for one operation of the user for selecting an object, the tablet terminal 100 can output a result of selectively displaying icons of related items related to the selected object. In addition, the arrangement of the icons displayed so as to be selectable here is an arrangement in consideration of the user's operation. That is, the higher priority icons are displayed near the position where the user finished the previous operation (end point of the trajectory).
 ユーザは、先のオブジェクトを選択する動作を行った後、すぐに、そのオブジェクトに関連する関連項目のアイコンを選択する動作を行うと予想され、これが操作の自然な流れである。この場合、ユーザは、先の動作が完了した位置から、次に目的のアイコンが表示された位置まで、指示体(指、ペン、あるいは、マウスで操作されるカーソルなど)を移動させることになる。 The user is expected to perform an operation of selecting an icon of a related item related to the object immediately after performing the operation of selecting the previous object, and this is a natural flow of operation. In this case, the user moves the indicator (finger, pen, or cursor operated by the mouse) from the position where the previous operation is completed to the position where the target icon is displayed next. .
 ここで、先の動作が完了した位置から、目的のアイコンが表示された位置までの距離が長ければ長いほど、指示体を長く移動させなければならないので、ユーザにとって選択操作は煩わしいものとなる。このような煩わしさは、表示部の画面サイズが大きければ大きいほど顕著となり、また、ユーザが片手で操作していて、接触可能領域が限られる場合などには、特に深刻な問題となる。オブジェクトが階層で管理されていて、何度も続けて選択操作が行われる場合にも上記煩わしさはますます増大する。 Here, the longer the distance from the position where the previous operation is completed to the position where the target icon is displayed, the longer the indicator must be moved, and the selection operation becomes troublesome for the user. Such annoyance becomes more prominent as the screen size of the display unit is larger, and becomes a particularly serious problem when the user is operating with one hand and the accessible area is limited. Even when the objects are managed in a hierarchy and the selection operation is performed repeatedly, the annoyance is further increased.
 そこで、本発明のように、選択される可能性が高いアイコンほど、先の動作の完了位置の近くに表示されるような配置にすることにより、ユーザは、高い確率で先の動作の移動完了位置近くに表示されている所望のアイコンを選択することになる。結果として、高い確率で、上述の選択操作の煩わしさを回避することができる。 Therefore, as in the present invention, the user can complete the movement of the previous action with a high probability by arranging the icons that are more likely to be selected to be displayed near the completion position of the previous action. A desired icon displayed near the position is selected. As a result, it is possible to avoid the troublesome selection operation described above with a high probability.
 したがって、ユーザは、指示体を画面上のあちらこちらに不自然に移動させる必要がなくなり、簡易な動作で目的の最終結果物にたどり着くことができる。また指示体の移動距離が短いほど、誤操作の誘発を抑えることも可能である。 Therefore, it is not necessary for the user to unnaturally move the indicator from one place to another on the screen, and the target final result can be reached with a simple operation. In addition, it is possible to suppress the induction of erroneous operations as the moving distance of the indicator is shorter.
 加えて、本発明の情報処理装置は、オブジェクトを選択した次にユーザが選択するであろう関連項目を予め察して、ユーザに選択可能に表示することができる。具体的には、本発明の上記構成によれば、操作画面処理手段によって操作画面に配置されるアイコンは、いずれも、ユーザが選択したオブジェクトに関連がある項目として抽出された関連項目のアイコンである。つまり、オブジェクトを選択したのち、次に選択されるであろう関連項目が、選択される可能性が高い順に動作の完了位置から近い位置にてアイコンで表示されるので、ユーザは次に目的とするアイコンを、指示体を大きく移動させることなく、即座に指定することができる。 In addition, the information processing apparatus of the present invention can preliminarily detect related items that the user will select next after selecting an object, and can display the related items in a selectable manner for the user. Specifically, according to the above configuration of the present invention, the icons arranged on the operation screen by the operation screen processing means are all related item icons extracted as items related to the object selected by the user. is there. In other words, after selecting an object, the related items that will be selected next are displayed as icons in the order from the highest possibility of being selected, so that the user can select the next target item. The icon to be performed can be immediately specified without greatly moving the indicator.
 以上のことから、本発明の情報処理装置は、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the information processing apparatus according to the present invention can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 本発明の情報処理装置では、上記関連情報記憶部には、関連項目のアイコンがユーザによって選択された回数が関連項目ごとに記憶されており、上記操作画面処理手段は、上記選択された回数が多い関連項目のアイコンほど、上記軌跡の終点の近くに配置してもよい。 In the information processing apparatus of the present invention, the related information storage unit stores the number of times the icon of the related item is selected by the user for each related item, and the operation screen processing means stores the number of times the selected item is selected. You may arrange | position the icon of many related items near the end point of the said locus | trajectory.
 上記構成によれば、操作画面処理手段によって、抽出されたアイコンが特定の配置位置に配置されるとき、操作画面処理手段は、選択された回数が多い関連項目のアイコンほど、上記軌跡の終点の近くに配置する。 According to the above configuration, when the extracted icon is arranged at a specific arrangement position by the operation screen processing means, the operation screen processing means indicates that the icon of the related item that is selected more frequently becomes the end point of the trajectory. Place it close.
 選択された回数が多い関連項目は、そのアイコンがユーザによってよく選択されているということを示す。ユーザによって最も多く選択されているアイコン(関連項目)は、優先順位が高い、すなわち、この次に選択される可能性が最も高いアイコン(関連項目)であると考えられる。 -A related item that is selected many times indicates that the icon is often selected by the user. The icon (related item) most frequently selected by the user is considered to be the icon (related item) having the highest priority, that is, the most likely to be selected next.
 上述のように過去の選択回数に基づいて、操作画面処理手段がアイコンを配置することにより、ユーザに選択される可能性が高いアイコンほど、軌跡の終点近くに配置されるように、操作画面(結果物)をユーザに提供することが可能となる。 As described above, the operation screen processing means arranges the icons based on the past number of selections, so that the icons that are more likely to be selected by the user are arranged closer to the end point of the locus. (Result) can be provided to the user.
 したがって、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 Therefore, the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 あるいは、本発明の情報処理装置では、上記関連情報記憶部には、関連項目の属性が関連項目ごとに記憶されており、上記操作画面処理手段は、上記選択されたオブジェクトの属性との類似度が高い属性を有する関連項目のアイコンほど、上記軌跡の終点の近くに配置してもよい。 Alternatively, in the information processing apparatus of the present invention, the attribute of the related item is stored for each related item in the related information storage unit, and the operation screen processing unit is similar to the attribute of the selected object. The icon of the related item having a higher attribute may be arranged closer to the end point of the trajectory.
 上記構成によれば、操作画面処理手段によって、抽出されたアイコンが特定の配置位置に配置されるとき、操作画面処理手段は、選択されたオブジェクトと属性が似ている関連項目のアイコンほど、上記軌跡の終点の近くに配置する。 According to the above configuration, when the extracted icon is arranged at a specific arrangement position by the operation screen processing means, the operation screen processing means indicates that the icon of the related item whose attribute is similar to the selected object, Place near the end of the trajectory.
 選択されたオブジェクトと属性との類似度が高い、すなわち、性質や分類が似ている関連項目は、その選択されたオブジェクトとより関係が深い、あるいは、性質が似ていると考えられる。 -It is considered that related items having a high degree of similarity between the selected object and the attribute, that is, similar in nature and classification, are more closely related to the selected object or similar in nature.
 ユーザは、属性(分類または性質など)が全く統一されていないオブジェクト(アイコン)を次々に選択するよりも、属性が、類似(または同一)のオブジェクト(アイコン)を、次々に選択して、閲覧したり、編集したりする傾向がある。 Rather than selecting objects (icons) whose attributes (classification or property, etc.) are not uniform at all, the user selects objects (icons) with similar (or identical) attributes one after another and browses them. And tend to edit.
 つまり、先の動作でオブジェクトが選択された次には、そのオブジェクトと属性が類似している関連項目(のアイコン)が、優先順位がより高くなる、すなわち、選択される可能性がより高いと言える。 In other words, after an object is selected in the previous operation, a related item (its icon) whose attribute is similar to that of the object has a higher priority, that is, is more likely to be selected. I can say that.
 このため、上述のように、オブジェクトの属性との類似度に基づいて、操作画面処理手段がアイコンを配置することにより、ユーザに選択される可能性が高いアイコンほど、軌跡の終点近くに配置されるように、操作画面(結果物)をユーザに提供することが可能となる。 For this reason, as described above, the operation screen processing unit arranges icons based on the similarity to the attribute of the object, so that icons that are more likely to be selected by the user are arranged near the end point of the trajectory. As described above, an operation screen (result) can be provided to the user.
 したがって、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 Therefore, the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 さらに、本発明の情報処理装置では、上記選択されたオブジェクト、および、上記関連項目は、写真であって、上記関連情報記憶部には、写真の撮影日時が属性として記憶されており、上記操作画面処理手段は、上記選択されたオブジェクトである写真の撮影日時と近い撮影日時の写真のアイコンほど、上記軌跡の終点の近くに配置することができる。 Furthermore, in the information processing apparatus of the present invention, the selected object and the related item are photographs, and the related information storage unit stores the shooting date and time of the photograph as an attribute. The screen processing means can arrange the icon of the photograph with the photographing date and time closer to the photographing date and time of the photograph as the selected object closer to the end point of the trajectory.
 上記構成によれば、操作画面処理手段によって、抽出された写真のアイコンが特定の配置位置に配置されるとき、操作画面処理手段は、選択されたオブジェクトとしての写真と撮影日時が近い写真のアイコンほど、上記軌跡の終点の近くに配置する。 According to the above configuration, when the extracted photo icon is arranged at a specific arrangement position by the operation screen processing unit, the operation screen processing unit displays the icon of the photo whose shooting date and time is close to the selected object. It is arranged near the end point of the locus.
 つまり、先の動作で写真が選択された次には、その写真と属性が類似している(ここでは、撮影日時が近い)写真が、優先順位がより高くなる、すなわち、選択される可能性がより高いと言える。 In other words, after a photo is selected in the previous operation, a photo that has similar attributes to the photo (here, the shooting date and time is close) has a higher priority, that is, may be selected. Can be said to be higher.
 このため、上述のように、写真の撮影日時の類似度に基づいて、操作画面処理手段が写真のアイコンを配置することにより、ユーザに選択される可能性が高い写真ほど、軌跡の終点近くに配置されるように、操作画面(結果物)をユーザに提供することが可能となる。 For this reason, as described above, the operation screen processing means places a photo icon based on the similarity of the shooting date and time of the photo, so that a photo that is more likely to be selected by the user is closer to the end of the trajectory. It is possible to provide an operation screen (result) to the user so as to be arranged.
 したがって、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 Therefore, the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 あるいは、本発明の情報処理装置では、上記関連項目は、動画コンテンツであって、上記関連情報記憶部には、動画コンテンツをユーザに推奨する度合いを示す推奨度が属性として記憶されており、上記操作画面処理手段は、上記推奨度が高い動画コンテンツのアイコンほど、上記軌跡の終点の近くに配置してもよい。 Alternatively, in the information processing apparatus of the present invention, the related item is video content, and the related information storage unit stores a recommendation degree indicating a degree of recommending the video content to the user as an attribute. The operation screen processing means may be arranged near the end point of the trajectory as the icon of the moving image content having the higher recommendation level.
 上記構成によれば、操作画面処理手段によって、抽出された動画コンテンツのアイコンが特定の配置位置に配置されるとき、操作画面処理手段は、各動画コンテンツに関連付けられている推奨度が高い動画コンテンツのアイコンほど、上記軌跡の終点の近くに配置する。 According to the above configuration, when the icon of the extracted video content is arranged at a specific arrangement position by the operation screen processing unit, the operation screen processing unit displays the video content having a high recommendation level associated with each video content. The icon is placed closer to the end point of the trajectory.
 ユーザが、推奨された動画コンテンツを選択する傾向が強いことは自然なことであり、推奨度が高い動画コンテンツほど、優先順位が高い、すなわち、ユーザに選択される可能性は高くなると考えられる。 It is natural that the user has a strong tendency to select the recommended moving image content, and it is considered that the moving image content having a higher degree of recommendation has a higher priority, that is, the possibility of being selected by the user is higher.
 このため、上述のように、動画コンテンツの推奨度に基づいて、操作画面処理手段がアイコンを配置することにより、ユーザに選択される可能性が高いアイコンほど、軌跡の終点近くに配置されるように、操作画面(結果物)をユーザに提供することが可能となる。 For this reason, as described above, the operation screen processing unit arranges icons based on the recommendation level of the moving image content, so that icons that are more likely to be selected by the user are arranged closer to the end point of the trajectory. In addition, an operation screen (result) can be provided to the user.
 したがって、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 Therefore, the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 あるいは、本発明の情報処理装置では、上記軌跡取得手段は、上記表示部に表示されたオブジェクトを囲うように上記表示部の画面を移動した上記指示体の移動の軌跡を取得し、上記オブジェクト特定手段は、上記軌跡により囲われた領域に、少なくとも一部が含まれるオブジェクトを、選択されたオブジェクトとして特定し、上記操作画面処理手段は、上記抽出された関連項目のアイコンを、環の輪郭線上に並べて配置することが好ましい。 Alternatively, in the information processing apparatus of the present invention, the trajectory acquisition unit acquires a trajectory of movement of the indicator that has moved the screen of the display unit so as to surround the object displayed on the display unit, and the object identification The means identifies an object at least partially included in the region surrounded by the locus as the selected object, and the operation screen processing means displays the extracted icon of the related item on the outline of the ring. It is preferable to arrange them side by side.
 上記構成によれば、軌跡取得手段が囲う動作の軌跡を取得し、この軌跡に基づいて、オブジェクト特定手段が、囲う動作によってユーザが選択したオブジェクトを特定する。続いて、関連項目抽出手段が、特定されたオブジェクトに関連する項目を抽出する。関連情報記憶部には、オブジェクトと該オブジェクトに関連する項目とが対応付けて記憶されているので、関連項目抽出手段によって抽出された関連項目は、ユーザによって選択されたオブジェクトに関連があるものばかりである。最後に、操作画面処理手段は、抽出された関連項目のアイコンを、囲う動作から連想されやすい環の形状に並べて配置する。なお、操作画面処理手段は、各アイコンを環の輪郭線上に並べる場合でも、その輪郭線上の特定の配置位置と、上記軌跡の終点との距離を参照して、優先順位が高いアイコンほど、上記終点近くに配置することを決定する。 According to the above configuration, the trajectory of the motion enclosed by the trajectory acquisition unit is acquired, and based on the trajectory, the object specifying unit specifies the object selected by the user by the surrounding motion. Subsequently, the related item extracting unit extracts items related to the identified object. Since the related information storage unit stores the object and the item related to the object in association with each other, the related item extracted by the related item extracting unit is related to the object selected by the user. It is. Finally, the operation screen processing means arranges the icons of the extracted related items in a ring shape that is easily associated with the surrounding operation. Note that the operation screen processing means refers to the distance between the specific arrangement position on the contour line and the end point of the trajectory even when the icons are arranged on the contour line of the ring, Decide to place it near the end point.
 これにより、上述のようにしてアイコンが配置され、生成された操作画面がユーザに提示される。このように、本発明の情報処理装置は、オブジェクトの周囲を指示体(ペン、指、または、マウスで制御するカーソルなど)で「囲う」という、オブジェクトを指定する上で、極めて自然で簡易なユーザの動作を契機として、その選択されたオブジェクトに関連する関連項目のアイコンを、環の形状に配置した操作画面をユーザに提供することができる。 Thereby, the icons are arranged as described above, and the generated operation screen is presented to the user. As described above, the information processing apparatus according to the present invention is extremely natural and simple in designating an object to “enclose” the object with an indicator (a pen, a finger, or a cursor controlled by a mouse). With the user's action as an opportunity, an operation screen in which icons of related items related to the selected object are arranged in a ring shape can be provided to the user.
 「囲う」の軌跡は、何かを囲むような形状になっているので、環の形状と、囲う動作によって得られた指示体の移動の軌跡とは類似していると言える。よって、先の、ユーザの囲う動作から、結果物(環状に配置されたアイコン)は連想されやすい。 Since the trajectory of “enclose” has a shape that encloses something, it can be said that the shape of the ring is similar to the trajectory of movement of the indicator obtained by the enclosing operation. Therefore, the result (an icon arranged in a ring shape) is likely to be associated with the above-described operation surrounded by the user.
 つまり、「オブジェクトを『囲う』動作を起こす」という事象から、「アイコンを環状に配置した操作画面が表示される」という事象への遷移は、ユーザの直感に反しない自然な流れであると言える。 In other words, it can be said that the transition from the event of "enclosing the object" to the event of "an operation screen with icons arranged in a circle" is a natural flow that does not contradict the user's intuition. .
 また、関連項目のアイコンを環状に配置したメニューリストは、線状の一次元的なメニューリストと比較して、次のようなメリットがある。例えば、一次元的なメニューリストでは、例えば、上から下へ、または、左から右へアイコンが配置されるため、配置位置に応じて各アイコンに意図せず優先順位が付いてしまう。これに対し、環状のメニューリストによれば、環状に配置されるすべてのアイコンを対等に扱うことが可能となる。 Also, the menu list with related item icons arranged in a circle has the following advantages compared to the linear one-dimensional menu list. For example, in a one-dimensional menu list, for example, icons are arranged from top to bottom or from left to right. Therefore, priorities are unintentionally assigned to the icons according to the arrangement positions. On the other hand, according to the circular menu list, it is possible to treat all icons arranged in a circular manner on an equal basis.
 以上のことから、本発明の情報処理装置は、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しないより一層自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 As described above, the information processing apparatus of the present invention can display the final result desired by the user in a more natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. . As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 本発明の情報処理装置では、上記操作画面処理手段は、上記選択されたオブジェクトの周囲にアイコンが配置されるように上記環の位置およびサイズを決定することが好ましい。 In the information processing apparatus of the present invention, it is preferable that the operation screen processing means determines the position and size of the ring so that an icon is arranged around the selected object.
 上記構成によれば、オブジェクトの周囲を「囲う」という、オブジェクトを指定する上で、極めて自然で簡易なユーザの動作に対して、情報処理装置は、そのオブジェクトの周囲にアイコンを配置するという結果を出力することができる。 According to the configuration described above, the information processing apparatus places an icon around the object in response to an extremely natural and simple user action of “enclosing” the object. Can be output.
 ユーザは、先に自分が囲んで選択したオブジェクトを囲むように関連項目のアイコンが配置された操作画面を結果物として得ることができる。これらのアイコンとオブジェクトとの位置関係は、先にユーザが実行した動作による指示体の移動の軌跡とオブジェクトとの位置関係に合致する。また、オブジェクトを囲うようにして得られた指示体の移動の軌跡は、アイコンが配置される環の形状に類似する。 The user can obtain, as a result, an operation screen in which icons of related items are arranged so as to surround the object that he / she has previously enclosed and selected. The positional relationship between these icons and the object matches the positional relationship between the object and the locus of movement of the indicator by the action previously performed by the user. Further, the movement trajectory of the indicator obtained by surrounding the object is similar to the shape of the ring in which the icon is arranged.
 つまり、「オブジェクトを『囲う』動作を起こす」という事象から、「オブジェクトの周囲にアイコンが配置された操作画面が表示される」という事象への遷移は、ユーザの直感に反しないより自然な流れであると言える。 In other words, the transition from the event of “enclosing the object” to the event of “an operation screen with icons arranged around the object” is more natural than the user's intuition. It can be said that.
 さらに、関連項目のアイコンをオブジェクトの周囲に環状に配置したメニューリストは、線状の一次元的なメニューリストと比較して、次のようなメリットがある。例えば、一次元的なメニューリストでは、例えば、上から下へ、または、左から右へアイコンが配置されるため、配置位置に応じて各アイコンに意図せず優先順位が付いてしまう。これに対し、環状のメニューリストによれば、環状に配置されるすべてのアイコンを対等に扱うことが可能となる。さらに、一次元的なメニューリストを先に選択されたオブジェクトの近くに表示したとしても、オブジェクトと各アイコンとの関連性を表現することは難しい。これに対し、先に選択されたオブジェクトの周囲に環状のメニューリストを表示させれば、先に選択された(囲われた)オブジェクトと、その周囲の各アイコンとの間に関連性があるということをユーザに自然に認識させることが可能となる。 Furthermore, a menu list in which icons of related items are arranged in a circle around an object has the following advantages compared to a linear one-dimensional menu list. For example, in a one-dimensional menu list, for example, icons are arranged from top to bottom or from left to right. Therefore, priorities are unintentionally assigned to the icons according to the arrangement positions. On the other hand, according to the circular menu list, it is possible to treat all icons arranged in a circular manner on an equal basis. Furthermore, even if a one-dimensional menu list is displayed near the previously selected object, it is difficult to express the relationship between the object and each icon. In contrast, if a circular menu list is displayed around the previously selected object, there is a relationship between the previously selected (enclosed) object and the surrounding icons. This makes it possible for the user to recognize this naturally.
 以上のことから、本発明の情報処理装置は、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the information processing apparatus according to the present invention can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 本発明の情報処理装置では、上記操作画面処理手段は、上記軌跡取得手段によって取得された軌跡、もしくは、その相似形または近似形を、上記環の形状として決定することが好ましい。 In the information processing apparatus of the present invention, it is preferable that the operation screen processing means determines the trajectory acquired by the trajectory acquisition means, or a similar shape or an approximate shape thereof, as the shape of the ring.
 上記構成によれば、ユーザがフリーハンドで任意の形状でオブジェクトを囲うという動作を行うが、このときの軌跡が、軌跡取得手段によって保持される。そして、操作画面処理手段は、操作画面を作成する際、上記のようにして得られた軌跡と同じまたは相似形の環の輪郭線上に、所定の領域(あるいはオブジェクトそのもの)を囲むようにして各アイコンを配置する。 According to the above configuration, the user performs an operation of freely surrounding the object with an arbitrary shape, and the trajectory at this time is held by the trajectory acquisition means. Then, when creating the operation screen, the operation screen processing means sets each icon so as to surround a predetermined area (or the object itself) on the contour line of the ring that is the same or similar to the locus obtained as described above. Deploy.
 これにより、オブジェクトの周囲を「囲う」という、オブジェクトを指定する上で、極めて自然で簡易なユーザの動作を契機として、アイコンを環状に配置するという結果を出力することができる。つまり、ユーザは、所定の領域(あるいはオブジェクトそのもの)を囲むように関連項目のアイコンが配置された操作画面を結果物として得ることができる。 This makes it possible to output a result that icons are arranged in a ring shape in response to a very natural and simple user action in specifying the object “enclose” around the object. That is, the user can obtain, as a result, an operation screen in which icons of related items are arranged so as to surround a predetermined area (or the object itself).
 さらに、アイコンが配置される環の形状は、オブジェクトを囲うようにして得られた操作体の移動の軌跡に一致するか、それと相似の関係にある状態で操作画面に表示される。 Furthermore, the shape of the ring in which the icons are arranged is displayed on the operation screen in a state that matches or is similar to the movement trajectory of the operation body obtained by surrounding the object.
 つまり、ユーザがオブジェクトを囲うと、「ユーザがオブジェクトを囲ったとおりに」アイコンが配置された操作画面が得られる。この事象の遷移は、ユーザの直感に反しないより自然な流れであると言える。 That is, when the user surrounds the object, an operation screen on which icons are arranged “as the user surrounded the object” is obtained. It can be said that the transition of this event is a more natural flow that does not contradict the user's intuition.
 また、ユーザが囲ったとおりの形状でアイコンが配置されるので、ユーザは思い通りの形状でオブジェクトを囲うことにより、思い通りの形状にアイコンが配置された操作画面を得ることができる。これにより、操作画面を表示して情報処理装置を操作する際の遊戯性が高まる。 Further, since the icons are arranged in the shape as enclosed by the user, the user can obtain the operation screen in which the icons are arranged in the desired shape by surrounding the object in the desired shape. Thereby, the playability at the time of displaying an operation screen and operating information processing apparatus increases.
 その上、ユーザは、アイコンの配置を予測して、自分が希望するとおりにオブジェクトを囲い、関連項目のアイコンを表示させることができるため、操作性はさらに向上する。 In addition, since the user can predict the arrangement of the icons, surround the object as he / she desires, and display the icons of related items, the operability is further improved.
 以上のことから、本発明の情報処理装置は、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the information processing apparatus according to the present invention can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 さらに、本発明の情報処理装置では、上記操作画面処理手段は、優先順位が最も高い関連項目のアイコンを、上記環の輪郭線上の、上記軌跡の終点に最も近い点に配置し、残りの関連項目の各アイコンを、優先順位が最も高い上記アイコンの配置位置を基準にして、上記環の輪郭線上に配置してもよい。 Furthermore, in the information processing apparatus of the present invention, the operation screen processing means arranges the icon of the related item having the highest priority at the point closest to the end point of the trajectory on the outline of the ring, and the remaining related items. You may arrange | position each icon of an item on the outline of the said ring on the basis of the arrangement position of the said icon with the highest priority.
 上記構成によれば、操作画面処理手段は、優先順位が1位のアイコンを、環の輪郭線上の、軌跡の終点からの最短距離に必ず配置することが可能となる。つまり、ユーザに選択される可能性が最も高いアイコンを、ユーザの指示体(指など)がある位置の最も近くに表示させることが可能となり、高い確率で、ユーザが指示体を大きく動かさずに目的のアイコンを選択できるようにすることができる。 According to the above configuration, the operation screen processing means can always arrange the icon with the highest priority at the shortest distance from the end point of the locus on the outline of the ring. In other words, the icon that is most likely to be selected by the user can be displayed closest to the position of the user's indicator (such as a finger), and the user does not move the indicator greatly with a high probability. A target icon can be selected.
 あるいは、本発明の情報処理装置では、上記軌跡取得手段は、上記表示部に表示されたオブジェクトを選択する上記指示体の移動が生じるまでの所定期間に生じた上記指示体の軌跡を取得し、上記操作画面処理手段は、上記所定期間に取得された軌跡が、上記表示部の画面における特定の領域に偏っていると判断した場合に、上記特定の領域にアイコンが配置されるようにアイコンの位置を決定してもよい。 Alternatively, in the information processing apparatus of the present invention, the trajectory acquisition unit acquires the trajectory of the indicator that has occurred in a predetermined period until the indicator moves to select the object displayed on the display unit, When the operation screen processing means determines that the trajectory acquired during the predetermined period is biased toward a specific area on the screen of the display unit, the icon screen is arranged so that the icon is arranged in the specific area. The position may be determined.
 上記構成によれば、軌跡取得手段は、オブジェクトを選択する動作だけでなく、過去の所定期間に生じた動作について軌跡を取得する。続いて、操作画面処理手段は、取得された軌跡によって、過去の所定期間において、表示部のどの位置に動作(指示体の移動)が生じたのかを把握することが可能となり、指示体の移動の軌跡が、表示部の特定の領域に集中しているのであれば、それによって指示体の移動位置の偏りを検知することができる。 According to the above configuration, the trajectory acquisition means acquires a trajectory not only for an operation for selecting an object but also for an operation that has occurred in a past predetermined period. Subsequently, the operation screen processing means can grasp at which position of the display unit the movement (movement of the indicator) has occurred in the past predetermined period from the acquired trajectory. If the locus is concentrated on a specific area of the display unit, it is possible to detect the deviation of the moving position of the indicator.
 このように移動位置が偏っているということは、その領域しかタッチできない(あるいは、それ以外の領域はタッチし難い、指示体を移動させられない)という特殊な使用状況下で当該情報処理装置を利用していると推測できる。 The fact that the movement position is biased in this way means that the information processing apparatus can be operated under a special use situation in which only that area can be touched (or other areas are difficult to touch and the indicator cannot be moved). You can guess that you are using it.
 そこで、操作画面処理手段は、偏りが検出された領域(すなわち、指示体を移動させることが可能な限られた領域)にアイコンが配置されるように環の位置を決定する。 Therefore, the operation screen processing means determines the position of the ring so that the icon is arranged in the area where the deviation is detected (that is, the limited area where the indicator can be moved).
 これにより、ユーザが選択可能な領域にアイコンが表示されることになり、次にユーザがアイコンを選択する動作を行う場合には、選択可能な領域からすぐさま所望のアイコンを選択することができる。 Thus, an icon is displayed in an area that can be selected by the user. When the user next performs an operation of selecting an icon, a desired icon can be selected immediately from the selectable area.
 具体例を用いてより詳細に説明すると以下のとおりである。例えば、片手で操作する場合には、接触位置は、タッチパネルの画面下部左側の領域(左手で操作する場合)、または、画面下部右側の領域(右手で操作する場合)に偏る傾向がある。ユーザがこのような状況で情報処理装置を使用しているときに、接触動作が必要なオブジェクトまたはアイコンを、画面上部や手と反対側の画面下部に表示すると操作が煩雑になるという問題がある。なぜなら、ユーザは、目的のオブジェクトをすぐさまタッチできず、接触可能な領域にたぐり寄せるという余計な動作を行わなければならないか、両手操作に切り替えなければならないからである。 The following is a more detailed description using specific examples. For example, when operating with one hand, the touch position tends to be biased toward the lower left area of the touch panel (when operating with the left hand) or the lower right area of the screen (when operating with the right hand). When the user is using the information processing apparatus in such a situation, there is a problem that the operation becomes complicated if an object or icon that requires a contact operation is displayed on the upper part of the screen or the lower part of the screen opposite to the hand. . This is because the user cannot touch the target object immediately and must perform an extra operation of rushing to a contactable area or switching to a two-handed operation.
 そこで、本発明の情報処理装置は、上記構成によってユーザの使用状況を察することにより、上記の問題を解決することができる。つまり、本発明の情報処理装置を、操作体の接触位置の偏りを検出し、ユーザの指示体がすぐさま届くと推測される領域内に、アイコンを配置する構成とすることができる。 Therefore, the information processing apparatus according to the present invention can solve the above-mentioned problem by observing the usage status of the user with the above-described configuration. That is, the information processing apparatus according to the present invention can be configured to detect the bias of the contact position of the operating tool and arrange the icon in a region where the user's indicator is estimated to arrive immediately.
 これにより、ユーザが片手で上記情報処理装置を操作する場合には、タッチパネルの画面下部左側(あるいは右側)の領域内に収まるように、アイコンが表示されるので、ユーザは、片手操作で目的のアイコンをたぐり寄せる必要はなく、すぐさま所望のアイコンを選択することができる。 As a result, when the user operates the information processing apparatus with one hand, the icon is displayed so as to fit within the lower left (or right) area of the screen of the touch panel. There is no need to drag icons, and a desired icon can be selected immediately.
 あるいは、本発明の情報処理装置では、上記操作画面処理手段は、上記表示部の画面における、上記オブジェクトを選択する上記指示体の移動の軌跡と重なる特定の領域にアイコンが配置されるようにアイコンの位置を決定してもよい。 Alternatively, in the information processing apparatus according to the present invention, the operation screen processing means may be configured such that the icon is arranged in a specific area overlapping the locus of movement of the indicator that selects the object on the screen of the display unit. May be determined.
 上記構成によれば、ユーザが指などの指示体で選択した位置、あるいは、その近辺にアイコンが配置されることになる。ユーザがオブジェクトを選択した位置は、ユーザにとって選択可能領域であると言える。よって、確実に、選択可能領域にアイコンを表示させることが可能となる。 According to the above configuration, the icon is arranged at or near the position selected by the user with an indicator such as a finger. It can be said that the position where the user has selected the object is a selectable area for the user. Therefore, it is possible to reliably display an icon in the selectable area.
 本発明の情報処理装置では、当該情報処理装置が備える入力部および上記表示部はタッチパネルを構成するものであり、上記軌跡取得手段は、上記タッチパネル上を移動した上記指示体の移動の軌跡を取得してもよい。 In the information processing apparatus of the present invention, the input unit and the display unit included in the information processing apparatus constitute a touch panel, and the trajectory acquisition unit acquires a trajectory of movement of the indicator that has moved on the touch panel. May be.
 上記構成によれば、ユーザがオブジェクトを選択するために行う接触動作と、その動作に応じて得られる結果物との関連性を高めることが可能となり、ユーザの直感に反しない自然な流れで操作画面を提供することができる。結果として、タッチパネルを備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 According to the above configuration, it is possible to increase the relevance between the contact operation performed by the user for selecting an object and the resultant product obtained according to the operation, and the operation is performed in a natural flow that does not contradict the user's intuition. A screen can be provided. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the touch panel.
 本発明の情報処理装置では、当該情報処理装置が備える入力部は、上記表示部に表示されるカーソルを移動させる指示を当該情報処理装置に入力するものであり、上記軌跡取得手段は、上記指示体としてのカーソルの移動の軌跡を取得してもよい。 In the information processing apparatus of the present invention, the input unit included in the information processing apparatus inputs an instruction to move the cursor displayed on the display unit to the information processing apparatus, and the trajectory acquisition unit includes the instruction You may acquire the locus | trajectory of the movement of the cursor as a body.
 上記構成によれば、ユーザが、入力部を操作して、オブジェクトを選択するために行う入力動作と、その動作に応じて得られる結果物との関連性を高めることが可能となり、ユーザの直感に反しない自然な流れで操作画面を提供することができる。結果として、表示部と入力部とを備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 According to the above configuration, it is possible to increase the relevance between the input operation performed for the user to select an object by operating the input unit and the resultant product obtained according to the operation, and the user's intuition The operation screen can be provided in a natural flow that does not violate. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the display unit and the input unit.
 本発明の操作画面表示方法は、上記課題を解決するために、情報処理装置における操作画面表示方法であって、上記情報処理装置が備える表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得ステップと、上記軌跡取得ステップにて取得された軌跡に基づいて特定された選択領域に少なくとも一部が重なるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定ステップと、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定ステップにて特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出ステップと、上記関連項目抽出ステップにて抽出された関連項目のアイコンを特定の位置に配置して、上記表示部に表示する操作画面処理ステップとを含み、上記操作画面処理ステップでは、上記抽出された関連項目の中で、優先順位が高い関連項目のアイコンほど、上記軌跡取得ステップにて取得された軌跡の終点の近くに配置することを特徴としている。 An operation screen display method of the present invention is an operation screen display method in an information processing apparatus to solve the above-described problem, and a trajectory of a movement of an indicator that indicates a position of a screen of a display unit included in the information processing apparatus. A trajectory acquisition step to acquire, an object specifying step to specify, as the selected object, an object that at least partially overlaps the selection area specified based on the trajectory acquired in the trajectory acquisition step, and the object and the object A related item extracting step of extracting an item associated with the object identified in the object identifying step as a related item with reference to a related information storage unit that associates and stores an item associated with Place the icon of the related item extracted in the item extraction step at a specific position, An operation screen processing step to be displayed on the display unit. In the operation screen processing step, among the extracted related items, the icon of the related item having a higher priority is acquired in the trajectory acquisition step. It is characterized by being placed near the end point of the trajectory.
 上記方法によれば、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および上記表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 According to the above method, the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations. As a result, there is an effect that excellent operability can be realized in the information processing apparatus including the input unit and the display unit.
 なお、上記情報処理装置は、コンピュータによって実現してもよく、この場合には、コンピュータを上記各手段として動作させることにより上記情報処理装置をコンピュータにて実現させる情報処理装置の制御プログラム、およびそれを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。 The information processing apparatus may be realized by a computer. In this case, an information processing apparatus control program for causing the information processing apparatus to be realized by the computer by causing the computer to operate as the above-described means, and A computer-readable recording medium on which is recorded also falls within the scope of the present invention.
 本発明の情報処理装置は、上記課題を解決するために、タッチパネルを備えた情報処理装置において、表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得手段と、上記軌跡取得手段によって取得された軌跡に基づいて特定された選択領域に少なくとも一部が重なるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定手段と、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定手段によって特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出手段と、上記関連項目抽出手段によって抽出された関連項目のアイコンを特定の位置に配置して、上記表示部に表示する操作画面処理手段とを備え、上記操作画面処理手段は、上記抽出された関連項目の中で、優先順位が高い関連項目のアイコンほど、上記軌跡取得手段によって取得された軌跡の終点の近くに配置することを特徴としている。 In order to solve the above-described problem, an information processing apparatus according to the present invention provides a trajectory acquisition unit that acquires a trajectory in which an indicator pointing a screen position of a display unit has moved in an information processing apparatus including a touch panel, and the trajectory acquisition. An object specifying means for specifying an object at least partially overlapping the selected area specified based on the trajectory acquired by the means as the selected object, and the object and an item related to the object are stored in association with each other. With reference to the related information storage unit, related item extraction means for extracting an item associated with the object specified by the object specifying means as a related item, and an icon for the related item extracted by the related item extraction means Operation screen processing means arranged at a specific position and displayed on the display unit The operation screen processing unit, in the related item of the extracted more icons higher priority related items, and characterized in that disposed near the end point of the trace obtained by the trace obtaining means.
 本発明の操作画面表示方法は、上記課題を解決するために、情報処理装置における操作画面表示方法であって、上記情報処理装置が備える表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得ステップと、上記軌跡取得ステップにて取得された軌跡に基づいて特定された選択領域に少なくとも一部が重なるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定ステップと、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定ステップにて特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出ステップと、上記関連項目抽出ステップにて抽出された関連項目のアイコンを特定の位置に配置して、上記表示部に表示する操作画面処理ステップとを含み、上記操作画面処理ステップでは、上記抽出された関連項目の中で、優先順位が高い関連項目のアイコンほど、上記軌跡取得ステップにて取得された軌跡の終点の近くに配置することを特徴としている。 An operation screen display method of the present invention is an operation screen display method in an information processing apparatus to solve the above-described problem, and a trajectory of a movement of an indicator that indicates a position of a screen of a display unit included in the information processing apparatus. A trajectory acquisition step to acquire, an object specifying step to specify, as the selected object, an object that at least partially overlaps the selection area specified based on the trajectory acquired in the trajectory acquisition step, and the object and the object A related item extracting step of extracting an item associated with the object identified in the object identifying step as a related item with reference to a related information storage unit that associates and stores an item associated with Place the icon of the related item extracted in the item extraction step at a specific position, An operation screen processing step to be displayed on the display unit. In the operation screen processing step, among the extracted related items, the icon of the related item having a higher priority is acquired in the trajectory acquisition step. It is characterized by being placed near the end point of the trajectory.
 したがって、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 Therefore, the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
本発明の一実施形態におけるタブレット端末の要部構成を示す機能ブロック図である。It is a functional block diagram which shows the principal part structure of the tablet terminal in one Embodiment of this invention. 本発明の一実施形態におけるタブレット端末のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the tablet terminal in one Embodiment of this invention. 本発明の一実施形態におけるタブレット端末の外観を示す平面図である。It is a top view which shows the external appearance of the tablet terminal in one Embodiment of this invention. タブレット端末をユーザが把持および操作するときの様子を説明する図であり、(a)は、タブレット端末が片手で把持され、その手で操作される様子を説明する図であり、(b)は、タブレット端末が一方の手で把持され、もう一方の手で操作される様子を説明する図である。It is a figure explaining a mode when a user hold | grips and operates a tablet terminal, (a) is a figure explaining a mode that a tablet terminal is hold | gripped with one hand and is operated with the hand, (b) It is a figure explaining a mode that a tablet terminal is hold | gripped with one hand and is operated with the other hand. タブレット端末のオブジェクト特定部の動作を説明する図であり、(a)は、ユーザが目的のオブジェクトを選択するためにオブジェクトを「囲う」という接触動作を行った様子を示す図であり、(b)は、(a)に示す接触動作に伴って、接触情報生成部が生成した接触情報の一例を示す図であり、(c)は、接触が検知されたt0~tnの期間に表示部に表示された映像フレームのマップ情報の一例を示す図である。It is a figure explaining operation | movement of the object specific part of a tablet terminal, (a) is a figure which shows a mode that the user performed contact operation of "surrounding" an object in order to select the target object, (b) ) Is a diagram showing an example of contact information generated by the contact information generation unit in accordance with the contact operation shown in (a), and (c) is a display unit during the period from t0 to tn when contact is detected. It is a figure which shows an example of the map information of the displayed video frame. タブレット端末の関連情報記憶部に記憶される関連情報の一例を示す図である。It is a figure which shows an example of the relevant information memorize | stored in the relevant information storage part of a tablet terminal. タブレット端末のアイコン記憶部に記憶されるアイコン画像の具体例を示す図である。It is a figure which shows the specific example of the icon image memorize | stored in the icon memory | storage part of a tablet terminal. タブレット端末のアイコン順位決定部が、各アイコンに付与した優先順位の一例を示す図である。It is a figure which shows an example of the priority given to each icon by the icon order | rank determination part of a tablet terminal. タブレット端末のアイコン配置決定部の動作の具体例を説明する図であり、(a)は、アイコン配置決定部が取得するアイコン配置パターンの具体例を示す図であり、(b)は、アイコン配置決定部によって決定されたアイコンの配置の具体例を示す図である。It is a figure explaining the specific example of operation | movement of the icon arrangement | positioning determination part of a tablet terminal, (a) is a figure which shows the specific example of the icon arrangement pattern which an icon arrangement | positioning determination part acquires, (b) is icon arrangement | positioning. It is a figure which shows the specific example of arrangement | positioning of the icon determined by the determination part. タブレット端末の操作画面処理部によって実行された操作画面生成処理の結果、得られた操作画面の具体例を示す図である。It is a figure which shows the specific example of the operation screen obtained as a result of the operation screen production | generation process performed by the operation screen process part of the tablet terminal. タブレット端末による操作画面表示処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the operation screen display process by a tablet terminal. タブレット端末の接触情報生成部およびオブジェクト特定部の動作を説明する図であり、(a)は、ユーザが目的のオブジェクトを選択するためにオブジェクトをレ点で「チェックする」という接触動作を行ったことを示す図であり、(b)は、同図の(a)に示す接触動作に伴って、接触情報生成部が生成した接触情報の一例を示す図である。It is a figure explaining operation | movement of the contact information generation part of a tablet terminal, and an object specific | specification part, (a) performed the contact operation | movement which "checks" an object in a check point in order for a user to select the target object (B) is a figure which shows an example of the contact information which the contact information generation part produced | generated with the contact operation | movement shown to (a) of the figure. タブレット端末のアイコン配置決定部が取得するアイコン配置パターンの他の具体例を示す図である。It is a figure which shows the other specific example of the icon arrangement pattern which the icon arrangement | positioning determination part of a tablet terminal acquires. タブレット端末の操作画面処理部によって実行された操作画面生成処理の結果、得られた操作画面の他の具体例を示す図である。It is a figure which shows the other specific example of the operation screen obtained as a result of the operation screen production | generation process performed by the operation screen process part of the tablet terminal. 本発明の他の実施形態におけるタブレット端末の要部構成を示す機能ブロック図である。It is a functional block diagram which shows the principal part structure of the tablet terminal in other embodiment of this invention. タブレット端末の操作画面処理部の各部の処理内容を説明する図であり、(a)は、操作画面処理部によるオブジェクトの表示処理の一例を説明する図であり、(b)は、操作画面処理部の環形状決定部が決定した、環形状に特化したアイコン配置パターンの一例を示す図である。It is a figure explaining the processing content of each part of the operation screen process part of a tablet terminal, (a) is a figure explaining an example of the display process of the object by an operation screen process part, (b) is an operation screen process. It is a figure which shows an example of the icon arrangement pattern specialized in the ring shape which the ring shape determination part of the part determined. タブレット端末の接触情報記憶部に記憶される接触情報の具体例を示す図であり、(a)は、ユーザが目的のオブジェクトを選択するために、オブジェクトを任意の形状で「囲う」接触動作を行った様子を示す図であり、(b)は、(a)に示す接触動作に伴って、接触情報生成部が生成した接触情報の一例を示す図である。It is a figure which shows the specific example of the contact information memorize | stored in the contact information memory | storage part of a tablet terminal, (a) is an operation which "encloses" an object in arbitrary shapes in order for a user to select the target object. It is a figure which shows the mode performed, (b) is a figure which shows an example of the contact information which the contact information generation part produced | generated with the contact operation | movement shown to (a). タブレット端末の環形状決定部によって決定された、アイコン配置パターンの他の例を示す図である。It is a figure which shows the other example of the icon arrangement pattern determined by the ring shape determination part of the tablet terminal. タブレット端末の操作画面処理部が生成する操作画面における、オブジェクトおよび環状アイコン群の他の配置例を示している。The other example of arrangement | positioning of the object and cyclic | annular icon group in the operation screen which the operation screen process part of a tablet terminal produces | generates is shown. タブレット端末の操作画面処理部が生成する操作画面における、オブジェクトおよび環状アイコン群の他の配置例を示している。The other example of arrangement | positioning of the object and cyclic | annular icon group in the operation screen which the operation screen process part of a tablet terminal produces | generates is shown. 関連項目のアイコン表示方法の一変形例を示す図であり、(a)は、最初にオブジェクトの周囲に小さく配置されたアイコンの環の様子を示し、(b)は、アイコンの環が拡大する途中の様子を示し、(c)は、(a)から(b)を経て、最終的にアイコンの環が画面中央に大きく配置された様子を示している。It is a figure which shows the modification of the icon display method of a related item, (a) shows the mode of the ring of the icon initially arrange | positioned small around the object, (b) expands the ring of an icon. A state in the middle is shown, and (c) shows a state in which a ring of icons is finally arranged largely in the center of the screen through (a) to (b). 関連項目のアイコン表示方法の一変形例を示す図であり、(a)は、接触情報の一具体例を示す図であり、(b)、(c)、(d)、(e)は、複数のアイコンが一定の間隔で順次表示される様子を示す図であり、(f)は、最終的に得られる操作画面の一具体例を示す図である。It is a figure which shows one modification of the icon display method of a related item, (a) is a figure which shows a specific example of contact information, (b), (c), (d), (e) It is a figure which shows a mode that a some icon is sequentially displayed at a fixed space | interval, (f) is a figure which shows a specific example of the operation screen finally obtained. 関連項目のアイコン表示方法の一変形例を示す図であり、(a)は、接触情報の一具体例を示す図であり、(b)は、t0の時点の操作画面の一例を示す図であり、(c)は、taの時点の操作画面の一例を示す図であり、(d)は、tbの時点の操作画面の一例を示す図であり、(e)は、tcの時点の操作画面の一例を示す図であり、(f)は、tnの時点の操作画面の一例を示す図である。It is a figure which shows one modification of the icon display method of a related item, (a) is a figure which shows a specific example of contact information, (b) is a figure which shows an example of the operation screen at the time of t0. (C) is a diagram showing an example of the operation screen at the time point ta, (d) is a diagram showing an example of the operation screen at the time point tb, and (e) is an operation at the time point tc. It is a figure which shows an example of a screen, (f) is a figure which shows an example of the operation screen at the time of tn. 本発明の他の実施形態におけるタブレット端末による操作画面表示処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the operation screen display process by the tablet terminal in other embodiment of this invention. 本発明の他の実施形態におけるタブレット端末による操作画面表示処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the operation screen display process by the tablet terminal in other embodiment of this invention. ユーザの使用状況に応じて操作画面を提示することが可能な、本発明のタブレット端末の動作を説明する図であり、(a)は、ユーザが左手で操作しているという状況の一例を説明する図であり、(b)は、(a)の接触動作に伴って生成された接触情報の具体例を示す図である。It is a figure explaining operation | movement of the tablet terminal of this invention which can show an operation screen according to a user's usage condition, (a) demonstrates an example of the condition where the user is operating with the left hand. (B) is a figure which shows the specific example of the contact information produced | generated with the contact operation | movement of (a). 操作画面処理部の環形状決定部によって決定された環形状にしたがってアイコンが配置されたときの操作画面の一例を示す図である。It is a figure which shows an example of an operation screen when an icon is arrange | positioned according to the ring shape determined by the ring shape determination part of the operation screen process part. (a)~(c)は、タブレット端末のアイコン配置決定部が行う、アイコン配置位置と優先順位との関連付けの具体例を示す図である。(A)-(c) is a figure which shows the specific example of correlation with the icon arrangement position and priority which the icon arrangement | positioning determination part of a tablet terminal performs. (a)および(b)は、ユーザが、接触可能領域上でドラッグすることにより、アイコンの環を回転させる様子を説明する図である。(A) And (b) is a figure explaining a mode that the ring of an icon is rotated by a user dragging on a contactable area | region. 本発明の情報処理装置としてのタブレット端末を、タブレットPCで実現した場合の様子を示す図である。It is a figure which shows a mode when the tablet terminal as an information processing apparatus of this invention is implement | achieved by tablet PC. (a)は、タブレット端末のアイコン配置決定部によって、指示体の軌跡および軌跡の終点tnがプロットされたアイコン配置パターンの具体例を示す図であり、(b)は、アイコン配置決定部によって、各アイコン配置位置に優先順位が関連付けられた結果を示す図である。(A) is a figure which shows the specific example of the icon arrangement | positioning pattern by which the locus | trajectory of the indicator and the end point tn of a locus | trajectory were plotted by the icon arrangement | positioning determination part of a tablet terminal, (b) It is a figure which shows the result of which the priority order was linked | related with each icon arrangement position. 本発明の情報処理装置としてのタブレット端末を、タブレットPCで実現した場合の様子を示す図である。It is a figure which shows a mode when the tablet terminal as an information processing apparatus of this invention is implement | achieved by tablet PC. (a)は、タブレット端末のアイコン配置決定部によって、指示体の軌跡および軌跡の終点tnがプロットされたアイコン配置パターンの具体例を示す図であり、(b)は、アイコン配置決定部によって、各アイコン配置位置に優先順位が関連付けられた結果を示す図である。(A) is a figure which shows the specific example of the icon arrangement | positioning pattern by which the locus | trajectory of the indicator and the end point tn of a locus | trajectory were plotted by the icon arrangement | positioning determination part of a tablet terminal, (b) It is a figure which shows the result of which the priority order was linked | related with each icon arrangement position. タブレット端末のアイコン配置決定部が決定したアイコン配置パターンの一例を示す図である。It is a figure which shows an example of the icon arrangement pattern which the icon arrangement | positioning determination part of the tablet terminal determined.
 ≪実施形態1≫
 本発明の実施形態について、図1~図14に基づいて説明すると以下の通りである。
Embodiment 1
Embodiments of the present invention will be described with reference to FIGS. 1 to 14 as follows.
 以下で説明する実施形態では、一例として、本発明の情報処理装置を、タブレット端末に適用した場合について説明する。本実施形態では、一例として、上記タブレット端末は、片手で操作することが可能な、小型で携帯性に優れたスマートフォンなどで実現される。 In the embodiment described below, a case where the information processing apparatus of the present invention is applied to a tablet terminal will be described as an example. In the present embodiment, as an example, the tablet terminal is realized by a small smartphone that can be operated with one hand and is excellent in portability.
 しかし、本発明の情報処理装置は、上記の例に限定されず、あらゆるサイズの情報処理装置(例えば、ノートサイズのタブレットPC、または、大型のタッチパネルを備えた電子黒板など)に、本発明の情報処理装置を適用してもよい。 However, the information processing apparatus of the present invention is not limited to the above-described example, and the information processing apparatus of any size (for example, a notebook-size tablet PC or an electronic blackboard equipped with a large touch panel) can be used. An information processing apparatus may be applied.
 〔タブレット端末のハードウェア構成〕
 図2は、本実施形態におけるタブレット端末100のハードウェア構成を示すブロック図である。タブレット端末100は、図2に示すとおり、少なくとも、制御部10、入力部11、表示部12および記憶部19を備えている。さらに、タブレット端末100は、本来備わっている機能を実現するために、操作部13、外部インターフェース14、通信部15、無線通信部16、音声出力部17、音声入力部18を備えていてもよい。
[Hardware configuration of tablet terminal]
FIG. 2 is a block diagram illustrating a hardware configuration of the tablet terminal 100 according to the present embodiment. As shown in FIG. 2, the tablet terminal 100 includes at least a control unit 10, an input unit 11, a display unit 12, and a storage unit 19. Furthermore, the tablet terminal 100 may include an operation unit 13, an external interface 14, a communication unit 15, a wireless communication unit 16, an audio output unit 17, and an audio input unit 18 in order to realize inherent functions. .
 また、タブレット端末100がスマートフォンなどの多機能携帯通信端末である場合には、ここでは省略したが、タブレット端末100は、通話処理部、撮影を行う撮像部(レンズ・撮像素子など)、放送受像部(チューナ・復調部など)、GPS、および、センサ(加速度センサ、傾きセンサなど)他、スマートフォンが標準的に備えている各種部品を備えていてもよい。 When the tablet terminal 100 is a multi-function mobile communication terminal such as a smartphone, the tablet terminal 100 is omitted here. However, the tablet terminal 100 includes a call processing unit, an imaging unit (such as a lens / image sensor) that performs imaging, and a broadcast image. Other parts (such as a tuner / demodulation unit), GPS, and sensors (such as an acceleration sensor and an inclination sensor) may be included as well as various components that are typically included in a smartphone.
 入力部11は、ユーザがタブレット端末100を操作するための指示信号を、タッチパネルを介して入力するためのものである。入力部11は、指示体(表示部12の画面位置を指示するもの、ここでは、例えば、指またはペンなど)の接触を受け付けるタッチ面と、指示体とタッチ面との間の接触/非接触(接近/非接近)、および、その接触(接近)位置を検知するためのタッチセンサとで構成されている。タッチセンサは、指示体とタッチ面との接触/非接触を検知できればどのようなセンサで実現されていてもかまわない。例えば、圧力センサ、静電容量センサ、光センサなどで実現される。 The input unit 11 is for inputting an instruction signal for the user to operate the tablet terminal 100 via the touch panel. The input unit 11 is a touch surface that accepts contact with an indicator (indicating the screen position of the display unit 12, here, for example, a finger or a pen), and contact / non-contact between the indicator and the touch surface. (Approach / non-approach) and a touch sensor for detecting the contact (approach) position. The touch sensor may be realized by any sensor as long as it can detect contact / non-contact between the indicator and the touch surface. For example, it is realized by a pressure sensor, a capacitance sensor, an optical sensor, or the like.
 表示部12は、タブレット端末100が情報処理するオブジェクト(アイコンなどのあらゆる表示対象物)、および、処理の結果物を表示したり、ユーザがタブレット端末100を操作するための操作画面をGUI(Graphical User Interface)画面として表示したりするものである。表示部12は、例えば、LCD(液晶ディスプレイ)などの表示装置で実現される。 The display unit 12 displays an object to be processed by the tablet terminal 100 (any display object such as an icon) and a processing result, and displays an operation screen for the user to operate the tablet terminal 100 using a GUI (Graphical (User 表示 Interface) screen. The display unit 12 is realized by a display device such as an LCD (Liquid Crystal Display).
 本実施形態では、入力部11と表示部12とは一体に成形されており、これらがタッチパネルを構成している。したがって、このような実施形態では、ユーザが画面位置を指示するために動かす(操作する)対象、すなわち、操作体(ここでは、指またはペンなど)は、同時に、表示部12の画面上の位置を指示する指示体でもある。 In this embodiment, the input unit 11 and the display unit 12 are integrally formed, and these constitute a touch panel. Therefore, in such an embodiment, an object to be moved (operated) to indicate a screen position, that is, an operation body (here, a finger or a pen) is simultaneously positioned on the screen of the display unit 12. It is also an indicator that indicates
 例えば、本発明のタブレット端末100のタッチパネルを投影型静電容量方式のタッチパネルで実現する場合、具体的には、上記タッチセンサは、ITO(Indium Tin Oxide)などによるマトリクス状の透明電極パターンを、ガラス、プラスチックなどの透明基板上に形成したものとなる。タッチセンサに指示体(ユーザの指またはペン等)が接触または接近すると、その付近の複数の透明電極パターンにおける静電容量が変化する。従って、制御部10は、上記透明電極パターンの電流または電圧の変化を検出することにより、上記指示体が接触または接近した位置を検出することができる。 For example, when the touch panel of the tablet terminal 100 of the present invention is realized by a projected capacitive touch panel, specifically, the touch sensor has a transparent electrode pattern in a matrix shape made of ITO (Indium Tin Oxide) or the like. It is formed on a transparent substrate such as glass or plastic. When an indicator (such as a user's finger or pen) touches or approaches the touch sensor, the electrostatic capacity of a plurality of transparent electrode patterns in the vicinity changes. Therefore, the control unit 10 can detect the position where the indicator is in contact or approached by detecting a change in the current or voltage of the transparent electrode pattern.
 以下では、「接触を検知する」、「接触動作」、「接触位置」などというときの「接触」という用語は、指示体とタッチ面とが完全に接する(接している)状態のみならず、指示体とタッチ面とが、タッチセンサが検知可能な程度に接近する(接近している)状態も含んでいる。 In the following, the term “contact” when “contact detection”, “contact operation”, “contact position”, etc. is not only the state in which the indicator and the touch surface are in complete contact (contact), It also includes a state in which the indicator and the touch surface are close (approaching) to the extent that the touch sensor can detect.
 操作部13は、ユーザがタブレット端末100に指示信号を直接入力するためのものである。例えば、操作部13は、ボタン、スイッチ、キー、ジョグダイアルなどの適宜の入力機構で実現される。例えば、操作部13は、タブレット端末100の電源のオン/オフを行うスイッチである。 The operation unit 13 is for the user to directly input an instruction signal to the tablet terminal 100. For example, the operation unit 13 is realized by an appropriate input mechanism such as a button, switch, key, or jog dial. For example, the operation unit 13 is a switch for turning on / off the power of the tablet terminal 100.
 外部インターフェース14は、外部の装置をタブレット端末100に接続するためのインターフェースである。外部インターフェース14は、例えば、これに限定されないが、外付けの記録媒体(メモリカードなど)を挿し込むためのソケット、HDMI(High Definition Multimedia Interface)端子、USB(Universal Serial Bus)端子などで実現される。タブレット端末100の制御部10は、外部インターフェース14を介して、外部の装置とデータの送受信を行うことができる。 The external interface 14 is an interface for connecting an external device to the tablet terminal 100. The external interface 14 is realized by, for example, but not limited to, a socket for inserting an external recording medium (memory card or the like), an HDMI (High Definition Multimedia Interface) terminal, a USB (Universal Serial Bus) terminal, or the like. The The control unit 10 of the tablet terminal 100 can exchange data with an external device via the external interface 14.
 通信部15は、通信網を介して外部の装置と通信を行うものである。通信部15は、通信網を介して、各種通信端末に接続し、タブレット端末100と通信端末との間でのデータの送受信を実現する。さらに、タブレット端末100が、スマートフォンなどの携帯通信端末である場合には、通信部15は、携帯電話回線網を介して、音声通話データ、電子メールデータなどを、他の装置との間で送受信する。 The communication unit 15 communicates with an external device via a communication network. The communication unit 15 is connected to various communication terminals via a communication network, and realizes data transmission / reception between the tablet terminal 100 and the communication terminal. Further, when the tablet terminal 100 is a mobile communication terminal such as a smartphone, the communication unit 15 transmits / receives voice call data, e-mail data, and the like to / from other devices via the mobile phone network. To do.
 無線通信部16は、無線で外部の装置と通信を行うものである。無線通信部16は、特に限定されないが、IrDA、IrSSなどの赤外線通信、Bluetooth通信、WiFi通信、非接触型ICカードのいずれかの無線通信手段を実現するものであってもよいし、これらの手段を複数実現するものであってもよい。タブレット端末100の制御部10は、無線通信部16を介して、タブレット端末100の周辺にある機器と通信し、それらの機器とデータの送受信を行うことができる。 The wireless communication unit 16 communicates with an external device wirelessly. The wireless communication unit 16 is not particularly limited, and may implement any wireless communication means such as infrared communication such as IrDA and IrSS, Bluetooth communication, WiFi communication, and a non-contact type IC card. A plurality of means may be realized. The control unit 10 of the tablet terminal 100 can communicate with devices in the vicinity of the tablet terminal 100 via the wireless communication unit 16, and can exchange data with these devices.
 音声出力部17は、タブレット端末100が処理した音声データを、音声として出力するものであり、スピーカ、ヘッドフォン端子およびヘッドフォン等により実現される。 The sound output unit 17 outputs sound data processed by the tablet terminal 100 as sound, and is realized by a speaker, a headphone terminal, headphones, and the like.
 音声入力部18は、タブレット端末100外部で発生した音声の入力を受け付けるものであり、マイク等により実現される。 The voice input unit 18 receives voice input generated outside the tablet terminal 100, and is realized by a microphone or the like.
 記憶部19は、タブレット端末100の制御部10が実行する(1)制御プログラム、(2)OSプログラム、(3)制御部10が、タブレット端末100が有する各種機能を実行するためのアプリケーションプログラム、および、(4)該アプリケーションプログラムを実行するときに読み出す各種データを記憶するものである。あるいは、(5)制御部10が各種機能を実行する過程で演算に使用するデータおよび演算結果等を記憶するものである。例えば、上記の(1)~(4)のデータは、ROM(read only memory)、フラッシュメモリ、EPROM(Erasable ROM)、EEPROM(Electrically EPROM)、HDD(Hard Disc Drive)などの不揮発性記憶装置に記憶される。例えば、上記の(5)のデータは、RAM(Random Access Memory)などの揮発性記憶装置に記憶される。どのデータをどの記憶装置に記憶するのかについては、タブレット端末100の使用目的、利便性、コスト、物理的な制約などから適宜決定される。 The storage unit 19 includes (1) a control program executed by the control unit 10 of the tablet terminal 100, (2) an OS program, and (3) an application program for the control unit 10 to execute various functions of the tablet terminal 100, And (4) storing various data read when the application program is executed. Alternatively, (5) the control unit 10 stores data used for calculation and calculation results in the course of executing various functions. For example, the above data (1) to (4) are stored in a non-volatile storage device such as a ROM (read only memory), a flash memory, an EPROM (Erasable ROM), an EEPROM (Electrically EPROM), an HDD (Hard Disc Drive). Remembered. For example, the data (5) is stored in a volatile storage device such as a RAM (Random Access Memory). Which data is to be stored in which storage device is appropriately determined based on the purpose of use, convenience, cost, physical restrictions, and the like of the tablet terminal 100.
 制御部10は、タブレット端末100が備える各部を統括制御するものである。制御部10は、例えば、CPU(central processing unit)などで実現され、タブレット端末100が備える機能は、制御部10としてのCPUが、ROMなどに記憶されているプログラムを、RAMなどに読み出して実行することで実現される。制御部10が実現する各種機能(特に、本発明の操作画面表示機能)については、別図を参照しながら後述する。 The control unit 10 performs overall control of each unit included in the tablet terminal 100. The control unit 10 is realized by, for example, a CPU (central processing unit). The functions of the tablet terminal 100 are such that the CPU as the control unit 10 reads a program stored in a ROM or the like into a RAM or the like and executes the program. It is realized by doing. Various functions (particularly, the operation screen display function of the present invention) realized by the control unit 10 will be described later with reference to other drawings.
 〔タブレット端末の外観〕
 図3は、タブレット端末100の外観を示す平面図である。図3に示すとおり、タブレット端末100は、タッチパネルとしての入力部11および表示部12を備えているものである。また、タブレット端末100には、これらは必須の構成ではないが、操作部13、外部インターフェース14、無線通信部16、音声出力部17、音声入力部18などが備えられている。例えば、無線通信部16が、赤外線通信手段で実現されている場合、タブレット端末100の側面には、無線通信部16として赤外線送受光部が設けられる。
[Appearance of tablet terminal]
FIG. 3 is a plan view showing the appearance of the tablet terminal 100. As illustrated in FIG. 3, the tablet terminal 100 includes an input unit 11 and a display unit 12 as a touch panel. The tablet terminal 100 includes an operation unit 13, an external interface 14, a wireless communication unit 16, an audio output unit 17, an audio input unit 18, and the like, although these are not essential components. For example, when the wireless communication unit 16 is realized by infrared communication means, an infrared transmission / reception unit is provided as the wireless communication unit 16 on the side surface of the tablet terminal 100.
 図4は、タブレット端末100をユーザが把持および操作するときの様子を説明する図である。より詳細には、図4の(a)は、タブレット端末100が片手で把持され、その手で操作される様子を説明する図であり、図4の(b)は、タブレット端末100が一方の手で把持され、もう一方の手で操作される様子を説明する図である。 FIG. 4 is a diagram illustrating a state when the user holds and operates the tablet terminal 100. More specifically, FIG. 4A is a diagram illustrating a state in which the tablet terminal 100 is gripped with one hand and is operated with that hand, and FIG. It is a figure explaining a mode that it is hold | gripped with a hand and operated with the other hand.
 本実施形態では、タブレット端末100は、片手で把持可能な手のひらサイズの情報処理装置であり、図4の(a)に示すように、片手でタブレット端末100を把持したまま、その手の親指で入力部11のタッチ面を操作できるものである。そして、タブレット端末100は、例えば、親指が届かない位置に操作対象となるアイコンが存在する場合、フリック動作で、親指近辺にアイコンを引き寄せて、親指でアイコンを囲ったり、タップしたりすることにより、アイコンの選択を行うことができるものである。 In the present embodiment, the tablet terminal 100 is a palm-sized information processing apparatus that can be held with one hand. As shown in FIG. 4A, the tablet terminal 100 is held with the thumb of the hand while holding the tablet terminal 100 with one hand. The touch surface of the input unit 11 can be operated. For example, when there is an icon to be operated at a position where the thumb does not reach, the tablet terminal 100 draws the icon near the thumb by flicking, and surrounds or taps the icon with the thumb. The icon can be selected.
 また、図4の(b)に示すように、ユーザは、一方の手でタブレット端末100を把持し、もう一方の手の指で入力部11のタッチ面を操作してもよい。あるいは、図示しないが、タブレット端末100を横長にして、その両脇を両手で把持し、両手の親指で入力部11のタッチ面を操作してもよい。 Further, as shown in FIG. 4B, the user may hold the tablet terminal 100 with one hand and operate the touch surface of the input unit 11 with the finger of the other hand. Alternatively, although not shown, the tablet terminal 100 may be horizontally long, hold both sides with both hands, and operate the touch surface of the input unit 11 with the thumbs of both hands.
 〔タブレット端末の機能〕
 次に、タブレット端末100の機能構成について説明する。図1は、本実施形態におけるタブレット端末100の要部構成を示す機能ブロック図である。
[Tablet function]
Next, the functional configuration of the tablet terminal 100 will be described. FIG. 1 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
 図1に示すとおり、本実施形態にかかるタブレット端末100の制御部10は、本発明の操作画面表示機能を実現するための機能ブロックとして、少なくとも、接触情報生成部21、オブジェクト特定部22、関連項目抽出部23、および、操作画面処理部24を備えている。そして、より詳細には、操作画面処理部24は、アイコン順位決定部31およびアイコン配置決定部33を備えている。 As shown in FIG. 1, the control unit 10 of the tablet terminal 100 according to the present embodiment includes at least a contact information generation unit 21, an object specification unit 22, and related functions as functional blocks for realizing the operation screen display function of the present invention. An item extraction unit 23 and an operation screen processing unit 24 are provided. In more detail, the operation screen processing unit 24 includes an icon rank determining unit 31 and an icon arrangement determining unit 33.
 上述した制御部10の各機能ブロックは、CPU(central processing unit)が、ROM(read only memory)等で実現された不揮発性記憶装置に記憶されているプログラムを不図示のRAM(random access memory)等に読み出して実行することで実現できる。 Each functional block of the control unit 10 described above includes a CPU (central processing unit) that stores a program stored in a non-volatile storage device realized by a ROM (read only memory) or the like (RAM (random access memory)). It can be realized by reading and executing the above.
 また、記憶部19は、制御部10の上記の各部が操作画面表示機能を実行する際に、データの読み出しまたは書き込みを行うための記憶部として、具体的には、フレームマップ記憶部41、関連情報記憶部42、アイコン記憶部43、および、接触情報記憶部44で構成されている。 In addition, the storage unit 19 is specifically a frame map storage unit 41 or a related unit as a storage unit for reading or writing data when the above-described units of the control unit 10 execute the operation screen display function. An information storage unit 42, an icon storage unit 43, and a contact information storage unit 44 are included.
 接触情報生成部21は、入力部11のタッチセンサから出力される信号を処理して、接触情報を生成するものである。接触情報とは、指示体(例えば指)の接触位置の座標位置を示す接触座標情報を少なくとも含んでいる。これにより、制御部10の各部は、上記接触情報から上記指示体の移動の軌跡を得ることができる。本実施形態では、さらに、上記軌跡を構成する各点に対して、接触が起こった時間を示す接触時間情報(指示体の移動時間を示す移動時間情報)が必要に応じて対応付けられていてもよい。 The contact information generation part 21 processes the signal output from the touch sensor of the input part 11, and produces | generates contact information. The contact information includes at least contact coordinate information indicating the coordinate position of the contact position of the indicator (for example, a finger). Thereby, each part of the control part 10 can acquire the locus | trajectory of the movement of the said indicator from the said contact information. In the present embodiment, the contact time information indicating the time when the contact has occurred (the movement time information indicating the movement time of the indicator) is further associated with each point constituting the trajectory as necessary. Also good.
 接触情報記憶部44は、接触情報生成部21によって生成された接触情報を記憶するものである。接触情報は、オブジェクト特定部22が即時利用可能なように図示しない記憶部(キャッシュなど)に一時的に記憶されてもよい。さらに、本実施形態では、接触情報は、操作画面処理部24および操作画面処理部24の各部が、操作画面生成処理(アイコンを表示する処理を含む)を実行するときに利用可能なように、接触情報記憶部44に記憶される。接触情報記憶部44を不揮発性記憶装置で実現するか否か、すなわち、この接触情報を不揮発的に記憶するか否かは、操作画面処理部24が実行する操作画面表示機能の目的、想定利用環境、あるいは、タブレット端末100自体の使用目的、利便性、コスト、物理的な制約などから適宜決定される。 The contact information storage unit 44 stores the contact information generated by the contact information generation unit 21. The contact information may be temporarily stored in a storage unit (such as a cache) (not shown) so that the object specifying unit 22 can be used immediately. Furthermore, in the present embodiment, the contact information can be used when each unit of the operation screen processing unit 24 and the operation screen processing unit 24 executes an operation screen generation process (including a process for displaying an icon). It is stored in the contact information storage unit 44. Whether or not the contact information storage unit 44 is realized by a non-volatile storage device, that is, whether or not the contact information is stored in a nonvolatile manner depends on the purpose of the operation screen display function executed by the operation screen processing unit 24 and the assumed use. It is determined as appropriate from the environment, the purpose of use of the tablet terminal 100 itself, convenience, cost, physical restrictions, and the like.
 接触情報生成部21が接触情報を生成する手順について、より詳細には、入力部11のタッチセンサが、タッチ面と指示体(本実施形態では、指)との接触を検知してから、その非接触を検知するまでの間、接触情報生成部21は、タッチセンサから出力される信号を取得する。この信号には、「接触」を検知した旨と、その接触位置を示す情報とが含まれており、接触情報生成部21は、この信号に基づいて、接触位置を座標で示す接触座標情報を生成する。さらに、接触情報生成部21は、接触が検知されてからそれが非接触となるまでの間の時間を測定し、接触時間情報を接触座標情報に対応付ける。接触情報生成部21は、タブレット端末100に搭載される時計部が保持する絶対的な時刻情報を取得して利用してもよいが、本実施形態では、接触情報生成部21は、接触が検知されてから計時を開始し、相対的な接触時間情報を得る。例えば、接触情報生成部21は、接触が最初に検知された時点(t0)を0.00秒として経過時刻を計測し、接触が最後に検知された時点(tn)まで、上記計測を継続させて、接触位置に対応する相対的な接触時間情報を取得すればよい。接触情報生成部21は、得られた接触時間情報を接触座標情報に対応付けて接触情報を生成する。本実施形態では、生成された接触情報は、オブジェクト特定部22に供給され、オブジェクト特定部22によって利用される。また、本実施形態では、接触情報生成部21は、少なくとも、接触動作が終了した位置の座標情報を取得しておき、この情報を含めて接触情報を生成する。さらに、接触情報生成部21は、接触動作が開始された位置の座標情報も取得してもよい。さらに、接触情報生成部21は、接触動作の開始および終了位置のそれぞれに対して、接触時間情報を対応付けてもよい。このようにして生成された接触情報は、操作画面処理部24に供給され、操作画面処理部24および操作画面処理部24に含まれる各部によって利用される。例えば、上記のデータ構造を有する接触情報により、操作画面処理部24は、少なくとも、軌跡の終点の位置を認識することができる。 More specifically, the contact information generation unit 21 generates contact information after the touch sensor of the input unit 11 detects the contact between the touch surface and the indicator (in this embodiment, a finger). Until the non-contact is detected, the contact information generation unit 21 acquires a signal output from the touch sensor. This signal includes information indicating that “contact” has been detected and information indicating the contact position. Based on this signal, the contact information generation unit 21 generates contact coordinate information indicating the contact position in coordinates. Generate. Further, the contact information generation unit 21 measures the time from when contact is detected until it becomes non-contact, and associates the contact time information with the contact coordinate information. The contact information generation unit 21 may acquire and use absolute time information held by the clock unit mounted on the tablet terminal 100. However, in the present embodiment, the contact information generation unit 21 detects contact. Then, timing is started and relative contact time information is obtained. For example, the contact information generation unit 21 measures the elapsed time with the time point when the contact is first detected (t0) as 0.00 seconds, and continues the measurement until the time point when the contact is finally detected (tn). Thus, the relative contact time information corresponding to the contact position may be acquired. The contact information generation unit 21 generates contact information by associating the obtained contact time information with the contact coordinate information. In the present embodiment, the generated contact information is supplied to the object specifying unit 22 and used by the object specifying unit 22. Moreover, in this embodiment, the contact information generation part 21 acquires the coordinate information of the position where contact operation was complete | finished at least, and produces | generates contact information including this information. Furthermore, the contact information generation unit 21 may also acquire coordinate information of a position where the contact operation is started. Furthermore, the contact information generation unit 21 may associate contact time information with the start and end positions of the contact operation. The contact information generated in this way is supplied to the operation screen processing unit 24 and used by each unit included in the operation screen processing unit 24 and the operation screen processing unit 24. For example, the operation screen processing unit 24 can recognize at least the position of the end point of the trajectory based on the contact information having the above data structure.
 オブジェクト特定部22は、ユーザの接触動作によって選択されたオブジェクトを特定するものである。オブジェクト特定部22は、接触情報生成部21によって生成された接触情報と、その接触が起こっている間に表示部12に表示されていた映像フレームのマップ情報とを対比する。これにより、オブジェクト特定部22は、接触動作によって指し示されたオブジェクトを、表示部12に表示中のオブジェクトの中から特定することができる。 The object specifying unit 22 specifies an object selected by the user's contact operation. The object specifying unit 22 compares the contact information generated by the contact information generating unit 21 with the map information of the video frame displayed on the display unit 12 while the contact is occurring. Thereby, the object specifying unit 22 can specify the object pointed to by the contact operation from among the objects being displayed on the display unit 12.
 フレームマップ記憶部41は、接触時の表示部12に出力されていた映像フレームのマップ情報を記憶するものである。マップ情報は、タッチパネルに表示されている映像フレームのレイアウトを示す情報である。具体的には、マップ情報は、表示されている各オブジェクトについて、それらを個々に識別する情報と、各オブジェクトの形状、大きさ、および、表示位置の情報を含む。つまり、マップ情報は、各オブジェクトをタッチパネルの座標系に対応させてプロットしたものである。 The frame map storage unit 41 stores map information of the video frame output to the display unit 12 at the time of contact. The map information is information indicating the layout of the video frame displayed on the touch panel. Specifically, the map information includes information for individually identifying each object displayed, and information on the shape, size, and display position of each object. That is, the map information is obtained by plotting each object corresponding to the coordinate system of the touch panel.
 図5は、オブジェクト特定部22の動作を説明する図である。より詳細には、図5の(a)は、ユーザが目的のオブジェクトを選択するためにオブジェクトを「囲う」という接触動作を行ったことを示す図である。図5の(b)は、同図の(a)に示す接触動作に伴って、接触情報生成部21が生成した接触情報の一例を示す図である。図5の(c)は、接触が検知されたt0~tnの期間に表示部12に表示された映像フレームのマップ情報の一例を示す図である。 FIG. 5 is a diagram for explaining the operation of the object specifying unit 22. More specifically, FIG. 5A is a diagram showing that the user has performed a contact operation of “enclosing” an object in order to select the target object. FIG. 5B is a diagram illustrating an example of contact information generated by the contact information generation unit 21 in accordance with the contact operation illustrated in FIG. FIG. 5C is a diagram illustrating an example of map information of a video frame displayed on the display unit 12 during a period from t0 to tn in which contact is detected.
 図5の(a)に示すとおり、ユーザが、タブレット端末100のタッチパネルに表示されているオブジェクト80(ここでは、写真)のうちの1つを「囲う」という接触動作を実行したとする。接触動作は、例えば、接触点が同図の破線の位置を通過するようにt0~tnの期間に実行されたとする。 As shown in (a) of FIG. 5, it is assumed that the user performs a contact operation of “enclosing” one of the objects 80 (here, photographs) displayed on the touch panel of the tablet terminal 100. For example, it is assumed that the contact operation is performed in a period from t0 to tn so that the contact point passes the position of the broken line in FIG.
 オブジェクト特定部22は、接触情報生成部21から、図5の(b)に示すような接触情報を取得する。本実施形態では、接触情報の座標系は、タブレット端末100のタッチパネルの座標系に対応し、パネルの最左上端を原点とするものである。図5の(b)において、ユーザの「囲う」接触動作の軌跡のうち、始点をt0、終点をtnとして示しているが、その間の各点にも接触時間情報が関連付けられていてもよい。 The object specifying unit 22 acquires contact information as shown in FIG. 5B from the contact information generating unit 21. In the present embodiment, the coordinate system of the contact information corresponds to the coordinate system of the touch panel of the tablet terminal 100, and has the leftmost upper end of the panel as the origin. In FIG. 5B, among the trajectories of the “enclose” contact movement of the user, the start point is indicated as t0 and the end point is indicated as tn. However, contact time information may also be associated with each point in between.
 オブジェクト特定部22は、フレームマップ記憶部41から、図5の(c)に示されるマップ情報(すなわち、t0~tnの期間において表示部12に表示された映像フレームのレイアウト)を取得する。そして、オブジェクト特定部22は、上記接触情報と上記マップ情報とを対比して、接触情報から得られたユーザの指の軌跡によって特定される領域(選択領域)に全部あるいはほぼ重なるオブジェクト80を選択されたオブジェクトであるとして特定する。図5に示す例では、オブジェクト特定部22は、図5の(c)の「写真1」を選択されたオブジェクトとして特定する。オブジェクト特定部22は、特定したオブジェクトの情報を、関連項目抽出部23に供給する。 The object specifying unit 22 acquires the map information shown in FIG. 5C (that is, the layout of the video frame displayed on the display unit 12 during the period from t0 to tn) from the frame map storage unit 41. Then, the object specifying unit 22 compares the contact information with the map information, and selects an object 80 that completely or substantially overlaps an area (selected area) specified by the trajectory of the user's finger obtained from the contact information. Identified as an object. In the example shown in FIG. 5, the object specifying unit 22 specifies “Picture 1” in FIG. 5C as the selected object. The object specifying unit 22 supplies information on the specified object to the related item extracting unit 23.
 なお、軌跡によって特定される選択領域について、オブジェクト特定部22は、軌跡が閉曲線の場合、その軌跡によって囲われる領域内(図5の(c)の太枠内斜線領域)を選択領域として特定してもよいし、あるいは、軌跡が閉曲線であってもそうでなくとも上記軌跡の外接矩形内を選択領域として特定してもよい。オブジェクト特定部22は、軌跡に基づいて、上記の例に限定されない所定の規則に基づいて特定の選択領域を求める。 For the selection area specified by the trajectory, when the trajectory is a closed curve, the object specifying unit 22 specifies the area surrounded by the trajectory (the hatched area in the thick frame in FIG. 5C) as the selection area. Alternatively, even if the locus is a closed curve, the circumscribed rectangle of the locus may be specified as the selection region. Based on the trajectory, the object specifying unit 22 obtains a specific selection region based on a predetermined rule that is not limited to the above example.
 図5の(a)~(c)は、一例として、ユーザがオブジェクトを選択するための接触動作が「囲う」である場合のオブジェクトの特定方法を示しているが、本発明のタブレット端末100の構成は上記に限定されない。例えば、ユーザは、所望のオブジェクトを指すときに、単に「タップする」という接触動作(シングルタップでもダブルタップでもよい)でオブジェクトを選択してもよいし、オブジェクトにチェックマーク(レ点または×印など)を入れるような「チェックする」という接触動作で選択してもよいし、オブジェクトを斜め上から斜め下に払うような「フリックする」という接触動作で選択してもよいし、オブジェクトを所定期間押し続ける「長押しする」という接触動作で選択してもよい。タブレット端末100は、上述のいずれの接触動作を「オブジェクトを選択するための動作」に割り当ててもよく、オブジェクト特定部22は、割り当てられた接触動作に応じて適切に、ユーザが指し示すオブジェクトを特定できる構成である。 FIGS. 5A to 5C show, as an example, a method for identifying an object when the contact action for the user to select an object is “enclose”. The configuration is not limited to the above. For example, when the user points to a desired object, the user may select the object by simply “tapping” a touch action (single tap or double tap), or check mark (such as a check mark or a cross mark) on the object. ) May be selected by a contact action of “check”, or may be selected by a “flick” contact action of paying the object diagonally from top to bottom, or the object may be selected for a predetermined period of time. You may select by the contact operation | movement of "long press" which continues pushing. The tablet terminal 100 may assign any of the contact actions described above to “an action for selecting an object”, and the object specifying unit 22 appropriately specifies the object pointed to by the user according to the assigned touch action. It is a possible configuration.
 関連項目抽出部23は、オブジェクト特定部22によって特定されたオブジェクト、すなわち、ユーザによって選択されたオブジェクトに関連する関連項目を抽出するものである。オブジェクトが選択されたときに、その選択されたオブジェクトに関連性が深い項目が、関連項目抽出部23によって抽出されることになっている。 The related item extracting unit 23 extracts related items related to the object specified by the object specifying unit 22, that is, the object selected by the user. When an object is selected, an item deeply related to the selected object is extracted by the related item extraction unit 23.
 例えば、オブジェクトが「写真」などのデータである場合、写真に対しては、「表示する」、「編集する」、「メールに添付して送信する」、「周辺機器(テレビなど)に転送する」、「印刷する」などの動作が実行されることが想定される。そこで、「動作対象」であるオブジェクトに対して実行される「動作」の関係にあたる項目がオブジェクトの関連項目として抽出されてもよい。 For example, when the object is data such as “photo”, the photo is transferred to “display”, “edit”, “send as an email”, “peripheral device (TV etc.)” It is assumed that operations such as “print” and “print” are executed. Therefore, an item corresponding to an “action” executed on an object that is an “action target” may be extracted as a related item of the object.
 あるいは、オブジェクトが「写真」などのデータである場合、その写真を特定の人に送ることが想定される。そこで、動作対象であるオブジェクトに対して動作が実行されるときの「動作相手」の関係にあたる項目が関連項目として抽出されてもよい。 Or, if the object is data such as “photograph”, it is assumed that the photo is sent to a specific person. Therefore, an item corresponding to the relationship of “action partner” when an action is executed on an object that is an action target may be extracted as a related item.
 あるいは、オブジェクトが複数の写真またはその他のデータを含む「アルバム」または「フォルダ」である場合、そのオブジェクトに含まれる写真またはデータをユーザは所望していると考えられる。このように、オブジェクトの下位層に属する項目が関連項目として抽出されてもよい。 Alternatively, if the object is an “album” or “folder” that includes multiple photos or other data, the user may want the photos or data contained in the object. In this way, items belonging to the lower layer of the object may be extracted as related items.
 関連情報記憶部42は、オブジェクトと項目との関連性を示す関連情報を記憶するものである。図6は、関連情報記憶部42に記憶される関連情報の一例を示す図である。 The related information storage unit 42 stores related information indicating the relationship between objects and items. FIG. 6 is a diagram illustrating an example of related information stored in the related information storage unit 42.
 関連情報は、図6に示すとおり、「オブジェクト」ごとに、少なくとも「関連項目」が対応付けられた情報である。関連情報は、この対応付けによって、オブジェクトと項目との関連性を示している。 As shown in FIG. 6, the related information is information in which at least “related items” are associated with each “object”. The association information indicates the association between the object and the item by this association.
 関連項目抽出部23は、オブジェクト特定部22によってオブジェクトが特定されると、関連情報記憶部42に記憶されている関連情報を参照し、特定されたオブジェクトに関連付けられている項目を関連項目として抽出する。 When an object is specified by the object specifying unit 22, the related item extracting unit 23 refers to related information stored in the related information storage unit 42 and extracts items related to the specified object as related items. To do.
 例えば、図5の(a)~(c)に示すとおり、オブジェクト特定部22が、選択されたオブジェクトは「写真1」であると特定したとする。この場合、関連項目抽出部23は、「写真1」はオブジェクトとしては「写真」に分類されるので、関連情報の中から、オブジェクト「写真」に関連付けられている関連項目群60を抽出する。 For example, as shown in FIGS. 5A to 5C, it is assumed that the object specifying unit 22 specifies that the selected object is “Photo 1”. In this case, since “Photo 1” is classified as “Photo” as an object, the related item extraction unit 23 extracts a related item group 60 associated with the object “Photo” from the related information.
 関連項目抽出部23によって抽出された関連項目の情報は、操作画面処理部24に供給される。そして、抽出された関連項目は、先に選択されたオブジェクトに関連のある項目として、選択可能に(例えば、アイコンで)表示される。 Information on the related items extracted by the related item extracting unit 23 is supplied to the operation screen processing unit 24. The extracted related items are displayed so as to be selectable (for example, as icons) as items related to the previously selected object.
 この構成には限定されないが、本実施形態では、図6に示すとおり、さらに「関連項目」ごとに、アイコンが割り当てられていてもよい。例えば、オブジェクト「写真」に関連付けられた、関連項目「テレビで表示する(テレビに転送する)」には、アイコン「1:テレビ」が関連付けられている。アイコン「1:テレビ」は、例えば、テレビのイラストなどが描かれたアイコンであって、「写真をテレビに送って表示させること」を想起させる絵柄であることが好ましい。その他の関連項目についても、その関連項目の内容を想起させる相応しい絵柄のアイコンがそれぞれ割り当てられている。 Although not limited to this configuration, in this embodiment, as shown in FIG. 6, an icon may be assigned to each “related item”. For example, the icon “1: TV” is associated with the related item “display on television (transfer to television)” associated with the object “photo”. The icon “1: TV” is, for example, an icon on which an illustration of a TV or the like is drawn, and is preferably a picture reminiscent of “sending a photograph to the TV for display”. Other related items are also assigned icons with appropriate patterns that recall the contents of the related items.
 このような関連情報に基づいて、関連項目抽出部23は、抽出した関連項目のそれぞれに対応するアイコン(あるいは、アイコンの識別情報)を、操作画面処理部24に供給してもよい。これにより操作画面処理部24は、関連項目抽出部23によって指定されたアイコンを表示するべく処理を進めることができる。 Based on such related information, the related item extracting unit 23 may supply the operation screen processing unit 24 with icons (or icon identification information) corresponding to the extracted related items. Thereby, the operation screen processing unit 24 can proceed to display the icon specified by the related item extraction unit 23.
 なお、本実施形態では、関連情報は、関連項目のそれぞれについて、その関連項目の性質あるいは分類などを示す情報、すなわち、「属性」を保持している。本実施形態では、属性は、特に、関連項目がユーザに選択される可能性(の高さ)を求めるための指標となることが好ましい。本実施形態では、属性の一例として、「選択回数」の情報が、関連項目ごとに関連情報記憶部42に記憶されている。属性「選択回数」は、かつてのユーザ操作によって、その関連項目が、今まで何回選択されたのかを示す情報である。選択された回数は、タブレット端末100が使用され始めてから今までの累積回数がカウントされたものであってもよい。あるいは、選択された回数は、電源オフ、所定期間経過または履歴消去などのイベント発生の度にリセットされて、その度に一からカウントされたものであってもよい。関連項目の属性は上記に限定されず、関連情報は、関連項目の性質あるいは分類などを示す情報であれば、あらゆる種類の属性を保持していてもよい。関連項目の属性は、操作画面処理部24およびその各部によって必要に応じて読み出される。 In the present embodiment, the related information holds information indicating the nature or classification of the related item, that is, “attribute” for each related item. In the present embodiment, it is preferable that the attribute is an index for obtaining the possibility (height) that the related item is selected by the user. In the present embodiment, as an example of the attribute, information of “number of selections” is stored in the related information storage unit 42 for each related item. The attribute “number of selections” is information indicating how many times the related item has been selected so far by a user operation. The number of times of selection may be one in which the cumulative number of times since the tablet terminal 100 started to be used has been counted. Alternatively, the selected number of times may be reset every time an event such as power-off, elapse of a predetermined period, or history deletion, and be counted from the beginning each time. The attribute of the related item is not limited to the above, and the related information may hold any type of attribute as long as it is information indicating the nature or classification of the related item. The attribute of the related item is read by the operation screen processing unit 24 and each unit as necessary.
 操作画面処理部24は、オブジェクト、および、選択されたオブジェクトに関連する関連項目(のアイコン)を、ユーザに選択可能に表示するための操作画面を生成する処理(操作画面生成処理)を行うものである。 The operation screen processing unit 24 performs processing (operation screen generation processing) for generating an operation screen for displaying an object and a related item (its icon) related to the selected object in a selectable manner for the user. It is.
 図7は、アイコン記憶部43に記憶されるアイコン画像の具体例を示す図である。図7に示すとおり、本実施形態では、各アイコン画像は、アイコン識別情報によって識別可能となっている。例えば、「1:テレビ」のアイコン識別情報には、テレビが描かれたアイコン画像が関連付けられている。また、図示していないが、よく通話する知人など、個人情報を表すアイコンとして、その人の似顔絵やアバターの画像が用いられてもよい。 FIG. 7 is a diagram showing a specific example of the icon image stored in the icon storage unit 43. As shown in FIG. As shown in FIG. 7, in this embodiment, each icon image can be identified by icon identification information. For example, the icon identification information “1: TV” is associated with an icon image depicting a TV. In addition, although not shown, a portrait or an avatar image of the person may be used as an icon representing personal information such as an acquaintance who often makes a call.
 操作画面処理部24は、関連項目抽出部23によって抽出された関連項目に割り当てられたアイコン画像を、アイコン記憶部43から読み出して、これらが適切な位置および適切なタイミングで表示されるように操作画面を生成し、図示しない表示制御部を介して、表示部12に出力する。 The operation screen processing unit 24 reads the icon images assigned to the related items extracted by the related item extraction unit 23 from the icon storage unit 43, and performs operations so that these are displayed at an appropriate position and an appropriate timing. A screen is generated and output to the display unit 12 via a display control unit (not shown).
 具体的には、本実施形態では、操作画面処理部24は、接触動作によって選択されたオブジェクトに関連する関連項目のアイコンを、所定のアイコン配置パターンに沿って表示させる機能を有している。 Specifically, in the present embodiment, the operation screen processing unit 24 has a function of displaying icons of related items related to the object selected by the contact operation along a predetermined icon arrangement pattern.
 操作画面処理部24は、本実施形態では、アイコン順位決定部31およびアイコン配置決定部33を少なくとも含む構成である。アイコン順位決定部31およびアイコン配置決定部33は、操作画面処理部24が実行する操作画面生成処理の一部の機能を担うものである。 In the present embodiment, the operation screen processing unit 24 includes at least an icon rank determining unit 31 and an icon arrangement determining unit 33. The icon rank determination unit 31 and the icon arrangement determination unit 33 are responsible for some functions of the operation screen generation process executed by the operation screen processing unit 24.
 アイコン順位決定部31は、関連項目の属性に基づいて、関連項目抽出部23によって抽出された関連項目に優先順位を付与するものである。あるいは、アイコン順位決定部31は、抽出された関連項目に関連付けられたアイコンに優先順位を付与してもよい。 The icon ranking determining unit 31 gives priority to the related items extracted by the related item extracting unit 23 based on the attributes of the related items. Alternatively, the icon rank determining unit 31 may give a priority to the icons associated with the extracted related items.
 本実施形態では、上述したとおり、関連情報には、関連項目の属性の1つとして、「選択回数」のフィールドが含まれている。そこで、アイコン順位決定部31は、抽出された関連項目の「選択回数」が多い順に、各関連項目または各アイコンに優先順位を付与する。ここでは、選択回数が多い関連項目ほど、そのアイコンがユーザに選択される可能性が高いと考えられ、したがって、高い優先順位が付与される。 In this embodiment, as described above, the related information includes a field of “number of selections” as one of the attributes of the related item. Therefore, the icon order determination unit 31 assigns a priority to each related item or each icon in descending order of the “selection count” of the extracted related items. Here, it is considered that the related item having a larger number of selections is more likely to be selected by the user, and accordingly, a higher priority is given.
 図8は、アイコン順位決定部31が、各アイコンに付与した優先順位の一例を示す図である。 FIG. 8 is a diagram illustrating an example of the priority order assigned to each icon by the icon order determination unit 31.
 例えば、図5の(a)または(c)に示すとおり、オブジェクト80(すなわち、写真1)が、接触動作によって選択され、特定されたことにより、関連項目抽出部23が、図6に示す関連項目の中から、関連項目群60を抽出したとする。 For example, as shown in (a) or (c) of FIG. 5, when the object 80 (that is, the photograph 1) is selected and specified by the contact operation, the related item extracting unit 23 performs the relation shown in FIG. It is assumed that the related item group 60 is extracted from the items.
 アイコン順位決定部31は、接触情報記憶部44に記憶されている、関連情報(図6)から、抽出された関連項目のそれぞれに対応付けられている、属性「選択回数」と、アイコン識別情報とを読み出す。そして、アイコン順位決定部31は、それぞれのアイコンに対して、対応する「選択回数」が示す選択回数順に、優先順位を付与する。図6および図8に示す例では、「選択回数」が多い順に各アイコンをソートすると、「2:プリンタ」、「4:写真表示」、「1:テレビ」、「8:メモリカード」、「3:メール」、「6:パレット」、「7:ゴミ箱」、「5:情報表示」となるので、これらに対して、順に、1位から8位までの優先順位を付与する。この優先順位の付与結果は、操作画面処理部24に返される。 The icon rank determination unit 31 includes an attribute “number of selections” and icon identification information associated with each of the related items extracted from the related information (FIG. 6) stored in the contact information storage unit 44. And read. Then, the icon order determination unit 31 assigns priorities to the respective icons in the order of the number of selections indicated by the corresponding “number of selections”. In the example shown in FIGS. 6 and 8, when the icons are sorted in descending order of the “number of selections”, “2: printer”, “4: photo display”, “1: television”, “8: memory card”, “ “3: Mail”, “6: Palette”, “7: Trash”, and “5: Information display”, the priority order from the first place to the eighth place is given to these. The result of assigning the priority is returned to the operation screen processing unit 24.
 アイコン配置決定部33は、関連項目のアイコンの配置を決定するものである。アイコン配置決定部33の機能を図9を参照して説明する。図9は、アイコン配置決定部33の動作の具体例を説明する図であり、より詳細には、図9の(a)は、アイコン配置決定部33が取得するアイコン配置パターンの具体例を示す図であり、図9の(b)は、アイコン配置決定部33によって決定されたアイコンの配置の具体例を示す図である。 The icon arrangement determination unit 33 determines the arrangement of icons of related items. The function of the icon arrangement determination unit 33 will be described with reference to FIG. FIG. 9 is a diagram illustrating a specific example of the operation of the icon arrangement determining unit 33. More specifically, FIG. 9A illustrates a specific example of the icon arrangement pattern acquired by the icon arrangement determining unit 33. FIG. 9B is a diagram illustrating a specific example of the icon arrangement determined by the icon arrangement determining unit 33.
 本実施形態では、まず、アイコン配置決定部33は、アイコン配置パターンを取得し、アイコン配置パターンにおいて定義されたアイコン配置位置のそれぞれに、上述の優先順位を対応付ける。アイコン配置パターンとは、アイコンを何個、および、アイコンをタッチパネル上のどの位置に配置するのかをパターン化して定義したものである。本実施形態では、一例として、タブレット端末100は、記憶部19の図示しないいずれかの領域において、1つの固定されたアイコン配置パターンを保持しており、アイコン配置決定部33は、そのアイコン配置パターンを取得する。しかし、タブレット端末100は上記の構成に限定されず、記憶部19において、複数種類のアイコン配置パターンを保持していてもよく、アイコン配置決定部33が必要に応じてアイコン配置パターンを1つ選択する構成であってもよい。あるいは、図示しない配置パターン決定部が、入力された接触動作の移動の軌跡にしたがって動的にアイコン配置パターンを決定し、これをアイコン配置決定部33に供給する構成であってもよい。 In the present embodiment, first, the icon arrangement determining unit 33 acquires an icon arrangement pattern, and associates the above-described priority order with each icon arrangement position defined in the icon arrangement pattern. The icon arrangement pattern is defined by patternizing how many icons and where the icons are arranged on the touch panel. In the present embodiment, as an example, the tablet terminal 100 holds one fixed icon arrangement pattern in any area (not shown) of the storage unit 19, and the icon arrangement determination unit 33 displays the icon arrangement pattern. To get. However, the tablet terminal 100 is not limited to the above configuration, and the storage unit 19 may hold a plurality of types of icon arrangement patterns, and the icon arrangement determination unit 33 selects one icon arrangement pattern as necessary. It may be configured to. Alternatively, a configuration in which an arrangement pattern determination unit (not shown) dynamically determines an icon arrangement pattern according to the input movement trajectory of the contact operation and supplies the icon arrangement pattern to the icon arrangement determination unit 33 may be adopted.
 本実施形態では、図9の(a)に示すとおり、タッチパネルの画面いっぱいに配置された環(縦長の楕円)の輪郭線上に8個のアイコンを均等な間隔で配置することを定義したアイコン配置パターンが、一例として、記憶部19に保持されている。しかし、図9の(a)は、本発明のアイコン配置パターンをこの具体例に限定する意図は無い。 In the present embodiment, as shown in FIG. 9A, an icon arrangement that defines that eight icons are arranged at equal intervals on the outline of a ring (vertically long ellipse) arranged over the entire screen of the touch panel. The pattern is held in the storage unit 19 as an example. However, FIG. 9A is not intended to limit the icon arrangement pattern of the present invention to this specific example.
 本実施形態では、アイコン配置決定部33は、図9の(a)に示すアイコン配置パターンにおいて定義された8個のアイコンの配置位置に、優先順位を対応付ける。 In the present embodiment, the icon arrangement determining unit 33 associates priorities with the arrangement positions of the eight icons defined in the icon arrangement pattern shown in FIG.
 ここで、アイコン配置決定部33は、接触情報記憶部44に記憶された接触情報に基づいて、アイコンの配置位置と優先順位との関連付けを行う。 Here, the icon arrangement determining unit 33 associates the icon arrangement position with the priority order based on the contact information stored in the contact information storage unit 44.
 詳細には、アイコン配置決定部33は、接触情報記憶部44から、先に生成されている接触情報を取得する。ここでは、アイコン配置決定部33は、接触情報のうち、軌跡の終点tnの接触座標情報のみを取得してもよい。次に、アイコン配置決定部33は、取得したアイコン配置パターンに軌跡の終点tnをプロットする。そして、アイコン配置決定部33は、終点tnとの距離が近い配置位置から順に、上位の優先順位を関連付ける。 Specifically, the icon arrangement determination unit 33 acquires the contact information generated previously from the contact information storage unit 44. Here, the icon arrangement determination unit 33 may acquire only the contact coordinate information of the end point tn of the locus among the contact information. Next, the icon arrangement determining unit 33 plots the end point tn of the locus on the acquired icon arrangement pattern. And the icon arrangement | positioning determination part 33 associates a high priority in order from the arrangement position with the short distance with the end point tn.
 具体的には、図9の(a)に示すとおり、A~Hの8個の配置位置が、終点tnとの距離が近い順に並べられた場合に、H、A、G、B、C、F、D、Eとなるとすると、アイコン配置決定部33は、この順に、1位から8位までの優先順位を関連付ける。 Specifically, as shown in FIG. 9A, when the eight arrangement positions A to H are arranged in order of the distance from the end point tn, H, A, G, B, C, Assuming F, D, and E, the icon arrangement determining unit 33 associates the priority orders from the first place to the eighth place in this order.
 結果として、アイコン配置決定部33は、図9の(b)に示す順位関連付け結果を生成することができる。図9の(b)に示す順位関連付け結果によれば、Aの位置に2位、Bの位置に4位、Cの位置に5位、Dの位置に7位、Eの位置に8位、Fの位置に6位、Gの位置に3位、Hの位置に1位が関連付けられる。 As a result, the icon arrangement determining unit 33 can generate the rank association result shown in FIG. According to the ranking association result shown in FIG. 9B, the second position at the A position, the fourth position at the B position, the fifth position at the C position, the seventh position at the D position, the eighth position at the E position, The sixth position is associated with the F position, the third position with the G position, and the first position with the H position.
 次に、アイコン配置決定部33は、アイコン順位決定部31によって決定された各アイコンの優先順位と、自身が決定した優先順位ごとの配置位置とに従って、アイコンを配置する。例えば、アイコン配置決定部33は、優先順位「3位」が付与された「1:テレビ」のアイコンを、図9の(b)の3位(元はG)の位置に配置することを決定する。 Next, the icon arrangement determining unit 33 arranges icons according to the priority of each icon determined by the icon order determining unit 31 and the arrangement position for each priority determined by itself. For example, the icon arrangement determining unit 33 determines to arrange the icon “1: TV” to which the priority “third place” is assigned at the third position (originally G) in FIG. To do.
 このようにしてアイコン配置決定部33によって決定されたアイコンの配置結果は、操作画面処理部24に返される。 The icon arrangement result determined by the icon arrangement determining unit 33 in this way is returned to the operation screen processing unit 24.
 操作画面処理部24は、アイコン順位決定部31およびアイコン配置決定部33の決定にしたがって抽出された各アイコンを決まった位置に配置した操作画面を生成し、表示制御部を介して表示部12に出力する。 The operation screen processing unit 24 generates an operation screen in which each icon extracted according to the determination of the icon rank determination unit 31 and the icon arrangement determination unit 33 is arranged at a predetermined position, and is displayed on the display unit 12 via the display control unit. Output.
 なお、図9の(a)および(b)に示す、環の輪郭線を示す破線は、タブレット端末100が内部に情報として保持している環の形状であって、実際には、表示部12に表示されなくてもよい。これより以降に示す各図における環の輪郭線を示す破線も同様、実際には、表示部12に表示されなくてもよい。 9A and 9B, the broken line indicating the outline of the ring is the shape of the ring held as information inside the tablet terminal 100, and actually the display unit 12 May not be displayed. Similarly, the broken line indicating the outline of the ring in each of the drawings shown below is not actually displayed on the display unit 12.
 図10は、操作画面処理部24が実行した操作画面生成処理の結果、得られた操作画面の具体例を示す図である。図10に示す例は、図5~図9と同様に、オブジェクト80(オブジェクト「写真1」)が選択されたことに対して得られた操作画面の具体例である。 FIG. 10 is a diagram illustrating a specific example of the operation screen obtained as a result of the operation screen generation process executed by the operation screen processing unit 24. The example shown in FIG. 10 is a specific example of the operation screen obtained when the object 80 (object “Photo 1”) is selected, as in FIGS.
 操作画面処理部24は、上述した手順に沿って抽出された「1:テレビ」、「2:プリンタ」、「3:メール」、「4:写真表示」、「5:情報表示」、「6:パレット」、「7:ゴミ箱」、および、「8:メモリカード」の各アイコン画像をアイコン記憶部43(図7)から読み出す。操作画面処理部24は、図8のとおり決定された優先順位および図9の(b)のとおり決定された配置位置にしたがって、操作画面を生成する。生成された操作画面は、表示部12へと出力されて、図10に示すとおり、表示部12に表示される。 The operation screen processing unit 24 extracts “1: TV”, “2: Printer”, “3: Mail”, “4: Photo display”, “5: Information display”, “6” extracted according to the above-described procedure. : Palette ”,“ 7: Trash ”, and“ 8: Memory card ”are read from the icon storage unit 43 (FIG. 7). The operation screen processing unit 24 generates an operation screen according to the priority order determined as shown in FIG. 8 and the arrangement position determined as shown in FIG. 9B. The generated operation screen is output to the display unit 12 and displayed on the display unit 12 as shown in FIG.
 操作画面処理部24は、図10に示すとおり、選択されたオブジェクト80を画面の中央に配置してもよい。なお、上述の説明では、操作画面処理部24が、オブジェクト80を中央に配置する処理を行う構成としたが、これは必須の構成ではない。しかし、本実施形態では、操作画面処理部24は、アイコンをアイコン配置パターンに沿って環状に配置するため、アイコンがオブジェクトと重ならないようにして見易くするという観点から、オブジェクト80を中央に配置することが好ましい。 The operation screen processing unit 24 may arrange the selected object 80 in the center of the screen as shown in FIG. In the above description, the operation screen processing unit 24 performs the process of arranging the object 80 in the center, but this is not an essential structure. However, in the present embodiment, the operation screen processing unit 24 arranges the object 80 in the center from the viewpoint of making it easy to see the icon so as not to overlap the object because the icon is arranged in a ring shape along the icon arrangement pattern. It is preferable.
 〔操作画面表示フロー〕
 次に、タブレット端末100が操作画面表示機能を実行したときの処理の流れについて説明する。図11は、タブレット端末100による操作画面表示処理の流れを示すフローチャートである。
[Operation screen display flow]
Next, a processing flow when the tablet terminal 100 executes the operation screen display function will be described. FIG. 11 is a flowchart showing a flow of operation screen display processing by the tablet terminal 100.
 入力部11によって、タッチパネルのタッチ面に指示体(ユーザの指など)が接触したことが検知されると(S101においてYES)、接触情報生成部21は、そのとき(t=t0)から、指の接触位置を示す接触座標情報の取得を開始し、これを経時的に取得する(S102)。この接触位置の追尾は、タッチ面と指との間の接触が検知されなくなるまで継続される(S103においてNO)。入力部11において、接触が非検知になると(S103においてYES)、接触情報生成部21は、t=t0からこのとき(t=tn)までの間取得した接触座標情報と、接触時間情報とを対応付けて接触情報を生成する(S104)。なお、オブジェクトを選択する接触動作がダブルタップなどである場合、すなわち、1つの接触動作において、極めて短い時間、一時的に、非接触が生じるような場合には、短い時間に一時的に接触が検知されなくても接触動作は終わっていないとして追尾が継続されるように接触情報生成部21を構成すればよい。 When the input unit 11 detects that an indicator (such as a user's finger) has touched the touch surface of the touch panel (YES in S101), the contact information generation unit 21 starts the finger from that time (t = t0). Acquisition of the contact coordinate information indicating the contact position is started, and this is acquired over time (S102). This tracking of the contact position is continued until no contact is detected between the touch surface and the finger (NO in S103). When contact is not detected in the input unit 11 (YES in S103), the contact information generation unit 21 obtains the contact coordinate information and the contact time information acquired from t = t0 to this time (t = tn). Corresponding contact information is generated (S104). When the contact operation for selecting an object is a double tap or the like, that is, when a non-contact occurs temporarily in a single contact operation for an extremely short time, the contact is temporarily made in a short time. What is necessary is just to comprise the contact information production | generation part 21 so that tracking may be continued as the contact operation is not over even if it is not detected.
 オブジェクト特定部22は、S104において生成された接触情報(例えば、図5の(b))と、フレームマップ記憶部41に記憶されているマップ情報(例えば、図5の(c))とを比較して、ユーザが接触した軌跡が存在する領域に重なるオブジェクトを選択されたオブジェクトとして特定する(S105)。図5の(c)に示す例では、「写真1」というオブジェクト80を特定する。 The object specifying unit 22 compares the contact information generated in S104 (for example, (b) in FIG. 5) with the map information (for example, (c) in FIG. 5) stored in the frame map storage unit 41. Then, the object overlapping the area where the locus touched by the user exists is specified as the selected object (S105). In the example shown in FIG. 5C, the object 80 “Photo 1” is specified.
 関連項目抽出部23は、S105において特定されたオブジェクトに基づいて、関連情報記憶部42の関連情報(例えば、図6)を参照して、特定されたオブジェクトの関連項目を抽出する(S106)。あるいは、関連項目に割り当てられているアイコンの識別情報を抽出してもよい。 The related item extraction unit 23 refers to the related information (for example, FIG. 6) in the related information storage unit 42 based on the object specified in S105, and extracts the related item of the specified object (S106). Alternatively, identification information of icons assigned to related items may be extracted.
 続いて、操作画面処理部24は、操作画面生成処理を実行する。まず、アイコン順位決定部31は、S106において抽出された関連項目のそれぞれについて優先順位を決定する(S107)。アイコン順位決定部31は、決定した優先順位を、関連項目またはその関連項目のアイコンに、それぞれ関連付けておく(例えば、図8)。 Subsequently, the operation screen processing unit 24 executes an operation screen generation process. First, the icon order determination unit 31 determines the priority order for each of the related items extracted in S106 (S107). The icon order determination unit 31 associates the determined priority order with the related item or the icon of the related item (for example, FIG. 8).
 一方、アイコン配置決定部33は、所定のアイコン配置パターン(例えば、図9の(a))を記憶部19から読み出し、アイコン配置パターンにおいて定義されているいくつかのアイコン配置位置に、優先順位を関連付ける(S108)。アイコン配置決定部33は、S104にて得られた接触情報に基づいて、アイコンの配置位置と優先順位との関連付けを行う。より詳細には、アイコン配置決定部33は、指の軌跡の終点からの距離が短い各アイコンの配置位置ほど上位の順位が割り当てられるように関連付けを行う(例えば、図9の(a)および(b))。なお、S107およびS108は、並列的に実行されてもよいし、直列的に任意の順序で順次実行されてもよい。 On the other hand, the icon arrangement determining unit 33 reads a predetermined icon arrangement pattern (for example, (a) in FIG. 9) from the storage unit 19, and assigns priority to some icon arrangement positions defined in the icon arrangement pattern. Associate (S108). The icon arrangement determining unit 33 associates the icon arrangement position with the priority order based on the contact information obtained in S104. More specifically, the icon arrangement determination unit 33 performs association so that a higher rank is assigned to an icon arrangement position with a shorter distance from the end point of the finger trajectory (for example, (a) and ( b)). Note that S107 and S108 may be executed in parallel, or may be executed sequentially in an arbitrary order in series.
 次に、アイコン配置決定部33は、アイコン順位決定部31によって決定された優先順位と、自身が決定した配置位置とにしたがって、抽出された各アイコンの配置を決定する(S109)。アイコン配置決定部33は、アイコンの配置結果を操作画面処理部24に返す。 Next, the icon arrangement determining unit 33 determines the arrangement of each extracted icon according to the priority determined by the icon order determining unit 31 and the arrangement position determined by itself (S109). The icon arrangement determining unit 33 returns the icon arrangement result to the operation screen processing unit 24.
 最後に、操作画面処理部24は、S106において抽出された関連項目のアイコン画像を、アイコン記憶部43(例えば、図7)から取得する。そして、S109の決定にしたがって、取得したアイコン画像が配置された操作画面を生成する(S110)。上述の例では、操作画面処理部24は、選択されたオブジェクトを中央に配置して、その周囲に各アイコンを環形状に配置した操作画面を生成する。 Finally, the operation screen processing unit 24 acquires the icon image of the related item extracted in S106 from the icon storage unit 43 (for example, FIG. 7). Then, according to the determination in S109, an operation screen on which the acquired icon image is arranged is generated (S110). In the above-described example, the operation screen processing unit 24 generates an operation screen in which the selected object is arranged in the center and each icon is arranged in a ring shape around the selected object.
 以上のようにして生成された操作画面の映像信号は、表示部12に出力される。図10に示すように、上記操作画面は、タブレット端末100の表示部12に表示される。本実施形態のタブレット端末100によって生成される操作画面によれば、各アイコンは、優先順位すなわち「選択回数」が多いものほど、つまりは、ユーザに選択される可能性が高いアイコンほど、軌跡の終点tnの近くに配置される。 The video signal of the operation screen generated as described above is output to the display unit 12. As shown in FIG. 10, the operation screen is displayed on the display unit 12 of the tablet terminal 100. According to the operation screen generated by the tablet terminal 100 according to the present embodiment, each icon has a higher priority, that is, a “selection count”, that is, an icon that is more likely to be selected by the user. It is arranged near the end point tn.
 本発明の上記構成および方法によれば、オブジェクトを選択するためのユーザの1つの接触動作に対して、タブレット端末100は、選択されたオブジェクトに関連する関連項目のアイコンを選択可能に表示するという結果を出力することができる。その上、ここで選択可能に表示されたアイコンの配置は、ユーザの接触動作が考慮された配置となっている。すなわち、ユーザに選択される可能性が高いアイコンほど、ユーザが先の接触動作を終えた位置(軌跡の終点tn)の近くに表示されるという配置である。 According to the above-described configuration and method of the present invention, the tablet terminal 100 displays the icons of related items related to the selected object in a selectable manner with respect to one contact operation of the user for selecting the object. The result can be output. In addition, the arrangement of the icons displayed so as to be selectable here is an arrangement in consideration of the user's contact operation. In other words, the icons that are more likely to be selected by the user are displayed closer to the position (end point tn of the trajectory) where the user has finished the previous contact operation.
 ユーザは、先のオブジェクトを選択する接触動作を行った後、すぐに、そのオブジェクトに関連する関連項目のアイコンを選択する接触動作を行うと予想され、これが操作の自然な流れである。この場合、ユーザは、先の接触動作が完了した位置から、次に目的のアイコンが表示された位置まで、指示体(指など)を移動させることになる。 The user is expected to perform the contact operation of selecting the icon of the related item related to the object immediately after performing the contact operation of selecting the previous object, and this is a natural flow of operation. In this case, the user moves the indicator (such as a finger) from the position where the previous contact operation is completed to the position where the target icon is displayed next.
 ここで、先の接触動作が完了した位置から、目的のアイコンが表示された位置までの距離が長ければ長いほど、指示体を長く移動させなければならないので、ユーザにとって選択操作は煩わしいものとなる。このような煩わしさは、タッチパネルの画面サイズが大きければ大きいほど顕著となり、また、ユーザが図4の(a)に示すように片手で操作していて、接触可能領域が限られる場合などには、特に深刻な問題となる。オブジェクトが階層で管理されていて、何度も続けて選択操作が行われる場合にも煩わしさはますます増大する。 Here, the longer the distance from the position where the previous contact operation has been completed to the position where the target icon is displayed, the longer the indicator must be moved, and the selection operation becomes troublesome for the user. . Such annoyingness becomes more noticeable as the screen size of the touch panel is larger, and when the user is operating with one hand as shown in FIG. It becomes a particularly serious problem. Even when the objects are managed in a hierarchy and the selection operation is performed repeatedly, the annoyance is further increased.
 そこで、本発明のように、選択される可能性が高いアイコンほど、先の接触動作の完了位置の近くに表示されるような配置にすることにより、ユーザは、高い確率で先の接触動作の完了位置近くに表示されている所望のアイコンを選択するだけでよくなる。結果として、高い確率で、上述の選択操作の煩わしさを回避することができる。 Therefore, as in the present invention, by arranging the icons that are more likely to be selected to be displayed near the completion position of the previous contact operation, the user can predict the previous contact operation with a high probability. It is only necessary to select a desired icon displayed near the completion position. As a result, it is possible to avoid the troublesome selection operation described above with a high probability.
 以上のように、本発明のタブレット端末100によれば、ユーザは、指示体を画面上のあちらこちらに不自然に移動させる必要がなくなり、簡易な接触動作で目的の最終結果物にたどり着くことができる。また指示体の移動距離が短いほど、誤操作の誘発を抑えることも可能である。 As described above, according to the tablet terminal 100 of the present invention, it is not necessary for the user to unnaturally move the indicator from one place to another on the screen, and the target final result can be reached with a simple contact operation. it can. In addition, it is possible to suppress the induction of erroneous operations as the moving distance of the indicator is shorter.
 以上のことから、タブレット端末100は、簡易な接触動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、タッチパネルを備えたタブレット端末100において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations. As a result, the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
 (接触動作およびアイコン配置パターンの他の例)
 本実施形態では、オブジェクトを選択するためのユーザの接触動作が「囲う」という動作である場合について説明した。また、固定されたアイコン配置パターンの具体例として、縦長の楕円の輪郭線上に8個のアイコンを均等な間隔で配置することを定義したアイコン配置パターンを用いる場合について説明した。
(Other examples of contact movement and icon arrangement pattern)
In the present embodiment, the case where the user's contact operation for selecting an object is an operation of “enclosing” has been described. Further, as a specific example of the fixed icon arrangement pattern, a case has been described in which an icon arrangement pattern that defines that eight icons are arranged at equal intervals on a vertically long elliptical outline has been described.
 しかし、本発明のタブレット端末100の構成は、上記に限定されない。タブレット端末100は、接触動作「囲う」以外を、オブジェクトを選択するためのジェスチャとして判別することができる。また、タブレット端末100は、図9の(a)に示すアイコン配置パターンとは別のアイコン配置パターンを保持してもよい。 However, the configuration of the tablet terminal 100 of the present invention is not limited to the above. The tablet terminal 100 can determine a gesture other than the contact operation “enclose” as a gesture for selecting an object. Moreover, the tablet terminal 100 may hold | maintain the icon arrangement pattern different from the icon arrangement pattern shown to (a) of FIG.
 図12は、接触情報生成部21およびオブジェクト特定部22の動作を説明する図である。より詳細には、図12の(a)は、ユーザが目的のオブジェクトを選択するためにオブジェクトをレ点で「チェックする」という接触動作を行ったことを示す図である。図12の(b)は、同図の(a)に示す接触動作に伴って、接触情報生成部21が生成した接触情報の一例を示す図である。 FIG. 12 is a diagram for explaining the operation of the contact information generating unit 21 and the object specifying unit 22. More specifically, FIG. 12A is a diagram illustrating that the user has performed a contact operation of “checking” an object at a check point in order to select a target object. FIG. 12B is a diagram illustrating an example of contact information generated by the contact information generation unit 21 in accordance with the contact operation illustrated in FIG.
 図12の(a)に示すとおり、ユーザが、タブレット端末100のタッチパネルに表示されているオブジェクト80(ここでは、写真2)をレ点で「チェックする」という接触動作を実行したとする。接触動作は、例えば、接触点が同図の破線の位置を通過するようにt0~tnの期間に実行されたとする。 As shown in FIG. 12A, it is assumed that the user performs a contact operation of “checking” the object 80 (here, the photograph 2) displayed on the touch panel of the tablet terminal 100 at a check mark. For example, it is assumed that the contact operation is performed in a period from t0 to tn so that the contact point passes the position of the broken line in FIG.
 接触情報生成部21は、図12の(b)に示す接触情報を生成する。ユーザの「チェックする」接触動作の軌跡のうち、始点をt0、終点をtnとして示しているが、接触情報生成部21は、その間の各点にも接触時間情報を関連付けてもよい。 The contact information generation unit 21 generates the contact information shown in FIG. Of the user's “check” contact movement trajectory, the start point is indicated as t0 and the end point is indicated as tn. However, the contact information generation unit 21 may associate the contact time information with each point in between.
 オブジェクト特定部22は、接触情報生成部21から、図12の(b)に示す接触情報を取得する。そして、オブジェクト特定部22は、フレームマップ記憶部41から、図5の(c)に示されるマップ情報を取得する。そして、オブジェクト特定部22は、上記接触情報と上記マップ情報とを対比して、接触情報から得られたユーザの指の軌跡が存在する領域と全部あるいはほぼ重なるオブジェクト80(ここでは、写真2)を選択されたオブジェクトであるとして特定する。あるいは、本変形例では、レ点の軌跡を構成する各点のうち、Y座標が最も大きい点(つまり、レ点の折り返し点)と重なるオブジェクトを、ユーザに選択されたオブジェクトとして特定してもよい。 The object specifying unit 22 acquires the contact information shown in FIG. 12B from the contact information generating unit 21. Then, the object specifying unit 22 acquires map information shown in FIG. 5C from the frame map storage unit 41. Then, the object specifying unit 22 compares the contact information with the map information, and an object 80 (here, Photo 2) that overlaps with or substantially overlaps the region where the locus of the user's finger obtained from the contact information exists. Is identified as the selected object. Or in this modification, you may specify the object which overlaps the point with the largest Y coordinate (namely, the turning point of a check point) among each point which comprises the locus | trajectory of a check point as an object selected by the user.
 次に、関連項目抽出部23は、特定されたオブジェクト80(写真2)の関連項目を抽出する。ここでは、実施形態1と同様に関連項目群60(図6)が抽出されたものとし、アイコン順位決定部31は、実施形態1と同様に各関連項目に優先順位を付与(図8)したものとする。 Next, the related item extracting unit 23 extracts related items of the specified object 80 (photo 2). Here, it is assumed that the related item group 60 (FIG. 6) is extracted as in the first embodiment, and the icon order determination unit 31 assigns a priority to each related item as in the first embodiment (FIG. 8). Shall.
 一方、アイコン配置決定部33は、図9の(a)に示すアイコン配置パターンの代わりに、図13に示すアイコン配置パターンを記憶部19から取得してもよい。図13は、アイコン配置決定部33が取得するアイコン配置パターンの他の具体例を示す図である。 On the other hand, the icon arrangement determining unit 33 may acquire the icon arrangement pattern shown in FIG. 13 from the storage unit 19 instead of the icon arrangement pattern shown in FIG. FIG. 13 is a diagram illustrating another specific example of the icon arrangement pattern acquired by the icon arrangement determining unit 33.
 図13に示す例では、アイコン配置パターンは、タッチパネルの画面を、縦×横=4×3ブロックに均等に分割し、分割されたブロックのうち中心付近のブロックを除く周囲の10ブロックにアイコンを1つずつ配置することを定義している。 In the example shown in FIG. 13, the icon arrangement pattern divides the touch panel screen evenly into vertical × horizontal = 4 × 3 blocks, and icons are added to the surrounding 10 blocks excluding the blocks near the center among the divided blocks. It is defined that they are arranged one by one.
 続いて、アイコン配置決定部33は、軌跡の終点tnと、各ブロックの中央に定められたアイコン配置位置との距離を求める。そして、終点tnとの距離が短い配置位置ほど上位の優先順位が割り当てられるように、配置位置と優先順位との関連付けを行う。図13の各アイコン配置位置に示される数字は、本変形例においてアイコン配置決定部33が関連付けた優先順位を示す。図13に示すとおり、終点tnに最も近い、最右上のブロックの配置位置には、当然、優先順位「1位」が関連付けられる。 Subsequently, the icon arrangement determining unit 33 obtains the distance between the end point tn of the trajectory and the icon arrangement position determined at the center of each block. Then, the placement position and the priority order are associated with each other so that a higher priority order is assigned to the placement position whose distance from the end point tn is shorter. The numbers shown at the respective icon arrangement positions in FIG. 13 indicate the priorities associated with the icon arrangement determining unit 33 in this modification. As shown in FIG. 13, the priority “first place” is naturally associated with the arrangement position of the upper right block closest to the end point tn.
 図14は、操作画面処理部24が実行した操作画面生成処理の結果、得られた操作画面の他の具体例を示す図である。図14に示す例は、図12の(b)に示す接触情報および図13に示すアイコン配置パターンに基づいて、オブジェクト80(ここでは、写真2)が選択されたことに対して得られた操作画面の具体例である。 FIG. 14 is a diagram illustrating another specific example of the operation screen obtained as a result of the operation screen generation process executed by the operation screen processing unit 24. The example shown in FIG. 14 shows an operation obtained when the object 80 (here, photo 2) is selected based on the contact information shown in FIG. 12B and the icon arrangement pattern shown in FIG. It is a specific example of a screen.
 操作画面処理部24は、上述した手順に沿って抽出された「1:テレビ」、「2:プリンタ」、「3:メール」、「4:写真表示」、「5:情報表示」、「6:パレット」、「7:ゴミ箱」、および、「8:メモリカード」の8個のアイコンと、アイコン順位決定部31によって決定された優先順位と、アイコン配置決定部33によって図13のとおり決定された配置位置とにしたがって、操作画面を生成する。生成された操作画面は、表示部12へと出力されて、図14に示すとおり、表示部12に表示される。 The operation screen processing unit 24 extracts “1: TV”, “2: Printer”, “3: Mail”, “4: Photo display”, “5: Information display”, “6” extracted according to the above-described procedure. : Palette ”,“ 7: Trash ”, and“ 8: Memory card ”, the priority determined by the icon order determination unit 31, and the icon arrangement determination unit 33 as shown in FIG. The operation screen is generated according to the arrangement position. The generated operation screen is output to the display unit 12 and displayed on the display unit 12 as shown in FIG.
 以下は必須の構成ではないが、操作画面処理部24は、図14に示すとおり、選択されたオブジェクト80を画面の中央に配置してもよい。本実施形態では、操作画面処理部24は、アイコンを、図13に示すアイコン配置パターンに沿ってタッチパネルの画面周囲に配置するため、アイコンがオブジェクトと重ならないようにして見易くするという観点から、オブジェクト80を中央に配置することが好ましい。 Although the following is not an essential configuration, the operation screen processing unit 24 may arrange the selected object 80 at the center of the screen as shown in FIG. In the present embodiment, the operation screen processing unit 24 arranges icons around the screen of the touch panel along the icon arrangement pattern shown in FIG. 13, so that the icons do not overlap with the objects and are easy to see. 80 is preferably arranged in the center.
 なお、図13に示すとおり、アイコン配置パターンが、10個分のアイコンの配置位置を定義する場合でも、関連項目抽出部23によって実際に抽出された関連項目が10個に満たない場合があってもよい。図14に示す例では、アイコン配置決定部33は、優先順位「9位」および「10位」が関連付けられたブロックには、何もアイコンを配置しないことを決定する。 As shown in FIG. 13, even when the icon arrangement pattern defines the arrangement positions of 10 icons, there may be cases where the related items actually extracted by the related item extraction unit 23 are less than 10. Also good. In the example illustrated in FIG. 14, the icon arrangement determination unit 33 determines that no icon is arranged in the blocks associated with the priority orders “9th” and “10th”.
 上述の例における本発明のタブレット端末100によれば、ユーザは、指示体を画面上のあちらこちらに不自然に移動させる必要がなくなり、簡易な接触動作で目的の最終結果物にたどり着くことができる。また指示体の移動距離が短いほど、誤操作の誘発を抑えることも可能である。 According to the tablet terminal 100 of the present invention in the above-described example, the user does not need to unnaturally move the indicator around the screen, and can reach the target final result with a simple contact operation. . In addition, it is possible to suppress the induction of erroneous operations as the moving distance of the indicator is shorter.
 以上のことから、タブレット端末100は、簡易な接触動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、タッチパネルを備えたタブレット端末100において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations. As a result, the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
 ≪実施形態2≫
 本発明の情報処理装置に関する他の実施形態について、図15~図24Bに基づいて説明すれば、以下のとおりである。なお、説明の便宜上、上述の実施形態1にて説明した図面と同じ機能を有する部材については、同じ符号を付記し、実施形態1と重複する内容については説明を省略する。
<< Embodiment 2 >>
Another embodiment of the information processing apparatus according to the present invention will be described below with reference to FIGS. 15 to 24B. For convenience of explanation, members having the same functions as those in the drawings described in the first embodiment are denoted by the same reference numerals, and description of the same contents as those in the first embodiment is omitted.
 本実施形態では、タブレット端末100は、「囲う」接触動作を、オブジェクトを選択するための動作として受け付ける構成であり、操作画面処理部24は、上記「囲う」接触動作から連想される環の形状に沿ってアイコンを配置する構成である。これにより、ユーザの直感に沿って、接触動作からより直感的に想起される結果物を出力することができる。 In the present embodiment, the tablet terminal 100 is configured to accept an “enclose” contact operation as an operation for selecting an object, and the operation screen processing unit 24 has a ring shape associated with the “enclose” contact operation. It is the structure which arrange | positions an icon along. Thereby, according to a user's intuition, the result recalled more intuitively from contact operation | movement can be output.
 〔タブレット端末の機能〕
 図15は、本実施形態におけるタブレット端末100の要部構成を示す機能ブロック図である。
[Tablet function]
FIG. 15 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
 本実施形態にかかるタブレット端末100は、実施形態1のタブレット端末100(図1)と比較して、さらに、制御部10が、機能ブロックとして、環形状決定部30を備えている構成である。 The tablet terminal 100 according to the present embodiment has a configuration in which the control unit 10 further includes a ring shape determining unit 30 as a functional block, as compared with the tablet terminal 100 (FIG. 1) of the first embodiment.
 ここで、本実施形態では、タブレット端末100の制御部10は、必須ではないが、さらに、必要に応じて、機能ブロックとして、ジェスチャ判定部25、および、アニメーション決定部32を備えていてもよい。 Here, in the present embodiment, the control unit 10 of the tablet terminal 100 is not essential, but may further include a gesture determination unit 25 and an animation determination unit 32 as functional blocks as necessary. .
 環形状決定部30は、操作画面処理部24がアイコンを配置するときのアイコン配置位置を決定するものであり、特に、アイコンを環の形状に並べて配置する場合に、その環の形状を決定するものである。 The ring shape determination unit 30 determines an icon arrangement position when the operation screen processing unit 24 arranges an icon. In particular, when arranging icons in a ring shape, the ring shape determination unit 30 determines the shape of the ring. Is.
 実施形態1では、操作画面処理部24は、任意の固定されたアイコン配置パターンにしたがってアイコンを配置する構成であった。本実施形態では、接触動作「囲う」によってオブジェクトが選択されたときに、環形状決定部30が、アイコンを環の形状に並べて配置することを定義するアイコン配置パターンを決定し、操作画面処理部24が決定されたアイコン配置パターンにしたがって、各アイコンを環の形状に並べて配置し表示する。 In the first embodiment, the operation screen processing unit 24 is configured to arrange icons in accordance with an arbitrary fixed icon arrangement pattern. In the present embodiment, when an object is selected by the touch action “enclose”, the ring shape determination unit 30 determines an icon arrangement pattern that defines that icons are arranged in a ring shape, and the operation screen processing unit Each icon is arranged and displayed in a ring shape according to the icon arrangement pattern 24 determined.
 つまり、環形状決定部30は、実施形態1に記載した図示しない配置パターン決定部を、環の形状に特化してアイコン配置パターンを決定するように構成した機能ブロックである。 That is, the ring shape determining unit 30 is a functional block configured to determine the icon arrangement pattern by specializing the arrangement pattern determining unit (not shown) described in the first embodiment to the shape of the ring.
 環形状決定部30によって決定された環形状に基づくアイコン配置パターンにしたがって、操作画面処理部24は、抽出されたアイコンを配置する。なお、環形状決定部30によって決定される環形状の情報は、環のサイズの情報、および/または、環の位置の情報をさらに含んでいてもよい。 In accordance with the icon arrangement pattern based on the ring shape determined by the ring shape determining unit 30, the operation screen processing unit 24 arranges the extracted icons. The ring shape information determined by the ring shape determination unit 30 may further include ring size information and / or ring position information.
 上述したとおり、タブレット端末100の制御部10は、さらに、ジェスチャ判定部25を備えていることが好ましい。入力部11に対して行われる接触動作(ジェスチャ)が「囲う」以外にも様々な種類が想定されている場合には、「囲う」のジェスチャなのか、あるいは、別のどのジェスチャなのかを判別する必要があるからである。 As described above, it is preferable that the control unit 10 of the tablet terminal 100 further includes the gesture determination unit 25. When various types of contact actions (gestures) performed on the input unit 11 other than “enclose” are assumed, it is determined whether the gesture is “enclose” or another gesture. Because it is necessary to do.
 ジェスチャ判定部25は、入力部11に対して行われた接触動作について、それが何のジェスチャであるのかを判定するものである。例えば、ジェスチャ判定部25は、「タップ」、「フリック」、「ピンチ」、「ドラッグ」、「囲う」などのジェスチャを判別することができる。ジェスチャを判別するアルゴリズムは、公知の技術を適宜採用することができる。 The gesture determination unit 25 determines what gesture it is for the contact operation performed on the input unit 11. For example, the gesture determination unit 25 can determine gestures such as “tap”, “flick”, “pinch”, “drag”, and “enclose”. A known technique can be appropriately employed as an algorithm for determining a gesture.
 ジェスチャ判定部25は、判定結果に応じて、その判定されたジェスチャに対応する処理を実行するように制御部10の各部に対して指示する。 The gesture determination unit 25 instructs each unit of the control unit 10 to execute processing corresponding to the determined gesture according to the determination result.
 本実施形態では、ジェスチャ判定部25は、検知された接触動作が、「囲う」というジェスチャであると判定した場合に、接触情報生成部21に対して、生成した接触情報を接触情報記憶部44に格納するように指示することが好ましい。これにより、「囲う」というジェスチャについての全ての情報(領域の位置、サイズ、軌跡、接触時間、接触点の移動タイミングなど)を、操作画面処理部24が参照できるようになるとともに、接触動作が「囲う」というジェスチャ以外であった場合には、不必要に接触情報記憶部44への書き込みが発生することを回避することができる。しかしながら、接触情報生成部21は、ジェスチャ判定部25の判定結果によらず、すべての接触情報を接触情報記憶部44に書き込む構成であってもよい。 In this embodiment, when the gesture determination unit 25 determines that the detected contact action is a gesture of “enclose”, the contact information generation unit 21 sends the generated contact information to the contact information storage unit 44. It is preferable to instruct to store. As a result, the operation screen processing unit 24 can refer to all information (such as the position, size, locus, contact time, contact point movement timing, etc.) of the gesture “enclose”, and the contact operation can be performed. If the gesture is other than the “enclose” gesture, it is possible to avoid unnecessary writing to the contact information storage unit 44. However, the contact information generation unit 21 may be configured to write all contact information in the contact information storage unit 44 regardless of the determination result of the gesture determination unit 25.
 上述したとおり、タブレット端末100の制御部10は、さらに、アニメーション決定部32を備えていてもよい。アニメーション決定部32は、操作画面に配置されるすべての配置対象物、すなわち、オブジェクト、アイコン、環などに対して付与するアニメーションを決定するものである。これにより、オブジェクトおよびアイコンを表示させる際に、表示のさせ方に視覚的な効果(すなわち、アニメーション)を付けることができる。 As described above, the control unit 10 of the tablet terminal 100 may further include an animation determination unit 32. The animation determination unit 32 determines an animation to be given to all objects to be arranged on the operation screen, that is, objects, icons, rings, and the like. Thereby, when displaying an object and an icon, the visual effect (namely, animation) can be given to how to display.
 なお、アニメーション決定部32は、オブジェクト、アイコン、または、環の動きだけではなく、これらについて、フェードイン(透明度の変更)などの視覚的効果を付与してもよい。 Note that the animation determination unit 32 may give a visual effect such as fade-in (change of transparency) in addition to the movement of the object, icon, or ring.
 また、アニメーション決定部32は、環状に表示するアイコンを最初から環の輪郭線上に出現させるのではなく、異なる場所から最終的にアイコンが輪郭線上に終着するようにアイコンに動きをつけてもよい。例えば、アニメーション決定部32は、環の中央あたりアイコンを集約させた後、それぞれのアイコンが拡散するような動きをつけて、最終的に環の輪郭線上に配置されるように各アイコンに動きをつけてもよい。 Further, the animation determination unit 32 may move the icons so that the icons are finally terminated on the contour line from different places instead of causing the icon to be displayed in a circular shape to appear on the contour line of the ring from the beginning. . For example, after the icons are aggregated around the center of the ring, the animation determination unit 32 adds a movement that diffuses each icon, and moves each icon so that it is finally arranged on the outline of the ring. May be attached.
 (オブジェクトと環形状の配置例1)
 図16は、本実施形態における操作画面処理部24の各部の処理内容を説明する図である。より詳細には、図16の(a)は、操作画面処理部24によって実行された、オブジェクトの表示処理の一例を説明する図であり、図16の(b)は、操作画面処理部24の環形状決定部30が決定した、環形状に特化したアイコン配置パターンの一具体例を示す図である。
(Example of arrangement of object and ring shape 1)
FIG. 16 is a diagram illustrating the processing contents of each unit of the operation screen processing unit 24 in the present embodiment. More specifically, FIG. 16A is a diagram for explaining an example of object display processing executed by the operation screen processing unit 24, and FIG. 16B is a diagram illustrating the operation screen processing unit 24. It is a figure which shows an example of the icon arrangement pattern specialized in the ring shape which the ring shape determination part 30 determined.
 本実施形態では、操作画面処理部24は、図16の(a)に示すとおり、先の「囲う」の接触動作によって選択されたオブジェクト80を、中央に配置しなおすことを決定する。このとき、アニメーション決定部32は、このオブジェクト80が元の位置から中央へと徐々に移動するアニメーションをオブジェクト80に対して付与してもよい。 In this embodiment, as shown in FIG. 16A, the operation screen processing unit 24 determines to reposition the object 80 selected by the previous “enclose” contact operation at the center. At this time, the animation determination unit 32 may give the object 80 an animation in which the object 80 gradually moves from the original position to the center.
 次に、本実施形態では、操作画面処理部24の環形状決定部30は、環の形状に基づくアイコン配置パターンを決定する。具体的には、環形状決定部30は、記憶部19に記憶されているアイコン配置パターンを取得することにより、アイコンを配置するための環の形状、位置、サイズ、アイコンの個数、アイコンの配置位置などを決定することができる。 Next, in the present embodiment, the ring shape determination unit 30 of the operation screen processing unit 24 determines an icon arrangement pattern based on the ring shape. Specifically, the ring shape determination unit 30 acquires the icon arrangement pattern stored in the storage unit 19, thereby to obtain the ring shape, position, size, number of icons, icon arrangement for arranging icons. The position etc. can be determined.
 図16の(b)は、環形状決定部30が、オブジェクト80の周囲にある楕円の環の輪郭線に沿って、8個のアイコンを均等に配置することを定義したアイコン配置パターンを決定した例を示している。 In FIG. 16B, the ring shape determination unit 30 determines an icon arrangement pattern that defines that eight icons are arranged uniformly along the outline of an elliptical ring around the object 80. An example is shown.
 ここで、アイコン配置パターンにおいて定義される、アイコンの配置位置を特定する基準の「環」の形状について、これを楕円とした図16の(b)の例は、単なる一例であって、本発明の環の形状を限定する意図は無い。また、「環」とは、必ずしも曲線からなる形状を意味しない。例えば、環形状決定部30は、環の形状を、円形、正方形、長方形、その他多角形で定義してもよいし、複雑な形状、いびつな形状、幾何学的でない形状であっても、内と外とを分離するような輪郭線を有している図形を環として定義してもよい。また、「環」は、必ずしも閉曲線を意味しない。環の輪郭線の始点と終点が完全に一致しない場合であっても、内と外とを大方分離するような輪郭線が環として定義されてもよい。操作画面処理部24は、環形状決定部30の決定にしたがって上述のようにして定義されたあらゆる形状の環の輪郭線上にアイコンを配置することができる。 Here, the example of FIG. 16B in which the shape of the reference “ring” that defines the icon arrangement position defined in the icon arrangement pattern is an ellipse is merely an example, and the present invention There is no intention to limit the shape of the ring. Further, the “ring” does not necessarily mean a shape formed by a curve. For example, the ring shape determining unit 30 may define the shape of the ring as a circle, square, rectangle, or other polygon, or may be a complex shape, irregular shape, or non-geometric shape. A figure having an outline that separates the outside from the outside may be defined as a ring. Further, “ring” does not necessarily mean a closed curve. Even if the starting point and the ending point of the contour line of the ring are not completely coincident with each other, a contour line that largely separates the inside and the outside may be defined as the ring. The operation screen processing unit 24 can place an icon on the outline of the ring of any shape defined as described above according to the determination of the ring shape determination unit 30.
 (オブジェクトと環形状の配置例2)
 本実施形態において、図16の(a)および(b)に示す例では、環形状決定部30は、アイコンを配置するための環について、所定形状、所定位置および所定サイズが定義されたアイコン配置パターンを記憶部19から取得することにより、アイコン配置パターンを決定していた。
(Example of arrangement of object and ring shape 2)
In the present embodiment, in the example shown in FIGS. 16A and 16B, the ring shape determination unit 30 defines an icon arrangement in which a predetermined shape, a predetermined position, and a predetermined size are defined for a ring for arranging icons. The icon arrangement pattern is determined by acquiring the pattern from the storage unit 19.
 しかし、このような構成に限定されず、本実施形態にかかる環形状決定部30は、入力された接触動作「囲う」の移動の軌跡(接触情報記憶部44に記憶されている接触情報)に基づいて、動的に環の形状を決定して、アイコン配置パターンを決定してもよい。より具体的には、環形状決定部30は、「囲う」動作の軌跡の形状を、そのまま、アイコンを配置するための環形状とすることができる。また、「囲う」動作によって囲われた領域の大きさに基づいて環のサイズを決定することもできる。また、囲われた領域の位置に基づいて環の位置を決定することもできる。環形状決定部30は、さらに、フレームマップ記憶部41に記憶されているマップ情報に基づいて環形状を決定してもよい。すなわち、囲われたオブジェクトの表示位置、サイズなどに応じて、環のサイズおよび位置を決定してもよい。このような場合の操作画面処理部24の動作を、図17および図18を参照して説明する。 However, the present invention is not limited to such a configuration, and the ring shape determination unit 30 according to the present embodiment uses the input contact movement “enclose” movement trajectory (contact information stored in the contact information storage unit 44). On the basis of this, the icon arrangement pattern may be determined by dynamically determining the shape of the ring. More specifically, the ring shape determination unit 30 can change the shape of the trajectory of the “surround” operation into a ring shape for arranging icons as it is. Also, the size of the ring can be determined based on the size of the area enclosed by the “enclose” operation. Further, the position of the ring can be determined based on the position of the enclosed region. The ring shape determination unit 30 may further determine the ring shape based on the map information stored in the frame map storage unit 41. That is, the size and position of the ring may be determined according to the display position and size of the enclosed object. The operation of the operation screen processing unit 24 in such a case will be described with reference to FIGS. 17 and 18.
 図17は、接触情報記憶部44に記憶される接触情報の具体例を示す図である。より詳細には、図17の(a)は、ユーザが目的のオブジェクトを選択するために、オブジェクトを任意の形状で「囲う」という接触動作を行ったことを示す図である。図17の(b)は、同図の(a)に示す接触動作に伴って、接触情報生成部21が生成した接触情報の一例を示す図である。 FIG. 17 is a diagram illustrating a specific example of the contact information stored in the contact information storage unit 44. More specifically, FIG. 17A is a diagram showing that the user has performed a contact operation of “enclosing” an object in an arbitrary shape in order to select a target object. FIG. 17B is a diagram illustrating an example of contact information generated by the contact information generation unit 21 in accordance with the contact operation illustrated in FIG.
 図17の(a)に示すとおり、ユーザが、タブレット端末100のタッチパネルに表示されているオブジェクト(ここでは、写真1)を、任意の形状(例えば、ハート型に「囲う」という接触動作を実行したとする。接触動作は、例えば、接触点が同図の破線の位置を通過するようにt0~tnの期間に実行されたとする。 As shown in FIG. 17A, the user performs a contact operation of “enclosing” an object (here, photo 1) displayed on the touch panel of the tablet terminal 100 in an arbitrary shape (for example, a heart shape). For example, it is assumed that the contact operation is performed in the period from t0 to tn so that the contact point passes through the position of the broken line in FIG.
 ジェスチャ判定部25は、接触情報生成部21から、図17の(b)に示すような接触情報を取得する。図17の(b)において、ユーザがなぞった軌跡のうち、始点をt0、終点をtnとして示しているが、その間の各点にも接触時間情報が関連付けられていてもよい。 The gesture determination unit 25 acquires contact information as illustrated in FIG. 17B from the contact information generation unit 21. In FIG. 17B, among the traces traced by the user, the start point is indicated as t0 and the end point is indicated as tn. However, the contact time information may be associated with each point in between.
 ジェスチャ判定部25は、図17の(b)に示される接触情報に基づいて、この接触動作を、「囲う」というジェスチャであると判定する。ジェスチャ判定部25は、図17の(b)に示される上記接触情報を接触情報記憶部44に記憶するように、接触情報生成部21に対して指示する。これにより、操作画面処理部24の各部は、アイコンを表示する処理を実行するときに、接触情報記憶部44に記憶された、図17の(b)に示される接触情報を参照することができる。 The gesture determination unit 25 determines that this contact operation is a gesture of “enclose” based on the contact information shown in FIG. The gesture determination unit 25 instructs the contact information generation unit 21 to store the contact information shown in FIG. 17B in the contact information storage unit 44. Thereby, each part of the operation screen process part 24 can refer to the contact information shown by (b) of FIG. 17 memorize | stored in the contact information storage part 44, when performing the process which displays an icon. .
 図18は、環形状決定部30によって決定された、アイコン配置パターンの他の例を示す図である。 FIG. 18 is a diagram illustrating another example of the icon arrangement pattern determined by the ring shape determining unit 30.
 まず、操作画面処理部24は、選択されたオブジェクト80(ここでは、写真1)を、中央に配置することができる。次に、本実施形態では、環形状決定部30が、接触情報記憶部44に記憶された接触情報を取得する。環形状決定部30は、接触情報から得られた、指の先端(接触点)の移動の軌跡に基づいて、その軌跡と同一のまたは相似する形状を、アイコンを配置するための環形状として決定する。本実施形態では、一例として、環形状決定部30は、環を中心に配置し、タッチ画面に可能な限り大きく配置することを決定する。図17の(a)および(b)に示すとおり、オブジェクト80は、ハート型に囲われている。そこで、環形状決定部30は、図18の破線に示すとおり、そのハート型の軌跡の相似形が中央に画面いっぱい配置されるように環形状81を決定する。このとき、環形状81上の各アイコン配置位置について、環形状決定部30は、アイコンを等間隔に配置するように決定してもよいし、別の規則にしたがって輪郭線上の任意の位置にアイコンを配置するように決定してもよい。 First, the operation screen processing unit 24 can place the selected object 80 (here, Photo 1) in the center. Next, in the present embodiment, the ring shape determination unit 30 acquires the contact information stored in the contact information storage unit 44. Based on the movement trajectory of the finger tip (contact point) obtained from the contact information, the ring shape determination unit 30 determines a shape that is the same as or similar to the trajectory as the ring shape for arranging the icons. To do. In the present embodiment, as an example, the ring shape determination unit 30 determines to place the ring at the center and to place it as large as possible on the touch screen. As shown in FIGS. 17A and 17B, the object 80 is surrounded by a heart shape. Therefore, the ring shape determination unit 30 determines the ring shape 81 so that the similar shape of the heart-shaped locus is arranged in the center of the screen as shown by the broken line in FIG. At this time, for each icon arrangement position on the ring shape 81, the ring shape determination unit 30 may determine that the icons are arranged at equal intervals, or the icon can be placed at an arbitrary position on the contour line according to another rule. You may decide to arrange.
 そして、操作画面処理部24のアイコン配置決定部33は、図17の(b)に示される軌跡の終点tnと、環形状決定部30によって決定された環形状81上の各アイコン配置位置との距離に応じて、各アイコン配置位置に優先順位を関連付ける。 And the icon arrangement | positioning determination part 33 of the operation screen process part 24 makes the end point tn of the locus | trajectory shown in FIG.17 (b), and each icon arrangement position on the ring shape 81 determined by the ring shape determination part 30. FIG. A priority order is associated with each icon arrangement position according to the distance.
 操作画面処理部24は、図18に示すように、環形状決定部30によって決定された環形状81の輪郭線上にアイコンを配置する。このとき、アイコン配置決定部33は、アイコン順位決定部31によって各アイコンに付与された優先順位と、配置位置に関連付けられている優先順位とが一致するようにアイコンの配置位置を決定する。 The operation screen processing unit 24 arranges an icon on the outline of the ring shape 81 determined by the ring shape determination unit 30 as shown in FIG. At this time, the icon arrangement determination unit 33 determines the icon arrangement position so that the priority order given to each icon by the icon order determination unit 31 matches the priority order associated with the arrangement position.
 なお、操作画面処理部24の環形状決定部30は、軌跡が極度に複雑な形状を有する場合には、軌跡の近似形を環の形状として決定してもよい。軌跡の細かくいびつな線を、直線または曲線で丸めることにより、環の形状を定義する情報量を少なくし、アイコンを配置する処理の負荷を低減することができる。 The ring shape determining unit 30 of the operation screen processing unit 24 may determine the approximate shape of the trajectory as the ring shape when the trajectory has an extremely complicated shape. By rounding a fine and distorted line of a locus with a straight line or a curve, the amount of information defining the shape of the ring can be reduced, and the processing load for arranging icons can be reduced.
 (オブジェクトと環形状の配置例3)
 上述の各配置例では、操作画面処理部24は、アイコンを配置する際に、選択されたオブジェクト(例えば、図16の(a)、(b)または図18のオブジェクト80)を、タッチパネルの画面の中央に配置する構成であり、環形状決定部30は、中央のオブジェクト80の周囲にアイコンを配置するよう環の位置を決定する構成であった。しかし、本発明の構成は、上記に限定されない。操作画面処理部24は、選択されたオブジェクト80の表示位置を、元のまま維持する構成であってもよく、この場合でも、環形状決定部30は、図19に示すとおり、環形状が画面中央で大きく表示されるように、環の位置およびサイズを決定してもよい。図19は、オブジェクトおよび環状アイコン群の他の配置例を示している。
(Example of arrangement of object and ring shape 3)
In each of the above-described arrangement examples, the operation screen processing unit 24 displays the selected object (for example, the object 80 in FIG. 16A, FIG. 16B, or the object 80 in FIG. 18) when the icon is arranged. The ring shape determination unit 30 is configured to determine the position of the ring so as to arrange icons around the center object 80. However, the configuration of the present invention is not limited to the above. The operation screen processing unit 24 may be configured to maintain the display position of the selected object 80 as it is. Even in this case, the ring shape determination unit 30 may display the ring shape on the screen as shown in FIG. The position and size of the ring may be determined so as to be displayed large in the center. FIG. 19 shows another arrangement example of the object and the circular icon group.
 (オブジェクトと環形状の配置例4)
 あるいは、選択されたオブジェクト80の表示位置が、元のまま維持される構成において、環形状決定部30は、図20に示すとおり、元の位置にあるオブジェクト80が環の中央になるように、環形状の位置およびサイズを決定してもよい。図20は、オブジェクトおよび環状アイコン群の他の配置例を示している。
(Example 4 of arrangement of object and ring shape)
Alternatively, in the configuration in which the display position of the selected object 80 is maintained as it is, the ring shape determination unit 30 may cause the object 80 at the original position to be in the center of the ring as shown in FIG. The position and size of the ring shape may be determined. FIG. 20 shows another arrangement example of the object and the circular icon group.
 (オブジェクトと環形状の配置例5-アニメーション付)
 選択されたオブジェクト80の表示位置が、元のまま維持される構成において、オブジェクト80とは無関係に環形状が画面中央で大きく配置される場合、「オブジェクト80を囲んでアイコンを表示させた」という事象と、最終的に得られる結果物(例えば、図19)との間の関連性が薄れるという問題がある。一方、オブジェクト80の表示位置が、元のまま維持される構成において、オブジェクト80の表示位置に関連させて周囲にアイコンを表示させる場合には、オブジェクト80が元々表示されている位置によっては、環形状が、十分なサイズで画面に収まるように配置されず、結果として、アイコンの視認性が低下するという問題がある(例えば、図20)。
(Object and ring shape arrangement example 5-with animation)
In the configuration in which the display position of the selected object 80 is maintained as it is, when the ring shape is largely arranged at the center of the screen regardless of the object 80, it is said that “the icon is displayed surrounding the object 80”. There is a problem that the relationship between the event and the finally obtained result (for example, FIG. 19) is weakened. On the other hand, in the configuration in which the display position of the object 80 is maintained as it is, when icons are displayed around the object 80 in association with the display position of the object 80, depending on the position where the object 80 is originally displayed, There is a problem that the shape is not arranged so as to fit on the screen with a sufficient size, and as a result, the visibility of the icon decreases (for example, FIG. 20).
 そこで、タブレット端末100がアニメーション決定部32を備えている場合には、アニメーション決定部32は、図21に示すとおり、アイコンを配置するための環にアニメーションを付与することによって、上記の問題を解決してもよい。具体的には、オブジェクト80の元の表示位置およびサイズに基づいて環形状決定部30が決定した環形状を、一定の時間をかけて、画面中央に大きく配置させるように、アニメーション決定部32が、アニメーションを環に対して付与する。これにより、一旦、オブジェクト80の周囲に小さく配置されたアイコンの環は、時間の経過に伴って徐々にその形状を変え、最終的に画面中央に大きく配置される。図21の(a)は、最初にオブジェクト80の周囲に小さく配置されたアイコンの環の様子を示し、図21の(b)は、アイコンの環が拡大する途中の様子を示し、図21の(c)は、同図の(a)から、同図の(b)を経て、最終的にアイコンの環が画面中央に大きく配置された様子を示している。 Therefore, when the tablet terminal 100 includes the animation determination unit 32, the animation determination unit 32 solves the above problem by adding animation to a ring for arranging icons as shown in FIG. May be. Specifically, the animation determination unit 32 sets the ring shape determined by the ring shape determination unit 30 based on the original display position and size of the object 80 so that the ring shape is largely arranged in the center of the screen over a certain period of time. Add animation to the ring. As a result, the ring of icons once arranged small around the object 80 gradually changes its shape with the passage of time, and is finally arranged large in the center of the screen. 21A shows a state of an icon ring initially arranged around the object 80, and FIG. 21B shows a state in which the icon ring is being expanded. (C) shows a state in which the ring of icons is finally arranged largely at the center of the screen from (a) of FIG.
 上記構成によれば、ユーザの接触動作と表示される結果物との関連性を損なうことなく、アイコンを十分なサイズで表示させることが可能となる。 According to the above configuration, the icon can be displayed in a sufficient size without impairing the relevance between the user's contact operation and the displayed result.
 なお、アニメーション決定部32は、環のサイズに合わせてアイコン1つ1つの大きさも徐々に大きくするようなアニメーションを付与してもよいし、環のサイズとアイコンのサイズとは独立させて、アイコンのサイズを一定とし、アイコンの配置間隔を徐々に広げるというアニメーションを付与してもよい。 Note that the animation determination unit 32 may give an animation that gradually increases the size of each icon according to the size of the ring, or the icon size is independent of the size of the ring and the size of the icon. The animation may be given such that the size of the icon is fixed and the interval between the icons is gradually increased.
 (アイコンの表示タイミング)
 上述の各配置例では、操作画面処理部24は、選択されたオブジェクトの周囲に、複数のアイコンを同時に配置する構成であった。しかし、これに限らず、操作画面処理部24が、アニメーション決定部32を備えている場合には、アニメーション決定部32がそれぞれのアイコンの表示タイミングを決定する構成であってもよい。
(Icon display timing)
In each of the above arrangement examples, the operation screen processing unit 24 is configured to simultaneously arrange a plurality of icons around the selected object. However, the present invention is not limited to this, and when the operation screen processing unit 24 includes the animation determination unit 32, the animation determination unit 32 may determine the display timing of each icon.
 図22は、関連項目のアイコン表示方法の一変形例を示す図である。 FIG. 22 is a diagram showing a modification of the related item icon display method.
 例えば、ユーザの「囲う」接触動作から、図22の(a)に示すような接触情報が得られたとする。アニメーション決定部32は、接触情報記憶部44に記憶された上記接触情報を参照し、この「囲う」のジェスチャは、t0からtnにかけて時計回りに発生したと認識する。アニメーション決定部32は、この動きを合うように、1番目から8番目までのアイコンを、1つずつ時計回りに一定の間隔で出現させることを決定する。例えば、アニメーション決定部32によって、図22の(b)、(c)、(d)、(e)、・・・と一定の間隔でアイコンを順に出現させ、最終的に、図22の(f)に示される操作画面に至るように、アイコンの表示タイミングが制御される。 For example, it is assumed that contact information as shown in FIG. 22A is obtained from the user's “enclose” contact operation. The animation determination unit 32 refers to the contact information stored in the contact information storage unit 44, and recognizes that the “enclose” gesture has occurred clockwise from t0 to tn. The animation determination unit 32 determines that the first to eighth icons appear one by one at regular intervals in a clockwise direction so as to match this movement. For example, the animation determining unit 32 causes icons to appear in order at regular intervals (b), (c), (d), (e),... In FIG. The display timing of the icons is controlled so that the operation screen shown in FIG.
 上記構成によれば、ユーザが囲ったときの動き(時計回り)とほぼ同じ動きを伴った結果物が得られるので、ユーザの接触動作と表示された結果物との関連性をより高めることが可能となり、結果として、ユーザの直感に反しない自然な流れで操作画面を提供することができる。 According to the above configuration, a result with substantially the same movement as the movement (clockwise) when the user is surrounded can be obtained, so that the relevance between the user's contact operation and the displayed result can be further increased. As a result, the operation screen can be provided in a natural flow that does not contradict the user's intuition.
 ここで、アイコン配置決定部33は、1番目のアイコンの表示位置と、指の軌跡の始点(t0の時の接触位置)とをおおよそ一致させ、最後のアイコンの表示位置と、終点(tnの時の接触位置)とをおおよそ一致させることが好ましい。 Here, the icon arrangement determining unit 33 roughly matches the display position of the first icon with the start point of the finger trajectory (contact position at time t0), and the display position of the last icon and the end point (tn It is preferable that the contact position at the time is approximately the same.
 これにより、接触動作と結果物とをより一層連動させて、より自然な流れで操作画面を提供することができる。 This makes it possible to provide an operation screen with a more natural flow by further linking the contact operation and the result.
 なお、ここでは、アイコンを出現させる順番と、アイコン順位決定部31によって付与されている優先順位とは一致していなくてもよい。例えば、図22に示す例では、優先順位「1位」のアイコンは、軌跡の終点tnに最も近いHの配置位置(図22の(f))に配置されるが、この配置例では、1番目に出現するのではなく、4番目に出現することになる(図22の(e))。 Note that, here, the order in which icons appear and the priority assigned by the icon order determination unit 31 do not have to match. For example, in the example shown in FIG. 22, the icon of the priority “1st place” is arranged at the H arrangement position ((f) in FIG. 22) closest to the end point tn of the trajectory. Instead of appearing in the second place, it will appear in the fourth place ((e) in FIG. 22).
 (アイコンの表示タイミング2)
 さらに、アニメーション決定部32は、指の動き(時計回りか、反時計回りか)だけでなく、オブジェクトが囲われたときの指が動く速度に合わせて、各アイコンを順次出現させることが好ましい。
(Icon display timing 2)
Furthermore, it is preferable that the animation determination unit 32 sequentially causes each icon to appear in accordance with not only the finger movement (clockwise or counterclockwise) but also the speed at which the finger moves when the object is surrounded.
 具体的には、図23の(a)に示すような接触情報が得られたとする。この接触情報は、t0の時点からtnの時点にかけて時計回りに囲われた軌跡を示している。そして、より詳細には、taの時点で接触位置(指の先端)がオブジェクトの左にあり、tbの時点で接触位置がオブジェクトの左上にあり、tcの時点で接触位置がオブジェクトの右上にあったことがこの接触情報から分かる。 Specifically, it is assumed that contact information as shown in FIG. This contact information indicates a trajectory enclosed in a clockwise direction from time t0 to time tn. More specifically, the contact position (tip of the finger) is at the left of the object at the time ta, the contact position is at the upper left of the object at the time tb, and the contact position is at the upper right of the object at the time tc. It can be seen from this contact information.
 そこで、アニメーション決定部32は、8個のアイコンが楕円形上に均等な間隔で配置される場合に、1番目のアイコンをオブジェクトの真下にt0の時点に出現させ、その後、指の速度に一致させて、taの時点で、オブジェクトの左のところ(3番目)までアイコンを出現させ、tbの時点で、オブジェクトの左上のところ(4番目)までアイコンを出現させ、tcの時点で、オブジェクトの右上のところ(6番目)までアイコンを出現させ、最終的に、tnの時点で、全てのオブジェクトを出現させることを決定する。図23の(b)は、t0の時点の操作画面を示している。図23の(c)は、taの時点の操作画面を示している。図23の(d)は、tbの時点の操作画面を示している。図23の(e)は、tcの時点の操作画面を示している。図23の(f)は、tnの時点の操作画面を示している。 Therefore, when the eight icons are arranged at equal intervals on the ellipse, the animation determination unit 32 causes the first icon to appear immediately below the object at time t0, and then matches the finger speed. The icon appears up to the left (third) of the object at the time ta, the icon appears up to the upper left (fourth) of the object at the time tb, and the object appears at the time tc. The icons appear up to the upper right (sixth), and finally, it is determined that all objects appear at the time point tn. FIG. 23B shows the operation screen at time t0. FIG. 23C shows an operation screen at the time point ta. (D) of FIG. 23 has shown the operation screen at the time of tb. FIG. 23E shows the operation screen at the time tc. FIG. 23 (f) shows the operation screen at time tn.
 上記構成によれば、ユーザが囲ったときの動き(時計回り)および動きの速度とほぼ同じ動きを伴った結果物が得られるので、ユーザの接触動作と表示された結果物との関連性をさらにより一層高めることが可能となり、結果として、ユーザの直感に反しない自然な流れで操作画面を提供することができる。 According to the above configuration, a result is obtained that has substantially the same movement as the movement of the user (clockwise) and the speed of the movement. Therefore, the relationship between the user's contact operation and the displayed result is obtained. The operation screen can be provided in a natural flow that does not contradict the user's intuition.
 ここでも、アイコンを出現させる順番と、アイコン順位決定部31によって付与されている優先順位とは一致していなくてもよい。図23に示す例では、優先順位「1位」のアイコンは、軌跡の終点tnに最も近いHの配置位置(図23の(f))に配置されるが、この配置例では、1番目に出現するのではなく、4番目に出現することになる(図23の(d))。 Also here, the order in which the icons appear and the priority order given by the icon order determination unit 31 do not have to match. In the example shown in FIG. 23, the icon of the priority “1st place” is arranged at the H arrangement position ((f) in FIG. 23) closest to the end point tn of the trajectory. Instead of appearing, it appears fourth ((d) in FIG. 23).
 〔操作画面表示フロー〕
 図24Aおよび24Bは、本実施形態におけるタブレット端末100による操作画面表示処理の流れを示すフローチャートである。
[Operation screen display flow]
24A and 24B are flowcharts showing a flow of operation screen display processing by the tablet terminal 100 in the present embodiment.
 入力部11によって、タッチパネルのタッチ面に指示体(ユーザの指など)が接触したことが検知されると(S201においてYES)、接触情報生成部21は、そのとき(t=t0)から、指の接触位置を示す接触座標情報の取得を開始し、これを経時的に取得する(S202)。この接触位置の追尾は、タッチ面と指との間の接触が検知されなくなるまで継続される(S203においてNO)。入力部11において、接触が非検知になると(S203においてYES)、接触情報生成部21は、t=t0からこのとき(t=tn)までの間取得した接触座標情報と、接触時間情報とを対応付けて接触情報を生成する(S204)。 When the input unit 11 detects that an indicator (such as a user's finger) has touched the touch surface of the touch panel (YES in S201), the contact information generation unit 21 starts the finger from that time (t = t0). The acquisition of the contact coordinate information indicating the contact position is started and acquired over time (S202). This tracking of the contact position is continued until no contact is detected between the touch surface and the finger (NO in S203). When contact is not detected in the input unit 11 (YES in S203), the contact information generation unit 21 obtains the contact coordinate information and the contact time information acquired from t = t0 to this time (t = tn). Corresponding contact information is generated (S204).
 ここで、ジェスチャ判定部25が、接触情報に基づいてこの接触動作のジェスチャを判定してもよい(S205)。本実施形態では、ジェスチャ判定部25は、判定したジェスチャが「囲う」でなければ(S206においてNO)、判定したそれ以外のジェスチャに応じた処理の実行を、制御部10の各部に指示する。その各部によって、判定されたジェスチャに応じた処理が実行される(S207)。 Here, the gesture determination unit 25 may determine the gesture of the contact operation based on the contact information (S205). In the present embodiment, if the determined gesture is not “enclose” (NO in S206), the gesture determination unit 25 instructs each unit of the control unit 10 to execute a process according to the determined other gesture. Each unit performs processing according to the determined gesture (S207).
 一方、ジェスチャ判定部25は、判定したジェスチャが「囲う」であった場合には(S206においてYES)、接触情報生成部21に対して接触情報を接触情報記憶部44に格納するように指示する。接触情報生成部21は、S204にて生成した接触情報を、接触情報記憶部44に記憶する(S208)。 On the other hand, when the determined gesture is “enclose” (YES in S206), the gesture determination unit 25 instructs the contact information generation unit 21 to store the contact information in the contact information storage unit 44. . The contact information generation unit 21 stores the contact information generated in S204 in the contact information storage unit 44 (S208).
 オブジェクト特定部22は、接触情報記憶部44に記憶されている接触情報(例えば、図5の(b)または図17の(b))と、フレームマップ記憶部41に記憶されているマップ情報(例えば、図5の(c))とを比較して、ユーザによって囲われた領域に重なるオブジェクトを選択されたオブジェクトとして特定する(S209)。図5の(c)に示す例では、「写真1」というオブジェクト80を特定する。 The object specifying unit 22 includes contact information stored in the contact information storage unit 44 (for example, FIG. 5B or FIG. 17B) and map information stored in the frame map storage unit 41 ( For example, by comparing with (c) of FIG. 5, an object that overlaps the area surrounded by the user is specified as the selected object (S209). In the example shown in FIG. 5C, the object 80 “Photo 1” is specified.
 関連項目抽出部23は、S209において特定されたオブジェクトに基づいて、関連情報記憶部42の関連情報(例えば、図6)を参照して、特定されたオブジェクトの関連項目を抽出する(S210)。あるいは、関連項目に割り当てられているアイコンの識別情報を抽出してもよい。 The related item extraction unit 23 refers to the related information (for example, FIG. 6) in the related information storage unit 42 based on the object specified in S209, and extracts the related item of the specified object (S210). Alternatively, identification information of icons assigned to related items may be extracted.
 続いて、操作画面処理部24は、接触情報記憶部44から、S204にて生成された接触情報を必要に応じて取得してもよい(S211)。そして、操作画面処理部24の環形状決定部30は、上記接触情報に基づいて動的に、環形状に基づくアイコン配置パターンを決定する(S212)。あるいは、操作画面処理部24は、記憶部19に保持されたアイコン配置パターンを取得することにより、アイコン配置パターンを決定してもよい。ここで、環形状決定部30が、環形状に基づくアイコン配置パターンを決定することにより、環をオブジェクトに対してどの位置、および、どのサイズで配置するのか、アイコンを何個配置するのか、その各アイコンをどのような形状の環の輪郭線上に配置するのか、アイコンを輪郭線上のどこに配置するのか、などが決定される(例えば、図9の(a)、図16の(b)、図18~図23など)。 Subsequently, the operation screen processing unit 24 may acquire the contact information generated in S204 from the contact information storage unit 44 as necessary (S211). Then, the ring shape determination unit 30 of the operation screen processing unit 24 dynamically determines an icon arrangement pattern based on the ring shape based on the contact information (S212). Alternatively, the operation screen processing unit 24 may determine the icon arrangement pattern by acquiring the icon arrangement pattern held in the storage unit 19. Here, the ring shape determination unit 30 determines an icon arrangement pattern based on the ring shape, which position and size the ring is arranged with respect to the object, how many icons are arranged, It is determined what kind of shape each icon is placed on the outline of the ring, where the icon is placed on the outline (for example, FIG. 9A, FIG. 16B, FIG. 18 to 23).
 次に、操作画面処理部24は、操作画面生成処理を実行する。具体的には、操作画面処理部24のアイコン順位決定部31は、S210において抽出された関連項目それぞれに優先順位を付与する(S213)。具体的には、アイコン順位決定部31は、抽出された関連項目の属性(例えば、図6の「選択回数」)を、関連情報記憶部42から読み出す。そして、属性に基づいて判断されたユーザに選択される可能性の高さの順に、各関連項目の順位を決定する(例えば、図8)。アイコン順位決定部31は、関連項目に対応付けられたアイコン画像またはアイコン識別情報に対して優先順位を関連付けてもよい。 Next, the operation screen processing unit 24 executes an operation screen generation process. Specifically, the icon ranking determining unit 31 of the operation screen processing unit 24 gives priority to each of the related items extracted in S210 (S213). Specifically, the icon rank determination unit 31 reads the extracted attribute of the related item (for example, “the number of selections” in FIG. 6) from the related information storage unit 42. Then, the order of each related item is determined in order of the possibility of being selected by the user determined based on the attribute (for example, FIG. 8). The icon order determination unit 31 may associate a priority order with the icon image or icon identification information associated with the related item.
 一方、操作画面処理部24のアイコン配置決定部33は、S212にて決定されたアイコン配置パターンにおいて定義されている各アイコン配置位置と、優先順位との関連付けを実行する(S214)。具体的には、アイコン配置決定部33は、S211にて得られた接触情報の中から、軌跡の終点の座標を取得し、終点からの距離が短い配置位置から順に、優先順位を関連付ける。なお、S213およびS214は、並列的に実行されてもよいし、直列的に任意の順序で順次実行されてもよい。さらには、S213は、S210よりも後の工程であれば、S211またはS212よりも前に実行されても構わない。 Meanwhile, the icon arrangement determining unit 33 of the operation screen processing unit 24 associates each icon arrangement position defined in the icon arrangement pattern determined in S212 with the priority (S214). Specifically, the icon arrangement determining unit 33 acquires the coordinates of the end point of the trajectory from the contact information obtained in S211 and associates the priorities in order from the arrangement position with the shortest distance from the end point. Note that S213 and S214 may be executed in parallel, or may be executed sequentially in an arbitrary order in series. Furthermore, S213 may be executed before S211 or S212 as long as it is a step after S210.
 続いて、アイコン配置決定部33は、S213にてアイコン順位決定部31によって各アイコンに付与されている優先順位と、S214にて各アイコン配置位置に関連付けた優先順位とが一致するように、各アイコンを配置することを決定する(S215)。 Subsequently, the icon placement determination unit 33 sets each priority so that the priority assigned to each icon by the icon order determination unit 31 in S213 matches the priority order associated with each icon placement position in S214. It is determined to arrange an icon (S215).
 ここで、タブレット端末100が、さらに、アニメーション決定部32を備えている場合には、アニメーション決定部32は、上流の各工程で配置位置が決定された配置対象物(オブジェクト、環、および、各アイコンなど)に対して、必要に応じてアニメーションを付与してもよい(S216)。例えば、アニメーション決定部32は、各配置対象物の出現タイミングを決定したり、配置対象物の位置、形状、大きさを漸次的に変更することを決定したり、フェードイン(透明度の変更)などのその他視覚的効果を付与することを決定したりしてもよい。 Here, when the tablet terminal 100 further includes an animation determination unit 32, the animation determination unit 32 includes an arrangement object (object, ring, and each) whose arrangement position is determined in each upstream process. An animation may be given to an icon or the like as necessary (S216). For example, the animation determination unit 32 determines the appearance timing of each placement object, decides to gradually change the position, shape, and size of the placement object, fade-in (changes in transparency), and the like. Or other visual effects may be determined.
 最後に、操作画面処理部24は、S210において抽出された関連項目のアイコン画像を、アイコン記憶部43(例えば、図7)から取得する。そして、上流の各工程で決定された内容にしたがって、S209において特定されたオブジェクトを配置し、上記オブジェクトの周囲またはオブジェクトに対して所定位置に配置された環の輪郭線上に、取得したアイコン画像を配置して操作画面を生成する(S217)。例えば、操作画面処理部24は、上記オブジェクトを中央に配置して、その周囲に配置された環形状、すなわち、S212にて決定された環形状の輪郭線上に各アイコンを配置する。しかも、この最終結果物において、配置された各アイコンは、選択される可能性が高いものほど、「囲う」接触動作の軌跡終点近くに配置されている。以上のようにして生成された操作画面の映像信号は、表示部12に出力される。 Finally, the operation screen processing unit 24 acquires the icon image of the related item extracted in S210 from the icon storage unit 43 (for example, FIG. 7). Then, according to the content determined in each upstream process, the object specified in S209 is arranged, and the acquired icon image is placed on the outline of the ring arranged around the object or at a predetermined position with respect to the object. The operation screen is generated by arranging them (S217). For example, the operation screen processing unit 24 places the object in the center and places each icon on the ring shape arranged around the object, that is, on the ring-shaped outline determined in S212. Moreover, in the final product, the icons arranged are arranged closer to the trajectory end point of the “enclose” contact operation as the possibility of being selected is higher. The video signal of the operation screen generated as described above is output to the display unit 12.
 本発明の上記構成および方法によれば、オブジェクトの周囲を「囲う」という、オブジェクトを指定する上で、極めて自然で簡易なユーザの接触動作に対して、タブレット端末100は、オブジェクトの周囲にアイコンが配置された操作画面(最終結果物)を出力することができる。 According to the above-described configuration and method of the present invention, the tablet terminal 100 displays an icon around the object in response to an extremely natural and simple user contact operation of “enclosing” the object. Can output an operation screen (final result) on which is placed.
 ユーザは、オブジェクトを囲むように関連項目のアイコンが配置された操作画面を結果物として得ることができる。これらのアイコンと、オブジェクトとの位置関係は、先にユーザが実行した先の接触動作による指の軌跡とオブジェクトとの位置関係に合致する。また、囲ったことにより得られる指の軌跡は、アイコンが配置される環形状に類似する。 The user can obtain an operation screen in which icons of related items are arranged so as to surround the object as a result. The positional relationship between these icons and the object matches the positional relationship between the object and the trajectory of the finger by the previous contact operation previously performed by the user. In addition, the finger trajectory obtained by surrounding is similar to a ring shape in which icons are arranged.
 つまり、「オブジェクトを『囲う』接触動作を起こす」という事象から、「オブジェクトの周囲にアイコンが配置された操作画面を得られる」という事象への遷移は、ユーザの直感に反しない自然な遷移であると言える。 In other words, the transition from the phenomenon of “contacting the object to“ enclose ”the object” to the phenomenon of “obtaining an operation screen with icons arranged around the object” is a natural transition that does not contradict the user's intuition. It can be said that there is.
 加えて、タブレット端末100は、オブジェクトを選択した次にユーザが選択するであろう関連項目を予め察して、ユーザに選択可能に表示することができる。具体的には、本発明の上記構成によれば、オブジェクトの周囲に表示されるアイコンは、いずれも、オブジェクトに関連がある項目として抽出された関連項目のアイコンである。つまり、ユーザは、オブジェクトを囲って選択したのち、そのオブジェクトに関連する「動作」、「動作対象」、「動作相手」などを周囲のアイコンから即座に指定することができる。 In addition, the tablet terminal 100 can preliminarily detect related items that the user will select after selecting an object, and can display the related items in a selectable manner for the user. Specifically, according to the above configuration of the present invention, the icons displayed around the object are all related item icons extracted as items related to the object. That is, after the user surrounds and selects an object, the user can immediately designate “motion”, “motion target”, “action partner”, and the like related to the object from surrounding icons.
 その上、上記最終結果物としての操作画面においては、選択される可能性が高いアイコンほど、先の接触動作の移動完了位置の近くに配置されている。これにより、ユーザは、高い確率で先の接触動作の移動完了位置近くに表示されているアイコンを所望のアイコンとして選択(タッチ)することになる。結果として、高い確率で、指示体を画面上のあちらこちらに不自然に移動させるという選択操作の煩わしさを回避することができる。 In addition, on the operation screen as the final result, the icons that are more likely to be selected are arranged closer to the movement completion position of the previous contact operation. Accordingly, the user selects (touches) an icon displayed near the movement completion position of the previous contact operation with a high probability as a desired icon. As a result, it is possible to avoid the troublesome selection operation of moving the indicator unnaturally from place to place on the screen with high probability.
 以上のように、本発明のタブレット端末100によれば、ユーザは、指示体を画面上のあちらこちらに不自然に移動させる必要がなくなり、簡易な接触動作で目的の最終結果物にたどり着くことができる。また指示体の移動距離が短いほど、誤操作の誘発を抑えることも可能である。 As described above, according to the tablet terminal 100 of the present invention, it is not necessary for the user to unnaturally move the indicator from one place to another on the screen, and the target final result can be reached with a simple contact operation. it can. In addition, it is possible to suppress the induction of erroneous operations as the moving distance of the indicator is shorter.
 以上のことから、タブレット端末100は、簡易な接触動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、タッチパネルを備えたタブレット端末100において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations. As a result, the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
 さらに、環形状決定部30が、指示体の移動の軌跡に基づいて、動的にアイコン配置パターン(具体的には、環の形状)を決定する構成によれば、以下の効果を奏する。 Furthermore, according to the configuration in which the ring shape determining unit 30 dynamically determines the icon arrangement pattern (specifically, the ring shape) based on the movement trajectory of the indicator, the following effects can be obtained.
 本発明の上記構成および方法によれば、オブジェクトの周囲を「囲う」という、オブジェクトを指定する上で、極めて自然で簡易なユーザの接触動作に対して、タブレット端末100は、オブジェクトの周囲にアイコンを配置するという操作画面を最終結果物として出力することができる。つまり、ユーザは、オブジェクトを囲むように関連項目のアイコンが配置された操作画面を結果物として得ることができる。 According to the above-described configuration and method of the present invention, the tablet terminal 100 displays an icon around the object in response to an extremely natural and simple user contact operation of “enclosing” the object. The operation screen for placing the image can be output as the final result. That is, the user can obtain, as a result, an operation screen in which icons of related items are arranged so as to surround the object.
 具体的には、ユーザは、フリーハンドで任意の形状でオブジェクトを囲うという動作を行うが、このときの軌跡がタブレット端末100において保持される。そして、タブレット端末100によって作成された操作画面において、得られた軌跡と同じまたは相似形の環形状の輪郭線上に、オブジェクトを囲むようにして各アイコンは配置される。 Specifically, the user performs an operation of surrounding the object in an arbitrary shape with a free hand, and the locus at this time is held in the tablet terminal 100. Then, on the operation screen created by the tablet terminal 100, each icon is arranged so as to surround the object on the same or similar ring-shaped contour line as the obtained trajectory.
 これらのアイコンと、オブジェクトとの位置関係は、先にユーザが実行した接触動作による指の軌跡とオブジェクトとの位置関係と合致する。また、囲ったことにより得られる指の軌跡は、アイコンが配置される環形状と一致する。 The positional relationship between these icons and the object matches the positional relationship between the object and the locus of the finger by the contact operation previously performed by the user. Further, the trajectory of the finger obtained by enclosing it matches the ring shape where the icon is arranged.
 つまり、オブジェクトを囲うと、オブジェクトの周囲に「囲ったとおりに」アイコンが配置された操作画面を得られる。この事象の遷移は、ユーザの直感に反しないより自然な流れであると言える。また、ユーザが囲ったとおりの形状で、アイコンが配置されるので、操作画面を表示してタブレット端末100を操作する際の遊戯性が高まる。その上、ユーザは、アイコンの配置を予測して、自分が希望するとおりにオブジェクトを囲い、関連項目のアイコンを表示させることができるため、操作性はさらに向上する。 That is, when an object is enclosed, an operation screen in which icons are arranged "as enclosed" around the object can be obtained. It can be said that the transition of this event is a more natural flow that does not contradict the user's intuition. In addition, since the icons are arranged in the shape as enclosed by the user, the playability when the operation screen is displayed and the tablet terminal 100 is operated is improved. In addition, since the user can predict the arrangement of the icons, surround the object as he / she desires, and display the icons of the related items, the operability is further improved.
 そして、オブジェクトを選択した後に周囲に表示されるアイコンは、そのオブジェクトと関連が深く次に選択される可能性が高い関連項目を示している。 The icons displayed in the surroundings after selecting an object indicate related items that are deeply related to the object and are likely to be selected next.
 以上のことから、タブレット端末100は、簡易な接触動作且つ少ない動作数でありながら、ユーザの直感に反しないより一層自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、タッチパネルを備えたタブレット端末100において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the tablet terminal 100 can display the final result desired by the user with a more natural flow that does not contradict the user's intuition, while having a simple contact operation and a small number of operations. As a result, the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
 ≪変形例≫
 上述の実施形態1、2を適宜組み合わせたタブレット端末100も本発明の範疇に入る。すなわち、各実施形態1、2に係るタブレット端末100の制御部10は、各々の実施形態において必須でない場合には、ジェスチャ判定部25、環形状決定部30、および、アニメーション決定部32を部分的に、あるいは、全て備えていてもよい。
≪Modification≫
A tablet terminal 100 in which the first and second embodiments are appropriately combined also falls within the scope of the present invention. That is, the control unit 10 of the tablet terminal 100 according to each of the first and second embodiments partially includes the gesture determination unit 25, the ring shape determination unit 30, and the animation determination unit 32 when not essential in each embodiment. Or all of them may be provided.
 (優先順位の決定方法)
 上述の各実施形態では、関連情報において関連項目に関連付けられている属性が「選択回数」であって、アイコン順位決定部31が、属性「選択回数」を参照し、ユーザに選択される可能性の高さにしたがって優先順位を付与するという構成について説明した。しかし、本発明のタブレット端末100の構成は、上記に限定されない。アイコン順位決定部31は、属性「選択回数」以外の様々な属性にしたがって、優先順位を付与してもよい。
(Priority determination method)
In each of the above-described embodiments, the attribute associated with the related item in the related information is “selection count”, and the icon rank determination unit 31 may refer to the attribute “selection count” and be selected by the user. The configuration in which the priority order is given according to the height of the above has been described. However, the configuration of the tablet terminal 100 of the present invention is not limited to the above. The icon order determination unit 31 may assign priorities according to various attributes other than the attribute “number of times of selection”.
 例えば、以下に説明するように、選択されたオブジェクトの属性と比較して、このオブジェクトとの相関性が高い関連項目ほど、ユーザに選択される可能性が高い関連項目であると決定してもよい。 For example, as described below, a related item having a higher correlation with the object compared to the attribute of the selected object may be determined as a related item that is more likely to be selected by the user. Good.
 より具体的には、選択されたオブジェクトがある1枚の「写真」であって、関連情報において、その写真に関連する他の写真が複数枚関連項目として関連付けられている場合を想定する。この場合、関連情報において、関連項目のそれぞれには、さらに、属性として「撮影日時」、「撮影者」、「カメラ機種」、「写真タイトル」などの情報が関連付けて関連情報記憶部42に記憶されている。 More specifically, it is assumed that there is a single “photo” with the selected object, and other photos related to the photo are related as a plurality of related items in the related information. In this case, in the related information, information such as “photographing date / time”, “photographer”, “camera model”, “photograph title” and the like are associated with each of the related items and stored in the related information storage unit 42. Has been.
 そこで、アイコン順位決定部31は、例えば、選択されたオブジェクトである「写真」の撮影日時と近い撮影日時にて撮影された関連項目(他の写真)ほど、優先順位が上位になるように関連項目の順位を決定してもよい。したがって、アイコン配置決定部33は、抽出された関連項目としての「複数の他の写真」について、選択されたオブジェクトとしての「写真」と撮影日時が近いものほど、軌跡の終点近くにそのアイコンを配置することを決定する。 Therefore, for example, the icon order determination unit 31 associates the related items (other photographs) taken at the photographing date and time closer to the photographing date and time of the selected object “photo” so that the priority order is higher. The order of items may be determined. Therefore, the icon arrangement determination unit 33 sets the icon closer to the end point of the trajectory as “photograph” as the selected object is closer to the photographed date and time with respect to “the plurality of other photos” as the extracted related items. Decide to place.
 ユーザは、次々に関連する写真を選択することを望んでいると推測される。そして、撮影日時が近い写真同士は、より関連性が強いと考えられ、先に選択されたオブジェクトとしての「写真」の次には、それと関連する、すなわち、撮影日時の近い「他の写真」が選択される可能性が高い。 It is assumed that the user wants to select related photos one after another. Photos with similar shooting dates and times are considered to be more closely related. Next to “Photo” as the previously selected object, “other photos” that are related to it, that is, with a similar shooting date and time. Is likely to be selected.
 そこで、本発明のように、撮影日時の近い、つまり、選択される可能性が高いアイコンほど、先の接触動作の移動完了位置の近くに表示されるような配置にすることにより、ユーザは、高い確率で先の接触動作の移動完了位置近くに表示されている所望のアイコンを選択することなる。結果として、高い確率で、指示体を画面上のあちらこちらに不自然に移動させるという選択操作の煩わしさを回避することができる。 Therefore, as in the present invention, by arranging the icons that are closer to the shooting date and time, that is, icons that are more likely to be selected, to be displayed near the movement completion position of the previous contact operation, the user can A desired icon displayed near the movement completion position of the previous contact operation is selected with high probability. As a result, it is possible to avoid the troublesome selection operation of moving the indicator unnaturally from place to place on the screen with high probability.
 なお、アイコン順位決定部31は、1つの属性に基づいて、選択される可能性の高さを決定するのではなく、複数の属性(例えば、「撮影日時」、「撮影者」、「カメラ機種」、「写真タイトル」など)を総合的に用いて、選択される可能性の高さを決定してもよい。例えば、アイコン順位決定部31は、選択されたオブジェクトの複数の属性と、各関連項目の複数の属性とを比較して、類似度を総合的に求めて、類似度が高いものほど、選択される可能性が高いと判断してもよい。ここでは、選択されたオブジェクトと属性が類似する関連項目ほど、そのアイコンがユーザに選択される可能性が高いと考えられ、したがって、高い優先順位が付与される。 Note that the icon rank determination unit 31 does not determine the high possibility of selection based on one attribute, but rather includes a plurality of attributes (for example, “shooting date”, “photographer”, “camera model”). ”,“ Photo title ”, etc.) may be used comprehensively to determine the likelihood of being selected. For example, the icon rank determination unit 31 compares a plurality of attributes of the selected object with a plurality of attributes of each related item to obtain a total similarity, and a higher similarity is selected. It may be determined that there is a high possibility that Here, it is considered that a related item having an attribute similar to that of the selected object is more likely to be selected by the user, and accordingly, a higher priority is given.
 あるいは、例えば、選択されたオブジェクトが「動画コンテンツを一覧表示する」というツールであって、関連項目として、複数の「動画コンテンツ」がアイコンで配置される場合を想定する。この場合、関連情報において、関連項目のそれぞれには、さらに、属性として、「オススメ度」、「ジャンル」、「出演者名」などの情報が関連付けて関連情報記憶部42に記憶されている。「オススメ度(推奨度)」とは、ここでは、ユーザに対して視聴を推奨する度合いを示す情報であり、ユーザの嗜好情報、視聴履歴などに基づいて、あるいは、動画コンテンツの提供者によって、予め定められているものである。 Alternatively, for example, it is assumed that the selected object is a tool “display video content list” and a plurality of “video content” are arranged as icons as related items. In this case, in the related information, information such as “recommendation degree”, “genre”, and “performer name” is associated with each of the related items and stored in the related information storage unit 42. “Recommendation level (recommendation level)” is information indicating the degree of recommendation for viewing to the user here, based on user preference information, viewing history, etc., or by the video content provider, It is predetermined.
 そこで、アイコン順位決定部31は、「オススメ度」が高い関連項目(動画コンテンツ)ほど、優先順位が上位になるように関連項目の順位を決定してもよい。したがって、アイコン配置決定部33は、抽出された関連項目としての「動画コンテンツ」について、「オススメ度」が高いものほど、軌跡の終点近くにそのアイコンを配置することを決定する。ここでは、オススメ度が高い関連項目ほど、そのアイコンがユーザに選択される可能性が高いと考えられ、したがって、高い優先順位が付与される。 Therefore, the icon ranking determining unit 31 may determine the ranking of the related items such that the higher the “recommended degree” the related items (video content), the higher the priority. Therefore, the icon placement determination unit 33 determines that the icon is placed closer to the end point of the trajectory as the “recommendation degree” is higher for the “video content” as the extracted related item. Here, it is considered that a related item having a higher recommendation level is more likely to be selected by the user, and therefore, a higher priority is given.
 このように、オススメ度が高い、つまり、選択される可能性が高いアイコンほど、先の接触動作の移動完了位置の近くに表示されるような配置にすることにより、ユーザは、高い確率で先の接触動作の移動完了位置近くに表示されている所望のアイコンを選択することになる。結果として、高い確率で、指示体を画面上のあちらこちらに不自然に移動させるという選択操作の煩わしさを回避することができる。 In this way, by arranging the icons that are highly recommended, that is, the icons that are more likely to be selected, to be displayed closer to the movement completion position of the previous contact operation, the user can have a higher probability. The desired icon displayed near the movement completion position of the contact operation is selected. As a result, it is possible to avoid the troublesome selection operation of moving the indicator unnaturally from place to place on the screen with high probability.
 また、アイコン順位決定部31は、予め設定されているユーザの嗜好情報と、各関連項目の属性(「ジャンル」および「出演者名」など)とを比較して、ユーザの嗜好との類似度を総合的に求めて、類似度が高いものほど、選択される可能性が高いと判断してもよい。ここでは、ユーザの好みに近い属性を持つ関連項目ほど、そのアイコンがユーザに選択される可能性が高いと考えられ、したがって、高い優先順位が付与される。 Further, the icon rank determination unit 31 compares the user preference information set in advance with the attributes of each related item (such as “genre” and “performer name”), and the similarity to the user preference. It may be determined that the higher the similarity is, the higher the possibility of selection is. Here, it is considered that a related item having an attribute close to the user's preference is more likely to be selected by the user, and thus a higher priority is given.
 あるいは、関連情報において、関連項目の属性の1つとして、最近選択された日時を示す「選択日時」が格納されていてもよい。アイコン順位決定部31は、属性「選択日時」を参照し、最後にその関連項目のアイコンが選択された日時が最近であるものほど、高い優先順位を付与してもよい。ここでは、ユーザが最近興味を持って選択している関連項目ほど、そのアイコンがユーザに選択される可能性が高いと考えられ、したがって、高い優先順位が付与される。反対に、最後にその関連項目のアイコンが選択された日時が古いほど、高い優先順位を付与してもよい。この構成によれば、最近選択されていない関連項目ほど、ユーザの指先の近くに配置することができるため、ユーザに選択される確率を上げることができる。 Alternatively, in the related information, “selected date / time” indicating the recently selected date / time may be stored as one of the attributes of the related item. The icon rank determination unit 31 may refer to the attribute “selected date and time” and give a higher priority to the latest date and time when the icon of the related item was last selected. Here, it is considered that the related item that the user has recently selected with interest is more likely to be selected by the user, and therefore, a higher priority is given. On the contrary, a higher priority may be given to the older date and time when the icon of the related item was last selected. According to this configuration, the related items that have not been selected recently can be arranged closer to the user's fingertips, so that the probability of being selected by the user can be increased.
 (ユーザの使用状況を「察する」)
 図4の(a)および(b)に示したとおり、タブレット端末100が小型の携帯端末であって、片手でも両手でも操作可能である場合、片手で操作しているとき、両手で操作しているときとでは、ユーザが画面に指を接触させる可能な領域は、異なることが想定される。図4の(b)に示すように、両手で操作する場合には、タッチパネルのどの領域にも触れることが可能である。一方、図4の(a)に示すように、片手で操作する場合には、接触位置は、画面下部左側の領域(左手で操作する場合)、または、画面下部右側の領域(右手で操作する場合)に偏る傾向がある。ユーザがこのような状況でタブレット端末100を使用しているときに、接触動作が必要なオブジェクトまたはアイコンを、画面上部や手と反対側の画面下部に表示すると操作が煩雑になるという問題がある。なぜなら、ユーザは、目的のオブジェクトをすぐさまタッチできず、接触可能な領域にたぐり寄せるという余計な動作を行わなければならないか、両手操作に切り替えなければならないからである。
("See" user usage)
As shown in FIGS. 4A and 4B, when the tablet terminal 100 is a small portable terminal and can be operated with one hand or both hands, when operating with one hand, operate with both hands. It is assumed that the area where the user can touch the screen with a finger is different from the time when the user is present. As shown in FIG. 4B, when operating with both hands, any area of the touch panel can be touched. On the other hand, as shown in FIG. 4A, when the operation is performed with one hand, the contact position is the area on the lower left side of the screen (when operated with the left hand) or the area on the lower right side of the screen (operated with the right hand). Tend to be biased. When the user is using the tablet terminal 100 in such a situation, there is a problem that the operation becomes complicated if an object or icon that requires a contact operation is displayed on the upper part of the screen or the lower part of the screen opposite to the hand. . This is because the user cannot touch the target object immediately and must perform an extra operation of rushing to a contactable area or switching to a two-handed operation.
 そこで、本発明のタブレット端末100は、ユーザの使用状況を察することにより、上記の問題を解決する。具体的には、本発明のタブレット端末100を、指の接触位置の偏りを検出し、ユーザの指がすぐさま届くと推測される領域内に、アイコンを配置する構成とすることができる。 Therefore, the tablet terminal 100 of the present invention solves the above problem by observing the usage status of the user. Specifically, the tablet terminal 100 of the present invention can be configured to detect the bias of the contact position of the finger and place the icon in an area where the user's finger is expected to reach immediately.
 本変形例では、タブレット端末100の接触情報生成部21は、接触/非接触の切り替えに関わらず、さらに、接触動作が「囲う」のジェスチャであるか否かに関わらず、所定期間(例えば、数秒~数分程度)に生じたユーザの接触動作を示す接触情報を生成し、接触情報記憶部44に記憶する構成である。 In this modification, the contact information generation unit 21 of the tablet terminal 100 does not depend on contact / non-contact switching, and further, regardless of whether the contact operation is a gesture of “enclose” (for example, In this configuration, contact information indicating a user's contact operation that occurred within a few seconds to several minutes) is generated and stored in the contact information storage unit 44.
 図25は、ユーザの使用状況に応じて操作画面を提示することが可能な本発明のタブレット端末100の動作を説明する図である。より詳細には、図25の(a)は、ユーザが左手で操作しているという状況の一例を説明する図である。図25の(b)は、同図の(a)の接触動作に伴って生成された接触情報の具体例を示す図である。 FIG. 25 is a diagram for explaining the operation of the tablet terminal 100 of the present invention capable of presenting an operation screen in accordance with the usage status of the user. More specifically, FIG. 25A is a diagram illustrating an example of a situation where the user is operating with the left hand. (B) of FIG. 25 is a figure which shows the specific example of the contact information produced | generated with the contact operation | movement of (a) of the figure.
 図25の(a)に示すとおり、例えば、ユーザが親指でフリック動作によって、目的のオブジェクトを接触可能な領域までたぐり寄せて、それを囲むという接触動作を実行したとする。 As shown in FIG. 25 (a), for example, it is assumed that the user performs a contact operation by dragging a target object to a contactable region by a flick operation with a thumb and surrounding it.
 接触情報生成部21は、所定期間(例えば、過去数秒~数分間程度)、上述の一連の接触動作に対して、図25の(b)に示すような接触情報を生成し、接触情報記憶部44に記憶させている。なお、軌跡を記憶する接触情報記憶部44のメモリ容量に制約がある場合には、接触情報生成部21は、新しい軌跡を記憶する度に、最も古い軌跡から削除していく構成にすればよい。 The contact information generation unit 21 generates contact information as shown in FIG. 25B for the above-described series of contact operations for a predetermined period (for example, the past several seconds to several minutes), and a contact information storage unit 44. When the memory capacity of the contact information storage unit 44 that stores the locus is limited, the contact information generation unit 21 may be configured to delete the oldest locus every time a new locus is stored. .
 ここで、ジェスチャ判定部25によって、オブジェクトを選択する(例えば、「囲う」)のジェスチャが生じたと判定される。操作画面処理部24は、接触情報記憶部44に記憶されている、過去数秒~数分間分の接触情報をさかのぼって参照して、指の接触位置に偏りが無いか否かを検出する。図25の(a)および(b)に示す例では、指の軌跡は画面下部左側の領域82に偏っている。操作画面処理部24は、この偏りを検知して、ユーザの接触可能領域を、画面下部左側の領域82であると特定する。なお、この画面下部左側の領域82と、画面下部右側の領域83とは、予め定義されているものとする。 Here, it is determined by the gesture determination unit 25 that a gesture for selecting an object (for example, “enclose”) has occurred. The operation screen processing unit 24 refers back to the past several seconds to several minutes of contact information stored in the contact information storage unit 44, and detects whether or not the finger touch position is biased. In the example shown in FIGS. 25A and 25B, the finger trajectory is biased toward the area 82 on the left side of the lower part of the screen. The operation screen processing unit 24 detects this bias and identifies the user accessible area as the area 82 on the lower left side of the screen. Note that the area 82 on the lower left side of the screen and the area 83 on the lower right side of the screen are defined in advance.
 操作画面処理部24の環形状決定部30は、アイコンを配置するための環が、画面下部左側の領域82に収まるように、環の形状、サイズ、および、配置位置を決定する。 The ring shape determining unit 30 of the operation screen processing unit 24 determines the shape, size, and arrangement position of the ring so that the ring for arranging the icons fits in the area 82 on the lower left side of the screen.
 図26は、配置パターン決定部(または環形状決定部30)によって決定されたアイコン配置パターン(または環形状)にしたがってアイコンが配置されたときの操作画面の一例を示す図である。図26に示すとおり、選択されたオブジェクト80の関連項目は、画面下部左側の領域82内に収まるように表示されているので、ユーザは、親指で目的のアイコンをたぐり寄せる必要はなく、すぐさま次のアイコンを選択することができる。 FIG. 26 is a diagram illustrating an example of an operation screen when icons are arranged according to the icon arrangement pattern (or ring shape) determined by the arrangement pattern determining unit (or ring shape determining unit 30). As shown in FIG. 26, since the related items of the selected object 80 are displayed so as to fit within the area 82 on the lower left side of the screen, the user does not need to drag the target icon with the thumb, The icon can be selected.
 さらに、操作画面処理部24は、抽出された各アイコンを、次にユーザに選択される可能性が高いものほど、オブジェクトを選択するための接触動作に基づく軌跡終点近くに配置するようになっている。このため、ユーザは、指を画面のあちらこちらに不自然に移動させるという操作上の煩わしさを高い確率で回避することができる。片手での操作は両手での操作よりも操作性が低下するので、操作上の煩わしさを上記のように回避できることは、片手で操作する場合に操作性を向上させる上で特に大きな効果を奏する。 Further, the operation screen processing unit 24 arranges each extracted icon near the locus end point based on the contact operation for selecting an object, as the next icon is more likely to be selected by the user. Yes. For this reason, the user can avoid the troublesome operation of moving the finger unnaturally around the screen with a high probability. Since one-handed operation is less operable than two-handed operation, avoiding the operational inconvenience as described above is particularly effective in improving operability when operating with one hand. .
 なお、タブレット端末100は、ユーザが片手で操作しているという使用状況を判別するために、指の軌跡の線の太さに基づいて、親指で操作されているのか否かを判定し、親指で操作されていると判定した場合に、片手で操作されていると判断し、画面下部にアイコンを表示する構成であってもよい。あるいは、タブレット端末100の筐体にセンサを設けて、タブレット端末100が、筐体が4本の指で把持されているのか、5本の指で把持されているのかを判別し、これに応じて片手操作、または、両手操作の判断を行ってもよい。 Note that the tablet terminal 100 determines whether or not the user is operating with the thumb based on the thickness of the line of the finger trajectory in order to determine the usage situation that the user is operating with one hand. If it is determined that the operation is performed, it may be determined that the operation is performed with one hand, and an icon may be displayed at the bottom of the screen. Alternatively, a sensor is provided in the casing of the tablet terminal 100, and the tablet terminal 100 determines whether the casing is gripped by four fingers or five fingers, and accordingly One-handed operation or two-handed operation may be determined.
 また、タブレット端末100の操作画面処理部24は、指の軌跡によって囲まれた領域の接触座標情報を参照して、軌跡領域を特定し、その軌跡領域および近辺を、ユーザの接触可能領域であると特定して、そこにアイコンの環が配置されるように決定してもよい。 In addition, the operation screen processing unit 24 of the tablet terminal 100 refers to the contact coordinate information of the area surrounded by the locus of the finger, identifies the locus area, and the locus area and the vicinity thereof are user accessible areas. It may be determined that an icon ring is arranged there.
 (アイコン表示数の調整)
 アイコン配置決定部33は、決定された環形状のサイズと、アイコンの大きさとを考慮して、関連項目抽出部23によって抽出されたすべてのアイコンを表示できないと判断した場合には、表示すべきアイコンの数を減らすことを決定してもよい。
(Adjusting the number of icons displayed)
If the icon arrangement determining unit 33 determines that all icons extracted by the related item extracting unit 23 cannot be displayed in consideration of the determined ring shape size and icon size, the icon arrangement determining unit 33 should display the icons. It may be decided to reduce the number of icons.
 また、アイコン配置決定部33は、接触情報を参照して、指の軌跡(あるいは、囲われた領域)の絶対的な大きさに基づいて、表示するアイコンの数を決定してもよい。これにより、ユーザはオブジェクトを小さめの環で囲うか、大きめの環で囲うかを変えることによって、意図的に、次に表示させるアイコンの数を調節することが可能となる。 Further, the icon arrangement determining unit 33 may determine the number of icons to be displayed based on the absolute size of the finger trajectory (or the enclosed area) with reference to the contact information. Thus, the user can intentionally adjust the number of icons to be displayed next by changing whether the object is surrounded by a smaller ring or a larger ring.
 ここで、アイコン配置決定部33は、表示すべきアイコンの数を減らす場合には、アイコン順位決定部31によって決定された優先順位の低いアイコンから順に減らすことができる。 Here, when the number of icons to be displayed is reduced, the icon arrangement determining unit 33 can decrease the icons in descending order of priority determined by the icon order determining unit 31.
 (アイコン配置位置と優先順位との関連付け方法)
 上述の各実施形態では、アイコン配置決定部33は、所定のアイコン配置パターンにおいて定義されたいくつかのアイコンの配置位置に優先順位を関連付けるときに、軌跡の終点tnとの距離が近い配置位置から順に、上位の優先順位を関連付ける構成であった。
(Association method of icon position and priority)
In each of the above-described embodiments, the icon placement determination unit 33, when associating priorities with the placement positions of some icons defined in a predetermined icon placement pattern, from the placement position that is close to the end point tn of the trajectory. In order, the higher priority order was associated.
 しかし、本発明の情報処理装置(タブレット端末100)の構成は上記に限定されない。アイコン配置決定部33は、以下に示す方法で、アイコン配置位置と優先順位との関連付けを行ってもよい。 However, the configuration of the information processing apparatus (tablet terminal 100) of the present invention is not limited to the above. The icon arrangement determining unit 33 may associate the icon arrangement position with the priority order by the method described below.
 図27の(a)~(c)は、アイコン配置決定部33が行う、アイコン配置位置と優先順位との関連付けの具体例を示す図である。 27 (a) to 27 (c) are diagrams showing specific examples of association between icon arrangement positions and priorities performed by the icon arrangement determining unit 33. FIG.
 オブジェクトを選択したときの指の軌跡の終点が、図27に示す終点tnの位置にある場合であって、アイコン配置決定部33が参照する所定のアイコン配置パターンは、図9の(a)に示すとおり、8個のアイコンを楕円の輪の輪郭線上に等間隔に配置するものであるとする。 The predetermined icon arrangement pattern referred to by the icon arrangement determining unit 33 is the case where the end point of the finger trajectory when the object is selected is at the position of the end point tn shown in FIG. As shown, it is assumed that eight icons are arranged at equal intervals on the outline of the elliptical ring.
 図27の(a)に示す例では、アイコン配置決定部33は、上記環の輪郭線上の、上記軌跡の終点tnに最も近い点Pを特定する。そして、アイコン配置決定部33は、点Pから環の輪郭線をたどってアイコン配置位置A~H(図9の(a))に到達するまでの距離が短い配置位置ほど上位の優先順位を関連付ける。 In the example shown in FIG. 27A, the icon arrangement determining unit 33 specifies a point P closest to the end point tn of the trajectory on the outline of the ring. Then, the icon arrangement determining unit 33 associates a higher priority with an arrangement position having a shorter distance from the point P to the icon arrangement positions A to H (FIG. 9A) following the outline of the ring. .
 すなわち、図27の(a)に示すとおり、アイコン配置決定部33は、アイコン配置位置Aに、優先順位8位を、Bに6位を、Cに4位を、Dに2位を、Eに1位を、Fに3位を、Gに5位を、Hに7位を関連付ける。 That is, as shown in FIG. 27A, the icon arrangement determining unit 33 sets the icon arrangement position A to the eighth priority, B to the sixth, C to the fourth, D to the second, E Associate 1st position with F, 3rd position with F, 5th position with G and 7th position with H.
 上記の構成によれば、タブレット端末100が、図28の(a)および(b)に示すとおり、ユーザのドラッグ操作によって、アイコンの環を、時計回りおよび反時計回りに回転させることが可能な操作画面を表示する場合にメリットがある。 According to the above configuration, as shown in FIGS. 28A and 28B, the tablet terminal 100 can rotate the icon ring clockwise and counterclockwise by a user's drag operation. There is a merit when displaying the operation screen.
 具体的には、ユーザが、終点tnでオブジェクトを選択した後、それに伴って表示されたアイコンの環をドラッグ操作で回転させて、目的のアイコンを自分の指先がある位置(すなわち終点tn付近)にたぐり寄せる場合を想定する。この場合、終点tnからその輪郭線をたどる距離が短くなる位置にあるアイコンほど、少ないドラッグ操作(回転操作)で、目的のアイコンをたぐり寄せることができる。 Specifically, after the user selects an object at the end point tn, the icon ring displayed with the user is rotated by a drag operation, and the target icon is located at the position where the fingertip is located (that is, near the end point tn). Suppose that you want to crawl on. In this case, the icon at a position where the distance to follow the outline from the end point tn becomes shorter, and the target icon can be dragged with fewer drag operations (rotation operations).
 つまり、ユーザに選択される可能性が高いアイコンほど、輪郭線をたどる距離が短くなる位置にアイコンを配置すれば、ユーザは、高い確率で、より少ない回数のドラッグ操作で目的のアイコンをたぐり寄せることができる。 In other words, if an icon is more likely to be selected by the user, if the icon is arranged at a position where the distance to follow the outline becomes shorter, the user will move the target icon with a lower number of drag operations with a higher probability. be able to.
 したがって、ユーザは、少ない動作数で目的の最終結果物にたどり着くことができる。つまり、選択操作の煩わしさを回避することが可能である。 Therefore, the user can reach the target final result with a small number of operations. That is, it is possible to avoid the troublesome selection operation.
 あるいは、アイコンが楕円形に配置されていると、ユーザは、目的のアイコンを探して選択する際に、自然にその楕円形の輪郭線をなぞるように指を走らせていくことが想定される。そこで、アイコン配置決定部33は、終点tnからその輪郭線をたどる距離が短くなるアイコン配置位置に、優先順位の高いアイコンを配置することを決定する。 Alternatively, when the icons are arranged in an elliptical shape, it is assumed that the user naturally moves his / her finger so as to trace the elliptical outline when searching for and selecting the target icon. Therefore, the icon arrangement determining unit 33 determines to arrange an icon having a high priority at an icon arrangement position where the distance to follow the outline from the end point tn is short.
 これにより、ユーザに選択される可能性が高いアイコンほど、終点tnからの輪郭線上の距離が短いアイコン配置位置に配置させることができる。このため、ユーザは、指を、オブジェクトを選択し終わった位置から走らせて、高い確率で、より短い距離で目的のアイコンに到達することができる。 Thereby, an icon having a high possibility of being selected by the user can be arranged at an icon arrangement position having a short distance on the contour line from the end point tn. For this reason, the user can reach the target icon at a shorter distance with high probability by running the finger from the position where the object has been selected.
 したがって、ユーザは、操作体を画面上のあちらこちらに不自然に移動させる必要がなくなり、簡易な接触動作で目的の最終結果物にたどり着くことができる。つまり、選択操作の煩わしさを回避することが可能である。また操作体の移動距離が短いほど、誤操作の誘発を抑えることも可能である。 Therefore, it is not necessary for the user to unnaturally move the operating body from one place to another on the screen, and the user can reach the desired final result with a simple contact operation. That is, it is possible to avoid the troublesome selection operation. In addition, the shorter the movement distance of the operating body, the more the erroneous operation can be prevented.
 図27の(b)に示す例では、アイコン配置決定部33は、所定のアイコン配置位置A~Hのうち、軌跡の終点tnに最も近い配置位置に、最も高い優先順位を関連付ける。同図に示すとおり、アイコン配置位置Eに1位を関連付ける。そして、1位のEから反時計回りに、各配置位置に対して、2位以降の優先順位を順に関連付ける。すなわち、図27の(b)に示すとおり、Dに2位を、Cに3位を、Bに4位を、Aに5位を、Hに6位を、Gに7位を、Fに8位を関連付ける。 In the example shown in (b) of FIG. 27, the icon arrangement determining unit 33 associates the highest priority with the arrangement position closest to the end point tn of the locus among the predetermined icon arrangement positions A to H. As shown in the figure, the first place is associated with the icon arrangement position E. Then, the second and subsequent priorities are sequentially associated with each arrangement position in the counterclockwise direction from the first E. That is, as shown in FIG. 27 (b), D is 2nd, C is 3rd, B is 4th, A is 5th, H is 6th, G is 7th, and F is Associate 8th place.
 上記の構成によれば、タブレット端末100が、図28の(a)に示すとおり、ユーザのドラッグ操作によって、アイコンの環を、時計回りに回転させることが可能な操作画面を表示する場合にメリットがある。 According to the above configuration, as shown in FIG. 28A, the tablet terminal 100 has an advantage when displaying an operation screen that can rotate the ring of icons clockwise by a user's drag operation. There is.
 具体的には、ユーザが、終点tnでオブジェクトを選択した後、それに伴って表示されたアイコンの環をドラッグ操作で時計回りに回転させて、目的のアイコンを自分の指先がある位置(すなわち終点tn付近)にたぐり寄せる場合を想定する。この場合、図27の(b)に示す各アイコンは、環が時計回りに回転するにつれて、1位のアイコンから、反時計回りの順で、ユーザの指先(終点tn付近)にたぐり寄せられる。 Specifically, after the user selects an object at the end point tn, the icon ring displayed along with it is rotated clockwise by a drag operation, and the target icon is moved to the position where the fingertip is located (that is, the end point). Assume a case of rushing to (near tn). In this case, each icon shown in FIG. 27B is dragged from the first-ranked icon to the user's fingertip (near the end point tn) in the counterclockwise order as the ring rotates clockwise.
 すなわち、たぐり寄せられる順は、図27の(b)に示す優先順位(ユーザに選択される可能性が高い順)に一致している。このため、ユーザは、高い確率で、より少ない回数のドラッグ操作で目的のアイコンをたぐり寄せることができる。 That is, the order in which they are gathered matches the priority shown in FIG. 27B (the order in which the user is likely to be selected). For this reason, the user can search for the target icon with a high probability with a smaller number of drag operations.
 図27の(c)に示す例では、アイコン配置決定部33は、図27の(b)に示す例と同様に、アイコン配置位置Eに1位を関連付ける。そして、1位のEから時計回りに、各配置位置に対して、2位以降の優先順位を順に関連付ける。すなわち、図27の(c)に示すとおり、Fに2位を、Gに3位を、Hに4位を、Aに5位を、Bに6位を、Cに7位を、Dに8位を関連付ける。 In the example shown in (c) of FIG. 27, the icon arrangement determining unit 33 associates the first place with the icon arrangement position E as in the example shown in (b) of FIG. Then, in order clockwise from the first E, the second and subsequent priorities are sequentially associated with each placement position. That is, as shown in FIG. 27 (c), F is 2nd, G is 3rd, H is 4th, A is 5th, B is 6th, C is 7th, and D is Associate 8th place.
 上記の構成によれば、タブレット端末100が、図28の(b)に示すとおり、ユーザのドラッグ操作によって、アイコンの環を、反時計回りに回転させることが可能な操作画面を表示する場合にメリットがある。 According to said structure, when the tablet terminal 100 displays the operation screen which can rotate the ring of an icon counterclockwise by a user's drag operation as shown in FIG.28 (b). There are benefits.
 具体的には、ユーザが、終点tnでオブジェクトを選択した後、それに伴って表示されたアイコンの環をドラッグ操作で反時計回りに回転させて、目的のアイコンを自分の指先がある位置(すなわち終点tn付近)にたぐり寄せる場合を想定する。この場合、図27の(c)に示す各アイコンは、環が反時計回りに回転するにつれて、1位のアイコンから、時計回りの順で、ユーザの指先(終点tn付近)にたぐり寄せられる。 More specifically, after the user selects an object at the end point tn, the user rotates the icon ring displayed in a counterclockwise direction by dragging, and moves the target icon to the position where his / her fingertip is located (that is, Assume a case of rushing to the vicinity of the end point tn. In this case, each icon shown in FIG. 27C is dragged from the first-ranked icon to the user's fingertip (near the end point tn) in the clockwise order as the ring rotates counterclockwise.
 すなわち、たぐり寄せられる順は、図27の(c)に示す優先順位(ユーザに選択される可能性が高い順)に一致している。このため、ユーザは、高い確率で、より少ない回数のドラッグ操作で目的のアイコンをたぐり寄せることができる。 That is, the order in which they are gathered matches the priority shown in FIG. 27C (the order in which the user is likely to be selected). For this reason, the user can search for the target icon with a high probability with a smaller number of drag operations.
 (タブレットPCへの適用)
 上述の各実施形態では、本発明の情報処理装置としてのタブレット端末100が、片手で操作することが可能な、小型で携帯性に優れたスマートフォンであることを想定して説明を行ったが、本発明の情報処理装置は、小型のスマートフォンに限定されず、ノートサイズ画面を有するタブレットPC、あるいは、それよりも大画面を有する電子黒板などにも適用可能である。
(Application to tablet PC)
In each of the above-described embodiments, the tablet terminal 100 as the information processing apparatus of the present invention has been described on the assumption that it is a small and portable smartphone that can be operated with one hand. The information processing apparatus of the present invention is not limited to a small smartphone, but can be applied to a tablet PC having a notebook size screen, an electronic blackboard having a larger screen than that, and the like.
 タブレットPC、電子黒板などは、スマートフォンよりも大きい表示画面を有している。そのため、ユーザが所望のオブジェクトやアイコンを選択するために、指示体(指やペン)などを、スマートフォンを操作するときよりも、より大きく動かさなければならない。 Tablet PCs, electronic blackboards, etc. have a larger display screen than smartphones. Therefore, in order for the user to select a desired object or icon, an indicator (finger or pen) or the like must be moved more than when operating the smartphone.
 本発明の情報処理装置によれば、所望のオブジェクトやアイコンを選択する上で、指示体の動きをより小さくすることができる。つまり、大きな表示画面を有するタブレットPC、あるいは、電子黒板などにおいて、本発明の情報処理装置を用いれば、優れた操作性を実現するという本発明の利点を、より多く享受することができる。 According to the information processing apparatus of the present invention, the movement of the indicator can be further reduced when selecting a desired object or icon. That is, if the information processing apparatus of the present invention is used in a tablet PC having a large display screen or an electronic blackboard, the advantage of the present invention that realizes excellent operability can be enjoyed more.
 図29および図31は、本発明の情報処理装置としてのタブレット端末100aを、タブレットPCで実現した場合の様子を示す図である。 FIG. 29 and FIG. 31 are views showing a state where the tablet terminal 100a as the information processing apparatus of the present invention is realized by a tablet PC.
 図30および図32において、(a)は、アイコン配置決定部33によって、指示体の軌跡および軌跡の終点tnがプロットされたアイコン配置パターンの具体例を示す図であり、(b)は、アイコン配置決定部33によって、各アイコン配置位置に優先順位が関連付けられた結果を示す図である。 30 and 32, (a) is a diagram showing a specific example of the icon arrangement pattern in which the locus of the indicator and the end point tn of the locus are plotted by the icon arrangement determining unit 33, and (b) is an icon. It is a figure which shows the result by which the priority order was linked | related with each icon arrangement position by the arrangement | positioning determination part.
 図29および図31に示すとおり、例えば、タブレット端末100aの表示部12の画面いっぱいに、二つの写真オブジェクトが表示されており、その何れかを、ユーザが指示体を移動させて選択するとする。具体的には、ユーザは、目的の写真オブジェクトを囲うように、指(指示体)を動かして接触動作を実施したとする。 As shown in FIGS. 29 and 31, for example, it is assumed that two photo objects are displayed on the entire screen of the display unit 12 of the tablet terminal 100a, and the user selects one of them by moving the indicator. Specifically, it is assumed that the user performs a contact operation by moving a finger (indicator) so as to surround the target photographic object.
 まず、図29に示す例では、ユーザは、目的の写真オブジェクトの右下あたりから、該写真オブジェクトを囲うように円を描き、同じく右下あたりで円を描き終える。 First, in the example shown in FIG. 29, the user draws a circle so as to surround the photo object from around the lower right of the target photo object, and similarly finishes drawing the circle around the lower right.
 そして、接触情報生成部21によって取得された上記指の軌跡とその終点tnを、アイコン配置決定部33は取得して、図30の(a)に示すように、それを所定のアイコン配置パターンにプロットする。 And the icon arrangement | positioning determination part 33 acquires the locus | trajectory of the said finger | toe acquired by the contact information generation part 21, and its end point tn, and as shown to (a) of FIG. 30, it is made into a predetermined icon arrangement pattern. Plot.
 続いて、アイコン配置決定部33は、所定の規則(例えば、終点tnと直線距離が短いもの順に、上位の優先順位を関連付けるという規則)にしたがって、終点tnに基づいて、パターンに定義された各アイコン配置位置A~Hに、優先順位1~8位を関連付ける。例えば、アイコン配置決定部33が、上記規則にしたがって、アイコン配置位置に優先順位を関連付けると、図30の(b)に示すとおりとなる。各アイコン配置位置に付与された数字は関連付けられた優先順位を表す。 Subsequently, the icon arrangement determination unit 33 follows each predetermined rule defined in the pattern based on the end point tn according to a predetermined rule (for example, a rule that associates a higher priority in the order of the shortest distance from the end point tn). The priority positions 1 to 8 are associated with the icon arrangement positions A to H. For example, when the icon arrangement determining unit 33 associates the priority order with the icon arrangement position according to the above rules, the result is as shown in FIG. The number assigned to each icon arrangement position represents the associated priority order.
 操作画面処理部24は、アイコン配置決定部33の決定にしたがって、アイコン順位決定部31がアイコンに予め付与した優先順位と一致するように、各アイコンを配置位置に配置する。 The operation screen processing unit 24 arranges each icon at the arrangement position according to the determination by the icon arrangement determining unit 33 so that the icon order determining unit 31 matches the priority order previously assigned to the icons.
 これにより、指先が位置していると考えられている終点tn付近に、優先順位が上位の(ユーザに選択される可能性が高い)アイコンが表示されることになる。よって、ユーザは、タブレット端末100aの大きめの画面であっても、高い確率で、指を大きく移動させずに目的のアイコンを選択する操作に移行することができる。 Thus, an icon with a higher priority (highly likely to be selected by the user) is displayed near the end point tn where the fingertip is considered to be located. Therefore, even if the user has a large screen of the tablet terminal 100a, the user can move to an operation of selecting a target icon without moving the finger greatly with a high probability.
 一方、接触動作の仕方は、ユーザによってまちまちである。例えば、図31に示すとおり、ユーザは、目的の写真オブジェクトの左上あたりから、該写真オブジェクトを囲うように円を描き、同じく左上あたりで円を描き終える場合もある。 On the other hand, the method of the contact operation varies depending on the user. For example, as shown in FIG. 31, the user may draw a circle so as to surround the photo object from the upper left of the target photo object, and finish drawing the circle around the upper left as well.
 この場合、アイコン配置決定部33は、上記指の軌跡とその終点tnを、図32の(a)に示すように、上記アイコン配置パターンにプロットする。 In this case, the icon arrangement determining unit 33 plots the finger trajectory and its end point tn in the icon arrangement pattern as shown in FIG.
 続いて、アイコン配置決定部33は、上記規則にしたがって、終点tnと直線距離が短いアイコン配置位置ほど、上位の優先順位を関連付ける。ここでは、例えば、アイコン配置決定部33が、上記規則にしたがって、アイコン配置位置に優先順位を関連付けると、図32の(b)に示すとおりとなる。 Subsequently, according to the above rules, the icon arrangement determining unit 33 associates the higher priority with the icon arrangement position having a shorter straight line distance with the end point tn. Here, for example, when the icon arrangement determining unit 33 associates the priority order with the icon arrangement position according to the above rule, the result is as shown in FIG.
 これにより、指先が位置していると考えられている終点tn付近に、優先順位が上位の(ユーザに選択される可能性が高い)アイコンが表示されることになる。よって、ユーザは、タブレット端末100aが備える大きめの画面であっても、高い確率で、指を大きく移動させずに目的のアイコンを選択する操作に移行することができる。 Thus, an icon with a higher priority (highly likely to be selected by the user) is displayed near the end point tn where the fingertip is considered to be located. Therefore, even if the user has a large screen included in the tablet terminal 100a, the user can move to an operation of selecting a target icon without moving the finger greatly with a high probability.
 上述のように、画面が大きめのタブレット端末(情報処理装置)100aにおいては、オブジェクトを選択するときの指示体の移動距離も長くなるので、終点(指先)の位置は、画面の至るところに到達する。そして、指先がある位置から離れたところに、次に選択したいアイコンが表示されることも有り得る。この場合、ユーザは、指示体を画面上のあちらこちらに不自然に移動させることになり、操作性が低下するという問題があった。なお、情報処理装置の画面が大画面であればあるほど、指示体を遠くに動かす操作は煩わしいものとなるので、上記の問題は深刻である。 As described above, in the tablet terminal (information processing apparatus) 100a having a large screen, the moving distance of the indicator when selecting an object becomes long, so that the position of the end point (fingertip) reaches everywhere on the screen. To do. Then, it is possible that the icon to be selected next is displayed at a position away from a certain position. In this case, the user unnaturally moves the indicator from one place to another on the screen, and there is a problem that operability is degraded. Note that the larger the screen of the information processing apparatus, the more troublesome is the operation of moving the indicator further away, and the above problem is more serious.
 しかしながら、本発明のタブレット端末100aによれば、上述のとおり、終点(指先)の位置が、画面の至るところ、いずれの位置に到達しても、その終点の位置から近い位置ほど、ユーザに選択される可能性の高いアイコンが表示される構成であるので、ユーザは、高い確率で、指をあまり移動させずに目的のアイコンを選択することができる。例えば、上述の例に示すとおり、同じ写真オブジェクトを囲う場合でも、ユーザによって囲い方はまちまちであり、それに応じて異なる結果が得られる。図29に示す接触動作では、図30に示すとおり、アイコン配置位置Eに、最も上位の優先順位が関連付けられるし、図31に示す接触動作では、図32に示すとおり、アイコン配置位置Gに最も上位の優先順位が関連付けられる。このように、大画面を有する情報処理装置であるほど、ユーザの囲い方などによって、最終結果物は大きく異なり、上記の深刻な問題を解消することができる。 However, according to the tablet terminal 100a of the present invention, as described above, even if the position of the end point (fingertip) reaches any position throughout the screen, the position closer to the end position is selected by the user. Since the icon that is highly likely to be displayed is displayed, the user can select the target icon with a high probability without moving the finger too much. For example, as shown in the above-described example, even when the same photographic object is enclosed, the way of enclosure varies depending on the user, and different results are obtained accordingly. In the contact operation shown in FIG. 29, the highest priority is associated with the icon arrangement position E as shown in FIG. 30, and in the contact operation shown in FIG. 31, the icon arrangement position G is the highest as shown in FIG. A higher priority is associated. In this way, as the information processing apparatus has a large screen, the final product differs greatly depending on how the user is enclosed, and the above serious problems can be solved.
 以上のように、本発明のタブレット端末100aによれば、ユーザは、指示体を画面上のあちらこちらに不自然に移動させる必要がなくなり、簡易な接触動作で目的の最終結果物にたどり着くことができる。また指示体の移動距離が短いほど、誤操作の誘発を抑えることも可能である。結果として、優れた操作性を実現するという効果を奏する。上述したとおり、大きな表示画面を有するタブレットPC、あるいは、電子黒板などにおいて本発明の情報処理装置を用いれば、得られる効果は特に大きいものとなる。 As described above, according to the tablet terminal 100a of the present invention, it is not necessary for the user to unnaturally move the indicator from one place to another on the screen, and the target final result can be reached with a simple contact operation. it can. In addition, it is possible to suppress the induction of erroneous operations as the moving distance of the indicator is shorter. As a result, there is an effect of realizing excellent operability. As described above, if the information processing apparatus of the present invention is used in a tablet PC having a large display screen, an electronic blackboard, or the like, the effect obtained is particularly great.
 (アイコン配置パターンの変形例)
 上述の各実施形態では、アイコン配置決定部33は、所定のアイコン配置パターン(例えば、図9、図30および図32の(a)など)に基づいて、アイコン配置位置を認識する構成であった。しかしこのような構成に限定されない。アイコン配置決定部33は、接触情報生成部21によって取得された、ユーザが接触動作を行ったときの指の軌跡に基づいて、アイコン配置パターンを自ら決定してもよい。
(Modification of icon arrangement pattern)
In each of the above-described embodiments, the icon arrangement determining unit 33 is configured to recognize the icon arrangement position based on a predetermined icon arrangement pattern (for example, (a) in FIGS. 9, 30, and 32). . However, it is not limited to such a configuration. The icon arrangement determination unit 33 may determine an icon arrangement pattern by itself based on a finger trajectory acquired by the contact information generation unit 21 when the user performs a contact operation.
 図33は、アイコン配置決定部33が決定したアイコン配置パターンの一例を示す図である。 FIG. 33 is a diagram illustrating an example of an icon arrangement pattern determined by the icon arrangement determining unit 33.
 本変形例では、所定のアイコン配置パターンは存在せず、環形状決定部30によって環の形状および配置位置が決定されているのみである。 In this modification, there is no predetermined icon arrangement pattern, and only the ring shape and the arrangement position are determined by the ring shape determination unit 30.
 図33に示すとおり、まず、アイコン配置決定部33は、環形状決定部30が決定した環の形状と配置位置の情報を取得し、それに対して軌跡の終点tnをプロットする。次に、アイコン配置決定部33は、上記環の輪郭線上の、上記軌跡の終点に最も近い点Pを特定し、そこを、最も優先順位が高い(1位)のアイコン配置位置と決定する。1位のアイコンのより詳細なアイコンの配置位置は特に限定されないが、例えば、アイコンの中心が、上記点Pと一致する位置を1位のアイコン配置位置と決定してもよい。 33, first, the icon arrangement determining unit 33 acquires information on the ring shape and the arrangement position determined by the ring shape determining unit 30, and plots the end point tn of the trajectory with respect thereto. Next, the icon arrangement determination unit 33 specifies a point P on the contour line of the ring that is closest to the end point of the locus, and determines that point as the icon arrangement position with the highest priority (first place). A more detailed icon arrangement position of the first icon is not particularly limited. For example, a position where the center of the icon coincides with the point P may be determined as the first icon arrangement position.
 続いて、アイコン配置決定部33は、残りのアイコン配置位置を、先に決定した1位のアイコン配置位置を基準に決定する。例えば、残りの配置すべきアイコンが7個ある場合に、図33に示すとおり、1位のアイコン配置位置を基準にして、8個すべて輪郭線上で等間隔になるように、残りのアイコン配置位置A~Gを決定する。そしての残りのアイコン配置位置A~Gと、優先順位の関連付けは、上述したいずれかの規則に則って実施されればよい。 Subsequently, the icon arrangement determining unit 33 determines the remaining icon arrangement positions based on the previously determined first icon arrangement position. For example, when there are seven icons to be arranged, as shown in FIG. 33, the remaining icon arrangement positions are all equally spaced on the contour line with reference to the first icon arrangement position. A to G are determined. The association between the remaining icon arrangement positions A to G and the priority order may be performed according to any of the rules described above.
 上記構成によれば、操作画面処理部24は、優先順位が1位のアイコンを、必ず、環の輪郭線上の、終点tnからの最短距離に配置することが可能となる。 According to the above configuration, the operation screen processing unit 24 can always arrange the icon with the highest priority in the shortest distance from the end point tn on the outline of the ring.
 あるいは、アイコン配置決定部33は、最も優先順位が高い(1位)のアイコンの配置位置を以下のように決定してもよい。 Alternatively, the icon arrangement determination unit 33 may determine the arrangement position of the icon with the highest priority (first place) as follows.
 具体的には、図30の(a)に示すとおり、アイコン配置決定部33は、まず、軌跡から、終点tnに加えて、点tn-1を取得する。そして、アイコン配置決定部33は、点tn-1から終点tnへ延びる直線の延長線が、環の輪郭線(ここでは、破線の楕円)と最初に交わる位置、すなわち、交点Qを特定する。そして、アイコン配置決定部33は、交点Qを、最も優先順位が高い(1位)のアイコン配置位置と決定してもよい。 Specifically, as shown in FIG. 30A, the icon arrangement determining unit 33 first acquires a point tn−1 in addition to the end point tn from the trajectory. Then, the icon arrangement determining unit 33 specifies a position where the extended line of the straight line extending from the point tn−1 to the end point tn first intersects with the contour line of the ring (here, the dashed ellipse), that is, the intersection point Q. And the icon arrangement | positioning determination part 33 may determine the intersection Q as the icon arrangement position of the highest priority (1st place).
 この構成は、終点tnが環の内側にある場合に有効である。この場合、交点Qは、ユーザが操作体(指やペン)を、オブジェクトを囲うように円形に動かしたときに、操作体が、環の輪郭線上で、最も自然に最も早く到達する点となると考えられる。 This configuration is effective when the end point tn is inside the ring. In this case, when the user moves the operating tool (finger or pen) in a circle so as to surround the object, the intersection Q is the point that the operating tool reaches the earliest naturally on the outline of the ring. Conceivable.
 このように交点Qの位置に、優先順位が1位のアイコンを配置することにより、操作体が最も自然に最も早く到達する位置に、優先順位が1位のアイコンを表示させることが可能となる。結果として、ユーザは、次の操作で、操作体によって、上記1位のアイコンをより選択しやすくなる。 By arranging the icon with the first priority at the position of the intersection Q in this way, it is possible to display the icon with the first priority at the position where the operating body reaches the earliest naturally. . As a result, the user can easily select the first icon by the operation tool in the next operation.
 (表示部と入力部とを別体で備える情報処理装置への適用)
 本発明の情報処理装置としてのタブレット端末100の構成、すなわち、指示体の移動の軌跡の終点近くに優先度の高いアイコンを配置して表示する構成について、これを、入力部11と表示部12とが別体として備わっている情報処理装置に適用してもよい。例えば、入力部11がマウスなどの入力装置で構成されており、表示部12が、LCDなどの表示装置で構成されている情報処理装置が考えられる。ここでは、表示部12に表示されているカーソルが、表示部12の画面上の位置を指示するものである。そして、ユーザがマウスを操作して入力動作を実施することによりカーソルが移動する。したがって、このような実施形態では、マウスが操作体であり、カーソルが指示体であって、操作体の移動に伴って表示部12に表示されている指示体が移動する。ユーザが、マウスを操作して表示部12の画面上のカーソルを動かすことによりオブジェクトを選択すると、情報処理装置は、内部であらかじめ保持しているカーソルの位置を上記マウスの動きと連動させて、その移動の軌跡を保持する。そして、アイコンを所定の位置に配置するとき、上記軌跡の終点に近い配置位置に、優先度の高い(ユーザが選択する可能性が高い)アイコンを配置する。上記構成によれば、ユーザがマウスを動かし終えた位置の近くによく選択されるアイコンが配置された操作画面が得られるので、マウスを動かす動作が大きくならずに、アイコンを選択するという次の動作に移ることが可能となる。
(Applied to information processing equipment with separate display and input)
Regarding the configuration of the tablet terminal 100 as the information processing apparatus of the present invention, that is, the configuration in which an icon having a high priority is arranged and displayed near the end point of the movement locus of the indicator, this is represented by the input unit 11 and the display unit 12. And may be applied to an information processing apparatus provided separately. For example, an information processing apparatus in which the input unit 11 is configured by an input device such as a mouse and the display unit 12 is configured by a display device such as an LCD can be considered. Here, the cursor displayed on the display unit 12 indicates the position of the display unit 12 on the screen. Then, when the user operates the mouse to perform an input operation, the cursor moves. Therefore, in such an embodiment, the mouse is the operating body, the cursor is the pointing body, and the pointing body displayed on the display unit 12 moves as the operating body moves. When the user selects an object by moving the cursor on the screen of the display unit 12 by operating the mouse, the information processing apparatus interlocks the position of the cursor held in advance with the movement of the mouse, The trajectory of the movement is held. When the icon is arranged at a predetermined position, an icon having a high priority (high possibility of being selected by the user) is arranged at an arrangement position close to the end point of the locus. According to the above-described configuration, an operation screen in which icons that are often selected are arranged near the position where the user has finished moving the mouse is obtained. It becomes possible to move to operation.
 なお、表示部12と別体で構成される入力部11としては、上記のマウス以外にも、例えば、キーボード、ジョイスティック、ディジタイザ、タブレットとスタイラスペンなどの各種入力装置を採用することができる。 As the input unit 11 configured separately from the display unit 12, various input devices such as a keyboard, a joystick, a digitizer, a tablet, and a stylus pen can be employed in addition to the mouse.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。 The present invention is not limited to the above-described embodiments, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present invention.
 なお、本発明の情報処理装置および操作画面表示方法は、以下のようにも表現され得る。 Note that the information processing apparatus and operation screen display method of the present invention can also be expressed as follows.
 タッチパネルを備えた情報処理装置において、上記タッチパネル上を移動した指示体の移動の軌跡を取得する接触動作取得手段と、上記接触動作取得手段によって取得された軌跡に基づいて特定された選択領域に少なくとも一部が重なるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定手段と、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定手段によって特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出手段と、上記関連項目抽出手段によって抽出された関連項目のアイコンを特定の位置に配置して、上記タッチパネルに表示する操作画面処理手段とを備え、上記操作画面処理手段は、上記抽出された関連項目の中で、優先順位(ユーザに選択される可能性が高い順など)が高い関連項目のアイコンほど、上記接触動作取得手段によって取得された軌跡の終点の近くに配置することを特徴とする情報処理装置。 In an information processing apparatus including a touch panel, at least a selection region specified based on a trajectory acquired by the contact motion acquisition unit that acquires a trajectory of movement of the indicator that has moved on the touch panel Referring to the object specifying means for specifying the partially overlapping object as the selected object and the related information storage unit for storing the object and the item related to the object in association with each other, the object specifying means specifies the object. Related item extraction means for extracting an item associated with the selected object as a related item, and operation screen processing for displaying the icon of the related item extracted by the related item extraction means at a specific position and displaying it on the touch panel And the operation screen processing means is extracted as described above. Among related items, an icon of a related item having a higher priority (such as a descending order of possibility of being selected by the user) is arranged near the end point of the trajectory acquired by the contact operation acquisition unit. Information processing apparatus.
 タッチパネルを備えた情報処理装置における操作画面表示方法において、上記タッチパネル上を移動した指示体の移動の軌跡を取得する接触動作取得ステップと、上記接触動作取得ステップにて取得された軌跡に基づいて特定された選択領域に少なくとも一部が重なるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定ステップと、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定ステップにて特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出ステップと、上記関連項目抽出ステップによって抽出された関連項目のアイコン特定の位置に配置して、上記タッチパネルに表示する操作画面処理ステップとを含み、上記操作画面処理ステップでは、上記抽出された関連項目の中で、優先順位(ユーザに選択される可能性が高い順など)が高い関連項目のアイコンほど、上記接触動作取得ステップにて取得された軌跡の終点の近くに配置することを特徴とする操作画面表示方法。 In an operation screen display method in an information processing apparatus including a touch panel, a touch motion acquisition step for acquiring a trajectory of movement of an indicator that has moved on the touch panel, and a specification based on the trajectory acquired in the touch motion acquisition step An object specifying step for specifying an object at least partially overlapping the selected area as the selected object, and a related information storage unit for storing the object and an item related to the object in association with each other. A related item extracting step for extracting an item associated with the object specified in the object specifying step as a related item, and an icon specifying position of the related item extracted by the related item extracting step. Operation screen processing steps to be displayed on In the operation screen processing step, among the extracted related items, the icons of related items having higher priority (such as the order in which the user is more likely to be selected) are acquired in the contact operation acquisition step. An operation screen display method characterized by being arranged near the end point of the trajectory.
 〔ソフトウェアによる実現例〕
 最後に、タブレット端末100の各ブロック、特に、接触情報生成部21、オブジェクト特定部22、関連項目抽出部23、操作画面処理部24、ジェスチャ判定部25、環形状決定部30、アイコン順位決定部31、アニメーション決定部32、および、アイコン配置決定部33は、ハードウェアロジックによって構成してもよいし、次のようにCPUを用いてソフトウェアによって実現してもよい。
[Example of software implementation]
Finally, each block of the tablet terminal 100, in particular, the contact information generating unit 21, the object specifying unit 22, the related item extracting unit 23, the operation screen processing unit 24, the gesture determining unit 25, the ring shape determining unit 30, the icon rank determining unit 31, the animation determination unit 32, and the icon arrangement determination unit 33 may be configured by hardware logic, or may be realized by software using a CPU as follows.
 すなわち、タブレット端末100は、各機能を実現する制御プログラムの命令を実行するCPU(central processing unit)、上記プログラムを格納したROM(read only memory)、上記プログラムを展開するRAM(random access memory)、上記プログラムおよび各種データを格納するメモリ等の記憶装置(記録媒体)などを備えている。そして、本発明の目的は、上述した機能を実現するソフトウェアであるタブレット端末100の制御プログラムのプログラムコード(実行形式プログラム、中間コードプログラム、ソースプログラム)をコンピュータで読み取り可能に記録した記録媒体を、上記タブレット端末100に供給し、そのコンピュータ(またはCPUやMPU)が記録媒体に記録されているプログラムコードを読み出し実行することによっても、達成可能である。 That is, the tablet terminal 100 includes a CPU (central processing unit) that executes instructions of a control program that realizes each function, a ROM (read only memory) that stores the program, a RAM (random access memory) that develops the program, A storage device (recording medium) such as a memory for storing the program and various data is provided. An object of the present invention is to provide a recording medium on which a program code (execution format program, intermediate code program, source program) of a control program of the tablet terminal 100, which is software that realizes the functions described above, is recorded so as to be readable by a computer. This can also be achieved by supplying the tablet terminal 100 and reading and executing the program code recorded on the recording medium by the computer (or CPU or MPU).
 上記記録媒体としては、例えば、磁気テープやカセットテープ等のテープ系、フロッピー(登録商標)ディスク/ハードディスク等の磁気ディスクやCD-ROM/MO/MD/DVD/CD-R等の光ディスクを含むディスク系、ICカード(メモリカードを含む)/光カード等のカード系、あるいはマスクROM/EPROM/EEPROM/フラッシュROM等の半導体メモリ系などを用いることができる。 Examples of the recording medium include tapes such as magnetic tapes and cassette tapes, magnetic disks such as floppy (registered trademark) disks / hard disks, and disks including optical disks such as CD-ROM / MO / MD / DVD / CD-R. Card system such as IC card, IC card (including memory card) / optical card, or semiconductor memory system such as mask ROM / EPROM / EEPROM / flash ROM.
 また、タブレット端末100を通信ネットワークと接続可能に構成し、上記プログラムコードを、通信ネットワークを介して供給してもよい。この通信ネットワークとしては、特に限定されず、例えば、インターネット、イントラネット、エキストラネット、LAN、ISDN、VAN、CATV通信網、仮想専用網(virtual private network)、電話回線網、移動体通信網、衛星通信網等が利用可能である。また、通信ネットワークを構成する伝送媒体としては、特に限定されず、例えば、IEEE1394、USB、電力線搬送、ケーブルTV回線、電話線、ADSL回線等の有線でも、IrDAやリモコンのような赤外線、Bluetooth(登録商標)、802.11無線、HDR、携帯電話網、衛星回線、地上波デジタル網等の無線でも利用可能である。なお、本発明は、上記プログラムコードが電子的な伝送で具現化された、搬送波に埋め込まれたコンピュータデータ信号の形態でも実現され得る。 Further, the tablet terminal 100 may be configured to be connectable to a communication network, and the program code may be supplied via the communication network. The communication network is not particularly limited. For example, the Internet, intranet, extranet, LAN, ISDN, VAN, CATV communication network, virtual private network, telephone line network, mobile communication network, satellite communication. A net or the like is available. Further, the transmission medium constituting the communication network is not particularly limited. For example, even in the case of wired such as IEEE 1394, USB, power line carrier, cable TV line, telephone line, ADSL line, etc., infrared rays such as IrDA and remote control, Bluetooth ( (Registered trademark), 802.11 wireless, HDR, mobile phone network, satellite line, terrestrial digital network, and the like can also be used. The present invention can also be realized in the form of a computer data signal embedded in a carrier wave in which the program code is embodied by electronic transmission.
 本発明は、入力部および表示部を備える情報処理装置に広く適用することが可能である。例えば、これには限定されないが、入力部および表示部を備える、デジタルテレビ、パーソナルコンピュータ、スマートフォン、タブレットPC、ノートパソコン、携帯電話、PDA(Personal Digital Assistant)、電子書籍リーダ、電子辞書、携帯用・家庭用ゲーム機、電子黒板などに好適に用いることができる。さらに、タッチパネルを備える情報処理装置に本発明を適用すれば、より一層優れた操作性を実現することができる。 The present invention can be widely applied to information processing apparatuses including an input unit and a display unit. For example, but not limited to this, a digital TV, personal computer, smartphone, tablet PC, notebook computer, mobile phone, PDA (Personal Digital Assistant), electronic book reader, electronic dictionary, portable, provided with an input unit and a display unit -It can be used suitably for a home game machine, an electronic blackboard, etc. Furthermore, if the present invention is applied to an information processing apparatus including a touch panel, it is possible to realize even better operability.
10 制御部
11 入力部(タッチパネル)
12 表示部(タッチパネル)
13 操作部
14 外部インターフェース
15 通信部
16 無線通信部
17 音声出力部
18 音声入力部
19 記憶部
21 接触情報生成部(軌跡取得手段/接触動作取得手段)
22 オブジェクト特定部(オブジェクト特定手段)
23 関連項目抽出部(関連項目抽出手段)
24 操作画面処理部(操作画面処理手段)
25 ジェスチャ判定部(ジェスチャ判定手段)
30 環形状決定部(環形状決定手段/配置パターン決定手段)
31 アイコン順位決定部(アイコン順位決定手段)
32 アニメーション決定部(アニメーション決定手段)
33 アイコン配置決定部(アイコン配置決定手段)
41 フレームマップ記憶部
42 関連情報記憶部
43 アイコン記憶部
44 接触情報記憶部
100 タブレット端末(情報処理装置)
10 Control unit 11 Input unit (touch panel)
12 Display unit (touch panel)
13 operation unit 14 external interface 15 communication unit 16 wireless communication unit 17 audio output unit 18 audio input unit 19 storage unit 21 contact information generation unit (trajectory acquisition unit / contact operation acquisition unit)
22 Object identification part (object identification means)
23 Related Item Extraction Unit (Related Item Extraction Unit)
24 Operation screen processing unit (operation screen processing means)
25 Gesture determination unit (gesture determination means)
30 Ring shape determination unit (ring shape determination means / arrangement pattern determination means)
31 Icon ranking determining unit (icon ranking determining means)
32 Animation determination unit (animation determination means)
33 Icon arrangement determining unit (icon arrangement determining means)
41 Frame map storage unit 42 Related information storage unit 43 Icon storage unit 44 Contact information storage unit 100 Tablet terminal (information processing apparatus)

Claims (16)

  1.  表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得手段と、
     上記軌跡取得手段によって取得された軌跡に基づいて特定された選択領域に少なくとも一部が重なるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定手段と、
     オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定手段によって特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出手段と、
     上記関連項目抽出手段によって抽出された関連項目のアイコンを特定の位置に配置して、上記表示部に表示する操作画面処理手段とを備え、
     上記操作画面処理手段は、
     上記抽出された関連項目の中で、優先順位が高い関連項目のアイコンほど、上記軌跡取得手段によって取得された軌跡の終点の近くに配置することを特徴とする情報処理装置。
    A trajectory acquisition means for acquiring a trajectory by which the indicator pointing to the position of the screen of the display unit has moved;
    Object specifying means for specifying an object that at least partially overlaps the selection area specified based on the trajectory acquired by the trajectory acquiring means, as a selected object;
    A related item extracting unit that extracts an item associated with the object identified by the object identifying unit as a related item with reference to a related information storage unit that associates and stores the object and an item associated with the object; ,
    An operation screen processing means for arranging the icon of the related item extracted by the related item extracting means at a specific position and displaying it on the display unit;
    The operation screen processing means includes
    An information processing apparatus characterized in that, among the extracted related items, an icon of a related item having a higher priority is arranged closer to the end point of the trajectory acquired by the trajectory acquisition unit.
  2.  上記関連情報記憶部には、関連項目のアイコンがユーザによって選択された回数が関連項目ごとに記憶されており、
     上記操作画面処理手段は、
     上記選択された回数が多い関連項目のアイコンほど、上記軌跡の終点の近くに配置することを特徴とする請求項1に記載の情報処理装置。
    In the related information storage unit, the number of times the icon of the related item is selected by the user is stored for each related item,
    The operation screen processing means includes
    The information processing apparatus according to claim 1, wherein icons of related items that are selected more frequently are arranged closer to an end point of the trajectory.
  3.  上記関連情報記憶部には、関連項目の属性が関連項目ごとに記憶されており、
     上記操作画面処理手段は、
     上記選択されたオブジェクトの属性との類似度が高い属性を有する関連項目のアイコンほど、上記軌跡の終点の近くに配置することを特徴とする請求項1に記載の情報処理装置。
    In the related information storage unit, attributes of related items are stored for each related item,
    The operation screen processing means includes
    The information processing apparatus according to claim 1, wherein icons of related items having attributes having a high similarity to the attribute of the selected object are arranged closer to the end point of the trajectory.
  4.  上記選択されたオブジェクト、および、上記関連項目は、写真であって、上記関連情報記憶部には、写真の撮影日時が属性として記憶されており、
     上記操作画面処理手段は、
     上記選択されたオブジェクトである写真の撮影日時と近い撮影日時の写真のアイコンほど、上記軌跡の終点の近くに配置することを特徴とする請求項3に記載の情報処理装置。
    The selected object and the related item are photographs, and the related information storage unit stores the shooting date and time of the photograph as an attribute,
    The operation screen processing means includes
    4. The information processing apparatus according to claim 3, wherein an icon of a photograph with a shooting date and time closer to the shooting date and time of the selected object is arranged closer to the end point of the trajectory.
  5.  上記関連項目は、動画コンテンツであって、上記関連情報記憶部には、動画コンテンツをユーザに推奨する度合いを示す推奨度が属性として記憶されており、
     上記操作画面処理手段は、
     上記推奨度が高い動画コンテンツのアイコンほど、上記軌跡の終点の近くに配置することを特徴とする請求項1に記載の情報処理装置。
    The related item is video content, and the related information storage unit stores a recommendation level indicating a level of recommending the video content to the user as an attribute,
    The operation screen processing means includes
    The information processing apparatus according to claim 1, wherein icons of moving image content having a higher recommendation level are arranged closer to an end point of the trajectory.
  6.  上記軌跡取得手段は、上記表示部に表示されたオブジェクトを囲うように上記表示部の画面を移動した上記指示体の移動の軌跡を取得し、
     上記オブジェクト特定手段は、上記軌跡により囲われた領域に、少なくとも一部が含まれるオブジェクトを、選択されたオブジェクトとして特定し、
     上記操作画面処理手段は、上記抽出された関連項目のアイコンを、環の輪郭線上に並べて配置することを特徴とする請求項1から5までのいずれか1項に記載の情報処理装置。
    The trajectory acquisition means acquires a trajectory of movement of the indicator that has moved the screen of the display unit so as to surround the object displayed on the display unit,
    The object specifying means specifies an object including at least a part of the area surrounded by the locus as a selected object,
    6. The information processing apparatus according to claim 1, wherein the operation screen processing unit arranges the icons of the extracted related items side by side on an outline of a ring.
  7.  上記操作画面処理手段は、
     上記選択されたオブジェクトの周囲にアイコンが配置されるように上記環の位置およびサイズを決定することを特徴とする請求項6に記載の情報処理装置。
    The operation screen processing means includes
    The information processing apparatus according to claim 6, wherein the position and size of the ring are determined so that an icon is arranged around the selected object.
  8.  上記操作画面処理手段は、
     上記軌跡取得手段によって取得された軌跡、もしくは、その相似形または近似形を、上記環の形状として決定することを特徴とする請求項6または7に記載の情報処理装置。
    The operation screen processing means includes
    The information processing apparatus according to claim 6 or 7, wherein a trajectory acquired by the trajectory acquisition unit, or a similar shape or an approximate shape thereof, is determined as the shape of the ring.
  9.  上記操作画面処理手段は、
     優先順位が最も高い関連項目のアイコンを、上記環の輪郭線上の、上記軌跡の終点に最も近い点に配置し、
     残りの関連項目の各アイコンを、優先順位が最も高い上記アイコンの配置位置を基準にして、上記環の輪郭線上に配置することを特徴とする請求項6から8までのいずれか1項に記載の情報処理装置。
    The operation screen processing means includes
    Place the icon of the related item with the highest priority at the point closest to the end point of the trajectory on the outline of the ring,
    9. The icons of the remaining related items are arranged on the outline of the ring with reference to the arrangement position of the icon having the highest priority. Information processing device.
  10.  上記軌跡取得手段は、
     上記表示部に表示されたオブジェクトを選択する上記指示体の移動が生じるまでの所定期間に生じた上記指示体の軌跡を取得し、
     上記操作画面処理手段は、
     上記所定期間に取得された軌跡が、上記表示部の画面における特定の領域に偏っていると判断した場合に、上記特定の領域にアイコンが配置されるようにアイコンの位置を決定することを特徴とする請求項1から9までのいずれか1項に記載の情報処理装置。
    The trajectory acquisition means includes
    Obtaining a locus of the indicator that has occurred in a predetermined period until the indicator moves to select an object displayed on the display unit;
    The operation screen processing means includes
    When it is determined that the trajectory acquired during the predetermined period is biased to a specific area on the screen of the display unit, the position of the icon is determined so that the icon is arranged in the specific area. The information processing apparatus according to any one of claims 1 to 9.
  11.  上記操作画面処理手段は、
     上記表示部の画面における、上記オブジェクトを選択する上記指示体の移動の軌跡と重なる特定の領域にアイコンが配置されるようにアイコンの位置を決定することを特徴とする請求項1から9までのいずれか1項に記載の情報処理装置。
    The operation screen processing means includes
    10. The position of the icon is determined so that the icon is arranged in a specific area overlapping with a movement locus of the indicator that selects the object on the screen of the display unit. The information processing apparatus according to any one of claims.
  12.  当該情報処理装置が備える入力部および上記表示部はタッチパネルを構成するものであり、
     上記軌跡取得手段は、上記タッチパネル上を移動した上記指示体の移動の軌跡を取得することを特徴とする請求項1から11までのいずれか1項に記載の情報処理装置。
    The input unit and the display unit included in the information processing apparatus constitute a touch panel.
    The information processing apparatus according to any one of claims 1 to 11, wherein the trajectory acquisition unit acquires a trajectory of movement of the indicator that has moved on the touch panel.
  13.  当該情報処理装置が備える入力部は、上記表示部に表示されるカーソルを移動させる指示を当該情報処理装置に入力するものであり、
     上記軌跡取得手段は、上記指示体としてのカーソルの移動の軌跡を取得することを特徴とする請求項1から11までのいずれか1項に記載の情報処理装置。
    The input unit included in the information processing apparatus inputs an instruction to move the cursor displayed on the display unit to the information processing apparatus.
    The information processing apparatus according to claim 1, wherein the trajectory acquisition unit acquires a trajectory of movement of the cursor as the indicator.
  14.  情報処理装置における操作画面表示方法であって、
     上記情報処理装置が備える表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得ステップと、
     上記軌跡取得ステップにて取得された軌跡に基づいて特定された選択領域に少なくとも一部が重なるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定ステップと、
     オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定ステップにて特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出ステップと、
     上記関連項目抽出ステップにて抽出された関連項目のアイコンを特定の位置に配置して、上記表示部に表示する操作画面処理ステップとを含み、
     上記操作画面処理ステップでは、
     上記抽出された関連項目の中で、優先順位が高い関連項目のアイコンほど、上記軌跡取得ステップにて取得された軌跡の終点の近くに配置することを特徴とする操作画面表示方法。
    An operation screen display method in an information processing apparatus,
    A trajectory acquisition step for acquiring a trajectory of movement of an indicator that indicates the position of the screen of the display unit included in the information processing apparatus;
    An object specifying step for specifying an object that at least partially overlaps the selected area specified based on the track acquired in the track acquiring step as a selected object;
    A related item extracting step of extracting an item associated with the object identified in the object identifying step as a related item with reference to a related information storage unit that associates and stores the object and an item associated with the object When,
    An operation screen processing step of placing the icon of the related item extracted in the related item extraction step at a specific position and displaying the icon on the display unit,
    In the above operation screen processing step,
    An operation screen display method characterized in that, among the extracted related items, an icon of a related item having a higher priority is arranged near the end point of the track acquired in the track acquisition step.
  15.  コンピュータを、請求項1から13までのいずれか1項に記載の情報処理装置の各手段として機能させるための制御プログラム。 A control program for causing a computer to function as each unit of the information processing apparatus according to any one of claims 1 to 13.
  16.  請求項15に記載の制御プログラムを記録したコンピュータ読み取り可能な記録媒体。 A computer-readable recording medium on which the control program according to claim 15 is recorded.
PCT/JP2012/067526 2011-07-15 2012-07-10 Information processing device, operation screen display method, control program, and recording medium WO2013011863A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-157168 2011-07-15
JP2011157168A JP2013025410A (en) 2011-07-15 2011-07-15 Information processor, operation screen display method, control program, and recording medium

Publications (1)

Publication Number Publication Date
WO2013011863A1 true WO2013011863A1 (en) 2013-01-24

Family

ID=47558041

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/067526 WO2013011863A1 (en) 2011-07-15 2012-07-10 Information processing device, operation screen display method, control program, and recording medium

Country Status (2)

Country Link
JP (1) JP2013025410A (en)
WO (1) WO2013011863A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015153389A (en) * 2014-02-19 2015-08-24 富士ゼロックス株式会社 Information processing apparatus and information processing program
CN111831180A (en) * 2020-07-08 2020-10-27 维沃移动通信有限公司 Icon sorting method and device and electronic equipment

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6003227B2 (en) * 2012-05-24 2016-10-05 株式会社ニコン Display device
JP6223755B2 (en) * 2013-09-06 2017-11-01 株式会社東芝 Method, electronic device, and program
JP6244879B2 (en) * 2013-12-19 2017-12-13 キヤノンマーケティングジャパン株式会社 Information processing system, information processing system program, information processing apparatus, information processing apparatus program, and operation screen generation method
US10852838B2 (en) 2014-06-14 2020-12-01 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
JP6365268B2 (en) * 2014-11-28 2018-08-01 コニカミノルタ株式会社 Display device, image forming apparatus, display method, and display program
JP2016193177A (en) * 2015-03-31 2016-11-17 富士フイルム株式会社 Radiation irradiation device
JP6755031B2 (en) * 2019-04-26 2020-09-16 富士通株式会社 Input support program, input support method and information processing device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006244353A (en) * 2005-03-07 2006-09-14 Konami Digital Entertainment:Kk Information processor, image movement instructing method and program
JP2008092585A (en) * 2007-10-16 2008-04-17 Fujifilm Corp Electronic camera
JP2009181501A (en) * 2008-01-31 2009-08-13 Toshiba Corp Mobile communication equipment
JP2010026710A (en) * 2008-07-17 2010-02-04 Sony Corp Information processor, information processing method, and information processing program
WO2010056483A1 (en) * 2008-11-13 2010-05-20 Qualcomm Incorporated Method and system for context dependent pop-up menus
JP2010267079A (en) * 2009-05-14 2010-11-25 Canon Inc Information processor, control method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006244353A (en) * 2005-03-07 2006-09-14 Konami Digital Entertainment:Kk Information processor, image movement instructing method and program
JP2008092585A (en) * 2007-10-16 2008-04-17 Fujifilm Corp Electronic camera
JP2009181501A (en) * 2008-01-31 2009-08-13 Toshiba Corp Mobile communication equipment
JP2010026710A (en) * 2008-07-17 2010-02-04 Sony Corp Information processor, information processing method, and information processing program
WO2010056483A1 (en) * 2008-11-13 2010-05-20 Qualcomm Incorporated Method and system for context dependent pop-up menus
JP2010267079A (en) * 2009-05-14 2010-11-25 Canon Inc Information processor, control method, and program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015153389A (en) * 2014-02-19 2015-08-24 富士ゼロックス株式会社 Information processing apparatus and information processing program
US10445511B2 (en) 2014-02-19 2019-10-15 Fuji Xerox Co., Ltd. Information processing apparatus, information processing method, and non-transitory computer readable medium
CN111831180A (en) * 2020-07-08 2020-10-27 维沃移动通信有限公司 Icon sorting method and device and electronic equipment

Also Published As

Publication number Publication date
JP2013025410A (en) 2013-02-04

Similar Documents

Publication Publication Date Title
JP5172997B2 (en) Information processing apparatus, operation screen display method, control program, and recording medium
US20230289008A1 (en) Device, method, and graphical user interface for navigating through a user interface using a dynamic object selection indicator
US20210191602A1 (en) Device, Method, and Graphical User Interface for Selecting User Interface Objects
JP5107453B1 (en) Information processing apparatus, operation screen display method, control program, and recording medium
US11675476B2 (en) User interfaces for widgets
WO2013011863A1 (en) Information processing device, operation screen display method, control program, and recording medium
US9626098B2 (en) Device, method, and graphical user interface for copying formatting attributes
US20190258373A1 (en) Scrollable set of content items with locking feature
US8525839B2 (en) Device, method, and graphical user interface for providing digital content products
CN110837329B (en) Method and electronic device for managing user interface
JP5669939B2 (en) Device, method and graphical user interface for user interface screen navigation
US8914743B2 (en) Device, method, and graphical user interface for navigating a list of identifiers
US20160004432A1 (en) Device, Method, and Graphical User Interface for Switching Between User Interfaces
US20110175826A1 (en) Automatically Displaying and Hiding an On-screen Keyboard
KR20140128208A (en) user terminal device and control method thereof
US11934640B2 (en) User interfaces for record labels
US20220174145A1 (en) Device, Method, and Graphical User Interface for Updating a Background for Home and Wake Screen User Interfaces
JP5173001B2 (en) Information processing apparatus, screen display method, control program, and recording medium
US11777881B2 (en) User interfaces and associated systems and processes for sharing portions of content items
US20240004532A1 (en) Interactions between an input device and an electronic device
US20220365632A1 (en) Interacting with notes user interfaces
JP2013190876A (en) Operation support device, operation support method, control program, data structure, and recording medium
US11379113B2 (en) Techniques for selecting text
CN110851068B (en) Method and electronic device for managing user interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12815559

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12815559

Country of ref document: EP

Kind code of ref document: A1