WO2013011862A1 - Information processing device, operation screen display method, control program, and recording medium - Google Patents

Information processing device, operation screen display method, control program, and recording medium Download PDF

Info

Publication number
WO2013011862A1
WO2013011862A1 PCT/JP2012/067525 JP2012067525W WO2013011862A1 WO 2013011862 A1 WO2013011862 A1 WO 2013011862A1 JP 2012067525 W JP2012067525 W JP 2012067525W WO 2013011862 A1 WO2013011862 A1 WO 2013011862A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
icon
operation screen
information
processing apparatus
Prior art date
Application number
PCT/JP2012/067525
Other languages
French (fr)
Japanese (ja)
Inventor
正義 神原
信子 三浦
敬一 長谷川
晋吾 山下
英樹 西村
陽 柯
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2013011862A1 publication Critical patent/WO2013011862A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to a user interface technology of an information processing apparatus including an input unit and a display unit.
  • the tablet terminal has a flat outer shape and includes a touch panel as a display unit and an input unit. By touching the object displayed on the touch panel with a finger, a pen, or the like, the user can perform various operations on the tablet terminal body.
  • the tablet terminal can discriminate various contact operations on the user's screen by the touch panel, and can perform object display according to the contact operation. For example, for the touch action, various actions such as tapping (lightly tapping), flicking (flicking, picking), pinching (pinch with a finger), dragging an object displayed on the screen with a finger (pen) There is.
  • the tablet terminal discriminates such various contact actions, and selects / moves objects, scrolls a list, enlarges / reduces an image, etc. according to the discrimination result.
  • the tablet terminal realizes a more intuitive operation by the touch panel as described above, and is supported by many people.
  • Patent Document 1 discloses a mobile communication terminal including a touch panel display unit.
  • an object URL, e-mail address, character string, image, etc.
  • a finger pen
  • the mobile communication terminal extracts a keyword from the selected object and accesses a related site.
  • Patent Document 2 discloses a portable device having a touch panel display.
  • the portable device of Patent Document 2 displays a through image (an image reflected on a camera, etc.) on a touch panel display, detects a specific target in the through image selected by touching the surrounding area, and reduces the specific target.
  • the image can be displayed on the edge of the touch panel display as a release button.
  • Patent Document 3 discloses a website search system using a touch panel.
  • the website search system of Patent Literature 3 accepts it as a search keyword and displays a first mother icon corresponding to the accepted keyword. Then, the website search system searches the website with a search engine according to the keyword, and displays a thumbnail image of the searched website around the first mother icon.
  • Patent Document 4 discloses an information processing apparatus including a display panel having a contact sensor.
  • the information processing apparatus disclosed in Patent Document 4 detects rotation of the operating body (finger) by a predetermined angle or more while an object is selected, and displays operation items related to the object around the object.
  • Patent Document 5 discloses an information processing apparatus including a touch panel unit.
  • the information processing apparatus disclosed in Patent Document 5 acquires a trajectory of a user's touch position, specifies an object image selected by the trajectory, and moves the selected object image to a position corresponding to an end point of the trajectory.
  • Patent Documents 6 to 9 disclose information processing apparatuses that realize menu display for the purpose of improving user convenience and operability.
  • JP 2010-218322 A (published on September 30, 2010) JP 2010-182023 A (released on August 19, 2010) JP 2009-134738 A (released on June 18, 2009) JP 2011-13980 A (published January 20, 2011) JP 2006-244353 A (published on September 14, 2006) JP-A-8-305535 (released on November 22, 1996) Japanese Patent Laid-Open No. 10-307664 (published November 17, 1998) No. 11-507455 (announced on June 29, 1999) JP 2001-265475 A (published September 28, 2001)
  • the good operability of the tablet terminal is how to display the final result, which is the user's purpose, with a simple contact operation and a small number of operations, and a natural flow that does not contradict the user's intuition, It depends on whether to display the result based on the contact operation.
  • Such an improvement in operability is realized by appropriately grasping the user's purpose, the user's state, and the user's tendency.
  • the tablet terminal displays what the user wants to do now, what the user wants to do next, how the user is now operating, where the user is, and how the user moves. It is required to “see” the user's intentions from all points of view, such as whether or not it is natural.
  • Patent Documents 1 to 9 are not necessarily sufficient to detect the user's intention.
  • Patent Document 1 discloses that an object is selected by an operation of surrounding the object, but it is disclosed that an item related to the object is extracted and displayed by the above operation.
  • Patent Document 2 discloses that an icon corresponding to an object is displayed by an operation of enclosing the object. However, the object is selected by the above operation, and the object is displayed along with the selection. Extracting and displaying related items is not disclosed.
  • Patent Document 3 discloses that when an object is selected, the object is displayed, and a thumbnail related to the object is displayed around the object. Are not connected.
  • Patent Document 4 discloses that an object is touched to be selected and an icon related to the object is displayed around the object. In order to display an icon around the object, the object is displayed. Apart from the selection, it is necessary to perform complicated operations (operations such as pressing the finger against the touch surface and twisting the finger angle) until the number of operations increases and the desired result (icon) is displayed. The operation becomes very complicated.
  • the above-mentioned operability problem is not limited to a small tablet terminal excellent in portability, but also an information processing apparatus of any size provided with a touch panel type display unit and input unit, as well as all types of forms not limited to a touch panel. This is a problem that commonly occurs in information processing apparatuses including a display unit and an input unit.
  • the present invention has been made in view of the above-described problems, and an object thereof is to realize excellent operability in an information processing apparatus including an input unit and a display unit.
  • the information processing apparatus of the present invention is surrounded by a trajectory acquisition unit that acquires a trajectory of movement of an indicator that indicates the position of the screen of the display unit, and a trajectory acquired by the trajectory acquisition unit.
  • a trajectory acquisition unit that acquires a trajectory of movement of an indicator that indicates the position of the screen of the display unit
  • a trajectory acquired by the trajectory acquisition unit With reference to an object specifying means for specifying an object including at least a part in the selected area as the selected object, and a related information storage unit that stores the object and an item related to the object in association with each other, Related item extracting means for extracting items associated with the object specified by the object specifying means as related items, and icons of related items extracted by the related item extracting means are arranged side by side on the outline of the ring. And an operation screen processing means for displaying on the display section.
  • the trajectory of the movement (indicator movement) that the trajectory acquisition unit encloses is acquired, and based on this trajectory, the object specifying unit specifies the object selected by the user by the enclosing operation.
  • the related item extracting unit extracts items related to the identified object. Since the related information storage unit stores the object and the item related to the object in association with each other, the related item extracted by the related item extracting unit is related to the object selected by the user. It is.
  • the operation screen processing means arranges the icons of the extracted related items in a ring shape that is easily associated with the surrounding operation.
  • the icons are arranged as described above, and the generated operation screen is presented to the user.
  • the information processing apparatus uses a very natural and simple user action as an opportunity to specify an object, which is to “enclose” the object with an indicator (such as a pen or a finger).
  • An operation screen in which icons of related items related to the selected object are arranged in the shape of a ring can be provided to the user.
  • the trajectory of “enclose” has a shape that encloses something, it can be said that the shape of the ring is similar to the trajectory of movement of the indicator obtained by the enclosing operation. Therefore, the result (an icon arranged in a ring shape) is likely to be associated with the above-described operation surrounded by the user.
  • the information processing apparatus of the present invention can preliminarily detect related items that the user will select next after selecting an object, and can display the related items in a selectable manner for the user.
  • the icons arranged on the operation screen by the operation screen processing means are all related item icons extracted as items related to the object selected by the user. is there. In other words, after selecting the object, the related item that will be selected next is displayed with a circular icon, so the user can immediately specify the next desired icon from the icon ring. be able to.
  • the menu list with related item icons arranged in a circle has the following advantages compared to the linear one-dimensional menu list.
  • icons are arranged from top to bottom or from left to right. Therefore, priorities are unintentionally assigned to the icons according to the arrangement positions.
  • the circular menu list it is possible to treat all icons arranged in a circular manner on an equal basis.
  • the information processing apparatus can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
  • the operation screen processing means determines the position and size of the ring so that an icon is arranged around the selected object.
  • the information processing apparatus places an icon around the object in response to an extremely natural and simple user action of “enclosing” the object. Can be output.
  • the user can obtain, as a result, an operation screen in which icons of related items are arranged so as to surround the object that he / she has previously enclosed and selected.
  • the positional relationship between these icons and the object matches the positional relationship between the object and the locus of movement of the indicator by the action previously performed by the user.
  • the movement trajectory of the indicator obtained by surrounding the object is similar to the shape of the ring in which the icon is arranged.
  • a menu list in which icons of related items are arranged in a circle around an object has the following advantages compared to a linear one-dimensional menu list.
  • icons are arranged from top to bottom or from left to right. Therefore, priorities are unintentionally assigned to the icons according to the arrangement positions.
  • the circular menu list it is possible to treat all icons arranged in a circular manner on an equal basis.
  • a one-dimensional menu list is displayed near the previously selected object, it is difficult to express the relationship between the object and each icon.
  • a circular menu list is displayed around the previously selected object, there is a relationship between the previously selected (enclosed) object and the surrounding icons. This makes it possible for the user to recognize this naturally.
  • the information processing apparatus can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the touch panel.
  • the operation screen processing means determines the trajectory acquired by the trajectory acquisition means, or a similar shape or an approximate shape thereof, as the shape of the ring.
  • the user performs an operation of freely surrounding the object with an arbitrary shape, and the trajectory at this time is held by the trajectory acquisition means. Then, when creating the operation screen, the operation screen processing means sets each icon so as to surround a predetermined area (or the object itself) on the contour line of the ring that is the same or similar to the locus obtained as described above. Deploy.
  • the shape of the ring in which the icons are arranged is displayed on the operation screen in a state that matches or is similar to the movement trajectory of the operation body obtained by surrounding the object.
  • the icons are arranged in the shape as enclosed by the user, the user can obtain the operation screen in which the icons are arranged in the desired shape by surrounding the object in the desired shape. Thereby, the playability at the time of displaying an operation screen and operating information processing apparatus increases.
  • the information processing apparatus can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
  • the operation screen processing means may individually determine the timing at which each icon is displayed.
  • the trajectory acquisition unit measures an elapsed time from the start of the movement of the indicator, and travel time indicating the elapsed time at least some of the points constituting the trajectory.
  • the operation screen processing means sequentially displays each icon clockwise or counterclockwise according to the moving direction of the indicator determined from the trajectory and the moving time information. It is preferable.
  • the movement time information is associated with each point of the trajectory.
  • the enclosing operation whether the object is surrounded in the movement direction, that is, clockwise, or counterclockwise. To see if the object is surrounded.
  • the operation screen processing means causes each icon to sequentially appear in the same direction in a circular manner in accordance with the actual movement (indicator moving direction) surrounding the user's object. Specifically, the operation screen processing means arranges icons one by one in a ring shape clockwise when the object is surrounded clockwise, and when the object is surrounded counterclockwise, The icons are arranged in a ring in a counterclockwise direction one by one.
  • the operation screen processing means determines a position relatively the same as the position of the start point of the trajectory in the ring as a display position of the first icon.
  • the operation screen processing means determines the appearance start position on the outline of the ring, where the first icon appears, when the icons appear sequentially in the same movement direction as the indicator. It is determined based on the starting point of the movement trajectory.
  • the position that is relatively the same as the position of the start point of the locus in the ring that is the reference line for placing the icon is determined as the display position of the first icon, and from there, the same movement direction (clockwise) as the indicator , Or counterclockwise), the remaining icons appear sequentially.
  • the operation screen processing means may determine the timing for sequentially displaying the icons so as to correspond to the moving speed of the indicator that forms the locus.
  • the operation screen processing means sets each icon to a predetermined position in the ring when the icons sequentially appear in the same movement direction as the indicator and with the start position of the movement of the operation body matched.
  • the timing of appearance is made to correspond to the moving speed of the indicator that formed the locus.
  • the related information storage unit may include a plurality of related items as long as the related items can be processed continuously or simultaneously.
  • Cooperation information indicating the presence or absence of cooperation is stored for each related item, and the operation screen processing means arranges icons of related items having cooperation among the related items extracted by the related item extracting means next to each other. You may arrange.
  • a plurality of related items having linkage are selected at the same time, the processing is executed continuously or simultaneously. Therefore, it is preferable that icons of a plurality of related items to be linked are selected at the same time in order to make the number of operations as small as possible.
  • the enclosing operation is used, all the icons can be enclosed in a single enclosing operation and selected simultaneously. In order to make it easy to enclose all the icons together in a single enclosing operation, even if the icons are arranged in a ring shape, the icons of a plurality of related items having cooperation are adjacent to each other. It is preferable that they are arranged side by side (in the case of three or more).
  • icons of related items having cooperation are arranged so as to be arranged next to each other. That is, when two or more icons are linked, they are arranged side by side on the outline of the ring in a continuous manner when there are three or more icons.
  • the information processing apparatus of the present invention further includes a position detection unit that acquires position information indicating the position of the own device, a direction detection unit that acquires direction information indicating the direction of the own device, and peripheral devices of the own device.
  • a position detection unit that acquires position information indicating the position of the own device
  • a direction detection unit that acquires direction information indicating the direction of the own device
  • peripheral devices of the own device Based on the communication unit that communicates and acquires the position information of the peripheral device, the position information of the own device acquired by the position detection unit, and the position information of the peripheral device acquired by the communication unit, Identifies the position of each peripheral device in relation to its own device by specifying the positional relationship with the peripheral device and specifying the orientation of its own device based on the direction information acquired by the direction detection unit.
  • Device direction specifying means, and the operation screen processing means specifies the related items of the peripheral device so as to correspond to the direction in which the peripheral device exists, specified by the device direction specifying means. Position of the response icons may be determined.
  • the device direction specifying means can specify the direction in which each peripheral device exists with respect to the own device based on the following three pieces of information. That is, (1) position information indicating the position of the own device acquired by the position detecting unit, (2) direction information indicating the direction of the own device acquired by the direction detecting unit, and (3) acquiring by the communication unit. , Position information of each peripheral device.
  • the operation screen processing means determines the arrangement position of the icon corresponding to the related item of the peripheral device so as to correspond to the direction in which the peripheral device exists, specified by the device direction specifying means.
  • the positional relationship between the central object and each icon corresponds to the positional relationship between the tablet terminal 100 and each peripheral device.
  • the user can intuitively grasp the relevance between the user's own operation and the information processing that occurs as a result. For example, if the user wants to transfer a photo to the digital television 1, the user may drag the photo object in the direction in which the digital television 1 is actually located. Therefore, it is possible to provide an operation screen that can be operated in a natural flow that does not contradict the user's intuition. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
  • the operation screen processing unit relates to an operation that acts on other devices other than the own device in the related information storage unit among the related items extracted by the related item extracting unit. You may arrange
  • the operation screen processing means can grasp the operation attribute of each extracted related item by referring to the related information storage unit. Specifically, the operation screen process can identify a related item that is an item related to an operation that acts on a device other than the own device based on the information of the operation attribute.
  • the operation screen processing means sets the icon of such a related item above the selected object. Decide to place in.
  • the icon related to the action to work on the other device is displayed at the upper part of the screen than the previously selected object.
  • the trajectory acquisition means acquires the trajectory of the indicator that has occurred in a predetermined period until the indicator surrounding the object displayed on the display unit moves.
  • the operation screen processing means determines that the trajectory acquired during the predetermined period is biased toward a specific area on the screen of the display unit, the ring of the ring is arranged so that an icon is arranged in the specific area. The position may be determined.
  • the trajectory acquisition means acquires a trajectory not only for the surrounding motion but also for the motion that occurred in the past predetermined period.
  • the operation screen processing means can grasp the position where the operation has occurred on the screen of the display unit in the past predetermined period based on the acquired trajectory. If it concentrates on the specific area
  • the contact position is biased in this way is that the information processing apparatus is used under a special usage situation in which only that area can be touched (or other areas are difficult to touch).
  • the operation screen processing means determines the position of the ring so that the icon is arranged in the area where the bias is detected.
  • the icon is displayed in the area where the user can touch, and when the user next performs an operation of selecting the icon, the desired icon can be selected immediately from the touchable area.
  • the touch position tends to be biased toward the lower left area of the touch panel (when operating with the left hand) or the lower right area of the screen (when operating with the right hand).
  • the operation becomes complicated if an object or icon that requires a contact operation is displayed on the upper part of the screen or the lower part of the screen opposite to the hand. . This is because the user cannot touch the target object immediately and must perform an extra operation of rushing to a contactable area or switching to a two-handed operation.
  • the information processing apparatus can solve the above-mentioned problem by observing the usage status of the user with the above-described configuration.
  • the information processing apparatus of the present invention can be configured to detect the deviation of the contact position of the indicator and place the icon in a region where the user's indicator is estimated to arrive immediately.
  • the icon is displayed so as to fit within the lower left (or right) area of the screen of the touch panel. There is no need to drag icons, and a desired icon can be selected immediately.
  • the operation screen processing means may include an icon in a region surrounded by the locus of the indicator or a specific region including the enclosed region on the screen of the display unit.
  • the position of the ring may be determined so as to be arranged.
  • the icon is arranged at or near the position surrounded by the indicator such as the finger by the user. It can be said that the position surrounded by the user is a contactable area for the user. Therefore, it is possible to reliably display an icon in the contactable area.
  • the input unit and the display unit included in the information processing apparatus constitute a touch panel
  • the trajectory acquisition unit acquires a trajectory of movement of the indicator that has moved on the touch panel. May be.
  • the input unit included in the information processing apparatus inputs an instruction to move the cursor displayed on the display unit to the information processing apparatus, and the trajectory acquisition unit includes: The locus of movement of the cursor as the indicator may be acquired.
  • an operation screen display method of the present invention acquires a trajectory in which an indicator that indicates a position of a screen of a display unit included in the information processing apparatus moves in the operation screen display method in the information processing apparatus.
  • a related item extracting step of extracting an item associated with the object identified in the object identifying step as a related item with reference to a related information storage unit that associates and stores the item, and the related item extracting step Arrange the icons of the related items extracted in step by step on the outline of the ring. It is characterized in that it comprises an operation screen processing step of displaying on the display unit.
  • the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations.
  • the information processing apparatus including the input unit and the display unit.
  • the information processing apparatus may be realized by a computer.
  • an information processing apparatus control program for causing the information processing apparatus to be realized by the computer by causing the computer to operate as the above-described means, and A computer-readable recording medium on which is recorded also falls within the scope of the present invention.
  • an information processing apparatus provides a trajectory acquisition unit that acquires a trajectory in which an indicator that indicates a position of a screen of the display unit has moved in an information processing device including an input unit and a display unit. And an object specifying means for specifying an object at least partially included in the area surrounded by the trajectory acquired by the trajectory acquiring means as a selected object, and an object and an item related to the object
  • a related item extracting unit that extracts an item associated with the object specified by the object specifying unit as a related item with reference to the related information storage unit to be stored, and a related item extracted by the related item extracting unit
  • An operation screen process that arranges the icon of the item on the outline of the ring and displays it on the display unit. It is characterized in that it comprises a means.
  • the operation screen display method of the present invention is a trajectory in which an indicator indicating the position of the screen of the display unit moves in the operation screen display method in an information processing apparatus including an input unit and a display unit.
  • a trajectory acquisition step for acquiring the object an object specifying step for specifying, as the selected object, an object at least part of which is included in the region surrounded by the trajectory acquired in the trajectory acquisition step, and the object and the object
  • the related item icons extracted by the item extraction step are displayed on the outline of the ring.
  • Arranged Te it is characterized by including an operation screen processing step of displaying on the display unit.
  • the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations.
  • the information processing apparatus including the input unit and the display unit.
  • (a) is a figure which shows a mode that the user performed contact operation of "surrounding" an object in order to select the target object
  • (b) ) Is a diagram showing an example of contact information generated by the contact information generation unit in accordance with the contact operation shown in (a)
  • (c) is a display unit during the period from t0 to tn when contact is detected. It is a figure which shows an example of the map information of the displayed video frame. It is a figure which shows an example of the relevant information memorize
  • FIG. 1 It is a functional block diagram which shows the principal part structure of the tablet terminal in further another embodiment of this invention. It is a figure which shows an example of the relevant information memorize
  • FIG. 25 is a diagram showing a specific example of an operation screen obtained following the operation screen shown in FIG. 24 as a result of the operation screen generation process executed by the operation screen processing unit. It is a figure which shows the modification of the icon display method of a related item. It is a figure which shows the modification of the icon display method of a related item.
  • (C) is a diagram showing an example of the operation screen at the time point ta
  • (d) is a diagram showing an example of the operation screen at the time point tb
  • (e) is an operation at the time point tc.
  • (f) is a figure which shows an example of the operation screen at the time of tn.
  • (a) demonstrates an example of the condition where the user is operating with the left hand.
  • (B) is a figure which shows the specific example of the contact information produced
  • (A) is a figure explaining a mode that a user rotates the ring of an icon by dragging on a contact possible area
  • (A) is a figure which shows a mode that the user performed the contact operation of "enclosing” in order to select the several target object
  • (b) is a contact operation shown to (a) of the figure. It is a figure which shows the specific example of the operation screen produced
  • (A) is a figure which shows a mode that the user performed the contact operation of "surrounding” in order to select a plurality of different types of objects, and (b) is shown in (a) of the figure. It is a figure which shows the specific example of the operation screen produced
  • Embodiment 1 An embodiment of the present invention will be described with reference to FIGS. 1 to 10 as follows.
  • the tablet terminal is realized by a small smartphone that can be operated with one hand and is excellent in portability.
  • the information processing apparatus of the present invention is not limited to the above example, and the information processing apparatus of the present invention can be applied to an information processing apparatus of any size (for example, an electronic blackboard equipped with a large touch panel). Good.
  • FIG. 2 is a block diagram illustrating a hardware configuration of the tablet terminal 100 according to the present embodiment.
  • the tablet terminal 100 includes at least a control unit 10, an input unit 11, a display unit 12, and a storage unit 19. Furthermore, the tablet terminal 100 may include an operation unit 13, an external interface 14, a communication unit 15, a wireless communication unit 16, an audio output unit 17, and an audio input unit 18 in order to realize inherent functions. .
  • the tablet terminal 100 When the tablet terminal 100 is a multi-function mobile communication terminal such as a smartphone, the tablet terminal 100 is omitted here. However, the tablet terminal 100 includes a call processing unit, an imaging unit (such as a lens / image sensor) that performs imaging, and a broadcast image. Other parts (such as a tuner / demodulation unit), GPS, and sensors (such as an acceleration sensor and an inclination sensor) may be included as well as various components that are typically included in a smartphone.
  • a call processing unit such as a lens / image sensor
  • Other parts such as a tuner / demodulation unit
  • GPS GPS
  • sensors such as an acceleration sensor and an inclination sensor
  • the input unit 11 is for inputting an instruction signal for the user to operate the tablet terminal 100 via the touch panel.
  • the input unit 11 is a touch surface that accepts contact with an indicator (indicating the screen position of the display unit 12, here, for example, a finger or a pen), and contact / non-contact between the indicator and the touch surface. (Approach / non-approach) and a touch sensor for detecting the contact (approach) position.
  • the touch sensor may be realized by any sensor as long as it can detect contact / non-contact between the indicator and the touch surface. For example, it is realized by a pressure sensor, a capacitance sensor, an optical sensor, or the like.
  • the display unit 12 displays an object to be processed by the tablet terminal 100 (any display object such as an icon) and a processing result, and displays an operation screen for the user to operate the tablet terminal 100 using a GUI (Graphical (User ⁇ Interface) screen.
  • the display unit 12 is realized by a display device such as an LCD (Liquid Crystal Display).
  • the input unit 11 and the display unit 12 are integrally formed, and these constitute a touch panel. Therefore, in such an embodiment, an object to be moved (operated) to indicate a screen position, that is, an operation body (here, a finger or a pen) is simultaneously positioned on the screen of the display unit 12. It is also an indicator that indicates
  • the touch panel of the tablet terminal 100 of the present invention is realized by a projected capacitive touch panel
  • the touch sensor has a transparent electrode pattern in a matrix shape made of ITO (Indium Tin Oxide) or the like. It is formed on a transparent substrate such as glass or plastic.
  • ITO Indium Tin Oxide
  • the control unit 10 can detect the position where the indicator is in contact or approached by detecting a change in the current or voltage of the transparent electrode pattern.
  • contact when “contact detection”, “contact operation”, “contact position”, etc. is not only the state in which the indicator and the touch surface are in complete contact (contact), It also includes a state in which the indicator and the touch surface are close (approaching) to the extent that the touch sensor can detect.
  • the operation unit 13 is for the user to directly input an instruction signal to the tablet terminal 100.
  • the operation unit 13 is realized by an appropriate input mechanism such as a button, switch, key, or jog dial.
  • the operation unit 13 is a switch for turning on / off the power of the tablet terminal 100.
  • the external interface 14 is an interface for connecting an external device to the tablet terminal 100.
  • the external interface 14 is realized by, for example, but not limited to, a socket for inserting an external recording medium (memory card or the like), an HDMI (High Definition Multimedia Interface) terminal, a USB (Universal Serial Bus) terminal, or the like.
  • the control unit 10 of the tablet terminal 100 can exchange data with an external device via the external interface 14.
  • the communication unit 15 communicates with an external device via a communication network.
  • the communication unit 15 is connected to various communication terminals via a communication network, and realizes data transmission / reception between the tablet terminal 100 and the communication terminal. Further, when the tablet terminal 100 is a mobile communication terminal such as a smartphone, the communication unit 15 transmits / receives voice call data, e-mail data, and the like to / from other devices via the mobile phone network. To do.
  • the wireless communication unit 16 communicates with an external device wirelessly.
  • the wireless communication unit 16 is not particularly limited, and may implement any wireless communication means such as infrared communication such as IrDA and IrSS, Bluetooth communication, WiFi communication, and a non-contact type IC card. A plurality of means may be realized.
  • the control unit 10 of the tablet terminal 100 can communicate with devices in the vicinity of the tablet terminal 100 via the wireless communication unit 16, and can exchange data with these devices.
  • the sound output unit 17 outputs sound data processed by the tablet terminal 100 as sound, and is realized by a speaker, a headphone terminal, headphones, and the like.
  • the voice input unit 18 receives voice input generated outside the tablet terminal 100, and is realized by a microphone or the like.
  • the storage unit 19 includes (1) a control program executed by the control unit 10 of the tablet terminal 100, (2) an OS program, and (3) an application program for the control unit 10 to execute various functions of the tablet terminal 100, And (4) storing various data read when the application program is executed.
  • the control unit 10 stores data used for calculation and calculation results in the course of executing various functions.
  • the above data (1) to (4) are stored in a non-volatile storage device such as a ROM (read only memory), a flash memory, an EPROM (Erasable ROM), an EEPROM (Electrically EPROM), an HDD (Hard Disc Drive).
  • the data (5) is stored in a volatile storage device such as a RAM (Random Access Memory). Which data is to be stored in which storage device is appropriately determined based on the purpose of use, convenience, cost, physical restrictions, and the like of the tablet terminal 100.
  • the control unit 10 performs overall control of each unit included in the tablet terminal 100.
  • the control unit 10 is realized by, for example, a CPU (central processing unit).
  • the functions of the tablet terminal 100 are such that the CPU as the control unit 10 reads a program stored in a ROM or the like into a RAM or the like and executes the program. It is realized by doing.
  • Various functions (particularly, the operation screen display function of the present invention) realized by the control unit 10 will be described later with reference to other drawings.
  • FIG. 3 is a plan view showing the appearance of the tablet terminal 100.
  • the tablet terminal 100 includes an input unit 11 and a display unit 12 as a touch panel.
  • the tablet terminal 100 includes an operation unit 13, an external interface 14, a wireless communication unit 16, an audio output unit 17, an audio input unit 18, and the like, although these are not essential components.
  • the wireless communication unit 16 is realized by infrared communication means, an infrared transmission / reception unit is provided as the wireless communication unit 16 on the side surface of the tablet terminal 100.
  • FIG. 4 is a diagram illustrating a state when the user holds and operates the tablet terminal 100. More specifically, FIG. 4A is a diagram illustrating a state in which the tablet terminal 100 is gripped with one hand and is operated with that hand, and FIG. It is a figure explaining a mode that it is hold
  • the tablet terminal 100 is a palm-sized information processing apparatus that can be held with one hand. As shown in FIG. 4A, the tablet terminal 100 is held with the thumb of the hand while holding the tablet terminal 100 with one hand.
  • the touch surface of the input unit 11 can be operated. For example, when there is an icon to be operated at a position where the thumb does not reach, the tablet terminal 100 draws the icon near the thumb by flicking, and surrounds or taps the icon with the thumb. The icon can be selected.
  • the user may hold the tablet terminal 100 with one hand and operate the touch surface of the input unit 11 with the finger of the other hand.
  • the tablet terminal 100 may be horizontally long, hold both sides with both hands, and operate the touch surface of the input unit 11 with the thumbs of both hands.
  • FIG. 1 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
  • the control unit 10 of the tablet terminal 100 includes at least a contact information generation unit 21, an object specification unit 22, and related functions as functional blocks for realizing the operation screen display function of the present invention.
  • An item extraction unit 23 and an operation screen processing unit 24 are provided.
  • Each functional block of the control unit 10 described above includes a CPU (central processing unit) that stores a program stored in a non-volatile storage device realized by a ROM (read only memory) or the like (RAM (random access memory)). It can be realized by reading and executing the above.
  • a CPU central processing unit
  • ROM read only memory
  • RAM random access memory
  • the storage unit 19 is specifically a frame map storage unit 41 or a related unit as a storage unit for reading or writing data when the above-described units of the control unit 10 execute the operation screen display function.
  • An information storage unit 42 and an icon storage unit 43 are included.
  • the contact information generation part 21 processes the signal output from the touch sensor of the input part 11, and produces
  • the contact information includes at least contact coordinate information indicating the coordinate position of the contact position of the indicator (for example, a finger). Thereby, each part of the control part 10 can acquire the locus
  • the contact time information indicating the time when the contact has occurred is further associated with each point constituting the trajectory as necessary. Also good.
  • the contact information generation unit 21 from when the touch sensor of the input unit 11 detects contact between the touch surface and the indicator (in this embodiment, a finger) until it detects non-contact. Acquires a signal output from the touch sensor. This signal includes information indicating that “contact” has been detected and information indicating the contact position. Based on this signal, the contact information generation unit 21 generates contact coordinate information indicating the contact position in coordinates. Generate. Further, the contact information generation unit 21 measures the time from when contact is detected until it becomes non-contact, and associates the contact time information with the contact coordinate information. The contact information generation unit 21 may acquire and use absolute time information held by the clock unit mounted on the tablet terminal 100. However, in the present embodiment, the contact information generation unit 21 detects contact.
  • the contact information generation unit 21 measures the elapsed time with the time point when the contact is first detected (t0) as 0.00 seconds, and continues the measurement until the time point when the contact is finally detected (tn). Thus, the relative contact time information corresponding to the contact position may be acquired.
  • the contact information generation unit 21 generates contact information by associating the obtained contact time information with the contact coordinate information. In the present embodiment, the generated contact information is supplied to the object specifying unit 22 and used by the object specifying unit 22.
  • the object specifying unit 22 specifies an object selected by the user's contact operation.
  • the object specifying unit 22 compares the contact information generated by the contact information generating unit 21 with the map information of the video frame displayed on the display unit 12 while the contact is occurring. Thereby, the object specification part 22 can specify the object enclosed by contact operation from the objects currently displayed on the display part 12.
  • the frame map storage unit 41 stores map information of the video frame output to the display unit 12 at the time of contact.
  • the map information is information indicating the layout of the video frame displayed on the touch panel.
  • the map information includes information for individually identifying each object displayed, and information on the shape, size, and display position of each object. That is, the map information is obtained by plotting each object corresponding to the coordinate system of the touch panel.
  • FIG. 5 is a diagram for explaining the operation of the object specifying unit 22. More specifically, FIG. 5A is a diagram showing that the user has performed a contact operation of “enclosing” an object in order to select the target object.
  • FIG. 5B is a diagram illustrating an example of contact information generated by the contact information generation unit 21 in accordance with the contact operation illustrated in FIG.
  • FIG. 5C is a diagram illustrating an example of map information of a video frame displayed on the display unit 12 during a period from t0 to tn in which contact is detected.
  • the object specifying unit 22 acquires contact information as shown in FIG. 5B from the contact information generating unit 21.
  • the coordinate system of the contact information corresponds to the coordinate system of the touch panel of the tablet terminal 100, and has the leftmost upper end of the panel as the origin.
  • the start point is indicated as t0 and the end point is indicated as tn.
  • contact time information may also be associated with each point in between.
  • the object specifying unit 22 acquires the map information shown in FIG. 5C (that is, the layout of the video frame displayed on the display unit 12 during the period from t0 to tn) from the frame map storage unit 41. Then, the object specifying unit 22 compares the contact information with the map information, and the object specifying unit 22 completely or substantially overlaps the area enclosed by the trajectory of the user's finger obtained from the contact information or the circumscribed rectangle of the area. 80 is identified as the selected object. In the example shown in FIG. 5, the object specifying unit 22 specifies “Picture 1” in FIG. 5C as the selected object. The object specifying unit 22 supplies information on the specified object to the related item extracting unit 23.
  • the map information shown in FIG. 5C that is, the layout of the video frame displayed on the display unit 12 during the period from t0 to tn
  • the related item extracting unit 23 extracts related items related to the object specified by the object specifying unit 22, that is, the object selected by the user. When an object is selected, an item deeply related to the selected object is extracted by the related item extraction unit 23.
  • the object is data such as “photo”
  • the photo is transferred to “display”, “edit”, “send as an email”, “peripheral device (TV etc.)”
  • operations such as “print” and “print” are executed. Therefore, an item corresponding to an “action” executed on an object that is an “action target” may be extracted as a related item of the object.
  • an item corresponding to the relationship of “action partner” when an action is executed on an object that is an action target may be extracted as a related item.
  • the user may want the photos or data contained in the object. In this way, items belonging to the lower layer of the object may be extracted as related items.
  • the related information storage unit 42 stores related information indicating the relationship between objects and items.
  • FIG. 6 is a diagram illustrating an example of related information stored in the related information storage unit 42.
  • the related information is information in which at least “related items” are associated with each “object”.
  • the association information indicates the association between the object and the item by this association.
  • the related item extracting unit 23 refers to related information stored in the related information storage unit 42 and extracts items related to the specified object as related items. To do.
  • the object specifying unit 22 specifies that the selected object is “Photo 1”.
  • the related item extraction unit 23 extracts a related item group 60 associated with the object “Photo” from the related information.
  • Information on the related items extracted by the related item extracting unit 23 is supplied to the operation screen processing unit 24.
  • the extracted related items are displayed so as to be selectable (for example, as icons) as items related to the previously selected object.
  • an icon may be assigned to each “related item”.
  • the icon “1: TV” is associated with the related item “display on television (transfer to television)” associated with the object “photo”.
  • the icon “1: TV” is, for example, an icon on which an illustration of a TV or the like is drawn, and is preferably a picture reminiscent of “sending a photograph to the TV for display”.
  • Other related items are also assigned icons with appropriate patterns that recall the contents of the related items.
  • the related item extracting unit 23 may supply the operation screen processing unit 24 with icons (or icon identification information) corresponding to the extracted related items. Thereby, the operation screen processing unit 24 can proceed to display the icon specified by the related item extraction unit 23.
  • the related information may not hold the “operation attribute” and “condition” information shown in FIG. 6 for each related item.
  • Related items “operation attribute” and “condition” will be described in another embodiment or modification of the present invention described later.
  • the operation screen processing unit 24 performs processing (operation screen generation processing) for generating an operation screen for displaying an object and a related item (its icon) related to the selected object in a selectable manner for the user. It is.
  • FIG. 7 is a diagram showing a specific example of the icon image stored in the icon storage unit 43. As shown in FIG. As shown in FIG. 7, in this embodiment, each icon image can be identified by icon identification information. For example, the icon identification information “1: TV” is associated with an icon image depicting a TV. In addition, although not shown, a portrait or an avatar image of the person may be used as an icon representing personal information such as an acquaintance who often makes a call.
  • the operation screen processing unit 24 reads the icon images assigned to the related items extracted by the related item extraction unit 23 from the icon storage unit 43, and performs operations so that these are displayed at an appropriate position and an appropriate timing.
  • a screen is generated and output to the display unit 12 via a display control unit (not shown).
  • the operation screen processing unit 24 has a function of displaying an icon of a related item related to the object selected by the touch action “enclose” around the selected object. ing.
  • FIG. 8 is a diagram for explaining processing contents of the operation screen processing unit 24. More specifically, FIG. 8A is a diagram illustrating an example of an object display process executed by the operation screen processing unit 24, and FIG. 8B is a diagram illustrating the operation screen processing unit 24. It is a figure which shows an example of the icon arrangement
  • the operation screen processing unit 24 first arranges the object 80 selected by the previous “enclose” contact operation at the center as shown in FIG.
  • the operation screen processing unit 24 arranges the icons of the extracted related items evenly around the object 80 in a ring shape.
  • FIG. 8B shows an example in which, when eight related items are extracted, the operation screen processing unit 24 uniformly arranges eight icons along the outline of the oval ring.
  • the shape of the reference “ring” indicating the arrangement position of the icon the oval shape is merely an example, and there is no intention to limit the shape of the ring of the present invention.
  • the “ring” does not necessarily mean a shape formed by a curve.
  • the operation screen processing unit 24 may define the shape of the ring as a circle, square, rectangle, or other polygon, or may include a complicated shape, irregular shape, or non-geometric shape.
  • a figure having an outline that separates the outside from the outside is defined as a ring.
  • “ring” does not necessarily mean a closed curve. Even if the starting point and the ending point of the contour line of the ring are not completely coincident with each other, a contour line that largely separates the inside and the outside may be defined as the ring.
  • the operation screen processing unit 24 arranges icons on the contour lines of the rings having any shapes defined as described above.
  • the broken line which shows the outline of a ring shown in FIG.8 (b) is the shape of the ring which the tablet terminal 100 hold
  • the broken line indicating the outline of the ring in each of the drawings shown below is not actually displayed on the display unit 12.
  • the order in which the icons are arranged is not particularly limited in the present embodiment.
  • the icons may be arranged in a clockwise direction from the top of the object 80 in the order extracted by the related item extraction unit 23.
  • FIG. 9 is a diagram illustrating a specific example of the operation screen obtained as a result of the operation screen generation process executed by the operation screen processing unit 24.
  • the example shown in FIG. 9 is a specific example of the operation screen obtained when the object 80 (object “photo 1”) is enclosed, as in FIGS. 5 and 8.
  • the related item extraction unit 23 refers to the related information shown in FIG. 6, and icon identification information associated with the object of the photograph, that is, “1: TV”, “2: Printer”, “3: Mail”, “4” : “Photo display”, “5: Information display”, “6: Palette”, “7: Trash”, and “8: Memory card” are extracted.
  • the operation screen processing unit 24 reads a corresponding icon image from the icon storage unit 43 as shown in FIG. 7 based on the icon identification information extracted by the related item extraction unit 23. Then, the selected object 80 is arranged at the center, and the read icon image is arranged around it.
  • the operation screen processing unit 24 performs the process of placing the object 80 in the center, but this is not an essential structure. However, in the present embodiment, the operation screen processing unit 24 arranges the icons of the related items of the object 80 around the object 80, so that a space for arranging the icons around the object 80 is ensured as widely and evenly as possible. Therefore, it is preferable to arrange the object 80 in the center.
  • FIG. 10 is a flowchart showing a flow of operation screen display processing by the tablet terminal 100.
  • the object specifying unit 22 compares the contact information generated in S104 (for example, (b) in FIG. 5) with the map information (for example, (c) in FIG. 5) stored in the frame map storage unit 41. Then, the object overlapping the area surrounded by the user is specified as the selected object (S105). In the example shown in FIG. 5C, the object 80 “Photo 1” is specified.
  • the related item extraction unit 23 refers to the related information (for example, FIG. 6) in the related information storage unit 42 based on the object specified in S105, and extracts the related item of the specified object (S106). Alternatively, identification information of icons assigned to related items may be extracted.
  • the operation screen processing unit 24 acquires the icon image of the related item extracted in S106 from the icon storage unit 43 (for example, FIG. 7). Then, the acquired icon image is arranged around the object specified in S105 to generate an operation screen (S107). At this time, the operation screen processing unit 24 arranges the object in the center and arranges each icon in a ring shape around the object.
  • the video signal of the operation screen generated as described above is output to the display unit 12. As shown in FIG. 9, the operation screen is displayed on the display unit 12 of the tablet terminal 100.
  • the tablet terminal 100 displays an icon around the object in response to an extremely natural and simple user contact operation of “enclosing” the object. Can be output.
  • the user can obtain an operation screen in which icons of related items are arranged so as to surround the object as a result.
  • the positional relationship between these icons and the object matches the positional relationship between the finger trajectory and the object by the contact operation previously performed by the user.
  • the finger trajectory obtained by surrounding is similar to a ring shape in which icons are arranged.
  • the transition from the phenomenon of “contacting the object to“ enclose ”the object” to the phenomenon of “obtaining an operation screen with icons arranged around the object” is a natural transition that does not contradict the user's intuition. It can be said that there is.
  • the tablet terminal 100 can preliminarily detect related items that the user will select after selecting an object, and can display the related items in a selectable manner for the user.
  • the icons displayed around the object are all related item icons extracted as items related to the object. That is, after the user surrounds and selects an object, the user can immediately designate “motion”, “motion target”, “action partner”, and the like related to the object from surrounding icons.
  • the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations.
  • the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
  • Embodiment 2 Another embodiment of the information processing apparatus according to the present invention will be described below with reference to FIGS.
  • members having the same functions as those in the drawings described in the first embodiment are denoted by the same reference numerals, and description of the same contents as those in the first embodiment is omitted.
  • each icon is arranged along a predetermined ring shape.
  • the operation screen processing unit 24 moves the ring shape according to the user's contact operation. It is the structure determined automatically. Thereby, according to a user's intuition, the result recalled more intuitively from contact operation
  • FIG. 11 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
  • the tablet terminal 100 has a configuration in which the control unit 10 further includes a ring shape determining unit 30 as a functional block, as compared with the tablet terminal 100 (FIG. 1) of the first embodiment.
  • the storage unit 19 further includes a contact information storage unit 44.
  • control unit 10 of the tablet terminal 100 is not essential, but further, as necessary, as a functional block, a gesture determination unit 25, an icon ranking determination unit 31, an animation determination unit 32, and The icon arrangement determining unit 33 may be provided.
  • a gesture determination unit 25 an icon ranking determination unit 31, an animation determination unit 32, and The icon arrangement determining unit 33 may be provided.
  • the icon rank determination unit 31, the animation determination unit 32, and the icon arrangement determination unit 33 will be described in a later embodiment or modification, and the description thereof is omitted in this embodiment.
  • the contact information storage unit 44 stores the contact information generated by the contact information generation unit 21.
  • the contact information is temporarily stored in a storage unit (such as a cache) (not shown) so that the object specifying unit 22 can be used.
  • the contact information is used so that each part of the operation screen processing unit 24 and the operation screen processing unit 24 can perform operation screen generation processing (including processing for displaying icons). It is stored in the storage unit 44.
  • the contact information storage unit 44 is realized by a non-volatile storage device, that is, whether or not the contact information is stored in a nonvolatile manner depends on the purpose of the operation screen display function executed by the operation screen processing unit 24 and the assumed use. It is determined as appropriate from the environment, the purpose of use of the tablet terminal 100 itself, convenience, cost, physical restrictions, and the like.
  • the ring shape determination unit 30 determines the ring shape when the operation screen processing unit 24 arranges icons around the object.
  • the operation screen processing unit 24 is configured to arrange icons in a predetermined ring shape.
  • the icon is arranged on an elliptical outline of a predetermined position and a predetermined size.
  • the ring shape determination unit 30 determines a ring shape for arranging icons based on the contact information stored in the contact information storage unit 44. That is, the ring shape determination unit 30 determines the ring shape according to the finger trajectory obtained from the user's contact operation “enclose”.
  • the ring shape determination unit 30 can directly change the shape of the trajectory of the “enclose” operation into a ring shape for arranging icons.
  • the size of the ring can be determined based on the size of the area enclosed by the “enclose” operation. Further, the position of the ring can be determined based on the position of the enclosed region.
  • the ring shape determination unit 30 may further determine the ring shape based on the map information stored in the frame map storage unit 41. That is, the size and position of the ring may be determined according to the display position and size of the enclosed object.
  • the operation screen processing unit 24 arranges the extracted icons.
  • the ring shape information determined by the ring shape determination unit 30 may further include ring size information and / or ring position information.
  • control unit 10 of the tablet terminal 100 further includes the gesture determination unit 25. If the contact operation (gesture) performed on the input unit 11 is assumed to be other than “enclose”, it is necessary to determine whether the gesture is “enclose” or another gesture. .
  • the gesture determination unit 25 determines what gesture it is for the contact operation performed on the input unit 11. For example, the gesture determination unit 25 can determine gestures such as “tap”, “flick”, “pinch”, “drag”, and “enclose”. A known technique can be appropriately employed as an algorithm for determining a gesture.
  • the gesture determination unit 25 instructs each unit of the control unit 10 to execute processing corresponding to the determined gesture according to the determination result.
  • the contact information generation unit 21 when the gesture determination unit 25 determines that the detected contact action is a gesture of “enclose”, the contact information generation unit 21 sends the generated contact information to the contact information storage unit 44. It is preferable to instruct to store. As a result, the operation screen processing unit 24 can refer to all information (such as the position, size, locus, contact time, contact point movement timing, etc.) of the gesture “enclose”, and the contact operation can be performed. If the gesture is other than the “enclose” gesture, it is possible to avoid unnecessary writing to the contact information storage unit 44. However, the contact information generation unit 21 may be configured to write all contact information in the contact information storage unit 44 regardless of the determination result of the gesture determination unit 25.
  • FIG. 12 is a diagram illustrating a specific example of contact information stored in the contact information storage unit 44. More specifically, FIG. 12A is a diagram showing that the user has performed a contact operation of “enclosing” an object in an arbitrary shape in order to select a target object. FIG. 12B is a diagram illustrating an example of contact information generated by the contact information generation unit 21 in accordance with the contact operation illustrated in FIG.
  • the user refers to one of the objects (here, photographs) displayed on the touch panel of the tablet terminal 100 in an arbitrary shape (for example, “enclose” in a heart shape).
  • the contact operation is executed, for example, when the contact point is executed in the period from t0 to tn so that the contact point passes the position of the broken line in FIG.
  • the gesture determination unit 25 acquires contact information as illustrated in FIG. 12B from the contact information generation unit 21.
  • the start point is indicated as t0 and the end point is indicated as tn.
  • contact time information may also be associated with each point in between.
  • the gesture determination unit 25 determines that this contact operation is a gesture of “enclose” based on the contact information shown in FIG.
  • the gesture determination unit 25 instructs the contact information generation unit 21 to store the contact information illustrated in FIG. 12B in the contact information storage unit 44.
  • each part of the operation screen process part 24 can refer to the contact information shown by (b) of FIG. 12 memorize
  • FIG. 13 is a diagram illustrating an example of icon display of related items executed by the operation screen processing unit 24 including the ring shape determining unit 30.
  • the operation screen processing unit 24 can place the selected object 80 (photo 1) in the center.
  • the ring shape determination unit 30 acquires the contact information stored in the contact information storage unit 44. Based on the movement trajectory of the finger tip (contact point) obtained from the contact information, the ring shape determination unit 30 determines a shape that is the same as or similar to the trajectory as the ring shape for arranging the icons. To do. In the present embodiment, as an example, the ring shape determination unit 30 determines to place the ring at the center and to place it as large as possible on the touch screen. As shown in FIGS. 12A and 12B, the object 80 is surrounded by a heart shape. Therefore, the ring shape determination unit 30 determines the ring shape 81 so that the similar shape of the heart-shaped locus is arranged in the center of the screen as shown by the broken line in FIG.
  • the operation screen processing unit 24 arranges icons on the outline of the ring shape 81 determined by the ring shape determination unit 30 as shown in FIG.
  • the operation screen processing unit 24 may arrange the icons at equal intervals, or may arrange the icons at arbitrary positions on the contour line according to another rule.
  • the operation screen processing unit 24 may determine the approximate shape of the trajectory as the ring shape when the trajectory has an extremely complicated shape. By rounding a fine and distorted line of a locus with a straight line or a curve, the amount of information defining the shape of the ring can be reduced, and the processing load for arranging icons can be reduced.
  • FIG. 14 is a flowchart showing a flow of operation screen display processing by the tablet terminal 100 in the present embodiment.
  • the acquisition of the contact coordinate information indicating the contact position is started and acquired over time (S202). This tracking of the contact position is continued until no contact is detected between the touch surface and the finger (NO in S203).
  • the gesture determination unit 25 may determine the gesture of the contact operation based on the contact information (S205). In the present embodiment, if the determined gesture is not “enclose” (NO in S206), the gesture determination unit 25 instructs each unit of the control unit 10 to execute a process according to the determined other gesture. Each unit performs processing according to the determined gesture (S207).
  • the gesture determination unit 25 instructs the contact information generation unit 21 to store the contact information in the contact information storage unit 44. .
  • the contact information generation unit 21 stores the contact information generated in S204 in the contact information storage unit 44 (S208).
  • the object specifying unit 22 includes contact information stored in the contact information storage unit 44 (for example, (b) of FIG. 12) and map information stored in the frame map storage unit 41 (for example, (c) of FIG. )) And the object overlapping the area surrounded by the user is specified as the selected object (S209).
  • the object 80 “Photo 1” is specified.
  • the related item extraction unit 23 refers to the related information (for example, FIG. 6) in the related information storage unit 42 based on the object specified in S209, and extracts the related item of the specified object (S210). Alternatively, identification information of icons assigned to related items may be extracted.
  • the operation screen processing unit 24 first acquires the contact information generated in S204 from the contact information storage unit 44 (S211).
  • the ring shape determination unit 30 of the operation screen processing unit 24 determines a ring shape for arranging icons from the acquired contact information (S212). For example, based on the heart-shaped locus shown in FIG. 12B, the heart-shaped similarity is determined as a ring shape (for example, the ring shape 81 in FIG. 13).
  • the operation screen processing unit 24 acquires the icon image of the related item extracted in S210 from the icon storage unit 43 (for example, FIG. 7). Then, the acquired icon image is arranged around the object specified in S209 to generate an operation screen (S213). At this time, the operation screen processing unit 24 places the object in the center and places each icon on the ring shape arranged around the object, that is, on the ring-shaped outline determined in S212 (for example, FIG. 13). The video signal of the operation screen generated as described above is output to the display unit 12.
  • the tablet terminal 100 displays an icon around the object in response to an extremely natural and simple user contact operation of “enclosing” the object. Can be output. That is, the user can obtain, as a result, an operation screen in which icons of related items are arranged so as to surround the object.
  • each icon is arranged so as to surround the object on the same or similar ring-shaped contour line as the obtained trajectory.
  • the positional relationship between these icons and the object matches the positional relationship between the object and the locus of the finger by the contact operation previously performed by the user. Further, the trajectory of the finger obtained by enclosing it matches the ring shape where the icon is arranged.
  • the icons displayed in the surroundings after selecting an object indicate related items that are deeply related to the object and are likely to be selected next.
  • the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations.
  • the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
  • trajectory of a pointer's movement this is shown as input part 11 and display part 12 May be applied to an information processing apparatus provided separately.
  • an information processing apparatus in which the input unit 11 is configured by an input device such as a mouse and the display unit 12 is configured by a display device such as an LCD can be considered.
  • the cursor displayed on the display unit 12 indicates the position of the display unit 12 on the screen. Then, when the user operates the mouse to perform an input operation, the cursor moves.
  • the mouse is the operating body
  • the cursor is the pointing body
  • the pointing body displayed on the display unit 12 moves as the operating body moves.
  • the information processing apparatus links the position of the cursor held in advance with the movement of the mouse, The trajectory of the movement is also held, and icons are displayed in a circle around the selected object.
  • the information processing apparatus can make the shape of the ring a similar shape or an approximate shape of the acquired trajectory. According to the above configuration, an operation screen on which icons are arranged can be obtained in a shape as if the user moved the mouse, so that it is possible to improve playability when operating the information processing apparatus.
  • various input devices such as a keyboard, a joystick, a digitizer, a tablet, and a stylus pen can be employed in addition to the mouse.
  • Embodiment 3 Another embodiment of the information processing apparatus according to the present invention will be described below with reference to FIGS.
  • members having the same functions as those in the drawings described in the first embodiment or the second embodiment are denoted by the same reference numerals, and the description overlapping with those in the first or second embodiment is omitted.
  • the extracted icons are arranged according to a predetermined rule, such as being arranged clockwise from the top of the object in the order of extraction.
  • a predetermined rule such as being arranged clockwise from the top of the object in the order of extraction.
  • the operation screen processing unit 24 has a configuration for determining the arrangement of icons so that they are arranged near (next).
  • FIG. 15 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
  • the tablet terminal 100 Compared with the tablet terminal 100 (FIG. 1) of the first embodiment, the tablet terminal 100 according to the present embodiment further includes the cooperation processing execution unit 26 and the icon arrangement determination unit 33 as functional blocks. It is the structure equipped with.
  • the storage unit 19 further includes the contact information storage unit 44 and the control unit 10 as function blocks as necessary, as necessary.
  • rank determination part 31, and the animation determination part 32 may be sufficient.
  • the icon arrangement determining unit 33 determines the icon arrangement.
  • the icon arrangement determining unit 33 determines which icon is arranged at which position based on the attribute of the extracted related item.
  • each icon is determined to be arranged on a predetermined or determined ring-shaped outline, but the other icons may be arbitrarily designed.
  • the icon arrangement determination unit 33 determines where, how many, in what arrangement, or at what interval the icons are arranged on the contour line based on the attribute of each related item. Can do.
  • the icon arrangement determining unit 33 determines the icon arrangement by paying attention to the cooperation between the related items among the attributes of the related items.
  • FIG. 16 is a diagram illustrating an example of related information stored in the related information storage unit 42 of the present embodiment.
  • the related information of the present embodiment may further include “operation attribute” and “condition” information shown in FIG. 6 as an example of the attribute of the related item.
  • the related item extracting unit 23 extracts items A to F shown in FIG. 16 as related items based on the related information.
  • Attribute “cooperation” is one of the attributes of the related items, and is information indicating the cooperation between the related items.
  • the related items include those representing “operation”, those representing “operation partner”, and items representing “operation target”.
  • operation target that is a target for performing the operation may be required.
  • operation partner that is the partner to perform the operation may be required. In this case, it can be said that “operation”, “operation target”, and “operation partner” are linked.
  • the icon arrangement determination unit 33 grasps the cooperation of the related items based on the “cooperation” of each related item stored in the related information storage unit 42. Then, the icon arrangement determining unit 33 determines the icon arrangement by arranging icons of related items that cooperate with each other, or by bringing together icons of related items that cooperate with each other.
  • one related item may belong to one cooperation group (one cooperation number), or may belong to a plurality of cooperation groups.
  • One linkage group may consist of a pair of (that is, two) related items, or may consist of three or more related items.
  • the related item “E: photo” can be an “operation target” of both operations “A: send by e-mail” and “B: upload to blog”. It belongs to both “1” and the cooperation group “2” to which the related item B belongs.
  • the related item “C: Call” does not have an “operation target”, and it only needs to be linked to the other party to call, that is, the “operation partner”.
  • the cooperation group “3” is composed of a pair of related items C and D.
  • the “action” of the related item “A: send by e-mail” requires “operation target” and “operation partner”.
  • the related items “E: Photo” and “F: Movie” are “operation targets” that can be transmitted by e-mail. Therefore, the cooperation group “1” to which the related item A belongs includes the related items A, D, E, and F.
  • the related item “E: photo” is associated with the cooperation number in the order of “1” and “2”.
  • E: photo indicates that the “operation” “A: send by e-mail” is stronger than the “operation” “B: upload to blog”.
  • F: moving image is associated with a cooperation number in the order of “2” and “1”. That is, “F: moving image” indicates that the “operation” “B: upload to blog” is stronger than the “operation” “A: send by e-mail”.
  • FIG. 17 is a diagram illustrating a procedure in which the icon arrangement determining unit 33 determines the icon arrangement.
  • the object “tool 1” has already been specified by the object specifying unit 22, and the six related items “A” to “F” shown in FIG. 16 have been extracted by the related item extracting unit 23. And When a rule that six related items are evenly arranged is determined in advance, the icon arrangement determining unit 33 determines only the arrangement of icons according to “cooperation”.
  • the icon arrangement determining unit 33 refers to the related information shown in FIG. 16 and classifies each related item according to the association with the cooperation number. For example, as shown in FIG. 17, each related item is classified into “linkage number 1” to “linkage number 3”. The related item may belong to a plurality of linkages. Further, the icon arrangement determining unit 33 classifies each related item into one of “operation”, “operation target”, and “operation partner”. The result of classifying each related item by the icon arrangement determining unit 33 is shown in FIG.
  • the icon arrangement determining unit 33 classifies related items according to a predetermined priority order when a plurality of “operation targets” belong to one linkage group. For example, in the example illustrated in FIG. 17A, the related item “E” has priority over the related item “F” in the cooperation group “1”.
  • the icon arrangement determining unit 33 determines the arrangement of the related items “A” to “F” in accordance with the cooperation of the related items.
  • the icon arrangement determining unit 33 recognizes the cooperation of the related items A, D, and E with the cooperation number 1. Then, it is determined that these are arranged adjacent to each other in the order of “operation”, “operation object”, and “operation partner”. Specifically, when the icon arrangement determining unit 33 arranges the related item “A”, next, “D” is arranged next to “A”, and “E” is arranged next to the other of “A”. Deploy.
  • the icon arrangement determining unit 33 determines that “D” that has already been arranged is linked to “C” by the linkage number 3, and places “C” next to an empty “D”. Subsequently, the icon arrangement determination unit 33 determines that “E” that has already been arranged is a sub, but is associated with “B” by the cooperation number 2, and “B” ". Finally, the remaining “F” linked to “B” is arranged next to “B” that is free.
  • a plurality of related items are assigned to one arrangement position and confront each other. In such a case, it is only necessary to prioritize related items with stronger cooperation based on the above-described priority order. Further, it is conceivable that one related item is assigned to a plurality of arrangement positions and overlapped. In preparation for such a case, a priority order is determined for each cooperation group separately indicating which cooperation is preferentially held (for example, the cooperation number may indicate the priority order as it is). The arrangement position may be determined so that the related items of the high cooperation group are adjacent to each other as much as possible.
  • the operation screen processing unit 24 arranges icons corresponding to each related item according to the arrangement determined by the icon arrangement determination unit 33, and generates an operation screen.
  • FIG. 18 is a diagram showing a specific example of the operation screen obtained as a result of the operation screen processing unit 24 arranging icons according to the determination of the icon arrangement determining unit 33.
  • the icon 90 corresponds to the related item “A: send by e-mail” and is created with an icon image of “3: e-mail” associated with the related item A.
  • the icon 91 corresponds to the related item “D: best friend information”, and is created with an icon image of “15: avatar” associated with the related item D.
  • This icon image is preferably an avatar of a specific person registered as a best friend. Accordingly, the user can understand at a glance which person the icon indicates.
  • the icon 92 corresponds to the related item “E: photo” and is created with an icon image of “16: thumbnail” associated with the related item E. This icon image is preferably an icon of the thumbnail image of the photo. Thereby, the user can understand what the photograph is at a glance.
  • the icons 90, 91, and 92 of related items A, D, and E that are linked to each other are arranged next to each other.
  • the related icons are arranged close together so that the user can show the relationship among the icons even if the icons are equivalent to the object. It becomes possible to grasp easily.
  • the three icons are gathered close together so that they can be easily enclosed and the operability is improved. .
  • the icon arrangement determining unit 33 may determine the icon arrangement interval according to “cooperation”.
  • FIG. 19 is a diagram showing another specific example of the operation screen obtained as a result of the operation screen processing unit 24 arranging icons according to the determination of the icon arrangement determining unit 33.
  • the icon arrangement determining unit 33 classifies each related item into “cooperation number 1” to “cooperation number 3”, and then, for each cooperation class, the related item belonging to that class. It is decided to arrange icons with a close interval between icons.
  • the icon arrangement determining unit 33 determines an arrangement in which a cluster of icons is formed for each category of cooperation.
  • a plurality of items may be arranged so that one related item belongs to several clusters.
  • the cooperation processing execution unit 26 executes the cooperation processing in consideration of the cooperation of the related items corresponding to these icons. Is.
  • the cooperation processing execution unit 26 may realize the cooperation processing by controlling each execution processing unit so that the plurality of execution processing units operate in conjunction with each other, or the plurality of processes are executed in an order that makes sense. As described above, the cooperative processing may be realized by controlling one or a plurality of execution processing units.
  • an icon 90 of the related item “A: Send by email”, an icon 91 of the related item “D: Best Friend Information”, and an icon of the related item “E: Photo” 92 is surrounded at once, the cooperation processing execution unit 26 controls an execution processing unit (not shown) to start a mail application, attaches a photograph indicated by the icon 92 to the mail, A series of processes of sending to the e-mail address is executed.
  • the tablet terminal 100 can easily present icons that operate in cooperation with each other and present them to the user. If a plurality of icons are surrounded, the tablet terminal 100 can be displayed. A plurality of processes can be executed together according to the linkage information.
  • the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations.
  • the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
  • Embodiment 4 Another embodiment of the information processing apparatus according to the present invention will be described below with reference to FIGS.
  • members having the same functions as those in the drawings described in the first, second, or third embodiment are denoted by the same reference numerals, and the same contents as those in the first, second, or third embodiments are described. Omitted.
  • the icon arrangement determining unit 33 is configured to determine the arrangement or interval of icons in consideration of the cooperation of related items.
  • the icon arrangement determination unit 33 considers the positional relationship between the tablet terminal 100 and the peripheral device, and displays the icon. It is the structure which determines arrangement
  • FIG. 20 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
  • the tablet terminal 100 Compared with the tablet terminal 100 (FIG. 1) of the first embodiment, the tablet terminal 100 according to the present embodiment further includes the device direction specifying unit 27 and the icon arrangement determining unit 33 as functional blocks. It is the structure equipped with.
  • the storage unit 19 further includes a device direction storage unit 45.
  • the storage unit 19 serves as the contact information storage unit 44
  • the control unit 10 serves as the function block, the gesture determination unit 25, and the cooperation processing execution.
  • rank determination part 31, and the animation determination part 32 may be sufficient.
  • the tablet terminal 100 is further provided with a position detection unit and a direction detection unit, although not shown in FIG. 2 as a hardware configuration.
  • the position detection unit is for detecting the current position of the tablet terminal 100. For example, an antenna for measuring a distance from a satellite or a base station, a signal received by the antenna, and a tablet is processed. And a signal processing unit that generates position information indicating the current position of the terminal 100.
  • the position information acquired by the position detection unit is supplied to the device direction specifying unit 27 of the control unit 10.
  • the position detection unit is realized by, for example, GPS (Global Positioning System) or an existing indoor position information acquisition system. For example, when an indoor position information acquisition system is employed, the position detection unit communicates with a plurality of base stations via a WLAN (Wireless Local Area Network) or the like, measures distances, Detect current position.
  • the antenna of the position detection unit may be shared with the antenna of the wireless communication unit 16 or may be a separate body.
  • the direction detection unit detects which direction the tablet terminal 100 faces in the horizontal direction. For example, the direction detection unit identifies the north direction by detecting the geomagnetism in the front-rear direction and the left-right direction. And a signal processing unit that processes a signal detected by the geomagnetic sensor and generates direction information indicating a direction in which the tablet terminal 100 is facing.
  • the direction information acquired by the direction detection unit is supplied to the device direction identification unit 27 of the control unit 10.
  • There are two-axis type and three-axis type geomagnetic sensors but any type of geomagnetic sensor may be adopted in the tablet terminal 100 of the present invention. However, when a two-axis type geomagnetic sensor is employed, at least while the tablet terminal 100 is performing the process of specifying the direction of its own device, the user moves the surface of the touch panel (display unit 12) upward. Need to be held horizontally.
  • the wireless communication unit 16 performs wireless communication with an external peripheral device and receives position information of each peripheral device.
  • the position information of each peripheral device received by the wireless communication unit 16 is supplied to the device direction specifying unit 27.
  • the device direction specifying unit 27 specifies the direction of the tablet terminal 100 and specifies in which direction each peripheral device exists as viewed from the tablet terminal 100. Specifically, the device direction specifying unit 27 acquires position information from the position detection unit described above, and specifies the current position of the tablet terminal 100 based on the position information. And the apparatus direction specific
  • FIG. 21 is a diagram illustrating an example of a usage environment when the user U of the tablet terminal 100 is using the tablet terminal 100 in a room indoors.
  • the direction in FIG. 21 is not particularly limited, it is assumed that the upper direction in FIG. 21 is north for easy understanding.
  • the side of the tablet terminal 100 main body shown in FIG. 3 where the wireless communication unit 16 is provided is the top (or upper side) of the tablet terminal 100 and the side of the tablet terminal 100 is the side where the voice input unit 18 is provided. Called the bottom (or bottom).
  • the direction of the tablet terminal 100 refers to the direction in which the upper side of the tablet terminal 100 is facing. That is, in the example shown in FIG. 21, since the upper side of the tablet terminal 100 faces north, the direction in which the tablet terminal 100 currently faces is “north”.
  • each peripheral device of the digital television 1, printer 2, digital photo frame 3, and personal computer 4 can wirelessly communicate with the tablet terminal 100, detects the current position of the own device, and detects the position of the own device. It has a function of transmitting information to the tablet terminal 100.
  • FIG. 22 is a diagram illustrating a specific example of the device direction information generated by the device direction specifying unit 27.
  • the device direction specifying unit 27 communicates with each peripheral device via the wireless communication unit 16 and acquires position information of each peripheral device.
  • the device direction specifying unit 27 plots each peripheral device in a predetermined coordinate system based on the acquired position information.
  • the device direction specifying unit 27 uses the coordinate system (Xr, Yr) corresponding to the room shown in FIG. ) And plot the current position of each peripheral device in this coordinate system.
  • each peripheral device is shown as a block for explanation, but the device direction specifying unit 27 grasps the position of each peripheral device by one point plotted in the coordinate system. Just do it.
  • the device direction specifying unit 27 further acquires the position information of the tablet terminal 100 from the position detection unit included in the device itself, specifies the current position of the tablet terminal 100, and the direction information of the tablet terminal 100 from the direction detection unit. And the direction of the tablet terminal 100 is specified.
  • the position of the tablet terminal 100 may be grasped by one point plotted on the coordinate system.
  • the device direction specifying unit 27 plots the tablet terminal 100 at the specified position in the specified direction.
  • the coordinate system (X, Y) corresponding to the touch panel of the tablet terminal 100 (the upper left corner of the touch panel screen is the origin) or the ring displayed on the display unit 12 of the touch panel Can be defined in the room coordinate system.
  • the device direction information generated by the device direction specifying unit 27 in this way is stored in the device direction storage unit 45 and read by the operation screen processing unit 24.
  • the operation screen processing unit 24 can specify the direction of each peripheral device by reading out the device direction information.
  • the icon arrangement determining unit 33 of the operation screen processing unit 24 can determine the arrangement positions of icons related to each peripheral device so as to correspond to the direction in which the peripheral device exists.
  • an elliptical ring (thick broken line in FIG. 22) is defined based on the specified position and direction of the tablet terminal 100, with the tablet terminal 100 as a central object.
  • the icon arrangement determination unit 33 sets the position of the icon related to the peripheral device at the position where the straight line connecting the point representing the position of the tablet terminal 100 and the point representing the position of the peripheral device and the outline of the ring intersect. The position can be determined.
  • FIG. 23 is a diagram illustrating a specific example of the related information related to the peripheral device stored in the related information storage unit 42.
  • related information indicating the correspondence between the device ID of the peripheral device and the icon is further stored.
  • the operation screen processing unit 24 refers to the related information shown in FIG. 23 when related items related to the peripheral devices “digital television 1”, “printer 2”, “digital photo frame 3”, and “computer 4” are extracted.
  • the icon image associated with each peripheral device is read from the icon storage unit 43.
  • the operation screen processing unit 24 generates an operation screen by arranging the read icon images at the arrangement positions determined by the icon arrangement determining unit 33.
  • FIG. 24 is a diagram illustrating a specific example of the operation screen obtained as a result of the operation screen generation process executed by the operation screen processing unit 24 in the present embodiment.
  • the operation screen processing unit 24 arranges the icons of the peripheral devices in accordance with the icon arrangement position determined by the icon arrangement determining unit 33 based on the device direction information shown in FIG. Generate.
  • the positions of the icons of the peripheral devices displayed on the display unit 12 correspond to the actual directions of the peripheral devices shown in FIG.
  • the screen on the touch panel is north in the example shown in FIG.
  • the digital TV 1 icon is substantially in the north direction
  • the digital photo frame 3 icon is in the substantially northeast direction
  • the printer 2 icon is in the same direction as the actual existence direction.
  • the icon of the personal computer 4 is displayed in the direction of approximately east and in the direction of approximately southeast.
  • the icon of the related item (peripheral device) can be displayed in a ring shape corresponding to the actual direction of the peripheral device. Therefore, it becomes possible to provide the user with an operation screen that is easier to understand without contradicting the user's intuition, and it is possible to realize a user interface that allows the next operation to be continued with a more natural flow. .
  • FIG. 25 is a diagram showing a specific example of the operation screen obtained following the operation screen shown in FIG. 24 as a result of the operation screen generation process executed by the operation screen processing unit 24.
  • the icon of each peripheral device shown in FIG. 24 is an icon indicating a related item of transferring a central object (here, an object “Photo 1”) to the peripheral device when the icon is selected. .
  • a contact operation for selecting an icon is an operation of selecting an object 80 with a finger and dragging it to a desired icon (transfer destination peripheral device) and releasing it. Become.
  • the contact operation as described above is linked to the operation of transmitting data (object 80) to the peripheral device, and is an operation that is easy for the user to understand intuitively.
  • icons are displayed in correspondence with the actual positional relationship between the tablet terminal 100 and each peripheral device (direction in which the peripheral device actually exists). For this reason, the user can intuitively grasp the relevance between his / her contact operation and information processing generated as a result. For example, if the user wants to transfer a photo to the digital television 1, the user may drag the photo object in the direction in which the digital television 1 is actually located.
  • a tablet terminal 100 in which the above-described first to fourth embodiments are appropriately combined also falls within the scope of the present invention.
  • the control unit 10 of the tablet terminal 100 according to each of the first to fourth embodiments includes the gesture determination unit 25, the cooperation processing execution unit 26, and the device direction specifying unit 27 when not essential in each embodiment. Or all may be provided.
  • the control part 10 operation screen process part 24 may be provided with the ring shape determination part 30, the icon order
  • the operation screen processing unit 24 places the selected object (for example, the object 80 in FIGS. 8A and 8B) in the center of the touch panel screen when placing an icon.
  • the ring shape determining unit 30 is configured to determine the position of the ring so as to arrange icons around the center object 80.
  • the operation screen processing unit 24 may be configured to maintain the display position of the selected object 80 as it is. Even in this case, the ring shape determination unit 30 may display the ring shape on the screen as shown in FIG. The position and size of the ring may be determined so as to be displayed large in the center.
  • FIG. 26 shows a modification of the related item icon display method.
  • the ring shape determination unit 30 may cause the object 80 at the original position to be in the center of the ring as shown in FIG. The position and size of the ring shape may be determined.
  • FIG. 27 shows a variation of the related item icon display method.
  • the operation screen processing unit 24 includes an animation determination unit 32.
  • the animation determination unit 32 determines an animation to be given to all objects to be arranged on the operation screen, that is, objects, icons, rings, and the like. Thereby, when displaying an object and an icon, the visual effect (namely, animation) can be given to how to display.
  • the animation determination unit 32 may give a visual effect such as fade-in (change of transparency) in addition to the movement of the object, icon, or ring.
  • the animation determination unit 32 may move the icons so that the icons are finally terminated on the contour line from different places instead of causing the icon to be displayed in a circular shape to appear on the contour line of the ring from the beginning. .
  • the animation determination unit 32 adds a movement that diffuses each icon, and finally moves each icon so that it is arranged on the outline of the ring. May be attached.
  • the animation determination unit 32 may solve the above problem by giving an animation to a ring for arranging icons. Specifically, the animation determination unit 32 sets the ring shape determined by the ring shape determination unit 30 based on the original display position and size of the object 80 so that the ring shape is largely arranged in the center of the screen over a certain period of time. Add animation to the ring. As a result, the ring of icons once arranged small around the object 80 gradually changes its shape with the passage of time, and is finally arranged large in the center of the screen.
  • 28A shows a state of an icon ring initially arranged around the object 80
  • FIG. 28B shows a state in which the icon ring is being expanded.
  • (C) shows a state in which the ring of icons is finally arranged largely at the center of the screen from (a) of FIG.
  • the icon can be displayed in a sufficient size without impairing the relevance between the user's contact operation and the displayed result.
  • the animation determination unit 32 may give an animation that gradually increases the size of each icon according to the size of the ring, or the icon size is independent of the size of the ring and the size of the icon.
  • the animation may be given such that the size of the icon is fixed and the interval between the icons is gradually increased.
  • the operation screen processing unit 24 is configured to simultaneously arrange a plurality of icons around the selected object.
  • the present invention is not limited to this, and when the operation screen processing unit 24 includes the animation determination unit 32, the animation determination unit 32 may determine the display timing of each icon.
  • FIG. 29 is a diagram showing a modification of the related item icon display method.
  • the animation determination unit 32 refers to the contact information stored in the contact information storage unit 44, and recognizes that the “enclose” gesture has occurred clockwise from t0 to tn.
  • the animation determination unit 32 determines that the first to eighth icons appear one by one at regular intervals in a clockwise direction so as to match this movement.
  • the animation determining unit 32 causes icons to appear in order at regular intervals such as (b), (c), (d), (e),... In FIG.
  • the display timing of the icons is controlled so that the operation screen shown in FIG.
  • the operation screen can be provided in a natural flow that does not contradict the user's intuition.
  • the icon arrangement determining unit 33 roughly matches the display position of the first icon with the start point of the finger trajectory (contact position at time t0), and the display position of the last icon and the end point (tn It is preferable that the contact position at the time is approximately the same.
  • the animation determination unit 32 sequentially causes each icon to appear in accordance with not only the finger movement (clockwise or counterclockwise) but also the speed at which the finger moves when the object is surrounded.
  • contact information indicates a trajectory enclosed in a clockwise direction from time t0 to time tn. More specifically, the contact position (tip of the finger) is at the left of the object at the time ta, the contact position is at the upper left of the object at the time tb, and the contact position is at the upper right of the object at the time tc. It can be seen from this contact information.
  • the animation determination unit 32 causes the first icon to appear immediately below the object at time t0, and then matches the finger speed.
  • the icon appears up to the left (third) of the object at the time ta, the icon appears up to the upper left (fourth) of the object at the time tb, and the object appears at the time tc.
  • the icons appear up to the upper right (sixth), and finally, it is determined that all objects appear at the time point tn.
  • FIG. 30B shows an operation screen at time t0.
  • FIG. 30C shows the operation screen at the time point ta.
  • FIG. 30D shows the operation screen at time tb.
  • FIG. 30E shows an operation screen at the time tc.
  • FIG. 30 (f) shows the operation screen at time tn.
  • the icon arrangement determining unit 33 determines which icon is arranged at which position based on the attribute of the extracted related item.
  • the icon arrangement determining unit 33 may determine the icon arrangement position of each related item in consideration of “operation attribute” which is one of the attributes of the related item.
  • the related item attribute "operation attribute" is information that indicates whether the operation has an attribute of an operation that works on another device or an operation that completes processing in the device itself. is there.
  • an operation of transmitting data to another device can be cited.
  • Examples of the latter operation include an operation for displaying data on the display unit of the own device.
  • Fig. 6 shows a specific example of "motion attribute”.
  • the related information table shown in FIG. 6 has a field of “operation attribute”.
  • the operation of the related item when the operation of the related item is “operation of transmitting data to another device”, the identification information of “other device transmission operation” indicating that is stored in association with the related item.
  • the identification information of “operation in the own device” indicating that is stored in association with the related item.
  • FIG. 31 shows a variation of the icon arrangement pattern of related items.
  • the icon arrangement determining unit 33 first refers to each “operation attribute” of the related item extracted by the related item extracting unit 23 from the related information storage unit 42. Then, as shown in FIG. 31, the icon arrangement determining unit 33 displays the icons (icons 70 to 72) of related items whose “operation attribute” is “other device transmission operation” in the upper half (or upper 3 minutes). 1). That is, it is arranged above the object 80. The icon arrangement determination unit 33 determines to arrange the icons (icons 74 to 76) of related items whose “operation attribute” is “in-machine operation” in the lower half (or the lower third) of the ring. .
  • the icon arrangement determining unit 33 arranged below the object 80 when there is a related item that does not belong to any of “other device transmission operation” and “in-machine operation”, the remaining empty space ( Alternatively, it is determined that the icons (icons 73 and 77) of the related items are arranged in the middle third).
  • the user when the data indicated by the central object 80 is transmitted to another device, the user next touches the object 80 and drags it to the icon indicating that the data is transmitted to the other device. The contact operation is performed.
  • the user causes the tablet terminal 100 to perform the “operation of transmitting to another device (information processing)”. , You will make a gesture to move things away from your place.
  • sending to another machine is more intuitive than the gesture of bringing an object in front of you, and the gesture of moving an object away from you.
  • the connection is considered strong.
  • the relevance between the user's contact operation and the obtained result can be further increased, and as a result, the operation screen can be provided in a natural flow that does not contradict the user's intuition.
  • the “operation attribute” information may be a flag indicating whether or not it is “another device transmission operation”.
  • the icon placement determination unit 33 sets the “operation attribute” to “TRUE”, that is, “others”. It may be determined that the related item “machine transmission operation” is arranged in the upper half of the ring.
  • the icon arrangement determining unit 33 displays the icon of each related item in time series. May be determined to be arranged.
  • the icon placement determining unit 33 selects the first The icon of the photo (here, the thumbnail image of the photo is preferred) is placed in the center of the top of the ring, and starting from there, it is decided to arrange the remaining icons evenly in the order of shooting date and time. May be.
  • the icon arrangement determining unit 33 determines that the icons are arranged in accordance with the time indicated by the “temporal element” information of each related item, with the ring for arranging the icons as a clock face. May be.
  • FIG. 32 shows a variation of the icon arrangement pattern of related items.
  • the icon arrangement determining unit 33 arranges each photo included in the album on the screen. At this time, the icon arrangement determining unit 33 arranges, for example, a photograph (group) 78 taken around 1 o'clock on the outline around 1 o'clock when the ring is regarded as a clock face. The photograph (group) 79 photographed around the time is arranged on a contour line around 3 o'clock.
  • the shooting date and time of a photo in the album can be expressed by the arrangement position of the photo, and the final result that is the user's purpose can be displayed with a simple contact operation and a small number of operations. It becomes.
  • FIGS. 4A and 4B when the tablet terminal 100 is a small portable terminal and can be operated with one hand or both hands, when operating with one hand, operate with both hands. It is assumed that the area where the user can touch the screen with a finger is different from the time when the user is present. As shown in FIG. 4B, when operating with both hands, any area of the touch panel can be touched. On the other hand, as shown in FIG. 4A, when the operation is performed with one hand, the contact position is the area on the lower left side of the screen (when operated with the left hand) or the area on the lower right side of the screen (operated with the right hand). Tend to be biased.
  • the tablet terminal 100 of the present invention solves the above problem by observing the usage status of the user.
  • the tablet terminal 100 of the present invention can be configured to detect the bias of the contact position of the finger and place the icon in an area where the user's finger is expected to reach immediately.
  • the contact information generation unit 21 of the tablet terminal 100 does not depend on contact / non-contact switching, and further, regardless of whether the contact operation is a gesture of “enclose” (for example, In this configuration, contact information indicating a user's contact operation that occurred within a few seconds to several minutes) is generated and stored in the contact information storage unit 44.
  • FIG. 33 is a diagram for explaining the operation of the tablet terminal 100 of the present invention capable of presenting an operation screen in accordance with the usage status of the user. More specifically, FIG. 33A is a diagram illustrating an example of a situation where the user is operating with the left hand. FIG. 33B is a diagram showing a specific example of the contact information generated in accordance with the contact operation of FIG.
  • the contact information generation unit 21 generates contact information as shown in FIG. 33B for the above-described series of contact operations for a predetermined period (for example, the past few seconds to several minutes), The information is stored in the contact information storage unit 44.
  • the contact information generation unit 21 may be configured to delete the oldest locus every time a new locus is stored. .
  • the operation screen processing unit 24 refers back to the past several seconds to several minutes of contact information stored in the contact information storage unit 44, and detects whether or not the finger touch position is biased. In the example shown in FIGS. 33A and 33B, the locus of the finger is biased toward the area 82 on the lower left side of the screen. The operation screen processing unit 24 detects this bias and identifies the user accessible area as the area 82 on the lower left side of the screen. Note that the area 82 on the lower left side of the screen and the area 83 on the lower right side of the screen are defined in advance.
  • the ring shape determining unit 30 of the operation screen processing unit 24 determines the shape, size, and arrangement position of the ring so that the ring for arranging the icons fits in the area 82 on the lower left side of the screen.
  • FIG. 34 is a diagram illustrating an example of an operation screen when icons are arranged according to the ring shape determined by the ring shape determination unit 30. As shown in FIG. 34, since the related items of the selected object 80 are displayed so as to fit within the area 82 on the lower left side of the screen, the user does not need to drag the target icon with his thumb, The icon can be selected.
  • the tablet terminal 100 determines whether or not the user is operating with the thumb based on the thickness of the line of the finger trajectory in order to determine the usage situation that the user is operating with one hand. If it is determined that the operation is performed, it may be determined that the operation is performed with one hand, and an icon may be displayed at the bottom of the screen.
  • a sensor is provided in the casing of the tablet terminal 100, and the tablet terminal 100 determines whether the casing is gripped by four fingers or five fingers, and accordingly One-handed operation or two-handed operation may be determined.
  • the operation screen processing unit 24 of the tablet terminal 100 refers to the contact coordinate information of the area surrounded by the locus of the finger, identifies the locus area, and the locus area and the vicinity thereof are user accessible areas. It may be determined that an icon ring is arranged there.
  • the tablet terminal 100 assumes the use condition that a user operates with one hand, and when a user does not arrange
  • FIG. 35A is a diagram illustrating a state in which the user rotates the ring of the icon by dragging on the contactable area
  • FIG. 35B is a diagram illustrating the above-described contact operation “drag”.
  • FIG. 6 is a diagram illustrating an example of an operation screen after the arrangement of icons is changed due to rotation of the ring.
  • the tablet terminal 100 presents the icons arranged in a ring shape so as to be rotatable along the outline of the ring. It is the structure to do.
  • an icon ring is displayed on the touch panel (display unit 12) in the arrangement shown in FIG. Since the user operates the tablet terminal 100 with only the left hand, the contactable area is limited (for example, the area 82 shown in FIG. 33B).
  • the gesture determination unit 25 determines the direction in which the finger moves by dragging, and “the icon corresponding to that direction is determined. It is recognized that the instruction “rotate the ring” has been input.
  • the animation determination unit 32 of the operation screen processing unit 24 outputs an operation screen to which an animation for rotating an icon is given to the display unit 12 according to the direction determined by the gesture determination unit 25.
  • each icon rotates clockwise on the outline of the ring.
  • the user can drag the icon of the television arranged above the touch panel to the contactable area (for example, the area 82) as shown in FIG. 35 (b).
  • the TV icon can be selected.
  • the icon arrangement determining unit 33 determines that all icons extracted by the related item extracting unit 23 cannot be displayed in consideration of the determined ring shape size and icon size, the icon arrangement determining unit 33 should display the icons. It may be decided to reduce the number of icons.
  • the icon arrangement determining unit 33 may determine the number of icons to be displayed based on the absolute size of the finger trajectory (or the enclosed area) with reference to the contact information. Thus, the user can intentionally adjust the number of icons to be displayed next by changing whether the object is surrounded by a smaller ring or a larger ring.
  • the icon arrangement determining unit 33 displays a specific related item as an icon based on the information of “condition” which is one of the attributes of the related item. You may decide not to.
  • Fig. 6 shows a specific example of "condition”.
  • the related information table shown in FIG. 6 has a “condition” field.
  • the attribute “condition” of the related item is information indicating a condition for displaying the icon of the related item.
  • condition “if there is attached information” is associated with each of the related items “display attached information of photo” and “display attached information of album”. This condition prescribes that if there is attached information in the photo, an icon is displayed, and if there is no attached information in the photo, no icon is displayed.
  • the icon placement determination unit 33 must include attached information in the selected photo when the photo object is selected and the eight related items (related item group 60) shown in FIG. 6 are extracted. For example, it is determined that only seven related items except for the related item “display attached information of photograph” are arranged in the ring.
  • the operation screen processing unit 24 may include an icon rank determining unit 31.
  • the icon ranking determining unit 31 gives priority to the related items extracted by the related item extracting unit 23 based on the attributes of the related items.
  • the related information includes a “selection frequency” field as one of the attributes of the related item.
  • “Selection frequency” is information indicating the number of times that the related item has been selected by the user in the past.
  • the icon order determination unit 31 assigns priorities to each related item in the descending order of the “selection frequency” of the extracted related items.
  • the icon arrangement determining unit 33 can determine the icon arrangement according to the priority order determined by the icon order determining unit 31.
  • icons can be arranged clockwise from the highest priority, or related item icons with higher priority can be arranged in the upper half of the ring.
  • the icon arrangement determining unit 33 can reduce the icons in descending order of priority.
  • the tablet terminal 100 of the present invention may accept selection of a plurality of objects on the object list screen. In this case, the tablet terminal 100 can extract related items according to the combination of the selected objects.
  • FIG. 36A is a diagram illustrating a state in which the user performs a contact operation of “enclosing” an object in order to select a plurality of target objects.
  • FIG. 36B is a diagram illustrating a specific example of the operation screen generated by the operation screen processing unit in accordance with the contact operation illustrated in FIG.
  • the object specifying unit 22 specifies that three photographs of the object 80, the object 84, and the object 85 are selected.
  • the related item extraction unit 23 refers to the related information shown in FIG. 6 and extracts the related item group 60 associated with the object “photograph”.
  • the operation screen processing unit 24 reads icon images corresponding to the related item group 60 from the icon storage unit 43 and arranges them in a ring around the object. However, since there are three selected objects here, the operation screen processing unit 24 displays the three selected photos (objects 80, 84, 85) in the center of the ring. Thus, the user can obtain an operation screen in which what he / she actually encloses is displayed in the center, and icons related to these are displayed in a ring shape around them. This makes it possible to easily grasp the relationship between what is actually enclosed and what is obtained as a result. In the present modification, three photographs (objects 80, 84, 85) arranged together in the center can be selected together by one touch operation.
  • the tablet terminal 100 executes a process of transferring all three pictures to the TV. Can be made. Since there is no need to drag each photo, the number of user actions can be reduced.
  • FIG. 37 is a diagram illustrating a state in which the user performs a contact operation of “enclosing” an object in order to select a plurality of different types of objects.
  • FIG. 37B is a diagram illustrating a specific example of the operation screen generated by the operation screen processing unit in accordance with the contact operation illustrated in FIG.
  • FIG. 38 is a diagram illustrating another example of the related information stored in the related information storage unit 42 in the present modification.
  • the user surrounds an object of a different type from the photograph, that is, the music file 86 in addition to the three photographs (objects 80, 84, 85) from the object list screen.
  • the object specifying unit 22 specifies that a total of four objects of three photos 80, 84, and 85 and a music file 86 have been selected.
  • the related item extraction unit 23 refers to the related information shown in FIG. 38 and selects the related item group 66 associated with the object “photo + music file”.
  • the operation screen processing unit 24 reads out icon images corresponding to the related item group 66 from the icon storage unit 43, and arranges them in a ring around the object.
  • the operation screen processing unit 24 brings together the icons of the three photos and music files selected in the center of the ring. Display in the center.
  • the user can obtain an operation screen in which what he / she actually encloses is displayed in the center, and icons related to these are displayed in a ring shape around them. This makes it possible to easily grasp the relationship between what is actually enclosed and what is obtained as a result.
  • an object consists of a plurality of types of data
  • relevant items suitable for the combination of the objects are extracted and their icons are displayed.
  • photo such as “print (photo)” or “edit photo”. Items are not extracted, but related items using both “photos” and “music files” such as “display slide show” are extracted.
  • Icon 87 is displayed around the center object.
  • the tablet terminal 100 reproduces the music file 86 and displays the icon.
  • the three photos dragged to 87 are displayed in a slide show.
  • the tablet terminal 100 can accept a contact operation in which a plurality of objects or different types of objects are selected by enclosing them all at once. Appropriate related items can be extracted according to the combination of objects.
  • the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations.
  • the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
  • an information processing apparatus including a touch panel, at least a part of a contact motion acquisition unit that acquires a trajectory of movement of the indicator that has moved on the touch panel and a region surrounded by the trajectory acquired by the contact motion acquisition unit Is identified by the object identifying means with reference to an object identifying means for identifying an object including the object as a selected object and a related information storage unit for storing the object and an item related to the object in association with each other
  • Related item extraction means for extracting an item associated with an object as a related item, and an operation for displaying on the touch panel the icons of related items extracted by the related item extraction means arranged on the outline of a ring
  • An information processing apparatus comprising screen processing means
  • an operation screen display method in an information processing apparatus including a touch panel the operation is displayed surrounded by a contact motion acquisition step of acquiring a trajectory of movement of the indicator that has moved on the touch panel, and a trajectory acquired in the contact motion acquisition step.
  • the object specifying step for specifying an object including at least a part in the area as the selected object, and the related information storage unit for storing the object and an item related to the object in association with each other
  • the object A related item extraction step for extracting an item associated with the object specified in the specific step as a related item, and icons of related items extracted by the related item extraction step are arranged side by side on the outline of the ring.
  • Operation screen processing step displayed on the touch panel Operation screen display method which comprises and.
  • each block of the tablet terminal 100 in particular, the contact information generation unit 21, the object identification unit 22, the related item extraction unit 23, the operation screen processing unit 24, the gesture determination unit 25, the cooperation processing execution unit 26, and the device direction identification unit 27, the ring shape determining unit 30, the icon order determining unit 31, the animation determining unit 32, and the icon arrangement determining unit 33 may be configured by hardware logic, or realized by software using a CPU as follows. May be.
  • the tablet terminal 100 includes a CPU (central processing unit) that executes instructions of a control program that realizes each function, a ROM (read only memory) that stores the program, a RAM (random access memory) that develops the program, A storage device (recording medium) such as a memory for storing the program and various data is provided.
  • An object of the present invention is to provide a recording medium on which a program code (execution format program, intermediate code program, source program) of a control program of the tablet terminal 100, which is software that realizes the functions described above, is recorded so as to be readable by a computer. This can also be achieved by supplying the tablet terminal 100 and reading and executing the program code recorded on the recording medium by the computer (or CPU or MPU).
  • Examples of the recording medium include tapes such as magnetic tapes and cassette tapes, magnetic disks such as floppy (registered trademark) disks / hard disks, and disks including optical disks such as CD-ROM / MO / MD / DVD / CD-R.
  • Card system such as IC card, IC card (including memory card) / optical card, or semiconductor memory system such as mask ROM / EPROM / EEPROM / flash ROM.
  • the tablet terminal 100 may be configured to be connectable to a communication network, and the program code may be supplied via the communication network.
  • the communication network is not particularly limited.
  • the Internet intranet, extranet, LAN, ISDN, VAN, CATV communication network, virtual private network, telephone line network, mobile communication network, satellite communication. A net or the like is available.
  • the transmission medium constituting the communication network is not particularly limited.
  • wired such as IEEE 1394, USB, power line carrier, cable TV line, telephone line, ADSL line, etc.
  • infrared rays such as IrDA and remote control, Bluetooth (Registered trademark), 802.11 wireless, HDR, mobile phone network, satellite line, terrestrial digital network, and the like can also be used.
  • the present invention can also be realized in the form of a computer data signal embedded in a carrier wave in which the program code is embodied by electronic transmission.
  • the present invention can be widely applied to information processing apparatuses including an input unit and a display unit.
  • a digital TV, personal computer, smartphone, tablet PC, notebook computer, mobile phone, PDA (Personal Digital Assistant), electronic book reader, electronic dictionary, portable, provided with an input unit and a display unit -It can be used suitably for a home game machine, an electronic blackboard, etc.
  • PDA Personal Digital Assistant
  • electronic book reader electronic dictionary
  • portable portable

Abstract

This information processing device, which is provided with an input unit and a display unit, achieves superior operability. A tablet terminal (100) is characterized by being provided with: a touch information generation unit (21) that acquires the trajectory along which an indicator pointing out a position on the screen of the display unit (12) has moved; an object specification unit (22) that specifies, as a selected object, an object of which at least a portion is included in a region encircled by the trajectory; a related item extraction unit (23) that extracts the item associated with the specified object as a related item by referring to a related information recording unit (42), which associates and records objects and items related to the objects; and an operation screen processing unit (24) that disposes the icons of the extracted related items along the contour line of a ring, and displays the icons at the display unit.

Description

情報処理装置、操作画面表示方法、制御プログラムおよび記録媒体Information processing apparatus, operation screen display method, control program, and recording medium
 本発明は、入力部および表示部を備える情報処理装置のユーザインターフェース技術に関するものである。 The present invention relates to a user interface technology of an information processing apparatus including an input unit and a display unit.
 近年、スマートフォン、タブレットPCなどの、いわゆるタブレット端末が急速に普及している。タブレット端末は、平板状の外形を有しており、表示部および入力部としてのタッチパネルを備えている。このタッチパネルに表示されたオブジェクトを、指やペン等でタッチすることにより、ユーザは、タブレット端末本体への各種操作を行うことができる。 In recent years, so-called tablet terminals such as smartphones and tablet PCs are rapidly spreading. The tablet terminal has a flat outer shape and includes a touch panel as a display unit and an input unit. By touching the object displayed on the touch panel with a finger, a pen, or the like, the user can perform various operations on the tablet terminal body.
 タブレット端末は、タッチパネルによってユーザの画面上での様々な接触動作を判別することができ、その接触動作に合わせたオブジェクト表示を行うことができる。例えば、接触動作には、画面に表示されたオブジェクトを、指(ペン)でタップする(軽くたたく)、フリックする(はじく、はらう)、ピンチする(指でつまむ)、ドラッグするなどの様々な動作がある。タブレット端末は、そうしたさまざまな接触動作を判別し、判別結果に応じて、オブジェクトの選択/移動、リストのスクロール、画像などの拡大/縮小を行う。タブレット端末は、上述のようにタッチパネルによってより直感的な操作を実現し、多くの人から支持されている。 The tablet terminal can discriminate various contact operations on the user's screen by the touch panel, and can perform object display according to the contact operation. For example, for the touch action, various actions such as tapping (lightly tapping), flicking (flicking, picking), pinching (pinch with a finger), dragging an object displayed on the screen with a finger (pen) There is. The tablet terminal discriminates such various contact actions, and selects / moves objects, scrolls a list, enlarges / reduces an image, etc. according to the discrimination result. The tablet terminal realizes a more intuitive operation by the touch panel as described above, and is supported by many people.
 例えば、特許文献1には、タッチパネル式の表示部を備えた携帯通信端末が開示されている。特許文献1の携帯通信端末では、オブジェクト(URL、メールアドレス、文字列、画像など)を、指(ペン)でなぞるようにタッチしたり、囲むようにタッチしたりすることによって選択できるようになっている。そして、そのような動作で、オブジェクトが選択された場合には、携帯通信端末は、選択されたオブジェクトからキーワードを抽出して関連サイトへアクセスする。 For example, Patent Document 1 discloses a mobile communication terminal including a touch panel display unit. In the portable communication terminal of Patent Document 1, an object (URL, e-mail address, character string, image, etc.) can be selected by touching it with a finger (pen) or touching it so as to surround it. ing. When an object is selected by such an operation, the mobile communication terminal extracts a keyword from the selected object and accesses a related site.
 また、特許文献2には、タッチパネルディスプレイを備えた携帯機器が開示されている。特許文献2の携帯機器は、タッチパネルディスプレイにスルー画像(カメラに映った画像等)を表示し、囲むようにタッチして選択されたスルー画像内の特定対象を検出して、該特定対象の縮小画像をレリーズボタンとしてタッチパネルディスプレイの端部に表示することができる。 Further, Patent Document 2 discloses a portable device having a touch panel display. The portable device of Patent Document 2 displays a through image (an image reflected on a camera, etc.) on a touch panel display, detects a specific target in the through image selected by touching the surrounding area, and reduces the specific target. The image can be displayed on the edge of the touch panel display as a release button.
 また、特許文献3には、タッチパネルを用いたウェブサイト検索システムが開示されている。特許文献3のウェブサイト検索システムは、キーワード表示領域に表示されているキーワードが、手でタッチされると、それを検索キーワードとして受け付け、受け付けたキーワードに対応する第1マザーアイコンを表示する。そして、ウェブサイト検索システムは、キーワードに従って検索エンジンでウェブサイトを検索し、検索されたウェブサイトのサムネイル画像を上記第1マザーアイコンの周囲に表示する。 Patent Document 3 discloses a website search system using a touch panel. When the keyword displayed in the keyword display area is touched by hand, the website search system of Patent Literature 3 accepts it as a search keyword and displays a first mother icon corresponding to the accepted keyword. Then, the website search system searches the website with a search engine according to the keyword, and displays a thumbnail image of the searched website around the first mother icon.
 また、特許文献4には、接触センサを有する表示パネルを備えた情報処理装置が開示されている。特許文献4の情報処理装置は、オブジェクトが選択されている状態で、操作体(指)の所定角度以上の回転を検知して、該オブジェクトに関連する操作項目を該オブジェクトの周辺に表示する。 Patent Document 4 discloses an information processing apparatus including a display panel having a contact sensor. The information processing apparatus disclosed in Patent Document 4 detects rotation of the operating body (finger) by a predetermined angle or more while an object is selected, and displays operation items related to the object around the object.
 また、特許文献5には、タッチパネル部を備えた情報処理装置が開示されている。特許文献5の情報処理装置は、ユーザのタッチ位置の軌跡を取得して、軌跡によって選択されたオブジェクト画像を特定し、軌跡の端点に応じた位置に、その選択されたオブジェクト画像を移動させる。 Further, Patent Document 5 discloses an information processing apparatus including a touch panel unit. The information processing apparatus disclosed in Patent Document 5 acquires a trajectory of a user's touch position, specifies an object image selected by the trajectory, and moves the selected object image to a position corresponding to an end point of the trajectory.
 一方で、タブレット端末にかかわらず、また、装置の大小を問わず、情報処理装置一般において、メニューを表示し選択を受け付けるというユーザインターフェース技術は非常によく用いられている。この技術によれば、情報処理装置は、ユーザが所望の項目を選択可能なようにメニューを表示し、ユーザの選択を受け付ける。これにより、ユーザは所望の項目を選択することで装置を操作することができる。例えば、特許文献6~9には、ユーザの利便性およぶ操作性を向上させることを目的として、メニュー表示を実現する情報処理装置が開示されている。 On the other hand, regardless of the tablet terminal and regardless of the size of the device, in general information processing devices, user interface technology that displays a menu and accepts selection is very often used. According to this technology, the information processing apparatus displays a menu so that the user can select a desired item, and accepts the user's selection. Thus, the user can operate the apparatus by selecting a desired item. For example, Patent Documents 6 to 9 disclose information processing apparatuses that realize menu display for the purpose of improving user convenience and operability.
特開2010-218322号公報(2010年9月30日公開)JP 2010-218322 A (published on September 30, 2010) 特開2010-182023号公報(2010年8月19日公開)JP 2010-182023 A (released on August 19, 2010) 特開2009-134738号公報(2009年6月18日公開)JP 2009-134738 A (released on June 18, 2009) 特開2011-13980号公報(2011年1月20日公開)JP 2011-13980 A (published January 20, 2011) 特開2006-244353号公報(2006年9月14日公開)JP 2006-244353 A (published on September 14, 2006) 特開平8-305535号公報(1996年11月22日公開)JP-A-8-305535 (released on November 22, 1996) 特開平10-307674号公報(1998年11月17日公開)Japanese Patent Laid-Open No. 10-307664 (published November 17, 1998) 特表平11-507455号公報(1999年6月29日公表)No. 11-507455 (announced on June 29, 1999) 特開2001-265475号公報(2001年9月28日公開)JP 2001-265475 A (published September 28, 2001)
 タブレット端末の操作性の良さは、いかに簡易な接触動作で且ついかに少ない動作数でユーザの目的である最終結果物を表示させるかということ、および、いかにユーザの直感に反しない自然な流れで、該接触動作に基づく結果物の表示を行うかということにかかっている。 The good operability of the tablet terminal is how to display the final result, which is the user's purpose, with a simple contact operation and a small number of operations, and a natural flow that does not contradict the user's intuition, It depends on whether to display the result based on the contact operation.
 こうした操作性の向上は、ユーザの目的、ユーザの状態、ユーザの傾向を適正に把握することによって実現される。タブレット端末は、例えば、ユーザは今何をどうしたいのか、その次に何がしたいのか、ユーザは今どうやって操作しているのか、今どこにいるのか、ユーザの動きに対してどのように表示することが自然であるのか、など、ユーザの意図をあらゆる観点から「察する」ことが求められている。 Such an improvement in operability is realized by appropriately grasping the user's purpose, the user's state, and the user's tendency. For example, the tablet terminal displays what the user wants to do now, what the user wants to do next, how the user is now operating, where the user is, and how the user moves. It is required to “see” the user's intentions from all points of view, such as whether or not it is natural.
 上述の特許文献1~9の各装置の構成では、ユーザの意図を察するには必ずしも十分とは言えない。 The configurations of the devices described in Patent Documents 1 to 9 are not necessarily sufficient to detect the user's intention.
 より具体的には、特許文献1には、オブジェクトを囲うという動作によって、オブジェクトを選択することが開示されているが、上記動作によって、そのオブジェクトに関連する項目を抽出し表示することは開示されていない。また、特許文献2には、オブジェクトを囲うという動作によって、そのオブジェクトに対応するアイコンを表示することは開示されているが、上記動作によって、そのオブジェクトを選択し、該選択に伴って、オブジェクトに関連する項目を抽出し表示することは開示されていない。また、特許文献3には、オブジェクトが選択されると、オブジェクトを表示し、オブジェクトに関連するサムネイルをオブジェクトの周囲に表示することが開示されているが、オブジェクトを囲うという動作とサムネイルの表示とは結びついていない。また、特許文献4には、オブジェクトをタッチして選択すること、および、オブジェクトに関連するアイコンをオブジェクトの周囲に表示することが開示されているが、周囲にアイコンを表示させるために、オブジェクトの選択とは別に、煩雑な動作(指をタッチ面に押さえつけて指の角度を変えてひねるような動作)を行わなければならず、操作数が増え、目的の結果物(アイコン)を表示するまでの操作が非常に複雑なものとなる。 More specifically, Patent Document 1 discloses that an object is selected by an operation of surrounding the object, but it is disclosed that an item related to the object is extracted and displayed by the above operation. Not. Further, Patent Document 2 discloses that an icon corresponding to an object is displayed by an operation of enclosing the object. However, the object is selected by the above operation, and the object is displayed along with the selection. Extracting and displaying related items is not disclosed. Further, Patent Document 3 discloses that when an object is selected, the object is displayed, and a thumbnail related to the object is displayed around the object. Are not connected. Patent Document 4 discloses that an object is touched to be selected and an icon related to the object is displayed around the object. In order to display an icon around the object, the object is displayed. Apart from the selection, it is necessary to perform complicated operations (operations such as pressing the finger against the touch surface and twisting the finger angle) until the number of operations increases and the desired result (icon) is displayed. The operation becomes very complicated.
 結果として、オブジェクトの選択、決定、結果物(関連する項目)の表示、結果物の選択、・・・最終結果物の表示、という一連の処理を、簡易な接触動作、少ない動作数、および、直感的な接触動作で装置に行わせるということができないという問題がある。 As a result, a series of processes such as object selection, determination, display of the result (related items), selection of the result, display of the final result, a simple contact operation, a small number of operations, and There is a problem that the device cannot be operated by an intuitive contact operation.
 上述の操作性の問題は、携帯性に優れた小型のタブレット端末のみならず、タッチパネル式の表示部兼入力部を備えた、あらゆるサイズの情報処理装置、ならびに、タッチパネルに限定されずあらゆる形態の表示部および入力部を備えた情報処理装置に共通して生じる問題である。 The above-mentioned operability problem is not limited to a small tablet terminal excellent in portability, but also an information processing apparatus of any size provided with a touch panel type display unit and input unit, as well as all types of forms not limited to a touch panel. This is a problem that commonly occurs in information processing apparatuses including a display unit and an input unit.
 本発明は、上記の問題点に鑑みてなされたものであり、その目的は、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することにある。 The present invention has been made in view of the above-described problems, and an object thereof is to realize excellent operability in an information processing apparatus including an input unit and a display unit.
 本発明の情報処理装置は、上記課題を解決するために、表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得手段と、上記軌跡取得手段によって取得された軌跡により囲われた領域に、少なくとも一部が含まれるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定手段と、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定手段によって特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出手段と、上記関連項目抽出手段によって抽出された関連項目のアイコンを、環の輪郭線上に並べて配置して、上記表示部に表示する操作画面処理手段とを備えていることを特徴としている。 In order to solve the above-described problem, the information processing apparatus of the present invention is surrounded by a trajectory acquisition unit that acquires a trajectory of movement of an indicator that indicates the position of the screen of the display unit, and a trajectory acquired by the trajectory acquisition unit. With reference to an object specifying means for specifying an object including at least a part in the selected area as the selected object, and a related information storage unit that stores the object and an item related to the object in association with each other, Related item extracting means for extracting items associated with the object specified by the object specifying means as related items, and icons of related items extracted by the related item extracting means are arranged side by side on the outline of the ring. And an operation screen processing means for displaying on the display section.
 上記構成によれば、軌跡取得手段が囲うという動作(指示体の移動)の軌跡を取得し、この軌跡に基づいて、オブジェクト特定手段が、囲う動作によってユーザが選択したオブジェクトを特定する。続いて、関連項目抽出手段が、特定されたオブジェクトに関連する項目を抽出する。関連情報記憶部には、オブジェクトと該オブジェクトに関連する項目とが対応付けて記憶されているので、関連項目抽出手段によって抽出された関連項目は、ユーザによって選択されたオブジェクトに関連があるものばかりである。最後に、操作画面処理手段は、抽出された関連項目のアイコンを、囲う動作から連想されやすい環の形状に並べて配置する。 According to the above configuration, the trajectory of the movement (indicator movement) that the trajectory acquisition unit encloses is acquired, and based on this trajectory, the object specifying unit specifies the object selected by the user by the enclosing operation. Subsequently, the related item extracting unit extracts items related to the identified object. Since the related information storage unit stores the object and the item related to the object in association with each other, the related item extracted by the related item extracting unit is related to the object selected by the user. It is. Finally, the operation screen processing means arranges the icons of the extracted related items in a ring shape that is easily associated with the surrounding operation.
 これにより、上述のようにしてアイコンが配置され、生成された操作画面がユーザに提示される。このように、本発明の情報処理装置は、オブジェクトの周囲を指示体(ペンまたは指など)で「囲う」という、オブジェクトを指定する上で、極めて自然で簡易なユーザの動作を契機として、その選択されたオブジェクトに関連する関連項目のアイコンを、環の形状に配置した操作画面をユーザに提供することができる。 Thereby, the icons are arranged as described above, and the generated operation screen is presented to the user. As described above, the information processing apparatus according to the present invention uses a very natural and simple user action as an opportunity to specify an object, which is to “enclose” the object with an indicator (such as a pen or a finger). An operation screen in which icons of related items related to the selected object are arranged in the shape of a ring can be provided to the user.
 「囲う」の軌跡は、何かを囲むような形状になっているので、環の形状と、囲う動作によって得られた指示体の移動の軌跡とは類似していると言える。よって、先の、ユーザの囲う動作から、結果物(環状に配置されたアイコン)は連想されやすい。 Since the trajectory of “enclose” has a shape that encloses something, it can be said that the shape of the ring is similar to the trajectory of movement of the indicator obtained by the enclosing operation. Therefore, the result (an icon arranged in a ring shape) is likely to be associated with the above-described operation surrounded by the user.
 つまり、「オブジェクトを『囲う』動作を起こす」という事象から、「アイコンを環状に配置した操作画面が表示される」という事象への遷移は、ユーザの直感に反しない自然な流れであると言える。 In other words, it can be said that the transition from the event of "enclosing the object" to the event of "an operation screen with icons arranged in a circle" is a natural flow that does not contradict the user's intuition. .
 加えて、本発明の情報処理装置は、オブジェクトを選択した次にユーザが選択するであろう関連項目を予め察して、ユーザに選択可能に表示することができる。具体的には、本発明の上記構成によれば、操作画面処理手段によって操作画面に配置されるアイコンは、いずれも、ユーザが選択したオブジェクトに関連がある項目として抽出された関連項目のアイコンである。つまり、オブジェクトを囲って選択したのち、次に選択されるであろう関連項目が環状のアイコンで表示されるので、ユーザは次に目的とするアイコンを、アイコンの環の中から即座に指定することができる。 In addition, the information processing apparatus of the present invention can preliminarily detect related items that the user will select next after selecting an object, and can display the related items in a selectable manner for the user. Specifically, according to the above configuration of the present invention, the icons arranged on the operation screen by the operation screen processing means are all related item icons extracted as items related to the object selected by the user. is there. In other words, after selecting the object, the related item that will be selected next is displayed with a circular icon, so the user can immediately specify the next desired icon from the icon ring. be able to.
 また、関連項目のアイコンを環状に配置したメニューリストは、線状の一次元的なメニューリストと比較して、次のようなメリットがある。例えば、一次元的なメニューリストでは、例えば、上から下へ、または、左から右へアイコンが配置されるため、配置位置に応じて各アイコンに意図せず優先順位が付いてしまう。これに対し、環状のメニューリストによれば、環状に配置されるすべてのアイコンを対等に扱うことが可能となる。 Also, the menu list with related item icons arranged in a circle has the following advantages compared to the linear one-dimensional menu list. For example, in a one-dimensional menu list, for example, icons are arranged from top to bottom or from left to right. Therefore, priorities are unintentionally assigned to the icons according to the arrangement positions. On the other hand, according to the circular menu list, it is possible to treat all icons arranged in a circular manner on an equal basis.
 以上のことから、本発明の情報処理装置は、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the information processing apparatus according to the present invention can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 本発明の情報処理装置では、上記操作画面処理手段は、上記選択されたオブジェクトの周囲にアイコンが配置されるように上記環の位置およびサイズを決定することが好ましい。 In the information processing apparatus of the present invention, it is preferable that the operation screen processing means determines the position and size of the ring so that an icon is arranged around the selected object.
 上記構成によれば、オブジェクトの周囲を「囲う」という、オブジェクトを指定する上で、極めて自然で簡易なユーザの動作に対して、情報処理装置は、そのオブジェクトの周囲にアイコンを配置するという結果を出力することができる。 According to the configuration described above, the information processing apparatus places an icon around the object in response to an extremely natural and simple user action of “enclosing” the object. Can be output.
 ユーザは、先に自分が囲んで選択したオブジェクトを囲むように関連項目のアイコンが配置された操作画面を結果物として得ることができる。これらのアイコンとオブジェクトとの位置関係は、先にユーザが実行した動作による指示体の移動の軌跡とオブジェクトとの位置関係に合致する。また、オブジェクトを囲うようにして得られた指示体の移動の軌跡は、アイコンが配置される環の形状に類似する。 The user can obtain, as a result, an operation screen in which icons of related items are arranged so as to surround the object that he / she has previously enclosed and selected. The positional relationship between these icons and the object matches the positional relationship between the object and the locus of movement of the indicator by the action previously performed by the user. Further, the movement trajectory of the indicator obtained by surrounding the object is similar to the shape of the ring in which the icon is arranged.
 つまり、「オブジェクトを『囲う』動作を起こす」という事象から、「オブジェクトの周囲にアイコンが配置された操作画面が表示される」という事象への遷移は、ユーザの直感に反しないより自然な流れであると言える。 In other words, the transition from the event of “enclosing the object” to the event of “an operation screen with icons arranged around the object” is more natural than the user's intuition. It can be said that.
 さらに、関連項目のアイコンをオブジェクトの周囲に環状に配置したメニューリストは、線状の一次元的なメニューリストと比較して、次のようなメリットがある。例えば、一次元的なメニューリストでは、例えば、上から下へ、または、左から右へアイコンが配置されるため、配置位置に応じて各アイコンに意図せず優先順位が付いてしまう。これに対し、環状のメニューリストによれば、環状に配置されるすべてのアイコンを対等に扱うことが可能となる。さらに、一次元的なメニューリストを先に選択されたオブジェクトの近くに表示したとしても、オブジェクトと各アイコンとの関連性を表現することは難しい。これに対し、先に選択されたオブジェクトの周囲に環状のメニューリストを表示させれば、先に選択された(囲われた)オブジェクトと、その周囲の各アイコンとの間に関連性があるということをユーザに自然に認識させることが可能となる。 Furthermore, a menu list in which icons of related items are arranged in a circle around an object has the following advantages compared to a linear one-dimensional menu list. For example, in a one-dimensional menu list, for example, icons are arranged from top to bottom or from left to right. Therefore, priorities are unintentionally assigned to the icons according to the arrangement positions. On the other hand, according to the circular menu list, it is possible to treat all icons arranged in a circular manner on an equal basis. Furthermore, even if a one-dimensional menu list is displayed near the previously selected object, it is difficult to express the relationship between the object and each icon. In contrast, if a circular menu list is displayed around the previously selected object, there is a relationship between the previously selected (enclosed) object and the surrounding icons. This makes it possible for the user to recognize this naturally.
 以上のことから、本発明の情報処理装置は、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、タッチパネルを備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the information processing apparatus according to the present invention can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the touch panel.
 本発明の情報処理装置では、上記操作画面処理手段は、上記軌跡取得手段によって取得された軌跡、もしくは、その相似形または近似形を、上記環の形状として決定することが好ましい。 In the information processing apparatus of the present invention, it is preferable that the operation screen processing means determines the trajectory acquired by the trajectory acquisition means, or a similar shape or an approximate shape thereof, as the shape of the ring.
 上記構成によれば、ユーザがフリーハンドで任意の形状でオブジェクトを囲うという動作を行うが、このときの軌跡が、軌跡取得手段によって保持される。そして、操作画面処理手段は、操作画面を作成する際、上記のようにして得られた軌跡と同じまたは相似形の環の輪郭線上に、所定の領域(あるいはオブジェクトそのもの)を囲むようにして各アイコンを配置する。 According to the above configuration, the user performs an operation of freely surrounding the object with an arbitrary shape, and the trajectory at this time is held by the trajectory acquisition means. Then, when creating the operation screen, the operation screen processing means sets each icon so as to surround a predetermined area (or the object itself) on the contour line of the ring that is the same or similar to the locus obtained as described above. Deploy.
 これにより、オブジェクトの周囲を「囲う」という、オブジェクトを指定する上で、極めて自然で簡易なユーザの動作を契機として、アイコンを環状に配置するという結果を出力することができる。つまり、ユーザは、所定の領域(あるいはオブジェクトそのもの)を囲むように関連項目のアイコンが配置された操作画面を結果物として得ることができる。 This makes it possible to output a result that icons are arranged in a ring shape in response to a very natural and simple user action in specifying the object “enclose” around the object. That is, the user can obtain, as a result, an operation screen in which icons of related items are arranged so as to surround a predetermined area (or the object itself).
 さらに、アイコンが配置される環の形状は、オブジェクトを囲うようにして得られた操作体の移動の軌跡に一致するか、それと相似の関係にある状態で操作画面に表示される。 Furthermore, the shape of the ring in which the icons are arranged is displayed on the operation screen in a state that matches or is similar to the movement trajectory of the operation body obtained by surrounding the object.
 つまり、ユーザがオブジェクトを囲うと、「ユーザがオブジェクトを囲ったとおりに」アイコンが配置された操作画面が得られる。この事象の遷移は、ユーザの直感に反しないより自然な流れであると言える。 That is, when the user surrounds the object, an operation screen on which icons are arranged “as the user surrounded the object” is obtained. It can be said that the transition of this event is a more natural flow that does not contradict the user's intuition.
 また、ユーザが囲ったとおりの形状でアイコンが配置されるので、ユーザは思い通りの形状でオブジェクトを囲うことにより、思い通りの形状にアイコンが配置された操作画面を得ることができる。これにより、操作画面を表示して情報処理装置を操作する際の遊戯性が高まる。 Further, since the icons are arranged in the shape as enclosed by the user, the user can obtain the operation screen in which the icons are arranged in the desired shape by surrounding the object in the desired shape. Thereby, the playability at the time of displaying an operation screen and operating information processing apparatus increases.
 その上、ユーザは、アイコンの配置を予測して、自分が希望するとおりにオブジェクトを囲い、関連項目のアイコンを表示させることができるため、操作性はさらに向上する。 In addition, since the user can predict the arrangement of the icons, surround the object as he / she desires, and display the icons of related items, the operability is further improved.
 以上のことから、本発明の情報処理装置は、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the information processing apparatus according to the present invention can display the final result desired by the user in a natural flow that does not contradict the user's intuition, while having a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 本発明の情報処理装置では、上記操作画面処理手段は、各アイコンが表示されるタイミングを個々に決定してもよい。 In the information processing apparatus of the present invention, the operation screen processing means may individually determine the timing at which each icon is displayed.
 上記構成によれば、抽出された関連項目のアイコンをすべて同時する表示するだけでなく、個々のアイコンを別々のタイミングで表示させるようなアニメーションが付加された操作画面を実現することができる。 According to the above-described configuration, it is possible to realize an operation screen to which animations for displaying individual icons at different timings as well as displaying all the icons of the extracted related items simultaneously.
 さらに、本発明の情報処理装置では、上記軌跡取得手段は、上記指示体の移動の開始からの経過時間を計測し、上記軌跡を構成する点の少なくともいくつかに、上記経過時間を示す移動時間情報を対応付けて保持し、上記操作画面処理手段は、上記軌跡および上記移動時間情報から判明した上記指示体の移動方向に合わせて、時計回り、または、反時計回りに各アイコンを順次表示させることが好ましい。 Furthermore, in the information processing apparatus according to the present invention, the trajectory acquisition unit measures an elapsed time from the start of the movement of the indicator, and travel time indicating the elapsed time at least some of the points constituting the trajectory. The operation screen processing means sequentially displays each icon clockwise or counterclockwise according to the moving direction of the indicator determined from the trajectory and the moving time information. It is preferable.
 上記構成によれば、上記軌跡の各点に移動時間情報が関連付けられているので、これを参照すれば、囲う動作について、移動方向、すなわち、時計回りでオブジェクトが囲まれたのか、反時計回りでオブジェクトが囲まれたのかが判明する。 According to the above configuration, the movement time information is associated with each point of the trajectory. With reference to this, regarding the enclosing operation, whether the object is surrounded in the movement direction, that is, clockwise, or counterclockwise. To see if the object is surrounded.
 操作画面処理手段は、このユーザのオブジェクトを囲う実際の動き(指示体の移動方向)に一致させて、同じ方向に各アイコンを順次環状に出現させる。具体的には、操作画面処理手段は、時計回りでオブジェクトが囲まれた場合には、アイコンを1つ1つ時計回りに環状に配置し、反時計回りでオブジェクトが囲まれた場合には、アイコンを1つ1つ反時計回りに環状に配置する。 The operation screen processing means causes each icon to sequentially appear in the same direction in a circular manner in accordance with the actual movement (indicator moving direction) surrounding the user's object. Specifically, the operation screen processing means arranges icons one by one in a ring shape clockwise when the object is surrounded clockwise, and when the object is surrounded counterclockwise, The icons are arranged in a ring in a counterclockwise direction one by one.
 これにより、ユーザが囲ったときの動きとほぼ同じ動き(指示体の移動方向と同じ動き)を伴った結果物(アイコンの出現パターン)が得られるので、ユーザの動作と表示された結果物との関連性をより高めることが可能となり、結果として、ユーザの直感に反しない自然な流れで操作画面を提供することができる。 As a result, a result (icon appearance pattern) with almost the same movement (the same movement as the direction of movement of the indicator) is obtained as when the user is surrounded. As a result, the operation screen can be provided in a natural flow that does not contradict the user's intuition.
 さらに、本発明の情報処理装置では、上記操作画面処理手段は、上記環における、上記軌跡の始点の位置と相対的に同じ位置を、最初のアイコンの表示位置に決定することが好ましい。 Furthermore, in the information processing apparatus of the present invention, it is preferable that the operation screen processing means determines a position relatively the same as the position of the start point of the trajectory in the ring as a display position of the first icon.
 上記構成によれば、操作画面処理手段は、指示体と同じ移動方向にて、アイコンを順次出現させる場合に、1番目のアイコンを出現させる、環の輪郭線上の出現開始位置を、指示体の移動の軌跡の始点に基づいて決定する。 According to the above configuration, the operation screen processing means determines the appearance start position on the outline of the ring, where the first icon appears, when the icons appear sequentially in the same movement direction as the indicator. It is determined based on the starting point of the movement trajectory.
 つまり、アイコンを配置するための基準線となる環における、上記軌跡の始点の位置と相対的に同じ位置を最初のアイコンの表示位置として決定し、そこから、指示体と同じ移動方向(時計回り、または、反時計回り)にて、残りのアイコンを順次出現させる。 In other words, the position that is relatively the same as the position of the start point of the locus in the ring that is the reference line for placing the icon is determined as the display position of the first icon, and from there, the same movement direction (clockwise) as the indicator , Or counterclockwise), the remaining icons appear sequentially.
 これにより、動作から結果物がより一層連想され、ユーザの直感に反しないより自然な流れで操作画面を提供することができる。 This makes it possible to provide an operation screen with a more natural flow that is more reminiscent of the user's intuition and results are more reminiscent of the operation.
 さらに、本発明の情報処理装置では、上記操作画面処理手段は、上記軌跡を形成した上記指示体の移動速度に対応するように、各アイコンを順次表示させるタイミングを決定してもよい。 Furthermore, in the information processing apparatus of the present invention, the operation screen processing means may determine the timing for sequentially displaying the icons so as to correspond to the moving speed of the indicator that forms the locus.
 上記構成によれば、操作画面処理手段は、指示体と同じ移動方向にて、また、操作体の移動の開始位置を一致させてアイコンを順次出現させる場合に、各アイコンを環の所定位置に出現させるタイミングを、上記軌跡を形成した指示体の移動速度に対応させる。 According to the above configuration, the operation screen processing means sets each icon to a predetermined position in the ring when the icons sequentially appear in the same movement direction as the indicator and with the start position of the movement of the operation body matched. The timing of appearance is made to correspond to the moving speed of the indicator that formed the locus.
 これにより、ユーザが囲ったときの動きおよび動きの速度と相対的に同じ速度の同じ動きを伴った結果物(アイコンの出現パターン)が得られるので、ユーザの動作と表示された結果物との関連性をさらにより一層高めることが可能となり、結果として、ユーザの直感に反しない自然な流れで操作画面を提供することができる。 As a result, a result (icon appearance pattern) with the same movement at the same speed as the movement of the user and the speed of the movement is obtained, so the user's action and the displayed result The relevance can be further enhanced, and as a result, the operation screen can be provided in a natural flow that does not contradict the user's intuition.
 あるいは、本発明の情報処理装置では、上記関連情報記憶部には、複数の関連項目が連続してまたは同時に処理可能な関係であればそれらの複数の関連項目を連携有りとする場合に、上記連携の有無を示す連携情報が関連項目ごとに記憶されており、上記操作画面処理手段は、上記関連項目抽出手段によって抽出された関連項目のうち、連携を有する関連項目同士のアイコンを隣に並べて配置してもよい。 Alternatively, in the information processing apparatus according to the present invention, the related information storage unit may include a plurality of related items as long as the related items can be processed continuously or simultaneously. Cooperation information indicating the presence or absence of cooperation is stored for each related item, and the operation screen processing means arranges icons of related items having cooperation among the related items extracted by the related item extracting means next to each other. You may arrange.
 連携を有する複数の関連項目は、同時に選択されれば、連続または同時に処理が実行される。したがって、連携する複数の関連項目のアイコンは、可能な限り少ない動作数とするために、同時に選択されることが好ましい。ここで、囲う動作を利用すれば、1回の囲う動作で、すべてのアイコンをまとめて囲い、同時に選択することができる。そして、1回の囲う動作で、すべてのアイコンをまとめて囲いやすくするために、各アイコンが環状に配置される場合であっても、連携を有する複数の関連項目のアイコンは、互いに隣り合って(3個以上の場合は連続して)並べて配置されることが好ましい。 If a plurality of related items having linkage are selected at the same time, the processing is executed continuously or simultaneously. Therefore, it is preferable that icons of a plurality of related items to be linked are selected at the same time in order to make the number of operations as small as possible. Here, if the enclosing operation is used, all the icons can be enclosed in a single enclosing operation and selected simultaneously. In order to make it easy to enclose all the icons together in a single enclosing operation, even if the icons are arranged in a ring shape, the icons of a plurality of related items having cooperation are adjacent to each other. It is preferable that they are arranged side by side (in the case of three or more).
 上記構成によれば、連携を有する関連項目同士のアイコンは、隣に並ぶように配置される。つまり、連携するアイコンが2個の場合は互いに隣り合うように、3個以上の場合は連続して、環の輪郭線上に並べて配置される。 According to the above configuration, icons of related items having cooperation are arranged so as to be arranged next to each other. That is, when two or more icons are linked, they are arranged side by side on the outline of the ring in a continuous manner when there are three or more icons.
 これにより、ユーザは、連携を有する関連項目を、1回の囲う動作で、すべてのアイコンをまとめて囲いやすくなる。つまり、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 This makes it easy for the user to enclose all the icons together in a single enclosing operation. That is, the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 あるいは、本発明の情報処理装置は、さらに、自装置の位置を示す位置情報を取得する位置検知部と、自装置の向きを示す方向情報を取得する方向検知部と、自装置の周辺機器と通信して周辺機器の位置情報を取得する通信部と、上記位置検知部によって取得された自装置の位置情報と、上記通信部によって取得された周辺機器の位置情報とに基づいて、自装置と周辺機器との位置関係を特定するとともに、上記方向検知部によって取得された方向情報によって、自装置の向きを特定することにより、自装置に対して各周辺機器がどの方向に存在するのかを特定する機器方向特定手段とを備え、上記操作画面処理手段は、上記機器方向特定手段によって特定された、上記周辺機器が存在する方向に対応するように、該周辺機器の関連項目に対応するアイコンの配置位置を決定してもよい。 Alternatively, the information processing apparatus of the present invention further includes a position detection unit that acquires position information indicating the position of the own device, a direction detection unit that acquires direction information indicating the direction of the own device, and peripheral devices of the own device. Based on the communication unit that communicates and acquires the position information of the peripheral device, the position information of the own device acquired by the position detection unit, and the position information of the peripheral device acquired by the communication unit, Identifies the position of each peripheral device in relation to its own device by specifying the positional relationship with the peripheral device and specifying the orientation of its own device based on the direction information acquired by the direction detection unit. Device direction specifying means, and the operation screen processing means specifies the related items of the peripheral device so as to correspond to the direction in which the peripheral device exists, specified by the device direction specifying means. Position of the response icons may be determined.
 上記構成によれば、機器方向特定手段は、以下の3つの情報に基づいて、自装置に対して各周辺機器がどの方向に存在するのかを特定することができる。すなわち、(1)位置検知部が取得する、自装置の位置を示す位置情報、(2)方向検知部が取得する、自装置の向きを示す方向情報、および、(3)通信部が取得する、各周辺機器の位置情報である。 According to the above configuration, the device direction specifying means can specify the direction in which each peripheral device exists with respect to the own device based on the following three pieces of information. That is, (1) position information indicating the position of the own device acquired by the position detecting unit, (2) direction information indicating the direction of the own device acquired by the direction detecting unit, and (3) acquiring by the communication unit. , Position information of each peripheral device.
 そして、操作画面処理手段は、上記機器方向特定手段によって特定された、上記周辺機器が存在する方向に対応するように、該周辺機器の関連項目に対応するアイコンの配置位置を決定する。 Then, the operation screen processing means determines the arrangement position of the icon corresponding to the related item of the peripheral device so as to correspond to the direction in which the peripheral device exists, specified by the device direction specifying means.
 この結果、周辺機器のアイコンがオブジェクトの周囲に環状に表示されたとき、中央のオブジェクトと各アイコンとの位置関係は、タブレット端末100と各周辺機器との位置関係に対応している。このため、ユーザは、より一層、自分の動作と、その結果発生する情報処理との間の関連性を直感的に把握することが可能となる。例えば、ユーザは、デジタルテレビ1に写真を転送したければ、実際にデジタルテレビ1がある方向に向かって写真のオブジェクトをドラッグすればよい。したがって、ユーザの直感に反しない自然な流れで操作が行える操作画面を提供することができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 As a result, when the icon of the peripheral device is displayed in a ring shape around the object, the positional relationship between the central object and each icon corresponds to the positional relationship between the tablet terminal 100 and each peripheral device. For this reason, the user can intuitively grasp the relevance between the user's own operation and the information processing that occurs as a result. For example, if the user wants to transfer a photo to the digital television 1, the user may drag the photo object in the direction in which the digital television 1 is actually located. Therefore, it is possible to provide an operation screen that can be operated in a natural flow that does not contradict the user's intuition. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
 あるいは、本発明の情報処理装置では、上記操作画面処理手段は、上記関連項目抽出手段によって抽出された関連項目のうち、上記関連情報記憶部において、自装置以外の他機に対して働きかける動作に関する項目であることを示す動作属性が関連付けられている関連項目のアイコンを、上記選択されたオブジェクトよりも上に配置してもよい。 Alternatively, in the information processing apparatus according to the present invention, the operation screen processing unit relates to an operation that acts on other devices other than the own device in the related information storage unit among the related items extracted by the related item extracting unit. You may arrange | position the icon of the related item with which the operation | movement attribute which shows that it is an item is linked | related above the selected said object.
 上記構成によれば、操作画面処理手段は、抽出された関連項目それぞれの動作属性を、関連情報記憶部を参照することにより把握することができる。具体的には、操作画面処理は、動作属性の情報に基づいて、自装置以外の他機に対して働きかける動作に関する項目である関連項目を識別することが可能である。 According to the above configuration, the operation screen processing means can grasp the operation attribute of each extracted related item by referring to the related information storage unit. Specifically, the operation screen process can identify a related item that is an item related to an operation that acts on a device other than the own device based on the information of the operation attribute.
 そして、操作画面処理手段は、抽出された関連項目が、自装置以外の他機に対して働きかける動作に関する項目である場合に、そのような関連項目のアイコンを、上記選択されたオブジェクトよりも上に配置することを決定する。 Then, when the extracted related item is an item related to an action that acts on other devices other than the own device, the operation screen processing means sets the icon of such a related item above the selected object. Decide to place in.
 これにより、他機に対して働きかける動作に関するアイコンは、先に選択されたオブジェクトよりも画面上方に表示されることになる。 Thus, the icon related to the action to work on the other device is displayed at the upper part of the screen than the previously selected object.
 このように、他機に働きかける動作に関連するアイコンを、オブジェクトよりも上側に配置すると、次に、ユーザが、「他機に働きかける動作(情報処理)」を、当該情報処理装置にさせたい場合には、上記オブジェクトをその目的のアイコンのところへドラッグさせる動作、つまり、自分の所から向こうへ物を移動させるジェスチャを行うことになる。 In this way, when an icon related to an operation that works on another device is placed above the object, the user next wants the information processing apparatus to perform an “operation on other device (information processing)” In this case, an operation of dragging the object to the target icon, that is, a gesture of moving an object from one's place to the other side is performed.
 「他機に働きかける動作(情報処理)」は、自分の手前に物を持ってくるジェスチャとよりも、自分の所から向こうへ物を移動させるというジェスチャとの方が直感的にもより関連が深く、より連想されやすいと考えられる。 “Working with other devices (information processing)” is more intuitively related to the gesture of moving an object away from your place than the gesture of bringing an object in front of you Deeper and more likely to be associated.
 上記構成によれば、ユーザの動作と得られる結果物との関連性をより高めることが可能となり、結果として、ユーザの直感に反しない自然な流れで操作画面を提供することができる。 According to the above configuration, it is possible to further increase the relevance between the user's action and the resultant product, and as a result, it is possible to provide the operation screen in a natural flow that does not contradict the user's intuition.
 あるいは、本発明の情報処理装置では、上記軌跡取得手段は、上記表示部に表示されたオブジェクトを囲う上記指示体の移動が生じるまでの所定期間に生じた上記指示体の軌跡を取得し、上記操作画面処理手段は、上記所定期間に取得された軌跡が、上記表示部の画面における特定の領域に偏っていると判断した場合に、上記特定の領域にアイコンが配置されるように上記環の位置を決定してもよい。 Alternatively, in the information processing apparatus of the present invention, the trajectory acquisition means acquires the trajectory of the indicator that has occurred in a predetermined period until the indicator surrounding the object displayed on the display unit moves. When the operation screen processing means determines that the trajectory acquired during the predetermined period is biased toward a specific area on the screen of the display unit, the ring of the ring is arranged so that an icon is arranged in the specific area. The position may be determined.
 上記構成によれば、軌跡取得手段は、囲う動作だけでなく、過去の所定期間に生じた動作について軌跡を取得する。続いて、操作画面処理手段は、取得された軌跡によって、過去の所定期間において、表示部の画面のどの位置に動作が生じたのかを把握することが可能となり、指示体の移動の軌跡が、表示部の画面の特定の領域に集中しているのであれば、それによって接触位置の偏りを検知することができる。 According to the above configuration, the trajectory acquisition means acquires a trajectory not only for the surrounding motion but also for the motion that occurred in the past predetermined period. Subsequently, the operation screen processing means can grasp the position where the operation has occurred on the screen of the display unit in the past predetermined period based on the acquired trajectory. If it concentrates on the specific area | region of the screen of a display part, the deviation of a contact position can be detected by it.
 このように接触位置が偏っているということは、その領域しかタッチできない(あるいは、それ以外の領域はタッチし難い)という特殊な使用状況下で当該情報処理装置を利用していると推測できる。 It can be inferred that the contact position is biased in this way is that the information processing apparatus is used under a special usage situation in which only that area can be touched (or other areas are difficult to touch).
 そこで、操作画面処理手段は、偏りが検出された領域にアイコンが配置されるように環の位置を決定する。 Therefore, the operation screen processing means determines the position of the ring so that the icon is arranged in the area where the bias is detected.
 これにより、ユーザが接触可能な領域にアイコンが表示されることになり、次にユーザがアイコンを選択する動作を行う場合には、接触可能な領域からすぐさま所望のアイコンを選択することができる。 Thus, the icon is displayed in the area where the user can touch, and when the user next performs an operation of selecting the icon, the desired icon can be selected immediately from the touchable area.
 具体例を用いてより詳細に説明すると以下のとおりである。例えば、片手で操作する場合には、接触位置は、タッチパネルの画面下部左側の領域(左手で操作する場合)、または、画面下部右側の領域(右手で操作する場合)に偏る傾向がある。ユーザがこのような状況で情報処理装置を使用しているときに、接触動作が必要なオブジェクトまたはアイコンを、画面上部や手と反対側の画面下部に表示すると操作が煩雑になるという問題がある。なぜなら、ユーザは、目的のオブジェクトをすぐさまタッチできず、接触可能な領域にたぐり寄せるという余計な動作を行わなければならないか、両手操作に切り替えなければならないからである。 The following is a more detailed description using specific examples. For example, when operating with one hand, the touch position tends to be biased toward the lower left area of the touch panel (when operating with the left hand) or the lower right area of the screen (when operating with the right hand). When the user is using the information processing apparatus in such a situation, there is a problem that the operation becomes complicated if an object or icon that requires a contact operation is displayed on the upper part of the screen or the lower part of the screen opposite to the hand. . This is because the user cannot touch the target object immediately and must perform an extra operation of rushing to a contactable area or switching to a two-handed operation.
 そこで、本発明の情報処理装置は、上記構成によってユーザの使用状況を察することにより、上記の問題を解決することができる。つまり、本発明の情報処理装置を、指示体の接触位置の偏りを検出し、ユーザの指示体がすぐさま届くと推測される領域内に、アイコンを配置する構成とすることができる。 Therefore, the information processing apparatus according to the present invention can solve the above-mentioned problem by observing the usage status of the user with the above-described configuration. In other words, the information processing apparatus of the present invention can be configured to detect the deviation of the contact position of the indicator and place the icon in a region where the user's indicator is estimated to arrive immediately.
 これにより、ユーザが片手で上記情報処理装置を操作する場合には、タッチパネルの画面下部左側(あるいは右側)の領域内に収まるように、アイコンが表示されるので、ユーザは、片手操作で目的のアイコンをたぐり寄せる必要はなく、すぐさま所望のアイコンを選択することができる。 As a result, when the user operates the information processing apparatus with one hand, the icon is displayed so as to fit within the lower left (or right) area of the screen of the touch panel. There is no need to drag icons, and a desired icon can be selected immediately.
 あるいは、本発明の情報処理装置では、上記操作画面処理手段は、上記表示部の画面における、上記指示体の軌跡により囲われた領域、または、該囲われた領域を含む特定の領域にアイコンが配置されるように上記環の位置を決定してもよい。 Alternatively, in the information processing apparatus according to the present invention, the operation screen processing means may include an icon in a region surrounded by the locus of the indicator or a specific region including the enclosed region on the screen of the display unit. The position of the ring may be determined so as to be arranged.
 上記構成によれば、ユーザが指などの指示体で囲った位置、あるいは、その近辺にアイコンが配置されることになる。ユーザが囲った位置は、ユーザにとって接触可能領域であると言える。よって、確実に、接触可能領域にアイコンを表示させることが可能となる。 According to the above configuration, the icon is arranged at or near the position surrounded by the indicator such as the finger by the user. It can be said that the position surrounded by the user is a contactable area for the user. Therefore, it is possible to reliably display an icon in the contactable area.
 本発明の情報処理装置において、当該情報処理装置が備える入力部および上記表示部はタッチパネルを構成するものであり、上記軌跡取得手段は、上記タッチパネル上を移動した上記指示体の移動の軌跡を取得してもよい。 In the information processing apparatus of the present invention, the input unit and the display unit included in the information processing apparatus constitute a touch panel, and the trajectory acquisition unit acquires a trajectory of movement of the indicator that has moved on the touch panel. May be.
 上記構成によれば、ユーザがオブジェクトを選択するために行う接触動作と、その動作に応じて得られる結果物との関連性を高めることが可能となり、ユーザの直感に反しない自然な流れで操作画面を提供することができる。結果として、タッチパネルを備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 According to the above configuration, it is possible to increase the relevance between the contact operation performed by the user for selecting an object and the resultant product obtained according to the operation, and the operation is performed in a natural flow that does not contradict the user's intuition. A screen can be provided. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the touch panel.
 あるいは、本発明の情報処理装置において、当該情報処理装置が備える入力部は、上記表示部に表示されるカーソルを移動させる指示を当該情報処理装置に入力するものであり、上記軌跡取得手段は、上記指示体としてのカーソルの移動の軌跡を取得してもよい。 Alternatively, in the information processing apparatus of the present invention, the input unit included in the information processing apparatus inputs an instruction to move the cursor displayed on the display unit to the information processing apparatus, and the trajectory acquisition unit includes: The locus of movement of the cursor as the indicator may be acquired.
 上記構成によれば、ユーザが、入力部を操作して、オブジェクトを選択するために行う入力動作と、その動作に応じて得られる結果物との関連性を高めることが可能となり、ユーザの直感に反しない自然な流れで操作画面を提供することができる。結果として、表示部と入力部とを備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 According to the above configuration, it is possible to increase the relevance between the input operation performed for the user to select an object by operating the input unit and the resultant product obtained according to the operation, and the user's intuition The operation screen can be provided in a natural flow that does not violate. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the display unit and the input unit.
 本発明の操作画面表示方法は、上記課題を解決するために、情報処理装置における操作画面表示方法において、上記情報処理装置が備える表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得ステップと、上記軌跡取得ステップにて取得された軌跡により囲われた領域に、少なくとも一部が含まれるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定ステップと、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定ステップにて特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出ステップと、上記関連項目抽出ステップにて抽出された関連項目のアイコンを、環の輪郭線上に並べて配置して、上記表示部に表示する操作画面処理ステップとを含むことを特徴としている。 In order to solve the above-described problem, an operation screen display method of the present invention acquires a trajectory in which an indicator that indicates a position of a screen of a display unit included in the information processing apparatus moves in the operation screen display method in the information processing apparatus. A trajectory acquiring step, an object specifying step for specifying, as a selected object, an object at least part of which is included in the region surrounded by the trajectory acquired in the trajectory acquiring step, and the object and the object A related item extracting step of extracting an item associated with the object identified in the object identifying step as a related item with reference to a related information storage unit that associates and stores the item, and the related item extracting step Arrange the icons of the related items extracted in step by step on the outline of the ring. It is characterized in that it comprises an operation screen processing step of displaying on the display unit.
 上記方法によれば、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および上記表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 According to the above method, the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations. As a result, there is an effect that excellent operability can be realized in the information processing apparatus including the input unit and the display unit.
 なお、上記情報処理装置は、コンピュータによって実現してもよく、この場合には、コンピュータを上記各手段として動作させることにより上記情報処理装置をコンピュータにて実現させる情報処理装置の制御プログラム、およびそれを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。 The information processing apparatus may be realized by a computer. In this case, an information processing apparatus control program for causing the information processing apparatus to be realized by the computer by causing the computer to operate as the above-described means, and A computer-readable recording medium on which is recorded also falls within the scope of the present invention.
 本発明の情報処理装置は、上記課題を解決するために、入力部および表示部を備えた情報処理装置において、上記表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得手段と、上記軌跡取得手段によって取得された軌跡により囲われた領域に、少なくとも一部が含まれるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定手段と、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定手段によって特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出手段と、上記関連項目抽出手段によって抽出された関連項目のアイコンを、環の輪郭線上に並べて配置して、上記表示部に表示する操作画面処理手段とを備えていることを特徴としている。 In order to solve the above-described problem, an information processing apparatus according to the present invention provides a trajectory acquisition unit that acquires a trajectory in which an indicator that indicates a position of a screen of the display unit has moved in an information processing device including an input unit and a display unit. And an object specifying means for specifying an object at least partially included in the area surrounded by the trajectory acquired by the trajectory acquiring means as a selected object, and an object and an item related to the object A related item extracting unit that extracts an item associated with the object specified by the object specifying unit as a related item with reference to the related information storage unit to be stored, and a related item extracted by the related item extracting unit An operation screen process that arranges the icon of the item on the outline of the ring and displays it on the display unit. It is characterized in that it comprises a means.
 本発明の操作画面表示方法は、上記課題を解決するために、入力部および表示部を備えた情報処理装置における操作画面表示方法において、上記表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得ステップと、上記軌跡取得ステップにて取得された軌跡により囲われた領域に、少なくとも一部が含まれるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定ステップと、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定ステップにて特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出ステップと、上記関連項目抽出ステップによって抽出された関連項目のアイコンを、環の輪郭線上に並べて配置して、上記表示部に表示する操作画面処理ステップとを含むことを特徴としている。 In order to solve the above-described problem, the operation screen display method of the present invention is a trajectory in which an indicator indicating the position of the screen of the display unit moves in the operation screen display method in an information processing apparatus including an input unit and a display unit. A trajectory acquisition step for acquiring the object, an object specifying step for specifying, as the selected object, an object at least part of which is included in the region surrounded by the trajectory acquired in the trajectory acquisition step, and the object and the object A related item extracting step of extracting an item associated with the object identified in the object identifying step as a related item with reference to a related information storage unit that associates and stores an item associated with The related item icons extracted by the item extraction step are displayed on the outline of the ring. Arranged Te, it is characterized by including an operation screen processing step of displaying on the display unit.
 したがって、簡易な動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、入力部および表示部を備えた情報処理装置において、優れた操作性を実現することが可能になるという効果を奏する。 Therefore, the final result desired by the user can be displayed in a natural flow that does not contradict the user's intuition while being a simple operation and a small number of operations. As a result, there is an effect that it is possible to realize excellent operability in the information processing apparatus including the input unit and the display unit.
本発明の一実施形態におけるタブレット端末の要部構成を示す機能ブロック図である。It is a functional block diagram which shows the principal part structure of the tablet terminal in one Embodiment of this invention. 本発明の一実施形態におけるタブレット端末のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the tablet terminal in one Embodiment of this invention. 本発明の一実施形態におけるタブレット端末の外観を示す平面図である。It is a top view which shows the external appearance of the tablet terminal in one Embodiment of this invention. タブレット端末をユーザが把持および操作するときの様子を説明する図であり、(a)は、タブレット端末が片手で把持され、その手で操作される様子を説明する図であり、(b)は、タブレット端末が一方の手で把持され、もう一方の手で操作される様子を説明する図である。It is a figure explaining a mode when a user hold | grips and operates a tablet terminal, (a) is a figure explaining a mode that a tablet terminal is hold | gripped with one hand and is operated with the hand, (b) It is a figure explaining a mode that a tablet terminal is hold | gripped with one hand and is operated with the other hand. タブレット端末のオブジェクト特定部の動作を説明する図であり、(a)は、ユーザが目的のオブジェクトを選択するためにオブジェクトを「囲う」という接触動作を行った様子を示す図であり、(b)は、(a)に示す接触動作に伴って、接触情報生成部が生成した接触情報の一例を示す図であり、(c)は、接触が検知されたt0~tnの期間に表示部に表示された映像フレームのマップ情報の一例を示す図である。It is a figure explaining operation | movement of the object specific part of a tablet terminal, (a) is a figure which shows a mode that the user performed contact operation of "surrounding" an object in order to select the target object, (b) ) Is a diagram showing an example of contact information generated by the contact information generation unit in accordance with the contact operation shown in (a), and (c) is a display unit during the period from t0 to tn when contact is detected. It is a figure which shows an example of the map information of the displayed video frame. タブレット端末の関連情報記憶部に記憶される関連情報の一例を示す図である。It is a figure which shows an example of the relevant information memorize | stored in the relevant information storage part of a tablet terminal. タブレット端末のアイコン記憶部に記憶されるアイコン画像の具体例を示す図である。It is a figure which shows the specific example of the icon image memorize | stored in the icon memory | storage part of a tablet terminal. タブレット端末の操作画面処理部の処理内容を説明する図であり、(a)は、操作画面処理部によるオブジェクトの表示処理の一例を説明する図であり、(b)は、操作画面処理部による関連項目のアイコン配置の一例を示す図である。It is a figure explaining the processing content of the operation screen process part of a tablet terminal, (a) is a figure explaining an example of the display process of the object by an operation screen process part, (b) is by an operation screen process part. It is a figure which shows an example of the icon arrangement | positioning of a related item. 操作画面処理部によって実行された操作画面生成処理の結果、得られた操作画面の具体例を示す図である。It is a figure which shows the specific example of the operation screen obtained as a result of the operation screen production | generation process performed by the operation screen process part. タブレット端末による操作画面表示処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the operation screen display process by a tablet terminal. 本発明の他の実施形態におけるタブレット端末の要部構成を示す機能ブロック図である。It is a functional block diagram which shows the principal part structure of the tablet terminal in other embodiment of this invention. タブレット端末の接触情報記憶部に記憶される接触情報の具体例を示す図であり、(a)は、ユーザが目的のオブジェクトを選択するために、オブジェクトを任意の形状で「囲う」接触動作を行った様子を示す図であり、(b)は、(a)に示す接触動作に伴って、接触情報生成部が生成した接触情報の一例を示す図である。It is a figure which shows the specific example of the contact information memorize | stored in the contact information memory | storage part of a tablet terminal, (a) is an operation which "encloses" an object in arbitrary shapes in order for a user to select the target object. It is a figure which shows the mode performed, (b) is a figure which shows an example of the contact information which the contact information generation part produced | generated with the contact operation | movement shown to (a). 環形状決定部を含む操作画面処理部によって実行された、関連項目のアイコン配置の一例を示す図である。It is a figure which shows an example of the icon arrangement | positioning of a related item performed by the operation screen process part containing a ring shape determination part. 本発明の他の実施形態におけるタブレット端末による操作画面表示処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the operation screen display process by the tablet terminal in other embodiment of this invention. 本発明のさらに他の実施形態におけるタブレット端末の要部構成を示す機能ブロック図である。It is a functional block diagram which shows the principal part structure of the tablet terminal in further another embodiment of this invention. タブレット端末の関連情報記憶部に記憶される関連情報の一例を示す図である。It is a figure which shows an example of the relevant information memorize | stored in the relevant information storage part of a tablet terminal. 操作画面処理部のアイコン配置決定部がアイコンの配置を決定する手順を説明する図であり、(a)は、アイコン配置決定部が属性に基づいて関連項目(のアイコン)を分類した結果をテーブルによって示す図であり、(b)は、アイコン配置決定部が(a)の分類にしたがって、各関連項目のアイコンを配置する手順を説明する図である。It is a figure explaining the procedure in which the icon arrangement | positioning determination part of an operation screen process part determines the arrangement | positioning of an icon, (a) is a table which shows the result that the icon arrangement | positioning determination part classified the related item (its icon) based on the attribute. (B) is a figure explaining the procedure in which the icon arrangement | positioning determination part arrange | positions the icon of each related item according to the classification | category of (a). 操作画面処理部がアイコン配置決定部の決定にしたがってアイコンを配置した結果、得られた操作画面の具体例を示す図である。It is a figure which shows the specific example of the operation screen obtained as a result of having arrange | positioned the icon according to the determination of the icon arrangement | positioning determination part by the operation screen process part. 操作画面処理部がアイコン配置決定部の決定にしたがってアイコンを配置した結果、得られた操作画面の他の具体例を示す図である。It is a figure which shows the other specific example of the operation screen obtained as a result of having arrange | positioned the icon according to the determination of the icon arrangement | positioning determination part by the operation screen process part. 本発明のさらに他の実施形態におけるタブレット端末の要部構成を示す機能ブロック図である。It is a functional block diagram which shows the principal part structure of the tablet terminal in further another embodiment of this invention. タブレット端末のユーザが屋内のある部屋にてタブレット端末を使用しているときの、使用環境の一例を示す図である。It is a figure which shows an example of a use environment when the user of a tablet terminal is using the tablet terminal in an indoor room. タブレット端末の機器方向特定部が生成する機器方向情報の具体例を示す図である。It is a figure which shows the specific example of the apparatus direction information which the apparatus direction specific | specification part of a tablet terminal produces | generates. 本実施形態の関連情報記憶部に記憶される周辺機器に関する関連情報の具体例を示す図である。It is a figure which shows the specific example of the relevant information regarding the peripheral device memorize | stored in the relevant information storage part of this embodiment. 操作画面処理部が実行した操作画面生成処理の結果、得られた操作画面の具体例を示す図である。It is a figure which shows the specific example of the operation screen obtained as a result of the operation screen production | generation process which the operation screen process part performed. 操作画面処理部が実行した操作画面生成処理の結果、図24に示す操作画面に続いて得られる操作画面の具体例を示す図である。FIG. 25 is a diagram showing a specific example of an operation screen obtained following the operation screen shown in FIG. 24 as a result of the operation screen generation process executed by the operation screen processing unit. 関連項目のアイコン表示方法の一変形例を示す図である。It is a figure which shows the modification of the icon display method of a related item. 関連項目のアイコン表示方法の一変形例を示す図である。It is a figure which shows the modification of the icon display method of a related item. 関連項目のアイコン表示方法の一変形例を示す図であり、(a)は、最初にオブジェクトの周囲に小さく配置されたアイコンの環の様子を示し、(b)は、アイコンの環が拡大する途中の様子を示し、(c)は、(a)から(b)を経て、最終的にアイコンの環が画面中央に大きく配置された様子を示している。It is a figure which shows the modification of the icon display method of a related item, (a) shows the mode of the ring of the icon initially arrange | positioned small around the object, (b) expands the ring of an icon. A state in the middle is shown, and (c) shows a state in which a ring of icons is finally arranged largely in the center of the screen through (a) to (b). 関連項目のアイコン表示方法の一変形例を示す図であり、(a)は、接触情報の一具体例を示す図であり、(b)、(c)、(d)、(e)は、複数のアイコンが一定の間隔で順次表示される様子を示す図であり、(f)は、最終的に得られる操作画面の一具体例を示す図である。It is a figure which shows one modification of the icon display method of a related item, (a) is a figure which shows a specific example of contact information, (b), (c), (d), (e) It is a figure which shows a mode that a some icon is sequentially displayed at a fixed space | interval, (f) is a figure which shows a specific example of the operation screen finally obtained. 関連項目のアイコン表示方法の一変形例を示す図であり、(a)は、接触情報の一具体例を示す図であり、(b)は、t0の時点の操作画面の一例を示す図であり、(c)は、taの時点の操作画面の一例を示す図であり、(d)は、tbの時点の操作画面の一例を示す図であり、(e)は、tcの時点の操作画面の一例を示す図であり、(f)は、tnの時点の操作画面の一例を示す図である。It is a figure which shows one modification of the icon display method of a related item, (a) is a figure which shows a specific example of contact information, (b) is a figure which shows an example of the operation screen at the time of t0. (C) is a diagram showing an example of the operation screen at the time point ta, (d) is a diagram showing an example of the operation screen at the time point tb, and (e) is an operation at the time point tc. It is a figure which shows an example of a screen, (f) is a figure which shows an example of the operation screen at the time of tn. 関連項目のアイコン配置パターンの一変形例を示す図である。It is a figure which shows the modification of the icon arrangement pattern of a related item. 関連項目のアイコン配置パターンの一変形例を示す図である。It is a figure which shows the modification of the icon arrangement pattern of a related item. ユーザの使用状況に応じて操作画面を提示することが可能な、本発明のタブレット端末の動作を説明する図であり、(a)は、ユーザが左手で操作しているという状況の一例を説明する図であり、(b)は、(a)の接触動作に伴って生成された接触情報の具体例を示す図である。It is a figure explaining operation | movement of the tablet terminal of this invention which can show an operation screen according to a user's usage condition, (a) demonstrates an example of the condition where the user is operating with the left hand. (B) is a figure which shows the specific example of the contact information produced | generated with the contact operation | movement of (a). 操作画面処理部の環形状決定部によって決定された環形状にしたがってアイコンが配置されたときの操作画面の一例を示す図である。It is a figure which shows an example of an operation screen when an icon is arrange | positioned according to the ring shape determined by the ring shape determination part of the operation screen process part. (a)は、ユーザが、接触可能領域上でドラッグすることにより、アイコンの環を回転させる様子を説明する図であり、(b)は、上記の接触動作「ドラッグ」によって、環が回転させられたことにより、アイコンの配置が変更された後の操作画面の一例を示す図である。(A) is a figure explaining a mode that a user rotates the ring of an icon by dragging on a contact possible area | region, (b) rotates a ring by said contact operation "drag". It is a figure which shows an example of the operation screen after arrangement | positioning of an icon is changed by having been performed. (a)は、ユーザが目的の複数のオブジェクトを選択するためにオブジェクト「囲う」という接触動作を実施した様子を示す図であり、(b)は、同図の(a)に示す接触動作に応じて、操作画面処理部によって生成された操作画面の具体例を示す図である。(A) is a figure which shows a mode that the user performed the contact operation of "enclosing" in order to select the several target object, (b) is a contact operation shown to (a) of the figure. It is a figure which shows the specific example of the operation screen produced | generated by the operation screen process part according to it. (a)は、ユーザが種類の異なる複数のオブジェクトを選択するために、オブジェクトを「囲う」という接触動作を実施した様子を示す図であり、(b)は、同図の(a)に示す接触動作に応じて、操作画面処理部によって生成された操作画面の具体例を示す図である。(A) is a figure which shows a mode that the user performed the contact operation of "surrounding" in order to select a plurality of different types of objects, and (b) is shown in (a) of the figure. It is a figure which shows the specific example of the operation screen produced | generated by the operation screen process part according to contact operation. 関連情報記憶部に記憶されている関連情報の他の例を示す図である。It is a figure which shows the other example of the relevant information memorize | stored in the relevant information storage part.
 ≪実施形態1≫
 本発明の実施形態について、図1~図10に基づいて説明すると以下の通りである。
Embodiment 1
An embodiment of the present invention will be described with reference to FIGS. 1 to 10 as follows.
 以下で説明する実施形態では、一例として、本発明の情報処理装置を、タブレット端末に適用した場合について説明する。本実施形態では、一例として、上記タブレット端末は、片手で操作することが可能な、小型で携帯性に優れたスマートフォンなどで実現される。 In the embodiment described below, a case where the information processing apparatus of the present invention is applied to a tablet terminal will be described as an example. In the present embodiment, as an example, the tablet terminal is realized by a small smartphone that can be operated with one hand and is excellent in portability.
 しかし、本発明の情報処理装置は、上記の例に限定されず、あらゆるサイズの情報処理装置(例えば、大型のタッチパネルを備えた電子黒板など)に、本発明の情報処理装置を適用してもよい。 However, the information processing apparatus of the present invention is not limited to the above example, and the information processing apparatus of the present invention can be applied to an information processing apparatus of any size (for example, an electronic blackboard equipped with a large touch panel). Good.
 〔タブレット端末のハードウェア構成〕
 図2は、本実施形態におけるタブレット端末100のハードウェア構成を示すブロック図である。タブレット端末100は、図2に示すとおり、少なくとも、制御部10、入力部11、表示部12および記憶部19を備えている。さらに、タブレット端末100は、本来備わっている機能を実現するために、操作部13、外部インターフェース14、通信部15、無線通信部16、音声出力部17、音声入力部18を備えていてもよい。
[Hardware configuration of tablet terminal]
FIG. 2 is a block diagram illustrating a hardware configuration of the tablet terminal 100 according to the present embodiment. As shown in FIG. 2, the tablet terminal 100 includes at least a control unit 10, an input unit 11, a display unit 12, and a storage unit 19. Furthermore, the tablet terminal 100 may include an operation unit 13, an external interface 14, a communication unit 15, a wireless communication unit 16, an audio output unit 17, and an audio input unit 18 in order to realize inherent functions. .
 また、タブレット端末100がスマートフォンなどの多機能携帯通信端末である場合には、ここでは省略したが、タブレット端末100は、通話処理部、撮影を行う撮像部(レンズ・撮像素子など)、放送受像部(チューナ・復調部など)、GPS、および、センサ(加速度センサ、傾きセンサなど)他、スマートフォンが標準的に備えている各種部品を備えていてもよい。 When the tablet terminal 100 is a multi-function mobile communication terminal such as a smartphone, the tablet terminal 100 is omitted here. However, the tablet terminal 100 includes a call processing unit, an imaging unit (such as a lens / image sensor) that performs imaging, and a broadcast image. Other parts (such as a tuner / demodulation unit), GPS, and sensors (such as an acceleration sensor and an inclination sensor) may be included as well as various components that are typically included in a smartphone.
 入力部11は、ユーザがタブレット端末100を操作するための指示信号を、タッチパネルを介して入力するためのものである。入力部11は、指示体(表示部12の画面位置を指示するもの、ここでは、例えば、指またはペンなど)の接触を受け付けるタッチ面と、指示体とタッチ面との間の接触/非接触(接近/非接近)、および、その接触(接近)位置を検知するためのタッチセンサとで構成されている。タッチセンサは、指示体とタッチ面との接触/非接触を検知できればどのようなセンサで実現されていてもかまわない。例えば、圧力センサ、静電容量センサ、光センサなどで実現される。 The input unit 11 is for inputting an instruction signal for the user to operate the tablet terminal 100 via the touch panel. The input unit 11 is a touch surface that accepts contact with an indicator (indicating the screen position of the display unit 12, here, for example, a finger or a pen), and contact / non-contact between the indicator and the touch surface. (Approach / non-approach) and a touch sensor for detecting the contact (approach) position. The touch sensor may be realized by any sensor as long as it can detect contact / non-contact between the indicator and the touch surface. For example, it is realized by a pressure sensor, a capacitance sensor, an optical sensor, or the like.
 表示部12は、タブレット端末100が情報処理するオブジェクト(アイコンなどのあらゆる表示対象物)、および、処理の結果物を表示したり、ユーザがタブレット端末100を操作するための操作画面をGUI(Graphical User Interface)画面として表示したりするものである。表示部12は、例えば、LCD(液晶ディスプレイ)などの表示装置で実現される。 The display unit 12 displays an object to be processed by the tablet terminal 100 (any display object such as an icon) and a processing result, and displays an operation screen for the user to operate the tablet terminal 100 using a GUI (Graphical (User 表示 Interface) screen. The display unit 12 is realized by a display device such as an LCD (Liquid Crystal Display).
 本実施形態では、入力部11と表示部12とは一体に成形されており、これらがタッチパネルを構成している。したがって、このような実施形態では、ユーザが画面位置を指示するために動かす(操作する)対象、すなわち、操作体(ここでは、指またはペンなど)は、同時に、表示部12の画面上の位置を指示する指示体でもある。 In this embodiment, the input unit 11 and the display unit 12 are integrally formed, and these constitute a touch panel. Therefore, in such an embodiment, an object to be moved (operated) to indicate a screen position, that is, an operation body (here, a finger or a pen) is simultaneously positioned on the screen of the display unit 12. It is also an indicator that indicates
 例えば、本発明のタブレット端末100のタッチパネルを投影型静電容量方式のタッチパネルで実現する場合、具体的には、上記タッチセンサは、ITO(Indium Tin Oxide)などによるマトリクス状の透明電極パターンを、ガラス、プラスチックなどの透明基板上に形成したものとなる。タッチセンサに指示体(ユーザの指またはペン等)が接触または接近すると、その付近の複数の透明電極パターンにおける静電容量が変化する。従って、制御部10は、上記透明電極パターンの電流または電圧の変化を検出することにより、上記指示体が接触または接近した位置を検出することができる。 For example, when the touch panel of the tablet terminal 100 of the present invention is realized by a projected capacitive touch panel, specifically, the touch sensor has a transparent electrode pattern in a matrix shape made of ITO (Indium Tin Oxide) or the like. It is formed on a transparent substrate such as glass or plastic. When an indicator (such as a user's finger or pen) touches or approaches the touch sensor, the electrostatic capacity of a plurality of transparent electrode patterns in the vicinity changes. Therefore, the control unit 10 can detect the position where the indicator is in contact or approached by detecting a change in the current or voltage of the transparent electrode pattern.
 以下では、「接触を検知する」、「接触動作」、「接触位置」などというときの「接触」という用語は、指示体とタッチ面とが完全に接する(接している)状態のみならず、指示体とタッチ面とが、タッチセンサが検知可能な程度に接近する(接近している)状態も含んでいる。 In the following, the term “contact” when “contact detection”, “contact operation”, “contact position”, etc. is not only the state in which the indicator and the touch surface are in complete contact (contact), It also includes a state in which the indicator and the touch surface are close (approaching) to the extent that the touch sensor can detect.
 操作部13は、ユーザがタブレット端末100に指示信号を直接入力するためのものである。例えば、操作部13は、ボタン、スイッチ、キー、ジョグダイアルなどの適宜の入力機構で実現される。例えば、操作部13は、タブレット端末100の電源のオン/オフを行うスイッチである。 The operation unit 13 is for the user to directly input an instruction signal to the tablet terminal 100. For example, the operation unit 13 is realized by an appropriate input mechanism such as a button, switch, key, or jog dial. For example, the operation unit 13 is a switch for turning on / off the power of the tablet terminal 100.
 外部インターフェース14は、外部の装置をタブレット端末100に接続するためのインターフェースである。外部インターフェース14は、例えば、これに限定されないが、外付けの記録媒体(メモリカードなど)を挿し込むためのソケット、HDMI(High Definition Multimedia Interface)端子、USB(Universal Serial Bus)端子などで実現される。タブレット端末100の制御部10は、外部インターフェース14を介して、外部の装置とデータの送受信を行うことができる。 The external interface 14 is an interface for connecting an external device to the tablet terminal 100. The external interface 14 is realized by, for example, but not limited to, a socket for inserting an external recording medium (memory card or the like), an HDMI (High Definition Multimedia Interface) terminal, a USB (Universal Serial Bus) terminal, or the like. The The control unit 10 of the tablet terminal 100 can exchange data with an external device via the external interface 14.
 通信部15は、通信網を介して外部の装置と通信を行うものである。通信部15は、通信網を介して、各種通信端末に接続し、タブレット端末100と通信端末との間でのデータの送受信を実現する。さらに、タブレット端末100が、スマートフォンなどの携帯通信端末である場合には、通信部15は、携帯電話回線網を介して、音声通話データ、電子メールデータなどを、他の装置との間で送受信する。 The communication unit 15 communicates with an external device via a communication network. The communication unit 15 is connected to various communication terminals via a communication network, and realizes data transmission / reception between the tablet terminal 100 and the communication terminal. Further, when the tablet terminal 100 is a mobile communication terminal such as a smartphone, the communication unit 15 transmits / receives voice call data, e-mail data, and the like to / from other devices via the mobile phone network. To do.
 無線通信部16は、無線で外部の装置と通信を行うものである。無線通信部16は、特に限定されないが、IrDA、IrSSなどの赤外線通信、Bluetooth通信、WiFi通信、非接触型ICカードのいずれかの無線通信手段を実現するものであってもよいし、これらの手段を複数実現するものであってもよい。タブレット端末100の制御部10は、無線通信部16を介して、タブレット端末100の周辺にある機器と通信し、それらの機器とデータの送受信を行うことができる。 The wireless communication unit 16 communicates with an external device wirelessly. The wireless communication unit 16 is not particularly limited, and may implement any wireless communication means such as infrared communication such as IrDA and IrSS, Bluetooth communication, WiFi communication, and a non-contact type IC card. A plurality of means may be realized. The control unit 10 of the tablet terminal 100 can communicate with devices in the vicinity of the tablet terminal 100 via the wireless communication unit 16, and can exchange data with these devices.
 音声出力部17は、タブレット端末100が処理した音声データを、音声として出力するものであり、スピーカ、ヘッドフォン端子およびヘッドフォン等により実現される。 The sound output unit 17 outputs sound data processed by the tablet terminal 100 as sound, and is realized by a speaker, a headphone terminal, headphones, and the like.
 音声入力部18は、タブレット端末100外部で発生した音声の入力を受け付けるものであり、マイク等により実現される。 The voice input unit 18 receives voice input generated outside the tablet terminal 100, and is realized by a microphone or the like.
 記憶部19は、タブレット端末100の制御部10が実行する(1)制御プログラム、(2)OSプログラム、(3)制御部10が、タブレット端末100が有する各種機能を実行するためのアプリケーションプログラム、および、(4)該アプリケーションプログラムを実行するときに読み出す各種データを記憶するものである。あるいは、(5)制御部10が各種機能を実行する過程で演算に使用するデータおよび演算結果等を記憶するものである。例えば、上記の(1)~(4)のデータは、ROM(read only memory)、フラッシュメモリ、EPROM(Erasable ROM)、EEPROM(Electrically EPROM)、HDD(Hard Disc Drive)などの不揮発性記憶装置に記憶される。例えば、上記の(5)のデータは、RAM(Random Access Memory)などの揮発性記憶装置に記憶される。どのデータをどの記憶装置に記憶するのかについては、タブレット端末100の使用目的、利便性、コスト、物理的な制約などから適宜決定される。 The storage unit 19 includes (1) a control program executed by the control unit 10 of the tablet terminal 100, (2) an OS program, and (3) an application program for the control unit 10 to execute various functions of the tablet terminal 100, And (4) storing various data read when the application program is executed. Alternatively, (5) the control unit 10 stores data used for calculation and calculation results in the course of executing various functions. For example, the above data (1) to (4) are stored in a non-volatile storage device such as a ROM (read only memory), a flash memory, an EPROM (Erasable ROM), an EEPROM (Electrically EPROM), an HDD (Hard Disc Drive). Remembered. For example, the data (5) is stored in a volatile storage device such as a RAM (Random Access Memory). Which data is to be stored in which storage device is appropriately determined based on the purpose of use, convenience, cost, physical restrictions, and the like of the tablet terminal 100.
 制御部10は、タブレット端末100が備える各部を統括制御するものである。制御部10は、例えば、CPU(central processing unit)などで実現され、タブレット端末100が備える機能は、制御部10としてのCPUが、ROMなどに記憶されているプログラムを、RAMなどに読み出して実行することで実現される。制御部10が実現する各種機能(特に、本発明の操作画面表示機能)については、別図を参照しながら後述する。 The control unit 10 performs overall control of each unit included in the tablet terminal 100. The control unit 10 is realized by, for example, a CPU (central processing unit). The functions of the tablet terminal 100 are such that the CPU as the control unit 10 reads a program stored in a ROM or the like into a RAM or the like and executes the program. It is realized by doing. Various functions (particularly, the operation screen display function of the present invention) realized by the control unit 10 will be described later with reference to other drawings.
 〔タブレット端末の外観〕
 図3は、タブレット端末100の外観を示す平面図である。図3に示すとおり、タブレット端末100は、タッチパネルとしての入力部11および表示部12を備えているものである。また、タブレット端末100には、これらは必須の構成ではないが、操作部13、外部インターフェース14、無線通信部16、音声出力部17、音声入力部18などが備えられている。例えば、無線通信部16が、赤外線通信手段で実現されている場合、タブレット端末100の側面には、無線通信部16として赤外線送受光部が設けられる。
[Appearance of tablet terminal]
FIG. 3 is a plan view showing the appearance of the tablet terminal 100. As illustrated in FIG. 3, the tablet terminal 100 includes an input unit 11 and a display unit 12 as a touch panel. The tablet terminal 100 includes an operation unit 13, an external interface 14, a wireless communication unit 16, an audio output unit 17, an audio input unit 18, and the like, although these are not essential components. For example, when the wireless communication unit 16 is realized by infrared communication means, an infrared transmission / reception unit is provided as the wireless communication unit 16 on the side surface of the tablet terminal 100.
 図4は、タブレット端末100をユーザが把持および操作するときの様子を説明する図である。より詳細には、図4の(a)は、タブレット端末100が片手で把持され、その手で操作される様子を説明する図であり、図4の(b)は、タブレット端末100が一方の手で把持され、もう一方の手で操作される様子を説明する図である。 FIG. 4 is a diagram illustrating a state when the user holds and operates the tablet terminal 100. More specifically, FIG. 4A is a diagram illustrating a state in which the tablet terminal 100 is gripped with one hand and is operated with that hand, and FIG. It is a figure explaining a mode that it is hold | gripped with a hand and operated with the other hand.
 本実施形態では、タブレット端末100は、片手で把持可能な手のひらサイズの情報処理装置であり、図4の(a)に示すように、片手でタブレット端末100を把持したまま、その手の親指で入力部11のタッチ面を操作できるものである。そして、タブレット端末100は、例えば、親指が届かない位置に操作対象となるアイコンが存在する場合、フリック動作で、親指近辺にアイコンを引き寄せて、親指でアイコンを囲ったり、タップしたりすることにより、アイコンの選択を行うことができるものである。 In the present embodiment, the tablet terminal 100 is a palm-sized information processing apparatus that can be held with one hand. As shown in FIG. 4A, the tablet terminal 100 is held with the thumb of the hand while holding the tablet terminal 100 with one hand. The touch surface of the input unit 11 can be operated. For example, when there is an icon to be operated at a position where the thumb does not reach, the tablet terminal 100 draws the icon near the thumb by flicking, and surrounds or taps the icon with the thumb. The icon can be selected.
 また、図4の(b)に示すように、ユーザは、一方の手でタブレット端末100を把持し、もう一方の手の指で入力部11のタッチ面を操作してもよい。あるいは、図示しないが、タブレット端末100を横長にして、その両脇を両手で把持し、両手の親指で入力部11のタッチ面を操作してもよい。 Further, as shown in FIG. 4B, the user may hold the tablet terminal 100 with one hand and operate the touch surface of the input unit 11 with the finger of the other hand. Alternatively, although not shown, the tablet terminal 100 may be horizontally long, hold both sides with both hands, and operate the touch surface of the input unit 11 with the thumbs of both hands.
 〔タブレット端末の機能〕
 次に、タブレット端末100の機能構成について説明する。図1は、本実施形態におけるタブレット端末100の要部構成を示す機能ブロック図である。
[Tablet function]
Next, the functional configuration of the tablet terminal 100 will be described. FIG. 1 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
 図1に示すとおり、本実施形態にかかるタブレット端末100の制御部10は、本発明の操作画面表示機能を実現するための機能ブロックとして、少なくとも、接触情報生成部21、オブジェクト特定部22、関連項目抽出部23、および、操作画面処理部24を備えている。 As shown in FIG. 1, the control unit 10 of the tablet terminal 100 according to the present embodiment includes at least a contact information generation unit 21, an object specification unit 22, and related functions as functional blocks for realizing the operation screen display function of the present invention. An item extraction unit 23 and an operation screen processing unit 24 are provided.
 上述した制御部10の各機能ブロックは、CPU(central processing unit)が、ROM(read only memory)等で実現された不揮発性記憶装置に記憶されているプログラムを不図示のRAM(random access memory)等に読み出して実行することで実現できる。 Each functional block of the control unit 10 described above includes a CPU (central processing unit) that stores a program stored in a non-volatile storage device realized by a ROM (read only memory) or the like (RAM (random access memory)). It can be realized by reading and executing the above.
 また、記憶部19は、制御部10の上記の各部が操作画面表示機能を実行する際に、データの読み出しまたは書き込みを行うための記憶部として、具体的には、フレームマップ記憶部41、関連情報記憶部42およびアイコン記憶部43で構成されている。 In addition, the storage unit 19 is specifically a frame map storage unit 41 or a related unit as a storage unit for reading or writing data when the above-described units of the control unit 10 execute the operation screen display function. An information storage unit 42 and an icon storage unit 43 are included.
 接触情報生成部21は、入力部11のタッチセンサから出力される信号を処理して、接触情報を生成するものである。接触情報とは、指示体(例えば指)の接触位置の座標位置を示す接触座標情報を少なくとも含んでいる。これにより、制御部10の各部は、上記接触情報から上記指示体の移動の軌跡を得ることができる。本実施形態では、さらに、上記軌跡を構成する各点に対して、接触が起こった時間を示す接触時間情報(指示体の移動時間を示す移動時間情報)が必要に応じて対応付けられていてもよい。 The contact information generation part 21 processes the signal output from the touch sensor of the input part 11, and produces | generates contact information. The contact information includes at least contact coordinate information indicating the coordinate position of the contact position of the indicator (for example, a finger). Thereby, each part of the control part 10 can acquire the locus | trajectory of the movement of the said indicator from the said contact information. In the present embodiment, the contact time information indicating the time when the contact has occurred (the movement time information indicating the movement time of the indicator) is further associated with each point constituting the trajectory as necessary. Also good.
 より詳細には、入力部11のタッチセンサが、タッチ面と指示体(本実施形態では、指)との接触を検知してから、その非接触を検知するまでの間、接触情報生成部21は、タッチセンサから出力される信号を取得する。この信号には、「接触」を検知した旨と、その接触位置を示す情報とが含まれており、接触情報生成部21は、この信号に基づいて、接触位置を座標で示す接触座標情報を生成する。さらに、接触情報生成部21は、接触が検知されてからそれが非接触となるまでの間の時間を測定し、接触時間情報を接触座標情報に対応付ける。接触情報生成部21は、タブレット端末100に搭載される時計部が保持する絶対的な時刻情報を取得して利用してもよいが、本実施形態では、接触情報生成部21は、接触が検知されてから計時を開始し、相対的な接触時間情報を得る。例えば、接触情報生成部21は、接触が最初に検知された時点(t0)を0.00秒として経過時刻を計測し、接触が最後に検知された時点(tn)まで、上記計測を継続させて、接触位置に対応する相対的な接触時間情報を取得すればよい。
接触情報生成部21は、得られた接触時間情報を接触座標情報に対応付けて接触情報を生成する。本実施形態では、生成された接触情報は、オブジェクト特定部22に供給され、オブジェクト特定部22によって利用される。
More specifically, the contact information generation unit 21 from when the touch sensor of the input unit 11 detects contact between the touch surface and the indicator (in this embodiment, a finger) until it detects non-contact. Acquires a signal output from the touch sensor. This signal includes information indicating that “contact” has been detected and information indicating the contact position. Based on this signal, the contact information generation unit 21 generates contact coordinate information indicating the contact position in coordinates. Generate. Further, the contact information generation unit 21 measures the time from when contact is detected until it becomes non-contact, and associates the contact time information with the contact coordinate information. The contact information generation unit 21 may acquire and use absolute time information held by the clock unit mounted on the tablet terminal 100. However, in the present embodiment, the contact information generation unit 21 detects contact. Then, timing is started and relative contact time information is obtained. For example, the contact information generation unit 21 measures the elapsed time with the time point when the contact is first detected (t0) as 0.00 seconds, and continues the measurement until the time point when the contact is finally detected (tn). Thus, the relative contact time information corresponding to the contact position may be acquired.
The contact information generation unit 21 generates contact information by associating the obtained contact time information with the contact coordinate information. In the present embodiment, the generated contact information is supplied to the object specifying unit 22 and used by the object specifying unit 22.
 オブジェクト特定部22は、ユーザの接触動作によって選択されたオブジェクトを特定するものである。オブジェクト特定部22は、接触情報生成部21によって生成された接触情報と、その接触が起こっている間に表示部12に表示されていた映像フレームのマップ情報とを対比する。これにより、オブジェクト特定部22は、接触動作によって囲われたオブジェクトを、表示部12に表示中のオブジェクトの中から特定することができる。 The object specifying unit 22 specifies an object selected by the user's contact operation. The object specifying unit 22 compares the contact information generated by the contact information generating unit 21 with the map information of the video frame displayed on the display unit 12 while the contact is occurring. Thereby, the object specification part 22 can specify the object enclosed by contact operation from the objects currently displayed on the display part 12. FIG.
 フレームマップ記憶部41は、接触時の表示部12に出力されていた映像フレームのマップ情報を記憶するものである。マップ情報は、タッチパネルに表示されている映像フレームのレイアウトを示す情報である。具体的には、マップ情報は、表示されている各オブジェクトについて、それらを個々に識別する情報と、各オブジェクトの形状、大きさ、および、表示位置の情報を含む。つまり、マップ情報は、各オブジェクトをタッチパネルの座標系に対応させてプロットしたものである。 The frame map storage unit 41 stores map information of the video frame output to the display unit 12 at the time of contact. The map information is information indicating the layout of the video frame displayed on the touch panel. Specifically, the map information includes information for individually identifying each object displayed, and information on the shape, size, and display position of each object. That is, the map information is obtained by plotting each object corresponding to the coordinate system of the touch panel.
 図5は、オブジェクト特定部22の動作を説明する図である。より詳細には、図5の(a)は、ユーザが目的のオブジェクトを選択するためにオブジェクトを「囲う」という接触動作を行ったことを示す図である。図5の(b)は、同図の(a)に示す接触動作に伴って、接触情報生成部21が生成した接触情報の一例を示す図である。図5の(c)は、接触が検知されたt0~tnの期間に表示部12に表示された映像フレームのマップ情報の一例を示す図である。 FIG. 5 is a diagram for explaining the operation of the object specifying unit 22. More specifically, FIG. 5A is a diagram showing that the user has performed a contact operation of “enclosing” an object in order to select the target object. FIG. 5B is a diagram illustrating an example of contact information generated by the contact information generation unit 21 in accordance with the contact operation illustrated in FIG. FIG. 5C is a diagram illustrating an example of map information of a video frame displayed on the display unit 12 during a period from t0 to tn in which contact is detected.
 図5の(a)に示すとおり、ユーザが、タブレット端末100のタッチパネルに表示されているオブジェクト80(ここでは、写真)のうちの1つを「囲う」という接触動作を実行したとする。接触動作は、例えば、接触点が同図の破線の位置を通過するようにt0~tnの期間に実行されたとする。 As shown in (a) of FIG. 5, it is assumed that the user performs a contact operation of “enclosing” one of the objects 80 (here, photographs) displayed on the touch panel of the tablet terminal 100. For example, it is assumed that the contact operation is performed in a period from t0 to tn so that the contact point passes the position of the broken line in FIG.
 オブジェクト特定部22は、接触情報生成部21から、図5の(b)に示すような接触情報を取得する。本実施形態では、接触情報の座標系は、タブレット端末100のタッチパネルの座標系に対応し、パネルの最左上端を原点とするものである。図5の(b)において、ユーザの「囲う」接触動作の軌跡のうち、始点をt0、終点をtnとして示しているが、その間の各点にも接触時間情報が関連付けられていてもよい。 The object specifying unit 22 acquires contact information as shown in FIG. 5B from the contact information generating unit 21. In the present embodiment, the coordinate system of the contact information corresponds to the coordinate system of the touch panel of the tablet terminal 100, and has the leftmost upper end of the panel as the origin. In FIG. 5B, among the trajectories of the “enclose” contact movement of the user, the start point is indicated as t0 and the end point is indicated as tn. However, contact time information may also be associated with each point in between.
 オブジェクト特定部22は、フレームマップ記憶部41から、図5の(c)に示されるマップ情報(すなわち、t0~tnの期間において表示部12に表示された映像フレームのレイアウト)を取得する。そして、オブジェクト特定部22は、上記接触情報と上記マップ情報とを対比して、接触情報から得られたユーザの指の軌跡によって囲われる領域内または該領域の外接矩形内に全部あるいはほぼ重なるオブジェクト80を選択されたオブジェクトであるとして特定する。図5に示す例では、オブジェクト特定部22は、図5の(c)の「写真1」を選択されたオブジェクトとして特定する。オブジェクト特定部22は、特定したオブジェクトの情報を、関連項目抽出部23に供給する。 The object specifying unit 22 acquires the map information shown in FIG. 5C (that is, the layout of the video frame displayed on the display unit 12 during the period from t0 to tn) from the frame map storage unit 41. Then, the object specifying unit 22 compares the contact information with the map information, and the object specifying unit 22 completely or substantially overlaps the area enclosed by the trajectory of the user's finger obtained from the contact information or the circumscribed rectangle of the area. 80 is identified as the selected object. In the example shown in FIG. 5, the object specifying unit 22 specifies “Picture 1” in FIG. 5C as the selected object. The object specifying unit 22 supplies information on the specified object to the related item extracting unit 23.
 関連項目抽出部23は、オブジェクト特定部22によって特定されたオブジェクト、すなわち、ユーザによって選択されたオブジェクトに関連する関連項目を抽出するものである。オブジェクトが選択されたときに、その選択されたオブジェクトに関連性が深い項目が、関連項目抽出部23によって抽出されることになっている。 The related item extracting unit 23 extracts related items related to the object specified by the object specifying unit 22, that is, the object selected by the user. When an object is selected, an item deeply related to the selected object is extracted by the related item extraction unit 23.
 例えば、オブジェクトが「写真」などのデータである場合、写真に対しては、「表示する」、「編集する」、「メールに添付して送信する」、「周辺機器(テレビなど)に転送する」、「印刷する」などの動作が実行されることが想定される。そこで、「動作対象」であるオブジェクトに対して実行される「動作」の関係にあたる項目がオブジェクトの関連項目として抽出されてもよい。 For example, when the object is data such as “photo”, the photo is transferred to “display”, “edit”, “send as an email”, “peripheral device (TV etc.)” It is assumed that operations such as “print” and “print” are executed. Therefore, an item corresponding to an “action” executed on an object that is an “action target” may be extracted as a related item of the object.
 あるいは、オブジェクトが「写真」などのデータである場合、その写真を特定の人に送ることが想定される。そこで、動作対象であるオブジェクトに対して動作が実行されるときの「動作相手」の関係にあたる項目が関連項目として抽出されてもよい。 Or, if the object is data such as “photograph”, it is assumed that the photo is sent to a specific person. Therefore, an item corresponding to the relationship of “action partner” when an action is executed on an object that is an action target may be extracted as a related item.
 あるいは、オブジェクトが複数の写真またはその他のデータを含む「アルバム」または「フォルダ」である場合、そのオブジェクトに含まれる写真またはデータをユーザは所望していると考えられる。このように、オブジェクトの下位層に属する項目が関連項目として抽出されてもよい。 Alternatively, if the object is an “album” or “folder” that includes multiple photos or other data, the user may want the photos or data contained in the object. In this way, items belonging to the lower layer of the object may be extracted as related items.
 関連情報記憶部42は、オブジェクトと項目との関連性を示す関連情報を記憶するものである。図6は、関連情報記憶部42に記憶される関連情報の一例を示す図である。 The related information storage unit 42 stores related information indicating the relationship between objects and items. FIG. 6 is a diagram illustrating an example of related information stored in the related information storage unit 42.
 関連情報は、図6に示すとおり、「オブジェクト」ごとに、少なくとも「関連項目」が対応付けられた情報である。関連情報は、この対応付けによって、オブジェクトと項目との関連性を示している。 As shown in FIG. 6, the related information is information in which at least “related items” are associated with each “object”. The association information indicates the association between the object and the item by this association.
 関連項目抽出部23は、オブジェクト特定部22によってオブジェクトが特定されると、関連情報記憶部42に記憶されている関連情報を参照し、特定されたオブジェクトに関連付けられている項目を関連項目として抽出する。 When an object is specified by the object specifying unit 22, the related item extracting unit 23 refers to related information stored in the related information storage unit 42 and extracts items related to the specified object as related items. To do.
 例えば、図5の(a)~(c)に示すとおり、オブジェクト特定部22が、選択されたオブジェクトは「写真1」であると特定したとする。この場合、関連項目抽出部23は、「写真1」はオブジェクトとしては「写真」に分類されるので、関連情報の中から、オブジェクト「写真」に関連付けられている関連項目群60を抽出する。 For example, as shown in FIGS. 5A to 5C, it is assumed that the object specifying unit 22 specifies that the selected object is “Photo 1”. In this case, since “Photo 1” is classified as “Photo” as an object, the related item extraction unit 23 extracts a related item group 60 associated with the object “Photo” from the related information.
 関連項目抽出部23によって抽出された関連項目の情報は、操作画面処理部24に供給される。そして、抽出された関連項目は、先に選択されたオブジェクトに関連のある項目として、選択可能に(例えば、アイコンで)表示される。 Information on the related items extracted by the related item extracting unit 23 is supplied to the operation screen processing unit 24. The extracted related items are displayed so as to be selectable (for example, as icons) as items related to the previously selected object.
 この構成には限定されないが、本実施形態では、図6に示すとおり、さらに「関連項目」ごとに、アイコンが割り当てられていてもよい。例えば、オブジェクト「写真」に関連付けられた、関連項目「テレビで表示する(テレビに転送する)」には、アイコン「1:テレビ」が関連付けられている。アイコン「1:テレビ」は、例えば、テレビのイラストなどが描かれたアイコンであって、「写真をテレビに送って表示させること」を想起させる絵柄であることが好ましい。その他の関連項目についても、その関連項目の内容を想起させる相応しい絵柄のアイコンがそれぞれ割り当てられている。 Although not limited to this configuration, in this embodiment, as shown in FIG. 6, an icon may be assigned to each “related item”. For example, the icon “1: TV” is associated with the related item “display on television (transfer to television)” associated with the object “photo”. The icon “1: TV” is, for example, an icon on which an illustration of a TV or the like is drawn, and is preferably a picture reminiscent of “sending a photograph to the TV for display”. Other related items are also assigned icons with appropriate patterns that recall the contents of the related items.
 このような関連情報に基づいて、関連項目抽出部23は、抽出した関連項目のそれぞれに対応するアイコン(あるいは、アイコンの識別情報)を、操作画面処理部24に供給してもよい。これにより操作画面処理部24は、関連項目抽出部23によって指定されたアイコンを表示するべく処理を進めることができる。 Based on such related information, the related item extracting unit 23 may supply the operation screen processing unit 24 with icons (or icon identification information) corresponding to the extracted related items. Thereby, the operation screen processing unit 24 can proceed to display the icon specified by the related item extraction unit 23.
 なお、本実施形態では、関連情報は、関連項目のそれぞれについて、図6に示す「動作属性」および「条件」の情報を保持していなくても構わない。関連項目の「動作属性」および「条件」については、後述する本発明の他の実施形態または変形例において説明する。 In the present embodiment, the related information may not hold the “operation attribute” and “condition” information shown in FIG. 6 for each related item. Related items “operation attribute” and “condition” will be described in another embodiment or modification of the present invention described later.
 操作画面処理部24は、オブジェクト、および、選択されたオブジェクトに関連する関連項目(のアイコン)を、ユーザに選択可能に表示するための操作画面を生成する処理(操作画面生成処理)を行うものである。 The operation screen processing unit 24 performs processing (operation screen generation processing) for generating an operation screen for displaying an object and a related item (its icon) related to the selected object in a selectable manner for the user. It is.
 図7は、アイコン記憶部43に記憶されるアイコン画像の具体例を示す図である。図7に示すとおり、本実施形態では、各アイコン画像は、アイコン識別情報によって識別可能となっている。例えば、「1:テレビ」のアイコン識別情報には、テレビが描かれたアイコン画像が関連付けられている。また、図示していないが、よく通話する知人など、個人情報を表すアイコンとして、その人の似顔絵やアバターの画像が用いられてもよい。 FIG. 7 is a diagram showing a specific example of the icon image stored in the icon storage unit 43. As shown in FIG. As shown in FIG. 7, in this embodiment, each icon image can be identified by icon identification information. For example, the icon identification information “1: TV” is associated with an icon image depicting a TV. In addition, although not shown, a portrait or an avatar image of the person may be used as an icon representing personal information such as an acquaintance who often makes a call.
 操作画面処理部24は、関連項目抽出部23によって抽出された関連項目に割り当てられたアイコン画像を、アイコン記憶部43から読み出して、これらが適切な位置および適切なタイミングで表示されるように操作画面を生成し、図示しない表示制御部を介して、表示部12に出力する。 The operation screen processing unit 24 reads the icon images assigned to the related items extracted by the related item extraction unit 23 from the icon storage unit 43, and performs operations so that these are displayed at an appropriate position and an appropriate timing. A screen is generated and output to the display unit 12 via a display control unit (not shown).
 具体的には、本実施形態では、操作画面処理部24は、接触動作「囲う」によって選択されたオブジェクトに関連する関連項目のアイコンを、その選択されたオブジェクトの周囲に表示させる機能を有している。 Specifically, in the present embodiment, the operation screen processing unit 24 has a function of displaying an icon of a related item related to the object selected by the touch action “enclose” around the selected object. ing.
 図8は、操作画面処理部24の処理内容を説明する図である。より詳細には、図8の(a)は、操作画面処理部24によって実行された、オブジェクトの表示処理の一例を説明する図であり、図8の(b)は、操作画面処理部24によって実行された、関連項目のアイコン配置の一例を示す図である。 FIG. 8 is a diagram for explaining processing contents of the operation screen processing unit 24. More specifically, FIG. 8A is a diagram illustrating an example of an object display process executed by the operation screen processing unit 24, and FIG. 8B is a diagram illustrating the operation screen processing unit 24. It is a figure which shows an example of the icon arrangement | positioning of the related item performed.
 本実施形態では、操作画面処理部24は、まず、図8の(a)に示すとおり、先の「囲う」の接触動作によって選択されたオブジェクト80を、中央に配置しなおす。次に、図8の(b)に示すとおり、操作画面処理部24は、抽出された関連項目の各アイコンをオブジェクト80の周囲に環形状によって均等に配置する。図8の(b)は、関連項目が8個抽出された場合に、操作画面処理部24が、楕円の環の輪郭線に沿って、8個のアイコンを均等に配置した例を示している。ここで、アイコンの配置位置を示す基準の「環」の形状について、これを楕円としたことは単なる一例であって、本発明の環の形状を限定する意図は無い。また、「環」とは、必ずしも曲線からなる形状を意味しない。例えば、操作画面処理部24は、環の形状を、円形、正方形、長方形、その他多角形で定義してもよいし、複雑な形状、いびつな形状、幾何学的でない形状であっても、内と外とを分離するような輪郭線を有している図形が環として定義される。また、「環」は、必ずしも閉曲線を意味しない。環の輪郭線の始点と終点が完全に一致しない場合であっても、内と外とを大方分離するような輪郭線が環として定義されてもよい。操作画面処理部24は、そのようにして定義されたあらゆる形状の環の輪郭線上にアイコンを配置する。 In the present embodiment, the operation screen processing unit 24 first arranges the object 80 selected by the previous “enclose” contact operation at the center as shown in FIG. Next, as illustrated in FIG. 8B, the operation screen processing unit 24 arranges the icons of the extracted related items evenly around the object 80 in a ring shape. FIG. 8B shows an example in which, when eight related items are extracted, the operation screen processing unit 24 uniformly arranges eight icons along the outline of the oval ring. . Here, regarding the shape of the reference “ring” indicating the arrangement position of the icon, the oval shape is merely an example, and there is no intention to limit the shape of the ring of the present invention. Further, the “ring” does not necessarily mean a shape formed by a curve. For example, the operation screen processing unit 24 may define the shape of the ring as a circle, square, rectangle, or other polygon, or may include a complicated shape, irregular shape, or non-geometric shape. A figure having an outline that separates the outside from the outside is defined as a ring. Further, “ring” does not necessarily mean a closed curve. Even if the starting point and the ending point of the contour line of the ring are not completely coincident with each other, a contour line that largely separates the inside and the outside may be defined as the ring. The operation screen processing unit 24 arranges icons on the contour lines of the rings having any shapes defined as described above.
 なお、図8の(b)に示す、環の輪郭線を示す破線は、タブレット端末100が内部に情報として保持している環の形状であって、実際には、表示部12に表示されなくてもよい。これより以降に示す各図における環の輪郭線を示す破線も同様、実際には、表示部12に表示されなくてもよい。 In addition, the broken line which shows the outline of a ring shown in FIG.8 (b) is the shape of the ring which the tablet terminal 100 hold | maintains as information inside, and is not actually displayed on the display part 12. FIG. May be. Similarly, the broken line indicating the outline of the ring in each of the drawings shown below is not actually displayed on the display unit 12.
 なお、アイコンを配置する順番は、本実施形態では特に限定されないが、例えば、関連項目抽出部23によって抽出された順に、オブジェクト80の真上から時計回りに配置することなどが考えられる。 Note that the order in which the icons are arranged is not particularly limited in the present embodiment. For example, the icons may be arranged in a clockwise direction from the top of the object 80 in the order extracted by the related item extraction unit 23.
 図9は、操作画面処理部24が実行した操作画面生成処理の結果、得られた操作画面の具体例を示す図である。図9に示す例は、図5および図8と同様に、オブジェクト80(オブジェクト「写真1」)が囲われたことに対して得られた操作画面の具体例である。 FIG. 9 is a diagram illustrating a specific example of the operation screen obtained as a result of the operation screen generation process executed by the operation screen processing unit 24. The example shown in FIG. 9 is a specific example of the operation screen obtained when the object 80 (object “photo 1”) is enclosed, as in FIGS. 5 and 8.
 関連項目抽出部23は、図6に示す関連情報を参照し、写真のオブジェクトに関連付けられたアイコン識別情報、すなわち、「1:テレビ」、「2:プリンタ」、「3:メール」、「4:写真表示」、「5:情報表示」、「6:パレット」、「7:ゴミ箱」、および、「8:メモリカード」を抽出する。 The related item extraction unit 23 refers to the related information shown in FIG. 6, and icon identification information associated with the object of the photograph, that is, “1: TV”, “2: Printer”, “3: Mail”, “4” : “Photo display”, “5: Information display”, “6: Palette”, “7: Trash”, and “8: Memory card” are extracted.
 操作画面処理部24は、関連項目抽出部23によって抽出されたアイコン識別情報に基づいて、図7に示すとおり、アイコン記憶部43から、対応するアイコン画像を読み出す。そして、選択されたオブジェクト80を中央に配置するとともに、その周囲に読み出したアイコン画像を配置する。 The operation screen processing unit 24 reads a corresponding icon image from the icon storage unit 43 as shown in FIG. 7 based on the icon identification information extracted by the related item extraction unit 23. Then, the selected object 80 is arranged at the center, and the read icon image is arranged around it.
 なお、上述の説明では、操作画面処理部24が、オブジェクト80を中央に配置する処理を行う構成としたが、これは必須の構成ではない。しかし、本実施形態では、操作画面処理部24は、オブジェクト80の関連項目のアイコンを、オブジェクト80の周囲に配置するため、周囲にアイコンを配置するためのスペースをできるだけ広く均等に確保するという観点から、オブジェクト80を中央に配置することが好ましい。 In the above description, the operation screen processing unit 24 performs the process of placing the object 80 in the center, but this is not an essential structure. However, in the present embodiment, the operation screen processing unit 24 arranges the icons of the related items of the object 80 around the object 80, so that a space for arranging the icons around the object 80 is ensured as widely and evenly as possible. Therefore, it is preferable to arrange the object 80 in the center.
 〔操作画面表示フロー〕
 次に、タブレット端末100が操作画面表示機能を実行したときの処理の流れについて説明する。図10は、タブレット端末100による操作画面表示処理の流れを示すフローチャートである。
[Operation screen display flow]
Next, a processing flow when the tablet terminal 100 executes the operation screen display function will be described. FIG. 10 is a flowchart showing a flow of operation screen display processing by the tablet terminal 100.
 入力部11によって、タッチパネルのタッチ面に指示体(ユーザの指など)が接触したことが検知されると(S101においてYES)、接触情報生成部21は、そのとき(t=t0)から、指の接触位置を示す接触座標情報の取得を開始し、これを経時的に取得する(S102)。この接触位置の追尾は、タッチ面と指との間の接触が検知されなくなるまで継続される(S103においてNO)。入力部11において、接触が非検知になると(S103においてYES)、接触情報生成部21は、t=t0からこのとき(t=tn)までの間取得した接触座標情報と、接触時間情報とを対応付けて接触情報を生成する(S104)。 When the input unit 11 detects that an indicator (such as a user's finger) has touched the touch surface of the touch panel (YES in S101), the contact information generation unit 21 starts the finger from that time (t = t0). Acquisition of the contact coordinate information indicating the contact position is started, and this is acquired over time (S102). This tracking of the contact position is continued until no contact is detected between the touch surface and the finger (NO in S103). When contact is not detected in the input unit 11 (YES in S103), the contact information generation unit 21 obtains the contact coordinate information and the contact time information acquired from t = t0 to this time (t = tn). Corresponding contact information is generated (S104).
 オブジェクト特定部22は、S104において生成された接触情報(例えば、図5の(b))と、フレームマップ記憶部41に記憶されているマップ情報(例えば、図5の(c))とを比較して、ユーザによって囲われた領域に重なるオブジェクトを選択されたオブジェクトとして特定する(S105)。図5の(c)に示す例では、「写真1」というオブジェクト80を特定する。 The object specifying unit 22 compares the contact information generated in S104 (for example, (b) in FIG. 5) with the map information (for example, (c) in FIG. 5) stored in the frame map storage unit 41. Then, the object overlapping the area surrounded by the user is specified as the selected object (S105). In the example shown in FIG. 5C, the object 80 “Photo 1” is specified.
 関連項目抽出部23は、S105において特定されたオブジェクトに基づいて、関連情報記憶部42の関連情報(例えば、図6)を参照して、特定されたオブジェクトの関連項目を抽出する(S106)。あるいは、関連項目に割り当てられているアイコンの識別情報を抽出してもよい。 The related item extraction unit 23 refers to the related information (for example, FIG. 6) in the related information storage unit 42 based on the object specified in S105, and extracts the related item of the specified object (S106). Alternatively, identification information of icons assigned to related items may be extracted.
 操作画面処理部24は、S106において抽出された関連項目のアイコン画像を、アイコン記憶部43(例えば、図7)から取得する。そして、取得したアイコン画像を、S105において特定されたオブジェクトの周囲に配置して操作画面を生成する(S107)。このとき、操作画面処理部24は、上記オブジェクトを中央に配置して、その周囲に各アイコンを環形状に配置する。 The operation screen processing unit 24 acquires the icon image of the related item extracted in S106 from the icon storage unit 43 (for example, FIG. 7). Then, the acquired icon image is arranged around the object specified in S105 to generate an operation screen (S107). At this time, the operation screen processing unit 24 arranges the object in the center and arranges each icon in a ring shape around the object.
 以上のようにして生成された操作画面の映像信号は、表示部12に出力される。図9に示すように、上記操作画面は、タブレット端末100の表示部12に表示される。 The video signal of the operation screen generated as described above is output to the display unit 12. As shown in FIG. 9, the operation screen is displayed on the display unit 12 of the tablet terminal 100.
 本発明の上記構成および方法によれば、オブジェクトの周囲を「囲う」という、オブジェクトを指定する上で、極めて自然で簡易なユーザの接触動作に対して、タブレット端末100は、オブジェクトの周囲にアイコンを配置するという結果を出力することができる。 According to the above-described configuration and method of the present invention, the tablet terminal 100 displays an icon around the object in response to an extremely natural and simple user contact operation of “enclosing” the object. Can be output.
 ユーザは、オブジェクトを囲むように関連項目のアイコンが配置された操作画面を結果物として得ることができる。これらのアイコンと、オブジェクトとの位置関係は、先にユーザが実行した接触動作による指の軌跡とオブジェクトとの位置関係に合致する。また、囲ったことにより得られる指の軌跡は、アイコンが配置される環形状に類似する。 The user can obtain an operation screen in which icons of related items are arranged so as to surround the object as a result. The positional relationship between these icons and the object matches the positional relationship between the finger trajectory and the object by the contact operation previously performed by the user. In addition, the finger trajectory obtained by surrounding is similar to a ring shape in which icons are arranged.
 つまり、「オブジェクトを『囲う』接触動作を起こす」という事象から、「オブジェクトの周囲にアイコンが配置された操作画面を得られる」という事象への遷移は、ユーザの直感に反しない自然な遷移であると言える。 In other words, the transition from the phenomenon of “contacting the object to“ enclose ”the object” to the phenomenon of “obtaining an operation screen with icons arranged around the object” is a natural transition that does not contradict the user's intuition. It can be said that there is.
 加えて、タブレット端末100は、オブジェクトを選択した次にユーザが選択するであろう関連項目を予め察して、ユーザに選択可能に表示することができる。具体的には、本発明の上記構成によれば、オブジェクトの周囲に表示されるアイコンは、いずれも、オブジェクトに関連がある項目として抽出された関連項目のアイコンである。つまり、ユーザは、オブジェクトを囲って選択したのち、そのオブジェクトに関連する「動作」、「動作対象」、「動作相手」などを周囲のアイコンから即座に指定することができる。 In addition, the tablet terminal 100 can preliminarily detect related items that the user will select after selecting an object, and can display the related items in a selectable manner for the user. Specifically, according to the above configuration of the present invention, the icons displayed around the object are all related item icons extracted as items related to the object. That is, after the user surrounds and selects an object, the user can immediately designate “motion”, “motion target”, “action partner”, and the like related to the object from surrounding icons.
 以上のことから、タブレット端末100は、簡易な接触動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、タッチパネルを備えたタブレット端末100において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations. As a result, the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
 ≪実施形態2≫
 本発明の情報処理装置に関する他の実施形態について、図11~図14に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、上述の実施形態1にて説明した図面と同じ機能を有する部材については、同じ符号を付記し、実施形態1と重複する内容については説明を省略する。
<< Embodiment 2 >>
Another embodiment of the information processing apparatus according to the present invention will be described below with reference to FIGS. For convenience of explanation, members having the same functions as those in the drawings described in the first embodiment are denoted by the same reference numerals, and description of the same contents as those in the first embodiment is omitted.
 上述の実施形態1では、各アイコンは、所定の環形状に沿って配置される構成であったが、本実施形態では、操作画面処理部24は、ユーザの接触動作に応じて環形状を動的に決定する構成である。これにより、ユーザの直感に沿って、接触動作からより直感的に想起される結果物を出力することができる。 In the first embodiment described above, each icon is arranged along a predetermined ring shape. However, in this embodiment, the operation screen processing unit 24 moves the ring shape according to the user's contact operation. It is the structure determined automatically. Thereby, according to a user's intuition, the result recalled more intuitively from contact operation | movement can be output.
 〔タブレット端末の機能〕
 図11は、本実施形態におけるタブレット端末100の要部構成を示す機能ブロック図である。
[Tablet function]
FIG. 11 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
 本実施形態にかかるタブレット端末100は、実施形態1のタブレット端末100(図1)と比較して、さらに、制御部10が、機能ブロックとして、環形状決定部30を備えている構成である。また、記憶部19が、さらに、接触情報記憶部44を備えている構成である。 The tablet terminal 100 according to the present embodiment has a configuration in which the control unit 10 further includes a ring shape determining unit 30 as a functional block, as compared with the tablet terminal 100 (FIG. 1) of the first embodiment. The storage unit 19 further includes a contact information storage unit 44.
 ここで、本実施形態では、タブレット端末100の制御部10は、必須ではないが、さらに、必要に応じて、機能ブロックとして、ジェスチャ判定部25、アイコン順位決定部31、アニメーション決定部32、および、アイコン配置決定部33を備えていてもよい。なお、アイコン順位決定部31、アニメーション決定部32、および、アイコン配置決定部33については、後の実施形態または変形例にて説明することとし、本実施形態では説明を省略する。 Here, in the present embodiment, the control unit 10 of the tablet terminal 100 is not essential, but further, as necessary, as a functional block, a gesture determination unit 25, an icon ranking determination unit 31, an animation determination unit 32, and The icon arrangement determining unit 33 may be provided. Note that the icon rank determination unit 31, the animation determination unit 32, and the icon arrangement determination unit 33 will be described in a later embodiment or modification, and the description thereof is omitted in this embodiment.
 接触情報記憶部44は、接触情報生成部21によって生成された接触情報を記憶するものである。実施形態1では、接触情報は、オブジェクト特定部22が利用可能なように図示しない記憶部(キャッシュなど)に一時的に記憶されるものであった。本実施形態では、接触情報は、操作画面処理部24および操作画面処理部24の各部が、操作画面生成処理(アイコンを表示する処理を含む)を実行するときに利用可能なように、接触情報記憶部44に記憶される。接触情報記憶部44を不揮発性記憶装置で実現するか否か、すなわち、この接触情報を不揮発的に記憶するか否かは、操作画面処理部24が実行する操作画面表示機能の目的、想定利用環境、あるいは、タブレット端末100自体の使用目的、利便性、コスト、物理的な制約などから適宜決定される。 The contact information storage unit 44 stores the contact information generated by the contact information generation unit 21. In the first embodiment, the contact information is temporarily stored in a storage unit (such as a cache) (not shown) so that the object specifying unit 22 can be used. In the present embodiment, the contact information is used so that each part of the operation screen processing unit 24 and the operation screen processing unit 24 can perform operation screen generation processing (including processing for displaying icons). It is stored in the storage unit 44. Whether or not the contact information storage unit 44 is realized by a non-volatile storage device, that is, whether or not the contact information is stored in a nonvolatile manner depends on the purpose of the operation screen display function executed by the operation screen processing unit 24 and the assumed use. It is determined as appropriate from the environment, the purpose of use of the tablet terminal 100 itself, convenience, cost, physical restrictions, and the like.
 環形状決定部30は、操作画面処理部24がオブジェクトの周囲にアイコンを配置するときの、環形状を決定するものである。実施形態1では、操作画面処理部24は、所定の環形状にてアイコンを配置する構成であった。例えば、所定位置および所定サイズの楕円形の輪郭線上にアイコンを配置する構成であった。本実施形態では、環形状決定部30は、接触情報記憶部44に記憶されている接触情報に基づいて、アイコンを配置するための環形状を決定する。つまり、環形状決定部30は、ユーザの接触動作「囲う」から得られた指の軌跡に応じて、環形状を決定する。環形状決定部30は、「囲う」動作の軌跡の形状を、そのまま、アイコンを配置するための環形状とすることができる。また、「囲う」動作によって囲われた領域の大きさに基づいて環のサイズを決定することもできる。また、囲われた領域の位置に基づいて環の位置を決定することもできる。環形状決定部30は、さらに、フレームマップ記憶部41に記憶されているマップ情報に基づいて環形状を決定してもよい。すなわち、囲われたオブジェクトの表示位置、サイズなどに応じて、環のサイズおよび位置を決定してもよい。 The ring shape determination unit 30 determines the ring shape when the operation screen processing unit 24 arranges icons around the object. In the first embodiment, the operation screen processing unit 24 is configured to arrange icons in a predetermined ring shape. For example, the icon is arranged on an elliptical outline of a predetermined position and a predetermined size. In the present embodiment, the ring shape determination unit 30 determines a ring shape for arranging icons based on the contact information stored in the contact information storage unit 44. That is, the ring shape determination unit 30 determines the ring shape according to the finger trajectory obtained from the user's contact operation “enclose”. The ring shape determination unit 30 can directly change the shape of the trajectory of the “enclose” operation into a ring shape for arranging icons. Also, the size of the ring can be determined based on the size of the area enclosed by the “enclose” operation. Further, the position of the ring can be determined based on the position of the enclosed region. The ring shape determination unit 30 may further determine the ring shape based on the map information stored in the frame map storage unit 41. That is, the size and position of the ring may be determined according to the display position and size of the enclosed object.
 環形状決定部30によって決定された環形状にしたがって、操作画面処理部24は、抽出されたアイコンを配置する。なお、環形状決定部30によって決定される環形状の情報は、環のサイズの情報、および/または、環の位置の情報をさらに含んでいてもよい。 In accordance with the ring shape determined by the ring shape determination unit 30, the operation screen processing unit 24 arranges the extracted icons. The ring shape information determined by the ring shape determination unit 30 may further include ring size information and / or ring position information.
 上述したとおり、タブレット端末100の制御部10は、さらに、ジェスチャ判定部25を備えていることが好ましい。入力部11に対して行われる接触動作(ジェスチャ)が「囲う」以外にも想定されている場合には、「囲う」のジェスチャなのか、あるいは、別のどのジェスチャなのかを判別する必要がある。 As described above, it is preferable that the control unit 10 of the tablet terminal 100 further includes the gesture determination unit 25. If the contact operation (gesture) performed on the input unit 11 is assumed to be other than “enclose”, it is necessary to determine whether the gesture is “enclose” or another gesture. .
 ジェスチャ判定部25は、入力部11に対して行われた接触動作について、それが何のジェスチャであるのかを判定するものである。例えば、ジェスチャ判定部25は、「タップ」、「フリック」、「ピンチ」、「ドラッグ」、「囲う」などのジェスチャを判別することができる。ジェスチャを判別するアルゴリズムは、公知の技術を適宜採用することができる。 The gesture determination unit 25 determines what gesture it is for the contact operation performed on the input unit 11. For example, the gesture determination unit 25 can determine gestures such as “tap”, “flick”, “pinch”, “drag”, and “enclose”. A known technique can be appropriately employed as an algorithm for determining a gesture.
 ジェスチャ判定部25は、判定結果に応じて、その判定されたジェスチャに対応する処理を実行するように制御部10の各部に対して指示する。 The gesture determination unit 25 instructs each unit of the control unit 10 to execute processing corresponding to the determined gesture according to the determination result.
 本実施形態では、ジェスチャ判定部25は、検知された接触動作が、「囲う」というジェスチャであると判定した場合に、接触情報生成部21に対して、生成した接触情報を接触情報記憶部44に格納するように指示することが好ましい。これにより、「囲う」というジェスチャについての全ての情報(領域の位置、サイズ、軌跡、接触時間、接触点の移動タイミングなど)を、操作画面処理部24が参照できるようになるとともに、接触動作が「囲う」というジェスチャ以外であった場合には、不必要に接触情報記憶部44への書き込みが発生することを回避することができる。しかしながら、接触情報生成部21は、ジェスチャ判定部25の判定結果によらず、すべての接触情報を接触情報記憶部44に書き込む構成であってもよい。 In this embodiment, when the gesture determination unit 25 determines that the detected contact action is a gesture of “enclose”, the contact information generation unit 21 sends the generated contact information to the contact information storage unit 44. It is preferable to instruct to store. As a result, the operation screen processing unit 24 can refer to all information (such as the position, size, locus, contact time, contact point movement timing, etc.) of the gesture “enclose”, and the contact operation can be performed. If the gesture is other than the “enclose” gesture, it is possible to avoid unnecessary writing to the contact information storage unit 44. However, the contact information generation unit 21 may be configured to write all contact information in the contact information storage unit 44 regardless of the determination result of the gesture determination unit 25.
 図12は、接触情報記憶部44に記憶される接触情報の具体例を示す図である。より詳細には、図12の(a)は、ユーザが目的のオブジェクトを選択するために、オブジェクトを任意の形状で「囲う」という接触動作を行ったことを示す図である。図12の(b)は、同図の(a)に示す接触動作に伴って、接触情報生成部21が生成した接触情報の一例を示す図である。 FIG. 12 is a diagram illustrating a specific example of contact information stored in the contact information storage unit 44. More specifically, FIG. 12A is a diagram showing that the user has performed a contact operation of “enclosing” an object in an arbitrary shape in order to select a target object. FIG. 12B is a diagram illustrating an example of contact information generated by the contact information generation unit 21 in accordance with the contact operation illustrated in FIG.
 図12の(a)に示すとおり、ユーザが、タブレット端末100のタッチパネルに表示されているオブジェクト(ここでは、写真)のうちの1つを、任意の形状(例えば、ハート型に「囲う」という接触動作を実行したとする。接触動作は、例えば、接触点が同図の破線の位置を通過するようにt0~tnの期間に実行されたとする。 As shown in (a) of FIG. 12, the user refers to one of the objects (here, photographs) displayed on the touch panel of the tablet terminal 100 in an arbitrary shape (for example, “enclose” in a heart shape). Assume that the contact operation is executed, for example, when the contact point is executed in the period from t0 to tn so that the contact point passes the position of the broken line in FIG.
 ジェスチャ判定部25は、接触情報生成部21から、図12の(b)に示すような接触情報を取得する。図12の(b)において、ユーザがなぞった軌跡のうち、始点をt0、終点をtnとして示しているが、その間の各点にも接触時間情報が関連付けられていてもよい。 The gesture determination unit 25 acquires contact information as illustrated in FIG. 12B from the contact information generation unit 21. In FIG. 12B, among the traces traced by the user, the start point is indicated as t0 and the end point is indicated as tn. However, contact time information may also be associated with each point in between.
 ジェスチャ判定部25は、図12の(b)に示される接触情報に基づいて、この接触動作を、「囲う」というジェスチャであると判定する。ジェスチャ判定部25は、図12の(b)に示される上記接触情報を接触情報記憶部44に記憶するように、接触情報生成部21に対して指示する。これにより、操作画面処理部24の各部は、アイコンを表示する処理を実行するときに、接触情報記憶部44に記憶された、図12の(b)に示される接触情報を参照することができる。 The gesture determination unit 25 determines that this contact operation is a gesture of “enclose” based on the contact information shown in FIG. The gesture determination unit 25 instructs the contact information generation unit 21 to store the contact information illustrated in FIG. 12B in the contact information storage unit 44. Thereby, each part of the operation screen process part 24 can refer to the contact information shown by (b) of FIG. 12 memorize | stored in the contact information storage part 44, when performing the process which displays an icon. .
 図13は、環形状決定部30を含む操作画面処理部24によって実行された、関連項目のアイコン表示の一例を示す図である。 FIG. 13 is a diagram illustrating an example of icon display of related items executed by the operation screen processing unit 24 including the ring shape determining unit 30.
 まず、操作画面処理部24は、選択されたオブジェクト80(写真1)を、中央に配置することができる。次に、本実施形態では、環形状決定部30が、接触情報記憶部44に記憶された接触情報を取得する。環形状決定部30は、接触情報から得られた、指の先端(接触点)の移動の軌跡に基づいて、その軌跡と同一のまたは相似する形状を、アイコンを配置するための環形状として決定する。本実施形態では、一例として、環形状決定部30は、環を中心に配置し、タッチ画面に可能な限り大きく配置することを決定する。図12の(a)および(b)に示すとおり、オブジェクト80は、ハート型に囲われている。そこで、環形状決定部30は、図13の破線に示すとおり、そのハート型の軌跡の相似形が中央に画面いっぱい配置されるように環形状81を決定する。 First, the operation screen processing unit 24 can place the selected object 80 (photo 1) in the center. Next, in the present embodiment, the ring shape determination unit 30 acquires the contact information stored in the contact information storage unit 44. Based on the movement trajectory of the finger tip (contact point) obtained from the contact information, the ring shape determination unit 30 determines a shape that is the same as or similar to the trajectory as the ring shape for arranging the icons. To do. In the present embodiment, as an example, the ring shape determination unit 30 determines to place the ring at the center and to place it as large as possible on the touch screen. As shown in FIGS. 12A and 12B, the object 80 is surrounded by a heart shape. Therefore, the ring shape determination unit 30 determines the ring shape 81 so that the similar shape of the heart-shaped locus is arranged in the center of the screen as shown by the broken line in FIG.
 操作画面処理部24は、図13に示すように、環形状決定部30によって決定された環形状81の輪郭線上にアイコンを配置する。操作画面処理部24は、アイコンを等間隔に配置してもよいし、別の規則にしたがって輪郭線上の任意の位置にアイコンを配置してもよい。 The operation screen processing unit 24 arranges icons on the outline of the ring shape 81 determined by the ring shape determination unit 30 as shown in FIG. The operation screen processing unit 24 may arrange the icons at equal intervals, or may arrange the icons at arbitrary positions on the contour line according to another rule.
 なお、操作画面処理部24は、軌跡が極度に複雑な形状を有する場合には、軌跡の近似形を環の形状として決定してもよい。軌跡の細かくいびつな線を、直線または曲線で丸めることにより、環の形状を定義する情報量を少なくし、アイコンを配置する処理の負荷を低減することができる。 The operation screen processing unit 24 may determine the approximate shape of the trajectory as the ring shape when the trajectory has an extremely complicated shape. By rounding a fine and distorted line of a locus with a straight line or a curve, the amount of information defining the shape of the ring can be reduced, and the processing load for arranging icons can be reduced.
 〔操作画面表示フロー〕
 図14は、本実施形態におけるタブレット端末100による操作画面表示処理の流れを示すフローチャートである。
[Operation screen display flow]
FIG. 14 is a flowchart showing a flow of operation screen display processing by the tablet terminal 100 in the present embodiment.
 入力部11によって、タッチパネルのタッチ面に指示体(ユーザの指など)が接触したことが検知されると(S201においてYES)、接触情報生成部21は、そのとき(t=t0)から、指の接触位置を示す接触座標情報の取得を開始し、これを経時的に取得する(S202)。この接触位置の追尾は、タッチ面と指との間の接触が検知されなくなるまで継続される(S203においてNO)。入力部11において、接触が非検知になると(S203においてYES)、接触情報生成部21は、t=t0からこのとき(t=tn)までの間取得した接触座標情報と、接触時間情報とを対応付けて接触情報を生成する(S204)。 When the input unit 11 detects that an indicator (such as a user's finger) has touched the touch surface of the touch panel (YES in S201), the contact information generation unit 21 starts the finger from that time (t = t0). The acquisition of the contact coordinate information indicating the contact position is started and acquired over time (S202). This tracking of the contact position is continued until no contact is detected between the touch surface and the finger (NO in S203). When contact is not detected in the input unit 11 (YES in S203), the contact information generation unit 21 obtains the contact coordinate information and the contact time information acquired from t = t0 to this time (t = tn). Corresponding contact information is generated (S204).
 ここで、ジェスチャ判定部25が、接触情報に基づいてこの接触動作のジェスチャを判定してもよい(S205)。本実施形態では、ジェスチャ判定部25は、判定したジェスチャが「囲う」でなければ(S206においてNO)、判定したそれ以外のジェスチャに応じた処理の実行を、制御部10の各部に指示する。その各部によって、判定されたジェスチャに応じた処理が実行される(S207)。 Here, the gesture determination unit 25 may determine the gesture of the contact operation based on the contact information (S205). In the present embodiment, if the determined gesture is not “enclose” (NO in S206), the gesture determination unit 25 instructs each unit of the control unit 10 to execute a process according to the determined other gesture. Each unit performs processing according to the determined gesture (S207).
 一方、ジェスチャ判定部25は、判定したジェスチャが「囲う」であった場合には(S206においてYES)、接触情報生成部21に対して接触情報を接触情報記憶部44に格納するように指示する。接触情報生成部21は、S204にて生成した接触情報を、接触情報記憶部44に記憶する(S208)。 On the other hand, when the determined gesture is “enclose” (YES in S206), the gesture determination unit 25 instructs the contact information generation unit 21 to store the contact information in the contact information storage unit 44. . The contact information generation unit 21 stores the contact information generated in S204 in the contact information storage unit 44 (S208).
 オブジェクト特定部22は、接触情報記憶部44に記憶されている接触情報(例えば、図12の(b))と、フレームマップ記憶部41に記憶されているマップ情報(例えば、図5の(c))とを比較して、ユーザによって囲われた領域に重なるオブジェクトを選択されたオブジェクトとして特定する(S209)。図5の(c)に示す例では、「写真1」というオブジェクト80を特定する。 The object specifying unit 22 includes contact information stored in the contact information storage unit 44 (for example, (b) of FIG. 12) and map information stored in the frame map storage unit 41 (for example, (c) of FIG. )) And the object overlapping the area surrounded by the user is specified as the selected object (S209). In the example shown in FIG. 5C, the object 80 “Photo 1” is specified.
 関連項目抽出部23は、S209において特定されたオブジェクトに基づいて、関連情報記憶部42の関連情報(例えば、図6)を参照して、特定されたオブジェクトの関連項目を抽出する(S210)。あるいは、関連項目に割り当てられているアイコンの識別情報を抽出してもよい。 The related item extraction unit 23 refers to the related information (for example, FIG. 6) in the related information storage unit 42 based on the object specified in S209, and extracts the related item of the specified object (S210). Alternatively, identification information of icons assigned to related items may be extracted.
 続いて、操作画面処理部24は、まず、接触情報記憶部44から、S204にて生成された接触情報を取得する(S211)。操作画面処理部24の環形状決定部30は、取得された接触情報から、アイコンを配置するための環形状を決定する(S212)。例えば、図12の(b)に示されるハート型の軌跡に基づいて、該ハート型の相似形を環形状として決定する(例えば、図13の環形状81)。 Subsequently, the operation screen processing unit 24 first acquires the contact information generated in S204 from the contact information storage unit 44 (S211). The ring shape determination unit 30 of the operation screen processing unit 24 determines a ring shape for arranging icons from the acquired contact information (S212). For example, based on the heart-shaped locus shown in FIG. 12B, the heart-shaped similarity is determined as a ring shape (for example, the ring shape 81 in FIG. 13).
 操作画面処理部24は、S210において抽出された関連項目のアイコン画像を、アイコン記憶部43(例えば、図7)から取得する。そして、取得したアイコン画像を、S209において特定されたオブジェクトの周囲に配置して操作画面を生成する(S213)。このとき、操作画面処理部24は、上記オブジェクトを中央に配置して、その周囲に配置された環形状、すなわち、S212にて決定された環形状の輪郭線上に各アイコンを配置する(例えば、図13)。以上のようにして生成された操作画面の映像信号は、表示部12に出力される。 The operation screen processing unit 24 acquires the icon image of the related item extracted in S210 from the icon storage unit 43 (for example, FIG. 7). Then, the acquired icon image is arranged around the object specified in S209 to generate an operation screen (S213). At this time, the operation screen processing unit 24 places the object in the center and places each icon on the ring shape arranged around the object, that is, on the ring-shaped outline determined in S212 (for example, FIG. 13). The video signal of the operation screen generated as described above is output to the display unit 12.
 本発明の上記構成および方法によれば、オブジェクトの周囲を「囲う」という、オブジェクトを指定する上で、極めて自然で簡易なユーザの接触動作に対して、タブレット端末100は、オブジェクトの周囲にアイコンを配置するという結果を出力することができる。つまり、ユーザは、オブジェクトを囲むように関連項目のアイコンが配置された操作画面を結果物として得ることができる。 According to the above-described configuration and method of the present invention, the tablet terminal 100 displays an icon around the object in response to an extremely natural and simple user contact operation of “enclosing” the object. Can be output. That is, the user can obtain, as a result, an operation screen in which icons of related items are arranged so as to surround the object.
 具体的には、ユーザは、フリーハンドで任意の形状でオブジェクトを囲うという動作を行うが、このときの軌跡がタブレット端末100において保持される。そして、タブレット端末100によって作成された操作画面において、得られた軌跡と同じまたは相似形の環形状の輪郭線上に、オブジェクトを囲むようにして各アイコンは配置される。 Specifically, the user performs an operation of surrounding the object in an arbitrary shape with a free hand, and the locus at this time is held in the tablet terminal 100. Then, on the operation screen created by the tablet terminal 100, each icon is arranged so as to surround the object on the same or similar ring-shaped contour line as the obtained trajectory.
 これらのアイコンと、オブジェクトとの位置関係は、先にユーザが実行した接触動作による指の軌跡とオブジェクトとの位置関係と合致する。また、囲ったことにより得られる指の軌跡は、アイコンが配置される環形状と一致する。 The positional relationship between these icons and the object matches the positional relationship between the object and the locus of the finger by the contact operation previously performed by the user. Further, the trajectory of the finger obtained by enclosing it matches the ring shape where the icon is arranged.
 つまり、オブジェクトを囲うと、オブジェクトの周囲に「囲ったとおりに」アイコンが配置された操作画面を得られる。この事象の遷移は、ユーザの直感に反しないより自然な流れであると言える。また、ユーザが囲ったとおりの形状で、アイコンが配置されるので、操作画面を表示してタブレット端末100を操作する際の遊戯性が高まる。その上、ユーザは、アイコンの配置を予測して、自分が希望するとおりにオブジェクトを囲い、関連項目のアイコンを表示させることができるため、操作性はさらに向上する。 That is, when an object is enclosed, an operation screen in which icons are arranged "as enclosed" around the object can be obtained. It can be said that the transition of this event is a more natural flow that does not contradict the user's intuition. In addition, since the icons are arranged in the shape as enclosed by the user, the playability when the operation screen is displayed and the tablet terminal 100 is operated is improved. In addition, since the user can predict the arrangement of the icons, surround the object as he / she desires, and display the icons of the related items, the operability is further improved.
 そして、オブジェクトを選択した後に周囲に表示されるアイコンは、そのオブジェクトと関連が深く次に選択される可能性が高い関連項目を示している。 The icons displayed in the surroundings after selecting an object indicate related items that are deeply related to the object and are likely to be selected next.
 以上のことから、タブレット端末100は、簡易な接触動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、タッチパネルを備えたタブレット端末100において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations. As a result, the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
 なお、本実施形態に係るタブレット端末100の構成、すなわち、指示体の移動の軌跡の相似形また近似形の環に沿ってアイコンを表示する構成について、これを、入力部11と表示部12とが別体として備わっている情報処理装置に適用してもよい。例えば、入力部11がマウスなどの入力装置で構成されており、表示部12が、LCDなどの表示装置で構成されている情報処理装置が考えられる。ここでは、表示部12に表示されているカーソルが、表示部12の画面上の位置を指示するものである。そして、ユーザがマウスを操作して入力動作を実施することによりカーソルが移動する。したがって、このような実施形態では、マウスが操作体であり、カーソルが指示体であって、操作体の移動に伴って表示部12に表示されている指示体が移動する。ユーザが、マウスを操作して表示部12の画面上のカーソルを動かすことによりオブジェクトを選択すると、情報処理装置は、内部であらかじめ保持しているカーソルの位置を上記マウスの動きと連動させて、その移動の軌跡も保持し、選択されたオブジェクトの周囲に、アイコンを環状表示する。情報処理装置は、このとき、環の形状を、取得した軌跡の相似形または近似形とすることができる。上記構成によれば、ユーザがマウスを動かしたとおりの形状にて、アイコンが配置された操作画面が得られるので、上記情報処理装置を操作する際の遊戯性を高めることが可能となる。 In addition, about the structure of the tablet terminal 100 which concerns on this embodiment, ie, the structure which displays an icon along the similar or approximate ring of the locus | trajectory of a pointer's movement, this is shown as input part 11 and display part 12 May be applied to an information processing apparatus provided separately. For example, an information processing apparatus in which the input unit 11 is configured by an input device such as a mouse and the display unit 12 is configured by a display device such as an LCD can be considered. Here, the cursor displayed on the display unit 12 indicates the position of the display unit 12 on the screen. Then, when the user operates the mouse to perform an input operation, the cursor moves. Therefore, in such an embodiment, the mouse is the operating body, the cursor is the pointing body, and the pointing body displayed on the display unit 12 moves as the operating body moves. When the user selects an object by moving the cursor on the screen of the display unit 12 by operating the mouse, the information processing apparatus links the position of the cursor held in advance with the movement of the mouse, The trajectory of the movement is also held, and icons are displayed in a circle around the selected object. At this time, the information processing apparatus can make the shape of the ring a similar shape or an approximate shape of the acquired trajectory. According to the above configuration, an operation screen on which icons are arranged can be obtained in a shape as if the user moved the mouse, so that it is possible to improve playability when operating the information processing apparatus.
 なお、表示部12と別体で構成される入力部11としては、上記のマウス以外にも、例えば、キーボード、ジョイスティック、ディジタイザ、タブレットとスタイラスペンなどの各種入力装置を採用することができる。 As the input unit 11 configured separately from the display unit 12, various input devices such as a keyboard, a joystick, a digitizer, a tablet, and a stylus pen can be employed in addition to the mouse.
 ≪実施形態3≫
 本発明の情報処理装置に関する他の実施形態について、図15~図19に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、上述の実施形態1または2にて説明した図面と同じ機能を有する部材については、同じ符号を付記し、実施形態1または2と重複する内容については説明を省略する。
<< Embodiment 3 >>
Another embodiment of the information processing apparatus according to the present invention will be described below with reference to FIGS. For convenience of explanation, members having the same functions as those in the drawings described in the first embodiment or the second embodiment are denoted by the same reference numerals, and the description overlapping with those in the first or second embodiment is omitted.
 上述の各実施形態では、抽出された各アイコンは、例えば、抽出された順にオブジェクトの真上から時計回りに配置されるなどの所定の規則にしたがって配置される構成であった。しかし、本実施形態では、オブジェクトの周囲に表示された関連項目が、一度に複数選択される(囲われる)ことを想定し、抽出された関連項目の中でも特に連携している関連項目同士が互いに近く(隣)に配置されるように、操作画面処理部24がアイコンの配置を決定する構成を有している。 In each of the above-described embodiments, the extracted icons are arranged according to a predetermined rule, such as being arranged clockwise from the top of the object in the order of extraction. However, in this embodiment, it is assumed that a plurality of related items displayed around the object are selected (enclosed) at a time, and the related items that are particularly linked among the extracted related items are mutually connected. The operation screen processing unit 24 has a configuration for determining the arrangement of icons so that they are arranged near (next).
 〔タブレット端末の機能〕
 図15は、本実施形態におけるタブレット端末100の要部構成を示す機能ブロック図である。
[Tablet function]
FIG. 15 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
 本実施形態にかかるタブレット端末100は、実施形態1のタブレット端末100(図1)と比較して、さらに、制御部10が、機能ブロックとして、連携処理実行部26、および、アイコン配置決定部33を備えている構成である。 Compared with the tablet terminal 100 (FIG. 1) of the first embodiment, the tablet terminal 100 according to the present embodiment further includes the cooperation processing execution unit 26 and the icon arrangement determination unit 33 as functional blocks. It is the structure equipped with.
 ここで、本実施形態では、必須ではないが、さらに必要に応じて、記憶部19が、さらに、接触情報記憶部44を、また、制御部10が、機能ブロックとして、ジェスチャ判定部25、環形状決定部30、アイコン順位決定部31、および、アニメーション決定部32を備えている構成であってもよい。 Here, in the present embodiment, although not indispensable, the storage unit 19 further includes the contact information storage unit 44 and the control unit 10 as function blocks as necessary, as necessary. The structure provided with the shape determination part 30, the icon order | rank determination part 31, and the animation determination part 32 may be sufficient.
 アイコン配置決定部33は、アイコンの配置を決定するものである。アイコン配置決定部33は、抽出された関連項目の属性に基づいて、どのアイコンをどの位置に配置するのかを決定するものである。 The icon arrangement determining unit 33 determines the icon arrangement. The icon arrangement determining unit 33 determines which icon is arranged at which position based on the attribute of the extracted related item.
 本発明のタブレット端末100においては、各アイコンは、所定の、あるいは、決定された環形状の輪郭線上に配置されることに決まっているが、それ以外については任意に設計されてもよい。 In the tablet terminal 100 of the present invention, each icon is determined to be arranged on a predetermined or determined ring-shaped outline, but the other icons may be arbitrarily designed.
 そこで、アイコン配置決定部33は、各関連項目の属性に基づいて、それぞれのアイコンを、輪郭線上のどこに、いくつ、どのような並びで、または、どのような間隔で配置するのかを決定することができる。 Therefore, the icon arrangement determination unit 33 determines where, how many, in what arrangement, or at what interval the icons are arranged on the contour line based on the attribute of each related item. Can do.
 特に、本実施形態では、アイコン配置決定部33は、関連項目の属性のうち、関連項目同士の連携性に着目してアイコンの配置を決定する。 In particular, in the present embodiment, the icon arrangement determining unit 33 determines the icon arrangement by paying attention to the cooperation between the related items among the attributes of the related items.
 図16は、本実施形態の関連情報記憶部42に記憶される関連情報の一例を示す図である。 FIG. 16 is a diagram illustrating an example of related information stored in the related information storage unit 42 of the present embodiment.
 図6に示す関連情報と比較して、図16に示す関連情報の異なる点は、関連項目ごとに、さらに、「連携」の情報が関連付けられている点である。なお、図16には示していないが、本実施形態の関連情報は、関連項目の属性の例として、さらに、図6に示す「動作属性」および「条件」の情報を含んでいてもよい。 16 is different from the related information shown in FIG. 6 in that “linked” information is associated with each related item. Although not shown in FIG. 16, the related information of the present embodiment may further include “operation attribute” and “condition” information shown in FIG. 6 as an example of the attribute of the related item.
 図16に示す関連情報の例によれば、あるオブジェクト「ツール1」には、AからFまでの6個の関連項目が対応付けられている。したがって、オブジェクト「ツール1」がユーザによって選択された場合、関連項目抽出部23は、この関連情報に基づいて、図16に示すA~Fの項目を関連項目として抽出する。 According to the related information example shown in FIG. 16, six related items from A to F are associated with a certain object “tool 1”. Therefore, when the object “tool 1” is selected by the user, the related item extracting unit 23 extracts items A to F shown in FIG. 16 as related items based on the related information.
 属性「連携」は、関連項目の属性の1つであり、関連項目同士の連携性を示す情報である。上述したとおり、関連項目には、「動作」を表すもの、「動作相手」を表すもの、「動作対象」を表すものがある。例えば、「動作」をするとき、その動作を行う対象となる「動作対象」が必要となる場合がある、このときの、「動作」と「動作対象」とは、連携していると言える。また、「動作」の内容によっては、「動作対象」に加えて、その動作を行う相手である「動作相手」が必要となる場合がある。この場合、「動作」と「動作対象」と「動作相手」とは、連携していると言える。 Attribute “cooperation” is one of the attributes of the related items, and is information indicating the cooperation between the related items. As described above, the related items include those representing “operation”, those representing “operation partner”, and items representing “operation target”. For example, when “operation” is performed, an “operation target” that is a target for performing the operation may be required. In this case, it can be said that “operation” and “operation target” are linked. Further, depending on the content of the “action”, in addition to the “operation target”, an “operation partner” that is the partner to perform the operation may be required. In this case, it can be said that “operation”, “operation target”, and “operation partner” are linked.
 図16に示す例では、「A:メールで送信する」という動作と、「D:ベストフレンド情報」という動作相手と、「E:写真」という動作対象とが連携していることを示す情報が、属性「連携」に格納されている。本実施形態では、連携番号「1」によって、上記関連項目A、D、E、および、Fが互いに連携しているとして紐付けられている。アイコン配置決定部33は、関連情報記憶部42に記憶されている各関連項目の「連携」に基づいて関連項目の連携性を把握する。そして、アイコン配置決定部33は、互いに連携する関連項目のアイコン同士が隣り合う並びにしたり、あるいは、連携する関連項目のアイコン同士を近づけてまとめたりして、アイコンの配置を決定する。 In the example illustrated in FIG. 16, there is information indicating that the operation “A: send by mail”, the operation partner “D: best friend information”, and the operation target “E: photograph” are linked. , Stored in the attribute “cooperation”. In the present embodiment, the related items A, D, E, and F are associated with each other by the cooperation number “1”. The icon arrangement determination unit 33 grasps the cooperation of the related items based on the “cooperation” of each related item stored in the related information storage unit 42. Then, the icon arrangement determining unit 33 determines the icon arrangement by arranging icons of related items that cooperate with each other, or by bringing together icons of related items that cooperate with each other.
 なお、関連項目と連携番号とは、必ずしも1対1である必要はない。すなわち、1つの関連項目は、1つ連携グループ(1つの連携番号)に属していてもよいし、複数の連携グループに属してしてもよい。また、1つの連携グループは、1対の(つまり2個の)関連項目で成り立っていてもよいし、3個以上の関連項目で成り立っていてもよい。 Note that the related items and the linkage numbers do not necessarily have to be one-to-one. That is, one related item may belong to one cooperation group (one cooperation number), or may belong to a plurality of cooperation groups. One linkage group may consist of a pair of (that is, two) related items, or may consist of three or more related items.
 例えば、関連項目「E:写真」は、関連項目「A:メールで送信する」および「B:ブログにアップする」という両方の動作の「動作対象」となり得るため、関連項目Aが属する連携グループ「1」と、関連項目Bが属する連携グループ「2」との両方に属している。 For example, the related item “E: photo” can be an “operation target” of both operations “A: send by e-mail” and “B: upload to blog”. It belongs to both “1” and the cooperation group “2” to which the related item B belongs.
 また、例えば、関連項目「C:電話する」という「動作」は、「動作対象」を持たず、電話する相手、すなわち、「動作相手」と連携していればよいので、関連項目Cが属する連携グループ「3」は、関連項目CとDとの対で成り立っている。一方、関連項目「A:メールで送信する」という「動作」は、「動作対象」と「動作相手」とが必要である。さらに、関連項目「E:写真」も「F:動画」もメールで送信できる「動作対象」である。したがって、関連項目Aが属する連携グループ「1」は、関連項目A、D、EおよびFで成り立っている。 In addition, for example, the related item “C: Call” does not have an “operation target”, and it only needs to be linked to the other party to call, that is, the “operation partner”. The cooperation group “3” is composed of a pair of related items C and D. On the other hand, the “action” of the related item “A: send by e-mail” requires “operation target” and “operation partner”. Furthermore, the related items “E: Photo” and “F: Movie” are “operation targets” that can be transmitted by e-mail. Therefore, the cooperation group “1” to which the related item A belongs includes the related items A, D, E, and F.
 なお、1つの関連項目が、複数の連携グループに属する場合には、連携の強さに応じて、属する連携グループに、予め優先順位が設定されていることが好ましい。例えば、図16に示す例では、関連項目「E:写真」には、連携番号が、「1」、「2」の順で関連付けられている。つまり、「E:写真」は、「B:ブログにアップする」という「動作」とよりも、「A:メールで送信する」という「動作」との連携が強いことを示している。一方、「F:動画」には、連携番号が、「2」、「1」の順で関連付けられている。つまり、「F:動画」は、「A:メールで送信する」という「動作」とよりも、「B:ブログにアップする」という「動作」との連携が強いことを示している。 Note that, when one related item belongs to a plurality of linkage groups, it is preferable that priorities are set in advance for the linkage groups to which they belong in accordance with the strength of the linkage. For example, in the example illustrated in FIG. 16, the related item “E: photo” is associated with the cooperation number in the order of “1” and “2”. In other words, “E: photo” indicates that the “operation” “A: send by e-mail” is stronger than the “operation” “B: upload to blog”. On the other hand, “F: moving image” is associated with a cooperation number in the order of “2” and “1”. That is, “F: moving image” indicates that the “operation” “B: upload to blog” is stronger than the “operation” “A: send by e-mail”.
 図17の(a)および(b)は、アイコン配置決定部33がアイコンの配置を決定する手順を説明する図である。 (A) and (b) of FIG. 17 is a diagram illustrating a procedure in which the icon arrangement determining unit 33 determines the icon arrangement.
 ここでは、オブジェクト特定部22によって既に、オブジェクト「ツール1」が特定されており、関連項目抽出部23によって、図16に示す6個の関連項目「A」~「F」が抽出されているものとする。6個の関連項目が均等に配置されるという規則が予め定められている場合には、アイコン配置決定部33は、アイコンの並びだけを「連携」にしたがって決定する。 Here, the object “tool 1” has already been specified by the object specifying unit 22, and the six related items “A” to “F” shown in FIG. 16 have been extracted by the related item extracting unit 23. And When a rule that six related items are evenly arranged is determined in advance, the icon arrangement determining unit 33 determines only the arrangement of icons according to “cooperation”.
 具体的には、まず、アイコン配置決定部33は、図16に示す関連情報を参照し、連携番号による紐付けにしたがって、各関連項目を分類する。例えば、図17に示すとおり、各関連項目を「連携番号1」~「連携番号3」に分類する。関連項目は、複数の連携に属していてもよい。さらに、アイコン配置決定部33は、各関連項目を、「動作」、「動作対象」、「動作相手」のいずれかに分類する。アイコン配置決定部33が各関連項目を分類した結果を図17の(a)に示す。 Specifically, first, the icon arrangement determining unit 33 refers to the related information shown in FIG. 16 and classifies each related item according to the association with the cooperation number. For example, as shown in FIG. 17, each related item is classified into “linkage number 1” to “linkage number 3”. The related item may belong to a plurality of linkages. Further, the icon arrangement determining unit 33 classifies each related item into one of “operation”, “operation target”, and “operation partner”. The result of classifying each related item by the icon arrangement determining unit 33 is shown in FIG.
 なお、アイコン配置決定部33は、1つの連携グループに複数の「動作対象」が属している場合には、予め定められた優先順位のとおりに関連項目を分類する。例えば、図17の(a)に示す例では、連携グループ「1」において、関連項目「E」が、関連項目「F」よりも優先する。 Note that the icon arrangement determining unit 33 classifies related items according to a predetermined priority order when a plurality of “operation targets” belong to one linkage group. For example, in the example illustrated in FIG. 17A, the related item “E” has priority over the related item “F” in the cooperation group “1”.
 次に、アイコン配置決定部33は、図17の(b)に示すように、関連項目の連携にしたがって、関連項目「A」~「F」の配置を決定する。まず、アイコン配置決定部33は、連携番号1によって、関連項目A、D、Eの連携性を認識する。そして、これらを「動作」、「動作対象」、「動作相手」の順に、互いに隣り合うように配置することを決定する。具体的には、アイコン配置決定部33は、関連項目「A」を配置すると、次に、「A」の隣に「D」を配置し、「A」のもう一方の隣に「E」を配置する。そして、アイコン配置決定部33は、既に配置した「D」は、連携番号3によって「C」と連携していると判断し、「D」の空いている隣に「C」を配置する。続いて、アイコン配置決定部33は、既に配置した「E」は、サブではあるが、連携番号2によって「B」と連携していると判断し、「E」の空いている隣に「B」を配置する。そして、最後に、「B」の空いている隣に、「B」と連携する残りの「F」を配置する。 Next, as shown in FIG. 17B, the icon arrangement determining unit 33 determines the arrangement of the related items “A” to “F” in accordance with the cooperation of the related items. First, the icon arrangement determining unit 33 recognizes the cooperation of the related items A, D, and E with the cooperation number 1. Then, it is determined that these are arranged adjacent to each other in the order of “operation”, “operation object”, and “operation partner”. Specifically, when the icon arrangement determining unit 33 arranges the related item “A”, next, “D” is arranged next to “A”, and “E” is arranged next to the other of “A”. Deploy. Then, the icon arrangement determining unit 33 determines that “D” that has already been arranged is linked to “C” by the linkage number 3, and places “C” next to an empty “D”. Subsequently, the icon arrangement determination unit 33 determines that “E” that has already been arranged is a sub, but is associated with “B” by the cooperation number 2, and “B” ". Finally, the remaining “F” linked to “B” is arranged next to “B” that is free.
 なお、1つの配置位置対して、複数の関連項目が割り当てられて対立することが考えられる。このような場合には、上述の優先順位に基づいて、連携がより強い関連項目を優先させればよい。また、1つの関連項目が複数の配置位置に割り当てられて重複することが考えられる。このような場合に備えて、どの連携を優先的に保持するのかを別途、連携グループごとにも優先順位を定めておき(例えば、連携番号がそのまま優先順位を示していてもよい)、優先順位の高い連携グループの関連項目同士ができるだけ隣り合うように、配置位置が決定されればよい。 Note that it is conceivable that a plurality of related items are assigned to one arrangement position and confront each other. In such a case, it is only necessary to prioritize related items with stronger cooperation based on the above-described priority order. Further, it is conceivable that one related item is assigned to a plurality of arrangement positions and overlapped. In preparation for such a case, a priority order is determined for each cooperation group separately indicating which cooperation is preferentially held (for example, the cooperation number may indicate the priority order as it is). The arrangement position may be determined so that the related items of the high cooperation group are adjacent to each other as much as possible.
 操作画面処理部24は、アイコン配置決定部33によって決定された配置にしたがって、各関連項目に対応するアイコンを配置して、操作画面を生成する。 The operation screen processing unit 24 arranges icons corresponding to each related item according to the arrangement determined by the icon arrangement determination unit 33, and generates an operation screen.
 図18は、操作画面処理部24がアイコン配置決定部33の決定にしたがってアイコンを配置した結果、得られた操作画面の具体例を示す図である。 FIG. 18 is a diagram showing a specific example of the operation screen obtained as a result of the operation screen processing unit 24 arranging icons according to the determination of the icon arrangement determining unit 33.
 アイコン90は、関連項目「A:メールで送信する」に対応し、この関連項目Aに関連付けられている「3:メール」のアイコン画像で作成されている。アイコン91は、関連項目「D:ベストフレンド情報」に対応し、この関連項目Dに関連付けられている「15:アバター」のアイコン画像で作成されている。このアイコン画像は、ベストフレンド登録されている特定の人物のアバターをアイコン化したものであることが好ましい。これにより、ユーザは一目見てどの人を指すアイコンであるのかを理解することができる。アイコン92は、関連項目「E:写真」に対応し、この関連項目Eに関連付けられている「16:サムネイル」のアイコン画像で作成されている。このアイコン画像は、その写真のサムネイル画像をそのままアイコン化したものであることが好ましい。これにより、ユーザは一目見て何の写真であるのかを理解することができる。 The icon 90 corresponds to the related item “A: send by e-mail” and is created with an icon image of “3: e-mail” associated with the related item A. The icon 91 corresponds to the related item “D: best friend information”, and is created with an icon image of “15: avatar” associated with the related item D. This icon image is preferably an avatar of a specific person registered as a best friend. Accordingly, the user can understand at a glance which person the icon indicates. The icon 92 corresponds to the related item “E: photo” and is created with an icon image of “16: thumbnail” associated with the related item E. This icon image is preferably an icon of the thumbnail image of the photo. Thereby, the user can understand what the photograph is at a glance.
 図18に示すとおり、互いに連携する関連項目A、D、Eのアイコン90、91、92は、隣同士に配置されている。このように、関連のあるアイコン同士が近くにまとめて配置されていることにより、ユーザは、オブジェクトに対して各アイコンが対等である場合であっても、その各アイコンの中での関係性を容易に把握することが可能となる。なお、連携するこれらの3つのアイコンをさらに囲って別の動作を起こすことが可能な場合には、3つのアイコンが近くにまとまっているので、容易に囲うことが可能となり、操作性が向上する。 As shown in FIG. 18, the icons 90, 91, and 92 of related items A, D, and E that are linked to each other are arranged next to each other. In this way, the related icons are arranged close together so that the user can show the relationship among the icons even if the icons are equivalent to the object. It becomes possible to grasp easily. In addition, when it is possible to further enclose these three icons to be linked and cause another operation, the three icons are gathered close together so that they can be easily enclosed and the operability is improved. .
 あるいは、アイコン配置決定部33は、アイコンの配置間隔を「連携」にしたがって決定してもよい。 Alternatively, the icon arrangement determining unit 33 may determine the icon arrangement interval according to “cooperation”.
 図19は、操作画面処理部24がアイコン配置決定部33の決定にしたがってアイコンを配置した結果、得られた操作画面の他の具体例を示す図である。 FIG. 19 is a diagram showing another specific example of the operation screen obtained as a result of the operation screen processing unit 24 arranging icons according to the determination of the icon arrangement determining unit 33.
 この例では、アイコン配置決定部33は、図17に示すとおり、各関連項目を「連携番号1」~「連携番号3」に分類した後、連携の分類ごとに、その分類に属する関連項目のアイコン同士の間隔を詰めてアイコンを配置することを決定する。 In this example, as shown in FIG. 17, the icon arrangement determining unit 33 classifies each related item into “cooperation number 1” to “cooperation number 3”, and then, for each cooperation class, the related item belonging to that class. It is decided to arrange icons with a close interval between icons.
 すなわち、図19に示すように、アイコン配置決定部33は、連携の分類ごとにアイコンのかたまりが形成されるような配置を決定する。1つの関連項目が、複数の分類に属する場合には、1つの関連項目がいくつかのかたまりに属するように複数配置されてもかまわない。 That is, as shown in FIG. 19, the icon arrangement determining unit 33 determines an arrangement in which a cluster of icons is formed for each category of cooperation. When one related item belongs to a plurality of classifications, a plurality of items may be arranged so that one related item belongs to several clusters.
 このような配置により、アイコンの1つ1つは見え難くなるが、アイコン(関連項目)同士の連携性を視覚化することが可能となり、ユーザは、その連携性を容易に把握することが可能となる。あるいは、複数のアイコンをさらに囲って別の動作を起こすことが可能な場合には、「複数のアイコンをまとめて囲う」ことをユーザに想起させやすくなる。 Such an arrangement makes it difficult to see each icon, but it is possible to visualize the cooperation between icons (related items), and the user can easily grasp the cooperation. It becomes. Alternatively, when it is possible to further enclose a plurality of icons and cause another operation, it is easy to remind the user to “enclose a plurality of icons together”.
 連携処理実行部26は、オブジェクト特定部22が特定したオブジェクト(または、アイコン)が複数であった場合に、それらのアイコンに対応する各関連項目の連携性を考慮して、連携処理を実行するものである。連携処理実行部26は、複数の実行処理部が連動して動作するように各実行処理部を制御することによって連携処理を実現してもよいし、複数の処理が意味を成す順序で実行されるように、1つまたは複数の実行処理部を制御することによって連携処理を実現してもよい。 When there are a plurality of objects (or icons) specified by the object specifying unit 22, the cooperation processing execution unit 26 executes the cooperation processing in consideration of the cooperation of the related items corresponding to these icons. Is. The cooperation processing execution unit 26 may realize the cooperation processing by controlling each execution processing unit so that the plurality of execution processing units operate in conjunction with each other, or the plurality of processes are executed in an order that makes sense. As described above, the cooperative processing may be realized by controlling one or a plurality of execution processing units.
 例えば、図18または図19に示すように、関連項目「A:メールで送信する」のアイコン90と、関連項目「D:ベストフレンド情報」のアイコン91と、関連項目「E:写真」のアイコン92とが一度に囲われた場合には、連携処理実行部26は、図示しない実行処理部を制御して、メールアプリケーションを起動し、メールにアイコン92が示す写真を添付し、アイコン91の人物のメールアドレス宛てに送信するという一連の処理を実行させる。 For example, as shown in FIG. 18 or FIG. 19, an icon 90 of the related item “A: Send by email”, an icon 91 of the related item “D: Best Friend Information”, and an icon of the related item “E: Photo” 92 is surrounded at once, the cooperation processing execution unit 26 controls an execution processing unit (not shown) to start a mail application, attaches a photograph indicated by the icon 92 to the mail, A series of processes of sending to the e-mail address is executed.
 上記構成によれば、タブレット端末100は、連携して動作するアイコン同士を囲みやすいようにしてユーザに提示することができるとともに、複数のアイコンが囲まれた場合に、それらが連携していれば、その連携情報にしたがって、複数の処理をまとめて実行することができる。 According to the above configuration, the tablet terminal 100 can easily present icons that operate in cooperation with each other and present them to the user. If a plurality of icons are surrounded, the tablet terminal 100 can be displayed. A plurality of processes can be executed together according to the linkage information.
 これにより、タブレット端末100は、簡易な接触動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、タッチパネルを備えたタブレット端末100において、優れた操作性を実現することが可能になるという効果を奏する。 Thereby, the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations. As a result, the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
 ≪実施形態4≫
 本発明の情報処理装置に関する他の実施形態について、図20~図25に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、上述の実施形態1、2または3にて説明した図面と同じ機能を有する部材については、同じ符号を付記し、実施形態1、2または3と重複する内容については説明を省略する。
<< Embodiment 4 >>
Another embodiment of the information processing apparatus according to the present invention will be described below with reference to FIGS. For convenience of explanation, members having the same functions as those in the drawings described in the first, second, or third embodiment are denoted by the same reference numerals, and the same contents as those in the first, second, or third embodiments are described. Omitted.
 上述の実施形態3では、アイコン配置決定部33は、関連項目の連携性を考慮して、アイコンの並びまたは間隔を決定する構成であった。本実施形態では、関連項目が、タブレット端末100の周辺にある機器に対応するものである場合に、アイコン配置決定部33は、タブレット端末100と周辺の機器との位置関係を考慮して、アイコンの配置を決定する構成である。 In the third embodiment described above, the icon arrangement determining unit 33 is configured to determine the arrangement or interval of icons in consideration of the cooperation of related items. In the present embodiment, when the related item corresponds to a device in the vicinity of the tablet terminal 100, the icon arrangement determination unit 33 considers the positional relationship between the tablet terminal 100 and the peripheral device, and displays the icon. It is the structure which determines arrangement | positioning.
 〔タブレット端末の機能〕
 図20は、本実施形態におけるタブレット端末100の要部構成を示す機能ブロック図である。
[Tablet function]
FIG. 20 is a functional block diagram illustrating a main configuration of the tablet terminal 100 according to the present embodiment.
 本実施形態にかかるタブレット端末100は、実施形態1のタブレット端末100(図1)と比較して、さらに、制御部10が、機能ブロックとして、機器方向特定部27、および、アイコン配置決定部33を備えている構成である。また、記憶部19が、さらに、機器方向記憶部45を備えている構成である。 Compared with the tablet terminal 100 (FIG. 1) of the first embodiment, the tablet terminal 100 according to the present embodiment further includes the device direction specifying unit 27 and the icon arrangement determining unit 33 as functional blocks. It is the structure equipped with. The storage unit 19 further includes a device direction storage unit 45.
 ここで、本実施形態では、必須ではないが、さらに必要に応じて、記憶部19が、接触情報記憶部44を、また、制御部10が、機能ブロックとして、ジェスチャ判定部25、連携処理実行部26、環形状決定部30、アイコン順位決定部31、および、アニメーション決定部32を備えている構成であってもよい。 Here, in the present embodiment, although not indispensable, if necessary, the storage unit 19 serves as the contact information storage unit 44, and the control unit 10 serves as the function block, the gesture determination unit 25, and the cooperation processing execution. The structure provided with the part 26, the ring shape determination part 30, the icon order | rank determination part 31, and the animation determination part 32 may be sufficient.
 また、本実施形態では、タブレット端末100は、ハードウェア構成として、図2に示していないが、さらに、位置検知部、および、方向検知部をさらに備えている。 Further, in the present embodiment, the tablet terminal 100 is further provided with a position detection unit and a direction detection unit, although not shown in FIG. 2 as a hardware configuration.
 位置検知部は、タブレット端末100の現在位置を検知するためのものであり、例えば、衛星または基地局などと距離の測定を行うためのアンテナと、アンテナによって受信された信号を処理して、タブレット端末100の現在位置を示す位置情報を生成する信号処理部とで構成される。位置検知部が取得した位置情報は、制御部10の機器方向特定部27に供給される。位置検知部は、例えば、GPS(Global Positioning System)または既存の屋内位置情報取得システムなどによって実現される。例えば、屋内位置情報取得システムが採用される場合、上記位置検知部は、WLAN(Wireless Local Area Network)などを介して複数の基地局と通信し、距離の測定を行って、特定の屋内での現在位置を検知する。なお、位置検知部のアンテナは、無線通信部16のアンテナと兼用であってもよいし、別体であってもよい。 The position detection unit is for detecting the current position of the tablet terminal 100. For example, an antenna for measuring a distance from a satellite or a base station, a signal received by the antenna, and a tablet is processed. And a signal processing unit that generates position information indicating the current position of the terminal 100. The position information acquired by the position detection unit is supplied to the device direction specifying unit 27 of the control unit 10. The position detection unit is realized by, for example, GPS (Global Positioning System) or an existing indoor position information acquisition system. For example, when an indoor position information acquisition system is employed, the position detection unit communicates with a plurality of base stations via a WLAN (Wireless Local Area Network) or the like, measures distances, Detect current position. Note that the antenna of the position detection unit may be shared with the antenna of the wireless communication unit 16 or may be a separate body.
 方向検知部は、タブレット端末100が、水平方向にどの方角を向いているのかを検知するものであり、例えば、方向検知部は、前後方向と左右方向の地磁気を検出することにより北の方角を特定する地磁気センサと、地磁気センサが検出する信号を処理して、タブレット端末100が向いている方向を示す方向情報を生成する信号処理部とで構成される。方向検知部が取得した方向情報は、制御部10の機器方向特定部27に供給される。地磁気センサには2軸タイプと3軸タイプとがあるが、いずれのタイプの地磁気センサを本発明のタブレット端末100に採用しても構わない。ただし、2軸タイプの地磁気センサが採用される場合には、少なくともタブレット端末100が自装置の方向を特定する処理を行っている間には、ユーザは、タッチパネル(表示部12)の面を上にして水平に把持する必要がある。 The direction detection unit detects which direction the tablet terminal 100 faces in the horizontal direction. For example, the direction detection unit identifies the north direction by detecting the geomagnetism in the front-rear direction and the left-right direction. And a signal processing unit that processes a signal detected by the geomagnetic sensor and generates direction information indicating a direction in which the tablet terminal 100 is facing. The direction information acquired by the direction detection unit is supplied to the device direction identification unit 27 of the control unit 10. There are two-axis type and three-axis type geomagnetic sensors, but any type of geomagnetic sensor may be adopted in the tablet terminal 100 of the present invention. However, when a two-axis type geomagnetic sensor is employed, at least while the tablet terminal 100 is performing the process of specifying the direction of its own device, the user moves the surface of the touch panel (display unit 12) upward. Need to be held horizontally.
 本実施形態では、無線通信部16は、外部の周辺機器と無線通信を行って、各周辺機器の位置情報を受信する。無線通信部16によって受信された各周辺機器の位置情報は、機器方向特定部27に供給される。 In this embodiment, the wireless communication unit 16 performs wireless communication with an external peripheral device and receives position information of each peripheral device. The position information of each peripheral device received by the wireless communication unit 16 is supplied to the device direction specifying unit 27.
 機器方向特定部27は、タブレット端末100の方向を特定するとともに、そのタブレット端末100から見て各周辺機器がどの方向に存在するのかを特定するものである。具体的には、機器方向特定部27は、上述の位置検知部から位置情報を取得して、その位置情報に基づいてタブレット端末100の現在位置を特定する。そして、機器方向特定部27は、無線通信部16を介して、各周辺機器からそれぞれの位置情報を取得して、周辺機器それぞれの現在位置を特定する。また、機器方向特定部27は、上述の方向検知部から方向情報を取得して、タブレット端末100が現在向いている方向を特定する。これにより、機器方向特定部27は、タブレット端末100から見て各周辺機器がどの方向に存在するのかを示す機器方向情報を生成する。機器方向特定部27によって生成された機器方向情報は、機器方向記憶部45に記憶され、操作画面処理部24の各部によって、必要に応じて読み出される。 The device direction specifying unit 27 specifies the direction of the tablet terminal 100 and specifies in which direction each peripheral device exists as viewed from the tablet terminal 100. Specifically, the device direction specifying unit 27 acquires position information from the position detection unit described above, and specifies the current position of the tablet terminal 100 based on the position information. And the apparatus direction specific | specification part 27 acquires each positional information from each peripheral apparatus via the wireless communication part 16, and specifies the present position of each peripheral apparatus. Moreover, the apparatus direction specific | specification part 27 acquires direction information from the above-mentioned direction detection part, and specifies the direction where the tablet terminal 100 is facing now. Thereby, the device direction specifying unit 27 generates device direction information indicating in which direction each peripheral device exists as viewed from the tablet terminal 100. The device direction information generated by the device direction specifying unit 27 is stored in the device direction storage unit 45 and is read by each unit of the operation screen processing unit 24 as necessary.
 次に、図21および図22に示す具体例を参照しながら、機器方向特定部27の動作について詳細に説明する。図21は、タブレット端末100のユーザUが屋内のある部屋にてタブレット端末100を使用しているときの、使用環境の一例を示す図である。図21における方角は特に限定されないが、理解を容易にするために、同図の上の方角が北であると仮定する。また、以下では、図3に示すタブレット端末100本体の無線通信部16が設けられている側をタブレット端末100の上(または上側)、音声入力部18が設けられている側をタブレット端末100の下(または下側)と称する。そして、タブレット端末100の水平方向における向きを表すとき、タブレット端末100の向き(タブレット端末100が向いている方向)とは、タブレット端末100の上側が向いている方向のことを指す。すなわち、図21に示す例では、タブレット端末100の上側が北に向いているので、タブレット端末100が現在向いている方向は「北」となる。 Next, the operation of the device direction specifying unit 27 will be described in detail with reference to the specific examples shown in FIGS. FIG. 21 is a diagram illustrating an example of a usage environment when the user U of the tablet terminal 100 is using the tablet terminal 100 in a room indoors. Although the direction in FIG. 21 is not particularly limited, it is assumed that the upper direction in FIG. 21 is north for easy understanding. In the following description, the side of the tablet terminal 100 main body shown in FIG. 3 where the wireless communication unit 16 is provided is the top (or upper side) of the tablet terminal 100 and the side of the tablet terminal 100 is the side where the voice input unit 18 is provided. Called the bottom (or bottom). And when the direction in the horizontal direction of the tablet terminal 100 is represented, the direction of the tablet terminal 100 (direction in which the tablet terminal 100 is facing) refers to the direction in which the upper side of the tablet terminal 100 is facing. That is, in the example shown in FIG. 21, since the upper side of the tablet terminal 100 faces north, the direction in which the tablet terminal 100 currently faces is “north”.
 図21に示す使用環境において、ユーザUは、タブレット端末100の表示部12の面が水平になるように把持し、使用している。タブレット端末100は、北を向いている。そして、タブレット端末100の周辺には、略北の方向に、デジタルテレビ1が、略北東の方向に、デジタルフォトフレーム3が、略東の方向に、プリンタ2が、略南東の方向にパソコン4が設置されている。本実施形態では、デジタルテレビ1、プリンタ2、デジタルフォトフレーム3およびパソコン4の各周辺機器は、タブレット端末100と無線通信可能であって、それぞれ自機の現在位置を検知し、自機の位置情報をタブレット端末100に送信する機能を有している。 In the use environment shown in FIG. 21, the user U is holding and using the display unit 12 of the tablet terminal 100 so that the surface of the display unit 12 is horizontal. The tablet terminal 100 faces north. In the vicinity of the tablet terminal 100, the digital television 1 is in the direction of approximately north, the digital photo frame 3 is in the direction of approximately east, the printer 2 is in the direction of approximately east, and the personal computer 4 is in the direction of approximately southeast. Is installed. In the present embodiment, each peripheral device of the digital television 1, printer 2, digital photo frame 3, and personal computer 4 can wirelessly communicate with the tablet terminal 100, detects the current position of the own device, and detects the position of the own device. It has a function of transmitting information to the tablet terminal 100.
 図22は、機器方向特定部27が生成する機器方向情報の具体例を示す図である。機器方向特定部27は、無線通信部16を介して各周辺機器と通信し、各周辺機器の位置情報を取得する。機器方向特定部27は、取得した位置情報に基づいて、各周辺機器を、所定の座標系にプロットする。本実施形態では、図22に示すとおり、機器方向特定部27は、機器方向記憶部45において、図21に示す部屋に対応させた座標系(Xr、Yr)(部屋の北西端を原点とする)を保持しており、この座標系に、各周辺機器の現在位置をプロットする。なお、図22に示す例では、各周辺機器を、説明のためにブロックで示しているが、機器方向特定部27は、各周辺機器の位置を、座標系にプロットする1点によって把握していればよい。 FIG. 22 is a diagram illustrating a specific example of the device direction information generated by the device direction specifying unit 27. The device direction specifying unit 27 communicates with each peripheral device via the wireless communication unit 16 and acquires position information of each peripheral device. The device direction specifying unit 27 plots each peripheral device in a predetermined coordinate system based on the acquired position information. In the present embodiment, as shown in FIG. 22, the device direction specifying unit 27 uses the coordinate system (Xr, Yr) corresponding to the room shown in FIG. ) And plot the current position of each peripheral device in this coordinate system. In the example shown in FIG. 22, each peripheral device is shown as a block for explanation, but the device direction specifying unit 27 grasps the position of each peripheral device by one point plotted in the coordinate system. Just do it.
 機器方向特定部27は、さらに、自装置が備える位置検知部から、タブレット端末100の位置情報を取得して、タブレット端末100の現在位置を特定するととともに、方向検知部からタブレット端末100の方向情報を取得して、タブレット端末100の方向を特定する。なお、タブレット端末100の位置も、座標系にプロットする1点によって把握されればよい。機器方向特定部27は、図22に示すとおり、タブレット端末100を、特定した位置に、特定した向きでプロットする。タブレット端末100の位置と方向とによって、タブレット端末100のタッチパネルに対応する座標系(X、Y)(タッチパネルの画面左上端を原点とする)、または、タッチパネルの表示部12に表示される環を、上記部屋の座標系に定義することができる。 The device direction specifying unit 27 further acquires the position information of the tablet terminal 100 from the position detection unit included in the device itself, specifies the current position of the tablet terminal 100, and the direction information of the tablet terminal 100 from the direction detection unit. And the direction of the tablet terminal 100 is specified. The position of the tablet terminal 100 may be grasped by one point plotted on the coordinate system. As illustrated in FIG. 22, the device direction specifying unit 27 plots the tablet terminal 100 at the specified position in the specified direction. Depending on the position and direction of the tablet terminal 100, the coordinate system (X, Y) corresponding to the touch panel of the tablet terminal 100 (the upper left corner of the touch panel screen is the origin) or the ring displayed on the display unit 12 of the touch panel Can be defined in the room coordinate system.
 このように機器方向特定部27が生成した機器方向情報は、機器方向記憶部45に記憶され、操作画面処理部24によって読み出される。 The device direction information generated by the device direction specifying unit 27 in this way is stored in the device direction storage unit 45 and read by the operation screen processing unit 24.
 操作画面処理部24は、この機器方向情報を読み出すことにより、それぞれの周辺機器の方向を特定することが可能となる。特に、操作画面処理部24のアイコン配置決定部33は、それぞれの周辺機器に関連するアイコンの配置位置を、その周辺機器が存在する方向に対応するように決定することができる。 The operation screen processing unit 24 can specify the direction of each peripheral device by reading out the device direction information. In particular, the icon arrangement determining unit 33 of the operation screen processing unit 24 can determine the arrangement positions of icons related to each peripheral device so as to correspond to the direction in which the peripheral device exists.
 例えば、図22に示すとおり、タブレット端末100の特定された位置および方向に基づいて、このタブレット端末100を中央のオブジェクトに見立てて、楕円の環(図22において太い破線)が定義されたとする。この場合、アイコン配置決定部33は、タブレット端末100の位置を表す点と、周辺機器の位置を表す点とを結ぶ直線、および、環の輪郭線が交わる位置を、その周辺機器に関するアイコンの配置位置に決定することができる。 For example, as shown in FIG. 22, it is assumed that an elliptical ring (thick broken line in FIG. 22) is defined based on the specified position and direction of the tablet terminal 100, with the tablet terminal 100 as a central object. In this case, the icon arrangement determination unit 33 sets the position of the icon related to the peripheral device at the position where the straight line connecting the point representing the position of the tablet terminal 100 and the point representing the position of the peripheral device and the outline of the ring intersect. The position can be determined.
 図23は、関連情報記憶部42に記憶される周辺機器に関する関連情報の具体例を示す図である。本実施形態では、周辺機器に関係する関連項目については、図23に示すとおり、周辺機器の機器IDとアイコンとの対応関係を示す関連情報が、さらに記憶されている。 FIG. 23 is a diagram illustrating a specific example of the related information related to the peripheral device stored in the related information storage unit 42. In the present embodiment, as to related items related to the peripheral device, as shown in FIG. 23, related information indicating the correspondence between the device ID of the peripheral device and the icon is further stored.
 操作画面処理部24は、周辺機器「デジタルテレビ1」、「プリンタ2」、「デジタルフォトフレーム3」および「パソコン4」に関する関連項目が抽出された場合には、図23に示す関連情報を参照し、それぞれの周辺機器に関連付けられているアイコン画像をアイコン記憶部43から読み出す。そして、操作画面処理部24は、アイコン配置決定部33によって決定された配置位置に、それらの読み出したアイコン画像を配置して操作画面を生成する。 The operation screen processing unit 24 refers to the related information shown in FIG. 23 when related items related to the peripheral devices “digital television 1”, “printer 2”, “digital photo frame 3”, and “computer 4” are extracted. The icon image associated with each peripheral device is read from the icon storage unit 43. Then, the operation screen processing unit 24 generates an operation screen by arranging the read icon images at the arrangement positions determined by the icon arrangement determining unit 33.
 図24は、本実施形態において、操作画面処理部24が実行した操作画面生成処理の結果、得られた操作画面の具体例を示す図である。 FIG. 24 is a diagram illustrating a specific example of the operation screen obtained as a result of the operation screen generation process executed by the operation screen processing unit 24 in the present embodiment.
 図24に示すとおり、操作画面処理部24は、アイコン配置決定部33が、図22に示す機器方向情報に基づいて決定したアイコン配置位置にしたがって、各周辺機器のアイコンを配置し、操作画面を生成する。これにより、表示部12に表示された各周辺機器のアイコンの位置は、図21に示す、実際の周辺機器の存在方向に対応したものとなる。具体的には、タッチパネルの画面の上は、図21に示す例では、北となる。そして、デジタルテレビ1のアイコンは、中央のオブジェクトから見て、実際の存在方向と同様に、略北の方向に、デジタルフォトフレーム3のアイコンは、略北東の方向に、プリンタ2のアイコンは、略東の方向に、パソコン4のアイコンは、略南東の方向に表示される。 As shown in FIG. 24, the operation screen processing unit 24 arranges the icons of the peripheral devices in accordance with the icon arrangement position determined by the icon arrangement determining unit 33 based on the device direction information shown in FIG. Generate. Thereby, the positions of the icons of the peripheral devices displayed on the display unit 12 correspond to the actual directions of the peripheral devices shown in FIG. Specifically, the screen on the touch panel is north in the example shown in FIG. As seen from the center object, the digital TV 1 icon is substantially in the north direction, the digital photo frame 3 icon is in the substantially northeast direction, and the printer 2 icon is in the same direction as the actual existence direction. The icon of the personal computer 4 is displayed in the direction of approximately east and in the direction of approximately southeast.
 上記構成によれば、関連項目(周辺機器)のアイコンを、実際の周辺機器の存在方向に対応させて環状に表示することができる。そのため、ユーザの直感に反しない、より一層理解しやすい操作画面をユーザに提供することが可能となり、より一層自然な流れで次の操作が続けられるようなユーザインターフェースを実現することが可能となる。 According to the above configuration, the icon of the related item (peripheral device) can be displayed in a ring shape corresponding to the actual direction of the peripheral device. Therefore, it becomes possible to provide the user with an operation screen that is easier to understand without contradicting the user's intuition, and it is possible to realize a user interface that allows the next operation to be continued with a more natural flow. .
 次に続く操作が、以下で説明するような接触動作である場合には、上記構成によって特に大きな効果が得られる。 When the subsequent operation is a contact operation as described below, a particularly great effect can be obtained by the above configuration.
 図25は、操作画面処理部24が実行した操作画面生成処理の結果、図24に示す操作画面に続いて得られる操作画面の具体例を示す図である。 FIG. 25 is a diagram showing a specific example of the operation screen obtained following the operation screen shown in FIG. 24 as a result of the operation screen generation process executed by the operation screen processing unit 24.
 図24に示す各周辺機器のアイコンは、そのアイコンが選択されると、中央のオブジェクト(ここでは、「写真1」というオブジェクト)を、その周辺機器に転送するという関連項目を意味するアイコンである。 The icon of each peripheral device shown in FIG. 24 is an icon indicating a related item of transferring a central object (here, an object “Photo 1”) to the peripheral device when the icon is selected. .
 本実施形態では、次に続く操作として、アイコンを選択するための接触動作は、オブジェクト80を指で選択し、所望のアイコン(転送先の周辺機器)のところまでドラッグしてリリースするという動作となる。 In the present embodiment, as a subsequent operation, a contact operation for selecting an icon is an operation of selecting an object 80 with a finger and dragging it to a desired icon (transfer destination peripheral device) and releasing it. Become.
 上述のような接触動作は、周辺機器にデータ(オブジェクト80)送信するという動作と連動し、ユーザにとって直感的に理解し易い動作となっている。加えて、本実施形態のタブレット端末100によれば、タブレット端末100と各周辺機器との実際の位置関係(周辺機器が実際に存在する方向)に対応させて、アイコンを表示している。このため、ユーザは、より一層、自分の接触動作と、その結果発生する情報処理との間の関連性を直感的に把握することが可能となる。例えば、ユーザは、デジタルテレビ1に写真を転送したければ、実際にデジタルテレビ1がある方向に向かって写真のオブジェクトをドラッグすればよい。 The contact operation as described above is linked to the operation of transmitting data (object 80) to the peripheral device, and is an operation that is easy for the user to understand intuitively. In addition, according to the tablet terminal 100 of the present embodiment, icons are displayed in correspondence with the actual positional relationship between the tablet terminal 100 and each peripheral device (direction in which the peripheral device actually exists). For this reason, the user can intuitively grasp the relevance between his / her contact operation and information processing generated as a result. For example, if the user wants to transfer a photo to the digital television 1, the user may drag the photo object in the direction in which the digital television 1 is actually located.
 結果として、ユーザの直感に反しない自然な流れで操作が行える操作画面を提供することができる。 As a result, it is possible to provide an operation screen that can be operated in a natural flow that does not contradict the user's intuition.
 ≪変形例≫
 上述の実施形態1~4を適宜組み合わせたタブレット端末100も本発明の範疇に入る。すなわち、各実施形態1~4に係るタブレット端末100の制御部10は、各々の実施形態において必須でない場合には、ジェスチャ判定部25、連携処理実行部26、および、機器方向特定部27を部分的に、あるいは、全て備えていてもよい。また、制御部10操作画面処理部24は、環形状決定部30、アイコン順位決定部31、アニメーション決定部32、および、アイコン配置決定部33を部分的に、あるいは、全て備えていてもよい。
≪Modification≫
A tablet terminal 100 in which the above-described first to fourth embodiments are appropriately combined also falls within the scope of the present invention. In other words, the control unit 10 of the tablet terminal 100 according to each of the first to fourth embodiments includes the gesture determination unit 25, the cooperation processing execution unit 26, and the device direction specifying unit 27 when not essential in each embodiment. Or all may be provided. Moreover, the control part 10 operation screen process part 24 may be provided with the ring shape determination part 30, the icon order | rank determination part 31, the animation determination part 32, and the icon arrangement | positioning determination part 33 partially or all.
 (オブジェクトと環形状の配置1)
 上述の各実施形態では、操作画面処理部24は、アイコンを配置する際に、選択されたオブジェクト(例えば、図8の(a)、(b)のオブジェクト80)を、タッチパネルの画面の中央に配置する構成であり、環形状決定部30は、中央のオブジェクト80の周囲にアイコンを配置するよう環の位置を決定する構成であった。しかし、本発明の構成は、上記に限定されない。操作画面処理部24は、選択されたオブジェクト80の表示位置を、元のまま維持する構成であってもよく、この場合でも、環形状決定部30は、図26に示すとおり、環形状が画面中央で大きく表示されるように、環の位置およびサイズを決定してもよい。図26は、関連項目のアイコン表示方法の一変形例を示している。
(Object and ring-shaped arrangement 1)
In each of the above-described embodiments, the operation screen processing unit 24 places the selected object (for example, the object 80 in FIGS. 8A and 8B) in the center of the touch panel screen when placing an icon. The ring shape determining unit 30 is configured to determine the position of the ring so as to arrange icons around the center object 80. However, the configuration of the present invention is not limited to the above. The operation screen processing unit 24 may be configured to maintain the display position of the selected object 80 as it is. Even in this case, the ring shape determination unit 30 may display the ring shape on the screen as shown in FIG. The position and size of the ring may be determined so as to be displayed large in the center. FIG. 26 shows a modification of the related item icon display method.
 (オブジェクトと環形状の配置2)
 あるいは、選択されたオブジェクト80の表示位置が、元のまま維持される構成において、環形状決定部30は、図27に示すとおり、元の位置にあるオブジェクト80が環の中央になるように、環形状の位置およびサイズを決定してもよい。図27は、関連項目のアイコン表示方法の一変形例を示している。
(Object and ring arrangement 2)
Alternatively, in the configuration in which the display position of the selected object 80 is maintained as it is, the ring shape determination unit 30 may cause the object 80 at the original position to be in the center of the ring as shown in FIG. The position and size of the ring shape may be determined. FIG. 27 shows a variation of the related item icon display method.
 (環形状のアニメーション)
 あるいは、アイコンを配置するための環形状に対してアニメーションを付与してもよい。この場合、操作画面処理部24は、アニメーション決定部32を備える構成である。
(Animated ring shape)
Or you may provide an animation with respect to the ring shape for arrange | positioning an icon. In this case, the operation screen processing unit 24 includes an animation determination unit 32.
 アニメーション決定部32は、操作画面に配置されるすべての配置対象物、すなわち、オブジェクト、アイコン、環などに対して付与するアニメーションを決定するものである。これにより、オブジェクトおよびアイコンを表示させる際に、表示のさせ方に視覚的な効果(すなわち、アニメーション)を付けることができる。 The animation determination unit 32 determines an animation to be given to all objects to be arranged on the operation screen, that is, objects, icons, rings, and the like. Thereby, when displaying an object and an icon, the visual effect (namely, animation) can be given to how to display.
 なお、アニメーション決定部32は、オブジェクト、アイコン、または、環の動きだけではなく、これらについて、フェードイン(透明度の変更)などの視覚的効果を付与してもよい。 It should be noted that the animation determination unit 32 may give a visual effect such as fade-in (change of transparency) in addition to the movement of the object, icon, or ring.
 また、アニメーション決定部32は、環状に表示するアイコンを最初から環の輪郭線上に出現させるのではなく、異なる場所から最終的にアイコンが輪郭線上に終着するようにアイコンに動きをつけてもよい。例えば、アニメーション決定部32は、環の中央あたりアイコンを集約させた後、それぞれのアイコンが拡散するような動きをつけて、最終的に環の輪郭線上に配置されるように各アイコンに動きをつけてもよい。 Further, the animation determination unit 32 may move the icons so that the icons are finally terminated on the contour line from different places instead of causing the icon to be displayed in a circular shape to appear on the contour line of the ring from the beginning. . For example, after the icons are aggregated around the center of the ring, the animation determination unit 32 adds a movement that diffuses each icon, and finally moves each icon so that it is arranged on the outline of the ring. May be attached.
 (オブジェクトと環形状の配置3~環形状のアニメーション)
 選択されたオブジェクト80の表示位置が、元のまま維持される構成において、オブジェクト80とは無関係に環形状が画面中央で大きく配置される場合、「オブジェクト80を囲んでアイコンを表示させた」という事象と、最終的に得られる結果物(例えば、図26)との間の関連性が薄れるという問題がある。一方、オブジェクト80の表示位置が、元のまま維持される構成において、オブジェクト80の表示位置に関連させて周囲にアイコンを表示させる場合には、オブジェクト80が元々表示されている位置によっては、環形状が、十分なサイズで画面に収まるように配置されず、結果として、アイコンの視認性が低下するという問題がある。
(Object and ring shape arrangement 3-ring shape animation)
In the configuration in which the display position of the selected object 80 is maintained as it is, when the ring shape is largely arranged at the center of the screen regardless of the object 80, it is said that “the icon is displayed surrounding the object 80”. There is a problem in that the relationship between the event and the finally obtained result (eg, FIG. 26) is diminished. On the other hand, in the configuration in which the display position of the object 80 is maintained as it is, when icons are displayed around the object 80 in association with the display position of the object 80, depending on the position where the object 80 is originally displayed, There is a problem in that the shape is not arranged to fit on the screen with a sufficient size, and as a result, the visibility of the icon is lowered.
 そこで、アニメーション決定部32は、図28に示すとおり、アイコンを配置するための環にアニメーションを付与することによって、上記の問題を解決してもよい。具体的には、オブジェクト80の元の表示位置およびサイズに基づいて環形状決定部30が決定した環形状を、一定の時間をかけて、画面中央に大きく配置させるように、アニメーション決定部32が、アニメーションを環に対して付与する。これにより、一旦、オブジェクト80の周囲に小さく配置されたアイコンの環は、時間の経過に伴って徐々にその形状を変え、最終的に画面中央に大きく配置される。図28の(a)は、最初にオブジェクト80の周囲に小さく配置されたアイコンの環の様子を示し、図28の(b)は、アイコンの環が拡大する途中の様子を示し、図28の(c)は、同図の(a)から、同図の(b)を経て、最終的にアイコンの環が画面中央に大きく配置された様子を示している。 Therefore, as shown in FIG. 28, the animation determination unit 32 may solve the above problem by giving an animation to a ring for arranging icons. Specifically, the animation determination unit 32 sets the ring shape determined by the ring shape determination unit 30 based on the original display position and size of the object 80 so that the ring shape is largely arranged in the center of the screen over a certain period of time. Add animation to the ring. As a result, the ring of icons once arranged small around the object 80 gradually changes its shape with the passage of time, and is finally arranged large in the center of the screen. 28A shows a state of an icon ring initially arranged around the object 80, and FIG. 28B shows a state in which the icon ring is being expanded. (C) shows a state in which the ring of icons is finally arranged largely at the center of the screen from (a) of FIG.
 上記構成によれば、ユーザの接触動作と表示される結果物との関連性を損なうことなく、アイコンを十分なサイズで表示させることが可能となる。 According to the above configuration, the icon can be displayed in a sufficient size without impairing the relevance between the user's contact operation and the displayed result.
 なお、アニメーション決定部32は、環のサイズに合わせてアイコン1つ1つの大きさも徐々に大きくするようなアニメーションを付与してもよいし、環のサイズとアイコンのサイズとは独立させて、アイコンのサイズを一定とし、アイコンの配置間隔を徐々に広げるというアニメーションを付与してもよい。 Note that the animation determination unit 32 may give an animation that gradually increases the size of each icon according to the size of the ring, or the icon size is independent of the size of the ring and the size of the icon. The animation may be given such that the size of the icon is fixed and the interval between the icons is gradually increased.
 (アイコンの表示タイミング)
 上述の各実施形態では、操作画面処理部24は、選択されたオブジェクトの周囲に、複数のアイコンを同時に配置する構成であった。しかし、これに限らず、操作画面処理部24が、アニメーション決定部32を備えている場合には、アニメーション決定部32がそれぞれのアイコンの表示タイミングを決定する構成であってもよい。
(Icon display timing)
In each of the above-described embodiments, the operation screen processing unit 24 is configured to simultaneously arrange a plurality of icons around the selected object. However, the present invention is not limited to this, and when the operation screen processing unit 24 includes the animation determination unit 32, the animation determination unit 32 may determine the display timing of each icon.
 図29は、関連項目のアイコン表示方法の一変形例を示す図である。 FIG. 29 is a diagram showing a modification of the related item icon display method.
 例えば、ユーザの「囲う」接触動作から、図29の(a)に示すような接触情報が得られたとする。アニメーション決定部32は、接触情報記憶部44に記憶された上記接触情報を参照し、この「囲う」のジェスチャは、t0からtnにかけて時計回りに発生したと認識する。アニメーション決定部32は、この動きを合うように、1番目から8番目までのアイコンを、1つずつ時計回りに一定の間隔で出現させることを決定する。例えば、アニメーション決定部32によって、図29の(b)、(c)、(d)、(e)、・・・と一定の間隔でアイコンを順に出現させ、最終的に、図29の(f)に示される操作画面に至るように、アイコンの表示タイミングが制御される。 For example, it is assumed that contact information as shown in FIG. 29A is obtained from the user's “enclose” contact operation. The animation determination unit 32 refers to the contact information stored in the contact information storage unit 44, and recognizes that the “enclose” gesture has occurred clockwise from t0 to tn. The animation determination unit 32 determines that the first to eighth icons appear one by one at regular intervals in a clockwise direction so as to match this movement. For example, the animation determining unit 32 causes icons to appear in order at regular intervals such as (b), (c), (d), (e),... In FIG. The display timing of the icons is controlled so that the operation screen shown in FIG.
 上記構成によれば、ユーザが囲ったときの動き(時計回り)とほぼ同じ動きを伴った結果物が得られるので、ユーザの接触動作と表示された結果物との関連性をより高めることが可能となり、結果として、ユーザの直感に反しない自然な流れで操作画面を提供することができる。 According to the above configuration, a result with substantially the same movement as the movement (clockwise) when the user is surrounded can be obtained, so that the relevance between the user's contact operation and the displayed result can be further increased. As a result, the operation screen can be provided in a natural flow that does not contradict the user's intuition.
 ここで、アイコン配置決定部33は、1番目のアイコンの表示位置と、指の軌跡の始点(t0の時の接触位置)とをおおよそ一致させ、最後のアイコンの表示位置と、終点(tnの時の接触位置)とをおおよそ一致させることが好ましい。 Here, the icon arrangement determining unit 33 roughly matches the display position of the first icon with the start point of the finger trajectory (contact position at time t0), and the display position of the last icon and the end point (tn It is preferable that the contact position at the time is approximately the same.
 これにより、接触動作と結果物とをより一層連動させて、より自然な流れで操作画面を提供することができる。 This makes it possible to provide an operation screen with a more natural flow by further linking the contact operation and the result.
 (アイコンの表示タイミング2)
 さらに、アニメーション決定部32は、指の動き(時計回りか、反時計回りか)だけでなく、オブジェクトが囲われたときの指が動く速度に合わせて、各アイコンを順次出現させることが好ましい。
(Icon display timing 2)
Furthermore, it is preferable that the animation determination unit 32 sequentially causes each icon to appear in accordance with not only the finger movement (clockwise or counterclockwise) but also the speed at which the finger moves when the object is surrounded.
 具体的には、図30の(a)に示すような接触情報が得られたとする。この接触情報は、t0の時点からtnの時点にかけて時計回りに囲われた軌跡を示している。そして、より詳細には、taの時点で接触位置(指の先端)がオブジェクトの左にあり、tbの時点で接触位置がオブジェクトの左上にあり、tcの時点で接触位置がオブジェクトの右上にあったことがこの接触情報から分かる。 Specifically, it is assumed that contact information as shown in FIG. This contact information indicates a trajectory enclosed in a clockwise direction from time t0 to time tn. More specifically, the contact position (tip of the finger) is at the left of the object at the time ta, the contact position is at the upper left of the object at the time tb, and the contact position is at the upper right of the object at the time tc. It can be seen from this contact information.
 そこで、アニメーション決定部32は、8個のアイコンが楕円形上に均等な間隔で配置される場合に、1番目のアイコンをオブジェクトの真下にt0の時点に出現させ、その後、指の速度に一致させて、taの時点で、オブジェクトの左のところ(3番目)までアイコンを出現させ、tbの時点で、オブジェクトの左上のところ(4番目)までアイコンを出現させ、tcの時点で、オブジェクトの右上のところ(6番目)までアイコンを出現させ、最終的に、tnの時点で、全てのオブジェクトを出現させることを決定する。図30の(b)は、t0の時点の操作画面を示している。図30の(c)は、taの時点の操作画面を示している。図30の(d)は、tbの時点の操作画面を示している。図30の(e)は、tcの時点の操作画面を示している。図30の(f)は、tnの時点の操作画面を示している。 Therefore, when the eight icons are arranged at equal intervals on the ellipse, the animation determination unit 32 causes the first icon to appear immediately below the object at time t0, and then matches the finger speed. The icon appears up to the left (third) of the object at the time ta, the icon appears up to the upper left (fourth) of the object at the time tb, and the object appears at the time tc. The icons appear up to the upper right (sixth), and finally, it is determined that all objects appear at the time point tn. FIG. 30B shows an operation screen at time t0. FIG. 30C shows the operation screen at the time point ta. FIG. 30D shows the operation screen at time tb. FIG. 30E shows an operation screen at the time tc. FIG. 30 (f) shows the operation screen at time tn.
 上記構成によれば、ユーザが囲ったときの動き(時計回り)および動きの速度とほぼ同じ動きを伴った結果物が得られるので、ユーザの接触動作と表示された結果物との関連性をさらにより一層高めることが可能となり、結果として、ユーザの直感に反しない自然な流れで操作画面を提供することができる。 According to the above configuration, a result is obtained that has substantially the same movement as the movement of the user (clockwise) and the speed of the movement. Therefore, the relationship between the user's contact operation and the displayed result is obtained. The operation screen can be provided in a natural flow that does not contradict the user's intuition.
 (アイコンの配置1)
 アイコン配置決定部33は、抽出された関連項目の属性に基づいて、どのアイコンをどの位置に配置するのかを決定するものである。ここで、アイコン配置決定部33は、関連項目の属性の1つである「動作属性」を考慮して、各関連項目のアイコン配置位置を決定してもよい。
(Icon arrangement 1)
The icon arrangement determining unit 33 determines which icon is arranged at which position based on the attribute of the extracted related item. Here, the icon arrangement determining unit 33 may determine the icon arrangement position of each related item in consideration of “operation attribute” which is one of the attributes of the related item.
 関連項目の属性「動作属性」とは、動作が、他機に対して何か働きかける動作の属性を持っているのか、自機内で処理が完了する動作の属性を持っているのかを示す情報である。前者の動作の例としては、例えば、他機へデータを送信する動作などが挙げられる。後者の動作の例としては、例えば、データを自機の表示部に表示する動作などが挙げられる。 The related item attribute "operation attribute" is information that indicates whether the operation has an attribute of an operation that works on another device or an operation that completes processing in the device itself. is there. As an example of the former operation, for example, an operation of transmitting data to another device can be cited. Examples of the latter operation include an operation for displaying data on the display unit of the own device.
 図6に「動作属性」の具体例を示す。図6に示す関連情報のテーブルは、「動作属性」のフィールドを有している。図6に示す例では、関連項目の動作が「他機へデータを送信する動作」である場合に、そのことを示す「他機送信動作」の識別情報が、当該関連項目に関連付けて記憶される。また、関連項目の動作が「自機内で処理が完了する動作」である場合に、そのことを示す「自機内動作」の識別情報が、当該関連項目に関連付けて記憶される。 Fig. 6 shows a specific example of "motion attribute". The related information table shown in FIG. 6 has a field of “operation attribute”. In the example shown in FIG. 6, when the operation of the related item is “operation of transmitting data to another device”, the identification information of “other device transmission operation” indicating that is stored in association with the related item. The Further, when the operation of the related item is “operation in which the processing is completed within the own device”, the identification information of “operation in the own device” indicating that is stored in association with the related item.
 図31は、関連項目のアイコン配置パターンの一変形例を示している。 FIG. 31 shows a variation of the icon arrangement pattern of related items.
 アイコン配置決定部33は、まず、関連項目抽出部23によって抽出された関連項目のそれぞれの「動作属性」を、関連情報記憶部42から参照する。そして、アイコン配置決定部33は、図31に示すとおり、「動作属性」が「他機送信動作」である関連項目のアイコン(アイコン70~72)を、環の上半分(あるいは、上3分の1)に配置することを決定する。すなわち、オブジェクト80よりも上に配置する。アイコン配置決定部33は、「動作属性」が「自機内動作」である関連項目のアイコン(アイコン74~76)を環の下半分(あるいは、下3分の1)に配置することを決定する。すなわち、オブジェクト80よりも下に配置する、アイコン配置決定部33は、「他機送信動作」および「自機内動作」のいずれにも属さない関連項目がある場合には、残りの空いたスペース(あるいは、中3分の1)にそれらの関連項目のアイコン(アイコン73、77)を配置することを決定する。 The icon arrangement determining unit 33 first refers to each “operation attribute” of the related item extracted by the related item extracting unit 23 from the related information storage unit 42. Then, as shown in FIG. 31, the icon arrangement determining unit 33 displays the icons (icons 70 to 72) of related items whose “operation attribute” is “other device transmission operation” in the upper half (or upper 3 minutes). 1). That is, it is arranged above the object 80. The icon arrangement determination unit 33 determines to arrange the icons (icons 74 to 76) of related items whose “operation attribute” is “in-machine operation” in the lower half (or the lower third) of the ring. . In other words, the icon arrangement determining unit 33 arranged below the object 80, when there is a related item that does not belong to any of “other device transmission operation” and “in-machine operation”, the remaining empty space ( Alternatively, it is determined that the icons (icons 73 and 77) of the related items are arranged in the middle third).
 図25に示したとおり、中央のオブジェクト80が示すデータを他機に送信する場合には、次にユーザは、オブジェクト80をタッチして、他機に送信することを示すアイコンの所にドラッグするという接触動作を行う。 As shown in FIG. 25, when the data indicated by the central object 80 is transmitted to another device, the user next touches the object 80 and drags it to the icon indicating that the data is transmitted to the other device. The contact operation is performed.
 よって、他機に送信するという動作に関連するアイコンを、中央のオブジェクトよりも上側に配置すれば、ユーザは、「他機に送信するという動作(情報処理)」をタブレット端末100にさせるために、自分の所から向こうへ物を移動させるジェスチャを行うことになる。 Therefore, if the icon related to the operation of transmitting to another device is arranged above the center object, the user causes the tablet terminal 100 to perform the “operation of transmitting to another device (information processing)”. , You will make a gesture to move things away from your place.
 「他機に送信するという動作(情報処理)」は、自分の手前に物を持ってくるジェスチャとよりも、自分の所から向こうへ物を移動させるというジェスチャとの方が直感的にもより結びつきが強いと考えられる。 The action of “sending to another machine (information processing)” is more intuitive than the gesture of bringing an object in front of you, and the gesture of moving an object away from you. The connection is considered strong.
 上記構成によれば、ユーザの接触動作と得られる結果との関連性をより高めることが可能となり、結果として、ユーザの直感に反しない自然な流れで操作画面を提供することができる。 According to the above configuration, the relevance between the user's contact operation and the obtained result can be further increased, and as a result, the operation screen can be provided in a natural flow that does not contradict the user's intuition.
 なお、「動作属性」の情報は、「他機送信動作」であるか否かを示すフラグであってもよく、例えば、アイコン配置決定部33は、「動作属性」が「TRUE」すなわち「他機送信動作」である関連項目を、環の上半分に配置すると決定してもよい。 Note that the “operation attribute” information may be a flag indicating whether or not it is “another device transmission operation”. For example, the icon placement determination unit 33 sets the “operation attribute” to “TRUE”, that is, “others”. It may be determined that the related item “machine transmission operation” is arranged in the upper half of the ring.
 (アイコンの配置2)
 また、アイコン配置決定部33は、抽出された各関連項目が、関連項目の属性の1つとして「時間的要素」の情報を含んでいる場合に、各関連項目のアイコンを、時系列に沿って配置するように決定してもよい。
(Icon arrangement 2)
In addition, when each extracted related item includes information of “temporal element” as one of the attributes of the related item, the icon arrangement determining unit 33 displays the icon of each related item in time series. May be determined to be arranged.
 例えば、アイコン配置決定部33は、関連項目が、写真のデータである場合であって、各写真に「撮影日時」の情報が関連付けられている場合には、アイコン配置決定部33は、1番目の写真のアイコン(ここでは、その写真のサムネイル画像が好ましい)を、環の上中央に配置し、そこからスタートして、残りのアイコンを撮影日時順に時計回りに均等に配置することを決定してもよい。 For example, when the related item is data of a photo and the information of “photographing date / time” is associated with each photo, the icon placement determining unit 33 selects the first The icon of the photo (here, the thumbnail image of the photo is preferred) is placed in the center of the top of the ring, and starting from there, it is decided to arrange the remaining icons evenly in the order of shooting date and time. May be.
 あるいは、アイコン配置決定部33は、アイコンを配置するための環を時計の文字盤に見立てて、各関連項目の「時間的要素」の情報が示す時刻どおりに、アイコンを配置することを決定してもよい。 Alternatively, the icon arrangement determining unit 33 determines that the icons are arranged in accordance with the time indicated by the “temporal element” information of each related item, with the ring for arranging the icons as a clock face. May be.
 図32は、関連項目のアイコン配置パターンの一変形例を示している。 FIG. 32 shows a variation of the icon arrangement pattern of related items.
 アイコン配置決定部33は、例えば、オブジェクト80としてアルバムが選択された場合に、そのアルバムに含まれる各写真を画面に配置する。このとき、アイコン配置決定部33は、例えば、1時ごろに撮影された写真(群)78を、環を時計の文字盤に見立てたときの、1時の辺りの輪郭線上に配置し、3時ごろに撮影された写真(群)79を、3時の辺りの輪郭線上に配置する。 For example, when an album is selected as the object 80, the icon arrangement determining unit 33 arranges each photo included in the album on the screen. At this time, the icon arrangement determining unit 33 arranges, for example, a photograph (group) 78 taken around 1 o'clock on the outline around 1 o'clock when the ring is regarded as a clock face. The photograph (group) 79 photographed around the time is arranged on a contour line around 3 o'clock.
 上記構成によれば、アルバム内の写真の撮影日時を写真の配置位置で表現することが可能となり、簡易な接触動作で且つ少ない動作数でユーザの目的である最終結果物を表示させることが可能となる。 According to the above configuration, the shooting date and time of a photo in the album can be expressed by the arrangement position of the photo, and the final result that is the user's purpose can be displayed with a simple contact operation and a small number of operations. It becomes.
 (ユーザの使用状況を「察する」)
 図4の(a)および(b)に示したとおり、タブレット端末100が小型の携帯端末であって、片手でも両手でも操作可能である場合、片手で操作しているとき、両手で操作しているときとでは、ユーザが画面に指を接触させる可能な領域は、異なることが想定される。図4の(b)に示すように、両手で操作する場合には、タッチパネルのどの領域にも触れることが可能である。一方、図4の(a)に示すように、片手で操作する場合には、接触位置は、画面下部左側の領域(左手で操作する場合)、または、画面下部右側の領域(右手で操作する場合)に偏る傾向がある。ユーザがこのような状況でタブレット端末100を使用しているときに、接触動作が必要なオブジェクトまたはアイコンを、画面上部や手と反対側の画面下部に表示すると操作が煩雑になるという問題がある。なぜなら、ユーザは、目的のオブジェクトをすぐさまタッチできず、接触可能な領域にたぐり寄せるという余計な動作を行わなければならないか、両手操作に切り替えなければならないからである。
("See" user usage)
As shown in FIGS. 4A and 4B, when the tablet terminal 100 is a small portable terminal and can be operated with one hand or both hands, when operating with one hand, operate with both hands. It is assumed that the area where the user can touch the screen with a finger is different from the time when the user is present. As shown in FIG. 4B, when operating with both hands, any area of the touch panel can be touched. On the other hand, as shown in FIG. 4A, when the operation is performed with one hand, the contact position is the area on the lower left side of the screen (when operated with the left hand) or the area on the lower right side of the screen (operated with the right hand). Tend to be biased. When the user is using the tablet terminal 100 in such a situation, there is a problem that the operation becomes complicated if an object or icon that requires a contact operation is displayed on the upper part of the screen or the lower part of the screen opposite to the hand. . This is because the user cannot touch the target object immediately and must perform an extra operation of rushing to a contactable area or switching to a two-handed operation.
 そこで、本発明のタブレット端末100は、ユーザの使用状況を察することにより、上記の問題を解決する。具体的には、本発明のタブレット端末100を、指の接触位置の偏りを検出し、ユーザの指がすぐさま届くと推測される領域内に、アイコンを配置する構成とすることができる。 Therefore, the tablet terminal 100 of the present invention solves the above problem by observing the usage status of the user. Specifically, the tablet terminal 100 of the present invention can be configured to detect the bias of the contact position of the finger and place the icon in an area where the user's finger is expected to reach immediately.
 本変形例では、タブレット端末100の接触情報生成部21は、接触/非接触の切り替えに関わらず、さらに、接触動作が「囲う」のジェスチャであるか否かに関わらず、所定期間(例えば、数秒~数分程度)に生じたユーザの接触動作を示す接触情報を生成し、接触情報記憶部44に記憶する構成である。 In this modification, the contact information generation unit 21 of the tablet terminal 100 does not depend on contact / non-contact switching, and further, regardless of whether the contact operation is a gesture of “enclose” (for example, In this configuration, contact information indicating a user's contact operation that occurred within a few seconds to several minutes) is generated and stored in the contact information storage unit 44.
 図33は、ユーザの使用状況に応じて操作画面を提示することが可能な本発明のタブレット端末100の動作を説明する図である。より詳細には、図33の(a)は、ユーザが左手で操作しているという状況の一例を説明する図である。図33の(b)は、同図の(a)の接触動作に伴って生成された接触情報の具体例を示す図である。 FIG. 33 is a diagram for explaining the operation of the tablet terminal 100 of the present invention capable of presenting an operation screen in accordance with the usage status of the user. More specifically, FIG. 33A is a diagram illustrating an example of a situation where the user is operating with the left hand. FIG. 33B is a diagram showing a specific example of the contact information generated in accordance with the contact operation of FIG.
 図33の(a)に示すとおり、例えば、ユーザが親指でフリック動作によって、目的のオブジェクトを接触可能な領域までたぐり寄せて、それを囲むという接触動作を実行したとする。 As shown in (a) of FIG. 33, for example, it is assumed that a user performs a contact operation by dragging a target object to a contactable region by a flick operation with a thumb and surrounding it.
 接触情報生成部21は、本変形例では、所定期間(例えば、過去数秒~数分間程度)上述の一連の接触動作に対して、図33の(b)に示すような接触情報を生成し、接触情報記憶部44に記憶させている。なお、軌跡を記憶する接触情報記憶部44のメモリ容量に制約がある場合には、接触情報生成部21は、新しい軌跡を記憶する度に、最も古い軌跡から削除していく構成にすればよい。 In this modification, the contact information generation unit 21 generates contact information as shown in FIG. 33B for the above-described series of contact operations for a predetermined period (for example, the past few seconds to several minutes), The information is stored in the contact information storage unit 44. When the memory capacity of the contact information storage unit 44 that stores the locus is limited, the contact information generation unit 21 may be configured to delete the oldest locus every time a new locus is stored. .
 ここで、ジェスチャ判定部25によって「囲う」のジェスチャが生じたと判定される。操作画面処理部24は、接触情報記憶部44に記憶されている、過去数秒~数分間分の接触情報をさかのぼって参照して、指の接触位置に偏りが無いか否かを検出する。図33の(a)および(b)に示す例では、指の軌跡は画面下部左側の領域82に偏っている。操作画面処理部24は、この偏りを検知して、ユーザの接触可能領域を、画面下部左側の領域82であると特定する。なお、この画面下部左側の領域82と、画面下部右側の領域83とは、予め定義されているものとする。 Here, it is determined by the gesture determination unit 25 that the “enclose” gesture has occurred. The operation screen processing unit 24 refers back to the past several seconds to several minutes of contact information stored in the contact information storage unit 44, and detects whether or not the finger touch position is biased. In the example shown in FIGS. 33A and 33B, the locus of the finger is biased toward the area 82 on the lower left side of the screen. The operation screen processing unit 24 detects this bias and identifies the user accessible area as the area 82 on the lower left side of the screen. Note that the area 82 on the lower left side of the screen and the area 83 on the lower right side of the screen are defined in advance.
 操作画面処理部24の環形状決定部30は、アイコンを配置するための環が、画面下部左側の領域82に収まるように、環の形状、サイズ、および、配置位置を決定する。 The ring shape determining unit 30 of the operation screen processing unit 24 determines the shape, size, and arrangement position of the ring so that the ring for arranging the icons fits in the area 82 on the lower left side of the screen.
 図34は、環形状決定部30によって決定された環形状にしたがってアイコンが配置されたときの操作画面の一例を示す図である。図34に示すとおり、選択されたオブジェクト80の関連項目は、画面下部左側の領域82内に収まるように表示されているので、ユーザは、親指で目的のアイコンをたぐり寄せる必要はなく、すぐさま次のアイコンを選択することができる。 FIG. 34 is a diagram illustrating an example of an operation screen when icons are arranged according to the ring shape determined by the ring shape determination unit 30. As shown in FIG. 34, since the related items of the selected object 80 are displayed so as to fit within the area 82 on the lower left side of the screen, the user does not need to drag the target icon with his thumb, The icon can be selected.
 なお、タブレット端末100は、ユーザが片手で操作しているという使用状況を判別するために、指の軌跡の線の太さに基づいて、親指で操作されているのか否かを判定し、親指で操作されていると判定した場合に、片手で操作されていると判断し、画面下部にアイコンを表示する構成であってもよい。あるいは、タブレット端末100の筐体にセンサを設けて、タブレット端末100が、筐体が4本の指で把持されているのか、5本の指で把持されているのかを判別し、これに応じて片手操作、または、両手操作の判断を行ってもよい。 Note that the tablet terminal 100 determines whether or not the user is operating with the thumb based on the thickness of the line of the finger trajectory in order to determine the usage situation that the user is operating with one hand. If it is determined that the operation is performed, it may be determined that the operation is performed with one hand, and an icon may be displayed at the bottom of the screen. Alternatively, a sensor is provided in the casing of the tablet terminal 100, and the tablet terminal 100 determines whether the casing is gripped by four fingers or five fingers, and accordingly One-handed operation or two-handed operation may be determined.
 また、タブレット端末100の操作画面処理部24は、指の軌跡によって囲まれた領域の接触座標情報を参照して、軌跡領域を特定し、その軌跡領域および近辺を、ユーザの接触可能領域であると特定して、そこにアイコンの環が配置されるように決定してもよい。 In addition, the operation screen processing unit 24 of the tablet terminal 100 refers to the contact coordinate information of the area surrounded by the locus of the finger, identifies the locus area, and the locus area and the vicinity thereof are user accessible areas. It may be determined that an icon ring is arranged there.
 あるいは、タブレット端末100は、ユーザが片手で操作するという使用状況を想定して、アイコンの環を領域82や領域83内に収まるように配置しない場合でも、ユーザが、簡易に目的のアイコンを、限られた接触可能領域上で選択できるような操作画面をユーザに提供することができる。 Or the tablet terminal 100 assumes the use condition that a user operates with one hand, and when a user does not arrange | position the ring of an icon so that it may be settled in the area | region 82 or the area | region 83, a user simply sets the target icon, An operation screen that can be selected on a limited accessible area can be provided to the user.
 図35の(a)は、ユーザが、接触可能領域上でドラッグすることにより、アイコンの環を回転させる様子を説明する図であり、図35の(b)は、上記の接触動作「ドラッグ」によって、環が回転させられたことにより、アイコンの配置が変更された後の操作画面の一例を示す図である。 FIG. 35A is a diagram illustrating a state in which the user rotates the ring of the icon by dragging on the contactable area, and FIG. 35B is a diagram illustrating the above-described contact operation “drag”. FIG. 6 is a diagram illustrating an example of an operation screen after the arrangement of icons is changed due to rotation of the ring.
 本変形例では、ユーザが片手で操作するために接触可能領域が制限される状況を想定して、タブレット端末100が、環状に配置されたアイコンを、環の輪郭線上に沿って回転可能に提示する構成である。 In this modification, assuming that the accessible area is limited because the user operates with one hand, the tablet terminal 100 presents the icons arranged in a ring shape so as to be rotatable along the outline of the ring. It is the structure to do.
 例えば、オブジェクト80が選択された後、図35の(a)に示すとおりの配置で、アイコンの環がタッチパネル(表示部12)に表示されたとする。ユーザは、左手だけでタブレット端末100を操作しているため、接触可能領域が制限されている(例えば、図33の(b)に示す領域82など)。 For example, it is assumed that after the object 80 is selected, an icon ring is displayed on the touch panel (display unit 12) in the arrangement shown in FIG. Since the user operates the tablet terminal 100 with only the left hand, the contactable area is limited (for example, the area 82 shown in FIG. 33B).
 このような状況で、もしユーザがタッチパネルの画面上方に配置されているテレビのアイコンを選択したければ、両手持ちに切り替えたり、タブレット端末100を台などに固定して左手を空けたりしなければならず、ユーザにとって不便である。 In such a situation, if the user wants to select the TV icon located above the screen of the touch panel, the user must switch to holding with two hands, or fix the tablet terminal 100 to a table or the like and open the left hand. It is inconvenient for the user.
 そこで、本変形例では、入力部11が、ユーザの接触動作「ドラッグ」を受け付けたとき、ジェスチャ判定部25は、ドラッグによって指が移動する方向を判定し、「その方向に対応させてアイコンの環を回転させる」という指示が入力されたことを認識する。操作画面処理部24のアニメーション決定部32は、ジェスチャ判定部25が判定した方向に応じて、アイコンを回転させるアニメーションを付与した操作画面を表示部12に出力する。 Therefore, in the present modification, when the input unit 11 accepts the user's contact operation “drag”, the gesture determination unit 25 determines the direction in which the finger moves by dragging, and “the icon corresponding to that direction is determined. It is recognized that the instruction “rotate the ring” has been input. The animation determination unit 32 of the operation screen processing unit 24 outputs an operation screen to which an animation for rotating an icon is given to the display unit 12 according to the direction determined by the gesture determination unit 25.
 例えば、ユーザは、図35の(a)に示す操作画面において、環の右下あたりで接触動作「ドラッグ(ここでは、右上から左下の方向へ)」を継続する。この接触動作に伴って、各アイコンが、環の輪郭線上で時計回りに回転する。こうして、ユーザは、タッチパネル上方に配置されているテレビのアイコンを、図35の(b)に示すとおり、接触可能領域(例えば、領域82)にたぐり寄せることができ、接触可能領域上で、目的のテレビのアイコンを選択することが可能となる。 For example, the user continues the contact action “drag (in this case, from the upper right to the lower left)” around the lower right of the ring on the operation screen shown in FIG. With this contact operation, each icon rotates clockwise on the outline of the ring. In this way, the user can drag the icon of the television arranged above the touch panel to the contactable area (for example, the area 82) as shown in FIG. 35 (b). The TV icon can be selected.
 (アイコン表示数の調整)
 アイコン配置決定部33は、決定された環形状のサイズと、アイコンの大きさとを考慮して、関連項目抽出部23によって抽出されたすべてのアイコンを表示できないと判断した場合には、表示すべきアイコンの数を減らすことを決定してもよい。
(Adjusting the number of icons displayed)
If the icon arrangement determining unit 33 determines that all icons extracted by the related item extracting unit 23 cannot be displayed in consideration of the determined ring shape size and icon size, the icon arrangement determining unit 33 should display the icons. It may be decided to reduce the number of icons.
 また、アイコン配置決定部33は、接触情報を参照して、指の軌跡(あるいは、囲われた領域)の絶対的な大きさに基づいて、表示するアイコンの数を決定してもよい。これにより、ユーザはオブジェクトを小さめの環で囲うか、大きめの環で囲うかを変えることによって、意図的に、次に表示させるアイコンの数を調節することが可能となる。 Further, the icon arrangement determining unit 33 may determine the number of icons to be displayed based on the absolute size of the finger trajectory (or the enclosed area) with reference to the contact information. Thus, the user can intentionally adjust the number of icons to be displayed next by changing whether the object is surrounded by a smaller ring or a larger ring.
 あるいは、アイコン配置決定部33は、関連項目抽出部23によって抽出された関連項目であっても、関連項目の属性の1つである「条件」の情報に基づいて、特定の関連項目をアイコン表示しないことを決定してもよい。 Alternatively, even if the related item extracted by the related item extracting unit 23, the icon arrangement determining unit 33 displays a specific related item as an icon based on the information of “condition” which is one of the attributes of the related item. You may decide not to.
 図6に「条件」の具体例を示す。図6に示す関連情報のテーブルは、「条件」のフィールドを有している。関連項目の属性「条件」とは、その関連項目のアイコンを表示する場合の条件を示す情報である。 Fig. 6 shows a specific example of "condition". The related information table shown in FIG. 6 has a “condition” field. The attribute “condition” of the related item is information indicating a condition for displaying the icon of the related item.
 図6に示す例では、「写真の付属情報を表示する」「アルバムの付属情報を表示する」というそれぞれの関連項目に「付属情報があれば」という「条件」が関連付けられている。この条件は、写真に付属情報があれば、アイコン表示を行い、写真に付属情報が無ければアイコン表示をしないことを既定するものである。 In the example shown in FIG. 6, “condition” “if there is attached information” is associated with each of the related items “display attached information of photo” and “display attached information of album”. This condition prescribes that if there is attached information in the photo, an icon is displayed, and if there is no attached information in the photo, no icon is displayed.
 したがって、アイコン配置決定部33は、写真のオブジェクトが選択されて、図6に示す8個の関連項目(関連項目群60)が抽出された場合に、選択された写真に付属情報が存在しなければ、関連項目「写真の付属情報を表示する」を除く、7個の関連項目だけを環に配置することを決定する。 Therefore, the icon placement determination unit 33 must include attached information in the selected photo when the photo object is selected and the eight related items (related item group 60) shown in FIG. 6 are extracted. For example, it is determined that only seven related items except for the related item “display attached information of photograph” are arranged in the ring.
 (アイコン表示優先順位)
 さらに、操作画面処理部24は、アイコン順位決定部31を備えていてもよい。
(Icon display priority)
Further, the operation screen processing unit 24 may include an icon rank determining unit 31.
 アイコン順位決定部31は、関連項目の属性に基づいて、関連項目抽出部23によって抽出された関連項目に優先順位を付与するものである。 The icon ranking determining unit 31 gives priority to the related items extracted by the related item extracting unit 23 based on the attributes of the related items.
 例えば、関連情報には、関連項目の属性の1つとして、「選択頻度」のフィールドが含まれている。「選択頻度」とは、過去にユーザによってその関連項目が選択された回数を示す情報である。 For example, the related information includes a “selection frequency” field as one of the attributes of the related item. “Selection frequency” is information indicating the number of times that the related item has been selected by the user in the past.
 アイコン順位決定部31は、抽出された関連項目の「選択頻度」が多い順に、各関連項目に優先順位を付与する。アイコン配置決定部33は、アイコン順位決定部31が決定した優先順位にしたがって、アイコンの配置を決定することができる。 The icon order determination unit 31 assigns priorities to each related item in the descending order of the “selection frequency” of the extracted related items. The icon arrangement determining unit 33 can determine the icon arrangement according to the priority order determined by the icon order determining unit 31.
 例えば、優先順位が高いものから順に時計回りにアイコンを配置したり、優先順位が高い関連項目のアイコンを環の上半分に配置したりすることができる。 For example, icons can be arranged clockwise from the highest priority, or related item icons with higher priority can be arranged in the upper half of the ring.
 あるいは、アイコン配置決定部33は、表示すべきアイコンの数を減らす場合には、優先順位の低いものから順に減らすことができる。 Alternatively, when the number of icons to be displayed is reduced, the icon arrangement determining unit 33 can reduce the icons in descending order of priority.
 (オブジェクトを複数選択する場合)
 上述の各実施形態および各変形例においては、ユーザが、オブジェクト一覧画面(例えば、図5の(a))から、1つのオブジェクトを選択する場合について説明した。しかし、本発明のタブレット端末100は、オブジェクト一覧画面において、複数のオブジェクトの選択を受け付けてもよい。この場合、タブレット端末100は、選択された複数のオブジェクトの組み合わせに応じて、関連項目を抽出することができる。
(When selecting multiple objects)
In each of the above-described embodiments and modifications, the case has been described in which the user selects one object from the object list screen (for example, FIG. 5A). However, the tablet terminal 100 of the present invention may accept selection of a plurality of objects on the object list screen. In this case, the tablet terminal 100 can extract related items according to the combination of the selected objects.
 図36の(a)は、ユーザが目的の複数のオブジェクトを選択するためにオブジェクト「囲う」という接触動作を実施した様子を示す図である。図36の(b)は、図36の(a)に示す接触動作に応じて、操作画面処理部によって生成された操作画面の具体例を示す図である。 FIG. 36A is a diagram illustrating a state in which the user performs a contact operation of “enclosing” an object in order to select a plurality of target objects. FIG. 36B is a diagram illustrating a specific example of the operation screen generated by the operation screen processing unit in accordance with the contact operation illustrated in FIG.
 図36の(a)に示すとおり、ユーザが、オブジェクト一覧画面から、3枚の写真、すなわち、オブジェクト80、オブジェクト84およびオブジェクト85を囲って選択したとする。オブジェクト特定部22は、オブジェクト80、オブジェクト84およびオブジェクト85の3枚の写真が選択されたと特定する。 36A, it is assumed that the user selects three photos, that is, an object 80, an object 84, and an object 85, from the object list screen. The object specifying unit 22 specifies that three photographs of the object 80, the object 84, and the object 85 are selected.
 ここでは、選択されたが複数であってもこれらのオブジェクトは全て「写真」である。そこで、関連項目抽出部23は、図6に示す関連情報を参照し、オブジェクト「写真」に関連付けられている関連項目群60を抽出する。 Here, even though there are a plurality of selected objects, all these objects are “photographs”. Therefore, the related item extraction unit 23 refers to the related information shown in FIG. 6 and extracts the related item group 60 associated with the object “photograph”.
 図36の(b)に示すとおり、操作画面処理部24は、関連項目群60に対応するアイコン画像をアイコン記憶部43から読み出し、オブジェクトの周囲に環状に配置する。ただし、ここでは、選択されたオブジェクトは3枚あるので、操作画面処理部24は、環の中央に選択された3枚の写真(オブジェクト80、84、85)を中央に表示する。これにより、ユーザは、自分が実際に囲ったものが中央に表示され、これらに対して関連のあるアイコンが周囲に環状に表示されている操作画面を得ることができる。これにより、実際に自身が囲ったものと、その結果得られたものとの関係性を容易に把握することが可能となる。また、本変形例では、中央にまとめて配置された3枚の写真(オブジェクト80、84、85)は、1つのタッチ動作でまとめて選択することが可能である。例えば、ユーザは、中央に表示されている3枚の写真の絵柄を1回ドラッグすれば(例えば、テレビのアイコンにドラッグすれば)、3枚ともテレビに転送するという処理をタブレット端末100に実行させることができる。写真1枚1枚をドラッグする必要がないので、ユーザの動作数を減らすことが可能である。 36 (b), the operation screen processing unit 24 reads icon images corresponding to the related item group 60 from the icon storage unit 43 and arranges them in a ring around the object. However, since there are three selected objects here, the operation screen processing unit 24 displays the three selected photos ( objects 80, 84, 85) in the center of the ring. Thus, the user can obtain an operation screen in which what he / she actually encloses is displayed in the center, and icons related to these are displayed in a ring shape around them. This makes it possible to easily grasp the relationship between what is actually enclosed and what is obtained as a result. In the present modification, three photographs ( objects 80, 84, 85) arranged together in the center can be selected together by one touch operation. For example, when the user drags a picture of three photos displayed in the center once (for example, by dragging to the TV icon), the tablet terminal 100 executes a process of transferring all three pictures to the TV. Can be made. Since there is no need to drag each photo, the number of user actions can be reduced.
 図37の(a)は、ユーザが種類の異なる複数のオブジェクトを選択するために、オブジェクトを「囲う」という接触動作を実施した様子を示す図である。図37の(b)は、図37の(a)に示す接触動作に応じて、操作画面処理部によって生成された操作画面の具体例を示す図である。図38は、本変形例における関連情報記憶部42に記憶されている関連情報の他の例を示す図である。 (A) of FIG. 37 is a diagram illustrating a state in which the user performs a contact operation of “enclosing” an object in order to select a plurality of different types of objects. FIG. 37B is a diagram illustrating a specific example of the operation screen generated by the operation screen processing unit in accordance with the contact operation illustrated in FIG. FIG. 38 is a diagram illustrating another example of the related information stored in the related information storage unit 42 in the present modification.
 図37の(a)に示すとおり、ユーザが、オブジェクト一覧画面から、3枚の写真(オブジェクト80、84、85)に加えて、写真とは異なる種類のオブジェクト、すなわち、音楽ファイル86を囲って選択したとする。オブジェクト特定部22は、オブジェクト80、84、85の3枚の写真と音楽ファイル86との合計4つのオブジェクトが併せて選択されたと特定する。 As shown in FIG. 37A, the user surrounds an object of a different type from the photograph, that is, the music file 86 in addition to the three photographs ( objects 80, 84, 85) from the object list screen. Suppose you select it. The object specifying unit 22 specifies that a total of four objects of three photos 80, 84, and 85 and a music file 86 have been selected.
 ここでは、選択されたオブジェクトの組合せは、「写真」と「音楽ファイル」である。そこで、関連項目抽出部23は、図38に示す関連情報を参照し、オブジェクト「写真+音楽ファイル」に関連付けられている関連項目群66を選択する。 Here, the selected object combination is “photo” and “music file”. Therefore, the related item extraction unit 23 refers to the related information shown in FIG. 38 and selects the related item group 66 associated with the object “photo + music file”.
 図37の(b)に示すとおり、操作画面処理部24は、関連項目群66に対応するアイコン画像をアイコン記憶部43から読み出し、オブジェクトの周囲に環状に配置する。ただし、ここでは、選択されたオブジェクトが、写真3枚+音楽ファイル1個であるので、操作画面処理部24は、環の中央に選択された3枚の写真および音楽ファイルのアイコンをひとまとめにして中央に表示する。これにより、ユーザは、自分が実際に囲ったものが中央に表示され、これらに対して関連のあるアイコンが周囲に環状に表示されている操作画面を得ることができる。これにより、実際に自身が囲ったものと、その結果得られたものとの関係性を容易に把握することが可能となる。 37 (b), the operation screen processing unit 24 reads out icon images corresponding to the related item group 66 from the icon storage unit 43, and arranges them in a ring around the object. However, here, since the selected object is three photos + one music file, the operation screen processing unit 24 brings together the icons of the three photos and music files selected in the center of the ring. Display in the center. Thus, the user can obtain an operation screen in which what he / she actually encloses is displayed in the center, and icons related to these are displayed in a ring shape around them. This makes it possible to easily grasp the relationship between what is actually enclosed and what is obtained as a result.
 さらに、オブジェクトが複数種類のデータから成っている場合、そのオブジェクトの組合せに応じて適した関連項目が抽出され、そのアイコンが表示される。例えば、本変形例のように、「写真」と「音楽ファイル」との組合せに対しては、「(写真を)印刷する」とか「写真を編集する」などといった「写真」に特化した関連項目は抽出されず、「スライドショー表示する」などといった、「写真」および「音楽ファイル」の両方を用いる関連項目が抽出される。 Furthermore, when an object consists of a plurality of types of data, relevant items suitable for the combination of the objects are extracted and their icons are displayed. For example, as in this modification, for the combination of “photo” and “music file”, there is a specialization related to “photo” such as “print (photo)” or “edit photo”. Items are not extracted, but related items using both “photos” and “music files” such as “display slide show” are extracted.
 図37の(b)に示す例では、「(写真を)印刷する」という関連項目を表す「2:プリンタ」のアイコンの代わりに、「スライドショー表示する」という関連項目を表す「9:スライドショー再生」のアイコン87が、中央のオブジェクトの周囲に表示される。図37の(b)に示す操作画面において、ユーザが、写真3枚+音楽ファイル1個がひとまとめにされたオブジェクトをアイコン87にドラッグすると、タブレット端末100は、音楽ファイル86を再生しつつ、アイコン87にドラッグされた3枚の写真をスライドショーで表示する。 In the example shown in FIG. 37B, instead of the icon “2: Printer” representing the related item “Print (photo)”, “9: Slideshow playback” representing the related item “Display slideshow”. Icon 87 is displayed around the center object. In the operation screen shown in FIG. 37B, when the user drags an object in which three photos and one music file are grouped together to the icon 87, the tablet terminal 100 reproduces the music file 86 and displays the icon. The three photos dragged to 87 are displayed in a slide show.
 このように、本発明のタブレット端末100は、オブジェクトを複数、また、異なる種類のオブジェクトであっても、これらを一度に囲ってまとめて選択するという接触動作を受け付けることが可能であり、選択されたオブジェクトの組合せに応じて、適切な関連項目を抽出することができる。 As described above, the tablet terminal 100 according to the present invention can accept a contact operation in which a plurality of objects or different types of objects are selected by enclosing them all at once. Appropriate related items can be extracted according to the combination of objects.
 以上のことから、タブレット端末100は、簡易な接触動作且つ少ない動作数でありながら、ユーザの直感に反しない自然な流れで、ユーザが所望する最終結果物を表示させることができる。結果として、タッチパネルを備えたタブレット端末100において、優れた操作性を実現することが可能になるという効果を奏する。 From the above, the tablet terminal 100 can display the final result desired by the user in a natural flow that does not contradict the user's intuition while having a simple contact operation and a small number of operations. As a result, the tablet terminal 100 including the touch panel has an effect that it is possible to realize excellent operability.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。 The present invention is not limited to the above-described embodiments, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present invention.
 なお、本発明の情報処理装置および操作画面表示方法は、以下のようにも表現され得る。 Note that the information processing apparatus and operation screen display method of the present invention can also be expressed as follows.
 タッチパネルを備えた情報処理装置において、上記タッチパネル上を移動した指示体の移動の軌跡を取得する接触動作取得手段と、上記接触動作取得手段によって取得された軌跡により囲われた領域に、少なくとも一部が含まれるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定手段と、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定手段によって特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出手段と、上記関連項目抽出手段によって抽出された関連項目のアイコンを、環の輪郭線上に並べて配置して、上記タッチパネルに表示する操作画面処理手段とを備えていることを特徴とする情報処理装置。 In an information processing apparatus including a touch panel, at least a part of a contact motion acquisition unit that acquires a trajectory of movement of the indicator that has moved on the touch panel and a region surrounded by the trajectory acquired by the contact motion acquisition unit Is identified by the object identifying means with reference to an object identifying means for identifying an object including the object as a selected object and a related information storage unit for storing the object and an item related to the object in association with each other Related item extraction means for extracting an item associated with an object as a related item, and an operation for displaying on the touch panel the icons of related items extracted by the related item extraction means arranged on the outline of a ring An information processing apparatus comprising screen processing means
 タッチパネルを備えた情報処理装置における操作画面表示方法において、上記タッチパネル上を移動した指示体の移動の軌跡を取得する接触動作取得ステップと、上記接触動作取得ステップにて取得された軌跡により囲われた領域に、少なくとも一部が含まれるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定ステップと、オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定ステップにて特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出ステップと、上記関連項目抽出ステップによって抽出された関連項目のアイコンを、環の輪郭線上に並べて配置して、上記タッチパネルに表示する操作画面処理ステップとを含むことを特徴とする操作画面表示方法。 In an operation screen display method in an information processing apparatus including a touch panel, the operation is displayed surrounded by a contact motion acquisition step of acquiring a trajectory of movement of the indicator that has moved on the touch panel, and a trajectory acquired in the contact motion acquisition step. Referring to the object specifying step for specifying an object including at least a part in the area as the selected object, and the related information storage unit for storing the object and an item related to the object in association with each other, the object A related item extraction step for extracting an item associated with the object specified in the specific step as a related item, and icons of related items extracted by the related item extraction step are arranged side by side on the outline of the ring. Operation screen processing step displayed on the touch panel Operation screen display method which comprises and.
 〔ソフトウェアによる実現例〕
 最後に、タブレット端末100の各ブロック、特に、接触情報生成部21、オブジェクト特定部22、関連項目抽出部23、操作画面処理部24、ジェスチャ判定部25、連携処理実行部26、機器方向特定部27、環形状決定部30、アイコン順位決定部31、アニメーション決定部32、および、アイコン配置決定部33は、ハードウェアロジックによって構成してもよいし、次のようにCPUを用いてソフトウェアによって実現してもよい。
[Example of software implementation]
Finally, each block of the tablet terminal 100, in particular, the contact information generation unit 21, the object identification unit 22, the related item extraction unit 23, the operation screen processing unit 24, the gesture determination unit 25, the cooperation processing execution unit 26, and the device direction identification unit 27, the ring shape determining unit 30, the icon order determining unit 31, the animation determining unit 32, and the icon arrangement determining unit 33 may be configured by hardware logic, or realized by software using a CPU as follows. May be.
 すなわち、タブレット端末100は、各機能を実現する制御プログラムの命令を実行するCPU(central processing unit)、上記プログラムを格納したROM(read only memory)、上記プログラムを展開するRAM(random access memory)、上記プログラムおよび各種データを格納するメモリ等の記憶装置(記録媒体)などを備えている。そして、本発明の目的は、上述した機能を実現するソフトウェアであるタブレット端末100の制御プログラムのプログラムコード(実行形式プログラム、中間コードプログラム、ソースプログラム)をコンピュータで読み取り可能に記録した記録媒体を、上記タブレット端末100に供給し、そのコンピュータ(またはCPUやMPU)が記録媒体に記録されているプログラムコードを読み出し実行することによっても、達成可能である。 That is, the tablet terminal 100 includes a CPU (central processing unit) that executes instructions of a control program that realizes each function, a ROM (read only memory) that stores the program, a RAM (random access memory) that develops the program, A storage device (recording medium) such as a memory for storing the program and various data is provided. An object of the present invention is to provide a recording medium on which a program code (execution format program, intermediate code program, source program) of a control program of the tablet terminal 100, which is software that realizes the functions described above, is recorded so as to be readable by a computer. This can also be achieved by supplying the tablet terminal 100 and reading and executing the program code recorded on the recording medium by the computer (or CPU or MPU).
 上記記録媒体としては、例えば、磁気テープやカセットテープ等のテープ系、フロッピー(登録商標)ディスク/ハードディスク等の磁気ディスクやCD-ROM/MO/MD/DVD/CD-R等の光ディスクを含むディスク系、ICカード(メモリカードを含む)/光カード等のカード系、あるいはマスクROM/EPROM/EEPROM/フラッシュROM等の半導体メモリ系などを用いることができる。 Examples of the recording medium include tapes such as magnetic tapes and cassette tapes, magnetic disks such as floppy (registered trademark) disks / hard disks, and disks including optical disks such as CD-ROM / MO / MD / DVD / CD-R. Card system such as IC card, IC card (including memory card) / optical card, or semiconductor memory system such as mask ROM / EPROM / EEPROM / flash ROM.
 また、タブレット端末100を通信ネットワークと接続可能に構成し、上記プログラムコードを、通信ネットワークを介して供給してもよい。この通信ネットワークとしては、特に限定されず、例えば、インターネット、イントラネット、エキストラネット、LAN、ISDN、VAN、CATV通信網、仮想専用網(virtual private network)、電話回線網、移動体通信網、衛星通信網等が利用可能である。また、通信ネットワークを構成する伝送媒体としては、特に限定されず、例えば、IEEE1394、USB、電力線搬送、ケーブルTV回線、電話線、ADSL回線等の有線でも、IrDAやリモコンのような赤外線、Bluetooth(登録商標)、802.11無線、HDR、携帯電話網、衛星回線、地上波デジタル網等の無線でも利用可能である。なお、本発明は、上記プログラムコードが電子的な伝送で具現化された、搬送波に埋め込まれたコンピュータデータ信号の形態でも実現され得る。 Further, the tablet terminal 100 may be configured to be connectable to a communication network, and the program code may be supplied via the communication network. The communication network is not particularly limited. For example, the Internet, intranet, extranet, LAN, ISDN, VAN, CATV communication network, virtual private network, telephone line network, mobile communication network, satellite communication. A net or the like is available. Further, the transmission medium constituting the communication network is not particularly limited. For example, even in the case of wired such as IEEE 1394, USB, power line carrier, cable TV line, telephone line, ADSL line, etc., infrared rays such as IrDA and remote control, Bluetooth ( (Registered trademark), 802.11 wireless, HDR, mobile phone network, satellite line, terrestrial digital network, and the like can also be used. The present invention can also be realized in the form of a computer data signal embedded in a carrier wave in which the program code is embodied by electronic transmission.
 本発明は、入力部および表示部を備える情報処理装置に広く適用することが可能である。例えば、これには限定されないが、入力部および表示部を備える、デジタルテレビ、パーソナルコンピュータ、スマートフォン、タブレットPC、ノートパソコン、携帯電話、PDA(Personal Digital Assistant)、電子書籍リーダ、電子辞書、携帯用・家庭用ゲーム機、電子黒板などに好適に用いることができる。さらに、タッチパネルを備える情報処理装置に本発明を適用すれば、より一層優れた操作性を実現することができる。 The present invention can be widely applied to information processing apparatuses including an input unit and a display unit. For example, but not limited to this, a digital TV, personal computer, smartphone, tablet PC, notebook computer, mobile phone, PDA (Personal Digital Assistant), electronic book reader, electronic dictionary, portable, provided with an input unit and a display unit -It can be used suitably for a home game machine, an electronic blackboard, etc. Furthermore, if the present invention is applied to an information processing apparatus including a touch panel, it is possible to realize even better operability.
1 デジタルテレビ
2 プリンタ
3 デジタルフォトフレーム
4 パソコン
10 制御部
11 入力部(タッチパネル)
12 表示部(タッチパネル)
13 操作部
14 外部インターフェース
15 通信部(通信部)
16 無線通信部(通信部)
17 音声出力部
18 音声入力部
19 記憶部
21 接触情報生成部(軌跡取得手段/接触動作取得手段)
22 オブジェクト特定部(オブジェクト特定手段)
23 関連項目抽出部(関連項目抽出手段)
24 操作画面処理部(操作画面処理手段)
25 ジェスチャ判定部(ジェスチャ判定手段)
26 連携処理実行部(連携処理実行手段)
27 機器方向特定部(機器方向特定手段)
30 環形状決定部(環形状決定手段)
31 アイコン順位決定部(アイコン順位決定手段)
32 アニメーション決定部(アニメーション決定手段)
33 アイコン配置決定部(アイコン配置決定手段)
41 フレームマップ記憶部
42 関連情報記憶部
43 アイコン記憶部
44 接触情報記憶部
45 機器方向記憶部
100 タブレット端末(情報処理装置)
1 Digital TV 2 Printer 3 Digital Photo Frame 4 Personal Computer 10 Control Unit 11 Input Unit (Touch Panel)
12 Display unit (touch panel)
13 Operation unit 14 External interface 15 Communication unit (communication unit)
16 Wireless communication unit (communication unit)
17 audio output unit 18 audio input unit 19 storage unit 21 contact information generation unit (trajectory acquisition means / contact operation acquisition means)
22 Object identification part (object identification means)
23 Related Item Extraction Unit (Related Item Extraction Unit)
24 Operation screen processing unit (operation screen processing means)
25 Gesture determination unit (gesture determination means)
26 Cooperation process execution part (Cooperation process execution means)
27 Device direction specifying unit (device direction specifying means)
30 Ring shape determining unit (ring shape determining means)
31 Icon ranking determining unit (icon ranking determining means)
32 Animation determination unit (animation determination means)
33 Icon arrangement determining unit (icon arrangement determining means)
41 Frame map storage unit 42 Related information storage unit 43 Icon storage unit 44 Contact information storage unit 45 Device direction storage unit 100 Tablet terminal (information processing apparatus)

Claims (17)

  1.  表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得手段と、
     上記軌跡取得手段によって取得された軌跡により囲われた領域に、少なくとも一部が含まれるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定手段と、
     オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定手段によって特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出手段と、
     上記関連項目抽出手段によって抽出された関連項目のアイコンを、環の輪郭線上に並べて配置して、上記表示部に表示する操作画面処理手段とを備えていることを特徴とする情報処理装置。
    A trajectory acquisition means for acquiring a trajectory by which the indicator pointing to the position of the screen of the display unit has moved;
    Object specifying means for specifying, as a selected object, an object at least part of which is included in the area surrounded by the trajectory acquired by the trajectory acquiring means;
    A related item extracting unit that extracts an item associated with the object identified by the object identifying unit as a related item with reference to a related information storage unit that associates and stores the object and an item associated with the object; ,
    An information processing apparatus, comprising: operation screen processing means for displaying icons on the related items extracted by the related item extracting means side by side on the outline of the ring and displaying them on the display section.
  2.  上記操作画面処理手段は、
     上記選択されたオブジェクトの周囲にアイコンが配置されるように上記環の位置およびサイズを決定することを特徴とする請求項1に記載の情報処理装置。
    The operation screen processing means includes
    The information processing apparatus according to claim 1, wherein the position and size of the ring are determined so that icons are arranged around the selected object.
  3.  上記操作画面処理手段は、
     上記軌跡取得手段によって取得された軌跡、もしくは、その相似形または近似形を、上記環の形状として決定することを特徴とする請求項1または2に記載の情報処理装置。
    The operation screen processing means includes
    The information processing apparatus according to claim 1, wherein a trajectory acquired by the trajectory acquisition unit, or a similar shape or an approximate shape thereof, is determined as the shape of the ring.
  4.  上記操作画面処理手段は、
     各アイコンが表示されるタイミングを個々に決定することを特徴とする請求項1から3までのいずれか1項に記載の情報処理装置。
    The operation screen processing means includes
    The information processing apparatus according to any one of claims 1 to 3, wherein a timing at which each icon is displayed is individually determined.
  5.  上記軌跡取得手段は、
     上記指示体の移動の開始からの経過時間を計測し、上記軌跡を構成する点の少なくともいくつかに、上記経過時間を示す移動時間情報を対応付けて保持し、
     上記操作画面処理手段は、
     上記軌跡および上記移動時間情報から判明した上記指示体の移動方向に合わせて、時計回り、または、反時計回りに各アイコンを順次表示させることを特徴とする請求項4に記載の情報処理装置。
    The trajectory acquisition means includes
    The elapsed time from the start of the movement of the indicator is measured, and the movement time information indicating the elapsed time is associated with and held in at least some of the points constituting the trajectory,
    The operation screen processing means includes
    The information processing apparatus according to claim 4, wherein the icons are sequentially displayed in a clockwise direction or a counterclockwise direction in accordance with a moving direction of the indicator determined from the trajectory and the moving time information.
  6.  上記操作画面処理手段は、
     上記環における、上記軌跡の始点の位置と相対的に同じ位置を、最初のアイコンの表示位置に決定することを特徴とする請求項5に記載の情報処理装置。
    The operation screen processing means includes
    6. The information processing apparatus according to claim 5, wherein a position that is relatively the same as the position of the start point of the locus in the ring is determined as a display position of the first icon.
  7.  上記操作画面処理手段は、
     上記軌跡を形成した上記指示体の移動速度に対応するように、各アイコンを順次表示させるタイミングを決定することを特徴とする請求項6に記載の情報処理装置。
    The operation screen processing means includes
    The information processing apparatus according to claim 6, wherein a timing for sequentially displaying the icons is determined so as to correspond to a moving speed of the indicator that forms the locus.
  8.  上記関連情報記憶部には、複数の関連項目が連続してまたは同時に処理可能な関係であればそれらの複数の関連項目を連携有りとする場合に、上記連携の有無を示す連携情報が関連項目ごとに記憶されており、
     上記操作画面処理手段は、
     上記関連項目抽出手段によって抽出された関連項目のうち、連携を有する関連項目同士のアイコンを隣に並べて配置することを特徴とする請求項1から7までのいずれか1項に記載の情報処理装置。
    In the related information storage unit, when a plurality of related items can be processed continuously or simultaneously, when the plurality of related items are linked, the link information indicating the presence or absence of the link is the related item. Every one is remembered,
    The operation screen processing means includes
    The information processing apparatus according to any one of claims 1 to 7, wherein among related items extracted by the related item extracting unit, icons of related items having cooperation are arranged next to each other. .
  9.  上記情報処理装置は、さらに、
     自装置の位置を示す位置情報を取得する位置検知部と、
     自装置の向きを示す方向情報を取得する方向検知部と、
     自装置の周辺機器と通信して周辺機器の位置情報を取得する通信部と、
     上記位置検知部によって取得された自装置の位置情報と、上記通信部によって取得された周辺機器の位置情報とに基づいて、自装置と周辺機器との位置関係を特定するとともに、上記方向検知部によって取得された方向情報によって、自装置の向きを特定することにより、自装置に対して各周辺機器がどの方向に存在するのかを特定する機器方向特定手段とを備え、
     上記操作画面処理手段は、
     上記機器方向特定手段によって特定された、上記周辺機器が存在する方向に対応するように、該周辺機器の関連項目に対応するアイコンの配置位置を決定することを特徴とする請求項1から8までのいずれか1項に記載の情報処理装置。
    The information processing apparatus further includes:
    A position detector that acquires position information indicating the position of the device itself;
    A direction detection unit that acquires direction information indicating the direction of the device itself;
    A communication unit that communicates with the peripheral device of its own device to obtain position information of the peripheral device;
    Based on the position information of the own device acquired by the position detection unit and the position information of the peripheral device acquired by the communication unit, the positional relationship between the own device and the peripheral device is specified, and the direction detection unit Device direction specifying means for specifying in which direction each peripheral device exists with respect to the own device by specifying the direction of the own device by the direction information acquired by
    The operation screen processing means includes
    9. An icon arrangement position corresponding to a related item of the peripheral device is determined so as to correspond to a direction in which the peripheral device exists specified by the device direction specifying means. The information processing apparatus according to any one of the above.
  10.  上記操作画面処理手段は、
     上記関連項目抽出手段によって抽出された関連項目のうち、上記関連情報記憶部において、自装置以外の他機に対して働きかける動作に関する項目であることを示す動作属性が関連付けられている関連項目のアイコンを、上記選択されたオブジェクトよりも上に配置することを特徴とする請求項1から9までのいずれか1項に記載の情報処理装置。
    The operation screen processing means includes
    Among the related items extracted by the related item extracting means, the related item icon associated with an operation attribute indicating that the related information storage unit is an item related to an operation that acts on a device other than the own device. The information processing apparatus according to claim 1, wherein the information processing apparatus is arranged above the selected object.
  11.  上記軌跡取得手段は、
     上記表示部に表示されたオブジェクトを囲う上記指示体の移動が生じるまでの所定期間に生じた上記指示体の軌跡を取得し、
     上記操作画面処理手段は、
     上記所定期間に取得された軌跡が、上記表示部の画面における特定の領域に偏っていると判断した場合に、上記特定の領域にアイコンが配置されるように上記環の位置を決定することを特徴とする請求項1から10までのいずれか1項に記載の情報処理装置。
    The trajectory acquisition means includes
    Obtaining a locus of the indicator that has occurred in a predetermined period until the indicator moves around the object displayed on the display unit;
    The operation screen processing means includes
    When it is determined that the trajectory acquired during the predetermined period is biased toward a specific area on the screen of the display unit, the position of the ring is determined so that an icon is arranged in the specific area. The information processing apparatus according to any one of claims 1 to 10, wherein the information processing apparatus is characterized in that:
  12.  上記操作画面処理手段は、
     上記表示部の画面における、上記指示体の軌跡により囲われた領域、または、該囲われた領域を含む特定の領域にアイコンが配置されるように上記環の位置を決定することを特徴とする請求項1から10までのいずれか1項に記載の情報処理装置。
    The operation screen processing means includes
    The position of the ring is determined such that an icon is arranged in a region enclosed by the locus of the indicator on the screen of the display unit or a specific region including the enclosed region. The information processing apparatus according to any one of claims 1 to 10.
  13.  当該情報処理装置が備える入力部および上記表示部はタッチパネルを構成するものであり、
     上記軌跡取得手段は、上記タッチパネル上を移動した上記指示体の移動の軌跡を取得することを特徴とする請求項1から12までのいずれか1項に記載の情報処理装置。
    The input unit and the display unit included in the information processing apparatus constitute a touch panel.
    The information processing apparatus according to any one of claims 1 to 12, wherein the trajectory acquisition unit acquires a trajectory of the movement of the indicator that has moved on the touch panel.
  14.  当該情報処理装置が備える入力部は、上記表示部に表示されるカーソルを移動させる指示を当該情報処理装置に入力するものであり、
     上記軌跡取得手段は、上記指示体としてのカーソルの移動の軌跡を取得することを特徴とする請求項1から12までのいずれか1項に記載の情報処理装置。
    The input unit included in the information processing apparatus inputs an instruction to move the cursor displayed on the display unit to the information processing apparatus.
    The information processing apparatus according to any one of claims 1 to 12, wherein the trajectory acquisition unit acquires a trajectory of movement of a cursor as the indicator.
  15.  情報処理装置における操作画面表示方法であって、
     上記情報処理装置が備える表示部の画面の位置を指し示す指示体が移動した軌跡を取得する軌跡取得ステップと、
     上記軌跡取得ステップにて取得された軌跡により囲われた領域に、少なくとも一部が含まれるオブジェクトを、選択されたオブジェクトとして特定するオブジェクト特定ステップと、
     オブジェクトと該オブジェクトに関連する項目とを対応付けて記憶する関連情報記憶部を参照して、上記オブジェクト特定ステップにて特定されたオブジェクトに対応付けられた項目を関連項目として抽出する関連項目抽出ステップと、
     上記関連項目抽出ステップにて抽出された関連項目のアイコンを、環の輪郭線上に並べて配置して、上記表示部に表示する操作画面処理ステップとを含むことを特徴とする操作画面表示方法。
    An operation screen display method in an information processing apparatus,
    A trajectory acquisition step for acquiring a trajectory of movement of an indicator that indicates the position of the screen of the display unit included in the information processing apparatus;
    An object specifying step for specifying, as a selected object, an object including at least a part of the region surrounded by the track acquired in the track acquiring step;
    A related item extracting step of extracting an item associated with the object identified in the object identifying step as a related item with reference to a related information storage unit that associates and stores the object and an item associated with the object When,
    An operation screen display method comprising: an operation screen processing step of arranging and displaying icons of related items extracted in the related item extraction step on a ring outline and displaying them on the display unit.
  16.  コンピュータを、請求項1から14までのいずれか1項に記載の情報処理装置の各手段として機能させるための制御プログラム。 A control program for causing a computer to function as each unit of the information processing apparatus according to any one of claims 1 to 14.
  17.  請求項16に記載の制御プログラムを記録したコンピュータ読み取り可能な記録媒体。 A computer-readable recording medium on which the control program according to claim 16 is recorded.
PCT/JP2012/067525 2011-07-15 2012-07-10 Information processing device, operation screen display method, control program, and recording medium WO2013011862A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-157167 2011-07-15
JP2011157167A JP5172997B2 (en) 2011-07-15 2011-07-15 Information processing apparatus, operation screen display method, control program, and recording medium

Publications (1)

Publication Number Publication Date
WO2013011862A1 true WO2013011862A1 (en) 2013-01-24

Family

ID=47558040

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/067525 WO2013011862A1 (en) 2011-07-15 2012-07-10 Information processing device, operation screen display method, control program, and recording medium

Country Status (2)

Country Link
JP (1) JP5172997B2 (en)
WO (1) WO2013011862A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018168114A1 (en) * 2017-03-16 2018-09-20 Ricoh Company, Ltd. Information processing system, information processing apparatus, information processing program and information processing method

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7509588B2 (en) 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US8519964B2 (en) 2007-01-07 2013-08-27 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US8619038B2 (en) 2007-09-04 2013-12-31 Apple Inc. Editing interface
US8423911B2 (en) 2010-04-07 2013-04-16 Apple Inc. Device, method, and graphical user interface for managing folders
US10788976B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
KR102255988B1 (en) * 2013-06-07 2021-05-25 삼성전자주식회사 Method for displaying visual effect of a portable terminal and fortable terminal therefor
JP6040873B2 (en) * 2013-06-17 2016-12-07 ソニー株式会社 Information processing apparatus, information processing method, and computer-readable recording medium
KR102109406B1 (en) * 2013-08-13 2020-05-28 엘지전자 주식회사 Display device connected to photo printer and method for controlling the same
CN110687969B (en) 2013-10-30 2023-05-02 苹果公司 Displaying related user interface objects
JP6054892B2 (en) * 2014-01-14 2016-12-27 レノボ・シンガポール・プライベート・リミテッド Application image display method, electronic apparatus, and computer program for multiple displays
JP6287435B2 (en) 2014-03-26 2018-03-07 日本電気株式会社 Information processing apparatus, information processing method, and program
US10482461B2 (en) 2014-05-29 2019-11-19 Apple Inc. User interface for payments
WO2016036552A1 (en) 2014-09-02 2016-03-10 Apple Inc. User interactions for a mapping application
WO2016104191A1 (en) 2014-12-26 2016-06-30 日立マクセル株式会社 Illumination device
US9940637B2 (en) 2015-06-05 2018-04-10 Apple Inc. User interface for loyalty accounts and private label accounts
US20160358133A1 (en) 2015-06-05 2016-12-08 Apple Inc. User interface for loyalty accounts and private label accounts for a wearable device
US10353540B2 (en) * 2016-03-30 2019-07-16 Kyocera Document Solutions Inc. Display control device
JP6465056B2 (en) * 2016-03-30 2019-02-06 京セラドキュメントソリューションズ株式会社 Display control device
DK201670595A1 (en) 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
US11816325B2 (en) 2016-06-12 2023-11-14 Apple Inc. Application shortcuts for carplay
JP6471730B2 (en) * 2016-06-29 2019-02-20 京セラドキュメントソリューションズ株式会社 Display input device
US10860199B2 (en) 2016-09-23 2020-12-08 Apple Inc. Dynamically adjusting touch hysteresis based on contextual data
JP6489253B2 (en) * 2018-02-27 2019-03-27 日本精機株式会社 Display device and in-vehicle device operation system
US11675476B2 (en) 2019-05-05 2023-06-13 Apple Inc. User interfaces for widgets
CN117544614A (en) * 2019-05-20 2024-02-09 北京小米移动软件有限公司 File transmission method, device and computer readable storage medium
JP2021022182A (en) * 2019-07-29 2021-02-18 株式会社電通グループ Display control method, display control apparatus, display control program, and display control system
CN111327769B (en) 2020-02-25 2022-04-08 北京小米移动软件有限公司 Multi-screen interaction method and device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006244353A (en) * 2005-03-07 2006-09-14 Konami Digital Entertainment:Kk Information processor, image movement instructing method and program
JP2006344217A (en) * 2005-06-09 2006-12-21 Fuji Xerox Co Ltd Information communicable portable device and program of the portable device
JP2008226049A (en) * 2007-03-14 2008-09-25 Ricoh Co Ltd Display processor, display processing method and display processing program
JP2010026710A (en) * 2008-07-17 2010-02-04 Sony Corp Information processor, information processing method, and information processing program
WO2010056483A1 (en) * 2008-11-13 2010-05-20 Qualcomm Incorporated Method and system for context dependent pop-up menus
JP2010267079A (en) * 2009-05-14 2010-11-25 Canon Inc Information processor, control method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006244353A (en) * 2005-03-07 2006-09-14 Konami Digital Entertainment:Kk Information processor, image movement instructing method and program
JP2006344217A (en) * 2005-06-09 2006-12-21 Fuji Xerox Co Ltd Information communicable portable device and program of the portable device
JP2008226049A (en) * 2007-03-14 2008-09-25 Ricoh Co Ltd Display processor, display processing method and display processing program
JP2010026710A (en) * 2008-07-17 2010-02-04 Sony Corp Information processor, information processing method, and information processing program
WO2010056483A1 (en) * 2008-11-13 2010-05-20 Qualcomm Incorporated Method and system for context dependent pop-up menus
JP2010267079A (en) * 2009-05-14 2010-11-25 Canon Inc Information processor, control method, and program

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018168114A1 (en) * 2017-03-16 2018-09-20 Ricoh Company, Ltd. Information processing system, information processing apparatus, information processing program and information processing method
JP2018155831A (en) * 2017-03-16 2018-10-04 株式会社リコー Information processing system, information processing apparatus, information processing program, and information processing method
CN110431524A (en) * 2017-03-16 2019-11-08 株式会社理光 Information processing system, information processing unit, message handling program and information processing method
US11163835B2 (en) 2017-03-16 2021-11-02 Ricoh Company, Ltd. Information processing system, information processing apparatus, information processing program and information processing method
CN110431524B (en) * 2017-03-16 2022-10-21 株式会社理光 Information processing system, information processing apparatus, information processing program, and information processing method

Also Published As

Publication number Publication date
JP2013025409A (en) 2013-02-04
JP5172997B2 (en) 2013-03-27

Similar Documents

Publication Publication Date Title
JP5172997B2 (en) Information processing apparatus, operation screen display method, control program, and recording medium
US20230049771A1 (en) Reduced size user interface
US11340757B2 (en) Clock faces for an electronic device
US20210357169A1 (en) User interfaces for devices with multiple displays
US11755273B2 (en) User interfaces for audio media control
US20220365671A1 (en) Device, Method, and Graphical User Interface for Switching Between User Interfaces
US11675476B2 (en) User interfaces for widgets
US20210191602A1 (en) Device, Method, and Graphical User Interface for Selecting User Interface Objects
JP5107453B1 (en) Information processing apparatus, operation screen display method, control program, and recording medium
US9753639B2 (en) Device, method, and graphical user interface for displaying content associated with a corresponding affordance
CN105335001B (en) Electronic device having curved display and method for controlling the same
US20190258373A1 (en) Scrollable set of content items with locking feature
WO2013011863A1 (en) Information processing device, operation screen display method, control program, and recording medium
JP5173001B2 (en) Information processing apparatus, screen display method, control program, and recording medium
KR102255087B1 (en) Electronic device and method for displaying object
US20230393714A1 (en) User interfaces for managing accessories
US11379113B2 (en) Techniques for selecting text
TW201610823A (en) Reduced size user interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12814971

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12814971

Country of ref document: EP

Kind code of ref document: A1