US20130036387A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
US20130036387A1
US20130036387A1 US13/554,397 US201213554397A US2013036387A1 US 20130036387 A1 US20130036387 A1 US 20130036387A1 US 201213554397 A US201213554397 A US 201213554397A US 2013036387 A1 US2013036387 A1 US 2013036387A1
Authority
US
United States
Prior art keywords
list
items
lists
coupling
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/554,397
Other languages
English (en)
Inventor
Yu MURATA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Murata, Yu
Publication of US20130036387A1 publication Critical patent/US20130036387A1/en
Priority to US15/697,188 priority Critical patent/US11042287B2/en
Priority to US17/335,489 priority patent/US20210286512A1/en
Priority to US18/462,967 priority patent/US20230418463A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a program.
  • a UI that creates and displays thumbnail images from scenes of every predetermined time interval among scenes that constitute moving image data.
  • Such a UI can reproduce moving image data from a scene desired by a user by making the user select a given thumbnail image.
  • the number of thumbnail images displayed on the UI increases or decreases in accordance with the time intervals of scenes taken out of moving image data. Accordingly, when a user performs an operation of increasing the number of thumbnail images, for example, the UI displays thumbnail images of scenes of shorter time intervals. As a user operation of increasing or decreasing the number of thumbnail images, the following technology is known.
  • a display device disclosed in JP 2011-003977A displays, when a pinch-put operation is performed on a thumbnail image of moving image data, thumbnail images of shorter time intervals. Accordingly, a user can easily check the details of each scene.
  • an information processing device including a display control unit configured to display a plurality of lists including a first list and a second list each having list items; and perform display of coupling to the first list a list item of the second list that is a sub-list of the first list.
  • an information processing method including displaying a plurality of lists including a first list and a second list each having list items, and performing display of coupling to the first list a list item of the second list that is a sub-list of the first list.
  • a program causing a computer to perform processes of displaying a plurality of lists including a first list and a second list each having list items; and performing display of coupling to the first list a list item of the second list that is a sub-list of the first list.
  • FIG. 1 is a diagram illustrating a summary of an embodiment of the present disclosure
  • FIG. 2 is a block configuration diagram showing the configuration of an information processing device in accordance with this embodiment
  • FIG. 3 is a diagram showing an exemplary display of a plurality of lists in accordance with this embodiment
  • FIG. 4 is a screen transition diagram illustrating an animation of coupling a plurality of lists in accordance with this embodiment
  • FIG. 5 is a diagram illustrating a pinch-out operation in accordance with this embodiment
  • FIG. 9 is screen transition diagram illustrating an animation when lists are separated in conjunction with a pinch-in operation in accordance with this embodiment.
  • FIG. 11 is a screen transition diagram illustrating an animation when lists are coupled in conjunction with a drag operation in accordance with this embodiment
  • FIG. 12 is a diagram illustrating a scale-out button and a scale-in button in accordance with this embodiment
  • FIG. 13 is a diagram showing a slider bar in accordance with this embodiment.
  • FIG. 14 is a diagram showing an exemplary display in which a plurality of lists are arranged vertically in accordance with this embodiment
  • FIG. 15 is a diagram showing an exemplary display in which a plurality of lists overlap in accordance with this embodiment
  • FIG. 16 is a diagram showing a screen on which a plurality of lists each including a plurality of thumbnail images are displayed in accordance with this embodiment
  • FIG. 17 is a diagram showing a result obtained when the plurality of lists shown in FIG. 16 are coupled;
  • FIG. 18 is a diagram showing a result obtained when the plurality of lists shown in FIG. 16 are separated;
  • FIG. 20 is a diagram showing a result obtained when the plurality of lists shown in FIG. 19 are coupled;
  • FIG. 21 is a diagram showing a result obtained when two sub-lists shown in FIG. 19 are coupled to a target list;
  • FIG. 22 is a diagram showing a screen that displays a target list in accordance with this embodiment.
  • FIG. 23 is a screen transition diagram illustrating an animation when lists are coupled in conjunction with a pinch-out operation in accordance with this embodiment
  • FIG. 24 is a diagram showing a screen that list-displays a single list in accordance with this embodiment.
  • FIG. 26 is a diagram showing a screen when coupling of lists is completed in accordance with this embodiment.
  • FIG. 27 is a flowchart showing a display control process of coupling/separating lists in accordance with pinch-in/out operations in accordance with this embodiment
  • FIG. 28 is a table exemplarily showing calculation of the movement amount and the movement speed of pinch-in/out operations in accordance with this embodiment.
  • FIG. 29 is a screen transition diagram illustrating an example of changing an animation when lists are coupled in conjunction with a pinch-out operation.
  • the information processing device 10 in accordance with an embodiment of the present disclosure includes (A) a display control unit (a GPU 112 ) that displays a plurality of lists each including list items and performs display of coupling to a first list list items of a second list, which is a sub-list of the first list, in response to a user operation.
  • a display control unit a GPU 112
  • FIG. 1 is a diagram illustrating a summary of an embodiment of the present disclosure.
  • an information processing device 10 in accordance with this embodiment includes a display 15 and a touch panel 16 integrated with the display 15 .
  • the information processing device 10 displays lists each having list items 21 on the display 15 .
  • the information processing device 10 displays a target list 23 and a plurality of sub-lists 25 each having list items 21 as shown in FIG. 1 .
  • the sub-lists 25 display more detailed information than does the target list 23 .
  • the target list 23 is displayed as a list to be operated by a user.
  • the sub-lists 25 may also be operated by a user.
  • the information processing device 10 when a user has performed some operation or when the internal state of an application has changed, changes the method of displaying lists. More specifically, the information processing device 10 performs display control so that lists are coupled/separated in response to a user operation detected by the touch panel 16 .
  • the information processing device 10 couples the sub-list 25 to the target list 23 by gradually moving the sub-list 25 closer to the target list 23 , for example.
  • the information processing device 10 creates a new sub-list 25 by arranging lists items 21 gradually separated from the target list 23 , for example.
  • the information processing device 10 when coupling/separating lists displayed on the display 15 , performs control of gradually changing each list item on the display screen. Accordingly, the user is able to, before the completion of the coupling/separation of the lists, check in advance the list items to be coupled to the target list 23 or the list items 21 to be separated from the target list 23 . In addition, the user is also able to cancel the display control of coupling/separation before the completion thereof while checking the list items 21 to be coupled/separated.
  • FIG. 2 is a block configuration diagram showing the configuration of the information processing device 10 in accordance with this embodiment.
  • the information processing device 10 includes a control unit 11 , RAM 12 , a storage medium 13 , a display processing unit 14 , a display 15 , a touch panel 16 , an operation unit 17 , and a communication unit 18 .
  • the control unit 11 functions as an arithmetic processing unit and a control unit, and controls each component of the information processing device 10 . More specifically, as shown in FIG. 2 , the control unit 11 in accordance with this embodiment includes a CPU (Central Processing Unit) 111 and a GPU (Graphics Processing Unit) 112 .
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the CPU 111 controls each component of the information processing device 10 in accordance with various programs.
  • the CPU 111 may also be a microprocessor.
  • the GPU 112 is a display control unit that generates an image to be displayed on the display 15 or changes a display screen in response to a user operation. More specifically, the GPU 112 performs display control so that a plurality of lists each having list items 21 arranged therein are displayed on the display 15 . In addition, the GPU 112 performs display control in response to a user operation so that list items 21 of a second list, which is a sub-list of the first list, is coupled to a first list.
  • the GPU 112 performs such display control of coupling/separating lists in response to a user operation detected by the touch panel 16 .
  • the GPU 112 when a pinch-out/in operation is detected, performs display control so that lists are coupled/separated in accordance with a change in the distance between the two fingers touching the touch panel 16 .
  • the GPU 112 may perform display control so that lists are coupled/separated in accordance with the movement amount or the movement speed of the fingers touching the touch panel 16 . Note that the display control of the GPU 112 performed in response to a user operation will be described in detail in ⁇ 2. Coupling/Separation of Plurality of Lists>to ⁇ 4. Operation Process>
  • the RAM (Random Access Memory) 12 temporarily stores programs used in the execution of the control unit 11 , parameters that change as appropriate during the execution, and the like.
  • the storage medium 13 may be, for example, nonvolatile memory such as flash ROM (or flash memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), or EPROM (Erasable Programmable ROM); a magnetic disk such as a hard disk or a disc-shaped magnetic body; an optical disc such as CD (Compact Disc), DVD-R (Digital Versatile Disc Recordable), or BD (Blu-Ray Disc (registered trademark)); or a storage medium such as a MO (Magneto Optical) disk.
  • nonvolatile memory such as flash ROM (or flash memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), or EPROM (Erasable Programmable ROM)
  • a magnetic disk such as a hard disk or a disc-shaped magnetic body
  • an optical disc such as CD (Compact Disc), DVD-R (Digital Versatile Disc Recordable), or BD (Blu-Ray Disc (registered trademark)
  • the display processing unit 14 causes the display 15 to output image data output from the GPU 112 in accordance with the display control of the GPU 112 .
  • the display 15 is a display device that outputs a display screen input from the display processing unit 14 .
  • This display 15 may be, for example, a display device such as a liquid crystal display (LCD) or an organic EL (Electroluminescence) display.
  • the display 15 in accordance with this embodiment displays a list including list icons 21 , for example.
  • the communication unit 18 is a module that communicates with a communication terminal. Specifically, the communication unit 18 includes a receiving unit that receives data from a communication terminal, and a transmitting unit that transmits data to the communication terminal. In addition, the communication unit 18 may transmit and receive data via short-range wireless communication such as Wi-Fi or Bluetooth.
  • the GPU 112 displays a plurality of lists as described above and, in response to a user operation, couples the sub-list 25 to the target list 23 or separates the list items 21 from the target list 23 , thereby increasing/decreasing the number of displayed list items 21 (the number of list items).
  • coupling of a plurality of lists and separation of a list will be sequentially described with reference to the drawings.
  • the GPU 112 When a plurality of lists are displayed as described above, if a user operation indicating coupling of lists is detected, the GPU 112 performs display control so that the list items 21 of the sub-list 25 are gradually moved closer to the target list 23 and thus are coupled thereto.
  • the coupling of the lists is performed with an animation such as the one shown in FIG. 4 , for example.
  • FIG. 4 is a screen transition diagram illustrating an animation of coupling a plurality of lists.
  • the GPU 112 displays a target list 23 and sub-lists 25 .
  • the GPU 112 performs display control so that the lists are coupled as shown in a screen 35 .
  • the GPU 112 widens the space between each list item of the target list 23 as shown in the screen 35 , and moves each list item of the sub-list 25 a to the space between each list item of the target list 23 .
  • the GPU 112 gradually lowers the position of the sub-list 25 b by one level and adjusts the size of the list items of the sub-list 25 b as shown in the screen 35 .
  • the GPU 112 adjusts the size and position of the list items of the sub-list 25 a so that they become uniform, and terminates the coupling.
  • a target list 23 x obtained after the coupling has an increased amount of information compared to the target list 23 before the coupling.
  • Coupling of a plurality of lists has been described above.
  • the aforementioned example shows a case in which the sub-list 25 a is coupled to the target list 23
  • coupling of a plurality of lists in accordance with the present disclosure is not limited thereto.
  • sub-lists at a plurality of levels such as the sub-list 25 a and the sub-list 25 b , may be coupled to the target list 23 .
  • the GPU 112 first separates a predetermined number of list items from the target list 23 in response to a user operation to thereby reduce the amount of information of the target list 23 . Then, the GPU 112 creates a new sub-list 25 by arranging the list items 21 separated from the target list 23 , whereby separation of the list is completed.
  • sub-list(s) 25 to be newly created after being separated from the target list 23 may be arranged at either one level or more levels.
  • the GPU 112 when a pinch-out operation is detected, couples lists in accordance with the pinch-out operation.
  • a pinch-out operation refers to a touch operation of, while touching two points on a screen with two fingers, widening the gap between the two fingers.
  • FIGS. 5 and 6 specific description will be made with reference to FIGS. 5 and 6 .
  • FIG. 5 is a diagram illustrating a pinch-out operation of a user. As shown in a screen 39 in FIG. 5 , when a pinch-out operation on a target list 23 is detected, the GPU 112 couples lists.
  • FIG. 6 is a screen transition diagram illustrating an animation when lists are coupled in conjunction with a pinch-out operation. As shown in a screen 41 in FIG. 6 , with a list item A 2 and a list item A 3 being touched, an operation (a pinch-out operation) of widening the gap d 1 between the touch positions is performed. The GPU 112 moves the display positions of the list item A 2 and the list item A 3 in accordance with the touch positions of the user.
  • the GPU 112 enlarges the list item B 2 inserted into the space between the list item A 2 and the list item A 3 to a size that is similar to the size of each list item of the target list 23 , and thus completes the coupling of the lists.
  • the user is also able to cancel the coupling of the lists before the completion thereof while checking information that is increased by the coupling.
  • the aforementioned embodiment describes detecting a pinch-out operation of touching two list items of the target list 23 with two fingers and widening the gap between the fingers
  • the positions of the pinch-out operation in accordance with this embodiment are not limited thereto.
  • the GPU 112 no matter at which position on the screen a pinch-out operation is detected, couples lists in accordance with the width of the gap d between the touch positions and a movement of widening the gap d with the screen being touched.
  • the GPU 112 when a pinch-in operation is detected, separates lists in accordance with the pinch-in operation.
  • a pinch-in operation refers to a touch operation of, while touching two points on a screen with two fingers, narrowing the gap between the two fingers.
  • description will be specifically made with reference to FIGS. 8 and 9 .
  • FIG. 9 is screen transition diagram illustrating an animation when lists are separated in conjunction with a pinch-in operation.
  • a screen 51 in FIG. 9 with a list item A 2 and a list item A 4 being touched, an operation (a pinch-in operation) of narrowing the gap d 3 between the touch positions is performed.
  • the GPU 112 moves the display positions of the list item A 2 and the list item A 4 so that they move closer to each other in accordance with the touch positions of the user.
  • the GPU 112 displays the list item A 3 so that the list item A 3 gradually shrinks in accordance with the pinch-in operation.
  • the GPU 112 performs display control so that the list item A 3 is moved away from and separated from the target list 23 .
  • the other list items of the target list 23 for example, every other list item arranged in the target list 23 , such as a list item A 1 and a list item A 5 , automatically moves and is thus separated. Note that when an operation of widening the gap between the touch positions of the two fingers performing the pinch-in operation is detected, the GPU 112 performs display control so that the separation of the list is canceled and the list item A 3 and the like are restored to their initial display positions.
  • the GPU 112 creates a new sub-list 25 x by arranging the list items A 1 , A 3 , and A 5 separated from the target list 23 .
  • the GPU 112 adjusts the display position of each list item of a target list 23 x whose number of list items has been decreased by the separation of the list.
  • the GPU 112 separates a predetermined number of list items from the target list 23 displayed on the screen in accordance with a pinch-in operation, and creates the new sub-list 25 x by arranging the predetermined number of the separated list items.
  • the user is able to check in advance which list items are to be separated before the completion of the separation.
  • the user is also able to cancel the separation of the list before the completion thereof while checking information of the target list 23 that is decreased by the separation.
  • FIG. 9 illustrates a view in which one more new sub-list 25 x is created
  • separation of a list through a pinch-in operation in accordance with this embodiment is not limited thereto, and the GPU 112 may newly create a plurality of sub-lists 25 in accordance with a pinch-in operation. For example, if a pinch-in operation is performed when the gap d 3 between the touch positions is double the width of a list item, the GPU 112 newly creates two sub-lists 25 .
  • the aforementioned embodiment describes detecting a pinch-in operation of, while touching two list items of the target list 23 with two fingers, narrowing the gap between the fingers
  • the positions of the pinch-in operation in accordance with this embodiment are not limited thereto.
  • the GPU 112 no matter at which position on the screen a pinch-in operation is detected, separates a list in accordance with the width of the gap d between the touch positions and a movement of narrowing the gap d with the screen being touched.
  • a drag operation refers to a touch operation of moving a finger while touching one point on a screen with the finger.
  • coupling of lists performed in conjunction with a drag operation will be described with reference to FIGS. 10 and 11 .
  • FIG. 10 is a diagram illustrating a drag operation of a user. As shown in a screen 57 in FIG. 10 , when a drag operation of moving a list item B 2 of a sub-list 25 a to a target list 23 is detected, the GPU 112 couples the lists.
  • FIG. 11 is a screen transition diagram illustrating an animation when lists are coupled in conjunction with a drag operation.
  • the GPU 112 moves a list item B 2 in a downward direction in accordance with a drag operation, and also moves the other list items B 1 and B 3 of the operation target sub-list 25 a in a downward direction.
  • the GPU 112 performs display control so that as the list item B 2 moves closer to the target list 23 , the space between each list item of the target lists 23 widens.
  • a user drags the list item B 2 to the widened space between each list item of the target lists 23 , and then lifts the touching finger off the screen.
  • the GPU 112 may accept cancelation of the coupling of the lists until the list item B 2 is dragged to a predetermined position in the target list 23 .
  • the GPU 112 performs display control so that the position of each list item is restored to the initial position.
  • the GPU 112 enlarges the list item B 2 dragged to a position in the target list 23 so that the size of the list item B 2 becomes similar to that of each list item of the target list 23 , and thus completes the coupling of the lists.
  • the GPU 112 in accordance with a drag operation, couples the sub-list 25 displayed on the screen to the target list 23 and increases the amount of information of the target list 23 .
  • the user is able to check in advance the list items 21 of the sub-list 25 to be coupled to the target list 23 .
  • the user is also able to cancel the coupling of the lists before the completion thereof while checking information that is increased by the coupling.
  • the GPU 112 when an operation of dragging a list item of the target list 23 in an upward direction is detected, separates the list. As an animation when a list is separated, for example, when the list item A 2 is dragged in an upward direction, the GPU 112 displays the list item A 2 so that it gradually becomes smaller. In addition, the GPU 112 simultaneously displays the other list items A to be separated from the target list 23 so that they gradually become smaller while moving them in an upward direction.
  • the GPU 112 arranges each of the remaining list items of the target list 23 whose number of list items has been decreased by the separation of the list so that the space between each list item becomes narrower.
  • the GPU 112 creates a new sub-list by arranging the list items separated from the target list 23 so that they are aligned above the target list 23 , whereby separation of the list is completed.
  • Coupling/separation of lists through a drag operation have been described above. Such coupling/separation of lists are performed by moving a single list item through a drag operation. However, it is also possible to move a single list item through a flick operation instead of a drag operation. Hereinafter, coupling/separation of lists performed in conjunction with a flick operation will be described.
  • a flick operation refers to a touch operation of, while touching one point on a screen with a finger, lightly sweeping the finger in one direction.
  • the GPU 112 couples/separates lists through a flick operation.
  • An animation when lists are coupled through a flick operation is roughly similar to the animation when lists are coupled through a drag operation described with reference to FIG. 11 .
  • the GPU 112 moves each list item of the sub-list 25 a in a downward direction.
  • the GPU 112 performs display control so that the space between each list item of the target list 23 widens.
  • each list item of the sub-list 25 a moves to the widened space between each list item of the target list 23 .
  • the GPU 112 enlarges each of the moved list items of the sub-list 25 a to a size that is similar to the size of each list item of the target list 23 , whereby coupling of the lists is completed.
  • the GPU 112 may perform display control so that the display position of each list item is restored to the initial position.
  • the GPU 112 when an operation of flicking a list item of the target list 23 in an upward direction is detected, separates the list. As an animation when a list is separated, for example, when the list item A 2 is flicked in an upward direction, the GPU 112 moves the list item A 2 in an upward direction while gradually displaying the list item A 2 in smaller size. At the same time, the GPU 112 also gradually displays the other list items A separated from the target list 23 in smaller size while moving them in an upward direction.
  • the GPU 112 arranges each of the remaining list items of the target list 23 whose number of list items has been decreased by the separation of the list so that the space between each list item becomes narrower.
  • the GPU 112 creates a new sub-list by aligning the list items separated from the target list 23 above the target list 23 , whereby separation of the list is completed.
  • the GPU 112 may couple/separate lists in conjunction with a tap/double-tap operation. For example, each of the tapped/double-tapped list items of the sub-list 25 a is moved to the space between each list item of the target list 23 as shown in FIG. 4 , whereby coupling of the lists is completed.
  • the GPU 112 separates a predetermined number of list items from the target list 23 to thereby reduce the amount of information of the target list 23 . Then, the GPU 112 creates a new sub-list by arranging the list items separated from the target list 23 , whereby separation of the list is completed.
  • the GPU 112 may also couple/separate lists in response to a user operation on a button or a bar.
  • description will be specifically made with reference to FIGS. 12 and 13 .
  • FIG. 12 is a diagram showing a scale-out button 27 and a scale-in button 28 .
  • the GPU 112 separates a predetermined number of list items from the target list 23 , and creates a new sub-list 25 by arranging the separated list items, whereby separation of the list is completed.
  • the GPU 112 moves each list item of the sub-list 25 to the space between each list item of the target list 23 , and adjusts the size of each moved list item, whereby coupling of the lists is completed.
  • FIG. 13 is a diagram showing a slider bar 29 .
  • the GPU 12 separates a predetermined number of list items from the target list 23 .
  • the GPU 112 creates a new sub-list 25 by arranging the separated list items, whereby separation of the list is completed.
  • the GPU 112 moves each list item of the sub-list 25 to the space between each list item of the target list 23 , and adjusts the size of each moved list item, whereby coupling of the lists is completed.
  • the GPU 112 reduces the amount of information of the target list 23 , which is changed by coupling/separation of lists, as the knob of the slider bar 29 moves closer to the “minus” sign, and increases the amount (increases the fragmentation level) of the information of the target list 23 as the knob of the slider bar 29 moves closer to the “plus” sign.
  • the GPU 112 displays a plurality of lists, each of which has list items 21 arranged in the horizontal direction, in the vertical direction in a stepwise manner.
  • the method of displaying a plurality of lists in accordance with this embodiment is not limited to the example shown in FIG. 1 .
  • FIGS. 14 and 15 another example of displaying a plurality of lists will be described with reference to FIGS. 14 and 15 .
  • FIG. 14 is a diagram showing an exemplary display in which a plurality of lists are arranged vertically.
  • the GPU 112 arranges a plurality of lists, each of which has list items 21 arranged in the vertical direction, in the horizontal direction in a stepwise manner.
  • the GPU 112 couples/separates lists.
  • FIG. 15 is a diagram showing an exemplary display in which a plurality of lists overlap.
  • the GPU 112 displays a target list 23 such that it overlaps a sub-list 25 .
  • a user is able to check each list item of the sub-list 25 between each list item of the target list 23 .
  • the GPU 112 moves each list item of the target list 23 in the horizontal direction in accordance with the pinch-out operation as shown in a screen 71 in FIG. 15 .
  • the GPU 112 also performs control such that each list item of the sub-list 25 is gradually enlarged, thereby displaying a target list 23 x with the sub-list 25 coupled thereto.
  • FIGS. 16 to 18 exemplarily illustrate thumbnail images, which are created from scenes of moving image data of every predetermined time period, as list items.
  • FIG. 16 is a diagram showing a screen 72 that displays a plurality of lists each including a plurality of thumbnail images.
  • a target list 73 is a list having arranged therein scenes of moving image data of every minute.
  • a sub-list 75 a is a list having arranged therein scenes, each of which is to be interpolated between the scenes of the target list 73 of every minute.
  • the sub-list 75 a includes arranged therein a thumbnail image of 4:30 to be interpolated between a thumbnail image of 4:00 and a thumbnail image of 5:00 of the target list 73 , a thumbnail image of 5:30 to be interpolated between a thumbnail image of 5:00 and a thumbnail image of 6:00 of the target list 73 , a thumbnail image of6:30 to be interpolated between a thumbnail image of 6:00 and a thumbnail image of 7:00 of the target list 73 , and the like.
  • a sub-list 75 b is a list having arranged therein scenes, every two of which are to be interpolated between the scenes arranged in the sub-list 75 a .
  • thumbnail images of 4:45 and 5:15 to be interpolated between the thumbnail image of 4:30 and the thumbnail image of 5:30 of the sub-list 75 a are arranged.
  • the GPU 112 couples the sub-list 75 a to the target list 73 , whereby the number of list items of the target list 73 increases.
  • FIG. 17 A case in which the number of list items of the target list 73 shown in FIG. 16 increases will be described with reference to FIG. 17 .
  • FIG. 17 is a diagram showing a result obtained when the plurality of lists shown in FIG. 16 are coupled.
  • a target list 73 x shown in FIG. 17 is obtained by coupling the sub-list 75 a to the target list 73 shown in FIG. 16 .
  • the number of list items of the target list 73 x increases, so that thumbnail images of every 30 seconds are arranged.
  • the granularity of information becomes finer.
  • FIG. 18 is a diagram showing a result obtained when the plurality of lists shown in FIG. 16 are separated.
  • a target list 73 y shown in FIG. 18 is obtained by separating a list from the target list 73 shown in FIG. 16 .
  • a sub-list 75 x shown in FIG. 18 is a sub-list that is newly created by arranging the list items separated from the target list 73 shown in FIG. 16 .
  • the separation of the list the number of list items of the target list 73 y decreases, so that thumbnail images of every two minutes are arranged.
  • the granularity of information becomes coarser.
  • FIGS. 19 to 21 exemplarily illustrate the types of information related to music data, as list items.
  • FIG. 19 is a diagram showing a screen 81 that displays a plurality of lists created on the basis of information related to music data.
  • a target list 83 is a list having arranged therein jacket images of music albums.
  • a sub-list 85 a is a list having arranged therein a list of names of music pieces on each music album.
  • a sub-list 85 b is a list having arranged therein the lyrics of each music piece.
  • the GPU 112 couples the sub-list 85 a to the target list 83 , whereby the number of list items of the target list 83 increases.
  • a case in which the number of list items of the target list 83 shown in FIG. 19 increases will be described with reference to FIG. 20 .
  • FIG. 20 is a diagram showing a result obtained when the plurality of lists shown in FIG. 19 are coupled.
  • a target list 83 x shown in FIG. 20 is obtained by coupling the sub-list 85 a to the target list 83 shown in FIG. 19 .
  • the number of list items of the target list 83 x is increased, and images of music jackets and lists of music pieces are arranged.
  • the types of information increase.
  • the GPU 122 may change the background for each related information (a list item) in the target list 83 x.
  • FIG. 21 is a diagram showing a result obtained when the two sub-lists 85 shown in FIG. 19 are coupled to the target list.
  • a target list 83 y shown in FIG. 21 is obtained by coupling the sub-list 85 a and the sub-list 85 b to the target list 83 shown in FIG. 19 .
  • the number of list items of the target list 83 y is increased, and images of music jackets, lists of music pieces, and the lyrics of each music piece are arranged.
  • the types of information increase.
  • the background may also be changed for each related information (list item) in the target list 83 y .
  • the GPU 112 may also couple each of the sub-list 85 a and the sub-list 85 b to the target list 83 without changing the display size of each list item of the sub-list 85 a and the sub-list 85 b.
  • the types of information are not limited to the types of information related to music data given as an example in FIGS. 19 and 21 .
  • the types of information may be classified in a stepwise manner in accordance with the degree of enthusiasm of each scene of moving image data.
  • the control unit 11 analyzes moving image data in advance, and the GPU 112 creates a first list by arranging thumbnail images of a scene with the highest degree of enthusiasm. Then, the GPU 112 adds a scene with the next highest degree of enthusiasm to the first list through coupling.
  • the GPU 112 in accordance with this embodiment may also display a single list and couple/separate list items of the list in response to a user operation.
  • the GPU 112 increases the number of list items of the displayed single list in response to a user operation indicating coupling of lists.
  • the GPU 112 in response to a user operation indicating separation of a list, decreases the number of list items of the displayed single list, and creates a new sub-list by arranging the list items separated from the single list.
  • examples of a user operation indicating coupling/separation of list items of a single list include a pinch-out/in operation, a tap/double-tap operation, and a button/bar operation described in [2-3. User Operation].
  • a pinch-out/in operation a pinch-out/in operation
  • a tap/double-tap operation a button/bar operation described in [2-3. User Operation].
  • the GPU 112 in response to a user operation indicating coupling of lists, causes new list items to gradually appear in the list to thereby increase the number of list items.
  • coupling of list items of a single list will be described with reference to FIGS. 22 and 23 .
  • FIG. 22 is a diagram showing a screen 91 that displays a target list 93 .
  • a pinch-out operation of touching list items A 2 and A 3 of the target list 93 with two fingers and widening the gap between the two touch positions with the screen being touched is detected.
  • the GPU 112 couples the list items of the target list 93 in accordance with the pinch-out operation.
  • the GPU 112 enlarges the list item made to appear between each list item of the target list 93 so that the size of the list item becomes similar to the size of each list item of the target list 93 , whereby coupling of the list items is completed.
  • the GPU 112 may, when an operation of narrowing the gap between the touch positions of the two fingers performing the pinch-out operation is detected, cancel the coupling of the list items and hide the new list items.
  • the GPU 112 causes new list items to gradually appear in the target list 93 in accordance with a pinch-out operation, the user is able to check in advance the list items to be coupled to the target list 93 .
  • the user is also able to cancel the coupling of the list items before the completion thereof while checking information that is increased by the coupling.
  • the GPU 112 may, when an operation of widening the gap between the touch positions of the two fingers performing the pinch-in operation is detected, cancel the separation of the list items, and cause the list items, which have once disappeared, to appear again.
  • the aforementioned example has described that the number of list items is increased/decreased by coupling/separation of list items of a single list.
  • the number of list items is increased by coupling of list items, some list items are expelled from the screen.
  • the GPU 112 in accordance with this embodiment may, when the number of list items of a target list is increased or decreased by coupling of list items, enlarge/shrink the display size of the list items to thereby perform list-display of displaying all list items within the screen.
  • the GPU 112 performs list-display, the user is able to check all list items that are increased by the coupling of the list items within the screen.
  • List-display of the GPU 112 will be described with reference to FIGS. 24 to 26 .
  • thumbnail images created from scenes of moving image data of every predetermined time period are displayed in a grid-list form as list items.
  • FIG. 24 is a diagram showing a screen 201 that list-displays a single list. As shown in a screen 201 in FIG. 24 , the GPU 112 arranges all thumbnail images of moving image data of every two minutes. An animation when list items are coupled in response to a user operation in this case will be described with reference to FIGS. 25 and 26 .
  • FIG. 26 is a diagram showing a screen 205 when coupling of list items is completed. As shown in the screen 205 in FIG. 26 , even when the number of list items is increased by the coupling of the list items, the GPU 112 displays all thumbnail images on a single screen.
  • the GPU 112 may also control list-display when coupling a plurality of lists.
  • FIGS. 6 and 9 illustrate examples in which the GPU 112 performs display control of coupling/separating lists in accordance with the gap d between two fingers performing pinch-in/out operations, respectively, this embodiment is not limited thereto.
  • the GPU 112 may also couple/separates lists in accordance with, for example, the movement amount of a finger performing a touch operation or the movement speed of the finger.
  • FIG. 27 an operation process performed when the GPU 112 couples/separates lists in accordance with the movement amount or the movement speed of a pinch-in/out operation will be described with reference to FIG. 27 .
  • FIG. 27 is a flowchart showing a display control process of coupling/separating lists in accordance with a pinch-in/out operation.
  • the touch panel 16 detects two touches in step S 102
  • the touch panel 16 further detects a pinch-in/out operation on the basis of movements of the operation positions of the two touches in the next step S 104 .
  • the touch panel 16 upon detecting a pinch-in/out operation, outputs the detection result to the control unit 11 .
  • step S 106 the GPU 112 of the control unit 11 , in accordance with the pinch-in/out operation detected by the touch panel 16 , starts an animation of coupling/separating lists displayed on the display 15 .
  • the GPU 112 moves the touched list item in accordance with the pinch-in/out operation.
  • step S 108 when the touch panel 16 detects an operation of touching two points, the process proceeds to step S 110 .
  • step S 110 the GPU 112 determines if the movement amount of the pinch-in/out operation is greater than a threshold.
  • the movement amount of the pinch-in/out operation used herein is calculated as shown in FIG. 28 , for example.
  • FIG. 28 is a table exemplarily showing calculation of the movement amount and the movement speed of pinch-in/out operations.
  • the movement amount of each of pinch-in/out operations is the sum of the distance of movement of a list item from the initial state in the pinch-in/out operation.
  • the movement amount of each of pinch-in/out operations may be the sum of the distance of movement of each of the two touch positions of the pinch-in/out operation.
  • step S 110 if the movement amount of the pinch-in/out operation exceeds the threshold, the process proceeds to step S 112 .
  • step S 112 the GPU 112 couples/separates lists. Note that if the movement amount exceeds a first threshold in step S 110 , coupling/separation of lists at one level is performed in step S 112 . For example, a single sub-list is coupled or a single sub-list is created through separation.
  • step S 110 the GPU 112 performs coupling/separation of lists at another level.
  • the GPU 112 couples one more sub-list, or creates one more sub-list through separation.
  • the GPU 112 performs coupling/separation of lists at multiple levels in accordance with a pinch-in/out operation.
  • step S 108 a case in which the two touches are no more detected will be detected.
  • the process proceeds to step S 114 , and the GPU 112 determines if the speed of the pinch-in/out operation exceeds a threshold in step S 114 .
  • the speed of a pinch-in/out operation used herein is the sum of the movement speed of each list item in the pinch-in/out operation as shown in FIG. 28 , for example.
  • step S 116 the GPU 112 performs coupling/separation of lists. Note that if the movement speed exceeds a first threshold in step S 114 , coupling/separation of lists at one level is performed in step S 116 . For example, a single sub-list is coupled, or a single sub-list is created through separation.
  • step S 116 the process returns to step S 114 , and if the movement speed is not still zero, the GPU 112 performs coupling/separation of lists at another level in step S 116 again.
  • the GPU 112 couples one more sub-list, or creates one more sub-list through separation.
  • the GPU 112 performs coupling/separation of lists at multiple levels in accordance with the speed of a list item moved in accordance with a pinch-in/out operation.
  • step S 118 the GPU 112 determines if the movement amount of the pinch-in/out operation exceeds the threshold. If the movement amount exceeds the threshold, the process proceeds to step S 120 .
  • the GPU 112 performs coupling/separation of lists.
  • the GPU 112 may also have a plurality of thresholds and may control the levels of coupling/separation of lists in accordance with which threshold the movement amount of the pinch-in/out operation has exceeded. For example, if the movement amount of a pinch-in/out operation exceeds a threshold a, the GPU 112 couples a sub-list, or creates a sub-list through separation. Meanwhile, if the movement amount of a pinch-in/out operation exceeds a threshold b, the GPU 112 couples two sub-lists, or creates two sub-lists through separation.
  • step S 122 the GPU 112 terminates the animation of coupling/separating lists.
  • a user is able to, by displaying a plurality of lists on a screen, check in advance the list items of the sub-list 25 to be coupled to the target list 23 before the completion of the coupling.
  • the information processing device 10 in accordance with this embodiment even when a single list is displayed on the screen, if new list items are caused to gradually appear in the list in response to a user operation, the user is able to check in advance the list items to be added before the completion of the coupling.
  • the user is able to check in advance the list items to be removed from the target list 23 by the separation of the list items.
  • the user is also able to, while checking the list items that are increased or decreased by an animation of coupling or separating lists, cancel the coupling or separation of the lists before the completion thereof
  • the GPU 112 can, by controlling all list items such that they are displayed within the screen, display more detailed information while securing the list properties of the list.
  • each list item of the sub-list 25 a to be coupled may be gradually be enlarged substantially at the same time as the start timing of the movement of the list item toward the target list 23 , and the enlargement of the list item may be terminated substantially at the same time as the termination of insertion of the list item between each list item of the target list 23 .
  • FIG. 29 shows an example of changing an animation when lists are coupled in conjunction with a pinch-out operation as described above. As shown in a screen 44 in FIG.
  • each list may be switched or all lists may be moved in accordance with a user operation.
  • the information processing device 10 shown in FIG. 1 is implemented by a mobile terminal such as a smartphone or a portable audio player
  • the information processing device 10 in accordance with the present disclosure is not limited thereto.
  • the information processing device 10 may implement a personal computer (PC), and an animation of coupling or separating lists in response to a user operation may be displayed on the GUI screen of the PC.
  • PC personal computer
  • a computer program for exerting a function that is equivalent to each configuration of the information processing device 10 in accordance with the aforementioned embodiment.
  • a recording medium having the computer program recorded thereon is also provided. Examples of the recording medium include a magnetic disk, an optical disc, a magneto-optical disc, and flash memory.
  • the aforementioned computer program may be distributed over a network without using a recording medium, for example.
  • the present technology may also be configured as below.
  • the display control unit is configured to, when the detection unit detects an operation of the user to widen a distance between a plurality of operation positions, perform display of coupling the second list to the first list.
  • the display control unit is configured to, when the detection unit detects an operation of the user to narrow a distance between a plurality of operation positions, perform display of separating the list items from the first list to create a new sub-list.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Telephone Function (AREA)
  • Processing Or Creating Images (AREA)
US13/554,397 2011-08-01 2012-07-20 Information processing device, information processing method, and program Abandoned US20130036387A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/697,188 US11042287B2 (en) 2011-08-01 2017-09-06 Information processing device, information processing method, and program for displaying of coupling and decoupling of lists
US17/335,489 US20210286512A1 (en) 2011-08-01 2021-06-01 Information processing device, information processing method, and program
US18/462,967 US20230418463A1 (en) 2011-08-01 2023-09-07 Information processing device, information processing method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-168262 2011-08-01
JP2011168262A JP2013033330A (ja) 2011-08-01 2011-08-01 情報処理装置、情報処理方法およびプログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/697,188 Continuation US11042287B2 (en) 2011-08-01 2017-09-06 Information processing device, information processing method, and program for displaying of coupling and decoupling of lists

Publications (1)

Publication Number Publication Date
US20130036387A1 true US20130036387A1 (en) 2013-02-07

Family

ID=46717711

Family Applications (4)

Application Number Title Priority Date Filing Date
US13/554,397 Abandoned US20130036387A1 (en) 2011-08-01 2012-07-20 Information processing device, information processing method, and program
US15/697,188 Active 2032-11-20 US11042287B2 (en) 2011-08-01 2017-09-06 Information processing device, information processing method, and program for displaying of coupling and decoupling of lists
US17/335,489 Abandoned US20210286512A1 (en) 2011-08-01 2021-06-01 Information processing device, information processing method, and program
US18/462,967 Pending US20230418463A1 (en) 2011-08-01 2023-09-07 Information processing device, information processing method, and program

Family Applications After (3)

Application Number Title Priority Date Filing Date
US15/697,188 Active 2032-11-20 US11042287B2 (en) 2011-08-01 2017-09-06 Information processing device, information processing method, and program for displaying of coupling and decoupling of lists
US17/335,489 Abandoned US20210286512A1 (en) 2011-08-01 2021-06-01 Information processing device, information processing method, and program
US18/462,967 Pending US20230418463A1 (en) 2011-08-01 2023-09-07 Information processing device, information processing method, and program

Country Status (4)

Country Link
US (4) US20130036387A1 (ja)
EP (1) EP2555104A3 (ja)
JP (1) JP2013033330A (ja)
CN (2) CN108334262B (ja)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140317555A1 (en) * 2013-04-22 2014-10-23 Samsung Electronics Co., Ltd. Apparatus, method, and computer-readable recording medium for displaying shortcut icon window
US20150212711A1 (en) * 2014-01-28 2015-07-30 Adobe Systems Incorporated Spread-to-Duplicate and Pinch-to-Delete Gestures
US20150220255A1 (en) * 2012-08-20 2015-08-06 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and related program
KR20150102261A (ko) * 2014-02-28 2015-09-07 염광윤 터치스크린을 이용한 컨텐츠 편집방법
USD749114S1 (en) * 2014-05-21 2016-02-09 Sharp Kabushiki Kaisha Display of mobile information terminal with transitional graphical user interface
USD749090S1 (en) * 2013-06-05 2016-02-09 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphic user interface
USD757739S1 (en) * 2013-06-05 2016-05-31 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphic user interface
USD760254S1 (en) * 2013-06-05 2016-06-28 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphic user interface
US20160196017A1 (en) * 2015-01-05 2016-07-07 Samsung Electronics Co., Ltd. Display apparatus and display method
US20160320959A1 (en) * 2014-01-15 2016-11-03 Yulong Computer Telecommunication Scientific (Shenzhen) Co., Ltd. Terminal Operation Apparatus and Terminal Operation Method
USD771084S1 (en) * 2013-06-05 2016-11-08 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphic user interface
USD773488S1 (en) * 2012-09-04 2016-12-06 Huawei Technologies Co., Ltd. Display screen with graphical user interface for viewing and installing applications in an electronic mall
US20170031580A1 (en) * 2015-07-28 2017-02-02 Kyocera Corporation Electronic apparatus, non-transitory computer-readable recording medium, and display control method of electronic apparatus
USD779513S1 (en) * 2014-07-07 2017-02-21 Microsoft Corporation Display screen with graphical user interface
USD789392S1 (en) * 2015-02-20 2017-06-13 Google Inc. Portion of a display panel with a graphical user interface
USD803230S1 (en) * 2015-02-20 2017-11-21 Google Inc. Portion of a display panel with a graphical user interface
US20180085188A1 (en) * 2016-09-28 2018-03-29 Biolase, Inc. Laser control gui system and method
US10466897B2 (en) * 2014-05-16 2019-11-05 Lg Electronics Inc. Mobile terminal for using multimodal virtual keyboard and controlling method thereof
US10831332B2 (en) * 2017-02-23 2020-11-10 The Florida International University Board Of Trustees User interface element for building interior previewing and navigation
US11079933B2 (en) 2008-01-09 2021-08-03 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US11120220B2 (en) * 2014-05-30 2021-09-14 Apple Inc. Device, method, and graphical user interface for a predictive keyboard
US11194467B2 (en) 2019-06-01 2021-12-07 Apple Inc. Keyboard management user interfaces
US11416136B2 (en) 2020-09-14 2022-08-16 Apple Inc. User interfaces for assigning and responding to user inputs

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013033330A (ja) * 2011-08-01 2013-02-14 Sony Corp 情報処理装置、情報処理方法およびプログラム
CN103399691A (zh) * 2013-08-07 2013-11-20 浙江宇天科技股份有限公司 一种加载显示数据方法及装置
CN104506725A (zh) * 2014-12-19 2015-04-08 广东欧珀移动通信有限公司 一种通话记录查看方法和装置
CN105843594A (zh) * 2015-01-13 2016-08-10 阿里巴巴集团控股有限公司 移动终端应用程序页面的展现方法和装置
CN107430477B (zh) * 2015-03-27 2021-01-05 谷歌有限责任公司 用于响应于用户触摸输入而显示内容项目的集的布局和过渡布局的技术

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5519606A (en) * 1992-01-21 1996-05-21 Starfish Software, Inc. System and methods for appointment reconciliation
US5850538A (en) * 1997-04-23 1998-12-15 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Priority queues for computer simulations
US6421072B1 (en) * 1999-06-03 2002-07-16 International Business Machines Corporation Displaying a complex tree structure among multiple windows
US20030018646A1 (en) * 2001-07-18 2003-01-23 Hitachi, Ltd. Production and preprocessing system for data mining
US20030056180A1 (en) * 2001-09-14 2003-03-20 Yasuo Mori Document processing method and system
US20030228141A1 (en) * 1999-03-31 2003-12-11 Microsoft Corporation Locating information on an optical media disc to maximize the rate of transfer
US20040064441A1 (en) * 2002-09-27 2004-04-01 Tow Daniel S. Systems and methods for providing structured query language optimization
US6816174B2 (en) * 2000-12-18 2004-11-09 International Business Machines Corporation Method and apparatus for variable density scroll area
US20050076309A1 (en) * 2003-10-03 2005-04-07 Kevin Goldsmith Hierarchical in-place menus
US20060242122A1 (en) * 2005-04-22 2006-10-26 Microsoft Corporation Systems, methods, and user interfaces for storing, searching, navigating, and retrieving electronic information
US20080040665A1 (en) * 2006-07-06 2008-02-14 Carsten Waldeck Method and system for displaying, locating and browsing data files
US20080294274A1 (en) * 2007-05-22 2008-11-27 Honeywell International Inc. Special purpose controller interface with breadcrumb navigation support
US20090177959A1 (en) * 2008-01-08 2009-07-09 Deepayan Chakrabarti Automatic visual segmentation of webpages
US20100088641A1 (en) * 2008-10-06 2010-04-08 Samsung Electronics Co., Ltd. Method and apparatus for managing lists using multi-touch
US20100162179A1 (en) * 2008-12-19 2010-06-24 Nokia Corporation Method and Apparatus for Adding or Deleting at Least One Item Based at Least in Part on a Movement
US20110072394A1 (en) * 2009-09-22 2011-03-24 Victor B Michael Device, Method, and Graphical User Interface for Manipulating User Interface Objects
US20120166987A1 (en) * 2010-12-28 2012-06-28 Samsung Electronics Co., Ltd. Method for moving object between pages and interface apparatus
US20120324357A1 (en) * 2011-06-17 2012-12-20 Microsoft Corporation Hierarchical, zoomable presentations of media sets
US20130179837A1 (en) * 2011-10-17 2013-07-11 Marcus Eriksson Electronic device interface
US8610673B2 (en) * 2008-12-03 2013-12-17 Microsoft Corporation Manipulation of list on a multi-touch display
US8650502B2 (en) * 2011-11-23 2014-02-11 International Business Machines Corporation Method for precise navigation of data

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3166455B2 (ja) * 1993-10-25 2001-05-14 ヤマハ株式会社 演奏データ作成装置
EP1351121A3 (en) * 2002-03-26 2009-10-21 Polymatech Co., Ltd. Input Device
US20050047629A1 (en) * 2003-08-25 2005-03-03 International Business Machines Corporation System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking
EP1591876A1 (en) * 2004-04-30 2005-11-02 Matsushita Electric Industrial Co., Ltd. Method of sorting elements in a list of a graphical user interface
US7603362B2 (en) * 2004-08-20 2009-10-13 Microsoft Corporation Ordered list management
JP4437548B2 (ja) * 2005-12-09 2010-03-24 ソニー株式会社 音楽コンテンツ表示装置、音楽コンテンツ表示方法及び音楽コンテンツ表示プログラム
US8683362B2 (en) * 2008-05-23 2014-03-25 Qualcomm Incorporated Card metaphor for activities in a computing device
US9507778B2 (en) * 2006-05-19 2016-11-29 Yahoo! Inc. Summarization of media object collections
JP2008046353A (ja) * 2006-08-16 2008-02-28 Sony Corp テーブル表示方法、情報処理装置およびテーブル表示用プログラム
JP2008112217A (ja) * 2006-10-27 2008-05-15 Star Micronics Co Ltd Cam装置と同期設定方法
JP2008134866A (ja) * 2006-11-29 2008-06-12 Sony Corp コンテンツ閲覧方法、コンテンツ閲覧装置およびコンテンツ閲覧プログラム
AU2006252191B2 (en) * 2006-12-21 2009-03-26 Canon Kabushiki Kaisha Scrolling Interface
US8681104B2 (en) * 2007-06-13 2014-03-25 Apple Inc. Pinch-throw and translation gestures
KR101320919B1 (ko) * 2008-01-29 2013-10-21 삼성전자주식회사 분할 화면을 통한 gui 제공방법 및 이를 적용한멀티미디어 기기
JP4171770B1 (ja) * 2008-04-24 2008-10-29 任天堂株式会社 オブジェクト表示順変更プログラム及び装置
JP4666012B2 (ja) * 2008-06-20 2011-04-06 ソニー株式会社 画像処理装置、画像処理方法、プログラム
JP5324143B2 (ja) * 2008-07-01 2013-10-23 キヤノン株式会社 表示制御装置、及び表示制御方法
US8279241B2 (en) * 2008-09-09 2012-10-02 Microsoft Corporation Zooming graphical user interface
JP5461030B2 (ja) * 2009-03-02 2014-04-02 アルパイン株式会社 入力装置
KR20100115547A (ko) * 2009-04-20 2010-10-28 에스케이 텔레콤주식회사 컨텐츠 주밍 방법과 그를 위한 터치 스크린 단말기 및 컴퓨터로 읽을 수 있는 기록매체
US8669945B2 (en) * 2009-05-07 2014-03-11 Microsoft Corporation Changing of list views on mobile device
JP5206587B2 (ja) * 2009-05-26 2013-06-12 ソニー株式会社 編集装置、編集方法及び編集プログラム
US8294710B2 (en) * 2009-06-02 2012-10-23 Microsoft Corporation Extensible map with pluggable modes
JP5493490B2 (ja) * 2009-06-16 2014-05-14 ソニー株式会社 表示制御装置、表示制御方法及び表示制御プログラム
JP4952747B2 (ja) * 2009-06-26 2012-06-13 ソニー株式会社 コンテンツ処理装置、コンテンツ処理方法およびコンテンツ処理プログラム
US20110087999A1 (en) * 2009-09-30 2011-04-14 International Business Machines Corporation Set definition in data processing systems
JP2011168262A (ja) 2010-02-19 2011-09-01 Isao Yuasa パテ面出しツール
CN101819500A (zh) * 2010-03-08 2010-09-01 广东欧珀移动通信有限公司 一种手持设备列表行显浏览调整方法
US8957920B2 (en) * 2010-06-25 2015-02-17 Microsoft Corporation Alternative semantics for zoom operations in a zoomable scene
JP2013033330A (ja) * 2011-08-01 2013-02-14 Sony Corp 情報処理装置、情報処理方法およびプログラム
JP5849502B2 (ja) * 2011-08-01 2016-01-27 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
US20130067420A1 (en) * 2011-09-09 2013-03-14 Theresa B. Pittappilly Semantic Zoom Gestures

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5519606A (en) * 1992-01-21 1996-05-21 Starfish Software, Inc. System and methods for appointment reconciliation
US5850538A (en) * 1997-04-23 1998-12-15 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Priority queues for computer simulations
US20030228141A1 (en) * 1999-03-31 2003-12-11 Microsoft Corporation Locating information on an optical media disc to maximize the rate of transfer
US6421072B1 (en) * 1999-06-03 2002-07-16 International Business Machines Corporation Displaying a complex tree structure among multiple windows
US6816174B2 (en) * 2000-12-18 2004-11-09 International Business Machines Corporation Method and apparatus for variable density scroll area
US20030018646A1 (en) * 2001-07-18 2003-01-23 Hitachi, Ltd. Production and preprocessing system for data mining
US20030056180A1 (en) * 2001-09-14 2003-03-20 Yasuo Mori Document processing method and system
US20040064441A1 (en) * 2002-09-27 2004-04-01 Tow Daniel S. Systems and methods for providing structured query language optimization
US20050076309A1 (en) * 2003-10-03 2005-04-07 Kevin Goldsmith Hierarchical in-place menus
US20060242122A1 (en) * 2005-04-22 2006-10-26 Microsoft Corporation Systems, methods, and user interfaces for storing, searching, navigating, and retrieving electronic information
US20080040665A1 (en) * 2006-07-06 2008-02-14 Carsten Waldeck Method and system for displaying, locating and browsing data files
US20080294274A1 (en) * 2007-05-22 2008-11-27 Honeywell International Inc. Special purpose controller interface with breadcrumb navigation support
US20090177959A1 (en) * 2008-01-08 2009-07-09 Deepayan Chakrabarti Automatic visual segmentation of webpages
US20100088641A1 (en) * 2008-10-06 2010-04-08 Samsung Electronics Co., Ltd. Method and apparatus for managing lists using multi-touch
US8610673B2 (en) * 2008-12-03 2013-12-17 Microsoft Corporation Manipulation of list on a multi-touch display
US20100162179A1 (en) * 2008-12-19 2010-06-24 Nokia Corporation Method and Apparatus for Adding or Deleting at Least One Item Based at Least in Part on a Movement
US20110072394A1 (en) * 2009-09-22 2011-03-24 Victor B Michael Device, Method, and Graphical User Interface for Manipulating User Interface Objects
US20120166987A1 (en) * 2010-12-28 2012-06-28 Samsung Electronics Co., Ltd. Method for moving object between pages and interface apparatus
US20120324357A1 (en) * 2011-06-17 2012-12-20 Microsoft Corporation Hierarchical, zoomable presentations of media sets
US20130179837A1 (en) * 2011-10-17 2013-07-11 Marcus Eriksson Electronic device interface
US8650502B2 (en) * 2011-11-23 2014-02-11 International Business Machines Corporation Method for precise navigation of data

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11474695B2 (en) 2008-01-09 2022-10-18 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US11079933B2 (en) 2008-01-09 2021-08-03 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US20150220255A1 (en) * 2012-08-20 2015-08-06 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and related program
USD773488S1 (en) * 2012-09-04 2016-12-06 Huawei Technologies Co., Ltd. Display screen with graphical user interface for viewing and installing applications in an electronic mall
US10254915B2 (en) * 2013-04-22 2019-04-09 Samsung Electronics Co., Ltd Apparatus, method, and computer-readable recording medium for displaying shortcut icon window
US20140317555A1 (en) * 2013-04-22 2014-10-23 Samsung Electronics Co., Ltd. Apparatus, method, and computer-readable recording medium for displaying shortcut icon window
USD749090S1 (en) * 2013-06-05 2016-02-09 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphic user interface
USD757739S1 (en) * 2013-06-05 2016-05-31 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphic user interface
USD760254S1 (en) * 2013-06-05 2016-06-28 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphic user interface
USD771084S1 (en) * 2013-06-05 2016-11-08 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphic user interface
US20160320959A1 (en) * 2014-01-15 2016-11-03 Yulong Computer Telecommunication Scientific (Shenzhen) Co., Ltd. Terminal Operation Apparatus and Terminal Operation Method
US9959026B2 (en) * 2014-01-28 2018-05-01 Adobe Systems Incorporated Spread-to-duplicate and pinch-to-delete gestures
US20150212711A1 (en) * 2014-01-28 2015-07-30 Adobe Systems Incorporated Spread-to-Duplicate and Pinch-to-Delete Gestures
KR101713287B1 (ko) 2014-02-28 2017-03-07 염광윤 터치스크린을 이용한 컨텐츠 편집방법
KR20150102261A (ko) * 2014-02-28 2015-09-07 염광윤 터치스크린을 이용한 컨텐츠 편집방법
US10466897B2 (en) * 2014-05-16 2019-11-05 Lg Electronics Inc. Mobile terminal for using multimodal virtual keyboard and controlling method thereof
USD749114S1 (en) * 2014-05-21 2016-02-09 Sharp Kabushiki Kaisha Display of mobile information terminal with transitional graphical user interface
US11120220B2 (en) * 2014-05-30 2021-09-14 Apple Inc. Device, method, and graphical user interface for a predictive keyboard
USD779513S1 (en) * 2014-07-07 2017-02-21 Microsoft Corporation Display screen with graphical user interface
US20160196017A1 (en) * 2015-01-05 2016-07-07 Samsung Electronics Co., Ltd. Display apparatus and display method
US10152205B2 (en) * 2015-01-05 2018-12-11 Samsung Electronics Co., Ltd. Display apparatus and display method
US11169662B2 (en) 2015-01-05 2021-11-09 Samsung Electronics Co., Ltd. Display apparatus and display method
USD789392S1 (en) * 2015-02-20 2017-06-13 Google Inc. Portion of a display panel with a graphical user interface
USD880492S1 (en) 2015-02-20 2020-04-07 Google Llc Portion of a display panel with a graphical user interface
USD881204S1 (en) 2015-02-20 2020-04-14 Google Llc Portion of a display panel with a graphical user interface
USD882584S1 (en) 2015-02-20 2020-04-28 Google Llc Portion of a display panel with a graphical user interface
USD882586S1 (en) 2015-02-20 2020-04-28 Google Llc Portion of a display panel with a graphical user interface
USD882587S1 (en) 2015-02-20 2020-04-28 Google Llc Portion of a display panel with a graphical user interface
USD882585S1 (en) 2015-02-20 2020-04-28 Google Llc Portion of a display panel with a graphical user interface
USD803230S1 (en) * 2015-02-20 2017-11-21 Google Inc. Portion of a display panel with a graphical user interface
US20170031580A1 (en) * 2015-07-28 2017-02-02 Kyocera Corporation Electronic apparatus, non-transitory computer-readable recording medium, and display control method of electronic apparatus
US20180085188A1 (en) * 2016-09-28 2018-03-29 Biolase, Inc. Laser control gui system and method
US10831332B2 (en) * 2017-02-23 2020-11-10 The Florida International University Board Of Trustees User interface element for building interior previewing and navigation
US11194467B2 (en) 2019-06-01 2021-12-07 Apple Inc. Keyboard management user interfaces
US11620046B2 (en) 2019-06-01 2023-04-04 Apple Inc. Keyboard management user interfaces
US11842044B2 (en) 2019-06-01 2023-12-12 Apple Inc. Keyboard management user interfaces
US11416136B2 (en) 2020-09-14 2022-08-16 Apple Inc. User interfaces for assigning and responding to user inputs

Also Published As

Publication number Publication date
US20170371536A1 (en) 2017-12-28
US11042287B2 (en) 2021-06-22
CN108334262A (zh) 2018-07-27
CN103176703B (zh) 2018-04-13
CN103176703A (zh) 2013-06-26
EP2555104A2 (en) 2013-02-06
CN108334262B (zh) 2021-10-29
EP2555104A3 (en) 2016-01-27
US20210286512A1 (en) 2021-09-16
JP2013033330A (ja) 2013-02-14
US20230418463A1 (en) 2023-12-28

Similar Documents

Publication Publication Date Title
US20210286512A1 (en) Information processing device, information processing method, and program
US20200356257A1 (en) Information processing device, information processing method, and program
US10387016B2 (en) Method and terminal for displaying a plurality of pages,method and terminal for displaying a plurality of applications being executed on terminal, and method of executing a plurality of applications
JP5942978B2 (ja) 情報処理装置、情報処理方法およびプログラム
JP6494917B2 (ja) ディスプレイ装置のディスプレイ方法及びディスプレイ装置
JP5970086B2 (ja) タッチスクリーンホバリング入力処理
US9405463B2 (en) Device and method for gesturally changing object attributes
EP1956474A2 (en) Method of displaying information by using touch input in mobile terminal
US10739953B2 (en) Apparatus and method for providing user interface
CN103294392A (zh) 用于在移动装置中编辑内容视图的方法和设备
JP5945157B2 (ja) 情報処理装置、情報処理装置の制御方法、制御プログラム、および記録媒体
JP6096100B2 (ja) 電子機器、制御方法、及び制御プログラム
US20170205967A1 (en) Display and interaction method in a user interface
CN105468254A (zh) 内容搜索设备和用于搜索内容的方法
JP6172251B2 (ja) 情報処理装置、情報処理方法およびプログラム
CN102982045A (zh) 移动快速链接的方法和浏览器
JP2016184418A (ja) 情報処理装置、情報処理方法およびプログラム

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MURATA, YU;REEL/FRAME:028600/0131

Effective date: 20120626

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION