WO2015159498A1 - Method and apparatus for displaying additional objects on a graphical user interface based on pinch gesture - Google Patents

Method and apparatus for displaying additional objects on a graphical user interface based on pinch gesture Download PDF

Info

Publication number
WO2015159498A1
WO2015159498A1 PCT/JP2015/001916 JP2015001916W WO2015159498A1 WO 2015159498 A1 WO2015159498 A1 WO 2015159498A1 JP 2015001916 W JP2015001916 W JP 2015001916W WO 2015159498 A1 WO2015159498 A1 WO 2015159498A1
Authority
WO
WIPO (PCT)
Prior art keywords
contents
image
adjacent
information processing
processing apparatus
Prior art date
Application number
PCT/JP2015/001916
Other languages
French (fr)
Inventor
Masahiro Takahashi
Ayaka Tamura
Satoshi Akagawa
Keiichi Yoshioka
Original Assignee
Sony Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation filed Critical Sony Corporation
Publication of WO2015159498A1 publication Critical patent/WO2015159498A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

There is provided an information processing apparatus including a contents display unit that arranges contents images corresponding to contents in a linear shape or a matrix shape, an operation detecting unit that detects an operation performed at the same time on the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other, and an image addition unit that adds and arranges a new image between the contents images which are adjacent to each other, in a case in which the operation is detected.

Description

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
The present disclosure relates to an information processing apparatus, an information processing method, and a program, and more particularly, to an information processing apparatus, an information processing method, and a program which are suitable for being used in a case in which a pinch operation can be detected.
<CROSS REFERENCE TO RELATED APPLICATIONS>
This application claims the benefit of Japanese Priority Patent Application JP 2014-082638 filed April 14, 2014, the entire contents of which are incorporated herein by reference.
In the related art, for example, in an information processing apparatus such as a smart phone, a tablet type computer, or the like, in which a touch panel is laminated on a display a so called pinch operation can be performed.
Here, the pinch operation indicates that the user touches the touch panel with his or her two fingers and performs widening an interval between two fingers (hereinafter, referred to as a pinch-out operation) or narrowing the interval (hereinafter, referred to as a pinch-in operation) while coming into contact with the touch panel (for example, refer to PTL 1).
For example, when the pinch-out operation is performed on a folder displayed on the display, contents (lower hierarchy) of the folder are displayed. In addition, for example, when the pinch-out operation is performed on an image such as a photo displayed on the display, the image is displayed as enlarged as much as a part corresponding to the widened interval between the fingers. In addition, when the pinch-in operation is performed on the image, the image is displayed as reduced as much as a part corresponding to the narrowed interval between the fingers.
Japanese Unexamined Patent Application Publication No. 2012-203440
As described above, in the related art, the pinch operation is used to only enlarge or reduce the image display.
The present disclosure is provided in consideration of such a situation, and it is desirable to propose new operations using a touch panel.
An information processing apparatus according to an embodiment of the present disclosure includes a contents display unit that arranges contents images corresponding to contents in a linear shape or a matrix shape, an operation detecting unit that detects an operation performed at the same time on the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other, and an image addition unit that adds and arranges a new image between the contents images which are adjacent to each other, in a case in which the operation is detected.
The operation detecting unit may detect a pinch-out operation performed between the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other.
The image addition unit may arrange icons as the new image between the contents images adjacent to each other in a case in which the operation is detected.
In a case in which the operation is detected, the image addition unit may display the icon so as to add a new image relating to at least one of the contents images which are adjacent to each other between the contents images adjacent to each other, and may add and arrange the new image according to an operation on the icon.
The icon may include at least one of a map icon instructing an addition and an arrangement of an image of a map image, a searching icon instructing an addition and an arrangement of a searching result of a searched word, an image insert icon instructing an addition and an arrangement of an existing image, and a photographing icon instructing taking a photo or a video.
When the pinch-out operation performed between the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other is detected, the operation detecting unit may detect on which one of the adjacent contents images a prior touch is performed, and in a case in which the pinch-out operation is detected, the image addition unit may add and arrange the new image relating to the primarily touched contents image between the adjacent contents images.
When the pinch-out operation performed between the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other is detected, the operation detecting unit may detect on which side of the adjacent contents images a flick operation is performed for a longer time, and in a case in which the pinch-out operation is detected, the image addition unit may add and arrange the new image relating to the contents image where the flick operation is performed for a longer time between the adjacent contents images.
The operation detecting unit may further also detect the pinch-in operation performed between a plurality of the contents images which are arranged in a linear shape or a matrix shape, and in a case in which the pinch-in operation is detected, the image addition unit may further make an image displayed between the plurality of contents images on which the pinch-in operation is performed disappear.
The information processing apparatus according to the embodiment of the present disclosure may further include a display displaying the contents images, and a touch panel laminated on the display.
An information processing method of an information processing apparatus according to another embodiment of the present disclosure includes, causing the information processing apparatus, to arrange contents images corresponding to contents in a linear shape or a matrix shape, to detect an operation performed at the same time on contents images which are arranged in a linear shape or a matrix shape and adjacent to each other, and to add and arrange a new image between the contents images which are adjacent to each other, in a case in which the operation is detected.
A program according to still another embodiment of the present disclosure causes a computer to function as a contents display unit that arranges contents images corresponding to contents in a linear shape or a matrix shape, an operation detecting unit that detects an operation performed at the same time on the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other, and an image addition unit that adds and arranges a new image between the contents images which are adjacent to each other, in a case in which the operation is detected.
In the embodiments of the present disclosure, the contents images corresponding to the contents are arranged in a linear shape and a matrix shape, the operation performed at the same time on the contents images which are arranged in a linear shape and a matrix shape and adjacent to each other is detected, and a new image is added and arranged between the adjacent contents images in a case in which the operation is detected.
According to the embodiments of the present disclosure, it is possible to improve an operability of a user with respect to an information processing apparatus.
Fig. 1A is a view illustrating a display example of an screen change according to a pinch operation of an information processing apparatus in the present embodiment. Fig. 1B is a view illustrating a display example of an screen change according to a pinch operation of an information processing apparatus in the present embodiment. Fig. 1C is a view illustrating a display example of an screen change according to a pinch operation of an information processing apparatus in the present embodiment. Fig. 2A is a view illustrating a display example of a state in which a camera application is started. Fig. 2B is a view illustrating a display example of a state in which a camera application is started. Fig. 3A is a view illustrating a display example of a case in which the camera application is terminated. Fig. 3B is a view illustrating a display example of a case in which the camera application is terminated. Fig. 3C is a view illustrating a display example of a case in which the camera application is terminated. Fig. 4A is a view illustrating a display example of the screen change according to a flick operation with respect to a contents viewer application. Fig. 4B is a view illustrating a display example of the screen change according to a flick operation with respect to a contents viewer application. Fig. 4C is a view illustrating a display example of the screen change according to a flick operation with respect to a contents viewer application. Fig. 5A is a view illustrating a display example corresponding to a pinch operation with respect to the contents viewer application. Fig. 5B is a view illustrating a display example corresponding to a pinch operation with respect to the contents viewer application. Fig. 5C is a view illustrating a display example corresponding to a pinch operation with respect to the contents viewer application. Fig. 6A is a view illustrating a display example in a case in which an image insert icon is selected. Fig. 6B is a view illustrating a display example in a case in which an image insert icon is selected. Fig. 6C is a view illustrating a display example in a case in which an image insert icon is selected. Fig. 6D is a view illustrating a display example in a case in which an image insert icon is selected. Fig. 7A is a view describing a method for distinguishing a photo on which the pinch operation is performed. Fig. 7B is a view describing a method for distinguishing a photo on which the pinch operation is performed. Fig. 8A is a view describing a method for distinguishing the photo on which the pinch operation is performed. Fig. 8B is a view describing a method for distinguishing the photo on which the pinch operation is performed. Fig. 9A is a view illustrating another display example of the screen change according to the pinch operation. Fig. 9B is a view illustrating another display example of the screen change according to the pinch operation. Fig. 9C is a view illustrating another display example of the screen change according to the pinch operation. Fig. 9D is a view illustrating another display example of the screen change according to the pinch operation. Fig. 10 is a block diagram illustrating a configuration example of the information processing apparatus. Fig. 11 is a functional block diagram realizing a series of an operation.
Hereinafter, the best embodiment for carrying out the present disclosure (hereinafter, referred to as an embodiment) will be described.
<Outline of Embodiments>
An information processing apparatus in the present embodiment includes a display on which a touch panel is accumulated; for example, it is assumed to be a smart phone. However, an application of the present disclosure is not limited to the smart phone.
Figs. 1A to 1C illustrate examples of a screen change according to the pinch operation from an information processing apparatus 10 which is the present embodiment, which is used even in the related art.
As illustrated in Fig. 1A, when a pinch-out operation by a user is detected in a state in which an image such as photo is displayed on a display 11 in the information processing apparatus 10 on which the touch panel is accumulated, the image is displayed enlarged as much as a moving amount of fingers of the user as illustrated in Fig. 1B. In addition, as illustrated in Fig. 1B, when a pinch-in operation performed on the image by the user is detected, the image is displayed as reduced as much as a moving amount of the fingers of user as illustrated in Fig. 1C.
Next, Figs. 2A to 2B illustrate display examples of a state in which a camera application having a function for taking a photo or a video (hereinafter, abbreviated as a camera app) is started in the information processing apparatus 10. A photo A, a video B, a photo C, and the like on a screen of the display are taken in alphabetical order. It is also the same in the other drawings.
In Fig. 2A, a state in which the camera app is started is illustrated. A preview A is displayed on the display 11, and a shutter button 21 for instructing to take a photo and a video button 22 for instructing to start taking a video are displayed in the preview A.
In Fig. 2B, in a state in which a preview B is displayed, a pseudo arrangement of (representative image of) the video B taken right before and the photo A taken before the video B is illustrated. However, the image A and the video B illustrated in the B of the drawing are not visually recognized by the user in this state.
As illustrated in Fig. 2B, in the information processing apparatus 10, when the photographing is performed by a camera app, the photos and the video obtained as a result are pseudo-arranged in a long side direction of the information processing apparatus 10 in a taken order. In addition, in the video, a play button 23 for instructing to start reproduction is superimposed.
In a state of in Fig. 2B, when the user performs a flick operation on the display 11 downward in the long side direction of the information processing apparatus 10, the video B and the photo A are sequentially moved downward and displayed on the display 11.
Accordingly, when the photos and the video are taken, the user can quickly confirm an earlier photographed result by the simple flick operation.
Next, Figs. 3A to 3C illustrate display examples of a case in which the camera app which is started in the information processing apparatus 10 is terminated.
As illustrated in Fig. 3A, when the flick operation is performed horizontally on the display 11 on which the preview is displayed, as illustrated in Fig. 3B, the displayed preview is moved in a direction of the flick operation and disappears, and the camera app is terminated. In addition, after the camera app is terminated, by performing the flick operation upward from a lower end of the display 11, a photographing icon (it will be described in detail with reference to Figs. 5A to 5C) or the like for restarting the camera app can be displayed, as illustrated in Fig. 3C.
Next, a contents viewer application (hereinafter, abbreviated as a viewer app) for reproducing the contents stored in the information processing apparatus 10 is started, and a state of displaying a list of the contents will be described with reference to Figs. 4A to 9D.
Here, the contents indicate still images, moving images, music data, game program and the like, the photos or the videos taken by the camera app are also included to the contents. In each of the contents stored in the information processing apparatus 10, attribute information (photograph date and time, photograph region, photographer, copyright owner, or the like), such as exif information with respect to still image data, is added. In addition, hereinafter, as an example of the content, the photo and the video are described.
Figs. 4A to 4C, by the viewer app, illustrate a list of the photos or the videos as the contents stored in the information processing apparatus 10 and the screen change according to the flick operation with respect to the list.
According to the viewer app, in the contents stored in the information processing apparatus 10, related representative images of the photos or the videos (hereinafter, simply referred to as a video) are pseudo-arranged in a row according to a predetermined order (photographed order) in the long side direction of the information processing apparatus 10.
Here, as the related content, for example, a same photographing date, a same photographing region, a same photographer, or the like can be designated by the user.
In addition, hereinafter, a direction of the long side of the information processing apparatus 10 is referred to as a vertical direction, and a direction of the short side of the information processing apparatus 10 is referred to as a horizontal direction.
As illustrated in Fig. 4A, the photos and the video are arranged in a row in the vertical direction, and the user performs the flick operation a lower side of the vertical direction in a state in which the photo B is displayed on the display 11, as illustrated in Fig. 4B, and the photo A arranged in a row in a upper side of the video B is moved downward and displayed on the display 11. In addition, in a state as illustrated in Fig. 4B, the user performs the flick operation upward in the vertical direction, as illustrated in Fig. 4C, and the photo C arranged in a row in a lower side of the video B is moved upward and displayed on the display 11.
Accordingly, by the simple flick operation, in the contents stored in the information processing apparatus 10, the user can quickly confirm the list of the related contents.
Next, Figs. 5A to 5C, by the viewer app, illustrate the list of the contents stored in the information processing apparatus 10, and an example of the screen change corresponding to the pinch operation thereof.
As illustrated in Fig. 5A, in a state in which the photo A and the photo B are displayed on the display 11, when the user comes into contact with the photo A and the photo B and performs the pinch-out operation, the photo A is moved up and the photo B is moved down. As illustrated in Fig. 5B, between the photo A and the photo B, a map icon 31, a search icon 32, an image insert icon 33, and the photographing icon 34 appear. The appeared map icon 31 to the photographing icon 34 can disappear from the screen because the user comes into contact with the photo A and the photo B and performs the pinch-in operation.
On the map icon 31 to the photographing icon 34, characters may be written, or illustrations or signs may be drawn, as described in the drawing.
Instead of the pinch-out operation, the user may perform a double tap operation (fast and sequentially touching twice) on the photo A and the photo B.
In a case in which the appeared map icon 31 is selected by the user, the photographing region is specified with reference to the attribute information of the photo A (or the photo B), and a map image indicating the photographing region is disposed between the photo A and the photo B as illustrated in Fig. 5C. In addition, the map image may be disposed so as to be superimposed on the photo A (or the photo B).
In addition, in a case in which the information processing apparatus 10 has a positional information acquiring function such as receiving a GPS signal, or the like, the map image indicating a current region may be disposed between the photo A and the photo B, as well as the photographing region of the photo A (or the photo B).
In a case in which the appeared search icon 32 is selected by the user, a noun disclosed therein (name of place, name of store, personal name, or the like) is extracted and designated as a searching word with reference to the attribute information of the photo A (or the photo B). Therefore, searching is performed using a search engine on the Internet. Accordingly, a searched result is disposed between the photo A and the photo B. In addition, the user can input an arbitrary search word.
In a case in which the appeared image insert icon 33 is selected by the user, the list of the photos related to the photo A in the contents stored in the information processing apparatus 10 is displayed in a row between the photo A and the photo B in the horizontal direction with reference to the attribute information of the photo A (or the photo B). In the list, one selected by the user is disposed between the photo A and the photo B.
Figs. 6A to 6D illustrate display examples in a case of being selected the image insert icon 33. In a case of these drawings, an example of referring the attribute information of the photo B is described.
When the image insert icon 33 is selected, as illustrated in Fig. 6A, a list (photo Ba, Photo Bb, Photo Bc, ...) of the photo or the like related to the photo B in the contents stored in the information processing apparatus 10 (photographing time within a predetermined interval, photographing region is in a predetermined range, or the like) is displayed between the photo A and the photo B in a row in the horizontal direction. When the user performs the flick operation in the horizontal direction with respect to a displaying of the list, the list of the photos in a row in the horizontal direction is moved in a direction in which the flick operation is performed.
When the user selected one photo (for example, the photo Bd) in the list, as illustrated in Fig. 6B, the selected photo Bd is inserted between the photo A and the photo B as the same size as the photo A and the photo B.
The user performs the pinch-out operation on the photo Bd which is newly inserted so that the photo Bd can be enlarged as illustrated in Fig. 6B, and on the other hand, the photo Bd can be reduced by performing the pinch-in operation (not illustrated). Further, as illustrated in Fig. 6C, when the flick operation is performed in an arbitrary direction with respect to the photo Bd, as illustrated in Fig. 6D, a displayed region of the photo Bd can be moved. That is, by performing the pinch-out operation and a flick operation on the newly inserted photo Bd, an arbitrary region of the photo Bd can be trimmed.
Figs. 5A to 5C will be described again. In a case in which the appeared photographing icon 34 is selected by the user, the camera app is started. The (representative image of) photos or the videos taken by the camera app are disposed between the photo A and the photo B.
Moreover, regarding which attribute information of the photo A or the photo B is referred to in a case in which the map icon 31 to the image insert icon 33 are selected by the user, it is preferable that either of the photo on the upper side and the photo on the lower side is determined in advance.
Otherwise, as illustrated in Figs. 7A to 7B, at the time of performing the pinch operation, the attribute information of the photo with which the finger first comes into contact may be referred.
Moreover, as illustrated in Figs. 8A to 8B, at the time of performing the pinch operation, the attribute information of the photo in which the moving amount of the finger is large may be referred.
Next, Figs. 9A to 9D illustrate the list of the contents stored in the information processing apparatus 10 and another example of the screen change corresponding to the pinch operation by the viewer app.
In the example of Figs. 9A to 9D, as illustrated in Fig. 9A, when the pinch-out operation is performed while contacting the photo A and the photo B, the map icon 31 to the photographing icon 34 appear as illustrated in Figs. 5A to 5C, the photo A is moved up and the photo B is moved down as illustrated the B of the drawing, and (image of) the contents related to the photo A (or the photo B) are newly inserted therebetween. In a case in which the moving amount of the fingers during the pinch-out operation is large as illustrated in Fig. 9C, the number of contents being inserted may be increased according to the moving amount as illustrated in Fig. 9D.
<Configuration Example of Information Processing Apparatus>
Next, Fig. 10 illustrates a configuration example of hardware of the information processing apparatus 10 for realizing the operations above described.
In the information processing apparatus 10, a CPU (Central Processing Unit) 101, a ROM (Read Only Memory) 102, and a RAM (Random Access Memory) 103 are mutually connected to each other by a bus 104.
Further, the input and output interface 105 is connected to the bus 104. In the input and output interface 105, an input unit 106, an output unit 107, a memory unit 108, a communication unit 109, and the drive 110 are connected thereto.
The input unit 106 is configured of the touch panel, a press button, a microphone, or the like. The output unit 107 is configured of the display, a speaker, or the like. In addition, on the display as an output unit 107, the touch panel as the input unit 106 is laminated. The memory unit 108 is configured of a nonvolatile memory or the like. The communication unit 109 is a network interface communicating data through cellular phone communication network, LAN, or the like. The drive 110 drives a removable media 111 such as a semiconductor memory.
In the information processing apparatus 10 as described above, a CPU 101, for example, loads the program stored in the memory unit 108 to the RAM 103 through the input and output interface 105 and the bus 104 and executes the program so that a series of processes as described above are performed.
The programs of various applications performed by the CPU 101, for example, are downloaded through the Internet, or can be installed in a state of being stored in the removable media 111 such as package media.
Next, Fig. 11 illustrates a configuration example of a functional block related to the screen change according to the operation with respect to the above described touch panel which is realized by the CPU 101 in the information processing apparatus 10 performing the camera app, the viewer app, or the like.
The contents display unit 121 performs displaying of the content, such as the photo or the video, and displaying of the list thereof. The icon display unit 122 makes the map icon 31 to the photographing icon 34 in Figs. 5A to 5C be displayed. The operation detecting unit 123 detects various operations (touch operation, flick operation, pinch operation, or the like) with respect to the touch panel laminated on the display.
That is, in the information processing apparatus 10, the operation with respect to the touch panel is detected and determined by the operation detecting unit 123, and according to this, the screen change is realized according to the above described operation to the touch panel by driving the contents display unit 121 and the icon display unit 122.
In addition, in the present embodiment, the photos or the video is arranged in a row in the vertical direction; however, these may be arranged in a row in the horizontal direction or may be arranged in the vertical direction and the horizontal direction in the matrix shape.
In addition, in the present embodiment, the screen change according to the operation to the touch panel of the camera app or the viewer app is described; however, for example, even an arbitrary application such as an application which is used at the time of buying the content can employ a screen change according to the similar operation.
In addition, the embodiment of the present disclosure is not limited to the above described embodiment but can be modified variously in a range in which a gist of the present disclosure is not changed.
The present disclosure can also adopt configurations as described hereinafter.
(1)
An information processing apparatus including: a contents display unit that arranges contents images corresponding to contents in a linear shape or a matrix shape; an operation detecting unit that detects an operation performed at the same time on the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other; and an image addition unit that adds and arranges a new image between the contents images which are adjacent to each other, in a case in which the operation is detected.
(2)
The information processing apparatus according to (1), in which the operation detecting unit detects a pinch-out operation performed between the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other.
(3)
The information processing apparatus according to (1) or (2), in which the image addition unit arranges icons as the new image between the contents images adjacent to each other in a case in which the operation is detected.
(4)
The information processing apparatus according to (3), in which, in a case in which the operation is detected, the image addition unit displays the icon so as to add a new image relating to at least one of the contents images which are adjacent to each other between the contents images adjacent to each other, and adds and arranges the new image according to an operation on the icon.
(5)
The information processing apparatus according to (3) or (4), in which the icon includes at least one of a map icon instructing an addition and an arrangement of a map image, a searching icon instructing an addition and an arrangement of an image of a searching result of a searched word, an image insert icon instructing an addition and an arrangement of an existing image, and a photographing icon instructing taking a photo or a video.
(6)
The information processing apparatus according to any one of (1) to (5), in which, when the pinch-out operation performed between the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other is detected, the operation detecting unit detects on which one of the adjacent contents images a prior touch is performed in, and in a case in which the pinch-out operation is detected, the image addition unit adds and arranges the new image relating to the primarily touched contents image between the adjacent contents images.
(7)
The information processing apparatus according to any one of (1) to (5), in which, when the pinch-out operation performed between the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other is detected, the operation detecting unit detects on which side of the adjacent contents images a flick operation is performed for a longer time, and in a case in which the pinch-out operation is detected, the image addition unit adds and arranges the new image relating to the contents image where the flick operation is performed for a longer time between the adjacent contents images.
(8)
The information processing apparatus according to any one of (1) to (7), in which the operation detecting unit further also detects the pinch-in operation performed between a plurality of the contents images which are arranged in a linear shape or a matrix shape, and in a case in which the pinch-in operation is detected, the image addition unit further makes an image displayed between the plurality of contents images on which the pinch-in operation is performed disappear.
(9)
The information processing apparatus according to any one of (1) to (8) further includes a display displaying the contents images, and a touch panel laminated on the display.
(10)
An information processing method of an information processing apparatus including, causing the information processing apparatus, to arrange contents images corresponding to contents in a linear shape or a matrix shape, to detect an operation performed at the same time on contents images which are arranged in a linear shape or a matrix shape and adjacent to each other, and to add and arrange a new image between the contents images which are adjacent to each other, in a case in which the operation is detected.
(11)
A program causing a computer to function as a contents display unit that arranges contents images corresponding to contents in a linear shape or a matrix shape, an operation detecting unit that detects an operation performed at the same time on the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other, and an image addition unit that adds and arranges a new image between the contents images which are adjacent to each other, in a case in which the operation is detected.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
10 Information processing apparatus
11 Display
101 CPU
121 Contents display unit
122 Icon display unit
123 Operation detecting unit

Claims (11)

  1. An information processing apparatus comprising:
    a contents display unit that arranges contents images corresponding to contents in a linear shape or a matrix shape;
    an operation detecting unit that detects an operation performed at the same time on the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other; and
    an image addition unit that adds and arranges a new image between the contents images which are adjacent to each other, in a case in which the operation is detected.
  2. The information processing apparatus according to Claim 1, wherein the operation detecting unit detects a pinch-out operation performed between the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other.
  3. The information processing apparatus according to Claim 2, wherein the image addition unit arranges icons as the new image between the contents images adjacent to each other in a case in which the operation is detected.
  4. The information processing apparatus according to Claim 3, wherein, in a case in which the operation is detected, the image addition unit displays the icon so as to add a new image relating to at least one of the contents images which are adjacent to each other between the contents images adjacent to each other, and adds and arranges the new image according to an operation on the icon.
  5. The information processing apparatus according to Claim 4, wherein the icon includes at least one of
    a map icon instructing an addition and an arrangement of a map image,
    a searching icon instructing an addition and an arrangement of an image of a searching result of a searched word,
    an image insert icon instructing an addition and an arrangement of an existing image, and
    a photographing icon instructing taking a photo or a video.
  6. The information processing apparatus according to Claim 2,
    wherein, when the pinch-out operation performed between the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other is detected, the operation detecting unit detects on which one of the adjacent contents images a prior touch is performed, and
    wherein, in a case in which the pinch-out operation is detected, the image addition unit adds and arranges the new image relating to the primarily touched contents image between the adjacent contents images.
  7. The information processing apparatus according to Claim 2,
    wherein, when the pinch-out operation performed between the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other is detected, the operation detecting unit detects on which side of the adjacent contents images a flick operation is performed for a longer time, and
    wherein, in a case in which the pinch-out operation is detected, the image addition unit adds and arranges the new image relating to the contents image where the flick operation is performed for a longer time between the adjacent contents images.
  8. The information processing apparatus according to Claim 2,
    wherein the operation detecting unit further also detects the pinch-in operation performed between a plurality of the contents images which are arranged in a linear shape or a matrix shape, and
    wherein, in a case in which the pinch-in operation is detected, the image addition unit further makes an image displayed between the plurality of contents images on which the pinch-in operation is performed disappear.
  9. The information processing apparatus according to Claim 2, further comprising: a display displaying the contents images; and
    a touch panel laminated on the display.
  10. An information processing method of an information processing apparatus comprising:
    causing the information processing apparatus,
    to arrange contents images corresponding to contents in a linear shape or a matrix shape;
    to detect an operation performed at the same time on contents images which are arranged in a linear shape or a matrix shape and adjacent to each other; and
    to add and arrange a new image between the contents images which are adjacent to each other, in a case in which the operation is detected.
  11. A program causing a computer to function as:
    a contents display unit that arranges contents images corresponding to contents in a linear shape or a matrix shape;
    an operation detecting unit that detects an operation performed at the same time on the contents images which are arranged in a linear shape or a matrix shape and adjacent to each other; and
    an image addition unit that adds and arranges a new image between the contents images which are adjacent to each other, in a case in which the operation is detected.
PCT/JP2015/001916 2014-04-14 2015-04-06 Method and apparatus for displaying additional objects on a graphical user interface based on pinch gesture WO2015159498A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014082638 2014-04-14
JP2014-082638 2014-04-14

Publications (1)

Publication Number Publication Date
WO2015159498A1 true WO2015159498A1 (en) 2015-10-22

Family

ID=53039545

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/001916 WO2015159498A1 (en) 2014-04-14 2015-04-06 Method and apparatus for displaying additional objects on a graphical user interface based on pinch gesture

Country Status (1)

Country Link
WO (1) WO2015159498A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106648399A (en) * 2015-11-02 2017-05-10 广州市动景计算机科技有限公司 Interface display processing method, apparatus and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100162179A1 (en) * 2008-12-19 2010-06-24 Nokia Corporation Method and Apparatus for Adding or Deleting at Least One Item Based at Least in Part on a Movement
US20100299599A1 (en) * 2009-05-19 2010-11-25 Samsung Electronics Co., Ltd. Mobile device and method for executing particular function through touch event on communication related list
JP2012203440A (en) 2011-03-23 2012-10-22 Sony Corp Information processor, information processing method, and program
US20130346882A1 (en) * 2012-06-26 2013-12-26 Google Inc. Prioritized management and presentation of notifications
EP2708996A1 (en) * 2011-05-13 2014-03-19 NTT DoCoMo, Inc. Display device, user interface method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100162179A1 (en) * 2008-12-19 2010-06-24 Nokia Corporation Method and Apparatus for Adding or Deleting at Least One Item Based at Least in Part on a Movement
US20100299599A1 (en) * 2009-05-19 2010-11-25 Samsung Electronics Co., Ltd. Mobile device and method for executing particular function through touch event on communication related list
JP2012203440A (en) 2011-03-23 2012-10-22 Sony Corp Information processor, information processing method, and program
EP2708996A1 (en) * 2011-05-13 2014-03-19 NTT DoCoMo, Inc. Display device, user interface method, and program
US20130346882A1 (en) * 2012-06-26 2013-12-26 Google Inc. Prioritized management and presentation of notifications

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106648399A (en) * 2015-11-02 2017-05-10 广州市动景计算机科技有限公司 Interface display processing method, apparatus and device

Similar Documents

Publication Publication Date Title
US10725734B2 (en) Voice input apparatus
US9942486B2 (en) Identifying dominant and non-dominant images in a burst mode capture
EP3093755B1 (en) Mobile terminal and control method thereof
US10082943B2 (en) Scrolling method of mobile terminal and apparatus for performing the same
JP5925024B2 (en) Display control apparatus, display control method, and program
US8341543B2 (en) Method and apparatus of scrolling a document displayed in a browser window
KR20110071708A (en) Method and apparatus for searching contents in touch screen device
KR101335325B1 (en) Display control apparatus and display control method
US9582172B2 (en) Display control apparatus and method, image display apparatus, and non-transitory computer readable medium
CN110286977B (en) Display method and related product
JP6170241B2 (en) Character identification device and control program
EP3001294B1 (en) Mobile terminal and method for controlling the same
WO2019155853A1 (en) Electronic album device, and operation method and operation program thereof
US10497079B2 (en) Electronic device and method for managing image
US20140028720A1 (en) Display Controller, Display Control Method And Computer-Readable Medium
WO2015159498A1 (en) Method and apparatus for displaying additional objects on a graphical user interface based on pinch gesture
WO2015163140A1 (en) Display device and display control program
JP5813703B2 (en) Image display method and system
US10795537B2 (en) Display device and method therefor
KR101260662B1 (en) Apparatus and Method for displaying history of application
US10558356B2 (en) Display control device and non-transitory computer-readable storage medium having program recorded thereon
KR102464590B1 (en) Display apparauts and control method thereof
JP6409294B2 (en) Information processing apparatus, system, method, and program
CN111782113B (en) Display method, display device and computer-readable storage medium
US20160035117A1 (en) Image display apparatus, image display method, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15719837

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 15719837

Country of ref document: EP

Kind code of ref document: A1