JP2006091294A - Image display method and apparatus - Google Patents

Image display method and apparatus Download PDF

Info

Publication number
JP2006091294A
JP2006091294A JP2004275303A JP2004275303A JP2006091294A JP 2006091294 A JP2006091294 A JP 2006091294A JP 2004275303 A JP2004275303 A JP 2004275303A JP 2004275303 A JP2004275303 A JP 2004275303A JP 2006091294 A JP2006091294 A JP 2006091294A
Authority
JP
Japan
Prior art keywords
image
user
sensor
unit
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2004275303A
Other languages
Japanese (ja)
Inventor
Yasuhiro Mori
Ichiro Okabayashi
一郎 岡林
康浩 森
Original Assignee
Matsushita Electric Ind Co Ltd
松下電器産業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Ind Co Ltd, 松下電器産業株式会社 filed Critical Matsushita Electric Ind Co Ltd
Priority to JP2004275303A priority Critical patent/JP2006091294A/en
Publication of JP2006091294A publication Critical patent/JP2006091294A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a method effective for browsing a quantity of digital photographs through intuitive operation. <P>SOLUTION: In mobile equipment 8, in principle, digital photographs are displayed one by one sequentially from a designated group of images. In such a case, a transit to the next image is realized by allowing the user to pat the mobile equipment 8. For example, if the user pats the mobile equipment on the left side, a display image 8 moves to right to disappear (slide-out) and a new image appears. Furthermore, if the user pats the mobile equipment on the front side, the display image 9 becomes smaller to disappear (zoom-out) and the new image appears. The moving speed is varied in response to how to pat the mobile equipment. Moreover, the user shakes the mobile equipment 8 vertically or horizontally to instruct contents of a slide show. For example, if the user shakes the mobile equipment quickly, reproduction is made with a violent visual effect. Furthermore, if the time of shaking is long, new and former images are mixed. Thus, a quantity of digital photographs are browsed and viewed through such an intuitive operation as a feeling of touch. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

  The present invention relates to how to display images, particularly digital photographs, and is mainly a portable AV information device that operates according to physical instructions based on tactile sensation, and realizes a digital photograph browsing function and a slide show function. Related to things.

  Conventional portable information devices that operate in accordance with physical instructions include devices that have a touch panel interface and display a map (see, for example, Patent Document 1). FIG. 9 is a diagram showing a display state of the conventional portable information device described in Patent Document 1. In FIG.

  In FIG. 9, a map is displayed on the display unit 101 of the portable information device 100. The map scrolls as follows. That is, when the finger 102 is pressed on the display unit 101 and moved in a desired direction, the map is moved and displayed by the distance moved in the desired direction. For example, if the finger 102 is moved from the bottom to the top, the map moves up.

  As for enlargement, close and press the thumb and forefinger on the display unit 101, and then open the two fingers. Show map.

As described above, the map can be scrolled and enlarged by a physical method using tactile sensation.
JP 2000-163444 A (page 5-6, FIGS. 6 and 8)

  However, when the conventional method is applied to browsing of digital photographs, it is effective to look at one sheet carefully, but it is not a general way of viewing digital photographs. In the conventional method, an effective method for browsing a plurality of sheets is not disclosed.

  Therefore, the present invention provides an image display method and apparatus that are effective when browsing a large number of images, particularly digital photographs, and that can be operated by intuitive instructions based on tactile sensations. These provide a way to view images and show them as a slideshow.

  In order to solve the conventional problems, the present invention provides a sensor unit that senses a user's physical operation, an image accumulation unit that accumulates an image, and reads an image from the image accumulation unit and senses it by the sensor unit. The image display device includes an image generation unit that displays an image obtained by processing the image according to an operation.

  In order to solve the conventional problems, the present invention provides a sensor unit that senses a user's physical operation, a history recording unit that accumulates results sensed by the sensor unit, an image accumulation unit that accumulates image data, The image display device includes an image generation unit that writes an image processing method into a scenario file according to the record of the history recording unit, reads out an image from the image storage unit according to the scenario file, processes the image, and displays the image.

  The image display method and apparatus according to the present invention performs switching of images or display with movement of a plurality of images by a user's physical operation such as tapping and shaking. In addition, selection of a display image and movement of an image are given based on a user's movement and biometric information history. As described above, a large amount of images can be browsed by an intuitive operation using a tactile sensation.

  Embodiments of the present invention will be described below with reference to the drawings.

(Embodiment 1)
A portable device will be described as an embodiment of an image display device. FIG. 1 is a configuration diagram of a portable device according to Embodiment 1 of the present invention. FIG. 2 and FIG. 3 are diagrams showing how the portable device is operated and displayed in the embodiment. In these FIG. 1, 1 is an image input unit, 2 is an image storage unit, 3 is a display image generation unit, 4 is a menu unit, 5 is a sensor unit, 6 is an analysis unit, and 7 is a display unit. These constitute the portable device 8. Reference numeral 9 denotes a display image displayed on the display unit 7. Here, a digital photograph of a still image will be described as an image.

  In FIG. 1, a digital photograph is input from an image input unit 1 and stored and stored in an image storage unit 2. Input to the image input unit 1 may be from the camera or from the outside (network or the like). The entity of the image storage unit 2 is an HDD, CD, DVD, SD, or the like. In the case of a recording medium such as a CD, DVD, or SD, it is physically inserted.

  Next, display (reproduction) is performed as follows. The user instructs a basic method of display from the menu unit 4. Also, an image group to be displayed is indicated (not shown). The image group here is designated by the same folder, recent image, specific event, or the like. The analysis unit 6 sends an instruction to the display image generation unit 3 in response to inputs from the menu unit 4 and the sensor unit 5, and the display image generation unit 3 reads the image from the image storage unit 2 and processes it into a form for display. The image is sent to the display unit 7 to display an image.

  The user operates the portable device 8 by giving a physical stimulus by tactile sensation. Here, the instruction is given by “tapping”. For this reason, the sensor unit 5 detects the strength, direction, interval, and the like that are hit from the outside.

  As a specific operation example, FIG. 2 shows a case where “single display” is selected. In the case of “single image display”, digital photos are displayed one by one from a designated image group in principle. Here, the transition to the next image (transition in editing terms) is realized by the user hitting the portable device 8. An example of this is shown in FIG. As shown in FIG. 2 (a), when tapping from the left side, the display image 9 moves to the right side and disappears (slide out), and a new image appears. Further, as shown in FIG. 2B, when tapping from the front, the display image 9 becomes smaller and disappears (zoom out), and a new image appears. Change the speed of movement depending on how you strike. When you hit it hard or fast, it moves quickly. For example, the following variations can be considered for the way of striking and moving. As a result, an intuitive operation is realized by a physical action by tactile sensation.

Tap from the left Slide to the right Slide from the right Slide to the left Slide from the top Slide from the bottom Tap from the bottom Slide from the top Tap from the front Zoom out or fade out Tap after the zoom in Zoom out disappears Can also be used. That is, image display and action (operation to tap the mobile device 8) are repeated, and a transition of the image is recorded to create a scenario file. Then, by playing along the scenario file, you can view the image as it was edited.

  Next, FIG. 3 shows a case where “image group display” is selected.

  FIG. 3A shows a case where “pond” is selected. This example is a metaphor that attracts frogs in a pond. A plurality of display images 9 are drifting in the display unit 7. Here, when the user taps from the left side, the display image 9 is collected at the hit position. In this process, the images are overlapped or separated, and as a result, the user can see many images. Here, the movement is changed depending on how to strike. For example, the more you tap, the faster the images gather. When the lapse of time has elapsed after the hitting, the display may be displayed as it is, but a plurality of display images 9 may begin to drift in the display unit 7 again.

  Next, FIG. 3B shows a case where “mountain” is selected. This example is a metaphor that breaks down the mountains. A plurality of display images 9 are displayed in the display unit 7 so as to overlap each other. Here, if the user strikes from the right side, the mountain collapses on the left side, so that the overlapping images are displayed apart, and the user can see many images. Here, the movement is changed depending on how to strike. For example, the harder it strikes, the more it collapses and the smaller the overlap. When the time elapses after the hitting is stopped, the display may be displayed as it is, or a plurality of display images 9 may be displayed again in the display unit 7 again.

  In either case of FIGS. 3A and 3B, an intuitive operation using a physical action, that is, browsing and viewing of a large number of digital photographs using a tactile sensation UI is realized.

  In the above, a sensor may be provided exclusively, but a gyro sensor for knowing position information can also be used. The display image is not limited to a photograph but may be an image synthesized by CG.

(Embodiment 2)
FIG. 4 is a configuration diagram of the portable device according to the second embodiment of the present invention. In FIG. 4, 10 is a history recording unit, 11 is a music input unit, 12 is an image feature extraction unit, 13 is a music feature extraction unit, 14 is a music storage unit, 15 is a slide show generation unit, 20 is an image packet, and 30 is music. Packet. The other blocks are the same as in FIG. FIG. 5 is a configuration diagram of the image packet 20 and includes an image ID 21, image sensitivity information 22, and image data 23. FIG. 6 is a configuration diagram of the music packet 30, which includes a music ID 31, music sensitivity information 32, and music data 33. FIG. 7 is a sensitivity map diagram, and FIG. 8 is a diagram showing a display state. Here, a digital photograph of a still image will be described as an image.

  In FIG. 4, the digital photograph is input from the image input unit 1, and the image sensitivity information 22 is extracted by the image feature extraction unit 12. After that, it is packetized into image packets 20 and stored and stored in the image storage unit 2. Input to the image input unit 1 may be from the camera or from the outside (network or the like). Music is input from the music input unit 11, and music sensitivity information 32 is extracted by the music feature extraction unit 13. Thereafter, it is packetized into music packets 30 and stored and stored in the music storage unit 14. The input to the music input unit 11 may be from a music player output or from the outside (network or the like).

  FIG. 5 shows the format of the image packet 20. The image ID 21 is a unique number of the image. The image sensitivity information 22 indicates the sensitivity information of the image as a numerical value. The image data 23 is the image itself. FIG. 6 shows the format of the music packet 30. As in the case of the image, the music ID 31 indicates the music unique number, the music sensitivity information 32 indicates the music sensitivity information by numerical values, and the music data 33 is the music itself.

  Here, sensitivity information (image sensitivity information 22, music sensitivity information 32) will be described. FIG. 7 is a sensitivity map, and the definition of the axis is shown in FIG. The x axis is the intensity axis, the right side is dynamic and the left side is static. The y-axis is warmth, the upper side is warm, and the lower side is cold. Each axis is quantified and the magnitude of the absolute value of the value indicates the degree. Here, the maximum value of the score is 50.

  An example of image sensitivity mapping is shown in FIG. “Intensity = 40, Warmth = 20” of the image sensitivity information 22 shown in FIG. 5 indicates “very dynamic and somewhat warm”.

Dynamic, warm = “flame”
Static, warm = “smile”
Static, cold = “snow scene”
Dynamic, cold = “waterfall”
An example of music sensitivity mapping is shown in FIG. “Intensity = −40, Warmth = −40” of the music sensitivity information 32 shown in FIG. 6 indicates that it is “pretty static and cold”.

Dynamic, warm = “pops”
Static, warm = “singing”
Static, cold = “Enka”
Dynamic, cold = “lock”
Next, display (reproduction) will be described. The display here is based on a picture-story display that sequentially displays a plurality of still images, a so-called slide show. In addition to simply switching and displaying still images sequentially, a visual effect is added at the time of switching. It is called a transition in edit terms, and it performs slide-out, fade-out, fade-in, etc. Furthermore, motion is added to the still image. The still image is subjected to camerawork-like movements such as zooming, panning, and tilting. By zooming in on a specific object such as a person, a strong impression can be given to the viewing user. In addition, music is added to the image playback. Thus, still image browsing can be realized as if viewing moving image content.

  Next, a user operation will be described. The user selects a menu in the menu section 4. First, the case where shuffle is selected in FIG. 4 will be described. As shown in FIG. 8, the user instructs the contents of the slide show by shaking the portable device 8 up and down and left and right. The instruction of contents here is not a detailed and accurate editing operation like editing software on a personal computer, but an instruction for selecting an atmosphere of how to move a slide show and selecting an image. Here, the atmosphere of movement is called a style. The style corresponds to the sensitivity map of FIG. 7, and has the following correspondence, for example. In intense style, intense visual effects, images and music are selected.

Dynamic, warm = “Ukiuki”
Static, warm = “hearty”
Static, cold = “Samzamu”
Dynamic, cold = “喝”
The mobile device 8 senses the motion shaken by the sensor unit 5 and analyzes the motion by the analysis unit 6, and the slide show generation unit 15 generates final video-like content.

  How to swing, that is, the shuffling movement and the content instruction will be described. The way of swinging is broken down into stroke size, speed and time. Give each of the following meanings.

(1) Stroke size-style (warmth)
The larger the size, the warmer the style, images, and music are selected. Conversely, if it is shaken small, a cold one is selected.

(2) Shaking speed-style (intensity)
The faster you choose, the more intense the style, the images, and the music. Conversely, if you shake gently, the quieter one is selected.

(3) Time If the shaking time is short, recent images will be mixed, and if it is long, old and new images will be mixed. The longer the image is shaken, the more the old image is dug up.

  In addition, a variation may be added to the instruction content by a shaking direction, a turning operation, or a fusion of movements.

  Next, with reference to FIG. 4, a case where action recording is selected will be described. In this case, the user's action corresponds to the shaking operation described above. That is, the user carries the portable device 8, senses various information with the sensor unit 5, and records it in the history recording unit 10. During reproduction, the analysis unit 6 analyzes the information in the history recording unit 10, and the slide show generation unit 15 generates final video-like content.

  Here, the various types of information are history such as the number of steps per day, movement distance, or biological information such as pulse, blood pressure, and body temperature, which means an action history. Based on these, the daily behavior is evaluated. If these numbers are high, it is determined that the day was busy, and video content with intense style, images, and music is generated and displayed. On the other hand, if the numerical value is low, it is determined that the day is a quiet day, and a moving picture content of a quiet style, image, and music is generated and displayed.

  The relationship between the action record and the content is not limited to this. If it is determined that the day is a busy day, quiet content may be generated for the sake of healing. In addition, here, the daily action record is assumed assuming a use scene in which the action of the day is changed at the end of the day, but it is not limited to one day, and may be one hour or one week. As described above, a large amount of digital photos can be viewed and viewed by an intuitive operation with a physical action based on tactile sensation.

  In this embodiment, the mobile terminal is described. However, the mobile terminal is not limited to the mobile terminal, and may be divided into a main body and a sensor in a stationary device. For example, the main unit may be a stationary device such as a digital TV or a DVD recorder, and the sensor unit may be a portable terminal device. Also, the image to be handled need not be limited to a still image, and may be a moving image, or may not be taken by an individual, but may be downloaded from a broadcast or web.

  The portable device according to the present invention realizes editing and browsing of a still image by an intuitive operation by tactile sensation. For this reason, it is effective as an image browsing function of a digital camera, PDA, mobile phone with camera, and car navigation system. It can also be used in stationary devices such as digital TVs and DVD recorders.

Configuration diagram of terminal equipment in Embodiment 1 of the present invention The figure which shows the mode of operation and display of the portable apparatus in the embodiment The figure which shows the mode of operation and display of the portable apparatus in the embodiment Configuration diagram of portable device in Embodiment 2 of the present invention Image packet configuration diagram Configuration diagram of music packet Kansei map Figure showing the display The figure which shows the mode of a display of the conventional portable information device

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Image input part 2 Image storage part 3 Display image generation part 4 Menu part 5 Sensor part 6 Analysis part 7 Display part 8 Portable device 9 Display image 10 History recording part 11 Music input part 12 Image feature extraction part 13 Music feature extraction part 14 Music storage unit 15 Slide show generation unit 20 Image packet 21 Image ID
22 Image Sensitivity Information 23 Image Data 30 Music Packet 31 Music ID
32 Music sensitivity information 33 Music data 100 Portable information device 101 Display unit 102 Finger

Claims (14)

  1. A sensor unit for sensing a user's physical movement;
    An image storage unit for storing images;
    Read the image of the image storage unit, according to the operation sensed by the sensor unit,
    An image display device comprising: an image generation unit that displays an image obtained by processing the image.
  2. The sensor unit senses a user's tapping operation,
    The image display device according to claim 1, wherein the image generation unit switches an image to be displayed in response to the request.
  3. When the image generation unit is hit in the up / down / left / right direction, the image is switched by sliding the image in the up / down / left / right direction or a mixed direction,
    The image display device according to claim 2, wherein when the image is hit in the front-rear direction, the image to be displayed is switched by enlarging or reducing the image.
  4. The sensor unit senses a user's tapping operation,
    In response, the image generation unit writes the image switching method in the scenario file,
    After generating a scenario file by repeating the above two operations,
    The image display device according to claim 1, wherein the image generation unit displays the images while switching according to a scenario file.
  5. The image generation unit generates a combined image by reading a plurality of images from the image storage unit,
    The sensor unit senses a user's tapping operation,
    The image display device according to claim 1, wherein the image generation unit performs display to move the plurality of images.
  6. The image generation unit performs a display in which a plurality of images are randomly moved in an arbitrary direction,
    The sensor unit senses a user's tapping operation,
    6. The image display device according to claim 5, wherein the image generation unit performs display so that the plurality of images approach the hit portion.
  7. The image generation unit displays a plurality of images in an overlapping manner,
    The sensor unit senses a user's tapping operation,
    6. The image display device according to claim 5, wherein the image generation unit performs display in which the plurality of images are collapsed in a direction opposite to that in which the image is hit. 6.
  8. A sensor unit for sensing a user's physical movement;
    A history recording unit for accumulating results detected by the sensor unit;
    An image storage unit for storing image data;
    An image display device comprising: an image generation unit that reads out an image from the image storage unit, processes the image, and displays the image according to the recording of the history recording unit.
  9. The sensor unit senses a user's shaking operation,
    The image generation unit changes the way the image is moved according to the size of the shake stroke and the shake speed,
    The image display device according to claim 8, wherein an image selection method is changed according to a length of shaking time.
  10. The sensor unit senses a user's movement or biological information,
    The image display device according to claim 8, wherein the image generation unit changes a method of moving an image and a method of selecting an image according to a movement or a history of biological information.
  11. The image display device according to claim 1, wherein the sensor unit uses a gyro sensor.
  12. The sensor unit uses a gyro sensor and a pedometer to sense a user's movement,
    The image display device according to claim 10, wherein a pulse sensor, a blood pressure sensor, or a body temperature sensor is used as a sensor for detecting biological information.
  13. A sensor step for sensing a user's physical movement;
    An image accumulation step for accumulating images;
    Read the image of the image accumulation step, according to the operation sensed in the sensor step,
    An image display method, comprising: displaying an image obtained by processing the image.
  14. A sensor step for sensing a user's physical movement;
    A history recording step for accumulating the results detected in the sensor step;
    An image storage step for storing image data;
    An image display method comprising: reading out an image from the image accumulation step, processing the image, and displaying the image according to the record in the history recording step.
JP2004275303A 2004-09-22 2004-09-22 Image display method and apparatus Pending JP2006091294A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004275303A JP2006091294A (en) 2004-09-22 2004-09-22 Image display method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004275303A JP2006091294A (en) 2004-09-22 2004-09-22 Image display method and apparatus

Publications (1)

Publication Number Publication Date
JP2006091294A true JP2006091294A (en) 2006-04-06

Family

ID=36232367

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004275303A Pending JP2006091294A (en) 2004-09-22 2004-09-22 Image display method and apparatus

Country Status (1)

Country Link
JP (1) JP2006091294A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007018586A (en) * 2005-07-06 2007-01-25 Sony Corp Device and method for reproducing content data
WO2008032364A1 (en) * 2006-09-12 2008-03-20 Pioneer Corporation Control apparatus, control method, control program and computer readable recording medium
CN101453607A (en) * 2007-12-06 2009-06-10 奥林巴斯映像株式会社 Reproducer, digital camera, slide show reproduction method and program
CN101848354A (en) * 2009-03-27 2010-09-29 奥林巴斯映像株式会社 Image playback apparatus and image display control method
JP2013089194A (en) * 2011-10-21 2013-05-13 Konami Digital Entertainment Co Ltd Information processing device, control method for information processing device, and program
WO2014038233A1 (en) * 2012-09-07 2014-03-13 Kabushiki Kaisha Toshiba Electronic apparatus and display control method
US8699857B2 (en) 2007-12-06 2014-04-15 Olympus Imaging Corp. Reproducer, digital camera, slide show reproduction method, program, image display apparatus, image display method, image reproduction method, and image display program
JP2014082771A (en) * 2013-12-03 2014-05-08 Olympus Imaging Corp Image data transmitter, image data transmission method, and program
US8839106B2 (en) 2007-09-17 2014-09-16 Samsung Electronics Co., Ltd. Method for providing GUI and multimedia device using the same
JP2015135623A (en) * 2014-01-17 2015-07-27 オリンパス株式会社 Display device, display method, and program
JP2017163208A (en) * 2016-03-07 2017-09-14 株式会社東芝 Monitoring system

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007018586A (en) * 2005-07-06 2007-01-25 Sony Corp Device and method for reproducing content data
JP4696734B2 (en) * 2005-07-06 2011-06-08 ソニー株式会社 Content data reproducing apparatus and content data reproducing method
WO2008032364A1 (en) * 2006-09-12 2008-03-20 Pioneer Corporation Control apparatus, control method, control program and computer readable recording medium
US8839106B2 (en) 2007-09-17 2014-09-16 Samsung Electronics Co., Ltd. Method for providing GUI and multimedia device using the same
KR101482080B1 (en) * 2007-09-17 2015-01-14 삼성전자주식회사 Method for providing GUI and multimedia device using the same
US8699857B2 (en) 2007-12-06 2014-04-15 Olympus Imaging Corp. Reproducer, digital camera, slide show reproduction method, program, image display apparatus, image display method, image reproduction method, and image display program
CN101453607A (en) * 2007-12-06 2009-06-10 奥林巴斯映像株式会社 Reproducer, digital camera, slide show reproduction method and program
US8334920B2 (en) 2009-03-27 2012-12-18 Olympus Imaging Corp. Image playback apparatus and image display control method
CN101848354A (en) * 2009-03-27 2010-09-29 奥林巴斯映像株式会社 Image playback apparatus and image display control method
US9088765B2 (en) 2009-03-27 2015-07-21 Olympus Imaging Corp. Image playback apparatus and image display control method
JP2013089194A (en) * 2011-10-21 2013-05-13 Konami Digital Entertainment Co Ltd Information processing device, control method for information processing device, and program
JP2014052907A (en) * 2012-09-07 2014-03-20 Toshiba Corp Electronic apparatus, display control method, and program
WO2014038233A1 (en) * 2012-09-07 2014-03-13 Kabushiki Kaisha Toshiba Electronic apparatus and display control method
JP2014082771A (en) * 2013-12-03 2014-05-08 Olympus Imaging Corp Image data transmitter, image data transmission method, and program
JP2015135623A (en) * 2014-01-17 2015-07-27 オリンパス株式会社 Display device, display method, and program
JP2017163208A (en) * 2016-03-07 2017-09-14 株式会社東芝 Monitoring system

Similar Documents

Publication Publication Date Title
US8587528B2 (en) Portable electronic device with animated image transitions
CN100507814C (en) Key-based advanced navigation techniques
KR100726750B1 (en) Menu image display method and electronic information apparatus
TWI334558B (en) Multimedia user interface
JP4542637B2 (en) Portable information device and information storage medium
CA2658413C (en) Touch screen device, method, and graphical user interface for determining commands by applying heuristics
US7587671B2 (en) Image repositioning, storage and retrieval
US9401177B2 (en) Video editing method and digital device therefor
CN100565433C (en) Browse media items
TWI459284B (en) Information processing apparatus, program, and control method
US5999173A (en) Method and apparatus for video editing with video clip representations displayed along a time line
JP4938733B2 (en) Menu screen display method and menu screen display device
US9860451B2 (en) Devices and methods for capturing and interacting with enhanced digital images
US20180181280A1 (en) Method for providing graphical user interface (gui), and multimedia apparatus applying the same
US7853895B2 (en) Control of background media when foreground graphical user interface is invoked
US7386804B2 (en) Method, system, apparatus, and computer program product for controlling and browsing a virtual book
KR100885596B1 (en) Content reproduction device and menu screen display method
JP4254950B2 (en) Reproducing apparatus and operation menu display method in reproducing apparatus
US20050231513A1 (en) Stop motion capture tool using image cutouts
CA2781607C (en) Gallery application for content viewing
KR100746603B1 (en) Speed browsing of media items in a media diary application
EP1577746A2 (en) Display controlling apparatus, display controlling method, and recording medium
US8249397B2 (en) Playback of digital images
US8847977B2 (en) Information processing apparatus to flip image and display additional information, and associated methodology
US10156974B2 (en) Information processing apparatus, display control method, and display control program