AU2012268806A1 - Method, apparatus and system for displaying images - Google Patents

Method, apparatus and system for displaying images Download PDF

Info

Publication number
AU2012268806A1
AU2012268806A1 AU2012268806A AU2012268806A AU2012268806A1 AU 2012268806 A1 AU2012268806 A1 AU 2012268806A1 AU 2012268806 A AU2012268806 A AU 2012268806A AU 2012268806 A AU2012268806 A AU 2012268806A AU 2012268806 A1 AU2012268806 A1 AU 2012268806A1
Authority
AU
Australia
Prior art keywords
images
image
display
scrolling
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2012268806A
Inventor
Mark Ronald Tainsh
Ij Eric Wang
Jie Xu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to AU2012268806A priority Critical patent/AU2012268806A1/en
Publication of AU2012268806A1 publication Critical patent/AU2012268806A1/en
Abandoned legal-status Critical Current

Links

Abstract

-27 Abstract METHOD, APPARATUS AND SYSTEM FOR DISPLAYING IMAGES A method of displaying a collection of images on a display is disclosed. One or more of the images are displayed on the display. The displayed images being are according to a 5 predetermined attribute. The displayed images are scrolled on the display in response to a scrolling action. One or more groups of images are displayed, each group representing a subset of the images in the collection. Each image of a subset are related based on a common value of the predetermined attribute. The common value is determined based on the scrolling action. 6979256v1 (P050158 Sneci As Filed) -4/8 Start 310 300 Display an Image Collection in a Gallery View Wait For Input Sensory Data Deactivate Image Input Sensory Group View Data Received 340 320 YsYes Input Sensory Data Noa DeactivatingDStanal N Yes sthe Image Grou Yes Adjust the Image View activated Group View No 350 Activate the Image Fig. 3AA Group View 6979582v1 (P5158Figs-As Filed)

Description

-1 METHOD, APPARATUS AND SYSTEM FOR DISPLAYING IMAGES TECHNICAL FIELD The present invention relates to a method for efficient navigation of images on mobile devices and, in particular, to a method, apparatus and system for displaying images on a display. 5 The present invention also relates to a computer program product including a computer readable medium having recorded thereon a computer program for displaying images on a display. BACKGROUND The advent of digital photography has altered the behaviour of people in capturing 10 images in the sense that it is more and more convenient to capture images using mobile electronic devices, such as digital cameras and smart phones. Meanwhile, the development of massive storage media has made it possible for people to store a large number of images on storage devices. Many of the images are stored as one or more image collections on the mobile electronic devices. 15 After images are captured, the images may be transferred from a mobile electronic device, such as a camera, to external storage devices such as hard drives and Compact Disc Read-Only Memories (CD-ROMs). The images may be transferred using various communication technology. Users may then browse the transferred images by connecting the storage devices to personal computers. Alternatively, users may also browse the images on 20 Liquid Crystal Display (LCD) or OLED (Organic Light-emitting Diode) display panels on the mobile devices. However, there is usually limited screen size on such mobile device display panels, and therefore images are often displayed in compact views for browsing. One method of displaying images of an image collection is known as "gallery view". In the gallery view method, reduced-size or thumbnail images of an image collection are displayed 25 within an interface in a vertical or horizontal scrolling list. The gallery view method allows a user to navigate through an underlying collection of images using gesture control, where the gesture control can be manifested either directly on the display panel or indirectly through a separate mechanical control. 6979256v1 (P050158 Sneci As Filed) -2 Displaying images in accordance with the gallery view method is becoming increasingly prevalent on mobile devices for image browsing. However, when navigating through a large image collection, a user is presented with a long list of thumbnail images. The user may become disoriented or confused when navigating through an extremely large image collection, 5 particularly when he or she desires to locate specific images. SUMMARY It is an object of the present invention to substantially overcome, or at least ameliorate, one or more disadvantages of existing arrangements. Disclosed are arrangements which augment mobile electronic devices, such as smart 10 phones and digital cameras, with display controls that enable efficient navigation through large collections of images. According to one aspect of the present disclosure there is provided a method of displaying a collection of images on a display, said method comprising: displaying one or more of the images on the display, said displayed images being 15 ordered according to a predetermined attribute; scrolling the displayed images on the display in response to a scrolling action; and displaying one or more groups of images, each group representing a subset of the images in the collection, each image of a subset being related based on a common value of the predetermined attribute, wherein the common value is determined based on the scrolling action. 20 According to another aspect of the present disclosure there is provided a system for displaying a collection of images on a display, said system comprising: a memory for storing a data and a computer program; a processor coupled to the memory for executing the computer program, the computer program comprising instructions for: 25 displaying one or more of the images on the display, said displayed images being ordered according to a predetermined attribute; 6979256v1 (P050158 Sneci As Filed) -3 scrolling the displayed images on the display in response to a scrolling action; displaying one or more groups of images, each group representing a subset of the images in the collection, each image of a subset being related based on a common value of the predetermined attribute, wherein the common value is determined based on the 5 scrolling action. According to still another aspect of the present disclosure there is provided an apparatus for displaying a collection of images on a display, said apparatus comprising: means for displaying one or more of the images on the display, said displayed images being ordered according to a predetermined attribute; 10 means for scrolling the displayed images on the display in response to a scrolling action; means for displaying one or more groups of images, each group representing a subset of the images in the collection, each image of a subset being related based on a common value of the predetermined attribute, wherein the common value is determined based on the scrolling action. 15 According to still another aspect of the present disclosure there is provided a computer readable medium having recorded thereon a computer program for displaying a collection of images on a display, said program comprising: code for displaying one or more of the images on the display, said displayed images being ordered according to a predetermined attribute; 20 code for scrolling the displayed images on the display in response to a scrolling action; code for displaying one or more groups of images, each group representing a subset of the images in the collection, each image of a subset being related based on a common value of the predetermined attribute, wherein the common value is determined based on the scrolling action. 25 Other aspects of the invention are also disclosed. 6979256v1 (P050158 Sneci As Filed) -4 BRIEF DESCRIPTION OF THE DRAWINGS One or more embodiments of the invention will now be described with reference to the following drawings, in which: Figs. 1A and 1B collectively form a schematic block diagram representation of an 5 electronic device on which described arrangements may be practised; Fig. 2 shows a graphical user interface displaying images in accordance with the gallery view method; Fig. 3A is a schematic flow diagram showing a method of displaying a collection of images; 10 Fig. 3B is a schematic flow diagram showing a method of adjusting an image group viewport; Fig. 4A shows the graphical user interface of Fig. 2 comprising an image group viewport with image groups; Fig. 4B shows the graphical user interface of Fig. 4A with the image group viewport 15 displaying adjusted image groups; and Fig. 4C shows the graphical user interface of Fig. 4A with an expanded image group viewport. DETAILED DESCRIPTION INCLUDING BEST MODE Where reference is made in any one or more of the accompanying drawings to steps 20 and/or features, which have the same reference numerals, those steps and/or features have for the purposes of this description the same function(s) or operation(s), unless the contrary intention appears. Figs. 1A and 1B collectively form a schematic block diagram of a general purpose electronic device 101 including embedded components, upon which methods to be described 25 are desirably practiced. The electronic device 101 may be, for example, a smartphone, a digital camera or a portable media player, in which processing resources are limited. Nevertheless, the 6979256v1 (P050158 Sneci As Filed) -5 methods to be described may also be performed on higher-level devices such as desktop computers, server computers, and other such devices with significantly larger processing resources. As seen in Fig. 1A, the electronic device 101 is an apparatus comprising an embedded 5 controller 102. Accordingly, the device 101 may be referred to as an "embedded device." In the present example, the controller 102 has a processing unit (or processor) 105 which is bi directionally coupled to an internal storage module 109. The storage module 109 may be formed from non-volatile semiconductor read only memory (ROM) 160 and semiconductor random access memory (RAM) 170, as seen in Fig. lB. The RAM 170 may be volatile, non 10 volatile or a combination of volatile and non-volatile memory. The electronic device 101 includes a display controller 107, which is connected to a display 114. The display 114 is a liquid crystal display (LCD) or OLED (organic light-emitting diode) display panel or the like. The display controller 107 is configured for displaying graphical images (e.g., video images) on the display 114 in accordance with instructions 15 received from the embedded controller 102, to which the display controller 107 is connected. The display 114 may also be used for browsing captured images, as described below. The electronic device 101 also includes user input devices 113 which are typically formed by keys, a keypad, mouse, trackball or like controls. In some implementations, the user input devices 113 may include a touch sensitive panel physically associated with the display 20 114 to collectively form a multi-touch screen. Such a touch-screen may thus operate as one form of graphical user interface (GUI) as opposed to a prompt or menu driven GUI typically used with keypad-display combinations. Other forms of user input devices may also be used, such as a microphone (not illustrated) for voice commands or a joystick/thumb wheel (not illustrated) for ease of navigation about menus. 25 As seen in Fig. 1A, the electronic device 101 also comprises a portable memory interface 106, which is coupled to the processor 105 via a connection 119. The portable memory interface 106 allows a complementary portable memory device 125 to be coupled to the electronic device 101 to act as a source or destination of data or to supplement the internal storage module 109. The portable memory device 125 may be, for example, an external hard 30 drive. Examples of such interfaces permit coupling with portable memory devices such as Universal Serial Bus (USB) memory devices, Secure Digital (SD) cards, Personal Computer Memory Card International Association (PCMIA) cards, optical disks and magnetic disks. 6979256v1 (P050158 Sneci As Filed) -6 The electronic device 101 also has a communications interface 108 to permit coupling of the device 101 to a computer or communications network 120 via a connection 121. The connection 121 may be wired or wireless. For example, the connection 121 may be radio frequency or optical. An example of a wired connection includes Ethernet. Further, an 5 example of wireless connection includes BluetoothTM type local interconnection, Wi-Fi (including protocols based on the standards of the IEEE 802.11 family), Infrared Data Association (IrDa) and the like. Typically, the electronic device 101 is configured to perform some special function. The embedded controller 102, possibly in conjunction with further special function components 10 110, is provided to perform that special function. As described here, the device 101 is an image capturing device such as a digital camera. In this instance, the components 110 may represent a camera lens and a CCD/CMOS sensor that converts light signals to electronic signals and focus control. The processor 105 may be used for manipulating captured digital images. Alternatively, the components 110 may comprise an imaging processor which is responsible for 15 manipulating the captured digital images. The electronic device 101 may also comprise an input/output interface 190 for coupling to peripheral devices including, for example, a High Definition TV (HDTV) 191. As another example, the device 101 may be a smartphone. In this instance, the components 110 may also represent those components required for communications in a 20 cellular telephone or Wi-Fi environment. As described here, the device 101 is a mobile (or portable) device and the special function components 110 may also represent a number of encoders and decoders of a type including Joint Photographic Experts Group (JPEG), (Moving Picture Experts Group) MPEG, MPEG-I Audio Layer 3 (MP3), and the like. 25 The methods described hereinafter may be implemented using the embedded controller 102, where the processes of Figs. 2 to 4C may be implemented as one or more software application programs 133 executable within the embedded controller 102. The electronic device 101 of Fig. 1A implements the described methods. In particular, with reference to Fig. IB, the steps of the described methods are effected by instructions in the software 133 that are 30 carried out within the controller 102. The software instructions may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part and the corresponding code modules 6979256v1 (P050158 Sneci As Filed) -7 performs the described methods and a second part and the corresponding code modules manage a user interface between the first part and the user. The software 133 of the embedded controller 102 is typically stored in the non-volatile ROM 160 of the internal storage module 109. The software 133 stored in the ROM 160 can be 5 updated when required from a computer readable medium. The software 133 can be loaded into and executed by the processor 105. In some instances, the processor 105 may execute software instructions that are located in RAM 170. Software instructions may be loaded into the RAM 170 by the processor 105 initiating a copy of one or more code modules from ROM 160 into RAM 170. Alternatively, the software instructions of one or more code modules may be 10 pre-installed in a non-volatile region of RAM 170 by a manufacturer. After one or more code modules have been located in RAM 170, the processor 105 may execute software instructions of the one or more code modules. The application program 133 is typically pre-installed and stored in the ROM 160 by a manufacturer, prior to distribution of the electronic device 101. However, in some instances, 15 the application programs 133 may be supplied to the user encoded on one or more CD-ROM (not shown) and read via the portable memory interface 106 of Fig. 1A prior to storage in the internal storage module 109 or in the portable memory 125. In another alternative, the software application program 133 may be read by the processor 105 from the network 120, or loaded into the controller 102 or the portable storage medium 125 from other computer readable media. 20 Computer readable storage media refers to any non-transitory tangible storage medium that participates in providing instructions and/or data to the controller 102 for execution and/or processing. Examples of such storage media include floppy disks, magnetic tape, CD-ROM, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, flash memory, or a computer readable card such as a PCMCIA card and the like, whether or not such 25 devices are internal or external of the device 101. Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the device 101 include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on 30 Websites and the like. A computer readable medium having such software or computer program recorded thereon is a computer program product. 6979256v1 (P050158 Sneci As Filed) -8 The second part of the application programs 133 and the corresponding code modules mentioned above may be executed to implement one or more graphical user interfaces (GUIs) to be rendered or otherwise represented upon the display 114 of Fig. 1A. Through manipulation of the user input device 113 (e.g., the keypad), a user of the device 101 and the 5 application programs 133 may manipulate the interface in a functionally adaptable manner to provide controlling commands and/or input to the applications associated with the GUI(s). Other forms of functionally adaptable user interfaces may also be implemented, such as an audio interface utilizing speech prompts output via loudspeakers (not illustrated) and user voice commands input via the microphone (not illustrated). 10 Fig. lB illustrates in detail the embedded controller 102 having the processor 105 for executing the application programs 133 and the internal storage 109. The internal storage 109 comprises read only memory (ROM) 160 and random access memory (RAM) 170. The processor 105 is able to execute the application programs 133 stored in one or both of the connected memories 160 and 170. When the electronic device 101 is initially powered up, a 15 system program resident in the ROM 160 is executed. The application program 133 permanently stored in the ROM 160 is sometimes referred to as "firmware". Execution of the firmware by the processor 105 may fulfil various functions, including processor management, memory management, device management, storage management and user interface. The processor 105 typically includes a number of functional modules including a 20 control unit (CU) 151, an arithmetic logic unit (ALU) 152 and a local or internal memory comprising a set of registers 154 which typically contain atomic data elements 156, 157, along with internal buffer or cache memory 155. One or more internal buses 159 interconnect these functional modules. The processor 105 typically also has one or more interfaces 158 for communicating with external devices via system bus 181, using a connection 161. 25 The application program 133 includes a sequence of instructions 162 through 163 that may include conditional branch and loop instructions. The program 133 may also include data, which is used in execution of the program 133. This data may be stored as part of the instruction or in a separate location 164 within the ROM 160 or RAM 170. In general, the processor 105 is given a set of instructions, which are executed therein. 30 This set of instructions may be organised into blocks, which perform specific tasks or handle specific events that occur in the electronic device 101. Typically, the application program 133 waits for events and subsequently executes the block of code associated with that event. Events 6979256v1 (P050158 Sneci As Filed) -9 may be triggered in response to input from a user, via the user input devices 113 of Fig. 1A, as detected by the processor 105. Events may also be triggered in response to other sensors and interfaces in the electronic device 101. The execution of a set of the instructions may require numeric variables to be read and 5 modified. Such numeric variables are stored in the RAM 170. The disclosed method uses input variables 171 that are stored in known locations 172, 173 in the memory 170. The input variables 171 are processed to produce output variables 177 that are stored in known locations 178, 179 in the memory 170. Intermediate variables 174 may be stored in additional memory locations in locations 175, 176 of the memory 170. Alternatively, some intermediate variables 10 may only exist in the registers 154 of the processor 105. The execution of a sequence of instructions is achieved in the processor 105 by repeated application of a fetch-execute cycle. The control unit 151 of the processor 105 maintains a register called the program counter, which contains the address in ROM 160 or RAM 170 of the next instruction to be executed. At the start of the fetch execute cycle, the contents of the 15 memory address indexed by the program counter is loaded into the control unit 151. The instruction thus loaded controls the subsequent operation of the processor 105, causing for example, data to be loaded from ROM memory 160 into processor registers 154, the contents of a register to be arithmetically combined with the contents of another register, the contents of a register to be written to the location stored in another register and so on. At the end of the fetch 20 execute cycle the program counter is updated to point to the next instruction in the system program code. Depending on the instruction just executed this may involve incrementing the address contained in the program counter or loading the program counter with a new address in order to achieve a branch operation. Each step or sub-process in the processes of the methods described below is associated 25 with one or more segments of the application program 133, and is performed by repeated execution of a fetch-execute cycle in the processor 105 or similar programmatic operation of other independent processor blocks in the electronic device 101. As described above, the display 114 is a liquid crystal display (LCD) panel or OLED (organic light-emitting diode) display or the like. As with many similar electronic devices such 30 as image capturing devices or smartphones, the display 114 of the device 101 has limited screen size. Accordingly, the device 101 may be configured to display images in compact views such as the gallery view method discussed above. The gallery view method displays images of an 6979256v1 (P050158 Sneci As Filed) -10 image collection in a vertical or horizontal scrolling list. The images displayed are typically the reduced-size versions or commonly known as thumbnails. A user of the device 101 may navigate through the images by interacting with the gallery view using the user input devices 113. 5 Fig. 2 shows one example of a graphical user interface 235 which may be displayed on the display 114 in order to navigate images (e.g., 201) of an image collection referred to as a gallery layer 205. As seen in Fig. 2, the graphical user interface 235 comprises a rectangular viewing region or gallery viewport 220 where a number of images (e.g., 201) are visible to the user. In the example of Fig. 2, and in accordance with the gallery view method, the gallery layer 10 205 can be thought of as being ordered on a large canvas based on a predetermined attribute selected by either the application program 133 or the user, and some images 200 of the underlying gallery layer 205 become visible within the gallery viewport 220 at any point in time as a user navigates through the images of the gallery layer 205. The predetermined attribute refers to any information that can be used to describe an image (e.g., 201) in the image 15 collection 205, such as capturing context (e.g., time, location, event), image type (e.g., landscape, portrait), technical settings (e.g., aperture, shutter speed), main subject (e.g., animals, children, parents, and nature), number of people, colours of the image, for example. The predetermined attribute may also refer to social data associated with the image such as number of views, comments, or rating. For example, images may be ordered by date, week, or month if 20 the predetermined attribute is capturing time. Alternatively, images may be ordered by city, state, or country if the predetermined attribute is capturing location. As shown in Fig. 2, the image collection 205 may have more images than can be fitted into the gallery viewport 220. The visible images 200 of the gallery layer 205, displayed within the gallery viewport 220, are displayed as square-cropped thumbnail images and are typically 25 arranged in a grid layout. Alternatively, the images may be displayed in some other form such as data items. The graphical user interface 235 also typically displays a navigation control which may, for example, be a scrollbar 225 comprising an upward button 210, downward button 215 and a scroll box 230 to allow the user to navigate either upwards or downwards through the images of 30 the gallery layer 205. The size of the scroll box 230 is typically used to represent the proportion that the visible images 200 in the gallery viewport 220 is of the gallery layer 205. 6979256v1 (P050158 Sneci As Filed) -11 Alternatively, the user may also navigate through the gallery layer 205 by interacting with the gallery viewport 220 using gestures, such as scrolling, flicking, panning, etc. For example, an upward scroll is equivalent to touching the upward button 210. Sensory input will be generated from one of the user input devices 113 for the gestures. Examples of the user 5 input device may include, but are not limit to, the mouse, the trackball, the scrolling wheel, the dial or the multi-touch screen. Although the gallery view method enables image browsing on display devices with limited screen size, for large collections of images, the user needs to keep scrolling in order to locate his/her desired images. Although the scroll box 230 provides indication of relative 10 location information, the user can benefit from better navigation guidance. A speed-dependent user interface control that enables navigation of collections of images is described here. In one arrangement, a sensory input is received, from one of the user input devices 113. Scrolling speed information from the sensory input may be used to control 15 the display of images in order to provide navigation guidance. The navigation guidance may be provided by presentation of groups of images (or "image groups"), with each group representing a subset of the images in the gallery layer 205. Each image clustered within an image group is related together based on a common value of a predetermined attribute. For example, an image group may contain photos captured at the same day. Such navigation 20 guidance may be referred to as "group view". As shown in Fig. 4A, in one arrangement, scrolling speed may be used to activate display of the images groups in order to provide the navigation guidance. For example, when there is a rapid scroll on the gallery viewport 220, an additional image group viewport 400 may appear superimposed on the gallery viewport 220. As described below, the image group 25 viewport 400 displays a plurality of image groups formed by images in the gallery layer 205. The advantage of such dual-view display is that both the gallery viewport and the image group viewport are visible on the display 114 at the same time. Displaying both views in such a manner is advantageous especially on mobile devices, such as the device 101, as such devices have limited screen size for display. A user may use the image group viewport to efficiently 30 locate desired images by navigating through the image groups. Once the desired images are located, the image group viewport 400 may be deactivated as described below. 6979256v1 (P050158 Sneci As Filed) -12 In one arrangement, the scrolling speed may also be used to determine level of navigation guidance. For example, the scrolling speed may be used to determine granularity of the image groups displayed on the image group viewport 400. For content navigation, people tend to scroll slowly for details and scroll quickly for overviews. Accordingly, granularity of 5 the image groups may be determined by the scrolling speed. Generally, the faster a user scrolls the coarser the image groups become. Methods are described here which enable efficient navigation in large image collections. An "image" may refer to any digital image captured by digital image capturing devices such as the device 101. An image group viewport 400 in this context refers to display of image groups 10 (or groups of images) and information related to each of the image groups. The image groups inside the image group viewport 400 are clusters of images from the gallery layer 205. The information related to each image group includes, but is not limited to, time, location, number of images inside the image group, and related events' names and names of people appearing on or capturing the image using the device 101. "Activation" of an image group viewport 400 in 15 this context refers to a state of being of the image group viewport 400 displayed on the image capturing device 101. "Deactivation" of an image group viewport 400 in this context refers to a state of the image group viewport not being displayed on the device 101. "Concept hierarchy" of an attribute in this context refers to a hierarchical breakdown of the attribute, from a most fine-grained level to a most coarse-grained level. 20 Fig. 3A is a schematic flow diagram showing a method 300 of displaying a collection of images on a display. The method 300 may be implemented as one or more software code modules of the software application 133 resident on the ROM 160 and being controlled in its execution by the processor 105. The method 300 will be described by way of example with reference to the gallery layer 205 displayed using the graphical user interface 235. The images 25 of the gallery layer 205 may be stored within the internal storage module 109 of the device 101. Alternatively, the images of the gallery layer 205 may accessed by the software 133, under execution of the processor 105, from the other computer readable storage media described above. The method 300 begins at displaying step 310, where the software 133, under execution 30 of the processor 105, displays one or more images of the gallery layer 205 in a gallery viewport 220 on the display 314 as described above with reference to Fig. 2. The gallery viewport 220 enables the user to view the images (e.g., 201) of the gallery layer 205 in a tabular form. As 6979256v1 (P050158 Sneci As Filed) -13 described above, in gallery viewport 220, the images of the gallery layer 205 are displayed in the user interface 235 in a list which may be scrolled either horizontally or vertically as described above. The software 133 is configured for scrolling the images of the gallery layer 205 horizontally or vertically on the display 314 in response to a scrolling action as described 5 below. When the software 133 detects that scrolling has stopped, the images (e.g., 201) displayed within the gallery viewport 220 may be snapped to the centre of the gallery viewport 220. As seen in Fig. 2, the interface 235 comprises the gallery viewport 220 and also the navigation control 225. The images (e.g., 200) of the gallery layer 205 are displayed in a tabular form in which space per thumbnail image in each row (or column) is substantially 10 similar at any time. Where the display 114 is a multi-touch screen, the software 133, under execution of the processor 105, may be configured to detect the presence and location of a touch within the gallery viewport 220. Such a touch screen makes smoother user interaction. For example, a scrolling action, such as an upward scroll gesture within the gallery viewport 220, may be used 15 to move the images upwards in a gallery viewport 220. Any projected user interfaces may also be used to navigate throughout the images in the gallery layer 205. As described above, gallery views are widely used for displaying images on mobile devices such as smart phones and digital cameras. Although gallery views may display a large number of images in a limited display area, gallery views make it tedious for users to locate desired images in large image (or 20 "photo") collections. Once the images (e.g., images 200) of the gallery layer 205 are displayed on the display 114, the method 300 proceeds to waiting step 315 where the software 133 waits for input sensory data from one of the user input devices 113, such as the mouse, trackball or the multi touch screen 25 At decision step 320, the software 133 polls the user input devices 113 for any user input sensory data. If the software 133 detects user input sensory data at step 320, then the method 300 proceeds to step 325. Otherwise, the method 300 returns to step 315. Then at decision step 325, if the software 133, under execution of the processor 105, determines that a scrolling signal has been received, then the method 300 proceeds to step 330. 30 Otherwise, the method 300 proceeds to step 335. 6979256v1 (P050158 Sneci As Filed) -14 The scrolling signal determination at step 325 may be implemented using any standard operating system functionality. A scrolling signal may be caused by a scrolling action detected by the software 133 tracking the presence and/or location of a touch from one of the user input devices 113 such as the touch screen, trackball, buttons, web camera of projected user interface. 5 At decision step 330, if the software 133, under execution of the processor 105, determines that the image group view has been activated, then the method 300 proceeds to step 350. Otherwise, the method 300 proceeds to step 345. In one arrangement, the state of the image group viewport is recorded as a Boolean value stored within RAM 170. Initially, the state of the image group viewport is assigned a value of "False". Once the image group viewport is 10 activated, the value of the Boolean value is changed to be "True". At activating step 345, the software 133 activates the image group viewport for displaying one or more image groups each representing a subset of the images in the gallery layer 205. In one arrangement, as shown in Fig. 4A, the image group viewport 400 may be implemented in the form of a foreground layer of the graphical user interface 235 15 The image group viewport 400 may be superimposed onto a background layer formed by the gallery viewport 220 of the interface 235 where the images 200 of the gallery viewport are displayed as shown in Fig. 2. The image group viewport 400 contains image groups (e.g., 440, 450) formed by clustering images from the gallery layer 205. The image groups displayed within the image group viewport 400 provide an overview of the gallery layer 205. Meanwhile, 20 the images (e.g., 200) in the gallery viewport 220 of the graphical user interface 235 may be browsed since the image group viewport has a low opacity value. Each image of a subset of images forming an image group displayed in the image group viewport 400 may be related according to a common value of a predetermined attribute. For example, as seen in Fig. 4A, the graphical user interface 235 comprises the image group 25 viewport 400 superimposed onto the gallery viewport 220. In the image group viewport 400, each image group (e.g., 440) is summarized using representative images (e.g., images 420)) and also information (e.g., 430) associated with a corresponding image group (e.g., 440). In the example of Fig. 4A, the information 430 is associated with image group 440 and indicates that images in the group 440 are captured in December 2011. Further, information 435 30 is associated with the image group 450 and indicates that images in image group (or cluster) 450 are captured in January 2012. The number of images in the group 440 is twenty (20) as indicated by the information 430. Further, the number of images in the group 450 is forty (40) 6979256v1 (P050158 Sneci As Filed) -15 as indicated in information 435 associated with the image group 450. Such summarizing information (e.g., 430, 435) makes it easier for the user to locate desired images. The summarizing information (e.g., 430, 435) may also include titles of subgroups which are included in each group. 5 The image groups (e.g., 440, 450) may be generated using clustering algorithms. The attributes useful for image clustering may include, but are not limited to, capturing time, image type (e.g., landscape, portrait), main subject (e.g., animals, children, parents, and nature), number of people, colours of images, etc. As described above, each image of the image group 440 is related by a common value of "December 2011" in relation to a "capturing time" 10 attribute. The use of attributes may depend on the number of images and also user choice. The image groups (e.g., 440, 450) displayed within the image group viewport 400 are scrollable. The image groups of the image group viewport 400 may be navigated by scrolling the image groups. The software application program 133 may be configured for scrolling the image groups (e.g., 440, 450) within the image group viewport 400 in response to a scrolling action. 15 In one arrangement, once the image groups of the image group viewport 400 have been scrolled, the images (e.g., 200) of gallery viewport 220 in the background are also scrolled. As such, the images (e.g., 200) displayed within the gallery viewport 220 are scrolled based on the image groups being scrolled. Therefore, both the image group viewport 400 and gallery viewport 220 may be synchronized so that the gallery viewport displays images that belong to the image 20 groups of the image group viewport 400. As described above, the images (e.g., 200) displayed in the gallery layer 205 may be ordered according to a predetermined attribute, such as capturing time. In this instance, the images may be ordered within the gallery layer 205 from top to bottom and visible in the gallery viewport 220, as seen in Fig. 2, in order of capture time, where images captured in more 25 recent times are shown at the bottom of the gallery viewport 220. Similarly, in the example of Fig. 4A, the image groups (e.g., 440 and 450) within the image group viewport 400 are ordered from top to bottom, as seen in Fig. 4A, in order of capture time. The images of the image group 440 were captured in "December 2011" and the images of the image group 450 were captured in "January 2012". Accordingly, in one arrangement, the ordering of the images (e.g., 200) 30 within the gallery view corresponds to the ordering of the image groups (e.g., 440, 450) within the image group viewport 400. This enables a user to easily determine the relationship between the images (e.g., 200) within the gallery view and the image groups (e.g., 440, 450). 6979256v1 (P050158 Sneci As Filed) -16 Once an image group has been selected by a user (e.g., using the user input devices 113) the software 133, under execution of the processor 105, may be configured to highlight images of the selected group as the images are displayed in the gallery viewport 220 in the background. In an alternative arrangement, the image group viewport 400 and the gallery viewport 5 220 may be displayed side by side. In this instance, each of the image group viewport 400 and the gallery viewport 220 may occupy a portion (e.g., half) of the user interface 235. In one arrangement, if any image is selected in the gallery viewport 220, an image group corresponding to the selected image is highlighted within the image group viewport 400. The highlighted image group provides context of the gallery layer 205. 10 After the image group viewport is activated in step 345, the method 300 returns to step 315. At decision step 335, if the software 133 determines that the user input sensory data detected at step 320 is a deactivating signal, the method 300 proceeds to step 340. Otherwise, the method 300 returns to step 315. 15 At step 340, the software 133, under execution of the processor 105, deactivates the image group viewport 400. In one arrangement, the deactivating signal may be triggered if a closing button 460, as seen in Fig. 4, is selected (e.g., by a gesture on the multi-touch screen). In an alternative arrangement, the deactivating signal may be triggered when scrolling of the images of the gallery viewport 220 slows down. Alternatively, the deactivating signal may be 20 triggered by the software 133 based on detection of positional changes on the device 101, or detection of change of scrolling actions (e.g., a change from scrolling to panning; or changing from big scrolling to small scrolling), etc. Deactivation of the image group viewport 400 occurs when a desired image has been located. Upon being deactivated, the image group viewport 400 may be closed and the gallery viewport may resume a full screen display where 25 the images (e.g., 200) are displayed within the gallery viewport 220 as seen in Fig. 2. As described above, if the software 133, under execution of the processor 105, determines at step 330 that the image group viewport 400 has been activated, then the method 300 proceeds to step 350. At step 350, the software 133 adjusts the image group viewport 400 based on scrolling action parameters. The scrolling action parameters may include, but are not 30 limited to, speed of scrolling, acceleration of scrolling, intensity of scrolling and duration of 6979256v1 (P050158 Sneci As Filed) -17 scrolling. The term "intensity" in the context of scrolling images (e.g., the images 200) of the gallery viewport 220 refers to frequency of scrolling actions detected by the software 133, under execution of the processor 105, for example, via one or more of the user input devices 113. Further, the term "duration" in the context of scrolling refers to a time period of one 5 scrolling action detected by the software 133 in relation to the images (e.g., 200) of the gallery viewport. In one arrangement, the common value associated with each image of an image group may be determined based on one or more of the scrolling action parameters. The scrolling action parameters used at step 350 may be calculated by the software 133, under execution of the processor 105, each time the image group viewport 400 has been 10 activated. Alternatively, the scrolling action parameters used at step 350 may be calculated by the software 133 using previous information for an image group or an attribute associated with a particular image group. The scrolling action parameters may also be combined to adjust the image group viewport at step 350. For example, both duration and speed of the scrolling action may be taken into account to adjust the image group viewport. A method 360 of adjusting an 15 image group viewport, as executed at step 350, will described below with reference to Fig. 3B. The images in the gallery layer 205 may be ordered according to a predetermined attribute. The predetermined attribute used to order the images may be selected by a user. In one arrangement, the predetermined attribute may be stored within the storage module 109 of the device 101. In an alternative arrangement, the predetermined attribute may be selected by 20 the software 133, under execution of the processor 105, as the attribute that results in a most consistent ordering of images in the gallery layer 205 and visible in the gallery viewport 220. The method 360 may be implemented as one or more software code modules of the software application 133 resident on the ROM 160 and being controlled in its execution by the processor 105. The method 360 will be described by way of example with reference to the 25 graphical user interface 235. The method 360 begins at retrieving step 361, where the software 133, under execution of the processor 133, retrieves the predetermined attribute (e.g., stored within RAM 170) according to which the images (e.g., image 200) in the gallery layer 205 are ordered. As described above, examples of the predetermined attribute may include, for example, capturing 30 time, image location, image type, colour, corresponding event. For the time attribute images in the gallery layer and visible in the gallery viewport may be in a chronological or reverse chronological order. 6979256v1 (P050158 Sneci As Filed) -18 In one arrangement, to identify an attribute value for each image (e.g., 200) of the gallery layer 205, the software, under execution of the processor 105, may be connect to social networks, via the network 120, to retrieve information corresponding to the image attribute. For example, location of an image may be determined based on a checking-in time as recorded in a 5 social network and time when the image was captured. In an alternative arrangement, instead of using checking-in time, planned events recorded in a social network and time may be used to identify an image attribute value for an image. For example, a social network may indicate that a user planned to go to a "Seven Bridges Walk" event which took place on 28 October 2012 from 8-00 a.m. till 11-30 a.m. near 10 Harbour Bridge, Sydney, Australia. Therefore, based on determination of the planned event the software 133, under execution of the processor 105, may determine that all images captured on 28 October between 8-00 a.m. and 11-30 a.m. (Sydney time) are related to a "Seven Bridges Walk" event and have been captured in Sydney, Australia near the Harbour Bridge. Location of where an image has been captured may be determined using any 15 characteristics (e.g., geographic coordinates, route with check points and approximate schedule of arriving to each check point, street address, place name, etc). In another alternative arrangement, location of where an image was captured may be determined using GPS statistics for the device 101 used to capture the image. Such GPS statistics may be stored locally within RAM 170 of the device 101 or remotely on a server connected to the network 120. 20 In one arrangement, to avoid confusing a user, image groups in the image group viewport 400 may be dynamically determined based on the predetermined attribute according to which images are ordered in the gallery layer 205 and visible in the gallery viewport 220. For example, if the predetermined attribute used to order images 200 in the gallery layer is "time", then the image groups (e.g., 440) displayed within the image group viewport 400 may be 25 determined based on the "time" attribute. Dynamically determining the image groups based on the predetermined attribute helps a user establish a mental connection between the image groups (e.g., 440) and the images (e.g., 200) in the gallery viewport 220, which makes it easier to navigate the image groups and also the images. If the image groups (e.g., 440) in the image group viewport 400 and images (e.g., 200) in the gallery layer 205 are determined based on a 30 different attribute (e.g.,, "colour" for the image group viewport 220 and "time" for the gallery layer 205), the user may become disorientated as it is unnatural to relate the image groups and the images using different attributes. 6979256v1 (P050158 Sneci As Filed) -19 In alternative arrangements, the image groups may also be determined based on a combination of at least one predetermined event and at least one predetermined attribute. Determining the image groups based on such a combination is helpful when a user is navigating through travel albums. For example, image groups may be determined based on time and travel 5 events together. In such an example, image groups may be determined based on time first, and then refined (i.e., images may be removed from one or more image groups) based on the travel events. As a result, image groups may be indicated by the combination of time and events, such as "October, Turkey", "October, Greece", and "November, Italy". Image groups may be defined with regard to attribute values. Each image group contains 10 a subset of images that are related based on a common value of the predetermined attribute retrieved at step 361. For example, an image group of images captured in Sydney Australia share a common value of "Sydney Australia" in regard to the "location" attribute. In such an example, the attribute value is a discrete value. However, there may be attribute values that define a range of values. An image group of images captured in January 2012 is one example. 15 The common value of "January 2012" defines a range of dates with regard to the time attribute. As described above, the common value of the predetermined attribute associated with each image of an image group may be determined based on scrolling action. In the method 360, once the predetermined attribute is retrieved at step 361, the method 360 proceeds to determining step 362, where the software 133 determines a level of the image groups based on 20 the scrolling speed. The level of an image group is related to granularity of the image group. People scroll slowly when concentrating on details within images but quickly when focusing on overviews of images. As a result, as scrolling speed is increased, image groups (e.g., 440) of the image group viewport 400 may be displayed in a coarser manner. In one arrangement, the granularity of an image group is indicated by a concept hierarchy of the predetermined attribute 25 associated with the image group. A concept hierarchy starts from a primitive level and builds up to higher levels by concept summarization. The levels of concept hierarchy may be stored in a lookup table configured within RAM 170. For example, for the predetermined attribute "time", the concept of a day may be considered as a primitive level. By concept summarization, coarser concepts, such as weeks, months, quarters, and years, may be produced. The primitive level of a 30 day and the coarser concepts form the concept hierarchy of time. In the time attribute example, the coarser the concept is the bigger range the concept covers. 6979256v1 (P050158 Sneci As Filed) -20 In one arrangement, a mapping between scrolling action parameters (e.g., speed) and different levels of a concept hierarchy may be generated and stored within a look-up table within RAM 170. The faster the speed of scrolling the higher the level of an associated concept. In the example of the "time" attribute, the primitive concept level starts from the concept of 5 "day", which may be shifted to the concept of "month" as scrolling becomes faster. In an alternative arrangement, a combination of the scrolling action parameters may be taken into account when determining the level of hierarchy for an image group. In one arrangement, the level of an image group may be determined based on the concept hierarchy of the predetermined attribute associated with the image group. Shifting the 10 concept changes the level of an image group and may be used to determine the common value of the predetermined attribute associated with each image in the image group. As described above, the concept may be shifted depending on scrolling action. Accordingly, the level of each of the image groups may be determined depending on the scrolling action. In an alternative arrangement, the level of an image group may also be determined 15 based on the concept hierarchy as described above. However, any correlated concept level for the determined concept level may also be determined. For example, for the predetermined "time" attribute, the primitive concept level starts from the concept of "day" as described above. As the user scrolls faster, the next concept level is the concept of "month". Since people usually store captured images based on events, image groups grouped by month correlate with image groups 20 grouped by event. In such an example, the concept level for the image group may be altered to the concept of "event", since it is more natural group images based on events. In one arrangement, if there are a predetermined number (or threshold number) of image groups grouped by event from the gallery layer 205, such image groups may be grouped by the software application 133 into larger groups if the application 133 determines that scrolling has 25 continued for a predetermined period of time . For example, a big image group of "birthday" events may be made at step 362 by grouping different people's birthday event images together. Once the level of image groups is determined at step 362, the method 360 proceeds to clustering step 363. At step 363, the software application program 133, under execution of the processor 105, clusters images of the gallery layer 205 into groups of images. The images are 30 clustered at step 363 based on the predetermined attribute determined at step 362, and the level of image groups determined at step 362. In one arrangement, k-means clustering may be used at step 363 to divide the gallery layer 205 into groups. The predetermined attribute used to 6979256v1 (P050158 Sneci As Filed) -21 cluster the images may be determined based on the concept level. As an example, if the predetermined attribute retrieved at step 361 is "time", then the images of the gallery layer 205 may be clustered for example by month, quarter or year. Further, as described above, the concept level of the image group may be based on a scrolling parameter such as scrolling speed. 5 Fig. 4A shows image groups (e.g., 440, 450) of the image group viewport 400 before a change of scrolling speed. As shown in Fig. 4A, the predetermined attribute retrieved at step 361 is time and the image groups of the image group viewport 400 are grouped based on months (e.g., December 2011, January 2012). Fig 4B shows an adjusted image group viewport 400 comprising image groups (e.g., 10 471, 491). In comparison with the image groups (e.g., 440, 450) of Fig. 4A, the image groups (e.g., 471, 491) of Fig. 4B are bigger as revealed from the number of representative images 470 and the number of images inside each group. For example, the image group 471 comprises forty (40) images as indicated by information 481 associated with the image group 471, and the image group 491 comprises sixty (60) images as indicated by information 480 associated with 15 the image group 491. Further, in Fig. 4B, each image group of the image group viewport 400 covers images from two months. For example, the information 481 associated with the image group 471 indicates that the images of the image group 471 comprises images captured between September 2011 and October 2011, whereas each image group (e.g., 440, 450) in Fig. 4A covers only a month. As such, in the example of Fig. 4B, each image of the image group 471 20 has a common value of "September 2011" or "October 2011" as determined based on the scrolling action. Further, the image group 471 of Fig. 4B is bigger compared to the image groups (e.g., 440, 450) of Fig. 4A. The clustering of images at step 363 may be executed by the software application 133 resident locally within the ROM 160. Alternatively, the clustering of images may be performed 25 on a remote server connected to the network 120. In an arrangement where images of the gallery layer 205 are stored on such a remote server, the clustering for the purposes of displaying the image groups in the image group viewport 400 may be executed on the remote server. In this instance, the image groups in the image group viewport 400 may be displayed on the device 101 before actual images of the gallery layer 205 are downloaded from the remote 30 server, via the network 120, to the device 101. This allows scrolling of the image groups of the image group viewport 400 to be performed in order to navigate to the desired images in the gallery layer 205 prior to the remote server necessarily downloading images to the gallery layer 205. The remote server can selectively download images corresponding to the image groups 6979256v1 (P050158 Sneci As Filed) -22 scrolled in the image group viewport 400 prior to other images being downloaded, which improves the user experience. Executing clustering on a remote server and displaying representations of the clusters in such a manner prior to the actual images being downloaded to the device 101 reduces clustering time. 5 Following the images of the gallery layer 205 being divided into groups at step 363 the method 360 proceeds to displaying step 364. At step 364, the image groups determined at step 363 are displayed by the software 133 as the image group viewport 400. In one arrangement, the size of the image group viewport 400 may be varied depending on the image groups determined at step 363. Bigger image groups contain more images and more information, 10 requiring more space for description. As described above, an image group (e.g., 471) may be made bigger depending on the concept level determined for the image group where the concept level may be determined based on the scrolling action. As such, in one arrangement, the graphical user interface 235 used to display the image groups changes depending on scrolling action. For example, Fig. 4C shows the graphical user interface where the image group 15 viewport 400 has been expanded to display bigger image groups 496 and 497 than the images groups 471 and 491 in Fig. 4B. As seen in Fig. 4C, a bigger image group, such as image group 491, requires more representative images 490 and more associated information 495 describing the images in the group 491. The information (e.g., 495) in Fig, 4C includes time, number of images (or "photos"), locations, people, and events for each image group, whereas the 20 information (e.g., 481) in Fig. 4A contains only time and number of images (photos). More information describing the images in each image group provides the user more information about bigger image groups. In an alternative arrangement, the image group viewport 400 may remain the same size as in Fig. 4B, even when displaying bigger image groups. Once the method 360 concludes at step 364, the software 133 shifts context to method 25 300 at step 350. Following step 350, the method 300 returns to step 315, waiting for next input sensory data. Industrial Applicability The arrangements described are applicable to the computer and data processing industries and particularly for the image processing. 6979256v1 (P050158 Sneci As Filed) -23 The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive. In the context of this specification, the word "comprising" means "including principally 5 but not necessarily solely" or "having" or "including", and not "consisting only of'. Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings. 6979256v1 (P050158 Sneci As Filed)

Claims (11)

1. A method of displaying a collection of images on a display, said method comprising: displaying one or more of the images on the display, said displayed images being 5 ordered according to a predetermined attribute; scrolling the displayed images on the display in response to a scrolling action; and displaying one or more groups of images, each group representing a subset of the images in the collection, each image of a subset being related based on a common value of the predetermined attribute, wherein the common value is determined based on the scrolling action. 10
2. The method according to claim 1, further comprising determining the groups of images based on the predetermined attribute.
3. The method according to claim 1, wherein at least one of the groups of images is determined based on a combination of at least one predetermined event and the predetermined attribute. 15
4. The method according to claim 1, further comprising determining a level for each image group depending on scrolling action.
5. The method according to claim 1, wherein a graphical user interface used to display the image groups changes depending on scrolling action.
6. The method according to claim 1, further comprising highlighting an image group 20 corresponding to a selected image.
7. The method according to claim 1, further comprising scrolling the groups of images in response to the scrolling action.
8. The method according to claim 1, wherein the displayed images are scrolled based on the groups of images being scrolled. 6979256v1 (P050158 Sneci As Filed) -25
9. A system for displaying a collection of images on a display, said system comprising: a memory for storing a data and a computer program; a processor coupled to the memory for executing the computer program, the computer program comprising instructions for: 5 displaying one or more of the images on the display, said displayed images being ordered according to a predetermined attribute; scrolling the displayed images on the display in response to a scrolling action; displaying one or more groups of images, each group representing a subset of the images in the collection, each image of a subset being related based on a common value 10 of the predetermined attribute, wherein the common value is determined based on the scrolling action.
10. An apparatus for displaying a collection of images on a display, said apparatus comprising: means for displaying one or more of the images on the display, said displayed images 15 being ordered according to a predetermined attribute; means for scrolling the displayed images on the display in response to a scrolling action; means for displaying one or more groups of images, each group representing a subset of the images in the collection, each image of a subset being related based on a common value of the predetermined attribute, wherein the common value is determined based on the scrolling 20 action.
11. A computer readable medium having recorded thereon a computer program for displaying a collection of images on a display, said program comprising: code for displaying one or more of the images on the display, said displayed images being ordered according to a predetermined attribute; 25 code for scrolling the displayed images on the display in response to a scrolling action; 6979256v1 (P050158 Sneci As Filed) -26 code for displaying one or more groups of images, each group representing a subset of the images in the collection, each image of a subset being related based on a common value of the predetermined attribute, wherein the common value is determined based on the scrolling action. Dated 19th day of December 2012 5 CANON KABUSHIKI KAISHA Patent Attorneys for the Applicant/Nominated Person SPRUSON & FERGUSON 6979256v1 (P050158 Sneci As Filed)
AU2012268806A 2012-12-20 2012-12-20 Method, apparatus and system for displaying images Abandoned AU2012268806A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2012268806A AU2012268806A1 (en) 2012-12-20 2012-12-20 Method, apparatus and system for displaying images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2012268806A AU2012268806A1 (en) 2012-12-20 2012-12-20 Method, apparatus and system for displaying images

Publications (1)

Publication Number Publication Date
AU2012268806A1 true AU2012268806A1 (en) 2014-07-10

Family

ID=51228841

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2012268806A Abandoned AU2012268806A1 (en) 2012-12-20 2012-12-20 Method, apparatus and system for displaying images

Country Status (1)

Country Link
AU (1) AU2012268806A1 (en)

Similar Documents

Publication Publication Date Title
US11706521B2 (en) User interfaces for capturing and managing visual media
US11770601B2 (en) User interfaces for capturing and managing visual media
US9942486B2 (en) Identifying dominant and non-dominant images in a burst mode capture
AU2018324085B2 (en) Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments
JP6170972B2 (en) Method and computer-readable recording medium for gallery application for content display
US20230388622A1 (en) User interfaces for electronic devices
KR102183448B1 (en) User terminal device and display method thereof
US9307153B2 (en) Method and apparatus for previewing a dual-shot image
EP3226537B1 (en) Mobile terminal and method for controlling the same
KR101636460B1 (en) Electronic device and method for controlling the same
WO2019046597A1 (en) Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments
TWI466008B (en) Display control apparatus, display control method, and computer program product
CN102713812A (en) Variable rate browsing of an image collection
JP7302038B2 (en) USER PROFILE PICTURE GENERATION METHOD AND ELECTRONIC DEVICE
JP2010072749A (en) Image search device, digital camera, image search method, and image search program
KR20150095537A (en) User terminal device and method for displaying thereof
US9930256B2 (en) Control device, control method, and recording medium
WO2020247045A1 (en) Displaying assets in multiple zoom levels of a media library
US10939171B2 (en) Method, apparatus, and computer readable recording medium for automatic grouping and management of content in real-time
US20130229330A1 (en) Controlling images at hand-held devices
KR20150066129A (en) Display appratus and the method thereof
US20120066622A1 (en) Method, apparatus, and software for displaying data objects
US10497079B2 (en) Electronic device and method for managing image
WO2017094800A1 (en) Display device, display program, and display method
AU2012268806A1 (en) Method, apparatus and system for displaying images

Legal Events

Date Code Title Description
MK4 Application lapsed section 142(2)(d) - no continuation fee paid for the application