WO2008058306A1 - Procédé de génération d'une interface utilisateur permettant d'afficher des données associées à des entités - Google Patents

Procédé de génération d'une interface utilisateur permettant d'afficher des données associées à des entités Download PDF

Info

Publication number
WO2008058306A1
WO2008058306A1 PCT/AU2006/001698 AU2006001698W WO2008058306A1 WO 2008058306 A1 WO2008058306 A1 WO 2008058306A1 AU 2006001698 W AU2006001698 W AU 2006001698W WO 2008058306 A1 WO2008058306 A1 WO 2008058306A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
magnification
images
user
displayed
Prior art date
Application number
PCT/AU2006/001698
Other languages
English (en)
Inventor
Heiko Waechter
Original Assignee
Deloitte Investments Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deloitte Investments Pty Ltd filed Critical Deloitte Investments Pty Ltd
Priority to AU2006350947A priority Critical patent/AU2006350947A1/en
Priority to PCT/AU2006/001698 priority patent/WO2008058306A1/fr
Publication of WO2008058306A1 publication Critical patent/WO2008058306A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the present invention relates to a process for generating a user interface to display data associated with entities.
  • the Cover Flow interface which exists alongside the text-based iTunes interface, rather than replacing it, only allows the user to view a fraction of their collection at any time (13 albums), which makes it difficult to gain an overview of the entire collection, and it remains inconvenient to locate an individual album or track. It is desired to provide a system and process for generating a user interface to display data associated with entities that alleviate one or more difficulties of the prior art, or at least that provide a useful alternative.
  • a process for generating a user interface to display data associated with entities including: generating a display of images associated with respective ones of said entities at a first magnification; animating, in response to selection of one of said images by a user, said display to centre and display the selected image at a second magnification greater than said first magnification; and animating, in response to selection by a user of a region between displayed images, said display to display said images at a magnification less than said second magnification.
  • the present invention also provides a process for generating a user interface to display data associated with entities, the process including: generating a display of mutually spaced images associated with respective ones of said entities, said display being at a first magnification; animating, in response to selection of one of said images by a user, said display to a second magnification greater than said first magnification, wherein the selected image is substantially centred within the display; wherein the display is responsive to input from the user to display data associated with a corresponding entity; and wherein the display is responsive to selection of a region between images displayed at said second magnification to animate the display to said first magnification.
  • the present invention also provides a system for generating a user interface to display data associated with entities, the system being adapted to generate a display of images associated with respective ones of said entities at a first magnification, said images being associated with controls responsive to input from a user to animate said display to centre and display the selected image at a second magnification greater than said first magnification; and the system being further adapted to animate, in response to selection by a user of a region between displayed images, said display to display said images at a magnification less than said second magnification.
  • Figures 1 to 6 are schematic diagrams of a preferred embodiment of a user interface for exploring entities
  • Figures 7 to 10 are schematic diagrams of a preferred embodiment of a user interface for exploring albums of music tracks
  • Figure 11 is a block diagram of a preferred embodiment of a system for exploring entities
  • Figure 12 is a flow diagram of an initialisation process executed by the system
  • Figure 13 is a schematic diagram of an entity object generated by the system
  • Figure 14 is a flow diagram of an animation process executed by the system
  • Figure 15 is a flow diagram of an entity search process executed by the system
  • Figure 16 is a flow diagram of an entity sorting process executed by the system; and Figure 17 is a flow diagram of an entity filtering process executed by the system.
  • Figure 1 shows an easy to use, intuitive, and visually appealing user interface that allows a user to visualise, browse, and explore entities associated with or including image data representing a visual image or picture, and preferably also metadata representing properties of the corresponding entity, and data of the entity.
  • the entities may be members of a group, or may be a group of sub-entities, and the user interface can be used to rapidly locate a desired entity or entities, possibly with the assistance of one or more search or filtering criteria specified by the user, and to gain access to the data and/or metadata associated with that entity or entities.
  • the graphical user interface or GUI 102 displays the images 104 associated with the entities as a two-dimensional array occupying most of the displayed area of the interface 102, referred to herein as the canvas 108, but with a rectangular region 106 across the top of the canvas 108 displaying controls for filtering, sorting, and searching the entities corresponding to the displayed images 104.
  • a characteristic feature of the interface 102 is that, if possible, all of the displayed entities are made simultaneously visible to the user by being displayed on a single working space or canvas 108, thus providing the user with a visual overview of all the displayed images 104, rather than, for example, dividing the images 104 over multiple pages.
  • the number of displayed entities becomes too large to allow the user to resolve or recognise the individual displayed images 104 of the entities, then the number of images 104 visible at any one time is limited to the number that allows the images 104 to be resolved or recognised by the user.
  • a navigation tool 204 allows the user to scroll the entire working space or canvas if the space taken up by all images 104 is larger than the screen viewing area.
  • use of the filtering functions accessed through controls in the top region 106 allows the number of images 104 displayed to be reduced to a number that can be simultaneously displayed to a user at a size and resolution sufficient for the user to resolve or recognise individual images 104 associated with those entities.
  • the interface 102 zooms in to display a portion or region of the canvas 108 at a higher magnification, centres the selected image 202 in the displayed region, as shown in Figure 2, and highlights the selected image 202, either by drawing a border around the image 202, or by modifying one or more display parameters (e.g., brightness for that image 202 to provide a visual indication that the image 202 is selected).
  • display parameters e.g., brightness for that image 202 to provide a visual indication that the image 202 is selected.
  • a navigation tool 204 is displayed near the top right hand corner of the interface 102 to represent the size and position of the displayed region relative to the entire canvas 108.
  • the navigation tool 204 allows the user to click and drag the representation of the visualised region to view a different portion of the canvas 108.
  • the interface 102 can be made to zoom back out to view the entire canvas 108 (i.e., to provide the overview shown in Figure 1) by using the pointing device to click on a region between images 104, such as the location 206 shown in Figure 2, for example.
  • a characteristic of the interface 102 is that movements (including changes in magnification) are effected by animation, whereby a movement from a first location and/or magnification to a second location and/or magnification is perceived by the user as a smooth movement. Where appropriate, the speed of this movement decreases smoothly with remaining distance to the second location and/or magnification.
  • selection of an entity by single clicking a displayed image 104 associated with the entity causes animation of the interface 102 to display a zoomed view wherein the selected image 202 is centred within the field of view and highlighted. If the display was already zoomed when the selection was changed, then the displayed field of view is only translated to centre the newly selected image. Otherwise, a zoom operation to a higher magnification is also performed.
  • a second click on the selected image 202 causes information or metadata associated with the selected image 202 (and thus with the corresponding entity) to be displayed adjacent to the selected image 202, as shown in Figure 3.
  • this movement is achieved in a visually appealing manner by animating the images 202, 104 on paths from their initial to their final or 'target' positions in a smooth movement whereby the speed of movement is proportional to the remaining distance to the target position and therefore decreases as each image 202, 304 approaches its final position.
  • the information or metadata associated with the selected entity typically includes high level descriptive information 306 associated with the entity, such as the entity's name or title, a category for the entity, a date associated with the entity.
  • high level descriptive information 306 associated with the entity, such as the entity's name or title, a category for the entity, a date associated with the entity.
  • the images 104 may be photographic images of the corresponding personnel or assets
  • the high level information 306 may be the person or asset's name, date of birth or purchase, role, location, etc.
  • selection of one of the links 310 may cause another window to be opened on the display, and further images or multimedia data, including video clips, animations, etc. may be displayed.
  • selection of one of the links may cause audio data associated with the selected entity to be played on an audio output device associated with the system. Clicking on any region between the bounding box 302 and the images 304 associated with unselected entities causes the bounding box 302 and the information/metadata displayed within it to be removed from the display, and the displayed view to be zoomed out in a smooth, animated movement.
  • a rectangular region 106 at the top of the canvas 108 provides controls for filtering, searching, and sorting the displayed images 104.
  • the filtering controls allow the user to reduce the number of displayed images 104 to those matching (or associated with information, metadata, or other data matching) one or more selection criteria.
  • the user can apply a searching function to display only those entities matching a search string entered into a textbox by the user.
  • the images 402 that do not match the selection or search criteria are removed from the display in an animated manner, and in a particular order; in this case, from left to right, top to bottom, as shown in Figure 4.
  • the 'deselected' images 402 are animated from their existing onscreen positions to offscreen positions, thereby appearing to 'drop off the screen.
  • the remaining images are then rearranged in smooth animated movements to remove the resulting gaps, as shown in Figure 5, leaving an array of remaining images 104, as shown in Figure 6.
  • This animation maintains the look and feel of the interface 102 established when the interface 102 is first initialised by the user.
  • the displayed images 104 are placed on the screen in a sequential animated manner whereby the images are animated from an off screen location to their final displayed location in a left to right and top to bottom manner, with the speed of movement of each image being proportional to the remaining distance between the image's current location and its final location on the display.
  • a sorting function can be applied to the displayed entities to rearrange the images 104 based on the sort criteria selected by the user.
  • the sorting can be applied based on the information or metadata associated with each entity, or indeed with properties of the images 104 themselves. In any case, when the display order of the images 104 is changed, the images 104 are rearranged on the display in a visually pleasing animated fashion. After the animated rearrangement, the displayed images 104 reflect the elected sort order.
  • each image 104 represents a small collection of associated music tracks.
  • the association of single image with a set or collection of music tracks is of course based on the traditional distribution of music on physical media such as compact discs (CDs), minidisks, vinyl records, cassette tapes, and the like, hi case, when the set of music tracks represents musical works that were in fact released as an aggregate work or collection on such a physical medium, the image associated with the collection of music tracks is preferably the image used on the cover or other form of packaging of the physical medium on which the music tracks were provided.
  • each image 104 refers to each image 104 as a "cover image” or "cover art"
  • the image associated with a set of music tracks need not actually be an image reproduced on a physical medium on which that has been distributed, but can be any image used to represent the set.
  • references to 'album 1 in this specification should be understood as referring to a group of tracks associated by any means. For example, a user may arbitrarily select tracks as members of a group and associate an arbitrary image with those tracks. Nevertheless, the resulting group of tracks is referred to herein an as album and the associated image as cover art or a cover image.
  • a graphical user interface for browsing and accessing audio data in the form of music tracks
  • a graphical user interface is provided in the form shown in Figure 7, with cover images 702 arranged in a rectangular array.
  • cover images 702 are arranged in an eight across by five down array.
  • the user interface presents cover images to a user in a manner analogous to laying out physical cover images on a flat surface.
  • Part of the power of this arrangement is that people often have a strong visual memory for cover art and associate it strongly with the corresponding music, even though they may not be able to recall the name of the album, or even the performing artist.
  • Presenting a substantial number of cover images to the user allows the user to quickly scan those images and make possibly impulsive selections that may not be based on a conscious awareness of which album they are selecting.
  • the default display (sort) order is alphabetical by artist, from left to right, top to bottom.
  • sort controls 704 are provided to sort or arrange the displayed cover images 702 by artist, musical genre, album title, or year of release.
  • another control allows the user to sort the cover images by properties (e.g., colour properties) of the cover images themselves.
  • Drop down menus 706 and a search textbox 708 allow the user to filter the total collection of albums to those matching a search string entered by the user into the search textbox 708, where the search string is applied to a specific item of information or metadata relating to each cover image, for instance artist name and album name.
  • the selection of a displayed cover image 802 causes the display to zoom in on that image 802, and centre within the display a bounding box 804 encompassing the selected cover image 802 and associated information on the collection of music tracks associated with the cover image 802.
  • Figure 8 shows an example where the selected cover image 802 is that of Michael Jackson's vinyl/CD album entitled Thriller, released in 1994, whose musical genre is classified as pop.
  • a show tracks control 806 is provided that, when clicked, causes the canvas to zoom in further and the boundary box 804 to expand to accommodate the display of the list of associated music track titles and their playing times or durations, together with a hide tracks control 906 to allow the track information to be hidden and the canvas to be zoomed out again.
  • This process is similar to physically picking up a CD in a record store and bringing it closer to look at the track information on the back. Selecting the hide tracks control 906 is thus similar to putting the CD back into the shelf stack but still keeping it in focus.
  • the player controls include standard controls 1002 for playing, pausing, and skipping the music tracks associated with the selected cover image 802, with the corresponding artist and title displayed in a rectangular region 1004 just to the left of the player controls.
  • a series of numbered rectangular icons 1006 to the right of the player controls represent the respective tracks associated with the selected cover image 802, with the number in each icon representing the track number within the set.
  • the track currently playing is shown in an expanded box 1008, providing the current time position of the track, its total playing time, and its title, together with a progress bar 1010 representing the elapsed and remaining time for that track. If the system pointing device is hovered over one of the other track icons 1006, then the icon is expanded to reveal the corresponding track title and its playing time. Clicking the mouse on any one of the track icons 1006 causes the corresponding track to be played.
  • a system 1102 for generating the interfaces described above is based on Flash Studio 8, available from Adobe Systems Incorporated, at http://www.adobe.com/products/flash.
  • the system 1102 includes a database 1104 in which is stored a collection of music tracks, referred to herein as a music library, together with cover images associated with respective subsets of those tracks.
  • the system also includes one or more XML files 1106 containing references to the music tracks and cover images, metadata associated with each music track and set, and at least one flash SWF file 1108 containing an applet to generate the user interface.
  • the XML files 1106, SWF files 1108, and music tracks stored in the database 1104 are accessed by a remote computer system 1110 via a communications network such as the Internet 1112. Alternatively, these files can be accessed via a local network or locally on a computer system on which the application is installed.
  • the remote computer system 1110 includes an operating system 1114 such as Microsoft Windows XP 5 a web browser application such as Microsoft Internet Explorer 1116, and Adobe's Flash Player web browser plugin 1118, available from Adobe System Incorporated.
  • a user of the remote computer system 1110 can access the music tracks stored on the database 1104 by entering a universal resource locator (URL) referencing the system 1102 into the web browser application 1116.
  • URL universal resource locator
  • the web browser 1116 to access a web server application 1120 installed on the system 1102 to access the XML 1106 and SWF files 1108.
  • the SWF file 1108 is thus transferred to the remote system 1110, allowing the flash player 1118 plugin to generate the user interface.
  • the SWF file defines the user interface behaviour described above, and the XML files 1106 provide the metadata associated with each album and with each music track, and providing file names or URLs referencing the cover images and music tracks stored in the database 1104, which are accessed by the flash player 1118 via middleware 1122 which allows the flash player 1118 to access the SQL database 1104. It will be apparent to those skilled in the art that a variety of different middleware applications can be used to provide this function, including PHP/MySQL, Cold Fusion, J2EE.
  • system 1102 and remote computer system 1110 are standard computer systems, such as Intel IA-32 based computers, and the user interface is implemented as program instructions in software.
  • the various components of the system 1102 could alternatively be distributed over a variety of alternative locations, and moreover that the user interface could be generated in part or in total by one or more dedicated hardware components, such as application-specific integrated circuits (ASICs) or field-programmable gate arrays (FPGAs), for example.
  • ASICs application-specific integrated circuits
  • FPGAs field-programmable gate arrays
  • the album was released in 2001 and is categorised as belonging to the ambient music genre. It comprises 11 music tracks with a total playing time of 60 minutes and 42 seconds.
  • metadata for only four of the eleven tracks is shown, defining the track number, the track playing time, and the filename of the file whose content is compressed audio data in the MP3 format, representing the music track.
  • the system executes an initialisation process, as shown in Figure 12.
  • the system reads into memory (RAM) all the functions used by the interface, the scripts for data handling, and the graphical user interface elements.
  • the XML file 1106 containing the file references and metadata for the music collection is retrieved from the system 1102 using HTTP.
  • a data object 1300 is created for each XML data entry which has properties referencing original metadata, in this case for each album (or set of tracks), as shown in Figure 13.
  • the name given to the object is the name of the corresponding album, as shown at 1304.
  • the properties 1306 given to the object 1300 are the metadata for the corresponding album, including a file reference to the image that will be used to represent the album in the interface.
  • step 1208 data entries 1308 corresponding to individual tracks of the entries as defined in the XML file 1106 are added to the object 1300. Steps 1206 and 1208 are repeated until each album referenced in the XML file 1106 has an associated data object. All data objects are added to an object container (Array) which enables the application to examine all data objects and rearrange their order by data object properties. This functionality is used when searching, filtering and sorting functions are applied to the data objects.
  • Array object container
  • an onscreen object is created for each data object 1300 by a "for-loop" control flow structure which cycles through all data objects in the object container (Array) and creates an onscreen object or graphical representation for each data object.
  • Array object container
  • These graphical objects are arranged on the screen based on the default sort order and positioning functions, as described further below.
  • the onscreen objects are initialised for user interaction, and the graphical user interface is initialised.
  • the arrangement of objects on the interface is determined by a positioning function. In its most basic form, the positioning function arranges objects from left to right, top to bottom, as in the following function that is executed at startup to arrange all objects (i.e., cover images) on the interface:
  • This function accepts the parameter whatContainer, a placeholder for the data array of album objects.
  • a for-loop cycles through all the cover objects in the data array, attaches object instances to the stage (the Flash term for the display canvas), and assigns x / y coordinates based on the ordering logic. In this case, the instances are arranged from left to right, top to bottom, in rows of 8 items.
  • All onscreen objects are animated by an animation process, as shown in Figure 14, that assigns new x and y coordinates to an object and animates the movement of the object from its current position to the new target position.
  • the target values come from a positioning script, as described above, that determines the logic by which objects are arranged in the interface, on startup as well as when using filter, sort and search functions (see Figures 14 to 16) that determine which objects are displayed, and their arrangement on the canvas.
  • This function accepts the following parameters: (i) clip: the object that the function is applied to (the word 'clip' is the terminology used by Flash by analogy with movie clips); (ii) x: the target x position; (iii) y: the target y position; and
  • (iv) s a parameter that determines the speed at which the object will move to its target position.
  • the onEnterFrame function is executed every time the application refreshes the interface screen.
  • the framerate (how many times the interface screen is refreshed per second) can be set in the Flash authoring environment and determines the framerate of the published SWF file.
  • a typical framerate that guarantees smooth movement is 30 frames per second.
  • a lower frame rate is less processor intensive, but animated movements become choppy.
  • the onEnterFrame function determines the distance between the current position and the target position (step 1406), and divides that value by the speed parameter.
  • the object's current position is then changed by the resulting number (step 1408) to incrementally move the object across the interface, giving the illusion of smooth movement.
  • the distance between the object and the target position is largest, so the object travels faster to cover the distance.
  • the increment added to the object's position becomes smaller in proportion to the remaining distance, and therefore the speed of movement decreases exponentially over time. This 'easing out' or exponential decay effect further adds to the impression of smooth animation.
  • the interface includes sort controls 704 for changing the sort order of displayed cover images. Selection of these sort controls 704 causes the system to execute a sorting process, as shown in Figure 15.
  • the process receives selection data representing selection by the user of a particular one of the sort controls 704 corresponding to a particular sort order (e.g., sort by musical genre).
  • the system sorts the data objects in the object container, in this case, based on the metadata representing each album's musical genre.
  • the system generates new coordinates for each displayed object, based on the sort order (using a left-to-right, top-to-bottom array order), and at step 1508, the system uses the animation process 1400 to animate the movement of displayed images to their new positions.
  • the process receives filtering data representing selection by the user of a particular one of the filtering controls 706 corresponding to a particular filtering criterion (e.g., filter by musical genre, jazz to show only jazz albums).
  • a particular filtering criterion e.g., filter by musical genre, jazz to show only jazz albums.
  • the system cycles through the object container, flagging objects whose properties (in this case the musical genre metadata) matches the filtering criterion.
  • any unflagged displayed objects are moved off the interface in a smooth animated movement.
  • the remaining flagged objects are copied to a temporary holding data container, and the positioning function is applied to these objects.
  • the displayed images for those objects are animated to close the gaps left by the removed images, using the animation process 1400 to animate the movement of displayed images to their new positions.
  • search textbox 708 invokes an object searching process, as shown in Figure 17.
  • the process receives search data representing the (possibly partial) search string entered in the textbox 708 by the user.
  • the system cycles through the data container, flagging objects whose properties match the search string.
  • any unflagged displayed objects are moved off the interface in a smooth animated movement.
  • the remaining flagged objects are copied to a temporary holding data container, and the positioning function is applied to these objects.
  • the displayed images for those objects are animated to close the gaps left by the removed images, using the animation process 1400 to animate the movement of displayed images to their new positions.
  • the system described above allows users of the Internet to explore and access music albums and tracks and can be used, for example, as part of an online music store, allowing users to preview tracks for evaluation purposes prior to purchase.
  • the user interface could alternatively be implemented as a component of a stand-alone computer system or device for exploring and accessing data stored on that system or device, or on a storage medium associated with or accessible by that system or device.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Système de génération d'une interface utilisateur permettant d'afficher des données associées à des entités, le système étant adapté afin de générer un affichage d'images associées à des entités respectives à un premier agrandissement, les images étant associées à des commandes réagissant à une entrée de la part d'un utilisateur de façon à animer l'affichage afin de centrer et d'afficher l'image sélectionnée à un second agrandissement supérieur au premier; et le système étant en outre adapté afin d'animer l'affichage en réponse à la sélection par un utilisateur d'une région située entre des images affichées, de façon à afficher les images à un agrandissement inférieur au second.
PCT/AU2006/001698 2006-11-14 2006-11-14 Procédé de génération d'une interface utilisateur permettant d'afficher des données associées à des entités WO2008058306A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2006350947A AU2006350947A1 (en) 2006-11-14 2006-11-14 Process for generating a user interface to display data associated with entities
PCT/AU2006/001698 WO2008058306A1 (fr) 2006-11-14 2006-11-14 Procédé de génération d'une interface utilisateur permettant d'afficher des données associées à des entités

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/AU2006/001698 WO2008058306A1 (fr) 2006-11-14 2006-11-14 Procédé de génération d'une interface utilisateur permettant d'afficher des données associées à des entités

Publications (1)

Publication Number Publication Date
WO2008058306A1 true WO2008058306A1 (fr) 2008-05-22

Family

ID=39401223

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2006/001698 WO2008058306A1 (fr) 2006-11-14 2006-11-14 Procédé de génération d'une interface utilisateur permettant d'afficher des données associées à des entités

Country Status (2)

Country Link
AU (1) AU2006350947A1 (fr)
WO (1) WO2008058306A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050005241A1 (en) * 2003-05-08 2005-01-06 Hunleth Frank A. Methods and systems for generating a zoomable graphical user interface
US20060193538A1 (en) * 2005-02-28 2006-08-31 Microsoft Corporation Graphical user interface system and process for navigating a set of images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050005241A1 (en) * 2003-05-08 2005-01-06 Hunleth Frank A. Methods and systems for generating a zoomable graphical user interface
US20060193538A1 (en) * 2005-02-28 2006-08-31 Microsoft Corporation Graphical user interface system and process for navigating a set of images

Also Published As

Publication number Publication date
AU2006350947A1 (en) 2008-05-22

Similar Documents

Publication Publication Date Title
Abowd et al. The Family Video Archive: an annotation and browsing environment for home movies
US20010056434A1 (en) Systems, methods and computer program products for managing multimedia content
US8805830B2 (en) Web application for accessing media streams
US7581186B2 (en) Media manager with integrated browsers
US7730423B2 (en) Method and system for organizing document information
Marchionini et al. Interfaces and tools for the Library of Congress national digital library program
US8327268B2 (en) System and method for dynamic visual presentation of digital audio content
US20080163056A1 (en) Method and apparatus for providing a graphical representation of content
US20080195970A1 (en) Smart genre display
US20100318939A1 (en) Method for providing list of contents and multimedia apparatus applying the same
WO2007126996A2 (fr) Système et procédés facilitant l'entrée de métadonnées
WO2005106637A2 (fr) Navigation dans des elements de medias
JP2010532059A (ja) 中央を固定されたリスト
WO2007149405A2 (fr) Listes de lecture structurées et interface utilisateur
JP4802833B2 (ja) コンテンツの検索装置、検索方法および検索プログラム
KR20130106812A (ko) 미디어 아이템들을 편성하고 시각화하는 방법 및 시스템
US20180033078A1 (en) Interface for enhanced continuity of browsing experience
KR20030019603A (ko) 항목 선택
US20140181088A1 (en) Activity contextualization
EP1721265A1 (fr) Systeme de gestion de donnees
Miser Sams Teach Yourself ITunes 10 in 10 Minutes
WO2008058306A1 (fr) Procédé de génération d'une interface utilisateur permettant d'afficher des données associées à des entités
WO2008005174A2 (fr) Afficheur intelligent
Lee User interface design for keyframe-based content browsing of digital video
US20030061235A1 (en) Display of a digital information content and method of selection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06804517

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2006350947

Country of ref document: AU

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2006350947

Country of ref document: AU

Date of ref document: 20061114

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 06804517

Country of ref document: EP

Kind code of ref document: A1