AU2006350947A1 - Process for generating a user interface to display data associated with entities - Google Patents

Process for generating a user interface to display data associated with entities Download PDF

Info

Publication number
AU2006350947A1
AU2006350947A1 AU2006350947A AU2006350947A AU2006350947A1 AU 2006350947 A1 AU2006350947 A1 AU 2006350947A1 AU 2006350947 A AU2006350947 A AU 2006350947A AU 2006350947 A AU2006350947 A AU 2006350947A AU 2006350947 A1 AU2006350947 A1 AU 2006350947A1
Authority
AU
Australia
Prior art keywords
display
images
magnification
user
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2006350947A
Inventor
Heiko Waechter
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DELOITTE INVESTMENTS Pty Ltd
Original Assignee
DELOITTE INVEST Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DELOITTE INVEST Pty Ltd filed Critical DELOITTE INVEST Pty Ltd
Publication of AU2006350947A1 publication Critical patent/AU2006350947A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Description

WO 2008/058306 PCT/AU2006/001698 PROCESS FOR GENERATING A USER INTERFACE TO DISPLAY DATA ASSOCIATED WITH ENTITIES FIELD OF THE INVENTION 5 The present invention relates to a process for generating a user interface to display data associated with entities. BACKGROUND The visualisation and exploration of information continues to be an area of active research 10 and development in information technology. The past decade has seen substantial increases in the processing power and storage capacity of the personal computer and similar electronic devices, accompanied by similar increases in their popularity and in broadband access to the Internet. These and other related changes have created a rising demand for technologies that allow people to quickly access and comprehend large sets of information, 15 and to be able to visualise, explore, and access selected parts of those sets of information. Although a wide variety of methods have been developed to visualise extremely large and complex data sets, it has also become important to be able to visualise, and explore (e.g., browse and navigate) relatively small data sets. 20 In particular, the development of compressed audio formats such as MPEG layer 3, known widely as MP3, and the availability of handheld audio devices capable of storing and playing large numbers of compressed audio files, has meant that a significant and growing proportion of the populations of many countries have substantial music collections electronically stored on such devices or on personal computers. Typically, these collections 25 are created by converting individual tracks of compact discs from an uncompressed pulse code modulated (PCM) format to respective data files in a compressed format (typically MP3 format) and/or by downloading already compressed music data files directly from the Internet. In either case, the result is a collection or 'library' of individual data files representing respective musical works (referred to herein for convenience as 'music WO 2008/058306 PCT/AU2006/001698 -2 tracks'), often comprising some hundreds or thousands of such files. Although groups of these files are often closely associated, in particular if they represent tracks originating from the same compact disc or collection, or are otherwise part of the same release by a particular artist, this may not be apparent in the user's library of data files. 5 A considerable number of software applications have been developed to assist users in managing and exploring their libraries of music files. The most popular of these software applications is probably the iTunes software application developed by Apple Computer Inc., freely available from http://www.apple.com/itunes. However, even though iTunes and 10 other applications generate and maintain metadata for each music track, such as Track Title, Artist, Album Title, Release Year, etc, which allow users to identify tracks originating from the same 'Album' (e.g., compact disk (CD), the library is nevertheless presented to the user as a scrollable table of track metadata. The table can be filtered to reduce the number of displayed tracks by characteristics such as musical genre, artist, or 15 release date, but the interface only allows the user to view a fraction of their entire collection at any time. Moreover, representation of tracks as a scrollable table of textual information does little to stimulate a user browsing the collection or to present the collection in a visually appealing or stimulating manner. 20 The latest version of iTunes 7.0, released on September 12, 2006, incorporates a plug-in component named Cover Flow, previously available as shareware, which does allow users to associate tracks originating from the same album with a visual image of the album's cover art. Users who obtain an Apple Store ID by entering their credit card details into an online form can now view their iTunes library as a 3D representation of albums that they 25 can 'flick through' by scrubbing sideways, apparently mimicking the appearance and movement of a stack of physical CDs or vinyl record albums being browsed in a real-world music store. However, the Cover Flow interface, which exists alongside the text-based iTunes interface, rather than replacing it, only allows the user to view a fraction of their collection at any time (13 albums), which makes it difficult to gain an overview of the 30 entire collection, and it remains inconvenient to locate an individual album or track.
WO 2008/058306 PCT/AU2006/001698 -3 It is desired to provide a system and process for generating a user interface to display data associated with entities that alleviate one or more difficulties of the prior art, or at least that provide a useful alternative. 5 SUMMARY OF THE INVENTION In accordance with the present invention, there is provided a process for generating a user interface to display data associated with entities, the process including: 10 generating a display of images associated with respective ones of said entities at a first magnification; animating, in response to selection of one of said images by a user, said display to centre and display the selected image at a second magnification greater than said first magnification; and 15 animating, in response to selection by a user of a region between displayed images, said display to display said images at a magnification less than said second magnification. The present invention also provides a process for generating a user interface to display data associated with entities, the process including: 20 generating a display of mutually spaced images associated with respective ones of said entities, said display being at a first magnification; animating, in response to selection of one of said images by a user, said display to a second magnification greater than said first magnification, wherein the selected image is substantially centred within the display; 25 wherein the display is responsive to input from the user to display data associated with a corresponding entity; and wherein the display is responsive to selection of a region between images displayed at said second magnification to animate the display to said first magnification.
WO 2008/058306 PCT/AU2006/001698 -4 The present invention also provides a system for generating a user interface to display data associated with entities, the system being adapted to generate a display of images associated with respective ones of said entities at a first magnification, said images being associated with controls responsive to input from a user to animate said display to centre 5 and display the selected image at a second magnification greater than said first magnification; and the system being further adapted to animate, in response to selection by a user of a region between displayed images, said display to display said images at a magnification less than said second magnification. 10 BRIEF DESCRIPTION OF THE DRAWINGS Preferred embodiments of the present invention are hereinafter described, by way of example only, with reference to the accompanying drawings, wherein: Figures 1 to 6 are schematic diagrams of a preferred embodiment of a user interface for exploring entities; 15 Figures 7 to 10 are schematic diagrams of a preferred embodiment of a user interface for exploring albums of music tracks; Figure 11 is a block diagram of a preferred embodiment of a system for exploring entities; Figure 12 is a flow diagram of an initialisation process executed by the system; 20 Figure 13 is a schematic diagram of an entity object generated by the system; Figure 14 is a flow diagram of an animation process executed by the system; Figure 15 is a flow diagram of an entity search process executed by the system; Figure 16 is a flow diagram of an entity sorting process executed by the system; and 25 Figure 17 is a flow diagram of an entity filtering process executed by the system.
WO 2008/058306 PCT/AU2006/001698 -5 DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Figure 1 shows an easy to use, intuitive, and visually appealing user interface that allows a user to visualise, browse, and explore entities associated with or including image data representing a visual image or picture, and preferably also metadata representing properties 5 of the corresponding entity, and data of the entity. The entities may be members of a group, or may be a group of sub-entities, and the user interface can be used to rapidly locate a desired entity or entities, possibly with the assistance of one or more search or filtering criteria specified by the user, and to gain access to the data and/or metadata associated with that entity or entities. 10 As shown in Figure 1, the graphical user interface or GUI 102 displays the images 104 associated with the entities as a two-dimensional array occupying most of the displayed area of the interface 102, referred to herein as the canvas 108, but with a rectangular region 106 across the top of the canvas 108 displaying controls for filtering, sorting, and 15 searching the entities corresponding to the displayed images 104. A characteristic feature of the interface 102 is that, if possible, all of the displayed entities are made simultaneously visible to the user by being displayed on a single working space or canvas 108, thus providing the user with a visual overview of all the displayed images 104, rather than, for example, dividing the images 104 over multiple pages. However, if the number of 20 displayed entities becomes too large to allow the user to resolve or recognise the individual displayed images 104 of the entities, then the number of images 104 visible at any one time is limited to the number that allows the images 104 to be resolved or recognised by the user. A navigation tool 204 allows the user to scroll the entire working space or canvas if the space taken up by all images 104 is larger than the screen viewing area. However, in 25 many cases use of the filtering functions accessed through controls in the top region 106 allows the number of images 104 displayed to be reduced to a number that can be simultaneously displayed to a user at a size and resolution sufficient for the user to resolve or recognise individual images 104 associated with those entities.
WO 2008/058306 PCT/AU2006/001698 -6 When the user recognises or otherwise selects a particular one of the displayed images 104 as being of interest, that item can be selected by moving a pointing device such as a standard computer mouse cursor over that image and clicking the pointing device to select the image and hence also the corresponding entity. In response, the interface 102 zooms in 5 to display a portion or region of the canvas 108 at a higher magnification, centres the selected image 202 in the displayed region, as shown in Figure 2, and highlights the selected image 202, either by drawing a border around the image 202, or by modifying one or more display parameters (e.g., brightness for that image 202 to provide a visual indication that the image 202 is selected). Simultaneously, a navigation tool 204 is 10 displayed near the top right hand corner of the interface 102 to represent the size and position of the displayed region relative to the entire canvas 108. The navigation tool 204 allows the user to click and drag the representation of the visualised region to view a different portion of the canvas 108. The interface 102 can be made to zoom back out to view the entire canvas 108 (i.e., to provide the overview shown in Figure 1) by using the 15 pointing device to click on a region between images 104, such as the location 206 shown in Figure 2, for example. A characteristic of the interface 102 is that movements (including changes in magnification) are effected by animation, whereby a movement from a first location and/or 20 magnification to a second location and/or magnification is perceived by the user as a smooth movement. Where appropriate, the speed of this movement decreases smoothly with remaining distance to the second location and/or magnification. These features provide the user interface 102 with great visual appeal and elegance, providing great satisfaction and enjoyment to the user, and making the interface a pleasure to use. 25 As described above, selection of an entity by single clicking a displayed image 104 associated with the entity causes animation of the interface 102 to display a zoomed view wherein the selected image 202 is centred within the field of view and highlighted. If the display was already zoomed when the selection was changed, then the displayed field of 30 view is only translated to centre the newly selected image. Otherwise, a zoom operation to WO 2008/058306 PCT/AU2006/001698 -7 a higher magnification is also performed. Once highlighted, a second click on the selected image 202 (or a double-click if the image is not yet selected) causes information or metadata associated with the selected image 202 (and thus with the corresponding entity) to be displayed adjacent to the selected image 202, as shown in Figure 3. This is achieved 5 by animating the display so that the selected image 202 and the displayed information are contained within a bounding box 302 (which may or may not itself be displayed), centred within the field of view. This requires the selected image 202 and the adjacent images 304 to be moved from the positions at which they were displayed prior to the second click (i.e., arranged as shown in Figure 2). As described above, this movement is achieved in a 10 visually appealing manner by animating the images 202, 104 on paths from their initial to their final or 'target' positions in a smooth movement whereby the speed of movement is proportional to the remaining distance to the target position and therefore decreases as each image 202, 304 approaches its final position. 15 As shown in Figure 3, the information or metadata associated with the selected entity typically includes high level descriptive information 306 associated with the entity, such as the entity's name or title, a category for the entity, a date associated with the entity. For example, if the entities are personnel or assets of an organisation, the images 104 may be photographic images of the corresponding personnel or assets, and the high level 20 information 306 may be the person or asset's name, date of birth or purchase, role, location, etc. Also displayed is general descriptive or other information 308 for the entity and one or more controls or links 310 that can be used to access other (possibly multimedia) 25 information or data associated with the entity. For example, selection of one of the links 310 may cause another window to be opened on the display, and further images or multimedia data, including video clips, animations, etc. may be displayed. Alternatively, selection of one of the links may cause audio data associated with the selected entity to be played on an audio output device associated with the system. Clicking on any region 30 between the bounding box 302 and the images 304 associated with unselected entities WO 2008/058306 PCT/AU2006/001698 -8 causes the bounding box 302 and the information/metadata displayed within it to be removed from the display, and the displayed view to be zoomed out in a smooth, animated movement. 5 As described above, when the entire canvas 108 is displayed (i.e., in the zoomed out state as shown in Figure 1), a rectangular region 106 at the top of the canvas 108 provides controls for filtering, searching, and sorting the displayed images 104. The filtering controls allow the user to reduce the number of displayed images 104 to those matching (or associated with information, metadata, or other data matching) one or more selection 10 criteria. Alternatively, the user can apply a searching function to display only those entities matching a search string entered into a textbox by the user. In either case, the images 402 that do not match the selection or search criteria are removed from the display in an animated manner, and in a particular order; in this case, from left to right, top to bottom, as shown in Figure 4. In this example, the 'deselected' images 402 are animated from their 15 existing onscreen positions to offscreen positions, thereby appearing to 'drop off the screen. The remaining images are then rearranged in smooth animated movements to remove the resulting gaps, as shown in Figure 5, leaving an array of remaining images 104, as shown in Figure 6. This animation maintains the look and feel of the interface 102 established when the interface 102 is first initialised by the user. When the interface 102 is 20 first generated, the displayed images 104 are placed on the screen in a sequential animated manner whereby the images are animated from an off screen location to their final displayed location in a left to right and top to bottom manner, with the speed of movement of each image- being proportional to the remaining distance between the image's current location and its final location on the display. 25 Similarly, a sorting function can be applied to the displayed entities to rearrange the images 104 based on the sort criteria selected by the user. The sorting can be applied based on the information or metadata associated with each entity, or indeed with properties of the images 104 themselves. In any case, when the display order of the images 104 is changed, WO 2008/058306 PCT/AU2006/001698 -9 the images 104 are rearranged on the display in a visually pleasing animated fashion. After the animated rearrangement, the displayed images 104 reflect the elected sort order. Although the user interface described above can be used to browse and explore 5 information associated with a wide variety of different types of entities associated with respective images, it has been found to be particularly effective for managing and interacting with collections of audio data, specifically collections of individual music tracks, where each image 104 represents a small collection of associated music tracks. The association of single image with a set or collection of music tracks is of course based on 10 the traditional distribution of music on physical media such as compact discs (CDs), minidisks, vinyl records, cassette tapes, and the like. In case, when the set of music tracks represents musical works that were in fact released as an aggregate work or collection on such a physical medium, the image associated with the collection of music tracks is preferably the image used on the cover or other form of packaging of the physical medium 15 on which the music tracks were provided. However, it will be appreciated that the distribution of music continues to become less associated with the traditional forms of distribution on physical media, in large part due to the growing popularity of network based distribution channels such as the iTunes music store. Accordingly, although the description below refers to each image 104 as a "cover image" or "cover art", it should be 20 apparent that the image associated with a set of music tracks need not actually be an image reproduced on a physical medium on which that has been distributed, but can be any image used to represent the set. Similarly, references to 'album' in this specification should be understood as referring to a group of tracks associated by any means. For example, a user may arbitrarily select tracks as members of a group and associate an arbitrary image with 25 those tracks. Nevertheless, the resulting group of tracks is referred to herein an as album and the associated image as cover art or a cover image. In the context of a graphical user interface for browsing and accessing audio data in the form of music tracks, a graphical user interface is provided in the form shown in Figure 7, 30 with cover images 702 arranged in a rectangular array. In the case of the interface being WO 2008/058306 PCT/AU2006/001698 -10 displayed on the full screen of a Macintosh Notebook Computer having a widescreen display, forty cover images 702 are arranged in an eight across by five down array. In this context, the user interface presents cover images to a user in a manner analogous to laying out physical cover images on a flat surface. Part of the power of this arrangement is that 5 people often have a strong visual memory for cover art and associate it strongly with the corresponding music, even though they may not be able to recall the name of the album, or even the performing artist. Presenting a substantial number of cover images to the user allows the user to quickly scan those images and make possibly impulsive selections that may not be based on a conscious awareness of which album they are selecting. 10 The default display (sort) order is alphabetical by artist, from left to right, top to bottom. However, four sort controls 704 are provided to sort or arrange the displayed cover images 702 by artist, musical genre, album title, or year of release. In an alternative embodiment, another control allows the user to sort the cover images by properties (e.g., colour 15 properties) of the cover images themselves. Drop down menus 706 and a search textbox 708 allow the user to filter the total collection of albums to those matching a search string entered by the user into the search textbox 708, where the search string is applied to a specific item of information or metadata relating to each cover image, for instance artist name and album name. 20 As shown in Figure 8, the selection of a displayed cover image 802 causes the display to zoom in on that image 802, and centre within the display a bounding box 804 encompassing the selected cover image 802 and associated information on the collection of music tracks associated with the cover image 802. For example, Figure 8 shows an 25 example where the selected cover image 802 is that of Michael Jackson's vinyl/CD album entitled Thriller, released in 1994, whose musical genre is classified as pop. A show tracks control 806 is provided that, when clicked, causes the canvas to zoom in further and the boundary box 804 to expand to accommodate the display of the list of associated music track titles and their playing times or durations, together with a hide tracks control 906 to 30 allow the track information to be hidden and the canvas to be zoomed out again. This WO 2008/058306 PCT/AU2006/001698 - 11 process is similar to physically picking up a CD in a record store and bringing it closer to look at the track information on the back. Selecting the hide tracks control 906 is thus similar to putting the CD back into the shelf stack but still keeping it in focus. 5 Selecting a track title by clicking on it using the pointing device causes that track to be played by an associated music player, whose controls are also displayed in the interface, as shown in Figure 10. The player controls include standard controls 1002 for playing, pausing, and skipping the music tracks associated with the selected cover image 802, with the corresponding artist and title displayed in a rectangular region 1004 just to the left of 10 the player controls. A series of numbered rectangular icons 1006 to the right of the player controls represent the respective tracks associated with the selected cover image 802, with the number in each icon representing the track number within the set. The track currently playing is shown in an expanded box 1008, providing the current time position of the track, its total playing time, and its title, together with a progress bar 1010 representing the 15 elapsed and remaining time for that track. If the system pointing device is hovered over one of the other track icons 1006, then the icon is expanded to reveal the corresponding track title and its playing time. Clicking the mouse on any one of the track icons 1006 causes the corresponding track to be played. 20 As shown in Figure 1, a system 1102 for generating the interfaces described above is based on Flash Studio 8, available from Adobe Systems Incorporated, at http://www.adobe.com/products/flash. The system 1102 includes a database 1104 in which is stored a collection of music tracks, referred to herein as a music library, together with cover images associated with respective subsets of those tracks. The system also includes 25 one or more XML files 1106 containing references to the music tracks and cover images, metadata associated with each music track and set, and at least one flash SWF file 1108 containing an applet to generate the user interface. The XML files 1106, SWF files 1108, and music tracks stored in the database 1104 are accessed by a remote computer system 1110 via a communications network such as the Internet 1112. Alternatively, these files 30 can be accessed via a local network or locally on a computer system on which the WO 2008/058306 PCT/AU2006/001698 - 12 application is installed. The remote computer system 1110 includes an operating system 1114 such as Microsoft Windows XP, a web browser application such as Microsoft Internet Explorer 1116, and Adobe's Flash Player web browser plugin 1118, available from Adobe System Incorporated. A user of the remote computer system 1110 can access the 5 music tracks stored on the database 1104 by entering a universal resource locator (URL) referencing the system 1102 into the web browser application 1116. This causes the web browser 1116 to access a web server application 1120 installed on the system 1102 to access the XML 1106 and SWF files 1108. The SWF file 1108 is thus transferred to the remote system 1110, allowing the flash player 1118 plugin to generate the user interface. 10 The SWF file defines the user interface behaviour described above, and the XML files 1106 provide the metadata associated with each album and with each music track, and providing file names or URLs referencing the cover images and music tracks stored in the database 1104, which are accessed by the flash player 1118 via middleware 1122 which allows the flash player 1118 to access the SQL database 1104. It will be apparent to those 15 skilled in the art that a variety of different middleware applications can be used to provide this function, including PHP/MySQL, Cold Fusion, J2EE. In the described embodiment, the system 1102 and remote computer system 1110 are standard computer systems, such as Intel IA-32 based computers, and the user interface is 20 implemented as program instructions in software. However, it will be apparent to those skilled in the art that the various components of the system 1102 could alternatively be distributed over a variety of alternative locations, and moreover that the user interface could be generated in part or in total by one or more dedicated hardware components, such as application-specific integrated circuits (ASICs) or field-programmable gate arrays 25 (FPGAs), for example. For the purposes of illustration, the following is an XML file 1106 for a library consisting of only one album or set of music tracks, entitled '10000hz' by the French group Air. The album was released in 2001 and is categorised as belonging to the ambient music genre. It 30 comprises 11 music tracks with a total playing time of 60 minutes and 42 seconds. For the WO 2008/058306 PCT/AU2006/001698 - 13 sake of brevity, metadata for only four of the eleven tracks is shown, defining the track number, the track playing time, and the filename of the file whose content is compressed audio data in the MP3 format, representing the music track. 5 <library> <album id="10000hz"> <mytitle>10000 Hz</mytitle> <artist>Air</artist> <genre>Ambient</genre> 10 <year>2001</year>  <mytracks>11</mytracks> <playtime>60:42</playtime> <songlist> 15 <song name="Electronic Performers" track="1" duration="5:35" file="Ol Electronic Performers.mp3" /> <song name="How Does It Make You Feel" 20 track="2" duration="4:37" file="02 How Does It Make You Feel.mp3" /> <song name="Radio #1" track="3" 25 duration="4:22" file="03 Radio #1.mp3" /> <song name="The Vagabond" track="4" duration="5:37" 30 file="04 The Vagabond.mp3" /> </songlist> </album> <library> WO 2008/058306 PCT/AU2006/001698 -14 When the system is first initialised, it executes an initialisation process, as shown in Figure 12. At step 1202, the system reads into memory (RAM) all the functions used by the interface, the scripts for data handling, and the graphical user interface elements. At step 1204, the XML file 1106 containing the file references and metadata for the music 5 collection is retrieved from the system 1102 using HTTP. After the application has parsed in the XML file 1106, a data object 1300 is created for each XML data entry which has properties referencing original metadata, in this case for each album (or set of tracks), as shown in Figure 13. As each object 1300 represents an 10 album, the name given to the object is the name of the corresponding album, as shown at 1304. Similarly, the properties 1306 given to the object 1300 are the metadata for the corresponding album, including a file reference to the image that will be used to represent the album in the interface. 15 Having created the album object 1300 at step 1206, step 1208 data entries 1308 corresponding to individual tracks of the entries as defined in the XML file 1106 are added to the object 1300. Steps 1206 and 1208 are repeated until each album referenced in the XML file 1106 has an associated data object. All data objects are added to an object container (Array) which enables the application to examine all data objects and rearrange 20 their order by data object properties. This functionality is used when searching, filtering and sorting functions are applied to the data objects. At step 1212, an onscreen object is created for each data object 1300 by a "for-loop" control flow structure which cycles through all data objects in the object container (Array) 25 and creates an onscreen object or graphical representation for each data object. These graphical objects are arranged on the screen based on the default sort order and positioning functions, as described further below. Finally, at step 1214 the onscreen objects are initialised for user interaction, and the graphical user interface is initialised.
WO 2008/058306 PCT/AU2006/001698 - 15 The arrangement of objects on the interface is determined by a positioning function. In its most basic form, the positioning function arranges objects from left to right, top to bottom, as in the following function that is executed at startup to arrange all objects (i.e., cover images) on the interface: 5 function drawFiles(whatContainer) { for (i=0; i<whatContainer.length; i++) tx = i%8; ty = Math.floor(i/8); 10 root.attachMovie("albumFile", "albumFile" + i, i) root("albumFile" + i]._x = 240*tx; root ["albumFile" + i]._y = 220*ty; } } 15 // root.albumList.sorton (["myartist", "mytitle"]); drawFiles( root.albumList); 20 This function accepts the parameter whatContainer, a placeholder for the data array of album objects. A for-loop cycles through all the cover objects in the data array, attaches object instances to the stage (the Flash term for the display canvas), and assigns x / y coordinates based on the ordering logic. In this case, the instances are arranged from left to right, top to bottom, in rows of 8 items. 25 All onscreen objects are animated by an animation process, as shown in Figure 14, that assigns new x and y coordinates to an object and animates the movement of the object from its current position to the new target position. The target values come from a positioning script, as described above, that determines the logic by which objects are 30 arranged in the interface, on startup as well as when using filter, sort and search functions (see Figures 14 to 16) that determine which objects are displayed, and their arrangement on the canvas. The animation steps 1406 to 1412 of the animation process are performed by the moveAlbums function: WO 2008/058306 PCT/AU2006/001698 -16 function moveAlbums(clip, x, y, s) var xTween = true; 5 var yTween = true; var xTarget = x; var yTarget = y; 10 var speed = s; clip.onEnterFrame = function() //check x if (xTween) { dist (xTarget-this. x); 15 if (Math.abs(dist)<l) { this. x = xTarget; xTween = false; } else { this. x += dist/speed; } //check y if (yTween) { dist = (yTarget-this._y); 25 if (Math.abs(dist)<l) { this._y yTarget; yTween = false; } else { this._y += dist/speed; 30} if (xTween I1 yTween) } else { delete this.onEnterFrame; 35 1 This function accepts the following parameters: 40 (i) clip: the object that the function is applied to (the word 'clip' is the terminology used by Flash by analogy with movie clips); (ii) x: the target x position; (iii) y: the target y position; and (iv) s: a parameter that determines the speed at which the object will move to its 45 target position.
WO 2008/058306 PCT/AU2006/001698 -17 The onEnterFrame function is executed every time the application refreshes the interface screen. The framerate (how many times the interface screen is refreshed per second) can be set in the Flash authoring environment and determines the framerate of the published SWF file.. A typical framerate that guarantees smooth movement is 30 frames per second. A 5 lower frame rate is less processor intensive, but animated movements become choppy. The onEnterFrame function determines the distance between the current position and the target position (step 1406), and divides that value by the speed parameter. The object's current position is then changed by the resulting number (step 1408) to incrementally move the object across the interface, giving the illusion of smooth movement. 10 As the animation starts, the distance between the object and the target position is largest, so the object travels faster to cover the distance. As the object gets closer to its target position, the increment added to the object's position becomes smaller in proportion to the remaining distance, and therefore the speed of movement decreases exponentially over time. This 15 'easing out' or exponential decay effect further adds to the impression of smooth animation. On every screen refresh, the function checks the distance between current and target position (dist-(xTarget - this. x)). If the distance is less than I pixel for x and y positions, then the movement function is removed from the object (step 1412) to avoid unnecessary 20 processing. As described above, the interface includes sort controls 704 for changing the sort order of displayed cover images. Selection of these sort controls 704 causes the system to execute a sorting process, as shown in Figure 15. At step 1502, the process receives selection data 25 representing selection by the user of a particular one of the sort controls 704 corresponding to a particular sort order (e.g., sort by musical genre). At step 1504, the system sorts the data objects in the object container, in this case, based on the metadata representing each album's musical genre. At step 1506, the system generates new coordinates for each displayed object, based on the sort order (using a left-to-right, top-to-bottom array order), WO 2008/058306 PCT/AU2006/001698 - 18 and at step 1508, the system uses the animation process 1400 to animate the movement of displayed images to their new positions. Similarly, use of the filtering controls 706 invokes an object filtering process, as shown in 5 Figure 16. At step 1602, the process receives filtering data representing selection by the user of a particular one of the filtering controls 706 corresponding to a particular filtering criterion (e.g., filter by musical genre, jazz to show only jazz albums). At step 1604, the system cycles through the object container, flagging objects whose properties (in this case the musical genre metadata) matches the filtering criterion. At step 1606, any unflagged 10 displayed objects are moved off the interface in a smooth animated movement. At step 1608, the remaining flagged objects are copied to a temporary holding data container, and the positioning function is applied to these objects. Finally, at step 1610, the displayed images for those objects are animated to close the gaps left by the removed images, using the animation process 1400 to animate the movement of displayed images to their new 15 positions. Finally, use of the search textbox 708 invokes an object searching process, as shown in Figure 17. On each keypress, at step 1704 the process receives search data representing the (possibly partial) search string entered in the textbox 708 by the user. At step 1706, the 20 system cycles through the data container, flagging objects whose properties match the search string. At step 1708, any unflagged displayed objects are moved off the interface in a smooth animated movement. At step 1710, the remaining flagged objects are copied to a temporary holding data container, and the positioning function is applied to these objects. Finally, at step 1712, the displayed images for those objects are animated to close the gaps 25 left by the removed images, using the animation process 1400 to animate the movement of displayed images to their new positions. The system described above allows users of the Internet to explore and access music albums and tracks and can be used, for example, as part of an online music store, allowing 30 users to preview tracks for evaluation purposes prior to purchase. However, it will be WO 2008/058306 PCT/AU2006/001698 -19 apparent that the user interface could alternatively be implemented as a component of a stand-alone computer system or device for exploring and accessing data stored on that system or device, or on a storage medium associated with or accessible by that system or device. 5 Many modifications will be apparent to those skilled in the art without departing from the scope of the present invention as hereinbefore described with reference to the accompanying drawings. 10

Claims (36)

1. A process for generating a user interface to display data associated with entities, the process including: 5 generating a display of images associated with respective ones of said entities at a first magnification; animating, in response to selection of one of said images by a user, said display to centre and display the selected image at a second magnification greater than said first magnification; and 10 animating, in response to selection by a user of a region between displayed images, said display to display said images at a magnification less than said second magnification.
2. The process of claim 1, wherein the magnification less than said second 15 magnification is equal to said first magnification.
3. The process of claim 1 or 2, including displaying, in response to receipt of input from a user, data associated with a corresponding entity. 20
4. The process of claim 3, including displaying, in response to receipt of further input from the user, further data associated with the corresponding entity.
5. The process of claim 3 or 4, including, in response to receipt of said input from the user, animating the display to display the selected image and the displayed data in a 25 substantially centred region of the display.
6. The process of claim 4, including, in response to receipt of said input from the user, animating the display to display the selected image and the displayed data in a substantially centred region of the display at a third magnification greater than said 30 second magnification. WO 2008/058306 PCT/AU2006/001698 -21
7. The process of claim 4 or 6, including removing said further data from said display in response to receipt of input from the user. 5
8. The process of claim 6, including, in response to receipt of input from said user, removing said further data from said display, and animating said display to said second magnification.
9. The process of any one of claims I to 8, wherein the animation is effected in a 10 smooth manner such that each movement from a first location to a second location is perceived as a smooth movement whose speed decreases smoothly with remaining distance to said second location.
10. The process of any one of claims 3 to 9, wherein the input represents a further 15 selection of a selected image.
11. The process of any one of claims 3 to 10, wherein at least part of the displayed data is displayed as text. 20
12. The process of any one of claims 1 to 11, wherein said images are displayed as an array of mutually spaced images.
13. The process of any one of claims 1 to 12, wherein the display of the selected image at the second magnification includes displaying a graphical representation of a size 25 and location of the displayed region relative to the display of said images at the first magnification.
14. The process of claim 13, including receiving user input to modify the graphical representation, and correspondingly modifying the location of the displayed region. 30 WO 2008/058306 PCT/AU2006/001698 -22
15. The process of any one of claims 1 to 14, wherein the display includes controls for determining arrangements of said images in said display, each control being responsive to input from the user to cause at least some of the displayed images to move from their current locations to respective new locations, each movement 5 being animated in a smooth manner such that each movement from a first location to a second location is perceived as a smooth movement whose speed decreases smoothly with remaining distance to said second location.
16. The process of claim 15, wherein at least one of said controls is responsive to input 10 from the user to cause at least some of the displayed images to be removed from said display, each such removal being effected as a smooth animated movement.
17. The process of claim 15 or 16, wherein at least one of said controls is responsive to input from the user to cause the displayed images to be rearranged on said display. 15
18. The process of any one of claims 1 to 17, wherein the entities represent files of audio data.
19. The process of any one of claims I to 17, wherein the entities represent groups of 20 related files of audio data.
20. The process of claim 18 or 19, wherein each file of audio data represents a musical work. 25
21. The process of claim 19, wherein the related files of audio data of each group represent tracks of an album.
22. The process of claim 21, wherein each of said images represents an album cover. WO 2008/058306 PCT/AU2006/001698 -23
23. The process of claim 21 or 22, wherein the data associated with a corresponding album includes a list of corresponding tracks of said album.
24. The process of claim 21 or 22, wherein the data associated with a corresponding 5 album includes textual information identifying titles of tracks of said album and respective playing times, an artist, a release year, and a musical genre.
25. The process of claim 23 or 24, wherein a displayed track title is responsive to selection by the user to cause a corresponding track to be played. 10
26. The process of any one of claims 21 to 25, wherein a selected image representing an album cover is responsive to input from a user to cause the corresponding group of tracks to be played. 15
27. The process of claim 26, wherein including modifying the selected image to indicate that the corresponding group of tracks is playing.
28. The process of claim 27, wherein a brightness of the selected image is repeatedly cycled from a first brightness to a second brightness to indicate that the 20 corresponding group of tracks is playing.
29. The process of any one of claims 25 to 28, wherein the playing of a track includes displaying player controls on said display, said player controls including track controls for respective tracks of said group, selection of a track control causing a 25 corresponding track to play.
30. The process of claim 29, wherein each of said track controls is responsive to hovering of a pointing device to cause display of corresponding track information. WO 2008/058306 PCT/AU2006/001698 - 24
31. A process for generating a user interface to display data associated with entities, the process including: generating a display of mutually spaced images associated with respective ones of said entities, said display being at a first magnification; 5 animating, in response to selection of one of said images by a user, said display to a second magnification greater than said first magnification, wherein the selected image is substantially centred within the display; wherein the display is responsive to input from the user to display data associated with a corresponding entity; and 10 wherein the display is responsive to selection of a region between images displayed at said second magnification to animate the display to said first magnification.
32. A system having components for executing any one of the above claims 1 to 31. 15
33. A computer-readable storage medium having stored thereon program instructions for executing any one of the above claims 1 to 31.
34. A user interface generated by any one of the above claims 1 to 31. 20
35. A system for generating a user interface to display data associated with entities, the system being adapted to generate a display of images associated with respective ones of said entities at a first magnification, said images being associated with controls responsive to input from a user to animate said display to centre and 25 display the selected image at a second magnification greater than said first magnification; and the system being further adapted to animate, in response to selection by a user of a region between displayed images, said display to display said images at a magnification less than said second magnification. WO 2008/058306 PCT/AU2006/001698 - 25
36. The system of claim 35, wherein the entities represent files of audio data or groups of files of audio data. 5
AU2006350947A 2006-11-14 2006-11-14 Process for generating a user interface to display data associated with entities Abandoned AU2006350947A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/AU2006/001698 WO2008058306A1 (en) 2006-11-14 2006-11-14 Process for generating a user interface to display data associated with entities

Publications (1)

Publication Number Publication Date
AU2006350947A1 true AU2006350947A1 (en) 2008-05-22

Family

ID=39401223

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2006350947A Abandoned AU2006350947A1 (en) 2006-11-14 2006-11-14 Process for generating a user interface to display data associated with entities

Country Status (2)

Country Link
AU (1) AU2006350947A1 (en)
WO (1) WO2008058306A1 (en)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8555165B2 (en) * 2003-05-08 2013-10-08 Hillcrest Laboratories, Inc. Methods and systems for generating a zoomable graphical user interface
US7737995B2 (en) * 2005-02-28 2010-06-15 Microsoft Corporation Graphical user interface system and process for navigating a set of images

Also Published As

Publication number Publication date
WO2008058306A1 (en) 2008-05-22

Similar Documents

Publication Publication Date Title
Abowd et al. The Family Video Archive: an annotation and browsing environment for home movies
US20010056434A1 (en) Systems, methods and computer program products for managing multimedia content
US8805830B2 (en) Web application for accessing media streams
US7730423B2 (en) Method and system for organizing document information
Marchionini et al. Interfaces and tools for the Library of Congress national digital library program
US7581186B2 (en) Media manager with integrated browsers
US7627831B2 (en) Interactive techniques for organizing and retrieving thumbnails and notes on large displays
US7853972B2 (en) Media preview user interface
EP2320335A2 (en) System and method for dynamic visual presentation of digital audio content
US20110153602A1 (en) Adaptive image browsing
WO2005106637A2 (en) Browsing media items organised using a ring based structure
US20100318939A1 (en) Method for providing list of contents and multimedia apparatus applying the same
US20080195970A1 (en) Smart genre display
WO2007149405A2 (en) Structured playlists and user interface
WO2007126996A2 (en) System and methods for enhanced metadata entry
EP1784712A2 (en) Common user interface for accessing media
CA2477494A1 (en) A slide show presentation and method for viewing same
KR20030019603A (en) Selection of an item
US20140181088A1 (en) Activity contextualization
WO2005086029A1 (en) Data handling system
Miser Sams Teach Yourself ITunes 10 in 10 Minutes
Martinho et al. ColorsInMotion: interactive visualization and exploration of video spaces
AU2006350947A1 (en) Process for generating a user interface to display data associated with entities
WO2008005174A2 (en) Smart genre display
Lee User interface design for keyframe-based content browsing of digital video

Legal Events

Date Code Title Description
MK4 Application lapsed section 142(2)(d) - no continuation fee paid for the application