MXPA05007087A - Multi-planar three-dimensional user interface - Google Patents

Multi-planar three-dimensional user interface

Info

Publication number
MXPA05007087A
MXPA05007087A MXPA/A/2005/007087A MXPA05007087A MXPA05007087A MX PA05007087 A MXPA05007087 A MX PA05007087A MX PA05007087 A MXPA05007087 A MX PA05007087A MX PA05007087 A MXPA05007087 A MX PA05007087A
Authority
MX
Mexico
Prior art keywords
plane
menu
user
user interface
data processing
Prior art date
Application number
MXPA/A/2005/007087A
Other languages
Spanish (es)
Inventor
Alan Glein Christopher
Ostojic Bojana
C Fong Jeffrey
Danner Sands Kort
R Gibson Mark
Original Assignee
Microsoft Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corporation filed Critical Microsoft Corporation
Publication of MXPA05007087A publication Critical patent/MXPA05007087A/en

Links

Abstract

A computer user interface for navigating through media content designed for primary use with a remote control device. The computer user interface contains two planes. A front plane (1001) connected to a first virtual hinge axis (1005) and a back plane (1003) connected to a second virtual hinge axis (1007). Both planes begin at a stating plane (901) whereupon a user selected menu item causes the planes to animate along their hinge so that the front plane is displayed more prominent to the user. The selected menu item is shown on the front plane while the remaining items are shown on the back plane.

Description

THREE-DIMENSIONAL USER INTERFACE OF MULTIPLE PLANS A portion of the description of this patent document contains material that is subject to copyright protection. The owner of the rights has no objection to reproduction by facsimile through any patent document or patent description, as it appears in the patent files or registers of the patent and trademark office, but otherwise reserves all Copyright.
FIELD OF THE INVENTION The invention relates in general to user interfaces of computer systems. More specifically the invention provides a three-dimensional space and improved utility animations for a multi-plane user interface of a data processing device primarily intended for interaction by a user through a remote control or other extended control device.
BACKGROUND OF THE INVENTION As the development of technology progresses, prices are reduced, and computing power increases, for example, memory, storage, processor speed, graphics, and the like, computers are used more often for special purposes in instead of being used as a general purpose machine. For example, computers have replaced cassette recorders (VCRs) in the form of a personal video recorder (PVR), capable of recording and stopping live TV, something that a VCR could never do. Since computers replace devices that inherently do not mean that a user interacts in the same way as a user can interact with a traditional PC, traditional user interfaces include perceived disadvantages that make them inconvenient as user interfaces for these special-purpose computers. , and in this way, new user interfaces are needed for a user to efficiently use the new device. In a conventional scenario, a user can interact with a home PC or laptop through a keyboard and a mouse to provide the primary input to the PC, and through a presentation screen and speakers to receive the primary output of the PC (Other input and output devices, such as a camcorder, printer, scanner, and the like may be used, but such devices are generally used less frequently for secondary input and output). The keyboard, the mouse, the presentation screen and the speaker are all typically placed very close to the user, for example, on a desk. The user interface of the operating system of the PC is probably designed under the expectation that the user will interact with the operating system using the keyboard, the mouse, the presentation device and the closely located speakers. This traditional computer input / output configuration is colloquially referred to as a "2-Foot" user interface, since the user is primarily intended to interact with the PC is approximately a distance of two feet (0.6096 meters) from the device entry or exit, for example sitting in a chair in front of the desk where the keyboard, mouse, presentation and speakers are located. However, the 2-Foot user interface does not provide the same level of usability to a user when deployed on a non-included device to be used as a 2-Foot interface, but rather is intended to be used. or be controlled through an infrared remote control or some other remote control device. Devices that are primarily intended to be used with a remote control device have a user interface colloquially referred to as a 10-foot user interface, since the user is primarily intended to interact with the device from a distance of two feet., and generally sits approximately 10 feet (3,048 meters) away from the output display screen attached to the device. Examples of devices that benefit from a 10-foot user interface include PVRs and central media PCs. A central media PC is a data processing device with features that allow a user to watch and record TV, handle music and listen to the radio, play DVDs, organize photos and perform other media-related activities, primarily through the media. interaction with a remote control device, for example, at a similar distance as a user uses to watch television at home. As will be appreciated, a 2-Foot user interface does not work as well when implemented in a device intended to have a 10-Foot user interface, since the text and graphics are usually too small to effectively be viewed from a large user interface. distance of the user from the presentation device. Although a first generation 10-Foot user interface has been developed for existing devices, this first generation of 10-Foot user interfaces has inherent usability deficiencies that hinder the user experience with the devices where they are deployed. . In this way, it could be an advance in the art to provide an improved user interface for devices whose primary interaction by a user is through a remote control device.
COMPENDIUM OF THE INVENTION The following presents a simplified compendium of the invention in order to provide a basic understanding of some aspects of the invention. This compendium is not an extensive overview of the invention. It is not intended to identify key or critical elements of the invention or to delineate the scope of the invention. The following compendium merely presents some concepts of the invention in a simplified form as a prelude to the more detailed description that is provided below. To overcome the limitations in the prior art described above, and to overcome other limitations that will become apparent after reading and understanding the present specification, the present invention is generally directed to a 10-Foot user interface for a computer system that it is controlled through a remote control device such as an infrared remote control. The user interface presents a menu that lists multiple menu items navigable and selectable by a user of the computer system using the remote control device. When the user selects one of the menu items, the user interface divides the content into two different planes in a three-dimensional space, where the user interface is plotted, and places the selected menu item in a prominent foreground, and menu items not selected in a less prominent background. According to another aspect of the invention, a computer in which the user interface is running can animate transitions from single-plane views to multiple planes of the user interface. A user of the data processing system can control the data processing system with a remote control device, for example, an infrared remote control. The computer has a processor configured, running software stored in memory, to provide the user interface as a three-dimensional user interface plotted on a presentation device connected to the computer system. The software stored in memory may include a user interface software module that provides the user interface in the three-dimensional space, wherein the user interface includes multiple menus navigable by the user using the remote control device. The software may also include an animation module which, under the control of the software module of the user interface, provides a sequence of frames for an animation when the user selects an item from a menu. The animation sequence divides the menu items from the menu from which the user selected an article between the planes in the three-dimensional space and animates the first and second planes moving away from each other in the three-dimensional space. According to another aspect of the invention, a computer-readable medium stores computer-executable instructions to perform a method for providing a user interface. The method includes generating a three-dimensional graphic space to provide a user interface of a data processing device, and presenting on a display device connected to the data processing device a first list of a plurality of menu items selectable by a user. that navigates the user interface using a remote control device. When the user selects one of the menu items, the user interface presents the selected menu item in a close-up in the three-dimensional graphic space, and presents the other menu items in a second plane in the three-dimensional graphic space. The user interface then animates the two planes by moving them away from each other in the three-dimensional space, so that when the animation is complete, the first plane has a more prominent presentation position than the second plane in the three-dimensional space where the User interface is plotted.
BRIEF DESCRIPTION OF THE DRAWINGS A more complete understanding of the present invention and its advantages can be gained by reference to the following description with reference to the accompanying drawings, in which similar reference numerals indicate similar characteristics or aspects, and wherein: Figure 1 illustrates a general operating environment suitable for the implementation of a media user interface according to an illustrative embodiment of the invention. Figure 2 illustrates a user interface infrastructure that can be used to support a media user interface in accordance with an illustrative embodiment of the invention. Figure 3 illustrates a start menu of a media user interface according to an illustrative embodiment of the invention. Figure 4 illustrates control zones of the start menu illustrated in Figure 3 according to an illustrative embodiment of the invention. Figure 5 illustrates the start menu shown in Figure 3 when a different menu item is highlighted by the selection cursor according to an illustrative embodiment of the invention.
Figure 6 illustrates a first frame in a most recently used list disclosure animation (MRU) according to an illustrative embodiment of the invention. Figure 7 illustrates an intermediate frame of a MRU list disclosure animation according to an illustrative embodiment of the invention. Figure 8 illustrates a final frame in a MRU list disclosure animation according to an illustrative embodiment of the invention. Figure 9a illustrates a top perspective view of an individual plane menu according to an illustrative embodiment of the invention. Figure 9b illustrates a top plan view of the individual plane menu illustrated in Figure 9a according to an illustrative embodiment of the invention. Figure 10a illustrates a top perspective view of a double hinge double plane menu according to an illustrative embodiment of the invention. Figure 10b illustrates a top plan view of the double double hinge plane menu illustrated in Figure 10a according to an illustrative embodiment of the invention. Figure 11 illustrates a first intermediate frame of an MRU list nullination animation according to an illustrative embodiment of the invention. Figure 12 illustrates a second intermediate frame of a MRU list tilt animation according to an illustrative embodiment of the invention. Figure 13 illustrates a final frame of an MRU list tilt animation according to an illustrative embodiment of the invention. Figure 14a illustrates a top perspective view of a dual wall double plane menu according to one embodiment of the invention. Figure 14b illustrates a top plan view of the double wall double plane menu illustrated in Figure 14 a according to an illustrative embodiment of the invention. Figure 15 illustrates a first intermediate frame of a power menu disclosing animation according to an illustrative embodiment of the invention. Figure 16 illustrates a second intermediate frame for a power menu disclosure animation according to an illustrative embodiment of the invention. Figure 17 illustrates a third intermediate frame of a power menu disclosure animation according to an illustrative embodiment of the invention. Figure 18 illustrates a final frame of a power menu disclosure animation according to an illustrative embodiment of the invention. Figure 19 illustrates a My Music menu according to an illustrative embodiment of the invention. Figure 20 illustrates a first intermediate frame of a context menu disclosure animation according to an illustrative embodiment of the invention. Figure 21 illustrates a second intermediate frame of a disclosure animation of context menu according to an illustrative embodiment of the invention. Figure 22 illustrates a final frame of a context menu disclosure animation according to an illustrative embodiment of the invention. Figure 23 illustrates a top plan view of an individual double-plane hinge menu according to an illustrative embodiment of the invention. Figure 24 illustrates a flow diagram for a method for performing alpha fading according to an illustrative embodiment of the invention. Figure 25 illustrates a folder navigation menu according to an illustrative embodiment of the invention. Figure 26 illustrates a volume window according to an illustrative embodiment of the invention. Figure 27 illustrates a second view of the volume window according to an illustrative embodiment of the invention. Figure 28 illustrates a view of the volume window when the volume is muted according to an illustrative embodiment of the invention. Figure 29 illustrates a top plan view of an alternative multiple plane media user interface according to one embodiment of the invention. Figure 30 illustrates a top plan view of an alternative multi-plane media user interface according to an illustrative embodiment of the invention.
DETAILED DESCRIPTION OF THE INVENTION In the following description of the various embodiments, reference is made to the appended drawings, which are part of the same, and which are shown by way of illustration of various embodiments wherein the invention can be practiced. It should be understood that other embodiments may be used and structural and functional modifications may be made without departing from the scope of the present invention. Figure 1 illustrates an example of a suitable computing system environment 100 wherein the invention can be implemented. The computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation to the scope of use or functionality of the invention. The computing environment 100 also should not be interpreted as having any dependency or requirement that relates to any component or combination thereof illustrated in the illustrative operating environment 100. The invention is operational with numerous other environments or configurations of general purpose computing system or special purpose. Examples of well-known computer systems, environments, and / or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, mobile or portable devices, multiprocessor systems; systems based on microprocessor, boxes of T.V. by cable, consumer-programmable electronic devices, network PCs, minicomputers, macrocomputers, distributed computing environments that include any of the above systems or devices, and the like.
The invention can be described in the general context of executable computer instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention is also designed to be practiced in distributed computing environments where the tasks are performed by remote processing devices that are connected through a communications network. In a distributed computing environment, program modules can be located on both local and remote computer storage media, including memory storage devices. With reference to Figure 1, an illustrative system for implementing the invention includes a general-purpose computing device in the form of a computer 110. Computer components 110 may include, but are not limited to, a processing unit 120, a system memory 130, and a common system conductor 121 that couples various system components that include the system memory to the processing unit 120. The common system conductor 121 can be any of several types of common conductor structures that they include a common memory driver or memory controller, a common peripheral driver, and a local common conductor that uses any of a variety of common conductor architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) common conductor, Micro Channel Architecture (MCA) common conductor, Improved ISA common conductor (EISA), Local Common Standard Association driver Electronic Video Recorder (VESA), and the Peripheral Component Interconnect (PCI) also known as common mezanine driver. Computer 110 typically includes a variety of readable computer media. Legible computer media can be any available media that can be accessed by computer 110 and includes both volatile and non-volatile media, removable and non-removable media. As an example, and not limitation, computer readable media may comprise computer storage media and media. The computer storage means includes both volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD) or other optical disc storage, magnetic cassettes, magnetic tape , magnetic disk storage or other magnetic storage devices, or any other means that can be used to store the desired information and which can be accessed by the computer 110. The media typically represents computer-readable instructions, data structures, program modules or other data in a modulated data signal such as vehicle wave or other transport mechanism and includes any means of information delivery. The term "modulated data signal" means a signal having one or more of its characteristics set or changed in such a way as to encode information in the signal. By way of example, and not limitation, the communication means include means by cables such as a wired network or a direct cable connection, and wireless means such as acoustic, RF, infrared and other wireless means. Combinations of any of the foregoing should also be included within the scope of computer readable media. The system memory 130 includes computer storage means in the form of volatile and / or non-volatile memory such as read-only memory (ROM) 131 and random access memory (RAM) 132. A basic input / output system 133 (BIOS), which contains the basic routines that help transfer information between elements within the computer 110, such as during startup, is typically stored in ROM 131. The RAM 132 typically contains data and / or program modules that are immediately accessible to and / or are actually operated by the processing unit 120. By way of example, and not limitation, Figure 1 illustrates the operating system 134, application programs 135, other program modules 136, and program data 137. The computer 110 may also include other removable / non-removable, volatile / non-volatile computer storage media. By way of example only, Figure 1 illustrates a hard disk drive 140 that reads from or writes to non-removable, non-volatile magnetic media, a magnetic disk unit 151 that reads from or writes to a removable, non-removable magnetic disk 152 , and an optical disk unit 155 that reads from or writes to a removable, non-volatile optical disk 156 such as a CD-ROM or other optical means. Other removable / non-removable, volatile / non-volatile computer storage media that can be used in the illustrative operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile discs, digital video cassette , Solid state RAM, solid state ROM, and the like. The hard disk drive 141 is typically connected to the common system conductor 121 through a non-removable memory interface such as the interface 140, and the magnetic disk unit 151 and the optical disk unit 155 are typically connected to the common system conductor. 121 through a removable memory interface, such as interface 150. The units and their associated computer storage media discussed above and illustrated in Figure 1 provide storage of computer-readable instructions, data structures., program modules and other data for computer 110. In Figure 1, for example, hard disk drive 141 is illustrated as storage operating system 144, application programs 145, other program modules 146, and program data. 147. Note that these components can be either the same or different from operating system 134, application programs 135, other program modules 136, and program data 137. Operating system 144, application programs 145, other program modules 146, and program data 147 are provided with different numbers to illustrate that, to a minimum, they are different copies. A user may enter information commands on the computer 20 through input devices such as keyboard 162, and a pointing device 161, commonly referred to as a mouse, rollerball or touchpad. Other input devices (not shown) may include a microphone, game lever, game pad, antenna, scanner, or the like. These and other input devices are frequently connected to the processing unit 120 through a user input interface 160 which is coupled to the common system conductor, but may be connected through another interface and common conductor structures, such as a parallel port, game port or a common universal serial driver (USB). A monitor 191 or other type of display device (e.g., a TV) is also connected to the common system driver 121 through an interface, such as a video interface 190. In addition to the monitor, computers may also include other peripheral output devices such as speakers 197 and printer 196, which may be connected through an output peripheral interface 190. In some aspects, a pen digitizer 165 and attached pen or stylus 166 is provided in order to digitally capture the free hand entry. Although a direct connection between the pen digitizer 165 and the user input interface 160 is shown, in practice, the pen digitizer 165 can be coupled to the unit or processing 110 directly, parallel port or other interface and the common conductor of system 130 through any technique including wirelessly. Also, the pen 166 may have a camera associated therewith and a transmitter for wirelessly transmitting image information captured through the camera or to an interface interacting with the common lead 130. In addition, the pen may have other perception systems as well. from or in place of the camera to determine electronic ink lines including accelerometers, magnetometers and gyroscopes. Computer 110 can operate in a networked environment that uses logical connections to one or more remote computers, such as remote computer 180. Remote computer 180 can be a personal computer, a server, a router, a network PC, a device peer or other common network node, and typically includes many or all of the elements described above relating to the computer 110, although only one memory storage device 181 has been illustrated in Figure 1. The logical connections illustrated in Figure 1; They include a local area network (LAN) 171 and a wide area network (WAN) 173, but can also include other networks. These networked environments are common places in offices, computer networks between companies, intranets and the Internet. In addition, the system can include capabilities through cables and / or wireless. For example, the network interface 170 may include a Bluetooth, SWLan, and / or IEEE 802.11 combination skill class. It is appreciated that other wireless communication protocols may be used in conjunction with these protocols or in lieu of these protocols. When used in a LAN network environment, computer 110 is connected to LAN 171 through a network interface or adapter 170. When used in a WAN network environment, the computer 110 typically includes a MODEM 172 or other means for establishing communication through the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the common system conductor 121 by means of the user input interface 160, or other appropriate mechanism. In a networked environment, the illustrated program modules relating to the computer 110 or its portions can be stored in the remote memory storage device. By way of example, and not limitation, Figure 1 illustrates remote application programs 185 resident in the memory device 181. It will be appreciated that the network connections shown are illustrative and that other means of establishing a communication link between the computers. It will be appreciated that the network connections shown are illustrative and other techniques can be used to establish a communications link between the computers. The existence of any of several well-known protocols such as TCP / IP, Ethernet, FTP, HTTP and the like, is presumed, and the system can be operated in a client-server configuration to allow a user to retrieve web pages from a server based on web. Any of the various conventional web browsers can be used to present and manipulate data on web pages. One or more aspects of the invention may be modalized into computer executable instructions, such as in one or more program modules and executed by one or more computer or other devices. In general, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed through a processor in a computer or other device. Computer-executable instructions can be stored on a computer-readable medium such as a hard drive, optical disk, removable storage media, solid storage memory, RAM, etc. It will be appreciated by one skilled in the art that the functionality of the program modules can be combined or distributed in various ways. In addition, the functionality may be modalized as a whole or in part in firmware or hardware equivalents such as integrated circuits, programmable field gate arrays (FPGA), and the like.
Illustrative Modes of the Invention In addition to the foregoing, the computer 110 may furthermore be configured with a television tuning card, and the computer 110 may be controlled through a remote control device 163, such as an infrared remote control. The remote control device 163 can be configured as a plurality of input, for example, buttons, keys, touchpad, finger pointing device, scroll control, etc., each configured to send a single command to the computer 110. through an infrared control signal, the remote control 163 can be configured to provide navigation buttons (eg, left, right, up, down, forward, backward, etc.), selection buttons (e.g. select primary, select secondary, enter, exit, cancel, etc.), alphanumeric input buttons (for example, 1, 2, ..., 9, 0, A, B, C, etc.), application buttons for launch certain applications or navigate to a certain type of data (for example, Internet Explorer, Music, TV, Photos, etc.), as well as conventional remote control inputs (for example, upload channel, download channel, increase volume, lower volume , etc.).
The computer 110 can be configured with a mode of operation means wherein a user interacts with the computer 110 using the remote control device 163 or a so-called "10-Foot" user interface presented on the TV 191. The mode Media operation can allow a user to watch or record television, watch a DVD, listen to music (through a digital music file or through a radio or optical disc), review and / or edit digital photos, and make other operations related to media. Since a user of the media operation mode will usually be sitting farther away than a user could be sitting farther than a user could be sitting to interact with the computer 110 in its normal mode of operation, the interface The user of the media operation mode must provide features that sufficiently convey the remote user's input reception back to the user, and transport navigation from the user interface to the remote located user. That is, the presentation of the user interface should not only be easily recognizable when a user is sitting directly at the front of the computer monitor (for example, at a separation of two feet or 0.6096 meters, such as a user interface). 2-Foot conventional) but may also be clearly recognizable and usable when the user is remotely controlling the user interface (eg, approximately 10 feet, (eg, approximately 3,048 meters) using the control device remote 163. For example, a 10-Foot user interface typically has less information on the screen at a time than a 2-Foot user interface, due to the instance through which the user is interacting with the interface This means that the information on the screen must be larger so that the user can see the user interface from further away. In a 10-Foot user interface typically is greater than the same information presented in a 2-Foot user interface, less information is adjusted in the same amount of actual display screen status. The 10-Foot user interface of the media operation mode is referred to herein as the media user interface. According to one aspect of the invention, in order to convey a sense of depth to a user of the media user interface, the media user interface can be constructed in a three-dimensional space. That is, although the media user interface can be presented in a two-dimensional presentation device, such as a monitor or TV, the media user interface can be constructed in a three-dimensional graphic space having dimensions X, Y, and Z, as well as having an alpha channel, a, to provide transparency according to certain characteristics of the interface of media users (described later). The use of the Z dimension allows the media user interface to have more information on the screen while still providing the information in a large enough size to make it visible from farther away than a traditional 2-Foot user interface, since the information can be presented with variable values of Z, as well as with variable values of X and Y. According to another aspect of the invention, in order to provide fluidity between the various presentations of the media user interface based on user inputs, the media user interface can be animated. Since the user of the media user interface will typically be located far away from the screen than a 2-Foot user interface, it is generally more difficult for a user to see smaller details in the user interface. Instead of instantly switching from one menu to the next, or from a selection of menu item to the next, which one of which a user may lose but puts careful attention on, the animation can be used to illustrate to the user the result of its entry, or the change from one menu to the other, making it easier for the user to conceptually follow its navigation through the media user interface. In addition, the animation may be used to provide feedback that a user has performed some action on the user interface, such as (a) moving the focus from one menu selection to another, or (b) selecting an item from a menu. In order to provide three-dimensionality and animations, the media user interface can be developed using any software package that provides three-dimensionality and graphics acceleration, such as the DirectX® 9.0 software programming device, with a DirectX routine time. 9.0b available from Microsoft Corporation of Readmond, Washington. The underlying software architecture secondary to the services provided by the media user interface. Microsoft's DirectX® is a suitable multimedia application programming interface (APIs) developed on Microsoft Windows® operating systems, and provides a standard development platform for Windows-based PCs allowing software programmers to access to specialized hardware features without having to write the specific hardware code. The APIs act as a bridge for the communication of hardware and software. DirectX® APIs provide multimedia application access to the advanced features of high-performance hardware such as three-dimensional (3-D) graphics acceleration integrated circuits and sound cards. The APIs also control low-level functions, including acceleration of two-dimensional (2-D) graphics; support for input devices such as game levers, keyboards, and mice; and control of sound mixing and sound output. Alternatively, DirectX® versions earlier than versions 9.0 can also be used. Although the specific software architecture will vary from system to system, an illustrative media center interface infrastructure 200 will now be described with reference to Figure 2. The reference to the specific media user interface infrastructure is not intended to limit the invention to the use of a specific infrastructure such as the infrastructure 200 or to a specific software package such as DirectX®. Illustrative infrastructure 200 of Figure 2 is provided merely as an example of how a media user interface infrastructure can be designed. The infrastructure used is a secondary consideration in addition to the operation and actual characteristics of the resulting media user interface described below with reference to Figure 3. Infrastructure 200 may include a top-level media user interface application 205, an interface of controls 210, an Ul 215 structure, component model services 220, and presenter 260. The Ul 205 application is the top-level control application that handles the operation of the media user interface by calling control routines and the Ul structure based on a user interaction with the media user interface. The operation of application 205 will be discussed later. The remaining infrastructure will now be described from the beginning. The presenter 260 maps the final resulting media user interface to the video memory. The presenter can run on his own thread, and receives information from the Ul 215 structure regarding what he is going to plot. A drawing unit for the presenter may be referred to as a Visual unit. Visuals can be arranged in a tree that describes the order of painting and containment. Visuals may also contain content that will be drawn or drawn, such as an image, text, color, etc. There can be a Visual object in the structure Ul 215 corresponding to each visual of the presenter, so that the Ul 215 structure can call the presenter 260 for plotting. The presenter 260 may include or communicate with presentation modules 261, 263, 265, depending on the graphic development application used for the media user interface, DirectX®9, GDI, or DirectX®7, respectively. The Component Model 220 services can include four primary service modules: Visual 221, Common Services 231, Ul Specific Structure Services (UIFW) 241, and Messaging and State Service 251. Message and status sending services are arranged by dispatcher 253 and session of Ul 255. Similar to a standard Windows® message queue, dispatcher 253 handles all requests for processing time for the components in the capsule for the media operation mode that is the platform of all 10-Foot user interface experience. Ul infrastructure components run as part of the capsule procedure. However, the dispatcher 253 may be extended to allow the creation and expression of new priority rules as necessary, for example, to allow a new rule running a particular task after all painting tasks but before any control task of time. The Ul 255 session is a state container that handles all data related to a group of objects. The session Ul 255 handles data, while the dispatcher 253 manages the time control. Other infrastructure services 200, for example, presenter 260, layout 223, drawing 227, etc., can store their data as sub-objects in session 255. Session 255 can create a port to communicate with each service, so that each service can refer to its portion of the data to manage its own tasks.
The remote control 257 is responsible for presenting the user interface on a remote device at a high fidelity, if desired. The remote control is optional, and is not required to present the user interface on a directly or indirectly connected monitor or television. The visual services 221 may include layout services 223, video memory management 225, drawing services 227, and animation services 229. Layout services 223 place the Visuals before they are presented by the Ul 215 structure and the presenter 260 The video memory management 225 handles data and instructions that go to the video card, including, for example, surface management, vertex buffers, and pixel shaders. The drawing services 227 handle any non-animated visual component that will be drawn in the user interface, including text. The animation services 229 include a portion used by the component model 220 and a portion used by the presenter 260. The component model portion develops an animation template that specifies an object, a destination, a period of time control, an animation method, stopping points, and any other necessary animation data. The template can include key tables that describe a value for some point in time and the way in which to interpolate between that keyframe and the following defined keyframe. The presenter then reproduces the template, at that time the animation services develop an active animation, which runs the presenter 260 per frame to move the Visuals on the screen. Common non-visual services 231 may include entry services 233 and directional navigation 235. Entry services 233 handle a state machine that determines how to process the entry (remote control navigation, press down / up, mouse movement, etc.). ) to a specific view of the media user interface. The directional navigation services 235 identify a movement destination of the same page based on a central point of a current screen selection, or other on-screen objectives, and the address indicated by the user. UIFW services 241 may include data services 243, analysis services 245, and page navigation services 247. Data services 243 allow data sources for objects to handle the union in accordance with predetermined binding rules, and allow variables to reference data that will be defined as necessary. For example, the data services 243 can be used to associate a property of display name of the article in photo with a property of Content of Text Vision Article of the thumbnail button, so that when a property in one of the objects is fixed or changed, the property related to the other object is changed or fixed too. A relationship does not need to be one to one. When a value in a united object changes, the union is marked as "dirty" and, at a later time, dispatcher 253 will call a procedure to re-evaluate dirty constructs, causing data services 243 to propagate new values to each dirty union destination. The analysis services 245 analyze XML descriptions of the media user interface. That is, the XML can be used to create visual aspects of the media user interface, as well as visual aspects of manual authorization of the media user interface in C, C ++, and / or C The page navigation services 247 identify navigations between pages based on a selected content article. An Ul 215 structure provides an abstraction layer between the application 205 and the component model 220. The controls user interface 210 handles the operation of items presented on the display screen. That is to say, simply drawing a button on a screen does not inherently make the user select that button as a result of the action. The user interface of controls 210 handles the actual operation of the items, such as buttons, radio lists, turn controls, and the like, as well as views and viewing articles. A Control is something in the user interface of media with which the user can interact with, manage inputs, focus and navigation. A View is an owner of the presentation of a Control. The Vision requests that a visual aspect of the control be plotted on the screen. That is, the vision aspect causes a visual representation of the control to be presented as part of the media user interface. A View can handle visuals by creating a tree of vision items. A vision article stores content to draw (that is, a visual), as well as the logic for how the content is used (for example, as a control or as part of an animation). The above infrastructure provides a description layer Ul handled at the top of a presentation system, whose basic unit is Visual, as discussed above. The visuals can be represented as tree nodes that establish a containment for transformations. The managed layer (the component model) creates a higher-level programming interface for the presentation system. The infrastructure can use objects to describe images, animations, transformations, and the like, using XML and / or a source code written in a language such as C, C ++, or C Those skilled in the art will appreciate that the underlying Ul infrastructure is secondary to the services it provides. Utilizing the aforementioned infrastructure and infrastructure services provides, the Ul 205 application (ie, the managed description layer) provides the routines and definitions that make, define and control the operation of the media user interface. An illustrative media user interface provided by the Ul 205 application will now be described with reference to Figs. 3-31. The home page 300 of the media user interface may include a plurality of high-level menu selections 301, a list (of text, icons, graphics, etc.) of the most recently-used items (MRU) 303 , a 305 power menu icon, and a clock. High-level menu selections can include Selective Online Light, My Pictures, My Videos, My TV, My Music, My Radio, My Programs, My Tasks, and parameters. Other high-level selections may also be included or alternatively. The MRU list 303 can, at all times, correspond to a highly illuminated menu selection item 307, as indicated by the selection cursor 309. That is, a MRU 303 list can include up to three media items selected by the most recent user 303a, 303b, 303c corresponding to the currently lit menu selection item 307. For example, when the My TV menu item is illuminated, MRU items may include selections of media for DVD, TV, or movies; when the My Music menu item is lit, the MRU list may include the three most recent songs played by the user; when the My Radio menu item is lit, the MRU list can include the most recent radio stations listened to by the user; etc. As the user moves the control cursor 309 over a new menu item, the Ul application refreshes the MRU 303 list to correspond to the item in the newly lit menu. If the user has never selected three media articles corresponding to the current article 307, the Ul 205 application may alternatively cause the media user interface to present default articles or actions, or no items in the entire MRU 303 list. According to one aspect of the invention, the MRU list contains geographic icons, or text, or a combination of the two. The icons are preferably used, with or without text, since visual stimulation is more easily perceived and recognized from distances (as is typical in use with a 10-Foot user interface) that is text. In order to perceive and recognize the text at the same distance, the text could necessarily be large enough and take more real state of presentation than is necessary for graphics or icons. In this way, a combination of text and graphics adapts the media user interface to be used as a 10-Foot user interface as well as a 2-Foot user interface.
The power icon 305 launches a power sub-menu, described later with respect to Figures 15-18. Figure 4 illustrates the zones 401-417 of the media user interface 300 selectable by a user using the remote control device 163. The user, using the up, down, left and right navigation buttons on the remote control device 163, you can navigate to each zone when the selectable content is presented in each zone. Zone 401 includes user selectable menu items, zone 403 includes a first MRU item selectable by the user; zone 405 includes a second MRU selectable by the user; zone 407 includes a third MRU item selectable by the user; zone 409 includes action buttons corresponding to selection item 307 currently illuminated; zone 411 includes system controls, for example, the power menu icon 305; zone 413 may include an item of selectable content indicating an action that is currently occurring, such as a song currently reproduced (see, for example, Figure 5). Each zone can be set to include selectable items or not, depending on the items actually selected, actions that are actually occurring (such as a song or a radio station being played). When the MRU list contains one or more items, the user can browse and select the MRU item presented in zone 403, 405, or 407, depending on one, two or three MRU items that are available, respectively. With further reference to Figure 5, as a user moves through the menu items 301, the menu items animately move up or down, although the selection cursor 309 remains fixed in the same position. When the user presses the down navigation button on the remote control 163, the content is shifted up; when the user presses the up navigation button on the remote control 163, the content is shifted downward. For example, to navigate from the media user interface shown in Figure 3 to the media user interface shown in Figure 5, the user only needs to press the down navigation button on the remote control device 163. When the user presses the down navigation button, the media user interface animates through a series of intermediate frames from the view shown in Figure 3 to the view shown in Figure 5, shifting the content as appropriate. Although the media user interface is at rest, ie, the user is not entering anything, the selection cursor 309 may be accented, for example, appearing to guide or press, to indicate to the user the currently lit menu item, as well as to indicate that the computer has not been frozen (ie, crashed). This is especially useful for being used as a 10-Foot user interface since, due to the distance from which a user can interact with the surface, the user can more easily track the cursor if the cursor is not sufficiently Large or prominent for the user to follow the trail. When the user selects the illuminated article, the selection cursor 309 may flash or provide some other visual indication that the user has provided the entry. An audio signal may also alternatively be used to provide selection feedback to the user. As discussed above, when the user navigates from a menu item in list 301 to another, the MRU list is also refreshed to contain the new items in the MRU list that correspond to the article to which the user has navigated. According to one aspect of the invention, the MRU list can be refreshed as they use a variety of animations. In a modality, the MRU list 303 can be animated with the menu list 301, except that the MRU list items corresponding to the item in the menu list 301 where the user is slowly navigating vanish from the vision as if they were escaping, and the MRU list items corresponding to the item in the menu list 301 to which the user is slowly navigating fade into vision as they move to their final positions in the media interface 300. Although the menu items remain visible as they move up or down the selection cursor 309, items in the MRU list do not. For example, suppose that a user navigates from the My TV menu item as shown in Figure 3 to the My Music menu item as shown in Figure 5. In order to navigate from My TV to My Music the user selects the navigation key down on the remote control or keyboard to send a downward navigation command to the Ul application. When the Ul application receives the down navigation command of the state shown in Figure 3, the Ul 205 application animates the menu items sliding to switch to the My Music menu item within the selection cursor 309, also fading the My Videos menu item partially from the view, and fading the menu item from More Programs completely from view as part of the animation. Also as part of the animation, the online selective light menu item disappears completely from the view, and the menu item for meters becomes partial in the view. Along with the lively slide of the menu items, the items in the My TV MRU list are moved up with the My TV menu item. However, the MRU list items from My TV fade from top to bottom view, along with their upward movement as they move from their original positions, thus graphically simulating the MRU list items passing through. under a cover that gradually increases the opacity from totally transparent to totally opaque. Similarly, as the MRU list items in My Music are visible from below the MRU My TV list items, they fade from view as if they came from under a cover, gradually increasing the transparency of totally opaque a totally transparent. This same effect can be used with MRU list items in focus, as shown in Figure 13 (described later). According to another embodiment, with reference to Figures 6-8, the MRU icons can animately be slid, swept or fly out of view, appearing graphically to originate from the back of the list of menu selection items 301, moving from left to right. The glide of vision provides a visual key to the user that the change in focus of the menu item has caused a change in the secondary content based on the focused menu item. Figures 6-8 illustrate a start box 601, an intermediate box 701 and a final frame 801, respectively, of a MRU List Reveal animation that may occur to present MRU list items associated with the newly lit My Music article. Figure 6 illustrates a first frame 601 of the animation after a user, starting from the media user interface shown in Figure 3, presses the navigation button down on the remote control device 163 to select My Music. As shown in Figure 6, the MRU list of articles that was previously submitted for the previously selected My TV menu item has disappeared. Figure 7 illustrates an intermediate frame 701 of the animation as the MRU 303 list items are swept to the right, appearing to originate from behind the list 301 items. As shown in Figure 7, MRU list items can have an alpha value during the animation, so that the articles appear at least partially transparent. Alternatively, no alpha value is used. The. Figure 8 illustrates a final frame 801 of the MRU list animation, illustrative of the final position and appearance (ie, non-transparency) of the MRU 303 list items. The appearance animation of the items of the MRU list directs the user voltage to the MRU list, so that the user can clearly see that the MRU items have changed as a result of the recently lit menu item that, in this example, is an article in My Music. Those skilled in the art will appreciate that although three animation tables for MRU list disclosure animation are provided here, there are other animation pictures among those provided in Figures 6-9. Figures 6-8 provide examples of key tables that can be used. The keyframes provide control points between which the animation can be interpolated to transition from one keyframe to the next. When using interpolation, the animation can be reproduced at any frame rate and be correct (as opposed to frame-based animations). Alternatively, you can use more or less keyframes. In accordance with another aspect of the invention, with reference to Figures 9-12, in order to draw the user's attention to the fact that the user has navigated from the menu list 301, the media user interface can provide three-dimensional feedback to the user, when the user changes the focus of the menu list 301 to an item in the MRU list 303. In an illustrative embodiment of the invention, the media user interface provides a graphically simulated double hinge effect in a three-dimensional space as the user moves the navigation cursor to an MRU list item. Figures 9a and 9b illustrate a top perspective view and a top plan view, respectively, of the start page 300 of the media user interface as the user is scrolling through the menu items 301 with the corresponding MRU 303 list. In Figures 9a and 9b, the user has not yet navigated the control cursor to an MRU list item. Figures 9a and 9b illustrate that all the content presented on the home page 300 is in a single plane X, Y 901. In other words, all the content on the home page 300 has the same Z dimension. The Figure illustrates the home page 300 corresponding to Figures 9a and 9b, before the user selects an MRU list item 301a, 301b, or 301c. Figures 10a and 10b illustrate a top perspective view and a top plan view, respectively, of the media user interface in a double-hinge MRU 1301 list item selection view (see corresponding Figure 13). Figures 10a and 10b illustrate that the content presented in the list item selection view MRU was divided between two planes 1001, 1003, extending from "hinge" axes 1005, 1007, respectively. The front plane 1001 may include the selected menu item 307 and its corresponding MRU list items 301a, 301b, 301c. The back plane 1003 may include the menu item 301 items other than the selected menu item 307. Since each plane 1001, 1003 is optically hinged on a virtual hinge axis 1005, 1007, respectively, the content Z values in each The respective plane will gradually change as the content moves away from the hinge axis in the plane. The content in the back plane 1003 may be visible behind the content in the front plane 1001, for example, using alpha shading of the front plane. Figure 13 illustrates a MRU 1301 list item selection view of the media user interface in accordance with this illustrative embodiment. By moving the selected content to the front plane 1001 and the unselected content to the rear plane 1003, the media user interface conceptually transports a user whose menu item is selected, along with its corresponding MRU 301 list, and which menu items were not selected but are available if the user selects to browse them. As shown in Figure 13, the media user interface content neither in the menu selection list 301 nor in the MRU list 303 can be presented in a third plane located at the position of the starting plane 901. By keeping the secondary content in the original plane 901, the user of the media user interface can easily navigate to the content located in the plane 901, such as a power menu icon 305. Figures 8 and 11-13, sequentially , illustrate frames in a MRU list tilt animation as the user moves the navigation cursor of the My Music menu item through the first MRU item 301a that corresponds to the My Music menu item. During the animation, the two planes are graphically pivoted, or tilted, forward and backward, as applicable, from the hinge axes, and the MRU list items are scanned, which seems to originate from the selected menu 307. Figure 8 illustrates the individual plane start menu while the navigation cursor 309 is on the My Music menu item. Figure 11 illustrates a first intermediate frame 1101 as the media user interface separates the content between two planes and begins to virtually pivot the front plane 1001 forward in the Z dimension on the hinge axis 1005, and begins to pivot the rear plane 1003 backward in the Z dimension on the hinge axis 1007. Figure 11 also illustrates the MRU 301 list items, beginning by sweeping out, which seems to originate from behind the selected menu item list 307. Figure 12 illustrates a second intermediate frame 1201 in the animation, illustrating the planes that are close to their respective end positions. In Figure 12, the selected menu item "My Music" continues graphically moving forward and is beginning to take on a more prominent appearance as compared to previous menu items that were not selected. A subtitle, "recent music", corresponding to the selected menu item is beginning to appear more clearly in table 1101 illustrated in Figure 11. Also in Figure 12, MRU list items continue to sweep outward, approaching your final positions. Figure 13 illustrates a final frame 1301 of the animation, with menu items and MRU list items in their end positions, selectable by a user as desired. The above illustration is provided as an illustrative use of the double-hinged planes to provide clear and conceptual visual feedback to a user of a 10-Foot user interface. These double-hinged planes can be used for any navigational feature of the media user interface, and should not be constructed as limiting the selection of an MRU list item. With further reference to Figures 9 and 14-22, according to another aspect of the invention, the media user interface can divide the content into two planes, a more prominent front plane and a less prominent back plane, but rather by hovering each plane as shown in Figures 9-13, the media user interface graphically pushes the back plane in fact from its original position, and pulls the right front plane from its original position, the resulting graphic effect is a double wall of content, where the selected or accented content is carried forward and illuminated, and the unselected content is taken back into a three-dimensional space, providing a clear conceptual and visual image for the user of the selected and unselected content, or in a menu mode taking importance over a previously displayed menu. Figure 9a illustrates a virtual top perspective view, and Figure 9b illustrates a top plan view of the start page 300 of the media user interface as the user is scrolling through the menu items 301 with the corresponding MRU 303 list. In Figures 9a and 9b, the user has not yet selected a new menu item to initiate plane splitting. Figures 9a and 9b illustrate that all the content presented on the home page 300 is in the same plane 901. In other words, all the content on the home page 300 has the same dimension Z. Figure 8 illustrates the page of start 300 corresponding to Figures 9a and 9b, before the user illuminates or selects a menu item or other menu item. Figures 14a and 14b illustrate a top perspective view and a top plan view, respectively, of the media user interface dividing the content between two planes, a front plane 1401 and a rear plane 1403, wherein the front plane 1401 it is graphically to the straight side and the back plane 1403 is graphically pushed straight back. All content in the front plane 1401 has substantially the same Z value, and all the backplane content 1403 has substantially the same Z value, although different from the content Z value in the front plane 1401. The front plane 1401 may include a a new menu (for example, a sub-menu) corresponding to an article of content selected by the user of the previous plan 901. The back plan 1403 may include the previous menu where the user selected the content article to cause the appearance of a new menu. It will be appreciated that the amount that the frontal plane is pulled forward in the space Z and the amount of the rear plane that is pushed back in the space Z, is a secondary consideration the fact that the simulated planes move substantially straight forward. and backward, respectively, in relation to each other. Alternatively, the backplane 1403 can be moved backward, and the front plane 1401 can remain fixed and open a new content, for example, a power menu, in its fixed position (where the start plane 901 was originally located). In yet another embodiment, the backplane 1403 can remain fixed while the front plane 1401 moves forward and presents the new content as it moves forward, eg, a context menu. The graphically simulated appearance of the moving plane 1401 forward, of the moving plane 1403 backwards, or both, can be achieved by enlarging the content in the plane 1401 and / or reducing the content in the plane 1403, providing the content in the plane 1401 in focus, while the content in the plane 1403 is not focused to a certain degree, and / or making the content in the plane 1401 lighter or brighter and making the content in the plane 1403 darker in appearance. Further with reference to Figure 18, the double wall effect can be used to illustrate to the user that the user has selected a power menu. Figure 18 illustrates a media user interface with a power menu 1801 in the front plane 1401, and the start menu content 301 in the backplane 1403. As is evident, through a comparison of Figure 18 with Figure 3, the start menu content in Figure 18 behind the power menu 1801 is graphically simulated to appear smaller than the start menu content 301 in Figure 3, since the start menu content in Figure 18 is in a plane that has been pushed backward from the power menu 1801 in Figure 18. The power menu 1801 can be considered as a secondary menu, since the user quickly returns from the power menu 1801 to the start menu 300, for example, if the user decides not to close the Ul 205 application of the 1801 power menu or perform any other option available in the power menu. In this example, the power menu has buttons 1803-1811 to close the media center application 205, removing the current user, stopping the computer, restarting the computer and going to sleep power mode, respectively, each button being selectable using navigation and selection buttons on the remote control device 163. The use of the double-sided three-dimensional graphic effect can be advantageous for bringing to a user that the user has selected a secondary menu, such as a context menu or a menu of energy, from which the user can quickly return to the original menu from which the user selected the sub-menu. However, the double wall effect can also be used for other purposes in a 10-Foot user interface to conceptually indicate the user, pushing the content back, that the current navigation has been temporarily interrupted, and that the new content faces of interrupted content now has the focus. Figures 15-18 illustrate a power menu disclosure animation that can visually indicate to the user that the user has selected the power button 305 (Figure 3) in the start menu 300. Figure 15 illustrates a first intermediate box 1501 of the animation after the user selects the power button 305. In Table 1501, the start plane 901, now considered to be the back plane 1403, has already been pushed back in the Z dimension, thus making all the Content originally located on the 901 plane (that is, all the start menu content) graphically appears smaller as it seems to move away from the user. Also in frame 1501, window 1503 begins to appear. Power menu 1801 will be placed in window 1503 when window 1503 is fully formed. Figure 16 illustrates a second intermediate frame 1601 of the power menu revelation animation. In Figure 16, the content in the backplane 1403 has not moved, as the backplane quickly reaches its destination in space Z in this particular example (however, the speed of the animation and the speed with the which any particular plane moves can be fixed as desired). Also in box 1601, window 1503 continues to open, and is now approximately half its normal size. Figure 17 illustrates a third intermediate frame 1701 of the power menu revelation animation. In frame 1701, the power menu window 1503 has almost reached its final size, and buttons 1803-1811 have begun to fade from view. Figure 18 illustrates the final frame of the power disclosure animation, where window 1503 is fully formed and power menu 1801 is complete, including buttons 1803-1811. With reference to Figures 19-23, a variation of the double-walled effect can be used to carry a context menu in the media user interface. Illustrated by the top plan view of Figure 23, Figures 19-22 illustrate a context menu disclosure animation that can be used to provide a context menu to a user of the media user interface. Figure 23 illustrates the conceptual top plan view resulting from the two planes at the end of the animation, which starts from the conceptual top floor view shown in Figure 9b. Figure 23 illustrates an individual hinge axis 2305 about which plane 2303 tilts back from the original position of the 901 plane. Figure 19 illustrates a My Music menu, which results from the user selecting the My Music menu item. illuminated from Figure 8. The My Music menu 1901 includes icons 1903-1911 representative of music stored on computer 110 where the media user interface is running., or on some other networked computer to which computer 110 has access. In the example shown in Figure 19, since the My Music menu is currently in the Album view, as indicated by the 1925 view indicator, the icons are representative of music albums. The My Music menu 1901 also has a plurality of menu buttons 1913-1923, through which a user can view music through albums, artist, playlist, music, or genre, as well as search for music, respectively . The 1925 view indicator is placed near any menu button that represents the current My Music menu view. In Figure 19, the user selection cursor 309 indicates which icon 1905 is currently illuminated for selection by the user. The "oppression" on the right "on the icon 1905 starts the animation sequence shown in Figures 20-22. Since the user may be using a remote control device 163 to control the media user interface in place of a mouse, the remote control device may have a secondary selection button, similar to the computer mouse configured for the right hand which is pressed to the right, the selection of which leads to a context menu instead of playing the selected music as it could result from using the primary selection button on the remote control device 163. Figure 20 illustrates a first intermediate frame 2001 just after the user presses the right side of the 1905 icon. In Figure 20, the 1905 icon has been slightly enlarged to bring the user to the selected 1905 icon by the user (as opposed to any other icon), and the 1905 icon also it is placed in plane 2301 (Figure 23). The remaining content, which originates from plane 901, is placed in plane 2303, and has been presented to appear to have begun to move backward. In this example, the plane 2303 is hinged at the point 2305, so that the content in the plane 2303 appears tilting backwards instead of going straight. Figure 21 illustrates a second intermediate frame 2101 of the context menu disclosure animation. In frame 2101, window 2203 has begun to be formed, appearing to originate from a vertical central position 2105 of the final window 2205, and gradually enlarging window 2203 in ascending and descending directions. The content in plane 2303 continues to swing backward on hinge axis 2305. Figure 22 illustrates a final frame 2301 of the context menu disclosure animation sequence. In frame 2201, window 2203 is fully formed and context menu 2205 is presented for navigation and selection by the user. Window 2203 (including its contents) and icon 1905 are in plane 2301, while the remaining content is totally tilted back in plane 2301. The individual hinge animation effect illustrated in Figures 19-23 can be modified in different ways. For example, the hinge axis can be placed to the left of the corresponding hinged plane instead of to the right of the corresponding hinged plane as shown in Figure 23, or the hinge axis can be placed above or below the corresponding hinged plane . Those skilled in the art will appreciate that, by using a three-dimensional enabled application such as DirectX®, the hinge axis can be placed anywhere in three-dimensional space. According to one aspect of the invention, the hinge axis can be placed conceptually away from the selected icon. That is, if the user selects an icon on the right side of the My Music 1901 menu, the Ul application can place the hinge axis 2305 to the right of the My Music menu, as shown in Figure 23, so that the content on the 2303 rear plane appears farther behind the selected icon than what might be if the hinge axis were placed to the left of the menu. Similarly, if the user selects an icon on the right side of the My Music 1901 menu, the Ul application can place the hinge axis 2305 to the left of the My Music menu; if the user selects an icon at the top of the My Music 1901 menu, the Ul application can place the hinge axis 2305 below the My Music menu; if the user selects an icon in the bottom part of the My Music 1901 menu, the Ul application can color the hinge axis 2305 above the My Music menu. Similar hinge placements can be used in any direction, including diagonal, to tilt the unused content as far as possible behind the selected content. According to one aspect of the invention, a user can specify where the hinge axis is located, for example by changing a hinge axis location determined under the Parameters menu items. Other effects and animations of multiple planes can be used to conceptually convey the navigation and selection of menus and articles to a user of the media user interface, the specific multi-plane effect or animation used is secondary to divide the menu content into two or more planes to convey conceptually to a user which article or articles are relevant based on the navigation and selection of menus and user articles in the media user interface. In some effects and animations, two planes can be used, as illustrated in Figures 10b (excluding plane 901), 14b (excluding plane 901), 23 or 30 (including planes 3301, 3303 and excluding plane 901) ). In other effects and animations, the Ul 205 application can divide the content into three or more planes, as illustrated in Figures 10b (including plane 901), 14b (including plane 901), 29 (including planes 2901, 2903, 2905), and 30 (including plans 3001, 3003, 901). The most relevant content or content in relation to the user's most recent selection is preferably placed in the most prominent plane, typically the plane in the front as compared to the other planes currently in use. Using a three-dimensional enabled development application such as DirectX® enables other novel features of the media user interface described herein. For example, the background of the media user interface may remain somewhat constant from one menu to the next, albeit slightly changing to indicate to the user that the application 205 is not frozen, and also to prevent the burning of the display device 191 In this way, the media user interface can have an animated background, as illustrated in Figures 3-8, 11-13 and 15-22. In Figures 3-8, 11-13 and 15-22, the background or antecedent appears similar in each figure. However, in a close inspection, it will be observed that the funds are actually slightly different, although they retain all the similarities, so that they do not confuse the user. The animated where illustrated in Figures 3-8, 11-13 and 15-22 can be created using two or more rotating layers in the three-dimensional space, the front layer preferably being almost transparent, each layer having an alpha value and a length of rotational cycle. Preferably, each cycle length is different, and the cycle lengths are not multiples of one and the other. The two layers can be connected as if through an invisible pole, and are separated in space Z (along the axis of the "invisible pole"). When the plane of origin rotates (is background), the child plane (the one above) can also rotate, at the same or a different speed. The effect animated in this way is achieved by rotating the plane of origin and having the depth between the two creates a sense of movement for the user. Another aspect enabled by the use of three-dimensional space and alpha shading is alpha fading, as illustrated in Figures 13, 19 and 22. That is, conventionally when a presentation does not have enough space allowed to present the entire name of an article, or enough space to present all the text associated with an article, the application will abruptly cut the text or present ellipses ("...") to indicate to the user that more text is available than the one presented. However, the media user interface can use the alpha (a) video channel of the text to gradually fade the text. With reference to Figure 24, in step 2401 the application Ul 205 determines how much space, S, is available for the text T0 that will be written on the screen. In step 2403, the Ul 205 application determines how much text T0 will fit in the allowed space S. The amount of text that will fit in the allowed space S is designated as T- |. Text measurement can be acquired from Win32 GDI APIs, such as DrawText. In step 2405, the Ul 205 application determines whether T1 and To are equal, meaning that all text T0 will fit in space S. If Ti and T0 are equal, then the Ul 205 application proceeds to draw text T0 in step 2407 in the space allowed if in the alpha bleeding. If Ti and T0 are not equal, then the Ul 205 application traces the text T ^ in the space allowed, and alpha mixes a final predetermined quantity of the text 1, for example, the last 1-5 characters, gradually changing the alpha level of totally opaque to totally transparent. The alpha graduation can use Direct3D vertex color interpolation capabilities. The need for ellipses in this way is avoided through the use of alpha bleed, referred to here as alpha fading. In yet another feature of the media user interface, the Ul 205 application can provide additional aspects for the media user interface in addition to those described above. For example, Figure 25 illustrates a folder navigation screen 2501. In Figure 25, the folder navigation screen 2501 is being used to select folders to search for music to be added to a music collection. However, the folder denial illustrated in Figure 25 can be used for any purpose for which folder navigation is useful. Figures 26-28 illustrate a volume window 2601 that appears when the user adjusts the sound volume, for example, using the remote control device 163. The volume window 2601 includes a numerical indicator 2603 of the volume level, as well as a visual indicator 2605 of the volume level in the form of a volume bar that fills from left to right as the volume moves from the minimum volume level to the maximum volume level. The volume window 2601 is beneficial since the user can easily determine, after removing the mute, at what volume level it will return. That is, as shown in Figure 31, the numerical indicator can switch to "mute" when the mute aspect is turned on, while the volume bar 2605 indicates the volume that will result when the mute is turned off. The volume window 2601 may appear in the individual plane 901 (Figure 9), or may appear using any of the double-plane effects described above. The media user interface described above, although presented in a flat or slightly convex display device, such as a monkey or a TV, is graphically simulated to look three-dimensional. Using the aspects described here, the Ul 205 application provides a media user interface that is suitable for use as a 10-Foot user interface by placing prominent icons and menu items in the presentation, using three-dimensional transformations to increase the actual state of presentation where the content can be presented, and using animation to transmit conceptually the navigation between views to the user. The media user interface especially is also useful when used with a media operation method of a data processing device, even if a media mode is not required. Although the invention has been described with respect to specific examples including the currently preferred modes for carrying out the invention, those skilled in the art will appreciate that there are numerous variations and changes to the systems and techniques described above. In this way, the spirit and scope of the invention should be broadly construed as set forth in the appended claims.

Claims (36)

1. - A method for presenting content to the user through a user interface, comprising the steps of: presenting a presentation device connected to a data processing device where the user interface is presented in a first plurality of articles of selectable menus, selected by a user using a remote control device; responding to the selection of the user of one of the plurality of menu items, presenting the selected menu item in a simulated first plane in a three-dimensional graphic space, and presenting the plurality of menu items other than the selected article in a second plane simulated in the three-dimensional graphic space, where the foreground has a more prominent presentation position than the second plane.
2. The method according to claim 1, wherein the first piano having a presentation position more prominent than the second plane, comprises the first plane facing the second plane in the three-dimensional graphic space as seen by a user of the user interface.
3. The method according to claim 1, further comprising presenting in the first plane a second plurality of menu items corresponding to a selected article.
4. - The method according to claim 1, further comprising pivoting the first plane on the first hinge axis.
5. The method according to claim 4, further comprising pivoting the second plane on a second hinge axis.
6. The method according to claim 5, wherein the first hinge axis has a different location in the three-dimensional space than the second hinge axis.
7. The method according to claim 1, further comprising simulating the thrust of the second plane backwards in a dimension Z of the three-dimensional space as compared to the first plane.
8. The method according to claim 7, further comprising pulling the first plane forward in the Z dimension.
9. The method according to claim 5, wherein the first and second hinge axes are on sides. substantially opposite of the display device in a dimension X of the three-dimensional space.
10. The method according to claim 3, wherein the second plurality of menu items comprises items of a context menu corresponding to the selected article.
11. The method according to claim 3, wherein the selected article comprises a type of means, and the second plurality of menu items comprises a list of the most recently used means of the selected type of means.
12. The method according to claim 3, wherein the second plurality of menu items comprises a sub-menu below the selected menu item.
13. A computer-readable medium that stores computer executable instructions to perform the method of claim 1.
14. A data processing system comprising: a remote control device for controlling the data processing system; a data processor configured to provide a A three-dimensional user interface in a presentation device connected to the data processing system executing computer executable software modules stored in a memory of the data processing system; and the memory storing computer executable software modules comprises: a user interface software module configured to provide the user interface in a three-dimensional space presented in the display device, said user interface including a plurality of navigable menus by a user using the remote control device; and an animation module that, under the control of the user interface software module, provides a sequence of frames of an animation when the user selects one of a plurality of menu items from a first menu of the plurality of menus, wherein the animation sequence divides the plurality of menu items between a first plane and a second plane and animates the first and second planes moving with each other in three-dimensional space.
15. The data processing device according to claim 14, wherein the selected menu item is in the foreground and the plurality of menu items other than the selected menu is in a second plane.
16. The data processing device according to claim 14, wherein the first plane has a prominent display position than the second plane.
17. The data processing device according to claim 16, wherein the first plane having a more prominent presentation position than the second plane comprises placing the first plane in front of the second plane in the three-dimensional graphic space as seen by the user interface user.
18. The data processing device according to claim 14, wherein the animation module pivots the first plane on a first hinge axis.
19. The data processing device according to claim 18, wherein the animation module pivots the second plane on a second hinge axis.
20. - The data processing device according to claim 19, wherein the first and second hinge axes are on substantially opposite sides of the display device in a dimension X of the three-dimensional space.
21. The data processing device according to claim 14, wherein the animation module pivots the second plane on a first hinge axis.
22. The data processing device according to claim 14, wherein the first hinge axis has a different location in the three-dimensional space than the second hinge axis.
23. The data processing device according to claim 14, wherein the animation module pushes the second plane back in a dimension Z of the three-dimensional space as compared to the first plane.
24. The data processing device according to claim 14, wherein the animation module pushes the first plane forward in the dimension Z.
The data processing device according to claim 14, in wherein the user interface software module causes a second plurality of menu items corresponding to the selected article to be presented in the first plane.
26. The data processing device according to claim 25, wherein the second plurality of menu items comprises articles of a context menu corresponding to the selected article.
27. The data processing device according to claim 25, wherein the selected article comprises a type of means, and the second plurality of menu items comprises a list of the most recently used means of the selected type of means. 28. The data processing device according to claim 27, wherein the animation module causes the second plurality of menu items to appear sliding from the back of the selected menu.
28. The data processing system according to claim 25, wherein the second plurality of menu items comprises a sub-menu below the selected menu.
29. A computer-readable medium that stores computer executable instructions for a method for providing a three-dimensional user interface, comprising the steps of: generating a three-dimensional graphic space to provide a user interface of a data processing device; graphically presenting on a display device connected to the data processing device, a first list of a plurality of menu items selectable by a user navigating the user interface using a remote control device; responding to the user selecting one of the plurality of menu items, displaying the selected menu item in a close-up in the three-dimensional graphic space, and presenting the plurality of menu items other than the selected article in a second plane in the graphic space three-dimensional animate the first and second planes that move one another in three-dimensional space, where, when the animation is complete, the first plane has a more prominent presentation position than the second plane.
30. The computer readable medium according to claim 29, wherein the animation of the first and second planes further comprises pivoting the first plane around a first hinge axis in the three-dimensional space.
31. The computer-readable medium according to claim 29, wherein the animation of the first and second planes further comprises pivoting the second plane around the first hinge axis in the three-dimensional space.
32. The computer-readable medium according to claim 30, wherein the animation of the first and second planes further comprises pivoting the second plane around a second hinge axis in the three-dimensional space.
33. The computer-readable medium according to claim 32, wherein the first hinge axis is different from the second hinge axis.
34. - The computer readable medium according to claim 29, wherein the animation of the first and second planes comprises moving the first and second planes in a dimension Z of the three-dimensional space without altering the dimensions X and Y of content either of the first or of the second plane.
35.- A user interface stored as executable instructions in a memory of a computer system and that can be presented in a presentation device connected to the computer system, said user interface comprises: in a first state: a first plurality of selectable menu items, wherein a user can illuminate one of the first plurality of selectable menu items at a time with a selection cursor, and a second plurality of menu items remaining corresponding to the illumination of the first plurality of items of menu, wherein the second plurality of menu items change to remain corresponding to the illuminated article of the first menu if the user moves the selection cursor from a first menu item to a menu second; and in a second state: a graphically simulated first plane having a user selecting from a first plurality of menu items and the second plurality of menu items corresponding to the one selected from the first plurality of menu items; and a graphically simulated second plane having the first remaining plurality of menu items other than that selected by the user of the first plurality of menu items, wherein the first graphically simulated plane has a more prominent appearance than the graphically simulated second plane. , wherein the user interface animates from the first state to the second state when the user moves the selection cursor to illuminate one of the second plurality of menu items.
36. The user interface according to claim 35, wherein the animated transition comprises pivoting at least one of the first plane and the second plane on a hinge axis.
MXPA/A/2005/007087A 2004-08-03 2005-06-28 Multi-planar three-dimensional user interface MXPA05007087A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10909838 2004-08-03

Publications (1)

Publication Number Publication Date
MXPA05007087A true MXPA05007087A (en) 2007-04-10

Family

ID=

Similar Documents

Publication Publication Date Title
CA2507741C (en) Multi-planar three-dimensional user interface
US10452223B2 (en) Three-dimensional space for navigating objects connected in hierarchy
EP1960990B1 (en) Voice and video control of interactive electronically simulated environment
US7853895B2 (en) Control of background media when foreground graphical user interface is invoked
KR100608589B1 (en) Three dimensional motion graphic user interface and method and apparutus for providing this user interface
JP3871684B2 (en) Content playback apparatus and menu screen display method
EP2518646A2 (en) Platform agnostic UI/UX and human interaction paradigm
US10809865B2 (en) Engaging presentation through freeform sketching
JP2003518681A (en) Navigation method in 3D image synthesis by operation of 3D image "navigation 3D"
US9843823B2 (en) Systems and methods involving creation of information modules, including server, media searching, user interface and/or other features
US20140111534A1 (en) Media-Editing Application for Generating and Editing Shadows
US10296158B2 (en) Systems and methods involving features of creation/viewing/utilization of information modules such as mixed-media modules
US11099714B2 (en) Systems and methods involving creation/display/utilization of information modules, such as mixed-media and multimedia modules
US10504555B2 (en) Systems and methods involving features of creation/viewing/utilization of information modules such as mixed-media modules
MXPA05007087A (en) Multi-planar three-dimensional user interface
CA2857519A1 (en) Systems and methods involving features of creation/viewing/utilization of information modules