WO2005045558A2 - Dynamic window anatomy - Google Patents

Dynamic window anatomy Download PDF

Info

Publication number
WO2005045558A2
WO2005045558A2 PCT/US2004/019109 US2004019109W WO2005045558A2 WO 2005045558 A2 WO2005045558 A2 WO 2005045558A2 US 2004019109 W US2004019109 W US 2004019109W WO 2005045558 A2 WO2005045558 A2 WO 2005045558A2
Authority
WO
WIPO (PCT)
Prior art keywords
window
content
properties
base
desktop
Prior art date
Application number
PCT/US2004/019109
Other languages
French (fr)
Other versions
WO2005045558A3 (en
Inventor
Scott Hanggie
Victor Tan
Gerardo Bermudez
Gregory D. Swedberg
Original Assignee
Microsoft Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corporation filed Critical Microsoft Corporation
Priority to KR1020057007426A priority Critical patent/KR101086570B1/en
Priority to CN2004800013383A priority patent/CN101288104B/en
Priority to AU2004279204A priority patent/AU2004279204B8/en
Priority to EP04776616A priority patent/EP1682964A4/en
Priority to MXPA05007169A priority patent/MXPA05007169A/en
Priority to BR0406387-2A priority patent/BRPI0406387A/en
Priority to CA002501671A priority patent/CA2501671A1/en
Priority to JP2006536548A priority patent/JP4808158B2/en
Publication of WO2005045558A2 publication Critical patent/WO2005045558A2/en
Publication of WO2005045558A3 publication Critical patent/WO2005045558A3/en
Priority to AU2009217377A priority patent/AU2009217377B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports

Definitions

  • the invention relates generally to a graphical user interface of a computer operating system. More specifically, the invention provides a mechanism that allows for the possibility of having, on a window-by-window basis, multiple andor irregularly-shaped client and non- client content areas in each window. ⁇
  • Computer operating systems typically have a shell that provides a graphical user interface (GUI) to an end-user.
  • GUI graphical user interface
  • the shell consists of one or a combination of software components that provide direct communication between the user and the operating system.
  • the graphical user interface typically provides a graphical icon-oriented and/or menu driven environment for the user to interact with the operating system, and is often based on a desktop metaphor. More specifically, the graphical user interface is designed to model the real world activity of working at a desk.
  • the desktop environment typically occupies the entire surface of a single display device, or may span multiple display devices, and hosts subordinate user interface objects such as icons, menus, cursors and windows.
  • a window is typically dedicated to a unique user activity, and is created and managed by eimer a third party software application or a system application. Each window behaves and displays its content independently as if it were a virtual display device under control of its particular application program. Windows can typically be interactively resized, moved around the display, and arranged in stacked order so as to fully or partially overlap one another. In some windowing environments, a window can assume discreet visual or behavioral states, such as minimized in size to an icon or maximized in size to occupy the entire display surface.
  • desktop windows are commonly assigned a top to bottom order in which they are displayed, known in the art as the Z-order, whereby any window overlies all other windows lower than itself with respect to Z-order occupying the same projected position on the screen.
  • a single, selected window has the "focus" at any given time, and is receptive to the user's input.
  • the user can direct input focus to another window by clicking the window with a mouse or other pointer device, or by employing a system-defined keyboard shortcut or key combination. This allows the user to work efficiently with multiple application programs, files and documents in a manner similar to the real world scenario of managing paper documents and other items which can be arbitrarily stacked or arranged on a physical desktop.
  • a drawback to many prior graphical user interface desktop implementations is their limited capacity to present visually rich content or exploit enhancements in graphical rendering technology.
  • Such enhancements include real-time rendering of physically modeled (lit, shaded, textured, transparent, reflecting, and refracting) two and three-dimensional content and smooth, high-performance animations.
  • visually rich content is possible within certain application programs running windowed or full screen within the graphical user interfaces of Windows® brand operating systems and like operating system shells.
  • the types of application programs that present such content are video games with real time 3D animation and effects, advanced graphical authoring tools such as ray tracers and advanced 2D and 3D publishing applications. Since the visual output of these programs is either restricted to the content area of its application window(s) or rendered full-screen to the exclusion of other windows and the desktop itself, the rich graphical output of the application program in no way contributes to the presentation of the desktop environment.
  • Computer operating systems employ a software layer responsible for managing user interface objects such as icons, menus, cursors, windows and desktops; arbitrating events from input devices such as the mouse and keyboard; and providing user interface services to software applications.
  • This software layer may be referred to as the Desktop Window Manager (DWM).
  • the rendering logic, input event routing, and application prograniming interfaces (APIs) of the Desktop Window Manager (DWM) collectively embody user interface policy, which in turn defines the overall user experience of the operating system.
  • APIs application prograniming interfaces
  • Prior DWM implementations employ an " ⁇ validation" model for rendering the desktop that evolved primarily from the need to conserve video and system memory resources as well as CPU and GPU bandwidth.
  • the affected portion of the display is "invalidated".
  • the DWM internally invalidates areas affected by a window size or move, whereas an application attempting a redraw all or a portion of its own window instructs the operating system, via an API, to invalidate the specified area of its window.
  • the DWM processes the invalidation request by determining the subset of the requested region that is in actual need of an on-screen update. The DWM typically accomplishes this by consulting a maintained list of intersecting regions associated with the target window, other windows overlying the target, clipping regions associated with the affected windows, and the visible boundaries of the display.
  • the DWM subsequently sends each affected application a paint message specifying the region in need of an update in a proscribed top-to-bottom order.
  • Applications can choose to either honor or ignore the specified region. Any painting performed by an application outside the local update region is automatically clipped by the DWM using services provided by a lower level graphical rendering engine such as the Graphics Device Interface (GDI).
  • GDI Graphics Device Interface
  • An advantage of the invalidation-messaging model is conservation of display memory. That is, an invalidation based DWM only needs to maintain enough buffer memory to draw a single desktop, without "remembering" what might be underneath presently displayed content.
  • features such as non-rectangular windows and rich 2D animations via GDI require CPU intensive calculations involving complex regions and/or extensive sampling of the display surface (thereby limiting the potential for graphics hardware-based acceleration), whereas other features such as transparency, shadows, 3D graphics and advanced Ughting effects are extremely difficult and very resource intensive.
  • the Microsoft Windows® XP window manager historically known as USER, has served as the dominant component of the graphical user interface subsystem (now known as Win32) since the advent of the Windows® brand operating system.
  • USER employs the 2-dimensional Graphics Device Interface (GDI) graphic rendering engine to render ti e display.
  • GDI is the other major subcomponent of Win32, and is based on rendering technology present in the original Windows® brand operating system. USER renders each window to the display using an invalidation-messaging model in concert with GDI clipping regions and 2D drawing primitives.
  • a primary activity of USER in rendering the desktop involves the identification of regions of the display in need of visual update, and info ⁇ ning applications of the need and location to draw, as per the invalidation model of desktop rendering.
  • the next development in desktop rendering is a bottom-to-top rendering approach referred to as desktop compositing.
  • desktop compositing DWM, or CDWM
  • the desktop is drawn from the bottom layer up to the top layer. That is, the desktop background is drawn first, followed by icons, folders, and content sitting directly on the desktop, followed by the folder(s) up one level, and so forth.
  • each iterative layer can base its content on the layer below it.
  • desktop compositing is a memory intensive process because the CDWM maintains in memory a copy of each item drawn to the desktop.
  • CDWM computer memory
  • UI elements residing in the non-client area cannot be modified by the application.
  • applications are limited to placing application content in a single rectangular region (the client area) unless the application wants to assume rendering and hit- testing responsibility for the entire non-client area (as well as the client area) of the window. Loosening any of these restrictions within the existing USER framework would render unusable many popular legacy applications that anticipate and depend on them.
  • the present invention is directed to a composited desktop providing advanced graphics and rendering capabilities.
  • a first illustrative aspect of the invention provides a data processing system that draws windows with dynamic anatomies.
  • the data processing system has a memory that stores window properties comprising, for each window for which properties are stored, properties for a base object and properties for at least one content object.
  • the data processing system also has a compositing desktop window manager software module that composes a desktop based on the window properties of each window for which properties are stored.
  • Another aspect of the invention provides a data structure for storing window information for windows have non-uniform, dynamic anatomies.
  • the data structure includes a first data field storing base object properties for a base object of a window, and a second data field storing content object properties for one or more content objects of the window.
  • Figure 1A illustrates an operating environment that may be used for one or more aspects of an illustrative embodiment of the invention.
  • Figure IB illustrates the distribution of function and services among components in an illustrative embodiment of a composited desktop platform.
  • Figure 2 illustrates a compositing method according to an illustrative aspect of the invention.
  • Figure 3 illustrates a window according to an illustrative aspect of the invention.
  • Figure 4 illustrates a portion of a window compositing method according to an illustrative aspect of the invention.
  • Figure 5 illustrates a frosted glass framed window rendered according to an illustrative aspect of the invention.
  • Figure 6 illustrates a window with a dynamic window anatomy.
  • Figure 7 illustrates regions used during mesh resizing.
  • the present invention provides a desktop window manager (DWM) that uses desktop compositing as its preferred rendering model.
  • the inventive desktop window manager is referred to herein as a Compositing Desktop Window Manager (CDWM).
  • CDWM Compositing Desktop Window Manager
  • the CDWM together with the composition subsystem, referred to as the Unified Compositing Engine (UCE), provides 3D graphics and animation, shadows, transparency, advanced lighting techniques and other rich visual features on the desktop.
  • the compositing rendering model used herein intrinsically eliminates the invalidation step in rendering and minimizes or eliminates the need to transmit paint and other notification messages because the system retains sufficient state information to render each window as required.
  • FIG. 1 illustrates an example of a suitable computing system environment 100 in which the invention may be implemented.
  • the computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100.
  • the invention is operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers; server computers; portable and hand-held devices such as personal digital assistants (PDAs), tablet PCs or laptop PCs; multiprocessor systems; microprocessor-based systems; set top boxes; programmable consumer electronics; network PCs; minicomputers; mainframe computers; distributed computing environments that include any of the above systems or devices; and the like.
  • PDAs personal digital assistants
  • tablet PCs or laptop PCs multiprocessor systems
  • microprocessor-based systems set top boxes
  • programmable consumer electronics network PCs
  • minicomputers minicomputers
  • mainframe computers distributed computing environments that include any of the above systems or devices; and the like.
  • the invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a conimunications network.
  • program modules may be located in both local and remote computer storage media including memory storage devices.
  • an illustrative system for implementing the invention includes a general purpose computing device in the form of a computer 110.
  • Components of computer 110 may include, but are not limited to, a processing unit 120, a system memory 130, and a system bus 121 that couples various system components including the system memory to the processing unit 120.
  • the system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, Advanced Graphics Port (AGP) bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • AGP Advanced Graphics Port
  • PCI Peripheral Component Interconnect
  • Computer 110 typically includes a variety of computer readable media.
  • Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media.
  • Computer readable media may comprise computer storage media and communication media.
  • Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 110.
  • Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
  • the system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132.
  • ROM read only memory
  • RAM random access memory
  • BIOS basic input/output system
  • RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120.
  • Figure 1 illustrates operating system 134, application programs 135, other program modules 136, and program data 137.
  • the computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media.
  • Figure 1 illustrates a hard disk drive 141 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152, and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 such as a CD ROM or other optical media.
  • removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
  • the hard disk drive 141 is typically connected to the system bus 121 through an non-removable memory interface such as interface 140, and magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such as interface 150.
  • hard disk drive 141 is illustrated as storing operating system 144, application programs 145, other program modules 146, and program data 147. Note that these components can either be the same as or different from operating system 134, application programs 135, other program modules 136, and program data 137. Operating system 144, application programs 145, other program modules 146, and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies.
  • a user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and pointing device 161, commonly referred to as a mouse, trackball or touch pad.
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
  • a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
  • a monitor 184 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 183.
  • Computer 110 may also include a digitizer 185 for use in conjunction with monitor 184 to allow a user to provide input using a stylus input device 186.
  • computers may also include other peripheral output devices such as speakers 189 and printer 188, which may be connected through an output peripheral interface 187.
  • the computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180.
  • the remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in Figure 1.
  • the logical connections depicted in Figure 1 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include other networks.
  • LAN local area network
  • WAN wide area network
  • Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
  • the computer 110 When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173, such as the Internet.
  • the modem 172 which may be internal or external, may be connected to the system bus 121 via the user input interface 160, or other appropriate mechanism.
  • program modules depicted relative to the computer 110, or portions thereof may be stored in the remote memory storage device.
  • Figure 1 illustrates remote application programs 182 as residing on memory device 181. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • the invention may use a compositing desktop window manager (CDWM) to draw and maintain the desktop display using a composited desktop model, i.e., a bottom-to-top rendering methodology.
  • the CDWM may maintain content in a buffer memory area for future reference.
  • the CDWM composes the desktop by drawing the desktop from the bottom up, begir ing with the desktop background and proceeding through overlapping windows in reverse Z order. While composing the desktop, the CDWM may draw each window based in part on the content on top of which the window is being drawn, and based in part on other environmental factors (e.g., light source, reflective properties, etc.).
  • the CDWM may use the alpha channel of an ARGB format texture to provide transparency to a window, and may selectively emphasize portions of window content (e.g., the frame) based on a virtual light source.
  • the CDWM may reside as part of the operating system 134, 144, or may reside independently of the operating system, e.g., in other program modules 136, 146.
  • the CDWM may rely upon a lower level graphics compositing subsystem, referred to herein as a Unified Compositing Engine (UCE), further described below and in co-pending application serial number (attorney docket number 50037.201US01), filed October 23, 2003, entitled "System and Method for a Unified Composition Engine in a Graphics Processing System", herein incorporate by reference in its entirety for all purposes.
  • UCE Unified Compositing Engine
  • the UCE is based on or uses Direct3D® and DirectX® technology by Microsoft Corporation of Redmond, Washington.
  • graphics compositing subsystems may be used, such as variations of the X Window platform based on the OpenGL® graphics engine by Silicon Graphics, Inc. of Mountain View, California, and the like.
  • the UCE enables 3D graphics and animation, transparency, shadows, lighting effects, bump mapping, environment mapping, and other rich visual features on the desktop.
  • FIG. 1B illustrates a component architecture according to an illustrative embodiment of a desktop composition platform.
  • a Compositing Desktop Window Manager (CDWM) 190 may include an Application Programming Interface 190a through which a composition-ware Application Software 191 obtains CDWM window and content creation and management services; a Subsystem Programming Interface 190b, through which the Legacy Windowing Graphics Subsystem 192 sends update notifications for changes affecting the redirected graphics output of individual windows (window graphical output redirection is described in more detail below); and a UI Object Manager 190c which maintains a Z-ordered repository for desktop UI objects such as windows and their associated content.
  • the UI Object Manager may communicate with a Theme Manager 193 to retrieve resources, object behavioral attributes, and rendering metrics associated with an active desktop theme.
  • the Legacy Graphical User Interface Subsystem 192 may include a Legacy Window Manager 192a and Legacy Graphics Device Interface 192b.
  • the Legacy Window Manager 192a provides invalidation-model windowing and desktop services for software applications developed prior to the advent of the CDWM.
  • the Legacy Graphics Device Interface 192b provides 2D graphics services to both legacy applications as well as the Legacy Window Manager.
  • the Legacy Graphics Device Interface based on the invalidation model for rendering the desktop, may lack support for 3D, hardware-accelerated rendering primitives and transformations, and might not natively support per-pixel alpha channel transparency in bitmap copy and transfer operations.
  • the Legacy Window Manager 192a and Graphical Device Interface 192b continue to serve to decrease the cost of ownership for users who wish to upgrade their operating system without sacrificing the ability to run their favorite or critical software applications that use the invalidation model.
  • the Legacy Graphical User Interface Subsystem 192 in the compositing process.
  • the perceived platform environment for legacy applications preferably does not change in order to avoid compromising their robustness on the composited desktop, yet the fundamental manner in which legacy windows are rendered to the desktop will be fundamentally altered.
  • the invention describes how this is achieved through the addition of a feature described herein as window graphical output redirection.
  • a Unified Compositing Engine (UCE) 194 may service rendering instructions and coalesce resources emitted from the CDWM via a Programming Interface 194a.
  • UCE Unified Compositing Engine
  • the UCE Programming Interface 194a provides the CDWM, and ultimately, applications, an abstract interface to a broad range of graphics services.
  • these UCE services are resource management, encapsulation from multiple-display scenarios, and remote desktop support.
  • FIG. 194b Graphics resource contention between CDWM write operations and rendering operations may be arbitrated by an internal Resource Manager 194b. Requests for resource updates and rendering services are placed on the UCE's Request Queue 194c by the Programming Interface subcomponent 194a. These requests may be processed asynchronously by the Rendering Module 194d at intervals coinciding with the refresh rate of the display devices installed on the system. Thus, the Rendering Module 194d of the UCE 194 may dequeue CDWM requests, access and manipulate resources stored in the Resource Manager 194b as necessary, and assemble and deliver display-specific rendering instructions to the 3D Graphics Interface 195.
  • Rendering the desktop to multiple displays requires abstraction of the differences in refresh rate, pixel format support, and device coordinate mapping among heterogenous display devices.
  • the UCE may provide this abstraction.
  • the UCE may also be responsible for delivering graphics data over a network connection in remote desktop configurations.
  • resource contention should be avoided, performance optimizations should be enacted and security should be robust.
  • responsibiUtes may also rest with the UCE.
  • the 3D Graphics Interface 195 may include a low-level, immediate-mode (stateless) graphics service such as Direct3D®, OpenGL®, or the like.
  • a purpose of the 3D Graphics Interface may be to provide an abstract interface over the features of the particular graphics hardware configuration.
  • the 3D Graphics Interface may service a single display device; the UCE may parse and distribute the CDWM's rendering instructions among multiple graphics output devices 197 in a multiple-display system via multiple device drivers 196.
  • FIG. IB the component architecture depicted in Figure IB is that of an illustrative embodiment.
  • the figure is intended to illustrate functions that the invention may include. These functions may be distributed among a fewer or greater number of software components than those represented in the figure, according to the capabilities of the platform and the desired feature set
  • a system that lacks theme management might derive all stock resources from the system, likely as static resources managed by the CDWM itself, rather than from a separate theme manager.
  • a platform that allows plugable window managers may replace the Application Programming Interface 190a in the CDWM with a Plugable Window Manager Interface in order to abstract the details of composited UI object and resource management.
  • Another possible variation may eliminate the Subsystem Programming Interface 190b if legacy application compatibility is not required.
  • FIG. 2 illustrates a general method for performing desktop compositing according to an illustrative aspect of the invention.
  • Steps 201 through 205 describe the interaction of a composition aware application using compositing desktop window manager (CDWM) APIs to create and manage a window and window content.
  • Steps 207 and 209 depict the interaction between legacy, invalidation-model window manager applications and the CDWM to composite legacy window content.
  • CDWM desktop window manager
  • the compositing desktop window manager receives requests from a composition-aware application to (1) create a composited window and (2) attach a content object.
  • the invention is not limited to a single content object per window; an application can dynamically create and attach to a window (as well as detach and destroy) any number of content objects via the CDWM API, further described below.
  • a content object consists of a raster surface of specified size and pixel format to be used as a diffuse texture mapped to an application- or system- defined mesh, along with optional accessory resources such as additional textures (light map, specular map, bump/normal map, etc), lights and a pixel shader.
  • the pixel format of the diffuse content texture may be any of the available formats supported by the video hardware installed on the system, but for tile purposes of the current illustration, may be 32-bit ARGB.
  • the application may be implicitly aware that the alpha (A) channel may be used to vary the transparency level of the content pixel, thus affording fine control over the amount of desktop background information modulating with the source pixel on final rendering.
  • the CDWM allocates a state block for the window to which it attaches a CDWM-implemented content object.
  • the content object allocates the resources requested or attaches resources forwarded by the application, and then marshals these resources to the UCE to allow ready access on UCE update requests.
  • step 205 the application notifies the CDWM of an unsolicited change to the window or the window content.
  • These changes can affect any window or content state, but for purpose of simplicity, the illustration depicts three common update requests: content size, window position or scale, or a change to the pixels of the content's diffuse texture.
  • Step 207 can be more generally described as placing the legacy window and graphics subsystem in "composition mode", in which the rendering of each individual window is redirected to a separate memory buffer.
  • the Legacy Graphical User Interface Subsystem 192 redirects the output of the graphics instructions involved in rendering the window to a bitmapped memory surface associated with the window.
  • redirection buffers may be managed by either the CDWM or the legacy window manager 192a, but the for the purpose of this illustration, surface resource management is centralized in the CDWM.
  • Each redirection buffer either .constitutes or is used to generate a diffuse content texture resource for the window.
  • the legacy window manager 192a need not invoke the CDWM window and content creation APIs; the legacy subsystem-CDWM communication channel for notifications is distinct from that of the application interface, and the CDWM derives composited window attributes (frame and border style, caption, etc) and state (hidden/shown, minim ed/maximized, etc) from existing legacy window properties.
  • the legacy window manager 192a informs the CDWM 190 of any change affecting the redirected window content texture that may necessitate a visual update.
  • the CDWM 190 discriminates from among size, position scale and pixel-level texture update requests, and acts accordingly.
  • the CDWM On a size update (step 211), the CDWM first determines whether a frame is associated with the target window (step 213). If a frame is associated with the window (step 215), the CDWM determines the appropriate size and orientation of the frame primitive based on a two- or three- dimensional extent explicitly provided by a composition-aware application, or on a combination of legacy and CDWM window metrics and the updated dimensions of the redirected legacy surface.
  • the CDWM makes the appropriate changes to the position information in the vertices in the frame mesh, and forwards the vertex data buffer to the UCE.
  • the UCE places the mesh update directive and the new vertex information on a queue for asynchronous processing. If the window does not have a frame, step 215 may be bypassed. In the case of either framed or frameless windows, size changes affecting the content area may cause the CDWM to resize the content mesh and queue the appropriate mesh update request and data to the UCE (step 217).
  • the CDWM determines the new transformation parameters and queues a transform resource update request along with the data to the UCE for asynchronous processing (step 221).
  • the resource ⁇ iimally consists of a four by four transformation matrix, but may contain additional data to support filtered transforms.
  • step 223 the CDWM receives an update request involving a change to the pixel data 'of the diffuse content texture, i.e., the application has updated its content within its window.
  • step 225 the CDWM services the request by queuing the new pixel information to the UCE for asynchronous processing.
  • a change to the window icon or caption text may also necessitate a redraw of the CDWM-managed icon or caption content object, respectively, associated with the window.
  • Window input focus may be reflected in the appearance of the frame, and thus in the case of a legacy window, the legacy window manager may deliver an input focus change update to the CDWM who re-renders the frame and possibly other content accordingly.
  • step 227 the UCE processes incoming composition and resource updates from the CDWM, and at intervals synchronized with the video refresh rates of each active video graphics adapter participating in the composition of the desktop, re-renders the desktop (or the appropriate portion thereof in a multiple-display configuration) to a display-sized backing buffer. This is accomplished using the immediate-mode rendering services provided by a 3D graphics engine (such as Microsoft DirectSD®), which in turn transfers the desktop to a primary display surface.
  • a 3D graphics engine such as Microsoft DirectSD®
  • the CDWM may define the window anatomy using various components, include a base content object and one or more child content objects.
  • the base content object defines the window frame, or border, and consists of a base geometry, base extent, base material properties and base content margins.
  • the base and child content objects may each be entirely defined and managed by the system or in the case of custom content elements, may be managed by the application. Content objects are discussed in more detail below.
  • Figure 3 illustrates an application window according to an illustrative aspect of the invention.
  • Application window 301 may include various regions and components.
  • the frame or base content 303 of the window 301 may host child content including buttons 305 (e.g., used to restore, maximize, minimize, close the window, etc.), an indicative icon 307, scrollbars 309, menu bar 311, and window caption text 313.
  • a primary content object area 315 may be derived from the redirection buffer obtained from the Legacy Window and Graphical User Interface Subsystem, or be created and attached to the standard base content and rendered by a composition-aware owning application.
  • Figure 3 is merely illustrative of basic window elements, and that additional or different window elements may additionally or alternatively be used.
  • window frame elements may alternatively be provided by an application, e.g., to provide a distinct look and feel to an application program.
  • an application program provides the scroll bar elements as custom child content objects so that they manifest an appearance and behavior peculiar to the application program.
  • an application may elect to remove or reposition one or more of the stock frame elements using the CDWM API.
  • An application need not be limited to a single primary content area, a restriction prevalent in the prior art.
  • the CDWM may support multiple application-created and rendered content areas associated with a single window.
  • the CDWM provides flexibiUty in the manner in which a window may be drawn. That is, the CDWM may allow an application to alter the default anatomy of a window by allowing applications to define multiple custom content objects, each having an arbitrary shape, instead of limiting each application to a single, rectangular client content area.
  • each CDWM window may be comprised of a base content object (i.e., the frame) and a collection of one or more child content objects.
  • Each content object may be defined by a unique set of content attributes, and can be configured to optionally receive keyboard and mouse events.
  • the CDWM maps mouse hit-test points relative to application- defined, content-local, 3D coordinates, and delivers mouse event notifications to the application.
  • Content objects may be managed entirely by the system, or in the case of custom content elements, may be managed by the application. Examples of system-managed content objects include the application indicative icon, frame buttons (e.g., minimize, restore, close), caption text, and certain menu bars and scroll bars.
  • Application-managed content objects include those content objects(s) to which the application renders its primary visual output, e.g., text by a word processor, numeric grid by a spreadsheet application, or images by a photo editing application.
  • the content texture may be a bitmap managed by the system, or in the case of custom content, the application.
  • the content texture may be mapped linearly to the content geometry in a single repeat.
  • the aspect ratio may be determined by the content geometry, and texture coordinates may be exposed in the content geometry.
  • Magnification of content may be controlled with a scaling transform that affects the mapping of the content texture to its geometry.
  • the CDWM may provide a default interactive mechanism by which the user can adjust the zoom factor, such as a system-provided menu option, slider control, and/or mouse and keyboard combinations.
  • a content surface whose diffuse texture is in a format supporting per-pixel alpha may be initialized by the system to zero alpha at the discretion of the application (or the system in the case of a stock content object). Therefore the underlying base content object may be displayed in unpainted areas of the content surface. This enhances both the programming model and user experience because applications are not required to erase the content surface before rendering, and the user is spared flicker and stale or unpainted areas in the window.
  • certain content objects may have no material properties associated with them because it would be undesirable to have the content interact with tight or the environment in a manner distracting to the user or otherwise interfering with the user's activities.
  • the visual appearance of a content object may be dete ⁇ nined solely by its texture, geometry and perhaps the per-vertex or per-pixel alpha value in such embodiments.
  • Figure 6 illustrates an example of a window 601 with a dynamic non-standard anatomy as described herein.
  • Window 601 has a base frame object 603 of a non-standard shape (i.e., non-rectangular), frame button objects 605 of non-standard shape (not rectangular) positioned in a non-standard location (other than the top right corner of the window), system-provided indicative frame icon object 607 in a non-standard position (other than the top left corner of the window), and frame window caption object 613 also in a non- standard position (not left justified in the top of the frame).
  • the application associated with the window has defined two primary content object areas 615a and 615b.
  • Primary content object area 615a is of regular (i.e., rectangular) shape, whereas primary content object area 615b is of an irregular, non-rectangular shape.
  • Window 601 may also include application-defined frame button objects 617 and 619 providing back and forward navigation control, respectively, e.g., in a browsing context
  • the CDWM may render the base portion of the application window 301 as a three- dimensional (3D) object.
  • a 3D mesh primitive may be used to define the window object's shape (base geometry), a primary diffuse texture may be mapped to the 3D geometry of the mesh, and optional material properties which may include lighting, shading, refraction, blur and other special effect parameters and resources, including ancillary textures, applied during the rendering process.
  • Ancillary textures may be used as resources for graphical effects well known in the art in order to provide "live,” physically modeled interaction with light sources, cursors, and other UI objects in the desktop environment.
  • textures may serve as the source of per-pixel 3D normal information (normal/bump mapping), tight masks (ambient, diffuse and specular light filters), reflection sources (e.g. reflection of the cursor when hovered over the window), static environment maps, and the like.
  • the vertex format of the base geometry may optionally include a 32-bit diffuse color component in ARGB format and texture coordinate pairs ⁇ tUn, tv avail ⁇ for mapping up to n textures to the mesh geometry, as described above.
  • each integer increment of tu and tv may define a repeat of the texture in the respective dimension. For example, values range from ⁇ 0.0, 0.0 ⁇ (texture left, top) to ⁇ 1.0, 1.0 ⁇ (texture right, bottom) represent a single repeat over the entire mesh, whereas ⁇ 0.0, 0.0 ⁇ to ⁇ 6.0, 4.0 ⁇ define six repetitions in the x-dimension and four repetitions in the y-dimension.
  • a content extent may be defined as a pair of three-dimensional points defining a bounding extent ⁇ x ⁇ e ⁇ , y top , Zfr on t > Xnght, ybottom, Zback ⁇ > or the coordinates of the smallest box that contain the base geometry. This is analogous to the 2D bounding window rectangle ⁇ xieft, y top , Xn ht, ybottom ⁇ - The triplet ⁇ ieft-X ⁇ ght, ytop-ybottom, Zback-Zback ⁇ defines the width, height and depth of the content's extent. The extent is calculated and managed by the system and represents the size and local position of the content.
  • the window object is resizable, manipulating the base content's extent is the means by which the CDWM may resize the window.
  • the position of each vertex in a resizable mesh might not simply be scaled to the new extent.
  • a predefined vertex position filter function along with applicable parameters may be specified by the appUcation at window creation time, or selected by the CDWM as a default.
  • the role of the vertex resizing filter function is to determine how each vertex in the target mesh behaves when its bounding extent is altered. Every filter function should determine for every member vertex the displacement direction and magnitude in each dimension (x, y, z).
  • the simplest filter function dete ⁇ nines the direction, (positive or negative), and the magnitude (scaled relative to the new extent or offset by an amount equal to that of one of the six faces of the mesh's bounding box in a 3D space). How each vertex behaves in a resizing operation can be described on a per-vertex, per-dimension basis as a property associated with the vertex itself, or can be defined for the mesh as a whole in geometric terms.
  • An example of the latter method is a pair of vectors ⁇ mxieft, my t0 p, z&ont, mx,i ht, my ottomj zback defining six sizing margin planes, each associated with a face of the mesh bounding box, effectively dividing the volume of the bounding box into 27 cubic subregions.
  • the sizing margin values may remain constant regardless of the size of the mesh, or may be calculated based on the initial size of the bounding box.
  • vertices occurring in the upper, left, front cubic subregion are offset by the same magnitude and direction as the upper-left-front comer of the bounding extent
  • Vertices occurring in the centermost cubic subregion are scaled relative to the new extent of that subregion.
  • Vertices occurring in the front, center cubic subregion are scaled relative to the new extent of that subregion in the x and y dimension, but are displaced by the same magnitude and in the same direction as the mesh's front Z bounding plane.
  • Figure 7 illustrates an example of a mesh resize operation in a 2-dimensional space.
  • a window 701 has rounded co ers with a comer radius 707. If a window resize operation merely scales the mesh on which the window is based, the co er radius would scale with the mesh. However, if the comer radius is scaled, the radius of tiie rounded comers may become too large or small and detract from the user experience and detract from the usability of the user interface. Thus, as the window 701 is resized, the comer radius preferably does not change.
  • the mesh may be divided into three segments per dimension (x, y, z as appUcable).
  • the window is divided into 9 quadrants 703a-i.
  • the window may be divided into 27 regions.
  • Each dimension may be equatty divided or divided unequaUy, thus allowing for equaUy sized region or unequaUy sized regions.
  • regions bounded by the bounding box may be made as small as necessary to encompass material that should not be scaled.
  • quadrants are offset in each dimension in which the quadrant is bounded by the bounding box, and scaled in each dimension in which the quadrant is bounded by a region divider 705a-d.
  • regions 703a, 703c, 703 g, and 703i are bounded by the bounding box on at least one side in both the X and Y dimensions, so mesh vertices in regions 703a, 703c, 703 g, and 703i retain the same offset from the bounding box as the window is resized.
  • Regions 703b and 703h are bounded by the bounding box on at least one side in tiie Y (vertical) dimension, but bounded only by region dividers 705 in the X (horizontal) dimension. Thus, mesh vertices in regions 703b and 703h will retain their offsets in the Y dimensions, but be scaled in the X dimension.
  • Regions 703d and 703f are bounded by the bounding box on at least one side in the X (horizontal) dimension, but bounded only by region dividers 705 in the Y (vertical) dimension. Thus, mesh vertices in regions 703d and 703f wiU retain their offsets in the X dimension, but be scaled in the Y dimension.
  • Region 703e is bounded by dividing lines 705 in both the X and Y dominions, so mesh vertices falling within region 703e wiU be scaled in both the X and Y dimensions.
  • One of skiU in the art wiU recognize the extension of this algorithm to 3 dimensions by including a Z dimension as described in the preceding paragraphs.
  • Another variation of a mesh resizing filter function may interpret hand-authored vertex metadata rather than rely on a global geometric construct such as sizing margins to determine whether the vertex position scales or offsets in any direction. Such a function might be used to preserve complex surface topology such as ridges and troughs during a mesh resize.
  • Another variation of a mesh resizing filter function may aUow vertices to be displaced in each dimension in a linear or nonlinear manner, with discrimination bits and function coefficients stored as per-vertex metadata. Such a function enables effects such as linear or non-linear, localized or generaUzed bulging or coUapsing concomitant with mesh resize.
  • the base content margins define the boundaries to which child content is constrained.
  • Content margins may be three-dimensional boundaries defined in the same manner as sizing margins. However, unlike sizing margins, content margins may scale linearly with window scale, and might not influence mesh resizing.
  • Local and desktop-global resources and parameters as specified according to the values of abstract material properties, in combination with pixel shaders, comprise the data and mechanism by which the CDWM may implement tiie rendering of physical modeled desktop content
  • High-level content material properties define tiie manner in which the content interacts with tight and the surrounding environment
  • the rendering of complex materials such as frosted glass may use techniques not natively supported in video hardware.
  • tiie CDWM implements the material properties using one of a smaU number of predefined pixel shaders.
  • a pixel shader is a smaU routine loaded into the display hardware that manipulates the values of pixels prior to display based on a pre-defined set of resources, including but not limited to tight sources, textures, and vertices in a mesh primitive, as weU as parameters such as transforms and metrics.
  • the CDWM may select from among its coUection of predefined pixel shaders the appropriate shader to render a particular set of object material properties, which include ambient color (intensity and transparency), diffuse color (intensity and transparency), specular color (intensity and transparency), reflection scalar, refraction index, diffuse texture, and bump texture, each of which is described further below.
  • Desktop-global properties may be used to define global environment properties, such as eye position, global tight source(s), environment maps, and the like. The resources and parameters that define these desktop-global properties may be forwarded together with the base window material properties to the 3D Graphics Interface as parameters to the active pixel shader immediately prior to rendering the window.
  • Ambient color simulates tight hitting the surface of the object from aU directions.
  • ambient intensity determines the relative amount of ambient light contacting the surface of the object, and a 32-bit ARGB value may be used to specify the ambient color and transparency.
  • ambient intensity may range from 0.0 (zero ambient light, giving a uniformly black appearance) to 1.0 (maximum intensity of the specified color distributed uniformly over the object). The effect of ambient intensity with a white ambient color allows control over the general brightness of the object.
  • Diffusion intensity determines the amount of directional tight scattered in aU directions after contacting the object's surface.
  • the tight itself is provided either by one or more directional tights or the cubic tight map.
  • diffuse color may be specified by a 32-bit ARGB value that dictates ti e color, where the alpha component dictates the transparency of the tight reflected diffusely.
  • the diffusion intensity value ranges from 0.0 (no tight is reflected diffusely, giving the object a uniformly black appearance) to 1.0 (aU tight is reflected diffusely, giving the object a shaded appearance according to the diffusion color value). Lit surfaces wiU appear more reatisticaUy modeled as the sum of the ambient and diffusion intensity values approaches 1.0.
  • specular intensity controls how much light is reflected off the object's surface directly back at the viewer, and specular color may be specified as an ARGB color of the object.
  • the light source itself may be in the form of either one or more directional lights or a cubic light map.
  • high specular intensity values may be used to model a shiny surface with sharp highlights, whereas low values may be used to model a matte surface with faint or absent highlights.
  • the alpha component of the color determines the transparency of the specular highlights.
  • Reflectivity like specularity, determines the amount of light that is reflected directly back at the viewer from the surface of the object Reflection differs from specularity in that reflection applies to the entire environment not just the light source.
  • a reflectivity value of 0.0 produces no reflection of the environment in the surface
  • a value of 1.0 produces mirror-like reflection of the environment in the surface.
  • the environment may be modeled using a combination of the cubic environment map and the mouse cursor.
  • the mouse cursor as weU as static features of the environment may be reflected from the window surface to a degree controUed by the reflection intensity scalar.
  • the refraction index of each object determines the angle of transmission of tight traveling through it.
  • Table 1 the angle of refraction index of various media which may be simulated are shown below in Table 1.
  • the angle of refraction may then be used to select the proper pixel from the background to render on the visible surface of the object following further processing associated with other material properties. Optimizations for the purpose of real time rendering of refraction may incorporate the Fresnel technique, a method appreciated by those of skiU in the art.
  • Visual styles may be used to define CDWM visual and behavioral poUcy. Visual Styles generally refer to user-selectable themes that specify elaborate, hand-designed graphics and behavioral attributes appUed to common user interface elements. Applications may optionally override some of these attributes, whereas others are selectively enforced by the system in the interest of consistency in the user interface.
  • Visual attributes include the appearance of common window content such as the frame area (base content), non-ctient buttons, and other appUcation independent elements.
  • Behavioral attributes include window and desktop transition animations, the manner in which a window is interactively moved or resized with the mouse (e.g., snap, glue and stretch and constraint), and other application- independent behaviors.
  • Visual and behavioral policy may be centralized in the CDWM rather than having that poticy distributed throughout the software rendering pipeline, thus providing a more consistent end-user experience, and a simpler development environment
  • the default (or custom) texture of a visual style may comprise an alpha level and/or a bitmap based on which each pixel is modified.
  • an alpha level may be used to modify a transparency level, as is known in the art.
  • the texture may comprise a bitmap with which the client and/or non-ctient area, or a portion of the client and/or non-client area, may be pixel shaded.
  • the bitmap may give the appearance of frosted glass.
  • Figure 5 iUustrates a window 501 rendered with a frosted glass frame 503, where the refraction index may be specified to simulate glass when deterniining which pixel from the content behind window frame 503 should appear.
  • the CDWM can compose a window 501 with a frame 503 having a frosted glass appearance that reflects tight from an optionally specified virtual tight source within the 3D desktop environment, yet has an opaque ctient content area so that visual acuity of the client content is not diminished.
  • Desktop rendering models (invalidation versus compositing) each have a unique schema for interacting with application programs so that the application program's window(s) are maintained properly on the desktop. For example, in an invalidation model, the desktop rendering is dependent on the management and continuous updating of window "ctipping regions.” Ctipping is the process by which rendering is limited to an appropriate area of a window. When one window is partiaUy obscured by another, its clipping region corresponds to the inverse of the obscured area.
  • the invalidation model DWM ensures that its ctipping region is appUed to the output, thus ensuring that no painting wiU take place in the overlying window(s). If the overlying window is moved, or the underlying window is brought to the top of the Z-order, the ctipping region of the underlying window is adjusted by the DWM accordingly before it sends the window a paint message to update any newly exposed content.
  • Invalidation model DWMs and compositing model DWMs thus rely on different information to draw the desktop.
  • the DWM invaUdation model DWM, because the DWM does not store a copy of the entire surface of each window on the desktop, the DWM must communicate with an application to refresh content during resizing and redraws. Likewise, the appUcation expects to not need to refresh its content unless asked to do so by the DWM (unless, of course, the content is updated as a result of user input). If the application does need to independently update its own content it asks the DWM to invatidate a portion of its own window, expecting to receive from the DWM a paint request corresponding to the invalid region.
  • the CDWM need not send the window paint messages on events such as those described above. This in turn obviates the invalidation step; the application need simply to redraw all or a portion of itself as internal events dictate.
  • each DWM and/or CDWM has a unique set of APIs through which appUcation programs expect to communicate with the DWM to ensure that the window content is kept current.
  • an appUcation originaUy programmed for use with an invaUdation model DWM i.e. one that relies on paint messages to render its content, will not necessarily work with a compositing model CDWM.
  • tiie CDWM may provide support for appUcations originaUy developed for use in an invaUdation model DWM. These appUcations may be referred to herein as legacy appUcations, and the backwards-compatible support may be referred to herein as legacy support.
  • Legacy APIs refer to APIs for use with a prior version of the operating system that used an invaUdation model DWM with which the legacy appUcation is compatible.
  • the legacy APIs 192b (Fig. IB) aUow the appUcation to conn iunicate with the invaUdation model DWM (legacy DWM) 192a.
  • the legacy DWM may use a separate legacy API element to process various legacy notifications on behalf of the appUcation to CDWM, to transfer relevant state information to the CDWM, and to translate between legacy and CDWM coordinate spaces for input and focus determinations.
  • the legacy DWM may be modified to redirect data to the CDWM, as described below.
  • Figure 4 iUustrates a portion of a window compositing method according to an iUustrative aspect of the invention.
  • Steps 401-409 represent the initial rendering of content associated with a legacy appUcation window whose source rendering surface (or set if instructions required to generate the surface) is obtained from tiie Legacy Window Manager 192a (Fig. IB).
  • Steps 411-419 illustrate rendering of window content created by a composition-aware appUcation program.
  • the CDWM receives an initial update notification for the primary window content from the legacy window manager
  • legacy application calling legacy APIs 192b to draw a window on the desktop according to the invaUdation model for which the application was designed.
  • Microsoft® Word® XP may caU the legacy APIs so that the legacy DWM 192a draws text input by the user.
  • the CDWM retrieves the content's default mesh from the theme manager.
  • the CDWM retrieves (or generates) the redirection surface from the Legacy Window Manager. This surface may be used as the content's diffuse texture.
  • the CDWM ensures that only the desired areas of the legacy texture are retained, so that those containing the legacy window frame, border and/or caption or not rendered.
  • One manner in which this can be accomptished expediently is by transforming the texture mapping coordinates of the mesh such that only the desired area is mapped to the mesh's x and y bounding extents.
  • the CDWM retrieves default material properties for the content The resources and parameters required to render the legacy content have now been coUected.
  • the CDWM receives information from an appUcation program requiring the rendering of a content object associated with a window.
  • the content may optionally be accompanied by a custom mesh, custom texture, and/or custom material properties.
  • a custom mesh may be provided alone when the appUcation program desires to have a non- standard shape for an existing content object. If the content object in question is the window base content, the custom mesh wiU redefine the shape of the window.
  • a custom texture and/or custom material properties may be provided alone when the appUcation program desires to impart a non-standard appearance (i,e., other than that specified by the active theme) to a system-defined content object If the content object in question is the window base content, the custom texture and/or material properties redefine the appearance of the window without modifying its shape. More commonly, the application creates a content object from scratch and specifies its mesh (which may be selected from a set of predefined system meshes), texture and material properties (which may be selected from a set of predefined system material properties) at creation time.
  • step 413 the CDWM determines whether a custom content mesh was specified and, if not retrieves a default mesh from the theme manager (step 403).
  • step 415 the CDWM determines whether a custom texture was specified and, if not, retrieves a default texture from the theme manager.
  • step 417 the CDWM determines whether custom material properties were specified by the application and, if not, retrieves a default set of material properties from the theme manager. The resources and parameters required to render the custom content have now been coUected.
  • step 419 the CDWM assembles a rendering instruction block via the UCE Programming Interface to render the content with references to the appropriate mesh, texture and material properties.
  • the rendering instruction block is queued for execution by the UCE.
  • the instruction block is executed by the UCE Rendering Model on expiration of the pending refresh interval of the target device(s).
  • the operating system in which the CDWM and legacy DWM are integrated inherently has the capabiUty to render the desktop using the invalidation DWM (legacy DWM 192a) or the compositing DWM (CDWM 190). That is, the invalidation model DWM is supported by the operating system in addition to the composition model in order to provide legacy support.
  • the CDWM and/or the operating system may aUow a user to select whether the compositing or legacy drawing mode should be used. The selection may be made automatically or manuaUy.
  • the selection may be made via manual user control, in accordance with the drawing mode defined by an activated visual style (theme) selected by the user.
  • the selection may alternatively or also be based on power-conservation conditions. For example, when a portable device is disconnected from an AC power source and switches to battery power, the operating system may enforce legacy drawing mode because it is the video graphics processing unit (GPU) is less active and thus consumes less power.
  • GPU video graphics processing unit
  • an operating system may provide a physicaUy modeled graphical user interface that uses advanced 3D graphics capabiUties.
  • Window frames may take on not only the appearance, but also the characteristics, of frosted glass or some other complex material that provides a simulated surface appearance having at least some transparency combined with at least some distortion of the content visible as a result of the transparency, resulting in a translucent effect.
  • the window frame also behaves Uke frosted glass in that it reflects content in the GUI environment, includes spectral highlights indicative of virtual tight sources, simulates an index of refraction similar to glass such that content behind the "frosted glass" border is sUghtly offset accordingly, and a bitmap may be appUed via one or more pixel shaders to provide distortion of underlying content
  • Frosted glass or other glass-like physically modeled objects provide many advantages to a user of the graphical user interface. For example, the aesthetic appearance of the glass enhances the GUI and provides a rich user experience that makes a GUI of the present invention more desirable to end-users than GUIs of other operating systems by providing a tight, open feeling to the user. At the same time, physically modeled frosted glass having true or near true characteristics of glass also provides functional advantages.
  • the frosted glass appearance aids the user to understand window layer order in a multi-window environment. Shading, reflection, and specular highlights create a stronger sense of depth and layers on the desktop, and help a user determine the Z order of various windows on the desktop. WhUe some known systems apply uniform transparency to an entire window, a user might not readily perceive what content is within the window of interest and what content is behind the window. By varying this uniform, per-pixel transparency as a function of Z-order, it is possible to ameUorate the problem, but in an unnatural, non-intuitive manner.
  • the present invention models the light-scattering behavior arising from the material imperfections in real-world frosted glass. It is this physically modeled distortion of the background that allows the user to immediately distinguish between background and foreground content And because the effect is cumulative, overlapping frosted glass window frames become progressively more distorted from foreground to background. Thus, the user can intuitively differentiate background content underlying multiple layers of frosted glass window frames.
  • Frosted glass also aUows the use of thicker borders to ease user interaction with the GUI, for example, by making it easier for a user to grab a window border with a mouse to move or resize the window, yet not obscure content beneath the window (because the glass is transparent or translucent).
  • Various frosted glass effects may be used to make it easier for a user to tell the difference for active versus inactive window states.
  • frosted glass makes it easier for a user to read andor view content on the screen, because the user can view more screen area at any given time (because the frosted glass is translucent or transparent), and the desktop appears less cluttered because glass is a non-obtrusive element on the display screen.
  • frosted glass appearance may be easily varied by applying a different bitmap and or a different pixel shaper to the rendering of the appearance.
  • environment variables e.g., differing die light source(s), which affects reflection and specular highlights
  • virtual physical properties of the glass e.g., index of refraction, reflection, etc.
  • wiU affect the appearance of the frosted glass as well.
  • the present invention may also be used to simulate other textures and compounds, e.g., metals, plastics, paper, cotton, and other natural and synthetic materials.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Digital Computer Display Output (AREA)
  • Image Generation (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A method and system for rendering a desktop on a computer using a composited desktop model operating system are disclosed. A composited desktop window manager (190), upon receiving base object and content object information for one or more content objects from an application program, draws the window (301) to a buffer memory, and takes advantage of advanced graphics hardware and visual effects to render windows based on content on which they are drawn. The frame portion (303) of each window (301) may be generated by pixel shading a bitmap having the appearance of frosted glass based on the content of the desktop on top of which the frame is displayed. Legacy support is provided so that the operating system can draw and render windows generated by legacy applications to look consistent with non-legacy application windows.

Description

DYNAMIC WINDOW ANATOMY
FIELD OF THE INVENTION
[01] The invention relates generally to a graphical user interface of a computer operating system. More specifically, the invention provides a mechanism that allows for the possibility of having, on a window-by-window basis, multiple andor irregularly-shaped client and non- client content areas in each window. λ
BACKGROUND OF THE INVENTION
[02] Computer operating systems typically have a shell that provides a graphical user interface (GUI) to an end-user. The shell consists of one or a combination of software components that provide direct communication between the user and the operating system. The graphical user interface typically provides a graphical icon-oriented and/or menu driven environment for the user to interact with the operating system, and is often based on a desktop metaphor. More specifically, the graphical user interface is designed to model the real world activity of working at a desk. The desktop environment typically occupies the entire surface of a single display device, or may span multiple display devices, and hosts subordinate user interface objects such as icons, menus, cursors and windows.
[03] Among the types of rendered objects hosted by the desktop environment are visually delineated areas of the screen known as windows. A window is typically dedicated to a unique user activity, and is created and managed by eimer a third party software application or a system application. Each window behaves and displays its content independently as if it were a virtual display device under control of its particular application program. Windows can typically be interactively resized, moved around the display, and arranged in stacked order so as to fully or partially overlap one another. In some windowing environments, a window can assume discreet visual or behavioral states, such as minimized in size to an icon or maximized in size to occupy the entire display surface. The collection of desktop windows are commonly assigned a top to bottom order in which they are displayed, known in the art as the Z-order, whereby any window overlies all other windows lower than itself with respect to Z-order occupying the same projected position on the screen. A single, selected window has the "focus" at any given time, and is receptive to the user's input. The user can direct input focus to another window by clicking the window with a mouse or other pointer device, or by employing a system-defined keyboard shortcut or key combination. This allows the user to work efficiently with multiple application programs, files and documents in a manner similar to the real world scenario of managing paper documents and other items which can be arbitrarily stacked or arranged on a physical desktop.
[04] A drawback to many prior graphical user interface desktop implementations is their limited capacity to present visually rich content or exploit enhancements in graphical rendering technology. Such enhancements include real-time rendering of physically modeled (lit, shaded, textured, transparent, reflecting, and refracting) two and three-dimensional content and smooth, high-performance animations. In contrast to the limited services available for utilizing graphical rendering enhancements on the desktop, visually rich content is possible within certain application programs running windowed or full screen within the graphical user interfaces of Windows® brand operating systems and like operating system shells. The types of application programs that present such content are video games with real time 3D animation and effects, advanced graphical authoring tools such as ray tracers and advanced 2D and 3D publishing applications. Since the visual output of these programs is either restricted to the content area of its application window(s) or rendered full-screen to the exclusion of other windows and the desktop itself, the rich graphical output of the application program in no way contributes to the presentation of the desktop environment.
[05] Computer operating systems employ a software layer responsible for managing user interface objects such as icons, menus, cursors, windows and desktops; arbitrating events from input devices such as the mouse and keyboard; and providing user interface services to software applications. This software layer may be referred to as the Desktop Window Manager (DWM). The rendering logic, input event routing, and application prograniming interfaces (APIs) of the Desktop Window Manager (DWM) collectively embody user interface policy, which in turn defines the overall user experience of the operating system. A primary reason for the lack of rich, visual desktops up to the present has been the methods with which DWMs manage and render the desktop. Prior DWM implementations employ an "^validation" model for rendering the desktop that evolved primarily from the need to conserve video and system memory resources as well as CPU and GPU bandwidth.
[06] In the invalidation model, when a window is resized or moved, or when an application wishes to redraw all or part of a window, the affected portion of the display is "invalidated". The DWM internally invalidates areas affected by a window size or move, whereas an application attempting a redraw all or a portion of its own window instructs the operating system, via an API, to invalidate the specified area of its window. In either case, the DWM processes the invalidation request by determining the subset of the requested region that is in actual need of an on-screen update. The DWM typically accomplishes this by consulting a maintained list of intersecting regions associated with the target window, other windows overlying the target, clipping regions associated with the affected windows, and the visible boundaries of the display. The DWM subsequently sends each affected application a paint message specifying the region in need of an update in a proscribed top-to-bottom order. Applications can choose to either honor or ignore the specified region. Any painting performed by an application outside the local update region is automatically clipped by the DWM using services provided by a lower level graphical rendering engine such as the Graphics Device Interface (GDI).
[07] An advantage of the invalidation-messaging model is conservation of display memory. That is, an invalidation based DWM only needs to maintain enough buffer memory to draw a single desktop, without "remembering" what might be underneath presently displayed content. However, because windows on the desktop are rendered in a top-down order, features such as non-rectangular windows and rich 2D animations via GDI require CPU intensive calculations involving complex regions and/or extensive sampling of the display surface (thereby limiting the potential for graphics hardware-based acceleration), whereas other features such as transparency, shadows, 3D graphics and advanced Ughting effects are extremely difficult and very resource intensive.
[08] By way of example, the Microsoft Windows® XP window manager, historically known as USER, has served as the dominant component of the graphical user interface subsystem (now known as Win32) since the advent of the Windows® brand operating system. USER employs the 2-dimensional Graphics Device Interface (GDI) graphic rendering engine to render ti e display. GDI is the other major subcomponent of Win32, and is based on rendering technology present in the original Windows® brand operating system. USER renders each window to the display using an invalidation-messaging model in concert with GDI clipping regions and 2D drawing primitives. A primary activity of USER in rendering the desktop involves the identification of regions of the display in need of visual update, and infoπning applications of the need and location to draw, as per the invalidation model of desktop rendering. [09] The next development in desktop rendering is a bottom-to-top rendering approach referred to as desktop compositing. In a compositing DWM, or CDWM, the desktop is drawn from the bottom layer up to the top layer. That is, the desktop background is drawn first, followed by icons, folders, and content sitting directly on the desktop, followed by the folder(s) up one level, and so forth. By rendering the desktop from the bottom up, each iterative layer can base its content on the layer below it. However, desktop compositing is a memory intensive process because the CDWM maintains in memory a copy of each item drawn to the desktop. Prior to recent market changes and manufacturing techniques that have made advanced video hardware and computer memory far more affordable, only commercial, expensive, high-end computing systems have been able to implement compositing engines, such as for preparing special effects for movies.
[10] The evolution of mid- and lower-end computer video hardware has been driven in large part by the graphical services available in popular operating systems. However, the graphical services available in popular operating systems have not significantly advanced for a variety of reasons, including the need to maintain compatibility with older application software and the limited capabilities of the affordable range of video hardware. More recently, however, real-time 3D computer games have overtaken operating systems as the primary market incentive for evolving retail video hardware, which has in a short time attained an exceptional level of sophistication. Real time, hardware-based 3D acceleration is now available to consumers at reasonable cost. Thus, graphics hardware features once considered highly advanced, such as accelerated texture and lighting algorithms, 3D transformations and the ability to directly program the GPU are readily available. At present, generally only game software and highly specialized graphics applications actively exploit such features, and in order to do so they must bypass the legacy Win32 window manager (USER) and GDI.
[11] Another obstacle in implementing a compositing desktop model is that legacy applications written for use with an invalidation model DWM will not function property in a compositing environment. This is because the core rendering logic of the legacy application is based on the operating system's invalidation-model DWM APIs. That is, rather than render window content in direct response to user interaction or changes in internal state, the legacy application will draw only upon receiving of a paint message generated either by the operating system or its own invalidation request. The most difficult remedy consists of devising a means by with the compositing DWM surrogates the legacy GUI platform on behalf of the application. The simpler alternatives consist of excluding the application from the composited desktop environment (an approach known in the art as "sand boxing"), or simply abandoning legacy application compatibility altogether.
[12] Presently, UI elements residing in the non-client area cannot be modified by the application. In addition, there is no straightforward and robust means to query, modify, or selectively override the standard frame layout, or render non-client elements individually. For example, there is no practical way for an application to relocate the system icon or non- client buttons (e.g., restore, maximize, minimize, close) and add or track custom non-client buttons. In addition, applications are limited to placing application content in a single rectangular region (the client area) unless the application wants to assume rendering and hit- testing responsibility for the entire non-client area (as well as the client area) of the window. Loosening any of these restrictions within the existing USER framework would render unusable many popular legacy applications that anticipate and depend on them.
[13] Thus, it would be an advancement in the art to provide a rich, full featured operating system that renders a desktop using a compositing model, and to provide a desktop window manager that allows dynamic window architectures. It would be a further advancement in the art to provide a desktop window manager that allows dynamic window architectures yet supports legacy applications so that legacy applications also work in the dynamic architectural model.
BRIEF SUMMARY OF THE INVENTION
[14] The following presents a simplified summary of the invention in order to provide a basic understanding of some aspects of the invention. This summary is not an extensive overview of the invention. It is not intended to identify key or critical elements of the invention or to delineate the scope of the invention. The following summary merely presents some concepts of the invention in a simplified form as a prelude to the more detailed description provided below.
[15] To overcome limitations in the prior art described above, and to overcome other limitations that will be apparent upon reading and under£rtanding the present specification, the present invention is directed to a composited desktop providing advanced graphics and rendering capabilities.
[16] A first illustrative aspect of the invention provides a data processing system that draws windows with dynamic anatomies. The data processing system has a memory that stores window properties comprising, for each window for which properties are stored, properties for a base object and properties for at least one content object. The data processing system also has a compositing desktop window manager software module that composes a desktop based on the window properties of each window for which properties are stored.
[17] Another aspect of the invention provides a data structure for storing window information for windows have non-uniform, dynamic anatomies. The data structure includes a first data field storing base object properties for a base object of a window, and a second data field storing content object properties for one or more content objects of the window.
BRIEF DESCRIPTION OF THE DRAWINGS
[18] A more complete understanding of the present invention and the advantages thereof may be acquired by referring to the following description in consideration of the accompanying drawings, in which like reference numbers indicate like features, and wherein:
[19] Figure 1A illustrates an operating environment that may be used for one or more aspects of an illustrative embodiment of the invention.
[20] Figure IB illustrates the distribution of function and services among components in an illustrative embodiment of a composited desktop platform.
[21] Figure 2 illustrates a compositing method according to an illustrative aspect of the invention.
[22] Figure 3 illustrates a window according to an illustrative aspect of the invention.
[23] Figure 4 illustrates a portion of a window compositing method according to an illustrative aspect of the invention.
[24] Figure 5 illustrates a frosted glass framed window rendered according to an illustrative aspect of the invention. [25] Figure 6 illustrates a window with a dynamic window anatomy.
[26] Figure 7 illustrates regions used during mesh resizing.
DETAILED DESCRIPTION OF THE INVENTION
[27] In the following description of the various embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration various embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural and functional modifications may be made without departing from the scope and spirit of the present invention.
[28] The present invention provides a desktop window manager (DWM) that uses desktop compositing as its preferred rendering model. The inventive desktop window manager is referred to herein as a Compositing Desktop Window Manager (CDWM). The CDWM, together with the composition subsystem, referred to as the Unified Compositing Engine (UCE), provides 3D graphics and animation, shadows, transparency, advanced lighting techniques and other rich visual features on the desktop. The compositing rendering model used herein intrinsically eliminates the invalidation step in rendering and minimizes or eliminates the need to transmit paint and other notification messages because the system retains sufficient state information to render each window as required.
Illustrative Operating Environment
[29] Figure 1 illustrates an example of a suitable computing system environment 100 in which the invention may be implemented. The computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100.
[30] The invention is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers; server computers; portable and hand-held devices such as personal digital assistants (PDAs), tablet PCs or laptop PCs; multiprocessor systems; microprocessor-based systems; set top boxes; programmable consumer electronics; network PCs; minicomputers; mainframe computers; distributed computing environments that include any of the above systems or devices; and the like.
[31] The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a conimunications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
[32] With reference to Figure 1, an illustrative system for implementing the invention includes a general purpose computing device in the form of a computer 110. Components of computer 110 may include, but are not limited to, a processing unit 120, a system memory 130, and a system bus 121 that couples various system components including the system memory to the processing unit 120. The system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, Advanced Graphics Port (AGP) bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus.
[33] Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 110. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
[34] The system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during startup, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation, Figure 1 illustrates operating system 134, application programs 135, other program modules 136, and program data 137.
[35] The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only, Figure 1 illustrates a hard disk drive 141 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152, and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 such as a CD ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. The hard disk drive 141 is typically connected to the system bus 121 through an non-removable memory interface such as interface 140, and magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such as interface 150.
[36] The drives and their associated computer storage media discussed above and illustrated in Figure 1, provide storage of computer readable instructions, data structures, program modules and other data for the computer 110. In Figure 1, for example, hard disk drive 141 is illustrated as storing operating system 144, application programs 145, other program modules 146, and program data 147. Note that these components can either be the same as or different from operating system 134, application programs 135, other program modules 136, and program data 137. Operating system 144, application programs 145, other program modules 146, and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and pointing device 161, commonly referred to as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A monitor 184 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 183. Computer 110 may also include a digitizer 185 for use in conjunction with monitor 184 to allow a user to provide input using a stylus input device 186. In addition to the monitor, computers may also include other peripheral output devices such as speakers 189 and printer 188, which may be connected through an output peripheral interface 187.
[37] The computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in Figure 1. The logical connections depicted in Figure 1 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. [38] When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, Figure 1 illustrates remote application programs 182 as residing on memory device 181. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
Illustrative Embodiments
[39] The invention may use a compositing desktop window manager (CDWM) to draw and maintain the desktop display using a composited desktop model, i.e., a bottom-to-top rendering methodology. The CDWM may maintain content in a buffer memory area for future reference. The CDWM composes the desktop by drawing the desktop from the bottom up, begir ing with the desktop background and proceeding through overlapping windows in reverse Z order. While composing the desktop, the CDWM may draw each window based in part on the content on top of which the window is being drawn, and based in part on other environmental factors (e.g., light source, reflective properties, etc.). For example, the CDWM may use the alpha channel of an ARGB format texture to provide transparency to a window, and may selectively emphasize portions of window content (e.g., the frame) based on a virtual light source.
[40] The CDWM may reside as part of the operating system 134, 144, or may reside independently of the operating system, e.g., in other program modules 136, 146. In addition, the CDWM may rely upon a lower level graphics compositing subsystem, referred to herein as a Unified Compositing Engine (UCE), further described below and in co-pending application serial number (attorney docket number 50037.201US01), filed October 23, 2003, entitled "System and Method for a Unified Composition Engine in a Graphics Processing System", herein incorporate by reference in its entirety for all purposes. In one illustrative embodiment the UCE is based on or uses Direct3D® and DirectX® technology by Microsoft Corporation of Redmond, Washington. In alternative embodiments other graphics compositing subsystems may be used, such as variations of the X Window platform based on the OpenGL® graphics engine by Silicon Graphics, Inc. of Mountain View, California, and the like. The UCE enables 3D graphics and animation, transparency, shadows, lighting effects, bump mapping, environment mapping, and other rich visual features on the desktop.
[41] Figure IB illustrates a component architecture according to an illustrative embodiment of a desktop composition platform. A Compositing Desktop Window Manager (CDWM) 190 may include an Application Programming Interface 190a through which a composition-ware Application Software 191 obtains CDWM window and content creation and management services; a Subsystem Programming Interface 190b, through which the Legacy Windowing Graphics Subsystem 192 sends update notifications for changes affecting the redirected graphics output of individual windows (window graphical output redirection is described in more detail below); and a UI Object Manager 190c which maintains a Z-ordered repository for desktop UI objects such as windows and their associated content. The UI Object Manager may communicate with a Theme Manager 193 to retrieve resources, object behavioral attributes, and rendering metrics associated with an active desktop theme.
[42] The Legacy Graphical User Interface Subsystem 192 may include a Legacy Window Manager 192a and Legacy Graphics Device Interface 192b. The Legacy Window Manager 192a provides invalidation-model windowing and desktop services for software applications developed prior to the advent of the CDWM. The Legacy Graphics Device Interface 192b provides 2D graphics services to both legacy applications as well as the Legacy Window Manager. The Legacy Graphics Device Interface, based on the invalidation model for rendering the desktop, may lack support for 3D, hardware-accelerated rendering primitives and transformations, and might not natively support per-pixel alpha channel transparency in bitmap copy and transfer operations. Together, the Legacy Window Manager 192a and Graphical Device Interface 192b continue to serve to decrease the cost of ownership for users who wish to upgrade their operating system without sacrificing the ability to run their favorite or critical software applications that use the invalidation model. In order to achieve seamless, side-by-side integration of legacy application windows with composition-aware application windows in a manner that imposes little or no discernable end-user penalties, there may be active participation of the Legacy Graphical User Interface Subsystem 192 in the compositing process. Indeed, the perceived platform environment for legacy applications preferably does not change in order to avoid compromising their robustness on the composited desktop, yet the fundamental manner in which legacy windows are rendered to the desktop will be fundamentally altered. The invention describes how this is achieved through the addition of a feature described herein as window graphical output redirection.
[43] A Unified Compositing Engine (UCE) 194 may service rendering instructions and coalesce resources emitted from the CDWM via a Programming Interface 194a. In a broad sense, the role of the UCE relative to the CDWM is analogous to that of the Legacy Graphics Device Interface 192b relative to the Legacy Window Manager 192a. The UCE Programming Interface 194a provides the CDWM, and ultimately, applications, an abstract interface to a broad range of graphics services. Among these UCE services are resource management, encapsulation from multiple-display scenarios, and remote desktop support.
[44] Graphics resource contention between CDWM write operations and rendering operations may be arbitrated by an internal Resource Manager 194b. Requests for resource updates and rendering services are placed on the UCE's Request Queue 194c by the Programming Interface subcomponent 194a. These requests may be processed asynchronously by the Rendering Module 194d at intervals coinciding with the refresh rate of the display devices installed on the system. Thus, the Rendering Module 194d of the UCE 194 may dequeue CDWM requests, access and manipulate resources stored in the Resource Manager 194b as necessary, and assemble and deliver display-specific rendering instructions to the 3D Graphics Interface 195.
[45] Rendering the desktop to multiple displays requires abstraction of the differences in refresh rate, pixel format support, and device coordinate mapping among heterogenous display devices. The UCE may provide this abstraction.
[46] The UCE may also be responsible for delivering graphics data over a network connection in remote desktop configurations. In order to efficiently remote the desktop of one particular system to another, resource contention should be avoided, performance optimizations should be enacted and security should be robust. These responsibiUtes may also rest with the UCE.
[47] The 3D Graphics Interface 195 may include a low-level, immediate-mode (stateless) graphics service such as Direct3D®, OpenGL®, or the like. A purpose of the 3D Graphics Interface may be to provide an abstract interface over the features of the particular graphics hardware configuration. The 3D Graphics Interface may service a single display device; the UCE may parse and distribute the CDWM's rendering instructions among multiple graphics output devices 197 in a multiple-display system via multiple device drivers 196.
[48] It should be noted that the component architecture depicted in Figure IB is that of an illustrative embodiment. The figure is intended to illustrate functions that the invention may include. These functions may be distributed among a fewer or greater number of software components than those represented in the figure, according to the capabilities of the platform and the desired feature set For example, a system that lacks theme management might derive all stock resources from the system, likely as static resources managed by the CDWM itself, rather than from a separate theme manager. A platform that allows plugable window managers may replace the Application Programming Interface 190a in the CDWM with a Plugable Window Manager Interface in order to abstract the details of composited UI object and resource management. Another possible variation may eliminate the Subsystem Programming Interface 190b if legacy application compatibility is not required. The subcomponents of the UCE 194 depicted in Figure IB may be broken out into separate processes, folded into the CDWM itself or integrated into the 3D Graphics Interface. Thus a wide range of particular component designs are possible, each of which are capable of fulfilling either the entire range or a subset of the functions comprising the invention.
[49] Figure 2 illustrates a general method for performing desktop compositing according to an illustrative aspect of the invention. Steps 201 through 205 describe the interaction of a composition aware application using compositing desktop window manager (CDWM) APIs to create and manage a window and window content. Steps 207 and 209 depict the interaction between legacy, invalidation-model window manager applications and the CDWM to composite legacy window content.
[50] In step 201, the compositing desktop window manager (CDWM) receives requests from a composition-aware application to (1) create a composited window and (2) attach a content object. The invention is not limited to a single content object per window; an application can dynamically create and attach to a window (as well as detach and destroy) any number of content objects via the CDWM API, further described below. A content object consists of a raster surface of specified size and pixel format to be used as a diffuse texture mapped to an application- or system- defined mesh, along with optional accessory resources such as additional textures (light map, specular map, bump/normal map, etc), lights and a pixel shader. The pixel format of the diffuse content texture may be any of the available formats supported by the video hardware installed on the system, but for tile purposes of the current illustration, may be 32-bit ARGB. When requesting this format the application may be implicitly aware that the alpha (A) channel may be used to vary the transparency level of the content pixel, thus affording fine control over the amount of desktop background information modulating with the source pixel on final rendering. In step 203, the CDWM allocates a state block for the window to which it attaches a CDWM-implemented content object. The content object allocates the resources requested or attaches resources forwarded by the application, and then marshals these resources to the UCE to allow ready access on UCE update requests. In step 205, the application notifies the CDWM of an unsolicited change to the window or the window content. These changes can affect any window or content state, but for purpose of simplicity, the illustration depicts three common update requests: content size, window position or scale, or a change to the pixels of the content's diffuse texture.
[51] The process of compositing a legacy window begins at desktop composition initialization, with the CDWM 190 dehvering a request to the legacy windowing and graphics subsystem 192 to redirect the graphical output of each legacy window to a temporary memory location (step 207). Step 207 can be more generally described as placing the legacy window and graphics subsystem in "composition mode", in which the rendering of each individual window is redirected to a separate memory buffer. In an illustrative embodiment, the Legacy Graphical User Interface Subsystem 192 redirects the output of the graphics instructions involved in rendering the window to a bitmapped memory surface associated with the window. However, the invention encompasses the ability to retain the native drawing instructions and associated parameters, and executing these instructions in the UCE during the process of compositing the next video frame for a target display device. These redirection buffers (surfaces or drawing instruction blocks) may be managed by either the CDWM or the legacy window manager 192a, but the for the purpose of this illustration, surface resource management is centralized in the CDWM. Each redirection buffer either .constitutes or is used to generate a diffuse content texture resource for the window. The legacy window manager 192a need not invoke the CDWM window and content creation APIs; the legacy subsystem-CDWM communication channel for notifications is distinct from that of the application interface, and the CDWM derives composited window attributes (frame and border style, caption, etc) and state (hidden/shown, minim ed/maximized, etc) from existing legacy window properties. In step 209, the legacy window manager 192a informs the CDWM 190 of any change affecting the redirected window content texture that may necessitate a visual update.
[52] In steps 211, 219 and 223, the CDWM 190 discriminates from among size, position scale and pixel-level texture update requests, and acts accordingly. On a size update (step 211), the CDWM first determines whether a frame is associated with the target window (step 213). If a frame is associated with the window (step 215), the CDWM determines the appropriate size and orientation of the frame primitive based on a two- or three- dimensional extent explicitly provided by a composition-aware application, or on a combination of legacy and CDWM window metrics and the updated dimensions of the redirected legacy surface. When the frame size has been determined, the CDWM makes the appropriate changes to the position information in the vertices in the frame mesh, and forwards the vertex data buffer to the UCE. The UCE places the mesh update directive and the new vertex information on a queue for asynchronous processing. If the window does not have a frame, step 215 may be bypassed. In the case of either framed or frameless windows, size changes affecting the content area may cause the CDWM to resize the content mesh and queue the appropriate mesh update request and data to the UCE (step 217).
[53] On a position (including rotation) or scale update (step 219), the CDWM determines the new transformation parameters and queues a transform resource update request along with the data to the UCE for asynchronous processing (step 221). The resource πώiimally consists of a four by four transformation matrix, but may contain additional data to support filtered transforms.
[54] In step 223, the CDWM receives an update request involving a change to the pixel data 'of the diffuse content texture, i.e., the application has updated its content within its window. In step 225, the CDWM services the request by queuing the new pixel information to the UCE for asynchronous processing.
[55] It will be appreciated by those of skill in the art that additional update requests may be supported in addition to those depicted in Fig 2, For example, a change to the window icon or caption text may also necessitate a redraw of the CDWM-managed icon or caption content object, respectively, associated with the window. Window input focus may be reflected in the appearance of the frame, and thus in the case of a legacy window, the legacy window manager may deliver an input focus change update to the CDWM who re-renders the frame and possibly other content accordingly.
[56] In step 227, the UCE processes incoming composition and resource updates from the CDWM, and at intervals synchronized with the video refresh rates of each active video graphics adapter participating in the composition of the desktop, re-renders the desktop (or the appropriate portion thereof in a multiple-display configuration) to a display-sized backing buffer. This is accomplished using the immediate-mode rendering services provided by a 3D graphics engine (such as Microsoft DirectSD®), which in turn transfers the desktop to a primary display surface.
[57] In order to draw the window in 3D, the CDWM may define the window anatomy using various components, include a base content object and one or more child content objects. The base content object defines the window frame, or border, and consists of a base geometry, base extent, base material properties and base content margins. The base and child content objects may each be entirely defined and managed by the system or in the case of custom content elements, may be managed by the application. Content objects are discussed in more detail below.
[58] Figure 3 illustrates an application window according to an illustrative aspect of the invention. Application window 301 may include various regions and components. The frame or base content 303 of the window 301 may host child content including buttons 305 (e.g., used to restore, maximize, minimize, close the window, etc.), an indicative icon 307, scrollbars 309, menu bar 311, and window caption text 313. A primary content object area 315 may be derived from the redirection buffer obtained from the Legacy Window and Graphical User Interface Subsystem, or be created and attached to the standard base content and rendered by a composition-aware owning application. Those of skill in the art will appreciate that Figure 3 is merely illustrative of basic window elements, and that additional or different window elements may additionally or alternatively be used. In addition, window frame elements may alternatively be provided by an application, e.g., to provide a distinct look and feel to an application program. An example would be where an application program provides the scroll bar elements as custom child content objects so that they manifest an appearance and behavior peculiar to the application program. Moreover, an application may elect to remove or reposition one or more of the stock frame elements using the CDWM API. An application need not be limited to a single primary content area, a restriction prevalent in the prior art.
[59] The CDWM may support multiple application-created and rendered content areas associated with a single window. In order to provide applications the capability to provide a more unique user experience, in one or more embodiments of the invention the CDWM provides flexibiUty in the manner in which a window may be drawn. That is, the CDWM may allow an application to alter the default anatomy of a window by allowing applications to define multiple custom content objects, each having an arbitrary shape, instead of limiting each application to a single, rectangular client content area.
[60] Thus, each CDWM window may be comprised of a base content object (i.e., the frame) and a collection of one or more child content objects. Each content object may be defined by a unique set of content attributes, and can be configured to optionally receive keyboard and mouse events. The CDWM maps mouse hit-test points relative to application- defined, content-local, 3D coordinates, and delivers mouse event notifications to the application. Content objects may be managed entirely by the system, or in the case of custom content elements, may be managed by the application. Examples of system-managed content objects include the application indicative icon, frame buttons (e.g., minimize, restore, close), caption text, and certain menu bars and scroll bars. Application-managed content objects include those content objects(s) to which the application renders its primary visual output, e.g., text by a word processor, numeric grid by a spreadsheet application, or images by a photo editing application.
[61] The content texture may be a bitmap managed by the system, or in the case of custom content, the application. The content texture may be mapped linearly to the content geometry in a single repeat. The aspect ratio may be determined by the content geometry, and texture coordinates may be exposed in the content geometry. Magnification of content may be controlled with a scaling transform that affects the mapping of the content texture to its geometry. The CDWM may provide a default interactive mechanism by which the user can adjust the zoom factor, such as a system-provided menu option, slider control, and/or mouse and keyboard combinations.
[62] Prior to each re-rendering, a content surface whose diffuse texture is in a format supporting per-pixel alpha, may be initialized by the system to zero alpha at the discretion of the application (or the system in the case of a stock content object). Therefore the underlying base content object may be displayed in unpainted areas of the content surface. This enhances both the programming model and user experience because applications are not required to erase the content surface before rendering, and the user is spared flicker and stale or unpainted areas in the window.
[63] In some embodiments, certain content objects, particularly those to which the application renders its primary graphical output, may have no material properties associated with them because it would be undesirable to have the content interact with tight or the environment in a manner distracting to the user or otherwise interfering with the user's activities. The visual appearance of a content object may be deteπnined solely by its texture, geometry and perhaps the per-vertex or per-pixel alpha value in such embodiments.
[64] Figure 6 illustrates an example of a window 601 with a dynamic non-standard anatomy as described herein. Window 601 has a base frame object 603 of a non-standard shape (i.e., non-rectangular), frame button objects 605 of non-standard shape (not rectangular) positioned in a non-standard location (other than the top right corner of the window), system-provided indicative frame icon object 607 in a non-standard position (other than the top left corner of the window), and frame window caption object 613 also in a non- standard position (not left justified in the top of the frame). In Figure 6, the application associated with the window has defined two primary content object areas 615a and 615b. Primary content object area 615a is of regular (i.e., rectangular) shape, whereas primary content object area 615b is of an irregular, non-rectangular shape. Window 601 may also include application-defined frame button objects 617 and 619 providing back and forward navigation control, respectively, e.g., in a browsing context
[65] The CDWM may render the base portion of the application window 301 as a three- dimensional (3D) object. A 3D mesh primitive may be used to define the window object's shape (base geometry), a primary diffuse texture may be mapped to the 3D geometry of the mesh, and optional material properties which may include lighting, shading, refraction, blur and other special effect parameters and resources, including ancillary textures, applied during the rendering process. Ancillary textures may be used as resources for graphical effects well known in the art in order to provide "live," physically modeled interaction with light sources, cursors, and other UI objects in the desktop environment. Thus, textures may serve as the source of per-pixel 3D normal information (normal/bump mapping), tight masks (ambient, diffuse and specular light filters), reflection sources (e.g. reflection of the cursor when hovered over the window), static environment maps, and the like.
[66] The vertex format of the base geometry may optionally include a 32-bit diffuse color component in ARGB format and texture coordinate pairs {tUn, tv„} for mapping up to n textures to the mesh geometry, as described above. As is well established in the art, each integer increment of tu and tv may define a repeat of the texture in the respective dimension. For example, values range from {0.0, 0.0} (texture left, top) to {1.0, 1.0} (texture right, bottom) represent a single repeat over the entire mesh, whereas {0.0, 0.0} to {6.0, 4.0} define six repetitions in the x-dimension and four repetitions in the y-dimension.
[67] A content extent may be defined as a pair of three-dimensional points defining a bounding extent {xιeβ, ytop, Zfront> Xnght, ybottom, Zback}> or the coordinates of the smallest box that contain the base geometry. This is analogous to the 2D bounding window rectangle{xieft, ytop, Xn ht, ybottom}- The triplet { ieft-Xπght, ytop-ybottom, Zback-Zback } defines the width, height and depth of the content's extent. The extent is calculated and managed by the system and represents the size and local position of the content.
[68] If the window object is resizable, manipulating the base content's extent is the means by which the CDWM may resize the window. In order to preserve edge and corner contours, the position of each vertex in a resizable mesh might not simply be scaled to the new extent. To enable fine control over mesh resizing, a predefined vertex position filter function along with applicable parameters may be specified by the appUcation at window creation time, or selected by the CDWM as a default. The role of the vertex resizing filter function is to determine how each vertex in the target mesh behaves when its bounding extent is altered. Every filter function should determine for every member vertex the displacement direction and magnitude in each dimension (x, y, z).
[69] The simplest filter function deteπnines the direction, (positive or negative), and the magnitude (scaled relative to the new extent or offset by an amount equal to that of one of the six faces of the mesh's bounding box in a 3D space). How each vertex behaves in a resizing operation can be described on a per-vertex, per-dimension basis as a property associated with the vertex itself, or can be defined for the mesh as a whole in geometric terms. An example of the latter method is a pair of vectors {mxieft, myt0p, z&ont, mx,i ht, my ottomj zback defining six sizing margin planes, each associated with a face of the mesh bounding box, effectively dividing the volume of the bounding box into 27 cubic subregions. The sizing margin values may remain constant regardless of the size of the mesh, or may be calculated based on the initial size of the bounding box. In an arbitrary mesh resizing operation, vertices occurring in the upper, left, front cubic subregion (bounded by {xιeft, ytop, Zfont, mxιcft, myop, m ftont}}) are offset by the same magnitude and direction as the upper-left-front comer of the bounding extent Vertices occurring in the centermost cubic subregion (bounded by {mxιeft, ytop, zfont, mXή&Λ, ybottom, z ac }) are scaled relative to the new extent of that subregion. Vertices occurring in the front, center cubic subregion are scaled relative to the new extent of that subregion in the x and y dimension, but are displaced by the same magnitude and in the same direction as the mesh's front Z bounding plane.
[70] To aid in understanding the above-described principle, Figure 7 illustrates an example of a mesh resize operation in a 2-dimensional space. A window 701 has rounded co ers with a comer radius 707. If a window resize operation merely scales the mesh on which the window is based, the co er radius would scale with the mesh. However, if the comer radius is scaled, the radius of tiie rounded comers may become too large or small and detract from the user experience and detract from the usability of the user interface. Thus, as the window 701 is resized, the comer radius preferably does not change. In order to prevent the comer radius from scaling, the mesh may be divided into three segments per dimension (x, y, z as appUcable). Thus in the present example, the window is divided into 9 quadrants 703a-i. In a 3D space, the window may be divided into 27 regions. Each dimension may be equatty divided or divided unequaUy, thus allowing for equaUy sized region or unequaUy sized regions. When regions are unequaUy sized, regions bounded by the bounding box may be made as small as necessary to encompass material that should not be scaled.
[71] During a window resize operation, quadrants are offset in each dimension in which the quadrant is bounded by the bounding box, and scaled in each dimension in which the quadrant is bounded by a region divider 705a-d. For example, regions 703a, 703c, 703 g, and 703i are bounded by the bounding box on at least one side in both the X and Y dimensions, so mesh vertices in regions 703a, 703c, 703 g, and 703i retain the same offset from the bounding box as the window is resized. Regions 703b and 703h are bounded by the bounding box on at least one side in tiie Y (vertical) dimension, but bounded only by region dividers 705 in the X (horizontal) dimension. Thus, mesh vertices in regions 703b and 703h will retain their offsets in the Y dimensions, but be scaled in the X dimension. Regions 703d and 703f are bounded by the bounding box on at least one side in the X (horizontal) dimension, but bounded only by region dividers 705 in the Y (vertical) dimension. Thus, mesh vertices in regions 703d and 703f wiU retain their offsets in the X dimension, but be scaled in the Y dimension. Region 703e is bounded by dividing lines 705 in both the X and Y dominions, so mesh vertices falling within region 703e wiU be scaled in both the X and Y dimensions. One of skiU in the art wiU recognize the extension of this algorithm to 3 dimensions by including a Z dimension as described in the preceding paragraphs.
[72] Another variation of a mesh resizing filter function may interpret hand-authored vertex metadata rather than rely on a global geometric construct such as sizing margins to determine whether the vertex position scales or offsets in any direction. Such a function might be used to preserve complex surface topology such as ridges and troughs during a mesh resize. Another variation of a mesh resizing filter function may aUow vertices to be displaced in each dimension in a linear or nonlinear manner, with discrimination bits and function coefficients stored as per-vertex metadata. Such a function enables effects such as linear or non-linear, localized or generaUzed bulging or coUapsing concomitant with mesh resize.
[73] The base content margins define the boundaries to which child content is constrained. Content margins may be three-dimensional boundaries defined in the same manner as sizing margins. However, unlike sizing margins, content margins may scale linearly with window scale, and might not influence mesh resizing.
[74] Local and desktop-global resources and parameters, as specified according to the values of abstract material properties, in combination with pixel shaders, comprise the data and mechanism by which the CDWM may implement tiie rendering of physical modeled desktop content
[75] High-level content material properties define tiie manner in which the content interacts with tight and the surrounding environment The rendering of complex materials such as frosted glass may use techniques not natively supported in video hardware. As a result, tiie CDWM implements the material properties using one of a smaU number of predefined pixel shaders. A pixel shader is a smaU routine loaded into the display hardware that manipulates the values of pixels prior to display based on a pre-defined set of resources, including but not limited to tight sources, textures, and vertices in a mesh primitive, as weU as parameters such as transforms and metrics. The CDWM may select from among its coUection of predefined pixel shaders the appropriate shader to render a particular set of object material properties, which include ambient color (intensity and transparency), diffuse color (intensity and transparency), specular color (intensity and transparency), reflection scalar, refraction index, diffuse texture, and bump texture, each of which is described further below. Desktop-global properties may be used to define global environment properties, such as eye position, global tight source(s), environment maps, and the like. The resources and parameters that define these desktop-global properties may be forwarded together with the base window material properties to the 3D Graphics Interface as parameters to the active pixel shader immediately prior to rendering the window.
[76] Ambient color simulates tight hitting the surface of the object from aU directions. As a material property applicable to any CDWM-managed UI content object, ambient intensity determines the relative amount of ambient light contacting the surface of the object, and a 32-bit ARGB value may be used to specify the ambient color and transparency. In one iUustrative embodiment ambient intensity may range from 0.0 (zero ambient light, giving a uniformly black appearance) to 1.0 (maximum intensity of the specified color distributed uniformly over the object). The effect of ambient intensity with a white ambient color allows control over the general brightness of the object.
[77] Diffusion intensity determines the amount of directional tight scattered in aU directions after contacting the object's surface. The tight itself is provided either by one or more directional tights or the cubic tight map. As a material property appticable to any CDWM-managed UI content object diffuse color may be specified by a 32-bit ARGB value that dictates ti e color, where the alpha component dictates the transparency of the tight reflected diffusely. The diffusion intensity value ranges from 0.0 (no tight is reflected diffusely, giving the object a uniformly black appearance) to 1.0 (aU tight is reflected diffusely, giving the object a shaded appearance according to the diffusion color value). Lit surfaces wiU appear more reatisticaUy modeled as the sum of the ambient and diffusion intensity values approaches 1.0.
[78] Specular intensity controls how much light is reflected off the object's surface directly back at the viewer, and specular color may be specified as an ARGB color of the object. The light source itself may be in the form of either one or more directional lights or a cubic light map. As a material property applicable to any CDWM-managed UI content object, high specular intensity values may be used to model a shiny surface with sharp highlights, whereas low values may be used to model a matte surface with faint or absent highlights. The alpha component of the color determines the transparency of the specular highlights.
[79] Reflectivity, like specularity, determines the amount of light that is reflected directly back at the viewer from the surface of the object Reflection differs from specularity in that reflection applies to the entire environment not just the light source. As a material property appticable to any CDWM-managed UI content object, a reflectivity value of 0.0 produces no reflection of the environment in the surface, and a value of 1.0 produces mirror-like reflection of the environment in the surface. The environment may be modeled using a combination of the cubic environment map and the mouse cursor. Thus, the mouse cursor as weU as static features of the environment may be reflected from the window surface to a degree controUed by the reflection intensity scalar.
[80] The refraction index of each object determines the angle of transmission of tight traveling through it. SneU's law, ni sin θi = n2 sin θ2, may be used, where nj and n2 are the refraction indices of mediums 1 and 2, and θi and θ2 are incident and transmission angles, respectively, of light relative to the surface normal. Therefore if medium 1 represents the desktop environment with an assigned refraction index of 1.0 (no refraction), and medium 2 is that of the window base object the angle of refraction is determined as θ0bj = sin ^sin θev / n obj). Known refraction indices for various media which may be simulated are shown below in Table 1.
Table 1
[81] Once the angle of refraction has been determined/computed, it may then be used to select the proper pixel from the background to render on the visible surface of the object following further processing associated with other material properties. Optimizations for the purpose of real time rendering of refraction may incorporate the Fresnel technique, a method appreciated by those of skiU in the art. [82] Visual styles (themes) may be used to define CDWM visual and behavioral poUcy. Visual Styles generally refer to user-selectable themes that specify elaborate, hand-designed graphics and behavioral attributes appUed to common user interface elements. Applications may optionally override some of these attributes, whereas others are selectively enforced by the system in the interest of consistency in the user interface. Visual attributes include the appearance of common window content such as the frame area (base content), non-ctient buttons, and other appUcation independent elements. Behavioral attributes include window and desktop transition animations, the manner in which a window is interactively moved or resized with the mouse (e.g., snap, glue and stretch and constraint), and other application- independent behaviors. Visual and behavioral policy may be centralized in the CDWM rather than having that poticy distributed throughout the software rendering pipeline, thus providing a more consistent end-user experience, and a simpler development environment
[83] According to an illustrative embodiment of the invention, the default (or custom) texture of a visual style may comprise an alpha level and/or a bitmap based on which each pixel is modified. For example, an alpha level may be used to modify a transparency level, as is known in the art. In addition, the texture may comprise a bitmap with which the client and/or non-ctient area, or a portion of the client and/or non-client area, may be pixel shaded. In one illustrative embodiment, for example, the bitmap may give the appearance of frosted glass. Figure 5 iUustrates a window 501 rendered with a frosted glass frame 503, where the refraction index may be specified to simulate glass when deterniining which pixel from the content behind window frame 503 should appear. By taking advantage of the graphics rendering engine's advanced texturing, Ughting, and 3D capabilities, and using an appropriate bitmap, the CDWM can compose a window 501 with a frame 503 having a frosted glass appearance that reflects tight from an optionally specified virtual tight source within the 3D desktop environment, yet has an opaque ctient content area so that visual acuity of the client content is not diminished.
[84] Desktop rendering models (invalidation versus compositing) each have a unique schema for interacting with application programs so that the application program's window(s) are maintained properly on the desktop. For example, in an invalidation model, the desktop rendering is dependent on the management and continuous updating of window "ctipping regions." Ctipping is the process by which rendering is limited to an appropriate area of a window. When one window is partiaUy obscured by another, its clipping region corresponds to the inverse of the obscured area. If the underlying window paints its content, whether in response to a paint message or in an unsolicited manner, the invalidation model DWM ensures that its ctipping region is appUed to the output, thus ensuring that no painting wiU take place in the overlying window(s). If the overlying window is moved, or the underlying window is brought to the top of the Z-order, the ctipping region of the underlying window is adjusted by the DWM accordingly before it sends the window a paint message to update any newly exposed content.
[85] Invalidation model DWMs and compositing model DWMs thus rely on different information to draw the desktop. For example, in an invaUdation model DWM, because the DWM does not store a copy of the entire surface of each window on the desktop, the DWM must communicate with an application to refresh content during resizing and redraws. Likewise, the appUcation expects to not need to refresh its content unless asked to do so by the DWM (unless, of course, the content is updated as a result of user input). If the application does need to independently update its own content it asks the DWM to invatidate a portion of its own window, expecting to receive from the DWM a paint request corresponding to the invalid region. Because in the case of the composited desktop sufficient information to draw each window in its entirety is retained by the CDWM, the CDWM need not send the window paint messages on events such as those described above. This in turn obviates the invalidation step; the application need simply to redraw all or a portion of itself as internal events dictate.
[86] Due to these fundamental differences, each DWM and/or CDWM has a unique set of APIs through which appUcation programs expect to communicate with the DWM to ensure that the window content is kept current. As a result an appUcation originaUy programmed for use with an invaUdation model DWM, i.e. one that relies on paint messages to render its content, will not necessarily work with a compositing model CDWM. Thus, with reference to Figure 4, tiie CDWM may provide support for appUcations originaUy developed for use in an invaUdation model DWM. These appUcations may be referred to herein as legacy appUcations, and the backwards-compatible support may be referred to herein as legacy support. Legacy APIs refer to APIs for use with a prior version of the operating system that used an invaUdation model DWM with which the legacy appUcation is compatible. The legacy APIs 192b (Fig. IB) aUow the appUcation to conn iunicate with the invaUdation model DWM (legacy DWM) 192a. The legacy DWM may use a separate legacy API element to process various legacy notifications on behalf of the appUcation to CDWM, to transfer relevant state information to the CDWM, and to translate between legacy and CDWM coordinate spaces for input and focus determinations. The legacy DWM may be modified to redirect data to the CDWM, as described below.
[87] Figure 4 iUustrates a portion of a window compositing method according to an iUustrative aspect of the invention. Steps 401-409 represent the initial rendering of content associated with a legacy appUcation window whose source rendering surface (or set if instructions required to generate the surface) is obtained from tiie Legacy Window Manager 192a (Fig. IB). Steps 411-419 illustrate rendering of window content created by a composition-aware appUcation program.
[88] In step 401 , the CDWM receives an initial update notification for the primary window content from the legacy window manager As a result of a legacy application calling legacy APIs 192b to draw a window on the desktop according to the invaUdation model for which the application was designed. For example, Microsoft® Word® XP may caU the legacy APIs so that the legacy DWM 192a draws text input by the user. In step 403 the CDWM retrieves the content's default mesh from the theme manager. In step 405 the CDWM retrieves (or generates) the redirection surface from the Legacy Window Manager. This surface may be used as the content's diffuse texture. In step 407, the CDWM ensures that only the desired areas of the legacy texture are retained, so that those containing the legacy window frame, border and/or caption or not rendered. One manner in which this can be accomptished expediently is by transforming the texture mapping coordinates of the mesh such that only the desired area is mapped to the mesh's x and y bounding extents. In step 409, the CDWM retrieves default material properties for the content The resources and parameters required to render the legacy content have now been coUected.
[89] In step 411 the CDWM receives information from an appUcation program requiring the rendering of a content object associated with a window. The content may optionally be accompanied by a custom mesh, custom texture, and/or custom material properties. A custom mesh may be provided alone when the appUcation program desires to have a non- standard shape for an existing content object. If the content object in question is the window base content, the custom mesh wiU redefine the shape of the window. A custom texture and/or custom material properties may be provided alone when the appUcation program desires to impart a non-standard appearance (i,e., other than that specified by the active theme) to a system-defined content object If the content object in question is the window base content, the custom texture and/or material properties redefine the appearance of the window without modifying its shape. More commonly, the application creates a content object from scratch and specifies its mesh (which may be selected from a set of predefined system meshes), texture and material properties (which may be selected from a set of predefined system material properties) at creation time.
[90] In step 413 the CDWM determines whether a custom content mesh was specified and, if not retrieves a default mesh from the theme manager (step 403). In step 415 the CDWM determines whether a custom texture was specified and, if not, retrieves a default texture from the theme manager. In step 417, the CDWM determines whether custom material properties were specified by the application and, if not, retrieves a default set of material properties from the theme manager. The resources and parameters required to render the custom content have now been coUected.
[91] In step 419 the CDWM assembles a rendering instruction block via the UCE Programming Interface to render the content with references to the appropriate mesh, texture and material properties.. The rendering instruction block is queued for execution by the UCE. The instruction block is executed by the UCE Rendering Model on expiration of the pending refresh interval of the target device(s).
[92] By providing legacy support, the operating system in which the CDWM and legacy DWM are integrated inherently has the capabiUty to render the desktop using the invalidation DWM (legacy DWM 192a) or the compositing DWM (CDWM 190). That is, the invalidation model DWM is supported by the operating system in addition to the composition model in order to provide legacy support. Thus, in systems that do not have the video hardware necessary to efficiently perform the processor intensive calculations required for desktop compositing (e.g., in systems with low video memory, or with no 3D acceleration hardware) the CDWM and/or the operating system may aUow a user to select whether the compositing or legacy drawing mode should be used. The selection may be made automatically or manuaUy. For example, the selection may be made via manual user control, in accordance with the drawing mode defined by an activated visual style (theme) selected by the user. The selection may alternatively or also be based on power-conservation conditions. For example, when a portable device is disconnected from an AC power source and switches to battery power, the operating system may enforce legacy drawing mode because it is the video graphics processing unit (GPU) is less active and thus consumes less power.
[93] Using the methods and systems described above, an operating system may provide a physicaUy modeled graphical user interface that uses advanced 3D graphics capabiUties. Window frames may take on not only the appearance, but also the characteristics, of frosted glass or some other complex material that provides a simulated surface appearance having at least some transparency combined with at least some distortion of the content visible as a result of the transparency, resulting in a translucent effect. That is, not only does the present invention have the capability to make a window frame or border look Uke frosted glass, but the window frame also behaves Uke frosted glass in that it reflects content in the GUI environment, includes spectral highlights indicative of virtual tight sources, simulates an index of refraction similar to glass such that content behind the "frosted glass" border is sUghtly offset accordingly, and a bitmap may be appUed via one or more pixel shaders to provide distortion of underlying content
[94] Frosted glass or other glass-like physically modeled objects provide many advantages to a user of the graphical user interface. For example, the aesthetic appearance of the glass enhances the GUI and provides a rich user experience that makes a GUI of the present invention more desirable to end-users than GUIs of other operating systems by providing a tight, open feeling to the user. At the same time, physically modeled frosted glass having true or near true characteristics of glass also provides functional advantages.
[95] The frosted glass appearance aids the user to understand window layer order in a multi-window environment. Shading, reflection, and specular highlights create a stronger sense of depth and layers on the desktop, and help a user determine the Z order of various windows on the desktop. WhUe some known systems apply uniform transparency to an entire window, a user might not readily perceive what content is within the window of interest and what content is behind the window. By varying this uniform, per-pixel transparency as a function of Z-order, it is possible to ameUorate the problem, but in an unnatural, non-intuitive manner. Rather, by incorporating into a pixel shader an adjustable blurring algorithm that samples multiple surrounding source pixels in the course of generating each destination pixel, and executing this shader in the process of rendering the window frame, the present invention models the light-scattering behavior arising from the material imperfections in real-world frosted glass. It is this physically modeled distortion of the background that allows the user to immediately distinguish between background and foreground content And because the effect is cumulative, overlapping frosted glass window frames become progressively more distorted from foreground to background. Thus, the user can intuitively differentiate background content underlying multiple layers of frosted glass window frames.
[96] Frosted glass also aUows the use of thicker borders to ease user interaction with the GUI, for example, by making it easier for a user to grab a window border with a mouse to move or resize the window, yet not obscure content beneath the window (because the glass is transparent or translucent). Various frosted glass effects may be used to make it easier for a user to tell the difference for active versus inactive window states. In addition, frosted glass makes it easier for a user to read andor view content on the screen, because the user can view more screen area at any given time (because the frosted glass is translucent or transparent), and the desktop appears less cluttered because glass is a non-obtrusive element on the display screen.
[97] It will be appreciated by those of skiU in the art that whfle the figures depict a specific example of frosted glass, the invention is not so limited. The frosted glass appearance may be easily varied by applying a different bitmap and or a different pixel shaper to the rendering of the appearance. In addition, applying different environment variables (e.g., differing die light source(s), which affects reflection and specular highlights) or changing virtual physical properties of the glass (e.g., index of refraction, reflection, etc.), wiU affect the appearance of the frosted glass as well. It will be appreciate that the present invention may also be used to simulate other textures and compounds, e.g., metals, plastics, paper, cotton, and other natural and synthetic materials.
[98] While the invention has been described with respect to specific examples including presently preferred modes of carrying out the invention, those skiUed in the art will appreciate that there are numerous variations and permutations of the above described systems and techniques. Thus, the spirit and scope of the invention should be construed broadly as set forth in the appended claims.

Claims

I/We Claim:
1. A computer readable medium storing a data structure defining a window for drawing on a desktop representation displayed on a display device, comprising: a first data field storing base content object properties for a base content object of the window; and a second data field storing content object properties for a pluraUty of discrete primary content objects.
2. The computer readable medium of claim 1 , wherein the first data field is subdivided to store base object properties comprising a base geometry.
3. The computer readable medium of claim 1 , wherein the first data field is subdivided to store base object properties comprising base content margins, a base extent and a base material.
4. The computer readable medium of claim 2, wherein the first data field is further subdivided to store base geometry properties comprising a plurality of vertices defining a mesh.
5. The computer readable medium of claim 3, wherein the first data field is further subdivided to store base material properties comprising an ambient color, a diffusive color, and a specular color.
6. The computer readable medium of claim 5, wherein each of the ambient color, diffusive color, and specular color are defined as an ARGB value.
7. The computer readable medium of claim 3 , wherein the first data field is further subdivided to store base material properties comprising a reflection scalar and a refraction index.
8. The computer readable medium of claim 3, wherein the first data field is further subdivided to store base material properties comprising a diffuse texture and a bump texture.
9. The computer readable medium of claim 1 , wherein the second data field is further subdivided to store a content geometry and a content surface for each primary content object
10. The computer readable medium of claim 9, wherein the second data field is further subdivided to store content surface properties comprising an ARGB texture for each primary content object.
11. A data processing system comprising: a memory storing window properties comprising, for a plurality of windows for which properties are stored, properties for a base object and properties for one or more primary content objects; a compositing desktop window manager software module that composes a desktop based on the window properties of each window for which properties are stored, wherein for one of the plurality of windows for which properties are stored, the memory stores a pluraUty of primary content objects.
12. The data processing system of claim 11, wherein the properties for the base object comprise a base geometry.
13. The data processing system of claim 11, wherein the properties for the base object comprise base content margins, a base extent and a base material.
14. The data processing system of claim 12, wherein the base geometry property comprises a plurality of vertices defining a mesh.
15. The data processing system of claim 13 , wherein the base material property comprises an ambient color, a diffusive color, and a specular color.
16. The data processing system of claim 15, wherein each of the ambient color, diffusive color, and specular color are defined at least by an ARGB value.
17. The data processing system of claim 13, wherein the base material property comprises a reflection scalar and a refraction index.
18. The data processing system of claim 13, wherein the base material property comprises a diffuse texture and a bump texture.
19. The data processing system of claim 11 , wherein the memory stores, for at least one primary content object, a content geometry and a content surface.
20. The data processing system of claim 19, wherein the content surface comprises an ARGB texture.
21. A computer implemented method of displaying a window in a graphical user interface of a shell of an operating system, comprising: receiving window information from an instance of an appUcation program; and rendering a window having a base object and a pluraUty of discrete primary content objects.
22. The method of claim 21 , wherein rendering is based on base content margins, a base extent and a base material.
23. The method of claim 21 , wherein rendering is based on a base geometry defined by a mesh.
24. The method of claim 22, wherein rendering is based on base material properties comprising an ambient color, a diffusive color, and a specular color.
25. The method of claim 24, wherein each of the ambient color, diffusive color, and specular color are defined as an ARGB value.
26. The method of claim 22, wherein rendering is based on base material properties comprising a reflection scalar and a refraction index.
27. The method of claim 21 , wherein rendering is based on base material properties comprising a diffuse texture and a bump texture.
28. The method of claim 21, wherein rendering is based on a content geometry and a content surface for each primary content object.
29. ' The method of claim 28, wherein rendering is based on content surface properties comprising an ARGB texture for each primary content object.
30. The method of claim 23, further comprising: receiving user input to resize the window; dividing the mesh into three regions per mesh dimension; for each region, mamtaining offsets of mesh vertices in any dimension by which the region is bounded by a bounding box of the window, and seating mesh vertices in any dimension by which the region is not bounded by the bounding box of the window.
31. A method for resizing a window having two primary content objects, the window defined in part by a mesh, comprising: dividing the mesh into three regions per mesh dimension; for each region, mamtaining offsets of mesh vertices in any dimension by which the region is bounded by a bounding box of the window, and scaling mesh vertices in any dimension by which the region is not bounded by the bounding box of the window.
32. The method of claim 31, wherein the regions are equally sized.
33. The method of claim 31, wherein the regions are not equaUy sized.
34. The method of claim 33, wherein regions bounded by the bounding box are as small as necessary to encompass material that should not be scaled.
PCT/US2004/019109 2003-10-23 2004-07-28 Dynamic window anatomy WO2005045558A2 (en)

Priority Applications (9)

Application Number Priority Date Filing Date Title
KR1020057007426A KR101086570B1 (en) 2003-10-23 2004-07-28 Dynamic window anatomy
CN2004800013383A CN101288104B (en) 2003-10-23 2004-07-28 Dynamic window anatomy
AU2004279204A AU2004279204B8 (en) 2003-10-23 2004-07-28 Dynamic window anatomy
EP04776616A EP1682964A4 (en) 2003-10-23 2004-07-28 Dynamic window anatomy
MXPA05007169A MXPA05007169A (en) 2003-10-23 2004-07-28 Dynamic window anatomy.
BR0406387-2A BRPI0406387A (en) 2003-10-23 2004-07-28 Dynamic window anatomy
CA002501671A CA2501671A1 (en) 2003-10-23 2004-07-28 Dynamic window anatomy
JP2006536548A JP4808158B2 (en) 2003-10-23 2004-07-28 A computer-readable medium storing a data structure defining a window for rendering a desktop representation and a data processing system
AU2009217377A AU2009217377B2 (en) 2003-10-23 2009-09-18 Dynamic window anatomy

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/691,442 US7817163B2 (en) 2003-10-23 2003-10-23 Dynamic window anatomy
US10/691,442 2003-10-23

Publications (2)

Publication Number Publication Date
WO2005045558A2 true WO2005045558A2 (en) 2005-05-19
WO2005045558A3 WO2005045558A3 (en) 2008-06-05

Family

ID=34521879

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/019109 WO2005045558A2 (en) 2003-10-23 2004-07-28 Dynamic window anatomy

Country Status (14)

Country Link
US (1) US7817163B2 (en)
EP (1) EP1682964A4 (en)
JP (2) JP4808158B2 (en)
KR (1) KR101086570B1 (en)
CN (1) CN101288104B (en)
AU (2) AU2004279204B8 (en)
BR (1) BRPI0406387A (en)
CA (1) CA2501671A1 (en)
MX (1) MXPA05007169A (en)
MY (1) MY142719A (en)
RU (1) RU2377663C2 (en)
TW (1) TWI374385B (en)
WO (1) WO2005045558A2 (en)
ZA (1) ZA200503149B (en)

Families Citing this family (93)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7681112B1 (en) 2003-05-30 2010-03-16 Adobe Systems Incorporated Embedded reuse meta information
US7765143B1 (en) 2003-11-04 2010-07-27 Trading Technologies International, Inc. System and method for event driven virtual workspace
US7170526B1 (en) * 2004-01-26 2007-01-30 Sun Microsystems, Inc. Method and apparatus for redirecting the output of direct rendering graphics calls
US7231632B2 (en) * 2004-04-16 2007-06-12 Apple Computer, Inc. System for reducing the number of programs necessary to render an image
US7847800B2 (en) * 2004-04-16 2010-12-07 Apple Inc. System for emulating graphics operations
US7248265B2 (en) * 2004-04-16 2007-07-24 Apple Inc. System and method for processing graphics operations with graphics processing unit
US8704837B2 (en) * 2004-04-16 2014-04-22 Apple Inc. High-level program interface for graphics operations
US8134561B2 (en) 2004-04-16 2012-03-13 Apple Inc. System for optimizing graphics operations
US8239749B2 (en) 2004-06-25 2012-08-07 Apple Inc. Procedurally expressing graphic objects for web pages
US7652678B2 (en) * 2004-06-25 2010-01-26 Apple Inc. Partial display updates in a windowing system using a programmable graphics processing unit
US8453065B2 (en) 2004-06-25 2013-05-28 Apple Inc. Preview and installation of user interface elements in a display environment
US20050285866A1 (en) * 2004-06-25 2005-12-29 Apple Computer, Inc. Display-wide visual effects for a windowing system using a programmable graphics processing unit
US7761800B2 (en) * 2004-06-25 2010-07-20 Apple Inc. Unified interest layer for user interface
US8566732B2 (en) 2004-06-25 2013-10-22 Apple Inc. Synchronization of widgets and dashboards
US8302020B2 (en) * 2004-06-25 2012-10-30 Apple Inc. Widget authoring and editing environment
US7490295B2 (en) * 2004-06-25 2009-02-10 Apple Inc. Layer for accessing user interface elements
US20060059432A1 (en) * 2004-09-15 2006-03-16 Matthew Bells User interface having viewing area with non-transparent and semi-transparent regions
US20060168537A1 (en) * 2004-12-22 2006-07-27 Hochmuth Roland M Computer display control system and method
US8140975B2 (en) 2005-01-07 2012-03-20 Apple Inc. Slide show navigation
US7730418B2 (en) * 2005-05-04 2010-06-01 Workman Nydegger Size to content windows for computer graphics
US8543931B2 (en) 2005-06-07 2013-09-24 Apple Inc. Preview including theme based installation of user interface elements in a display environment
US7865830B2 (en) 2005-07-12 2011-01-04 Microsoft Corporation Feed and email content
US7843466B2 (en) * 2005-07-29 2010-11-30 Vistaprint Technologies Limited Automated image framing
US8095887B2 (en) * 2005-08-05 2012-01-10 Samsung Electronics Co., Ltd. Apparatus for providing multiple screens and method of dynamically configuring multiple screens
US7743336B2 (en) 2005-10-27 2010-06-22 Apple Inc. Widget security
US7752556B2 (en) * 2005-10-27 2010-07-06 Apple Inc. Workflow widgets
US7954064B2 (en) 2005-10-27 2011-05-31 Apple Inc. Multiple dashboards
US9104294B2 (en) 2005-10-27 2015-08-11 Apple Inc. Linked widgets
US8543824B2 (en) 2005-10-27 2013-09-24 Apple Inc. Safe distribution and use of content
US7707514B2 (en) 2005-11-18 2010-04-27 Apple Inc. Management of user interface elements in a display environment
US20070162850A1 (en) * 2006-01-06 2007-07-12 Darin Adler Sports-related widgets
US8108785B2 (en) * 2006-01-09 2012-01-31 Microsoft Corporation Supporting user multi-tasking with clipping lists
US20070245250A1 (en) * 2006-04-18 2007-10-18 Microsoft Corporation Microsoft Patent Group Desktop window manager using an advanced user interface construction framework
US8155682B2 (en) * 2006-05-05 2012-04-10 Research In Motion Limited Handheld electronic device including automatic mobile phone number management, and associated method
US8497874B2 (en) * 2006-08-01 2013-07-30 Microsoft Corporation Pixel snapping for anti-aliased rendering
US8144166B2 (en) * 2006-08-01 2012-03-27 Microsoft Corporation Dynamic pixel snapping
US8869027B2 (en) 2006-08-04 2014-10-21 Apple Inc. Management and generation of dashboards
AU2007291927B2 (en) * 2006-08-27 2008-07-17 Miwatch Holdings Limited GSM mobile watch phone
US8508552B2 (en) * 2006-09-08 2013-08-13 Microsoft Corporation Pixel snapping with relative guidelines
CN100583022C (en) * 2006-09-27 2010-01-20 联想(北京)有限公司 Method for capturing computer screen image
US7712047B2 (en) * 2007-01-03 2010-05-04 Microsoft Corporation Motion desktop
US20080168367A1 (en) * 2007-01-07 2008-07-10 Chaudhri Imran A Dashboards, Widgets and Devices
US7996787B2 (en) * 2007-02-06 2011-08-09 Cptn Holdings Llc Plug-in architecture for window management and desktop compositing effects
KR101415023B1 (en) * 2007-04-26 2014-07-04 삼성전자주식회사 Apparatus and method for providing information through network
JP4858313B2 (en) * 2007-06-01 2012-01-18 富士ゼロックス株式会社 Workspace management method
US8954871B2 (en) 2007-07-18 2015-02-10 Apple Inc. User-centric widgets and dashboards
US20090021486A1 (en) * 2007-07-19 2009-01-22 Apple Inc. Dashboard Surfaces
US8667415B2 (en) 2007-08-06 2014-03-04 Apple Inc. Web widgets
US8156467B2 (en) * 2007-08-27 2012-04-10 Adobe Systems Incorporated Reusing components in a running application
US8176466B2 (en) 2007-10-01 2012-05-08 Adobe Systems Incorporated System and method for generating an application fragment
US8407605B2 (en) * 2009-04-03 2013-03-26 Social Communications Company Application sharing
US8397168B2 (en) 2008-04-05 2013-03-12 Social Communications Company Interfacing with a spatial virtual communication environment
JP4995057B2 (en) * 2007-12-07 2012-08-08 キヤノン株式会社 Drawing apparatus, printing apparatus, drawing method, and program
US9189250B2 (en) * 2008-01-16 2015-11-17 Honeywell International Inc. Method and system for re-invoking displays
US8356258B2 (en) 2008-02-01 2013-01-15 Microsoft Corporation Arranging display areas utilizing enhanced window states
US9619304B2 (en) 2008-02-05 2017-04-11 Adobe Systems Incorporated Automatic connections between application components
US8266637B2 (en) * 2008-03-03 2012-09-11 Microsoft Corporation Privacy modes in a remote desktop environment
CN102067085B (en) * 2008-04-17 2014-08-13 微系统道格有限公司 Method and system for virtually delivering software applications to remote clients
US8379058B2 (en) * 2008-06-06 2013-02-19 Apple Inc. Methods and apparatuses to arbitrarily transform windows
US20090328080A1 (en) * 2008-06-25 2009-12-31 Microsoft Corporation Window Redirection Using Interception of Drawing APIS
US8656293B1 (en) 2008-07-29 2014-02-18 Adobe Systems Incorporated Configuring mobile devices
US20100054632A1 (en) * 2008-09-02 2010-03-04 Motorola, Inc. Method and Apparatus to Facilitate Removing a Visual Distraction From an Image Being Captured
KR101520067B1 (en) * 2008-10-02 2015-05-13 삼성전자 주식회사 Graphic processing method and apparatus implementing window system
US8490026B2 (en) * 2008-10-27 2013-07-16 Microsoft Corporation Painting user controls
US8363067B1 (en) * 2009-02-05 2013-01-29 Matrox Graphics, Inc. Processing multiple regions of an image in a graphics display system
US20110060993A1 (en) * 2009-09-08 2011-03-10 Classified Ventures, Llc Interactive Detailed Video Navigation System
WO2011060442A2 (en) * 2009-11-16 2011-05-19 Citrix Systems, Inc. Methods and systems for selective implementation of progressive display techniques
CN102156999B (en) * 2010-02-11 2015-06-10 腾讯科技(深圳)有限公司 Generation method and device thereof for user interface
US20110252376A1 (en) 2010-04-07 2011-10-13 Imran Chaudhri Device, Method, and Graphical User Interface for Managing Concurrently Open Software Applications
US9823831B2 (en) * 2010-04-07 2017-11-21 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
CN102065336B (en) * 2010-07-21 2013-06-26 深圳市创维软件有限公司 Digital television receiver and method for determining multistage window shearing relation of digital television receiver
US20120092381A1 (en) * 2010-10-19 2012-04-19 Microsoft Corporation Snapping User Interface Elements Based On Touch Input
WO2012159361A1 (en) * 2011-08-01 2012-11-29 华为技术有限公司 Distributed processing method and system for virtual desktop
TWI489370B (en) 2012-10-16 2015-06-21 智邦科技股份有限公司 System and method for rendering widget
US9477404B2 (en) 2013-03-15 2016-10-25 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US9530243B1 (en) 2013-09-24 2016-12-27 Amazon Technologies, Inc. Generating virtual shadows for displayable elements
US9591295B2 (en) 2013-09-24 2017-03-07 Amazon Technologies, Inc. Approaches for simulating three-dimensional views
US9437038B1 (en) 2013-09-26 2016-09-06 Amazon Technologies, Inc. Simulating three-dimensional views using depth relationships among planes of content
US9224237B2 (en) * 2013-09-27 2015-12-29 Amazon Technologies, Inc. Simulating three-dimensional views using planes of content
US9367203B1 (en) 2013-10-04 2016-06-14 Amazon Technologies, Inc. User interface techniques for simulating three-dimensional depth
US9703445B2 (en) 2014-05-07 2017-07-11 International Business Machines Corporation Dynamic, optimized placement of computer-based windows
US9659394B2 (en) * 2014-06-30 2017-05-23 Microsoft Technology Licensing, Llc Cinematization of output in compound device environment
US10147158B2 (en) 2014-12-13 2018-12-04 Microsoft Technology Licensing, Llc Frame invalidation control with causality attribution
US9911395B1 (en) * 2014-12-23 2018-03-06 Amazon Technologies, Inc. Glare correction via pixel processing
CN105068714B (en) * 2015-08-10 2019-02-05 联想(北京)有限公司 A kind of display control method and electronic equipment
US10915952B2 (en) 2015-12-18 2021-02-09 Trading Technologies International, Inc. Manipulating trading tools
GB2563282B (en) * 2017-06-09 2022-01-12 Displaylink Uk Ltd Bypassing desktop composition
WO2018231258A1 (en) * 2017-06-16 2018-12-20 Microsoft Technology Licensing, Llc Generating user interface containers
CN115769604A (en) * 2020-05-27 2023-03-07 马尔科·杜里奇 Notification application for computing devices
CN114546204B (en) * 2022-04-21 2022-08-12 广东统信软件有限公司 Window management method, computing device and readable storage medium
US11886685B1 (en) 2022-10-20 2024-01-30 Stardock Corporation Desktop container peek interface
CN116013210A (en) * 2023-01-18 2023-04-25 广州朗国电子科技股份有限公司 Display effect adjusting method, system, equipment and storage medium
US12067348B1 (en) 2023-08-04 2024-08-20 Bank Of America Corporation Intelligent webpage UI optimization leveraging UI and viewer extraction telemetry

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4694404A (en) * 1984-01-12 1987-09-15 Key Bank N.A. High-speed image generation of complex solid objects using octree encoding
US4696404A (en) 1986-08-27 1987-09-29 Corella Arthur P Heat sealed package with perforated compartment seal
JPH0546568A (en) 1991-08-08 1993-02-26 Internatl Business Mach Corp <Ibm> Dispersion application execution device and method
US5487145A (en) 1993-07-09 1996-01-23 Taligent, Inc. Method and apparatus for compositing display items which minimizes locked drawing areas
US5682550A (en) * 1995-06-07 1997-10-28 International Business Machines Corporation System for restricting user access to default work area settings upon restoration from nonvolatile memory where the settings are independent of the restored state information
US5708717A (en) 1995-11-29 1998-01-13 Alasia; Alfred Digital anti-counterfeiting software method and apparatus
IL116804A (en) 1996-01-17 1998-12-06 R N S Remote Networking Soluti Application user interface redirector
GB2312119B (en) 1996-04-12 2000-04-05 Lightworks Editing Systems Ltd Digital video effects apparatus and method therefor
US5870088A (en) * 1996-05-09 1999-02-09 National Instruments Corporation System and method for editing a control via direct graphical user interaction
US5751283A (en) * 1996-07-17 1998-05-12 Microsoft Corporation Resizing a window and an object on a display screen
US5923328A (en) * 1996-08-07 1999-07-13 Microsoft Corporation Method and system for displaying a hierarchical sub-tree by selection of a user interface element in a sub-tree bar control
US6061695A (en) 1996-12-06 2000-05-09 Microsoft Corporation Operating system shell having a windowing graphical user interface with a desktop displayed as a hypertext multimedia document
WO1998033151A1 (en) * 1997-01-24 1998-07-30 Sony Corporation Device, method, and medium for generating graphic data
JP3361951B2 (en) 1997-02-25 2003-01-07 大日本スクリーン製造株式会社 Print data processing apparatus and method
US6208347B1 (en) * 1997-06-23 2001-03-27 Real-Time Geometry Corporation System and method for computer modeling of 3D objects and 2D images by mesh constructions that incorporate non-spatial data such as color or texture
US6486886B1 (en) 1997-07-15 2002-11-26 Silverbrook Research Pty Ltd Bump map compositing for simulated digital painting effects
JP3350923B2 (en) 1998-01-06 2002-11-25 横河電機株式会社 Instrument diagram display device and recording medium recording instrument diagram display program
US6870546B1 (en) * 1998-06-01 2005-03-22 Autodesk, Inc. Protectable expressions in objects having authorable behaviors and appearances
AU2001238321A1 (en) 2000-02-16 2001-08-27 Goamerica, Inc. Document creation and scheduling of applications' jobs
AU2001276583A1 (en) 2000-07-31 2002-02-13 Hypnotizer Method and system for receiving interactive dynamic overlays through a data stream and displaying it over a video content
WO2002015128A1 (en) 2000-08-18 2002-02-21 Amcor Limited System for creating an artwork
US7051288B2 (en) 2001-02-15 2006-05-23 International Business Machines Corporation Method, system, and product for a java-based desktop to provide window manager services on UNIX
JP3790126B2 (en) 2001-05-30 2006-06-28 株式会社東芝 Spatiotemporal domain information processing method and spatiotemporal domain information processing system
US7181699B2 (en) 2001-06-13 2007-02-20 Microsoft Corporation Dynamic resizing of dialogs
US7047500B2 (en) 2001-11-16 2006-05-16 Koninklijke Philips Electronics N.V. Dynamically configurable virtual window manager
US6816159B2 (en) 2001-12-10 2004-11-09 Christine M. Solazzi Incorporating a personalized wireframe image in a computer software application
US7028266B2 (en) 2002-04-05 2006-04-11 Microsoft Corporation Processing occluded windows during application sharing
US6980209B1 (en) 2002-06-14 2005-12-27 Nvidia Corporation Method and system for scalable, dataflow-based, programmable processing of graphics data
US7342589B2 (en) 2003-09-25 2008-03-11 Rockwell Automation Technologies, Inc. System and method for managing graphical data
US7839419B2 (en) 2003-10-23 2010-11-23 Microsoft Corporation Compositing desktop window manager
US20050275661A1 (en) 2004-06-10 2005-12-15 Cihula Joseph F Displaying a trusted user interface using background images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP1682964A4 *

Also Published As

Publication number Publication date
AU2009217377B2 (en) 2011-04-28
CN101288104B (en) 2012-08-01
RU2005115848A (en) 2006-01-20
KR20060105421A (en) 2006-10-11
CN101288104A (en) 2008-10-15
JP2010267274A (en) 2010-11-25
ZA200503149B (en) 2006-11-29
JP4819175B2 (en) 2011-11-24
RU2377663C2 (en) 2009-12-27
EP1682964A4 (en) 2009-01-28
JP2007522535A (en) 2007-08-09
AU2004279204B2 (en) 2009-06-18
AU2004279204A8 (en) 2008-10-02
MY142719A (en) 2010-12-31
BRPI0406387A (en) 2005-08-09
JP4808158B2 (en) 2011-11-02
AU2004279204B8 (en) 2009-11-19
WO2005045558A3 (en) 2008-06-05
KR101086570B1 (en) 2011-11-23
US20050088452A1 (en) 2005-04-28
US7817163B2 (en) 2010-10-19
CA2501671A1 (en) 2005-04-23
TWI374385B (en) 2012-10-11
EP1682964A2 (en) 2006-07-26
MXPA05007169A (en) 2005-08-26
AU2009217377A1 (en) 2009-10-08
TW200519724A (en) 2005-06-16
AU2004279204A1 (en) 2005-06-23

Similar Documents

Publication Publication Date Title
US7817163B2 (en) Dynamic window anatomy
US7839419B2 (en) Compositing desktop window manager
US6931601B2 (en) Noisy operating system user interface
EP3111318B1 (en) Cross-platform rendering engine
US20060107229A1 (en) Work area transform in a graphical user interface
US8432396B2 (en) Reflections in a multidimensional user interface environment
NO326851B1 (en) Systems and methods for providing controllable texture sampling

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 1478/DELNP/2005

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 2501671

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2006536548

Country of ref document: JP

Ref document number: 200503149

Country of ref document: ZA

WWE Wipo information: entry into national phase

Ref document number: 1020057007426

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2004776616

Country of ref document: EP

AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

WWE Wipo information: entry into national phase

Ref document number: 2004279204

Country of ref document: AU

Ref document number: 20048013383

Country of ref document: CN

ENP Entry into the national phase

Ref document number: 2005115848

Country of ref document: RU

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 2004279204

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: PA/A/2005/007169

Country of ref document: MX

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWP Wipo information: published in national office

Ref document number: 2004776616

Country of ref document: EP