WO2002021451A1 - Procede et systeme de creation et utilisation simultanees de programmes d'ordinateurs en realite virtuelle - Google Patents

Procede et systeme de creation et utilisation simultanees de programmes d'ordinateurs en realite virtuelle Download PDF

Info

Publication number
WO2002021451A1
WO2002021451A1 PCT/US2001/027630 US0127630W WO0221451A1 WO 2002021451 A1 WO2002021451 A1 WO 2002021451A1 US 0127630 W US0127630 W US 0127630W WO 0221451 A1 WO0221451 A1 WO 0221451A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual reality
environment
programs
scene graph
program
Prior art date
Application number
PCT/US2001/027630
Other languages
English (en)
Inventor
Ross Barna
Ryan Tecco
Original Assignee
Neochi Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neochi Llc filed Critical Neochi Llc
Priority to AU2001288811A priority Critical patent/AU2001288811A1/en
Publication of WO2002021451A1 publication Critical patent/WO2002021451A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality

Definitions

  • the present invention generally relates to virtual reality systems, and more specifically relates to a method and system for simultaneously creating and using multiple distributed virtual reality programs.
  • Virtual reality refers to the presentation of a three dimensional artificial environment that may be perceived as reality by the user.
  • a user may interact with and be projected into the virtual environment with the implementation of devices that allow the system to receive signals from the user.
  • Effective virtual reality immerses the user in computer generated sensory data which may include audio, visual, and tactile data.
  • Visual data is delivered to the user on various devices such as a projector screen, monitor, head mount display, retinal projection, or special goggles.
  • Display devices may be immersive or non-immersive.
  • An immersive device is one which uses separate images for the right and left eyes of the user and encompasses a significant portion of the user's field of vision.
  • An example of such a device is the
  • CAVE Computer Automatic Virtual Environment
  • the CAVE is an elaborate virtual reality system that projects images around the user (e.g., on the walls, floor and/or ceiling) not merely on a monitor.
  • the CAVE produces separate images for the left and right eyes, resulting in a stereoscopic effect which produces an illusion of depth, h addition, CAVE allows multiple users to experience the virtual reality simultaneously.
  • Non-immersive display can be accomplished using a CRT display or LCD projector. These devices do not simulate stereo vision and they cover only a small portion of the user's vision.
  • Input devices may be conventional devices, such as keyboard and mouse, or specialized devices, such as data glove, eye-motion detector, or voice recognition devices.
  • Virtual reality programs operate on the various input and output devices to generate the images and other sensory data perceived by the user. Manipulating the images, for example, that create the virtual reality environment requires sophisticated algorithms and computer programming technology. Virtual reality programs involve significant graphics manipulation, most of which is not uniquely specific to the program. Consistent with modern programming techniques, much of the graphics processing is performed by graphics libraries. Many of the standard graphics library, and hence VR programs, implement scene graphs.
  • Scene graphs are directed acyclic graph data structures for representing multi-dimensional aspects of the scene, i.e., the visual presentation of the VR environment.
  • the information about aspects of the scene such as shape, transformation (location in space) and properties take the form of nodes attached to each other in a deliberate manner as to constitute a graph.
  • the links connecting the nodes establish relationships between the aspects of the presentation that the nodes represent.
  • a virtual reality program that simulates the motion of a car may generate a scene graph containing numerous nodes branching out of a root node, where the individual nodes represent parts of the car, such as the wheels, doors, windows, mirrors, steering wheel, signals, lights, brake pedal, and acceleration pedal etc.
  • the nodes are connected so as to establish that if the transformation (position) of the wheels move, the rest of the car also moves in corresponding fashion.
  • the connections between the nodes also establish that if one of the doors moves (open) the wheels do not necessarily move.
  • Rendering entails traversing a scene graph to determine information corresponding to the shapes defined in the graph and the associated properties, and generating display signals in accordance with the information.
  • the program traverses, in a particular order, the scene graph containing the nodes describing aspects of the car, rendering each aspect in the correct position relative to the other aspects/nodes, thereby creating the presentation of a car.
  • OpenGL (www, opengl. org: the contents of which are incorporated herein by reference) is a library of functions that can perform basic graphics tasks such as drawing, shading, transforming, lighting, texturing, and projecting. It also includes advanced features such as mipmaps, antialiasing, projected textures and platform-specific extensions. OpenGL is hardware accelerated on certain platforms and can be used very effectively in virtual reality while still being applicable to the high quality work of the film industry. OpenGL does not support any hierarchical scene graph or multiprocessing model. OpenGL is written in C.
  • IRIS Performer (www.sgi.com/software/performer: the contents of which are incorporated herein by reference) is built directly on top of OpenGL and is considered the standard tool of the high- end visualization and simulation industry. Users range from military and government to film post-production studios and TV stations. Performer adds a hierarchical scene graph to OpenGL that allows programmers a more intuitive and efficient way of managing objects and the transformations applied to them. Performer also adds a multiprocess pipeline model to OpenGL. This pipeline significantly improves performance in rendering and database access on multiprocessor platforms, and also allows for pipelining and parallelization of the tasks that normally occur in OpenGL applications. As a result, multiprocessor computers will run Performer application much faster than uniprocessor machines.
  • Performer is supported on LINUX and IRIX operating systems and can handle a larger bandwidth of data than OpenGL alone. Performer also supports the special hardware of SGI workstations, which are the state of the art in the field (see www.sgi.com/products).
  • Multigen Vega (www.multigen.com/producls/vegaI.htm: the contents of which are incorporated herein by reference) is the military's tool of choice for creating war simulations. It is essentially identical to Performer but has extensions that simulate special effects, load special terrain databases and support various simulation specific needs. Vega is supported on IRIX.
  • World Tool Kit (www.sense8. com/products/wtk. html: the contents of which are incorporated herein by reference) is a clone of IRIS Performer that runs operates on Windows NT and IRIX. It also includes a client/server tool that allows users on different computers to all use the same applications.
  • DIVE Distributed Interactive Virtual Environments
  • DEVA (www.aig. cs. man, ac. uk/svslems/Deva: the contents of which are incorporated herein by reference) is geared toward developing intelligent techniques of describing behavior and mitigating metaphysical differences between these behaviors. DEVA also addresses the management of multiple users at distributed locations. DEVA is based on top of MAVERIK, an OpenGL rendering system.
  • Some systems allow each user to affect the virtual environment simultaneously, but do not allow a single user to execute multiple, independent programs. This is because these systems are designed specifically for multi-user environments without considerations for multi-program design.
  • the implementation of such systems is concerned mainly with quick and efficient updating of shared or distributed databases.
  • the present invention is a system and method for creating and using virtual reality (VR) computer programs.
  • the invention allows for the simultaneous display of multiple independent VR programs by managing the VR display and other sensory output devices on which these programs operate.
  • the system includes the capacity to display programs that are running on any machine connected to a network, e.g., LAN and or Internet, or on the machine running the VR display device (or devices).
  • a network e.g., LAN and or Internet
  • the system operates the graphics subsystem that creates images of the virtual environment, and services and manages the programs that operate with the system.
  • the system maintains a central mechanism (Construct) for processing the presentation of one or more application VR programs operating concurrently.
  • the system acts as an interface between the various applications and the output device or devices that comprise the VR presentation to the user.
  • the applications may be interactive or self-contained; may be operating locally or remotely over a network; and may be written in any language.
  • the applications are limited only by the imagination of the programmers, provided the programs conform to the system API, application program interface.
  • Each program operates as if it were an independent program, where instructions affecting the presentation (VR environment) are processed by the central mechanism.
  • the system combines current graphics systems with a distributed object system.
  • the system provides an API supported by at least one graphics library and uses a scene graph schema for managing the data comprising the presentation of the VR environment.
  • the system Upon receipt of instructions affecting the presentation, the system updates the scene graph accordingly and realizes the change, typically by updating the display, though naturally extendable to other output mediums.
  • the system maintains the scene graph using distributed objects and system identifiers for each node and provides the system identifiers to the application programs as needed.
  • the applications use the system identifiers provided by the system in their instructions relating to the VR environment.
  • Figure 1 is a block diagram of the preferred embodiment of the present invention
  • Figure 2 is a block diagram of a Construct in accordance with the preferred embodiment
  • Figure 3 is an illustration of an application program in accordance with the preferred embodiment
  • Figure 4 is a flow chart showing a method of processing an application program in accordance with the preferred embodiment
  • Figure 5 is a flow chart showing a method of processing by the Space Manager in accordance with the preferred embodiment
  • Figure 6 is a block diagram of an Implementation portion of the Construct in accordance with the preferred embodiment
  • FIG. 7 is an illustration of interprocess communication in accordance with the preferred embodiment
  • Figure 8 is an illustration of broadcast interprocess communication in accordance with the preferred embodiment
  • Figure 9 is a block diagram of the hierarchy among types of objects in accordance with the preferred embodiment.
  • Figure 10 is a block diagram of a scene graph data structure in accordance with the preferred embodiment.
  • a system enables users to participate in a virtual reality (VR) environment generated and manipulated by one or more independent VR application programs.
  • the primary runtime environment called the Construct
  • the Construct is the platform in which the virtual reality presentation is generated.
  • the user operating the virtual reality session starts the Construct.
  • the user may implement one or more application programs to participate in the same virtual reality session.
  • the system facilitates the use of VR application programs that operate on the environment, various output devices that project the environment perceived by the user(s) and optionally various input devices that determine each user's attention and movements.
  • the system also provides tools for programmers creating VR applications. Such tools include a framework for creating specialized space management programs, application program interface (API) and other developmental libraries.
  • API application program interface
  • the Construct is the central program that receives the multiple and contemporaneous inputs/outputs (influences) in the VR environment from application programs. Influences on the environment include .requests for functionality from application programs operating on the environment.
  • the system provides the capability for displaying multiple applications concurrently, sharing resources and facilitating cooperation between the applications. This allows the user to move between applications without closing them down.
  • any aspect of the virtual reality session experienced by the user may be shared by the applications.
  • Such interoperability is facilitated by the distributed nature of the underlying mechanics of these VR programs. Due to the distributed nature of the mechanics, the usage, implementation and interface of the programs are loosely coupled and can be located on different machines. Such distribution aids in gaining scalability, modularity and extensibility.
  • a scene graph is a directed acyclical graph data structure that represents a hierarchy of elements (termed nodes) that can be delivered to a rendering system and turned into an image on the appropriate device, e.g., immersive display.
  • Scene graphs aid application programmers in thinking about the scenes that they build and manipulate.
  • the format of the scene graph is conducive to efficient processing by computer graphics systems.
  • the system supports basic scene graph libraries available in current graphics systems such as IRIS Performer, JAVA3D, SSG, WTK, and VRML. The use of different scene graph libraries allows the various graphics and scene graph systems to be interchanged without affecting the functionality or requiring recompilation of the Construct.
  • Each application program may be designed to generate and manipulate its own scene graph, but in operation all functions affecting the scene graph are executed at the Construct' s scene graph.
  • the system provides a uniform API for managing functions affecting the scene graph.
  • the API provides a common communication format between application programs and the system.
  • Application VR programs may be designed independently of the system, then written in compliance with the API and operate seamlessly with the system.
  • the system uses shared graphics libraries. The API, scene graph and graphics libraries are discussed below.
  • the Construct is implemented in an object oriented programing language. As is known in the field, the Construct may be implemented in other languages and adapted for other computer platforms. According to the preferred embodiment, the Construct uses objects to generate and facilitate the VR experience.
  • an object is a collection of data and functionality that are conceptually related. For example, a typical program has an object to manage file operations and such object may be called "file object".
  • the nodes that comprises the scene graph are objects.
  • the scene graph is the data structure used for storing and managing the VR environment as it is to be perceived by the user. Hence, the nodes are the building blocks that comprise the scene graph and thereby the VR environment.
  • the properties, features, and functionality of that ball are collected and managed in one or more nodes.
  • the node (or nodes) may be said to represent the ball. If the VR environment is also to have a cat, the cat is represented by another node (or group of nodes).
  • the Construct and application programs generate and manipulate various objects, including nodes, as each proceeds to operate in a VR session. Examples of some of the objects typically used with this system are set forth below.
  • a space manager is used to manage the presentation of a VR session produced by the operation of multiple applications.
  • the space manager is a program that controls the allocation of space within the VR environment and updates the environment to reflect changes requested by application programs or caused by the user.
  • the Construct alerts the Space Manager and adjusts parameters of the environment accordingly. Without a space manager, the Construct would otherwise execute the directions of each application program individually, without any regard for the presentations of the other applications functioning contemporaneously.
  • each application may be allocated a distinct space in which to operate its VR presentation.
  • the space manager may be designed with varying levels of complexity. For example a space manager may provide each application with a distinct origin in space but the presentation of each application is not confined a particular sub- space. Alternatively, the space manager may allow the user to designate an origin for each application. The space manager may also allow users to modify parameters of the management or shrink programs at the users command. Naturally, to operate effectively, there can be only one operating space manager associated with any given Construct at any given time.
  • the system for generating the Construct may be a distributed system implemented using multiple computers.
  • the various application programs may be implemented on different computers networked to the computer implementing the Construct.
  • portions of the system e.g., the space manager
  • the Construct 114 and Space Manager 118 may be implemented on a computer 110, while application programs 116 that use the Construct may reside on the same computer 110 or another computer, e.g., computer 112.
  • the Space Manager may be located on any machine, e.g. computer 110 or computer 112.
  • the application programs commumcate to the Construct 114 via a communication network 100, such as, a Local Area Network, Wide Area Network or the Internet.
  • the display device(s) 120 are situated and connected to the computer 124 at the user's location.
  • the input device(s) 122 are also connected to the user's computer 127.
  • the Construct 114 uses graphics libraries to affect the display of the scene graph manipulated by the Space Manager and application programs. Specifically, if an application program wants to change the appearance of individual graphic elements in the environment, it requests or instructs the Construct 114 to do so.
  • the application programs themselves are independent of any specific graphics library.
  • the application programs use the API to communicate with the Construct.
  • the Construct 114 forwards the request to the Space Manager 118, which then places a notification on a queue accessible by the Space Manager 118 .
  • the Space Manager 118 fulfills the request by updating the scene graph. This typically involves calling one or more functions in the applicable graphics library.
  • the Display component of the Construct proceeds to render the scene graph, effecting an updated presentation by the display or other output devices.
  • the user starts the system and a blank environment appears. This environment is the visible manifestation of the Construct.
  • the display, communication and associated management utilities are started.
  • application programs on the local machine, or on any machine connected to the network may manipulate the scene graph which in turn affects the display.
  • Application programs are executed in a conventional manner. For instance, a user may run an application program using a command interpreter or from any other program.
  • the Space Manager places the application program in the virtual environment and once placed, the user may interact and experience the program through the immersive display.
  • an application program establishes communication with the Construct using the Common Object Request Broker Architecture (CORBA).
  • CORBA Common Object Request Broker Architecture
  • CORBA is a standard set by the Object Management Group for communications between distributed objects. By using this standard, application programs may be designed independently and yet interface with the Construct without customization. Each application program creates its own variables and calls functions that manipulate the runtime environment. To the programmer, a geometric object that they are manipulating may appear to exist in the program, but instead they are manipulating a representation of that geometric object that actually resides in the Construct runtime environment. This analogy is similar to abstractions in modern operating systems where programmers believe they have access to a device but in reality they are just manipulating an abstraction of that device. CORBA allows this analogy to go beyond actions on a single computer to allow programmers to manipulate local objects that actually reside remotely. Various implementations of CORBA exist in the computer industry and are written for various platforms and languages. The API utilizes an appropriate implementation of CORBA to communicate with the Construct, which also uses an appropriate implementation of CORBA.
  • the Construct is running on a UNIX based machine and uses an implementation of CORBA written in C and a programmer wants to write an application program in Java on a Windows machine
  • the development libraries for Java use the Java-specific implementation of CORBA and interface with the Construct effectively.
  • the development libraries are used to write application programs for the system and define a set of graphics functionality that are carried out by the runtime environment resulting in the images sent to the user.
  • One standard and important developmental tool is the Application Program Interface (API).
  • API Application Program Interface
  • Programmers writing software for use with the run-time system are required to use the system's API.
  • This API contains all of the functionality needed to communicate between the program and the system, create and modify elements of a scene graph, and communicate with other application programs.
  • the system API contains abstract graphics procedures from implementation specific libraries such as OpenGL, Performer, World Tool Kit and others. The procedure formats are generic across libraries, without changing the interface to the programmer. All system API classes derive from objects which are the root of all distributed functionality.
  • the inheritance hierarchy of the API can be seen in Figure 9 showing how the different classes (objects in runtime) are related to one another in terms of function and data inheritance.
  • the API may be supported under C, C++, Java, Perl and several other languages, and may be ported to new languages by implementing a CORBA object request broker for the target language.
  • the framework includes documentation and specifications for writing a space manager. It may include a code skeleton which lays out the basic functionality requirements for space management. Any manager that fulfils the basic functionality requirements may be implemented with the present system.
  • the Construct is the conceptual center of the system, and is responsible for the management and display of application programs using the system.
  • the Construct is responsible for displaying the visual state of the runtime environment.
  • Application programs written with the aid of and in accordance with the API for this system connect to the Construct which manages their display to the user.
  • the Construct is run on the machine that controls the display device (or output devices).
  • the applications may be distributed across different machines on the network. In this way, the system achieves separation between the displaying machine and the machines on which application programs run.
  • the Construct may be implemented in C++ and use the Common Object Request Broker Architecture (CORBA).
  • CORBA Common Object Request Broker Architecture
  • Protocol standards may be used, such as, Remote Procedure Calls, Message Passing Interface, DCOM (Microsoft's version of CORBA), and SOAP (Microsoft standard for XLM RPC). Therefore, programs operating in the system only need conform to the system specifications for communication, leaving the system to be responsible for interfacing with the specific graphics libraries (e.g., Performer, WTK, and openGL).
  • DCOM Microsoft's version of CORBA
  • SOAP Microsoft standard for XLM RPC
  • the Construct is an interface between the applications and the display devices.
  • the Construct 200 includes several functions which are conceptually divided into components.
  • the Map Service 212 is the component that facilitates the translation between its own identifiers and memory locations, and those from application programs.
  • Implementation module 214 which may be a sub-component of Map Service, provides graphics functionality to the generic CORBA interfaces.
  • Display module 216 which may also be a sub-component of Map Service, maintains the scene graph and facilitates rendition on the output devices, e.g., immersive display device 230.
  • the Construct may contain display hardware drivers 218 to take the visualization information from the Display module 216 and render it on the display devices 230.
  • the Implementation and Display may be structured as components of the Construct without the intermediary Map Service component or with the Map Service as a third component.
  • the Construct begins with an initialization process involving test communication between various components within the system, including Map Service and CORBA utilities. If any essential component is missing, the system may report the problem and shut down. Upon completion of the initialization process, the Constmct is in a state ready to accept application programs.
  • Constmct generates a basic two-node scene graph (224 and 226).
  • application programs generate and manipulate nodes 222 and each of these nodes are represented in Implementation
  • the Display module 216 maintains the relationships between and among the nodes, and hence the scene graph. Nodes 222 in the Implementation are added to the scene graph as additional nodes 228. Though not specifically shown, it should be appreciated that applications generate and use objects other than nodes which are also represented and supported by functionality at the Implementation component, similar to nodes 222 but not necessarily represented at the Display component. For example, a file object, enabling functionality involving file management, is an object that is not a node and is not realized in the presentation. See figure 9 and accompanying text for additional examples. An application program may be run from any CORBA compliant application attached to the Construct directly or via a network.
  • an application 300 when it begins processing in its local address space, it creates a client 310 to interface with the Construct.
  • the client establishes communication with the Construct and requests information regarding the various components of the Construct. This information is stored by the client for future use and the application program is ready to operate transparently with the system.
  • the application program 300 creates and manipulates objects 312 representing various elements of the VR environment.
  • the Map Service 212 When an application program creates an object, the Map Service 212 must be informed in order to keep track of all the objects representing elements in the VR environment.
  • the API automatically interfaces with the Map Service without explicit directions from the application programmer.
  • the Map Service registers the new object and returns a system ID number for the newly created object to the application.
  • the application program interacts with the Implementation 214 via the API which in turn affects the Display 216 which controls the immersive display device sending images to the user.
  • the client 310 created by the application program is associated with the other objects created by that application, and contains shared information that may be required by the other objects.
  • a VR session begins at step 400, with the initialization of the Constmct which generates a blank scene graph. Substance is added to the blank scene graph by the operation of application programs.
  • an application program When an application program starts, it creates a client object (step 420), and establishes communication with the Constmct, (step 422). Typically, steps 420 and 422 are performed once for each application program as it joins the VR session. While the session may involve a variety of VR functionality, the general process involves creation of a variety of objects, and the manipulation of those objects.
  • the application creates a scene object which is communicated to the Constmct.
  • the scene object contains general information (and optionally functionality) about the application and its presentation that may be used by other objects or processes.
  • the Constmct determines whether the application seeks to create an object or manipulate an object. If the application is creating an object the Constmct then determines at step 432 whether the object to be created is a scene object. If the object is a scene object, the process continues to step 440, where the application program creates a scene object that is mirrored at the Constmct.
  • the application program provides the scene object with attributes that describe its presentation within the session. These attributes may indicate, for example, whether the presentation must be close to the user and whether it may intersect with the presentation from other programs.
  • step 432 the scene object generates a Pseudo Root which is also communicated to the Construct.
  • the Pseudo Root is the root node of the scene graph from the perspective of the application. At the constmct the application's scene graph is only a subgraph.
  • the Space Manager recognizes the Pseudo Root and at step 446 interprets its attributes. The Space Manager may determine whether translation (moving in space) or scaling (changing size) are required.
  • step 448 the Space Manager attaches the Pseudo Root to the main scene graph. According to the attributes specified in the scene object.
  • the API informs the Constmct that the application program seeks to create an object at step 440, and that it is not a scene object at step 442.
  • the application program then creates an object at step 450.
  • the application program provides the object with properties including its relationship within the scene.
  • the API informs the Construct to create the object at step 452 and, at step 454, the Map Service creates an object at Implementation.
  • the application program manipulates objects according to its design.
  • the API instructs the Constmct that the application program seeks to manipulate an object, for example, a node of a scene graph.
  • the application program manipulates the object, performing some function, at step 460.
  • the request for functionality defined by the object is communicated to the Construct where, at step 464, the Implementation proceeds to realize the functionality, possibly referencing the appropriate graphics library.
  • the Map Service which is, a component of the Constmct, is responsible for object creation and management. Objects are uniformly referenced by their system object identifier assigned by the Map Service. The Map Service also provides the mapping between the system object identifiers and the memory at the Constmct. The functionality of the objects, though “controlled” or “executed” by the applications, are realized at the Construct, where the scene graph and graphics libraries are located. By using the system API, the applications are insulated from the specific details of implementation at the Construct. The Map Service receives the system object identifier from an application and the Map Service proceeds to realize the functionality via Implementation.
  • the Implementation is composed of an interface which declares the expected functionality and the graphics-library-specific implementation which supplies the functionality to the interface using a specific graphics library. To change the graphics implementation that the system uses to generate images, the entire Implemen- tation is replaced when the Construct is compiled.
  • each different kind of object 614 is defined by a set of functions 616, generically composed of responsibilities 622.
  • the responsibilities of the various objects are fulfilled by a graphics library 618.
  • the graphics libraries support the functionality defined for the objects and the library may be easily substituted with another graphics library.
  • objects are created using the previously set definitions and hence supported. However, many instances of the same kind of object may be created. Where there are, for example, two instances of the same kind of object 614, arrows point to the same set 616 to indicate that both objects have the same functionality (responsibilities), by definition.
  • objects are called instantiated objects.
  • the Map Service assigns system object identifiers to each new instantiated object created by an application program.
  • an object is created, assigned a memory address, and the memory address is sent to the respective graphics library 618 to visually realize the addition of a new object.
  • the application program knows the local memory address of the object, it does not know the system memory address located at the Constmct.
  • the application program uses the system object identifiers to reference the objects when interfacing with the Constmct.
  • the application calls/executes a responsibility, the application provides the system object identifier to the Map Service which "translates" the identifier into a (system) memory address at the Constmct.
  • the Map Service proceeds to relay the address and other information to the graphics library 618.
  • the information about the object being added begins as a memory address in an application program, is mapped to a system identifier and is then mapped to a memory address in the Construct.
  • a typical function/responsibility 622 of the Transform object, 614 is adding a node to a scene graph, called addChild.
  • AddChild accepts as a parameter an identifier of the node to be added to the scene graph.
  • addChild is called with the ' identifier of the Text node. This may be denoted "Transform: :addChild(Text)".
  • the Text node is assigned a system object identifier by the Map Service when the Text node is created.
  • the Transform object calls addChild using the system object identifier.
  • the Map Service then provides the memory address corresponding to the system identifier for the Text node.
  • addChild 622 references the graphics library 618 to realize the addition of the Text node identified by its memory address to the scene graph.
  • the Display module provides an interface between the applications and the output devices, along with the libraries associated with those devices. This means that to support a new device or library, the Display is replaced with a suitable one. Application programs do not need to know what display device they will present their scene on at runtime. This allows one application to be written for a variety of devices instead of requiring a different application to be written every time the target display hardware changes. This flexibility is possible because of the Display interface, which can be implemented with a variety of different libraries. !9 Hardware devices supported by the Display module include, CAVEs, BOOMs, HMDs, a variety of tracking devices and flat-screen monitors. Many of the hardware implementations are provided by specific libraries such as VRCO's CAVElib and Ohio State's VRJuggler.
  • the Space Manager stores this knowledge and manipulates the scene graph accordingly.
  • Space Manager An important characteristic of the Space Manager is that it is a nonessential part of the system that is implemented with the same tools (API) that are used to make user-level programs for the system. This allows the Space Manager to be removed, changed and restarted without affecting other parts of the system and allows it to exist on any computer able of communicating with the system ( Figure 1 : 110, 112).
  • the Space Manager is also able to recognize a scene graph previously managed by a different space manager and internalize information about the state of that graph.
  • the Space Manager connects the received sub-graphs to the main graph through the root node or navigation node.
  • the root node represents the origin or center of the environment; (0, 0, 0) in an X, Y, Z Cartesian coordinate system.
  • the other nodes are called the navigation nodes. Anything attached to the root node appears to remain stationary and anything attached to a navigation node appears to move with the user's coordinate system. This is because instead of moving the user in the scene, the scene moves around the user.
  • interface mechanisms such as a virtual menu for the user to manipulate parameters
  • GUI mechanisms are connected to the root node. Since they do not move, they thus remain accessible to the user regardless of the user's position in the environment.
  • typical elements of the VR environment are connected to a navigation node directly or through other nodes forming a path from the navigation node so that they move naturally as the user navigates the environment.
  • the Space Manager need not continually monitor the scene for changes that it will need to act on. Instead, the changes occurring in the application programs register notifications into a queue that the Space Manager monitors. The Space Manager uses this queue to refresh its representation of only the changed parts of the scene. This optimization allows the Space Manager more time to do its most important job, managing space.
  • the Space Manager maintains an internal model of the scene composed of boundary representations.
  • the boundary representations may be in the form of a sphere or box.
  • the Space Manager uses the boundary representations in its calculations for intersection and in its representation of occupied space. Then the Space Manager updates the scene graph to reflect the calculations.
  • the Space Manager is an independent program that may be used in conjunction with the Constmct to enhance the VR presentation. When the Space Manager is executed, it must first determine whether there is a space manager currently or previously associated with the Constmct. The Space Manager process begins with initialization. Referring to Figure 5 at step 502, the
  • Space Manager queries the Construct to determine whether the scene / VR environment is being managed by another space manager. Since the Constmct may use or associate with only one space manager at any given time, if there is another space manager in operation, the incoming Space Manager exits the system, at step 503. Provided no other space manager is active, at step 504, the Construct sets up the Space Manager for use with the system. At step 506, the Space Manager is expired.
  • the Space Manager queries the Construct to determine whether the environment was previously managed and if so, at step 507, the Space Manager assimilates the scene graph previously generated, generating appropriate internal representations. Once the initialization steps are completed, the Space Manager's general operations are driven by signals from the applications. At step 510, the Space Manager waits for a signal from the Constmct or application program. When a signal is received, the process continues to step 512, where the Space Manager determines what type of signal is received. If the signal indicates the creation of a new pseudo root, the process continues to step 520, where the Space Manager receives the new pseudo root, and interprets its attributes (step 522).
  • the Space Manager positions the pseudo root within the central scene graph and at step 526, the pseudo root is attached to the scene graph. If at step 512 the Space Manager determines that the received signal indicates changing a node, the process instead continues to step 530 where the Space Manager changes the node accordingly and at step 532 recalculates the scene graph respectively.
  • step 526 or step 532 the process returns to step 510 to await another received signal.
  • the Space Manager makes its management decisions based on attributes that programs choose. Attributes may be divided into groups, for example, intersection, locality and attachment.
  • the intersection attribute describes whether the program can intersect with other programs and may take the values of "exclusive” (preventing the intersection of programs) or “inclusive” (allowing intersection of programs).
  • the locality attribute describes approximately how dense a program's space usage is and may take the values of "environmental” (allowing the program to move throughout the scene graph) or "localized” (limiting the program to move within an area smaller than the entire scene graph).
  • the attachment attribute describes the location of the space in reference to other spaces and may take the values of "attached” (indicating a particular reference point), “detached” (indicating the absence of a reference point), “user_attached” (indicating reference with respect to the user), “x_aligned”, (indicating reference with respect to the x-axis rather than a point) "y_aligned” (indicating reference to with respect to the y-axis) and/or “z_aligned” (indicating reference with respect to the z-axis).
  • the default combination is
  • LPC may be used for one-to-one communication as well as one-to-many (also called broadcast) communication.
  • One-to-One IPC is implemented using properties, signals, events, and fat pipes. All of these are implemented within the confines of the CORBA run-time system. One-to-one IPC most often takes place with a confirmation that the communication took place, implemented as reliable TCP.
  • a client makes a request to a common interface ORB (object request broker) which directs the request to the appropriate server that contains the object.
  • ORB object request broker
  • UDP user datagram protocol
  • FIG. 7 illustrates four generic forms of interprocess communication that are supported by the system: Property, Signal, Event and Fat Pipe. These concepts are somewhat different in principle and implementation from the paradigms in modem operating systems that bear similar names.
  • each program generates a client object to handle general operations and processing.
  • the client objects of the running application programs send and receive messages among each other to achieve interprocess communication.
  • Properties are communications sent by one program 710 to describe itself to another program 712.
  • Signals are communications sent from one program 714 to another program 716 concerning system conditions or instructions to take an action.
  • Events are general communications between programs (718, 720), indicated with a bidirectional arrow.
  • Fat pipes are bidirectional communications used to transfer large files or set up data streams between programs (722,724).
  • Properties A property is a distributed attribute that can be accessed by another process.
  • a property contains data that one program offers to other programs. Programs possess data values, which they then export by way of properties. Once a data value is exported, a change in that local value updates the property as well.
  • Properties are often simple types of data structures, e.g., integers, floating point numbers, but may be complex aggregate data structures as well, e.g. lists, tables.
  • An example of a property could be color and a value of that property could be yellow.
  • a property has an associated value and is accessible to any distributed application through the
  • Signals are used for system related information. Signals are used to transmit information and instructions about termination, relocation, execution, suspension and other process level * functionality. Signals have a higher priority than events (described below).
  • a signal is a message targeted to notify a process of a system condition. Received signals are interpreted by the receiving end, which calls the appropriate function according to the signal received. The system defines a standard set of signals and the user cannot define any additional signals.
  • Signals can be executed asynchronously. Signal operations that change shared data are expected to provide their own mutual exclusion to prevent data cormption. (Typically one process may not access the shared data while another process is about to change the data value.) Signals are push-based, meaning that an application can receive one without any warning. Applications with the proper permissions may generate signals. One process initiates the signal and the other process receives and performs the defined operation. Signals typically do not return a value. An event is a targeted, definable message that is sent by one program to another program or set of programs. Events have varying delivery types such as guaranteed, unreliable and "best-guess" transmission. Events may be used for non-system related message passing and are freely expandable and usable by applications.
  • An event is a message targeted to notify a process of a user-defined condition.
  • Application programmers may define or even standardize sets of events that their applications recognize and/or send.
  • the runtime system does not define events.
  • Received events are interpreted by the receiving end, which calls (executes) an appropriate function according to the information received in the event.
  • An event that cannot be interpreted by the receiving process results in a null operation - to avoid making both the application and runtime system unstable.
  • Events are used for non-critical, application message passing. Any application can define any arbitrary amount of events. These events may be standardized across a set of applications such as "window" managers or may be transient for the lifetime of the application and published to a central authority. Events may also be assigned priorities. Thus a newly arrived event with a higher priority than all currently queued events will be executed first. The highest event priority is generally not higher than the lowest signal priority since, in general, signals have precedence over events. Events can be defined to ensure delivery and execution or they can be defined to make a best effort at delivery. Best-effort delivery is often useful for non- critical operations such as animation transform updates. Event execution may be deferred to execute an arriving signal because signals have a higher priority than events.
  • event execution may or may not continue.
  • the handler After matching an event ID with the associated routine, the handler checks to see if any signals have arrived in the queue. If a signal has arrived, the handler first removes the signal from the queue and processes the signal. If no signal has arrived, or if the intervening signal has non-fatal behavior with respect to the process, the event routine is executed.
  • Fat Pipes are stream-like connections that can be established between a program and the system or between two programs. These streams are used to transfer large amounts of raw data, such as files or network streams.
  • a fat pipe operates to transfer large blocks of data between different processes. The most common use for this mechanism is transferring 3D models between one location and another.
  • the fat pipe mechanism provides a disk cache management system for processes that wish to temporarily acquire a model for addition to the scene graph for the lifetime of the process.
  • the fat pipe implementation on application request, acquires the model, caches it, and purges it when either the use of the model is discontinued or when the application is terminated.
  • the fat pipe operates by formatting the binary data and then transferring it over a pre-defined CORBA channel.
  • the fat pipe also defines quench and suspension operations for use by algorithms managing the efficient flow of traffic on the network.
  • the fat pipe can be told to quench (drop the transfer rate) a large bulk transfer, or suspend the transfer entirely. This is useful when other, prioritized operations such as important signals and events must occur.
  • the fat pipe can either use TCP or reliable UDP to transmit data.
  • the fat pipe is implemented as a pull-based mechanism.
  • a process To use a fat pipe, a process must first negotiate the file transfer with another process. This involves requesting a file, communicating about its availability, transmitting the file, performing compression and checksumming and closing down the pipe. Fat pipes that are used to transmit more than two large data sets between the same processes are kept open until a time-out to eliminate the repeated cost and complexity in creating the connection.
  • a fat pipe "writes" its data to the disk cache object, which is then responsible for writing the data temporarily to disk or to a memory location.
  • the fat pipe is not concerned with where (disk or memory) the cache writes data.
  • Applications may choose to use in-memory databases and stores to improve performance. If the cache fills, the pipe receives a suspend operation until the cache can resolve the problem, either by a preset behavior or by notifying the user.
  • FIG 8 illustrates the broadcast interprocess communication (IPC) that is supported by the system. Broadcasting allows one program to send data to multiple other programs simultaneously and solves the problem of transmitting identical data to many targets. Broadcasts are critical system events that must reach all processes, or a subset thereof.
  • IPC broadcast interprocess communication
  • One-to-many LPC allows shared communication to all processes ( Figure 8).
  • the one-to-many model requires features not included in CORBA, and therefore, this is implemented by way of a
  • Another problem with broadcast implementation is that it must be reliable. When the system is shutdown and sends a termination broadcast to all processes, all processes must be guaranteed to receive it. To address this issue, a set of UDP network objects with simple reliability algorithms is used to accomplish the task of transmitting the broadcast.
  • a visualization of web-server activity in a CAVE may be implemented by writing a program for the web-server that parses server logs and uses that information to manipulate a scene graph.
  • the manipulated scene graph is a sub-graph of the system wide scene graph used by the Construct, which may be displaying on a CAVE, to render images for the user. This means that a change in web-server activity changes parsed information which, under control of a programmer, affects the display of the visualization in the CAVE.
  • a set of distributed programs such as the web-server visualization program discussed above, manipulates a scene graph using a defined set of functionality.
  • This set of functionality is called the Application Programming Interface, sometimes also referred to as development libraries.
  • An object 950 may represent any data-functionality combination.
  • objects for example, file 960, bound 962, color 964, and font 968 are objects with functionality relating to their names.
  • Nodes 970 are objects that may comprise the scene graph, with each object having some functionality in common (making it a node) and some unique according to its name.
  • transform 988, geometry 986, text 984, and light 982 are different types of nodes.
  • the geometry node may represent its shape
  • the transform node may indicate changes in modeling the geometry
  • the text node may represent text to appear in the environment
  • the light node may represent shading and color of the element.
  • Bound (962): A boundary representation (sphere or box) which is used in intersection testing, and visibility testing to speed processing. The Bound is used in the Space Manager.
  • Color (964) Color in RGB A (Red, Green, Blue, Alpha) format. This may be used to designate a single color or list of colors.
  • File (960) Represents a file which may be used to store complex geometry information, for example, the geometry and color information for a geothermal data visualization.
  • Font (968): A text font which is used in conjunction with Text.
  • Geom (986): A Node which is a collection of Points, Lines, Triangles, Normals, Textures and/or
  • VertexLists which may or may not be indexed. This is the basic building block for creating visible material in the scene graph. Geoms can be created using pre-existing files or alternatively created in real-time.
  • Light (982): A Node which represents a light source of type POINT, SPOT or OMNI.
  • the Light also has color and orientation. The default orientation is at the origin along the -Z axis.
  • Node (970) The basic building block of the scene graph. This object can have children and can be a child. Any object which can be part of the scene graph must be descended (in an Object
  • Text (984): A Node which when combined with a Font can produce text in the VR environment.
  • Texture (not shown): Image data in RGBA format which can be applied to Geom data.
  • Transform (988) A Node that can translate, scale, rotate and shear its child Nodes.
  • Triangle (not shown): A Node thatjs a triangle defined by a 3-member VertexList. Triangles can be used to build up complicated geometry and modify the geometry dynamically.
  • Vec3 (976): A vector in 3-D Cartesian space (X, Y, Z)
  • Figure 10 shows an example of an application program 900 generating a sub-graph that may be added to the environment. Building the scene graph may be accomplished with C++ code. The code is illustrated in the lower portion of Figure 10, while the scene graph generated by the code is presented in the upper portion of Figure 10.
  • the program begins by creating a Scene (R) 912 which represents the root of the scene graph sub-graph but is only a subtree in the system wide scene graph. Then, on lines 2-3, the program creates various nodes 916 and 918 that are to be used.
  • the nodes created and used in this example are arbitrary; in practice, the creation of specific nodes is dictated by the particular program being designed.
  • the program adds nodes to the scene graph repeatedly using the addChild routine.
  • two transform nodes Tl, T2 are added to the root node.
  • three transform nodes T3, T4, T5 are added to the second transform node T2.
  • a single geometry node is added with a link to each of the transform nodes T1-T5.
  • five geometry nodes may be added, one geometry to each transform, or some mixture thereof.
  • the topology of the system is most accurately characterized as a star topology with requests fanning inwards from programs to the Constmct.
  • This fanning-in topology is a design requirement that plays a pivotal role in determining the scalability and performance of the system.
  • the scalability of the system is typically limited by the memory and processing power of the machine that runs the Construct.
  • the system can scale in various dimensions such as: complexity of geometry in the system, number of active programs in the system, frequency of function calls in the system and frequency of communication between programs (LPC).
  • the performance of the system is limited only by the processing power of the graphics subsystem and interconnect bandwidth/latency.
  • the graphics subsystem performance directly affects the user's experience asynchronously from the rest of the system. For instance, if the graphics performance is fast and the system is overburdened with requests, the user will still experience the environment without jerky response, but changes to the environment will appear jerky.
  • the interconnect between application programs and the Construct (be it co-located or connected via Ethernet) determines the performance of the programs and the user's experience of the programs. Hence, performance of the graphics subsystem and interconnect are independent.
  • the system works with a variety of virtual reality hardware.
  • the immersive display devices are fed images by special purpose graphics workstations. These workstations are capable of generating realistic images at a frequency such that the users perceive the images to be a coherent experience instead of a sequence of images.
  • the system is flexible and supports a wide range of virtual reality related hardware, and is not constrained to specific display technologies, for example, motion trackers.
  • the motion tracker most frequently uses an electromagnetic device that knows its location and orientation in space. These trackers transmit the location and orientation of the user's head to the program that is rendering the images for the immersive system. This information allows the user to move around within the virtual environment.
  • the system is also designed for use on non-immersive displays such as current monitor technologies.
  • the system works with a variety of virtual reality hardware because the support depends on the graphics library that is in use. For instance, if a specialized device with a modified version of openGL is used, then the system uses that version of openGL to support the hardware.
  • Recognition systems such as IBM ViaVoice and Dragon Naturally Speaking; Gesture Recognition using neural networks where a user's gestures can be interpreted and transformed into functional actions or events; Wireless Tracking Technologies involving stereo matching algorithms providing a wireless solution to the problem of dangling wires in virtual environments; Neurological Electrical Signal Input devices which interpret facial muscle movement and brain wave activity; Distributed Sensor Data Collection devices for seamless remote data collection and the integration of this collected data into applications; Eye Tracking devices which monitor the movements of the eyes and leverage this capability to improve interface technologies; Computer Vision techniques which generate volumetric models of the user and determine location of various parts of the user's body; Tactile Feedback/Haptics technologies which generate force feedback and physical stimulation; and Audio servers may be connected to the system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention porte sur un procédé et un système de création et utilisation de programmes (116) d'ordinateurs en réalité virtuelle, plusieurs de ces programmes pouvant être présentés simultanément sur l'écran (120) de l'utilisateur. Les programmes peuvent tourner sur plusieurs ordinateurs séparés reliés par un réseau (100). Le programme central (110) assiste et gère le ou les programmes opérant avec le système. Un sous-système (112) graphique est utilisé par le programme central pour créer les images dans l'environnement virtuel. Le sous-système graphique comporte des bibliothèques de graphiques compatibles avec l'écran de l'utilisateur. Le programme central utilise de plus une structure de graphes scéniques conservant l'environnement virtuel pendant que le programme se déroule. Une interface de programme d'application facilite les communications entre le programme et le système.
PCT/US2001/027630 2000-09-07 2001-09-06 Procede et systeme de creation et utilisation simultanees de programmes d'ordinateurs en realite virtuelle WO2002021451A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001288811A AU2001288811A1 (en) 2000-09-07 2001-09-06 Method and system for simultaneously creating and using multiple virtual realityprograms

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US65672600A 2000-09-07 2000-09-07
US09/656,726 2000-09-07

Publications (1)

Publication Number Publication Date
WO2002021451A1 true WO2002021451A1 (fr) 2002-03-14

Family

ID=24634297

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/027630 WO2002021451A1 (fr) 2000-09-07 2001-09-06 Procede et systeme de creation et utilisation simultanees de programmes d'ordinateurs en realite virtuelle

Country Status (2)

Country Link
AU (1) AU2001288811A1 (fr)
WO (1) WO2002021451A1 (fr)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1299244C (zh) * 2005-06-02 2007-02-07 中国科学院力学研究所 一种三维场景动态建模和实时仿真的系统及方法
CN100382029C (zh) * 2003-10-20 2008-04-16 上海科技馆 用计算机及视频设备模拟捕鱼的方法及装置
CN101853162A (zh) * 2010-06-01 2010-10-06 电子科技大学 一种可编辑的网页三维几何造型渲染方法
WO2013019162A1 (fr) * 2011-08-04 2013-02-07 Playware Studios Asia Pte Ltd Procédé et système d'hébergement de mondes virtuels transitoires qui peuvent être créés, hébergés et terminés à distance et automatiquement
CN103116576A (zh) * 2013-01-29 2013-05-22 安徽安泰新型包装材料有限公司 一种语音手势交互翻译装置及其控制方法
US8599194B2 (en) 2007-01-22 2013-12-03 Textron Innovations Inc. System and method for the interactive display of data in a motion capture environment
US8615714B2 (en) 2007-01-22 2013-12-24 Textron Innovations Inc. System and method for performing multiple, simultaneous, independent simulations in a motion capture environment
US9013396B2 (en) 2007-01-22 2015-04-21 Textron Innovations Inc. System and method for controlling a virtual reality environment by an actor in the virtual reality environment
CN104980599A (zh) * 2015-06-17 2015-10-14 上海斐讯数据通信技术有限公司 一种手语语音通话方法及系统
CN107707726A (zh) * 2016-08-09 2018-02-16 深圳市鹏华联宇科技通讯有限公司 一种用于正常人与聋哑人通讯的终端和通话方法
CN109887069A (zh) * 2013-04-19 2019-06-14 华为技术有限公司 在屏幕上显示3d场景图的方法
US10416769B2 (en) 2017-02-14 2019-09-17 Microsoft Technology Licensing, Llc Physical haptic feedback system with spatial warping
US20210318998A1 (en) * 2020-04-10 2021-10-14 International Business Machines Corporation Dynamic schema based multitenancy

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5625576A (en) * 1993-10-01 1997-04-29 Massachusetts Institute Of Technology Force reflecting haptic interface
US5734805A (en) * 1994-06-17 1998-03-31 International Business Machines Corporation Apparatus and method for controlling navigation in 3-D space
US5825363A (en) * 1996-05-24 1998-10-20 Microsoft Corporation Method and apparatus for determining visible surfaces
US5861885A (en) * 1993-03-23 1999-01-19 Silicon Graphics, Inc. Method and apparatus for indicating selected objects by spotlight
US6005548A (en) * 1996-08-14 1999-12-21 Latypov; Nurakhmed Nurislamovich Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods
US6054991A (en) * 1991-12-02 2000-04-25 Texas Instruments Incorporated Method of modeling player position and movement in a virtual reality system
US6057856A (en) * 1996-09-30 2000-05-02 Sony Corporation 3D virtual reality multi-user interaction with superimposed positional information display for each user
US6064389A (en) * 1997-05-27 2000-05-16 International Business Machines Corporation Distance dependent selective activation of three-dimensional objects in three-dimensional workspace interactive displays

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6054991A (en) * 1991-12-02 2000-04-25 Texas Instruments Incorporated Method of modeling player position and movement in a virtual reality system
US5861885A (en) * 1993-03-23 1999-01-19 Silicon Graphics, Inc. Method and apparatus for indicating selected objects by spotlight
US5625576A (en) * 1993-10-01 1997-04-29 Massachusetts Institute Of Technology Force reflecting haptic interface
US5734805A (en) * 1994-06-17 1998-03-31 International Business Machines Corporation Apparatus and method for controlling navigation in 3-D space
US5825363A (en) * 1996-05-24 1998-10-20 Microsoft Corporation Method and apparatus for determining visible surfaces
US6005548A (en) * 1996-08-14 1999-12-21 Latypov; Nurakhmed Nurislamovich Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods
US6057856A (en) * 1996-09-30 2000-05-02 Sony Corporation 3D virtual reality multi-user interaction with superimposed positional information display for each user
US6064389A (en) * 1997-05-27 2000-05-16 International Business Machines Corporation Distance dependent selective activation of three-dimensional objects in three-dimensional workspace interactive displays

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100382029C (zh) * 2003-10-20 2008-04-16 上海科技馆 用计算机及视频设备模拟捕鱼的方法及装置
CN1299244C (zh) * 2005-06-02 2007-02-07 中国科学院力学研究所 一种三维场景动态建模和实时仿真的系统及方法
US8615714B2 (en) 2007-01-22 2013-12-24 Textron Innovations Inc. System and method for performing multiple, simultaneous, independent simulations in a motion capture environment
US9013396B2 (en) 2007-01-22 2015-04-21 Textron Innovations Inc. System and method for controlling a virtual reality environment by an actor in the virtual reality environment
US8599194B2 (en) 2007-01-22 2013-12-03 Textron Innovations Inc. System and method for the interactive display of data in a motion capture environment
CN101853162A (zh) * 2010-06-01 2010-10-06 电子科技大学 一种可编辑的网页三维几何造型渲染方法
CN101853162B (zh) * 2010-06-01 2013-01-09 电子科技大学 一种可编辑的网页三维几何造型渲染方法
WO2013019162A1 (fr) * 2011-08-04 2013-02-07 Playware Studios Asia Pte Ltd Procédé et système d'hébergement de mondes virtuels transitoires qui peuvent être créés, hébergés et terminés à distance et automatiquement
AU2012290740B2 (en) * 2011-08-04 2017-03-30 Playware Studios Asia Pte Ltd Method and system for hosting transient virtual worlds that can be created, hosted and terminated remotely and automatically
CN103116576A (zh) * 2013-01-29 2013-05-22 安徽安泰新型包装材料有限公司 一种语音手势交互翻译装置及其控制方法
CN109887069A (zh) * 2013-04-19 2019-06-14 华为技术有限公司 在屏幕上显示3d场景图的方法
CN104980599A (zh) * 2015-06-17 2015-10-14 上海斐讯数据通信技术有限公司 一种手语语音通话方法及系统
CN107707726A (zh) * 2016-08-09 2018-02-16 深圳市鹏华联宇科技通讯有限公司 一种用于正常人与聋哑人通讯的终端和通话方法
US10416769B2 (en) 2017-02-14 2019-09-17 Microsoft Technology Licensing, Llc Physical haptic feedback system with spatial warping
US20210318998A1 (en) * 2020-04-10 2021-10-14 International Business Machines Corporation Dynamic schema based multitenancy
US11995047B2 (en) * 2020-04-10 2024-05-28 International Business Machines Corporation Dynamic schema based multitenancy

Also Published As

Publication number Publication date
AU2001288811A1 (en) 2002-03-22

Similar Documents

Publication Publication Date Title
US10062354B2 (en) System and methods for creating virtual environments
Doerr et al. CGLX: a scalable, high-performance visualization framework for networked display environments
US7676356B2 (en) System, method and data structure for simulated interaction with graphical objects
US20210090315A1 (en) Artificial reality system architecture for concurrent application execution and collaborative 3d scene rendering
US20120050300A1 (en) Architecture For Rendering Graphics On Output Devices Over Diverse Connections
US20160225188A1 (en) Virtual-reality presentation volume within which human participants freely move while experiencing a virtual environment
Febretti et al. Omegalib: A multi-view application framework for hybrid reality display environments
US20100289804A1 (en) System, mechanism, and apparatus for a customizable and extensible distributed rendering api
WO2002021451A1 (fr) Procede et systeme de creation et utilisation simultanees de programmes d'ordinateurs en realite virtuelle
Bierbaum et al. Software tools for virtual reality application development
Amselem A window on shared virtual environments
WO2017006223A1 (fr) Moteur graphique permettant de créer et d'exécuter des applications avec des interfaces multi-sensorielles
Snowdon et al. The aviary vr system: A prototype implementation
WO2002052410A1 (fr) Procede de manipulation d'un systeme distribue d'objets informatiques
Valkov et al. Viargo-a generic virtual reality interaction library
Duval et al. PAC-C3D: A new software architectural model for designing 3d collaborative virtual environments
Castillo-Effen et al. Modeling and visualization of multiple autonomous heterogeneous vehicles
Eilemann Equalizer Programming and User Guide: The official reference for developing and deploying parallel, scalable OpenGL applications using the Equalizer parallel rendering framework
Arsenault et al. DIVERSE: A software toolkit to integrate distributed simulations with heterogeneous virtual environments
Ruffaldi et al. Coco-a framework for multicore visuo-haptics in mixed reality
Capin et al. A taxonomy of networked virtual environments
Kessler A flexible framework for the development of distributed, multi-user virtual environment applications
Metze et al. Towards a general concept for distributed visualisation of simulations in Virtual Reality environments.
Lacoche et al. Providing plasticity and redistribution for 3D user interfaces using the D3PART model
Ferreira et al. Multiple display viewing architecture for virtual environments over heterogeneous networks

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM EC EE ES FI GB GD GE HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP