NL2002841C2 - Immersive collaborative environment using motion capture, head mounted display, and cave. - Google Patents

Immersive collaborative environment using motion capture, head mounted display, and cave. Download PDF

Info

Publication number
NL2002841C2
NL2002841C2 NL2002841A NL2002841A NL2002841C2 NL 2002841 C2 NL2002841 C2 NL 2002841C2 NL 2002841 A NL2002841 A NL 2002841A NL 2002841 A NL2002841 A NL 2002841A NL 2002841 C2 NL2002841 C2 NL 2002841C2
Authority
NL
Netherlands
Prior art keywords
virtual reality
users
simulation
avatars
collision
Prior art date
Application number
NL2002841A
Other languages
Dutch (nl)
Other versions
NL2002841A (en
Inventor
Michael K Dobbins
Pascale Rondot
Eric D Shone
Michael R Yokell
Kevin J Abshire
Anthony Ray Harbor
Scott Lovell
Michael K Barron
Original Assignee
Lockheed Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/355,771 external-priority patent/US8615383B2/en
Application filed by Lockheed Corp filed Critical Lockheed Corp
Publication of NL2002841A publication Critical patent/NL2002841A/en
Application granted granted Critical
Publication of NL2002841C2 publication Critical patent/NL2002841C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/15Vehicle, aircraft or watercraft design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/02CAD in a network environment, e.g. collaborative CAD or distributed simulation

Abstract

A collaborative visualization system integrates motion capture and virtual reality, along with kinematics and computer-aided design (CAD), for the purpose, for example, of evaluating an engineering design. A virtual reality simulator ereates a fbll-scale, three-dimensional virtual reality simulation responsive to computer-aided design (CAD) data., Motion capture data is ohtained from users simultaneously interacting with the virtual reality sinmiation. The virtual reality simulator animates in real time avatan responsive to motion capture data from the users The virtual reality simulation, including the interactions of the one or more avatars and also objects, is displayed as a three-dimensional image in a common immersive environment using one or more head mounted displays so that the users can evaluate the CAD design to thereby verify that tasks associated with a product built according to the CAD design can be performed by a predetermined range of user sizes.

Description

Title: Immersive collaborative environment using motion capture, head mounted display, and cave
BACKGROUND
1. Related Annlications
[0001] This application claims priority to and the benefit of U.S. Patent Application Serial No. 12/355,771, by Dobbins et al., titled "Immersive Collaborative Environment Using Motion Capture, Head Mounted Display, and Cave" filed January 17, 2009, incorporated herein by reference in its entirety. This application also relates to: U.S. Provisional Patent Application Serial No. 61/022,185, by Dobbins et al., titled "System and Program Product for Providing a Collaborative Immersive Environment and Related Methods" filed January 18, 2008; U.S. Patent Application Serial No. 12/355,770, by Dobbins et al., titled "Providing a Collaborative Immersive Environment Using a Spherical Camera and Motion Capture" filed January 17, 2009; and U.S. Patent Application Serial No. 12/355,768, by Dobbins et al., titled "Portable Immersive Environment Using Motion Capture and Head Mounted Display1 filed January 17, 2009, all of which are each incorporated herein by reference in their entireties.
2. Field of Invention
[0002] The present invention relates generally to virtual reality and motion capture, and, more particularly, to systems and program products which allow persons to interact with real and artificial environments using motion capture.
3. Background
[0003] Various techniques and technologies exist which allow users to interact with or analyze their environment. For example, motion capture techniques are used in the fields of sports, medicine, and entertainment, especially video gaming and animation. In sports, for example, motion capture enables a golfer's swing to be digitally recorded for analysis. In medicine, orthopedic rehabilitation can employ motion capture to provide feedback to the patient, illustrating correct or incorrect techniques with the patient's movements during walking, for example. In animation, motion capture allows for an actor's movements and even facial expressions to be digitally recorded in a computer model. Later, animators use the actor's recorded motions as the basis for the motions of a computer-generated character. Likewise, video games use motion capture to facilitate the animation of life-like characters within the games.
[0004] Virtual reality technologies allow a user to interact with a computer-simulated environment. Most virtual reality environments rely on computer screens or stereoscopic displays and are primarily visual experiences. A popular example of virtual reality technology is a flight simulator video game, in which the player pilots a virtual aircraft in a computer-simulated environment.
[0005] Telepresence refers to technologies which allow the user to experience, or be present at, a remote location. For example, telepresence includes a remote video camera in which the user can control the pan, tilt, and zoom, if the display is of sufficient size and quality to allow the user to feel present at the remote location.
[0006] None of these technologies alone provide a collaborative immersive environment for evaluating a design through interaction, virtual training on a task, and validating a simulation with a life video.
SUMMARY OF INVENTION
[0007] In view of the foregoing, embodiments of the present invention, for example, provide a collaborative visualization system, which integrates motion capture and virtual reality, along with kinematics and computer-aided design (CAD), for the purpose of, amongst others, evaluating a design. Embodiments of the present invention also provide, e.g., portable motion capture systems, which allow one or more persons to interact with real and artificial environments, and head mounted displays for evaluating a design, virtual training on a task, and validating a simulation with a real-world video, amongst others. Embodiments of the present invention further provide, for example, objects to be tracked by a motion capture system and incorporated into a virtual reality simulation. Embodiments of the collaborative visualization system can include, for example, an immersive environment with one or more simultaneous users, with one or more external observers, with real-time interactions and scaling, and with the ability to switch between a simulation and a telepresence view. An immersive environment can, for example, generate a three-dimensional or stereoscopic image which appears to surround the viewer. That is, the viewer is "immersed" in the artificial environment.
[0008] Embodiments of the present invention include, for example, a virtual reality simulator. The virtual reality simulator can receive data from CAD designs and display a simulation constructed from the data to a user via, for example, a head mounted display, resulting in full-scale and stereoscopic images. The images can be, for example, so detailed that the user can read the wording on the working knobs and switches within the simulation.
[0009] Embodiments of the present invention further include, for example, a motion capture system incorporated into the virtual reality simulator. The motion capture system, for example, can track the movements and interactions of a user (who is wearing a motion capture ensemble with flexible and adjustable level of detail, from just the head to a full body suit and gloves) within the virtual reality simulation so that the when the user's head rotates or tilts, the view of the simulation rendered in the head mounted display changes accordingly. The motion capture system, for example, can store the tracked movements and interactions of a user for later use, including, for example, training or design evaluation purposes. The motion capture system also can track the movements and interactions of multiple users within the virtual reality simulation such that the multiple users are represented by avatars in real time within the simulation. Thus, multiple users can simulate a coordinated activity within the simulation, such as performing routine maintenance on an aircraft, for example. In addition, the motion capture system, for example, can track the movements and interactions of objects, including tools and props used by a user.
[0010] Embodiments of the present invention also include, for example, an immersive observation system where multiple observers can view, in real-time, the virtual reality simulation including the interactions of the avatars in a common, immersive environment so as to evaluate the CAD design. The immersive observation system can include a CAVE (Cave Automatic Virtual Environment), such as, a reconfigurable 8-foot-high by 10-foot-wide by 10-foot-long room with displays on three walls and the floor, where one or more observers view a common environment in an immersive and interactive way, including stereoscopic and full-scale images. Advantageously, using the CAVE, the designers of the CAD design can observe end-users interacting extensively with the proposed design in the simulator to analyze and evaluate the design without having to build expensive prototypes. For example, observers can view users within the simulation performing the routine maintenance operations on the aircraft. In addition, trainees can observe, for example, trainers performing various tasks by viewing avatars of the trainers responsive to recorded motion capture data.
[0011] According to the embodiments of the present invention, the virtual reality simulator can scale each avatar in real time. That is, a 5' 4" user within the simulation can be scaled in real-time to be a 6' 2" avatar within the simulation. The images rendered in the 5' 4" user's head mounted display will correspond to the perspective expected by someone 6' 2". An observer of the simulation will see a 6' 2" avatar. Scaling may be accomplished by applying a ratio to each aspect of the data in the translation from motion capture data to simulation data. Alternately, scaling may be accomplished in real time by positioning the avatar's head, hands, and feet in the correct location to allow kinematics software to solve for the other joints. In addition, scaling may be accomplished in post processing, as opposed to in real time.
[0012] According to embodiments of the present invention, the virtual reality simulator can include interactions with the simulated environment. In one such example, the virtual reality simulator can include collision detection software to provide feedback within the simulation. If a user sticks his or her hand where the simulation indicates that a wall should be, for example a virtual collision is detected. The hand motion is set to either stop on the collision or allowed to disappear from the view of the user (because it is, after all, behind the wall), and the panel of the wall can be set to change color (or some other behavior) to provide feedback and indicate that a collision has occurred. In a preferred configuration, the wall turns red; similarly, if a knee "collides" with a toolbox in the simulation, the toolbox turns red. In exemplary configuration, the collision triggers a sound; the sound can be a directional sound indicating a direction of the collision with respect to the user or observer. Various types of sounds can further provide information regarding the collision, such as, severity or the objects involved in the collision. For example, a user hitting the user's head on part of an aircraft can result in a different sound than a object colliding with the floor. In addition, the simulation can alter its behavior based on detected collisions, by opening a door or panel, for example.
[0013] In addition, embodiments of the present invention can include a portable motion capture system for capturing tasks in the field. This system includes motion tracking markers (perhaps on a suit, body, or other apparel), a plurality of cameras installed on a tripod or clamped on a rigid structure so that cameras can track the movements of a user wearing the motion capture markers, and a computer to record the images from the camera. The portable motion capture system allows for a remote procedure, such as a field maintenance operation, be recorded. Because of the incorporated nature of the virtual reality simulator and the motion capture system provided by the embodiments of the present invention, the data from a field maintenance operation can later be studied in the virtual reality simulator for interactions with a new design or for real-time evaluation of, for example, an existing design or a sequence of operations on an existing design. Moreover, according to embodiments of the present invention, a portable motion capture system can be utilized for real-time design evaluation in the field, for presentation of a design, and for training at a remote location. Evaluation of a design can include, for example, evaluating a design with respect to an environment, an analysis of the ergonomics of the design, a study of tasks associated with the design, and other considerations as understood by those skilled in the art.
[0014] Furthermore, embodiments of the present invention include methods of validating a simulation with real-world video using immersive technology. For example, according to an embodiment of such a method, a spherical camera captures real-world video, or a real-world still photograph, at a remote location. Later, the video or photograph is rendered in a head mounted display. A motion capture system collects the user's head rotation information, which is used to control the pan, tilt, and zoom of the video. Then the user can switch between displaying the real-world video and the simulation as a way of validating the simulation. In addition, the video can be displayed on a desktop or CAVE. As an example, using a spherical camera to capture the real-world images from the deck of an aircraft carrier can be used to validate a simulation of that environment.
[0015] Embodiments of the present invention include, for example, systems and associated methods of providing an immersive environment with multiple simultaneous users, with external observers, with real-time interactions and scaling, and with the ability to switch between a simulation and a telepresence view, as will be understood by those skilled in the art. Embodiments of the present invention provide improved approaches to evaluate designs without having to build prototypes and to train personnel without the need for prototypes or on location travel.
BRIEF DESCRIPTION OF DRAWINGS
[0016] So that the manner in which the features and benefits of the invention, as well as others which will become apparent, may be understood in more detail, a more particular description of the invention briefly summarized above may be had by reference to the embodiments thereof which are illustrated in the appended drawings, which form a part of this specification.
It is also to be noted, however, that the drawings illustrate only various embodiments of the invention and are therefore not to be considered limiting of the invention's scope as it may include other effective embodiments as well.
[0017] Figure 1 is a schematic diagram of a system to provide a collaborative immersive environment for the evaluation of an engineering design according to an embodiment of the present invention;
[0018] Figure 2 is an environmental view illustrating four users wearing motion capture equipment interacting with a virtual reality simulation, according to another embodiment of the present invention;
[0019] Figure 3 is an environmental view of a motion capture glove according to an embodiment of the present invention;
[0020] Figure 4 is a perspective view of a Hergo Easy Mount Mobile Computer Cart - Star Base according to an embodiment of the present invention;
[0021] Figure 5 is a perspective view of a head-mounted display according to an embodiment of the present invention;
[0022] Figure 6 is a perspective view of a head tracker to be used in conjunction with the CAVE according to an embodiment of the present invention;
[0023] Figure 7 is an environmental view of wand hardware to be used in conjunction with the CAVE according to an embodiment of the present invention;
[0024] Figure 8A is a perspective view of a spherical camera according to an embodiment of the present invention;
[0025] Figure 8B is a perspective view of motion capture cameras according to an embodiment of the present invention;
[0026] Figure 9 is an perspective view of avatars within a virtual reality simulation according to an embodiment of the present invention;
[0027] Figure 10 is a schematic block diagram of a collaborative visualization system according to an embodiment of the present invention;
[0028] Figure 11 is a schematic flow diagram of a method to provide a collaborative immersive environment for the evaluation of an engineering design according to an embodiment of the present invention; and
[0029] Figure 12 is a schematic flow diagram of a method of validating a simulation with real-world video using immersive technology according to an embodiment of the present invention.
DETAILED DESCRIPTION OF INVENTION
[0030] The present invention will now be described more fully hereinafter with reference to the accompanying drawings, which illustrate embodiments of the invention. This invention may, however, be embodied in many different forms and should not be construed as limited to the illustrated embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout.
[0031] As illustrated in Figures 1 and 10, embodiments of the present invention include an collaborative visualization system 20 which integrates motion capture and virtual reality technologies, along with kinematics and CAD, for the purpose of, amongst others, evaluating a design, virtual training on a task, and validating a simulation with a real-world video. In addition, embodiments of the present invention include, for example, immersive observation environments as well.
[0032] Embodiments of the present invention include, for example, a virtual reality simulator 58 to create the virtual reality environment. The virtual reality simulator can receive data from a CAD program and create a virtual reality simulation of the design projected to a user via a head mounted display 40, as illustrated, for example, in Figures 1, 2, and 5. In addition, the simulation may be displayed via a desktop or a CAVE 44. The virtual reality simulation is three-dimensional (3D) and allows a user to inspect, evaluate, and interact with the CAD design. The simulation environment is labeled immersive because the simulation is 3D and full-scale, and the user's view can rotate throughout the simulation so that and the user becomes immersed in the simulation.
[0033] Embodiments provide for evaluation of designs for aircraft, space systems, spacecraft, ships, and missile systems, which often utilize an extensive evaluation process traditionally requiring, for example, expensive mock-ups and prototypes. The extensive evaluation process can advantageously include ergonomic analysis and task analysis for operation and maintenance tasks, as understood by those skilled in the art.
[0034] According to an exemplary embodiment of the present invention, the virtual reality software used includes ENVISION (D5) from DELMIA. As understood by those skilled in the art, this software provides a physics-based, 3D environment specifically for designing, verifying, and rapid prototyping of concept designs involving structures, mechanical systems, and humans. As understood by those skilled in the art, the software enhances system and subsystem level models with physics-based motion, virtual reality immersion, and ergonomic evaluation capabilities for highly accurate 3D simulation, analysis, and visualization. In addition, according to an exemplary embodiment of the present invention, other software components used to create the virtual reality environment include: PolyWorks from InnovMetric Software Inc, a software tool used to convert point cloud data to polygons; NuGraf from Okino Computer Graphics, a polygon conversion and reduction tool; Deep Exploration or Deep Server from Right Hemisphere, a polygon conversion and reduction tool; and MS Visual Studio from Microsoft, a code development suite. According to an exemplary embodiment of the present invention, the hardware supporting this software can includes: four Dell Precision Workstations 670, 16x DVD-ROM, 48/32 CDRW, Dual 3.0 Ghz Xeon with 2 MB L2 Cache, 800 FSB 4 GB RAM, nVidia Quadro FX3400 256 MB, 136 GB HD. As understood by those skilled in the art, the virtual reality simulator can include a computer program product, stored in one or more tangible computer readable media and readable by a computer so that the computer program product operates to perform the various instructions when read by the computer as described herein.
[0035] As illustrated in Figure 1, 2, and 3, according to an embodiment of the present invention, the motion capture system 30 includes, for example, users wearing bodysuits 32, gloves 34, and headgear 36 with markers 52 at known locations on the suit, such as the knee, the wrist, and the top of the shoulders. The motion capture system can further include real objects, or props, also having markers to be represented in the virtual reality simulator. Cameras 54, as illustrated in Figure 8B, then digitally record the locations of the markers as the users move around and interact in the simulation, capturing a set of data. This motion capture data can then made available, in real time, to the virtual reality simulator 58 so that the users and their movements are modeled as avatars 56 within the simulation and so that feedback is provided to the users. An avatar is an electronic image that represents and is manipulated by, or driven by a computer user, as in a computer game or other virtual reality setting, typically being an incarnation in human form.
[0036] According to an exemplary embodiment of the present invention, the motion capture system 30 includes up to twenty-four cameras 54 (e.g., 12 Eagle-i and 12 Hawk-i from Motion Analysis Corporation, as shown in Figure 8B), for example, mounted on a truss 38 (e.g., LxWxH is 20’xl5’xl0’) used to track up to six concurrent users wearing head-mounted displays 40, gloves 34 (e.g., TALON Gloves from Motion Analysis Corporation), and body suits 32. According to an embodiment of the present invention, the motion capture system 30 uses software from Motion Analysis Corporation called EVaRT V5.0.4 and includes the following plug-ins: Animation Plugins, RT2 Animation Plugins, Calcium 4, Talon Streaming 4, Talon Viewer 4, and EVaRT5. As understood by those skilled in the art, this software allows templates and props to be created and customized for tracking everything from physical mockups to full body person. In addition, VRSim’s SimlO module can be used to multicast the data collected by EVaRT to be used by the simulation engine software ENVISION D5. As understood by those skilled in the art, embodiments can, for example, incorporate any number of cameras.
[0037] According to an embodiment of the present invention, the virtual reality simulator can, for example, scale each avatar in real time. That is, a 5' 4" user within the simulation can be scaled in real-time to be a 6' 2" avatar within the simulation. The images rendered in the 5' 4" user's head mounted display will correspond to the perspective expected by someone 6' 2". An observer of the simulation will see a 6' 2" avatar, and the avatar's posture will match that of the user. Scaling may be accomplished by applying a ratio to each aspect of the data in the translation from motion capture data to simulation data. Alternately, scaling may be accomplished by positioning the avatar's head, hands, and feet in the correct location to allow kinematics software to solve for the other joints. Figure 9 illustrates 4 avatars scaled to different sizes, all driven from the same user; hence, all have the same posture. Note also that scaling can be accomplished in post processing, as opposed to in real time. That is, for a remote training example, a user, i.e., a trainee, of a first size, e.g, 6' 2", can be displayed an avatar of the first size, e.g, 6' 2", responsive to motion capture data from a user, i.e., a trainer, of a second size different than the first size, e.g, 5' 4".
[0038] As illustrated in Figures 1, 2, and 5, according to embodiments of the present invention, a user experiences the virtual reality environment visually through a head-mounted display 40, and each of the head mounted displays can have a different perspective of the virtual reality simulation. The head-mounted display 40 can include a stereo display helmet worn by users for immersive visualization of full-scale data. As understood by those skilled in the art, one type of head-mounted display 40, the VR1280 from Virtual Research, has the ability to display at 1280 x 1204 at 60 Hz in mono or stereo. As understood by those skilled in the art, a head mounted displays can include separate left-eye and right-eye displays with different images so that a user views an image in the head mounted display stereoscopically.
[0039] In an exemplary embodiment of the present invention, the software used to display the 3D scene can be based on OpenSceneGraph 3D graphics toolkit. MiniViz from VRSim is a viewer that allows the user to view a running ENVISION (D5) simulation. The viewer loads models in the environment and then references the ENVISION (D5) simulation for the positions of the models and tracked viewpoints.
[0040] Embodiments of the present invention provide computer workstations to support the head-mounted displays. In an exemplary embodiment of the present invention, the hardware used to support four head-mounted displays includes: four Dell Precision Workstation 670, 16x DVD-ROM 48/32 CDRW, Dual 3.0 Ghz Xeon with 2 MB L2 Cache, 800 FSB, 4 GB RAM, nVidia Quadro FX3400 256 MB, 136 GB HD. For convenience and as understood by those skilled in the art, a Hergo Easy Mount Mobile Computer Cart - Star Base 42, as illustrated in Figure 4, can be used to mount each set of equipment, including the computer, keyboard, mouse, head-mounted display 40, Talon Gloves 34, and a flat panel monitor from Hergo, according an embodiment of the present invention.
[0041] According to embodiments of the present invention, the virtual reality simulator can include, for example, interactions with the simulated environment. In one such example, the virtual reality simulator can include collision detection software to provide feedback within the simulation. If a user sticks a hand where the simulation indicates that a wall should be, a virtual collision is detected. The hand can be stopped at the time of the collision or be allowed to disappear from the view of the user and the panel of the wall change color (or some other behavior) to provide feedback and indicate that a collision has occurred. In a preferred configuration, the wall turns red. Similarly, if a knee "collides" with a bomb in the simulation, the bomb turns red. In an exemplary embodiment, an appearance of the object is altered in response to the collision. In exemplary configuration, the collision triggers a sound; the sound can be a directional sound indicating a direction of the collision with respect to the user or observer. Various types of sounds can further provide information regarding the collision, such as, severity or the objects involved in the collision. For example, a user hitting the user's head on part of an aircraft can result in a different sound than a object colliding with the floor. In addition, the simulation can alter its behavior based on detected collisions, by opening a door or panel, for example. In addition, this collision detection feature can be used to facilitate the grabbing of a virtual object within the simulation, permitting the object to he moved or otherwise manipulated in the virtual environment. Collisions can also occur between avatars and between simulated objects, such as, between a simulated hand tool and simulated wall.
[0042] Embodiments of the present invention can also include, for example, an immersive observation environment as well. The observation environment allows designers to observe and interact with the virtual reality simulation including the avatars 56 (see Figure 9), which can be driven in real-time by motion capture data from the users. As illustrated in Figure 1, the observation environments can include the CAVE (Cave Automatic Virtual Environment) 44, a reconfigurable, e.g., 8-foot-high by 10-foot-wide by 10-foot-long room with displays on three walls and the floor, where one or more observers view a common environment in an immersive way, including stereoscopic and full-scale images. As understood by those skilled in the art, the CAVE 44 from Mechdyne displays a resolution of 1280 x 1024 at 96 Hz. Therefore, designers and other observers can view in real-time detailed interaction of the users within the simulation. In an exemplary configuration, one wall of the CAVE 44 may be used to display all individual viewpoints with the other two walls and the floor of the CAVE 44 being used to display an overall view. In this configuration, an observer immersed in the simulation can view avatar 56 posture and the avatar view point simultaneously. Other available data, such as temperature, distance measurements, time, etc. whether visible or not, may be also be displayed on the one wall of the CAVE 44, according to an exemplary configuration.
[0043] In an exemplary embodiment of the present invention, the software used to display the 3D scene is based on OpenSceneGraph, a 3D graphics open source toolkit. Much like the stand alone viewer for the head-mounted display, the CAVE 44 can use the MiniViz software program. The projections of each screen, however, can be mapped using an Application Programmer's Interface (API) called CAVELib from Mechdyne. Also running the CAVE 44 are other programs, which can include: Vega, Vega Prime, and Ensight Gold, according to an embodiment of the present invention.
[0044] Embodiments of the present invention can also include head tracker 46 (see Figure 6) and wand hardware 48 (see Figure 7), to be used in the CAVE 44 to allow the observers to move around easily within the simulation. According to an exemplary embodiment of the present invention, the InterSense IS-900 uses a combination of inertial and ultrasonic sensors to determine the positions of the head tracker and wand. The software running the head tracker and wand can include a program called Trackd 5.5 from VRCO. As understood by those skilled in the art, the Trackd application takes information from the head tracker and wand and makes that information available to either CAVELib or the ENVISION (D5) simulation. This tracked information is applied to correct the projections on the walls and the floor of the CAVE 44.
[0045] Embodiments of the present invention can also include, for example, an immersive environment with the ability to switch between a simulation and a telepresence view. The telepresence view is available through the CAVE 44, a head mounted device 40, or desktop display. The content for the telepresence view are gathered via a spherical camera 50 (see Figure 8A) at a remote location, such as, for example, the surface of an aircraft carrier. According to an embodiment of the present invention, the spherical camera 50 has a set of six digital cameras embedded into one device that captures 75% of a sphere. The dynamic scene can be displayed in the head mounted display 40 or the CAVE 44, where the user can be immersed in real-world video information to help validate simulations. As understood by those skilled in the art, the Ladybug2 is a 4.7 MegaPixel video capture device using six 1024x768 CCDs. According to an exemplary embodiment of the present invention, the Ladybug2 from Point Grey Research comes with a software development environment Laybug2 SDK and a tool called LadybugCap that allows content to be captured; this software produces spherical avi files for playback, as understood by those skilled in the art. In an exemplary embodiment, Ladybug3 can be used for better resolution but a lower frame rate.
[0046] As understood by those skilled in the art, the spherical camera 50 produces data files in 1GB increments, which may be 2 seconds or 2 minutes. So, 30 seconds of video capture can turn into 15 files at 1GB each. These files require translation into a viewable format. Other solutions use a video editor and paste the first viewable files together to form a single video file, in the process reducing the quality to produce a smaller, second level video file. In contrast, embodiments of the present invention read the first level video files into buffers and provide indexing, such as the first file, last, current, the file before the current and the file after the current. This allows the video group with many files to be played as if they were a single video file.
[0047] According to an embodiment of the present invention, real-world objects can be scanned to create models for the virtual reality simulator. In an exemplary embodiment of the present invention, the Creatform Handy Scan EXAscan can be employed. In an exemplary embodiment of the present invention, a Leica HDS 3000 can be employed. The Leica HDS 3000 is a laser based device that scans equipment to quickly create high-fidelity 3D models where they do not yet exist. As understood by those skilled in the art, in practice, the true resolution of the scanner is 1/4” of point accuracy for unmodeled data and 1/8” of point accuracy for data modeled based on multiple points from a point cloud. According to an exemplary embodiment of the present invention, the software used to capture the area, set the resolution, and register the point clouds together is Cyclone. Also used is PolyWorks to register the clouds and produce polygon geometry to be incorporated into the simulation. According to an exemplary embodiment of the present invention, the hardware supporting the device can be a Dell Precision M70 Laptop, 1.86GHz, 1GB RAM, and Nvidia Quadro FX Go 1400.
[0048] Embodiments of the present invention can include, for example, a portable motion capture system for capturing tasks in the field. According to an embodiment of the portable motion capture system, the system can include markers at predetermined locations, a motion capture suit, a plurality of cameras installed on a tripod so that cameras can track the movements of a user wearing the motion capture suit, and a computer or other storage medium to record the images from the camera. According to another embodiment of the portable motion capture system for capturing tasks in the field, the system can include markers at predetermined locations, a motion capture suit, a plurality of cameras clamped on a rigid structure so that cameras can track the movements of a user wearing the motion capture suit, and a computer or other storage medium to record the images from the camera and to digitally record locations of the markers associated with the bodysuit, responsive to the recorded images from the plurality of cameras. In a training application, for example, motions of a remote trainer at a first location can be tracked and captured for training of a trainee at a second location, ever later or in real time. In addition, motions of one or more users can be tracked and captured for task analysis, including, for example, ergonomic and efficiency analysis. In another embodiment, for example, a portable motion capture system can be utilized for real-time design evaluation in the field and for design presentation or demonstration, including, for example, a new design.
[0049] The embodiments of the present invention include a method of evaluating an engineering design, as illustrated in Figure 11. The method includes simulating a CAD design in a virtual reality environment (step 80) and driving one or more avatars within the virtual reality simulation using motion capture data obtained from a user interacting with the virtual reality simulation (step 81). The method continues with displaying the virtual reality simulation, including the interactions of the one or more avatars, to multiple observers in a common, immersive environment in real-time so as to evaluate the CAD design (step 82) to thereby verify that tasks associated with a product built according to the CAD design can be performed by a predetermined range of user sizes.
[0050] The embodiments of the present invention include a method of validating a simulation with real-world video using immersive technology, as illustrated in Figure 12. The method includes capturing real-world video by a spherical camera at a remote location (step 90) and rendering the video in a head mounted display (step 91). The method continues with capturing the head rotation information of a user by a motion capture system (step 92) and controlling the pan, tilt, and zoom of the video by the head rotation information of the user (step 93). The method also includes switching, under user control, between displaying the real-world video and the simulation (step 94).
[0051] The embodiments of the present invention also include a computer program product, stored on a tangible computer memory media, operable on a computer, the computer program product comprising a set of instructions that, when executed by the computer, cause the computer to perform various operations. The operations include, for example, receiving CAD data, generating video signals to simulate in virtual reality the design from the CAD data, providing for the tracking of multiple users interacting with each other and the simulation, providing for the tracking of objects interacting with the simulation, generating scaled avatars within the simulation, generating video signals for the common immersive environment, and receiving user input to select between video, graphics, or both together.
[0052] Embodiments can also include a computer program product, being stored in one or more tangible computer readable media and readable by a computer so that the computer program product operates to perform instructions described herein when read by the computer. The instructions include recording at a first location full-body motion capture data for one or more trainers performing one or more tasks by a portable motion capture system. The instructions include animating one or more avatars within a virtual reality simulation by a virtual reality simulator at a second location, responsive to recorded motion capture data for the one or more trainers at the first location so that each of the one or more trainers corresponds to one of the one or more avatars. The instructions include displaying the virtual reality simulation, including the one or more animated avatars, as a three-dimensional image that appears to surround one or more trainees to thereby define a common immersive environment using one or more head mounted displays so that the one or more trainees can analyze the one or more tasks performed. The instructions can also include obtaining motion capture data for one or more trainees interacting with the virtual reality simulation through a motion capture system; animating one or more avatars within a virtual reality simulation by a virtual reality simulator in real time, responsive to motion capture data for the one or more trainees at the second location; detecting a collision between an avatar animated by a trainee and a simulated object in the virtual reality simulation by the virtual reality simulator; and altering a color of the simulated object in the virtual reality simulation by the virtual reality simulator to provide feedback for the detected collision.
[0053] The embodiments of the present invention can also include a system for training at a remote location, for example, tasks associated with operation or maintenance of an aircraft. Likewise, tasks can be associated with the operation or maintenance of a design for an aircraft, a space system, a spacecraft, a ship, or a missile system.
[0054] Embodiments of the present invention further include a method of simulating a task. The method includes recording full-body motion capture data for one or more users performing one or more tasks by a portable motion capture system. The method includes animating one or more avatars within a virtual reality simulation by a virtual reality simulator responsive to motion capture data for the one or more users so that each of the one or more users corresponds to one of the one or more avatars. The method includes displaying the virtual reality simulation, including the one or more animated avatars, as a three-dimensional image using one or more head mounted displays so that each of the one or more head mounted displays can provide a different perspective of the virtual reality simulation.
[0055] The system includes a portable motion capture system 30, 42 at a first location positioned to track the movements of one or more users, e.g., trainers, and to record full-body motion capture data for one or more users, e.g., trainers, performing one or more tasks. The system can include a virtual reality simulator 58 being positioned to receive the recorded motion capture data from a first location and capable of animating one or more avatars 56 within a three-dimensional virtual reality simulation at a second different location, responsive to recorded motion capture data. The system can include an immersive observation system to display the virtual reality simulation, including the one or more animated avatars 56, as a three-dimensional image that appears to surround one or more trainees to thereby define a common immersive environment 20 using one or more head mounted displays 40 so that each of the one or more head mounted displays 40 can have a different perspective of the virtual reality simulation and so that the one or more trainees can analyze the one or more tasks performed.
[0056] It is important to note that while embodiments of the present invention have been described in the context of a fully functional system, those skilled in the art will appreciate that the mechanism of at least portions of the present invention and/or aspects thereof are capable of being distributed in the form of a computer readable medium of instructions in a variety of forms for execution on a processor, processors, or the like, and that the present invention applies equally regardless of the particular type of signal bearing media used to actually carry out the distribution. Examples of computer readable media include but are not limited to: nonvolatile, hard-coded type media such as read only memories (ROMs), CD-ROMs, and DVD-ROMs, or erasable, electrically programmable read only memories (EEPROMs), recordable type media such as floppy disks, hard disk drives, CD-R/RWs, DVD-RAMs, DVD-R/RWs, DVD+R/RWs, flash drives, and other newer types of memories, and transmission type media such as digital and analog communication links. For example, such media can include both operating instructions and operations instructions related to the design and evaluation program product and the method steps, described above.
[0057] In the drawings and specification, there have been disclosed a typical preferred embodiment of the invention, and although specific terms are employed, the terms are used in a descriptive sense only and not for purposes of limitation. The invention has been described in considerable detail with specific reference to these illustrated embodiments. It will be apparent, however, that various modifications and changes can be made within the spirit and scope of the invention as described in the foregoing specification and as defined in the attached claims.
List of elements figure 1 & 10
Figure 1 - ref. 1 = 3D Laser Scanner
Figure 10
- ref. 2 = HEAD MOUNTED DISPLAY
- ref. 3 = 4 MOBILE PCs
- ref. 4 = ANALYSIS TALON GLOVES
- ref. 5 = 24 CAMERAS
- ref. 6 = SIMULATION CREATION STATIONS
- ref. 7 = CAVE DISPLAY SYSTEMS PCs
- ref. 8 = RECONFIGUARBLE CAVE DISPLAY
- ref. 9 = SERVERS AND STORAGE
- ref. 10 = STORAGE DATA SERVER
- ref. 11 = APP, DOMAIN, NONREAL-TIME
DATA, SERVER, DEEP SERVER, SQL SERVER, ENVISION D5R16
- ref. 12 = SYMANTEC BACKUP SOFTWARE
AND BACKUP DOMAIN SERVER
- ref. 13 = DATA BACKUP, TERA STATION
1 TB OF NETWORK ATTACHED STORAGE
- ref. 14 = DATA BACKUP, TERA STATION PRO
2 TB OF NETWORK ATTACHED STORAGE

Claims (10)

1. Een systeem voor het evalueren van een engineering ontwerp, waarbij het systeem omvat: - een virtual reahty simulator die is ingericht voor het ontvangen van computer-aided-design (CD) gegevens die een CAD ontwerp omvatten, en voor het creëren van een drie-dimensionale virtual reahty simulatie op schaal vanuit de CAD ontwerp data, waarbij de drie-dimensionale virtual reality simulatie wordt weergegeven op één of meer stereoscopische op het hoofd bevestigde weergave-eenheden, waar elk van de één of meer op het hoofd bevestigde weergave-eenheden een verschillend perspectief van de virtual reality simulatie heeft; - een opnamesysteem voor beweging dat is geïntegreerd in de virtual reahty simulator en is ingericht voor het simultaan volgen van bewegingen van één of meer gebruikers binnen de virtual reality simulatie, waarbij het opnamesysteem omvat; één of meer lichaamskledingstukken, handschoenen en hoofdtoestellen met markeerelementen op vooraf bepaalde locaties voor het waarnemen van beweging; een meervoudig aantal camera’s op vooraf geselecteerde locaties die zijn ingericht voor het opnemen van beelden die bewegingen registreren van de één of meer gebruikers die hun lichaamskledingstuk, handschoenen en hoofdtoestellen voor het opnemen van beweging dragen; en één of meer computers voor het digitaal registreren van locaties van de markeerelementen die zijn verbonden met de lichaamskledingstukken, handschoenen, en hoofdtoestellen wanneer de één of meer gebruikers in de simulatie interacteren, in respons op de opgenomen beelden van het meervoudig aantal camera’s. - een immersie observatiesysteem waarbij de één of meer gebruikers de virtual reality simulatie stereoscopisch en in real time bezien voor het daardoor definiëren van een gemeenschappelijke immersie omgeving zodat de één of meer gebruikers het CAD ontwerp kunnen evalueren, waarbij de gevolgde bewegingen van de één of meer gebruikers één of meer avatars binnen de virtual reality simulatie besturen en de bewegingen van de één of meer avatars corresponderen met de gevolgde bewegingen van de één of meer gebruikers, en waarbij de virtual reality simulator voorts is ingericht voor het onafhankelijk in real time schalen van elk van de één of meer avatars om daardoor te verifiëren dat taken die verbonden zijn met een product dat is vervaardigd overeenkomstig het CAD ontwerp, uitgevoerd kunnen worden door een vooraf bepaald bereik van gebruikers afmetingen, en waarbij de virtual reality simulator voorts is ingericht voor het detecteren van een botsing tussen een avatar en de driedimensionale virtual reality simulatie van het CAD ontwerp en om een gedeelte van de avatar uit het bhkveld van één of meer van de gebruikers te doen verdwijnen voor het verschaffen van feedback voor de gedetecteerde botsing.A system for evaluating an engineering design, the system comprising: - a virtual reahty simulator adapted to receive computer-aided design (CD) data comprising a CAD design, and for creating a three-dimensional virtual reality simulation on a scale from the CAD design data, the three-dimensional virtual reality simulation being displayed on one or more stereoscopic head-mounted display units, where each of the one or more head-mounted display units units has a different perspective of the virtual reality simulation; - a motion recording system that is integrated into the virtual reahty simulator and is adapted to simultaneously monitor the movements of one or more users within the virtual reality simulation, the recording system comprising; one or more body garments, gloves, and main devices with marker elements at predetermined locations for observing motion; a plurality of cameras at preselected locations adapted to record images that record movements of one or more users wearing their body garment, gloves, and head motion recording devices; and one or more computers for digitally recording locations of the marker elements associated with the body garments, gloves, and main devices when the one or more users interact in the simulation, in response to the recorded images of the plurality of cameras. - an immersion observation system in which the one or more users view the virtual reality simulation stereoscopically and in real time to thereby define a common immersion environment so that the one or more users can evaluate the CAD design, whereby the movements followed by the one or more users control one or more avatars within the virtual reality simulation and the movements of the one or more avatars correspond to the followed movements of the one or more users, and wherein the virtual reality simulator is further adapted to scale each of them independently in real time of the one or more avatars to thereby verify that tasks associated with a product manufactured in accordance with the CAD design can be performed through a predetermined range of user dimensions, and wherein the virtual reality simulator is further adapted to detect of a collision between an avatar and the dri edimensional virtual reality simulation of the CAD design and to remove part of the avatar from the bhk field of one or more of the users to provide feedback for the detected collision. 2. Een systeem volgens conclusie 1, waarbij de virtual reahty simulator voorts is ingericht om een geluid uit een bepaalde richting te verschaffen, waar het geluid uit een bepaalde richting een richting van de botsing ten opzichte van de gebruiker aangeeft, en waarbij de virtual reahty simulator voorts is ingericht om de weergave van een gedeelte van de driedimensionale virtual reahty simulatie te wijzigen om feedback op de gedetecteerde botsing te verschaffen.A system according to claim 1, wherein the virtual reahty simulator is further adapted to provide a sound from a certain direction, where the sound from a certain direction indicates a direction of the collision relative to the user, and wherein the virtual reahty The simulator is further adapted to change the representation of a portion of the three-dimensional virtual response simulation to provide feedback on the detected collision. 3. Een systeem volgens conclusie 1, waarbij de botsing een eerste botsing is; en waarbij de virtual reahty simulator voorts is ingericht voor het detecteren van een tweede botsing tussen een gesimuleerd object en de virtual reahty simulatie van het CAD ontwerp, en voor wijzigen van de weergave van het gesimuleerde object voor het verschaffen van feedback voor de gedetecteerde tweede botsing.A system according to claim 1, wherein the collision is a first collision; and wherein the virtual reahty simulator is further adapted to detect a second collision between a simulated object and the virtual reahty simulation of the CAD design, and to change the display of the simulated object to provide feedback for the detected second collision . 4. Een systeem volgens conclusie 1, waarbij de één of meer lichaamskledingstukken, handschoenen en hoofdtoestellen met markeerelementen op vooraf bepaalde locaties voor het waarnemen van beweging een meervoudig aantal lichaamskledingstukken, handschoenen en hoofdtoestellen met markeerelementen op vooraf bepaalde locaties voor het waarnemen van beweging zijn; waarbij de één of meer avatars een meervoudig aantal avatars zijn; waarbij de een of meer gebruikers een meervoudig aantal gebruikers zijn, gerepresenteerd als een corresponderend meervoudig aantal avatars in de driedimensionale virtual reality simulatie; en waarbij het bewegingsopnamesysteem de bewegingen en interacties registreert van het meervoudig aantal gebruikers gerepresenteerd als het meervoudig aantal avatars binnen de driedimensionale virtual reality simulatie om daarmee een gecoördineerde activiteit binnen de simulatie te simuleren.A system according to claim 1, wherein the one or more body garments, gloves and main devices with marker elements at predetermined locations for detecting movement are a plurality of body garments, gloves, and main devices with marker elements at predetermined locations for detecting movement; wherein the one or more avatars are a multiple number of avatars; the one or more users being a multiple number of users, represented as a corresponding multiple number of avatars in the three-dimensional virtual reality simulation; and wherein the motion recording system records the movements and interactions of the plurality of users represented as the plurality of avatars within the three-dimensional virtual reality simulation to thereby simulate coordinated activity within the simulation. 5. Een systeem volgens conclusie 1, waarbij het immersie-observatiesysteem omvat een configureerbare weergave-eenheid omvattende drie wanden en een vloer, waarbij de weergave-eenheid stereoscopische beelden op volledige schaal omvat van de één of meer gebruikers, voor het definiëren van een Cave Automatic Virtual Environment (CAVE).A system according to claim 1, wherein the immersion observation system comprises a configurable display unit comprising three walls and a floor, the display unit comprising full-scale stereoscopic images of the one or more users for defining a Cave Automatic Virtual Environment (CAVE). 6. Computerprogrammaprodukt, dat is op geslagen in één of meer tastbare door een computer leesbare media die leesbaar zijn door de computer zodat het computerprogrammaprodukt in werking treedt voor het uitvoeren van de hierna volgende instructies wanneer deze worden gelezen door de computer: het ontvangen van computer-aided design (CAD) gegevens door een virtual reality simulator; het genereren van videosignalen door de virtual reality simulator voor het simuleren in een virtual reality van een ontwerp vanuit de CAD gegevens weergaven, waarbij de videosignalen worden gericht naar één of meer op het hoofd bevestigde weergave-eenheden, waarbij elk van de één of meer op het hoofd bevestigde weergave-eenheden een verschillend perspectiefvan de virtual reality simulatie heeft, waarbij elk van de één of meer op het hoofd bevestigde weergave-eenheden afzonderlijke weergave-eenheden voor het linkeroog en het rechteroog omvatten met verschillende signalen zodat een gebruiker een beeld in het op het hoofd bevestigde weergave-eenheid stereoscopisch ziet; het volgen van bewegingen van één of meer gebruikers die interacteren met de virtual reality simulatie; het genereren van geregistreerde gegevens van beweging voor de één of meer gebruikers in respons op de gevolgde bewegingen van de één of meer gebruikers; waarbij het opnamesysteem omvat; één of meer lichaamskledingstukken, handschoenen en hoofdtoestellen met markeerelementen op vooraf bepaalde locaties voor het waarnemen van beweging; een meervoudig aantal camera’s op vooraf geselecteerde locaties die zijn ingericht voor het opnemen van beelden die bewegingen registreren van de één of meer gebruikers die hun lichaamskledingstuk, handschoenen en hoofdtoestellen voor het opnemen van beweging dragen; en één of meer computers voor het digitaal registreren van locaties van de markeerelementen die zijn verbonden met de lichaamskledingstukken, handschoenen, en hoofdtoestellen wanneer de één of meer gebruikers in de simulatie interacteren, in respons op de op genomen beelden van het meervoudig aantal camera’s; en het in real time aansturen van één of meer op schaal gebrachte avatars in de simulatie door de virtual reality simulator gebruikmakend van de geregistreerde gegevens voor beweging voor de één of meer gebruikers, waarbij bewegingen van één of meer van de avatars corresponderen met de geregistreerde bewegingen van de één of meer gebruikers, zodat een gebruiker van een eerste afmeting een avatar simuleert van een tweede afmeting, waarbij de tweede afmeting verschilt met de eerste afmeting, en zodanig dat de één of meer gebruikers het CAD ontwerp kunnen evalueren; het detecteren van een botsing tussen een avatar van de één of meer geschaalde avatars en de driedimensionale virtual reality simulatie van het CAD ontwerp door de virtual reality simulator; en het doen verdwijnen van een deel van de avatar van het blikveld van een gebruiker van de één of meer gebruikers door de virtual reality simulator voor het verschaffen van feedback voor de gedetecteerde botsing.6. Computer program product stored in one or more tangible computer readable media readable by the computer so that the computer program product becomes operative to execute the following instructions when read by the computer: receiving computer -aided design (CAD) data by a virtual reality simulator; generating video signals by the virtual reality simulator for simulating in virtual reality a design from the CAD data displays, wherein the video signals are directed to one or more head-mounted display units, each of the one or more on the head-mounted display units have a different perspective of the virtual reality simulation, each of the one or more head-mounted display units comprising separate display units for the left eye and the right eye with different signals so that a user displays an image in the display unit attached to the head stereoscopically; following the movements of one or more users who interact with the virtual reality simulation; generating recorded motion data for the one or more users in response to the followed motion of the one or more users; wherein the recording system comprises; one or more body garments, gloves, and main devices with marker elements at predetermined locations for observing motion; a plurality of cameras at preselected locations adapted to record images that record movements of one or more users wearing their body garment, gloves, and head motion recording devices; and one or more computers for digitally recording locations of the marker elements associated with the body garments, gloves, and main devices when the one or more users interact in the simulation, in response to the captured images of the plurality of cameras; and controlling one or more scaled avatars in real time in the simulation by the virtual reality simulator using the recorded data for movement for the one or more users, wherein movements of one or more of the avatars correspond to the recorded movements of the one or more users, so that a user of a first dimension simulates an avatar of a second dimension, the second dimension differing from the first dimension, and such that the one or more users can evaluate the CAD design; detecting a collision between an avatar of the one or more scaled avatars and the three-dimensional virtual reality simulation of the CAD design by the virtual reality simulator; and causing a part of the avatar to disappear from the field of view of a user of one or more users by the virtual reality simulator to provide feedback for the detected collision. 7. Een computerprogrammaprodukt volgens conclusie 6, waarbij het programmaprodukt voorts werkt voor het uitvoeren van de volgende instructie: het verschaffen van een geluid in een bepaalde richting als respons op de gedetecteerde botsing, waarbij het geluid in een bepaalde richting een richting van de botsing ten opzichte van de gebruiker aangeeft.A computer program product according to claim 6, wherein the program product further functions to execute the following instruction: providing a sound in a particular direction in response to the detected collision, wherein the sound in a particular direction is in a direction of the collision to the user. 8. Een computerprogrammaproduct volgens conclusie 6, waarbij het programmaproduct voorts werkt voor het uitvoeren van de volgende instructies: het wijzigen van een aanzicht van een gedeelte van de driedimensionale virtual reality simulatie door de virtual reality simulator voor het verschaffen van feedback voor de gedetecteerde botsing.A computer program product according to claim 6, wherein the program product further functions to execute the following instructions: modifying a view of a portion of the three-dimensional virtual reality simulation by the virtual reality simulator to provide feedback for the detected collision. 9. Computergeïmplementeerde werkwijze voor het evalueren van een engineering ontwerp, waarbij de werkwijze omvat: het creëren van een driedimensionaal virtual reality simulatie door een virtual reality simulator in respons op computer aided design (CAD) gegevens van een CAD ontwerp; het in real time animeren van één of meer avatars in de virtual reality simulatie door de virtual reality simulator in respons op geregistreerde gegevens voor bewegingen die door middel van een opnamesysteem voor beweging zijn verkregen van de één of meer gebruikers die simultaan interacteren met de virtual reality simulatie zodat elk van de één of meer gebruikers correspondeert met één van de één of meer avatars, waarbij bewegingen van de één of meer gebruikers de één of meer avatars besturen, bewegingen van de één of meer van de avatars corresponderen met de bewegingen van de één of meer gebruikers, waarbij het opnamesysteem voor beweging omvat: één of meer lichaamskledingstukken, handschoenen en hoofdtoestellen met markeerelementen op vooraf bepaalde locaties voor het waarnemen van beweging; een meervoudig aantal camera’s op vooraf geselecteerde locaties die zijn ingericht voor het opnemen van beelden die bewegingen registreren van de één of meer gebruikers die hun lichaamskledingstuk, handschoenen en hoofdtoestellen voor het opnemen van beweging dragen; en één of meer computers voor het digitaal registreren van locaties van de markeerelementen die zijn verbonden met de lichaamskledingstukken, handschoenen, en hoofdtoestellen wanneer de één of meer gebruikers in de simulatie interacteren, in respons op de op genomen beelden van het meervoudig aantal camera’s; het weergeven van de virtual reahty simulatie, omvattende de interacties van de één of meer avatars als een driedimensionaal beeld dat één of meer gebruikers in real time lijkt te omgeven om daardoor een gemeenschappelijke immersie-omgeving te definiëren gebruikmakend van één of meer op het hoofd bevestigde weergave-eenheden zodat de één of meer gebruikers een verschillend perspectief van de virtual reahty simulatie hebben in respons op de geregistreerde gegevens voor beweging; het detecteren van een botsing tussen een avatar van de één of meer geschaalde avatars en de driedimensionale virtual reality simulatie van het CAD ontwerp door de virtual reality simulator; en het doen verdwijnen van een deel van de avatar uit het bhkveld van een gebruiker van één of meer van de gebruikers voor het verschaffen van feedback voor de gedetecteerde botsing.A computer-implemented method for evaluating an engineering design, the method comprising: creating a three-dimensional virtual reality simulation by a virtual reality simulator in response to computer aided design (CAD) data of a CAD design; animating in real time one or more avatars in the virtual reality simulation by the virtual reality simulator in response to recorded data for movements obtained by means of a motion recording system from the one or more users who interact with the virtual reality simultaneously simulation so that each of the one or more users corresponds to one of the one or more avatars, with movements of the one or more users controlling the one or more avatars, movements of the one or more of the avatars corresponding to the movements of the one or more users, the motion recording system comprising: one or more body garments, gloves, and main devices with marker elements at predetermined locations for detecting motion; a plurality of cameras at preselected locations adapted to record images that record movements of one or more users wearing their body garment, gloves, and head motion recording devices; and one or more computers for digitally recording locations of the marker elements associated with the body garments, gloves, and main devices when the one or more users interact in the simulation, in response to the captured images of the plurality of cameras; displaying the virtual reahty simulation, comprising the interactions of the one or more avatars as a three-dimensional image that appears to surround one or more users in real time thereby defining a common immersion environment using one or more head-mounted display units so that one or more users have a different perspective of the virtual reahty simulation in response to the recorded data for motion; detecting a collision between an avatar of the one or more scaled avatars and the three-dimensional virtual reality simulation of the CAD design by the virtual reality simulator; and causing a part of the avatar to disappear from the user's field of view of one or more of the users to provide feedback for the detected collision. 10. Computer-geïmplementeerde werkwijze van conclusie 9, waarbij de feedback voor de gedetecteerde botsing omvat tenminste het wijzigen van een aanzicht van een gedeelte van de driedimensionale virtual reality simulatie en het verschaffen van een geluid uit een bepaalde richting, waarbij het geluid uit een bepaalde richting een richting van de botsing ten opzichte van de gebruiker aangeeft, om feedback op de gedetecteerde botsing te verschaffen.The computer-implemented method of claim 9, wherein the feedback for the detected collision comprises at least modifying a view of a portion of the three-dimensional virtual reality simulation and providing a sound from a certain direction, the sound from a certain indicates a direction of the collision relative to the user, to provide feedback on the detected collision.
NL2002841A 2009-01-17 2009-05-05 Immersive collaborative environment using motion capture, head mounted display, and cave. NL2002841C2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US35577109 2009-01-17
US12/355,771 US8615383B2 (en) 2008-01-18 2009-01-17 Immersive collaborative environment using motion capture, head mounted display, and cave

Publications (2)

Publication Number Publication Date
NL2002841A NL2002841A (en) 2010-07-20
NL2002841C2 true NL2002841C2 (en) 2014-10-20

Family

ID=42357641

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2002841A NL2002841C2 (en) 2009-01-17 2009-05-05 Immersive collaborative environment using motion capture, head mounted display, and cave.

Country Status (5)

Country Link
KR (1) KR20100084597A (en)
CA (1) CA2662318C (en)
DK (1) DK177693B1 (en)
NL (1) NL2002841C2 (en)
TR (1) TR200903142A1 (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101845231B1 (en) 2011-06-14 2018-04-04 삼성전자주식회사 Image processing apparatus and method
KR101389894B1 (en) * 2012-07-18 2014-04-29 주식회사 도담시스템스 Virtual reality simulation apparatus and method using motion capture technology and
KR101495673B1 (en) * 2013-08-14 2015-02-25 한국항공우주연구원 Spacecraft take off and landing experience system
KR102077108B1 (en) 2013-09-13 2020-02-14 한국전자통신연구원 Apparatus and method for providing contents experience service
KR101493614B1 (en) * 2013-11-01 2015-02-13 (주)세이프텍리서치 Ship Navigation Simulator and Design Method by using Augmented Reality Technology and Virtual Bridge System
US9599821B2 (en) * 2014-08-08 2017-03-21 Greg Van Curen Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space
KR101659849B1 (en) * 2015-01-09 2016-09-29 한국과학기술원 Method for providing telepresence using avatars, and system and computer-readable recording medium using the same
WO2016144279A1 (en) 2015-03-06 2016-09-15 Ors Filiz Mujdehan Virtual reality based remote learning system and method
US10043311B2 (en) * 2015-09-16 2018-08-07 The Boeing Company Immersive design management system
CN106409053A (en) * 2016-11-24 2017-02-15 四川艾瑞智信科技有限公司 Portable simulation teaching device based on virtual reality
KR101922677B1 (en) * 2016-12-06 2019-02-20 주식회사 상화 Experience apparatus
FR3062489B1 (en) * 2017-02-01 2020-12-25 Peugeot Citroen Automobiles Sa ANALYSIS DEVICE FOR DETERMINING A DETECTION PERIOD CONTRIBUTING TO A LATENCY TIME WITHIN AN IMMERSIVE SYSTEM OF VIRTUAL REALITY
FR3062488B1 (en) 2017-02-01 2020-12-25 Peugeot Citroen Automobiles Sa ANALYSIS DEVICE FOR DETERMINING A LATENCY TIME OF AN IMMERSIVE SYSTEM OF VIRTUAL REALITY
CN110267028B (en) * 2019-06-24 2021-04-30 中冶智诚(武汉)工程技术有限公司 Signal synchronous display system for five-surface LED-CAVE
KR102472579B1 (en) * 2020-07-16 2022-11-30 주식회사 아이팝 Motion capture system for virtual fire fighting and motion capture method using the same
KR102456069B1 (en) * 2020-07-17 2022-10-18 주식회사 아이팝 Ratio correction method between a real model and virtual model
CN112288880A (en) * 2020-10-29 2021-01-29 重庆建工住宅建设有限公司 Building assembled fitment design system based on AR technique
KR102447171B1 (en) * 2022-03-18 2022-09-27 헬리오센 주식회사 Building touring guide marking system for metaverse
KR20240006373A (en) 2022-07-06 2024-01-15 주식회사 와플코퍼레이션 Method of managing audition and apparatus performing the same

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7761269B1 (en) * 2000-04-14 2010-07-20 Ford Global Technologies, Llc System and method of subjective evaluation of a vehicle design within a virtual environment using a virtual reality
ES2233201B1 (en) * 2003-11-21 2006-07-16 Seat, S.A. MIXED REALITY SIMULATION SYSTEM.

Also Published As

Publication number Publication date
CA2662318C (en) 2014-12-02
CA2662318A1 (en) 2010-07-17
DK200900534A (en) 2010-07-18
KR20100084597A (en) 2010-07-27
DK177693B1 (en) 2014-03-10
NL2002841A (en) 2010-07-20
TR200903142A1 (en) 2010-08-23

Similar Documents

Publication Publication Date Title
US8615383B2 (en) Immersive collaborative environment using motion capture, head mounted display, and cave
US8624924B2 (en) Portable immersive environment using motion capture and head mounted display
NL2002841C2 (en) Immersive collaborative environment using motion capture, head mounted display, and cave.
Krichenbauer et al. Augmented reality versus virtual reality for 3d object manipulation
Araujo et al. Snake charmer: Physically enabling virtual objects
TWI567659B (en) Theme-based augmentation of photorepresentative view
US20170092223A1 (en) Three-dimensional simulation system for generating a virtual environment involving a plurality of users and associated method
Kim et al. Haptic interaction and volume modeling techniques for realistic dental simulation
Spanlang et al. A first person avatar system with haptic feedback
Drossis et al. Interaction with immersive cultural heritage environments using virtual reality technologies
Song et al. An immersive VR system for sports education
Zaldívar-Colado et al. A mixed reality for virtual assembly
Camporesi et al. The effects of avatars, stereo vision and display size on reaching and motion reproduction
De Paolis et al. Augmented Reality, Virtual Reality, and Computer Graphics: 4th International Conference, AVR 2017, Ugento, Italy, June 12-15, 2017, Proceedings, Part I
Lyne Development of virtual reality applications for the construction industry using the Oculus Rift head mounted display
Onyesolu et al. A survey of some virtual reality tools and resources
Thalmann et al. Virtual reality software and technology
Bernal A system for immersive medical and engineering training based on serious games
EP4151291A1 (en) Information processing device, information processing method, and program
US11393153B2 (en) Systems and methods performing object occlusion in augmented reality-based assembly instructions
Nesamalar et al. An introduction to virtual reality techniques and its applications
Chaikhamwang et al. The development of public relations for school of computer and information technology chiangrai rajabhat university using virtual reality technology
Chung et al. Optimizing Camera Setup for In-Home First-person Rendering Mixed Reality Gaming System
Magnenat-Thalmann et al. Virtual reality software and technology
Akinjala et al. Animating human movement & gestures on an agent using Microsoft kinect

Legal Events

Date Code Title Description
MM Lapsed because of non-payment of the annual fee

Effective date: 20190601