WO2003015057A1 - Augmented reality-based firefighter training system and method - Google Patents

Augmented reality-based firefighter training system and method Download PDF

Info

Publication number
WO2003015057A1
WO2003015057A1 PCT/US2002/025065 US0225065W WO03015057A1 WO 2003015057 A1 WO2003015057 A1 WO 2003015057A1 US 0225065 W US0225065 W US 0225065W WO 03015057 A1 WO03015057 A1 WO 03015057A1
Authority
WO
WIPO (PCT)
Prior art keywords
firefighter
ruggedized
scba
extinguishing agent
nozzle
Prior art date
Application number
PCT/US2002/025065
Other languages
English (en)
French (fr)
Inventor
John Franklin Ebersole
John Franklin Ebersole, Jr.
Todd Joseph Furlong
Mark Stanley Bastian
John Franklin Walker
Original Assignee
Information Decision Technologies Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/927,043 external-priority patent/US7110013B2/en
Priority claimed from US10/123,364 external-priority patent/US6822648B2/en
Application filed by Information Decision Technologies Llc filed Critical Information Decision Technologies Llc
Priority to CA002456858A priority Critical patent/CA2456858A1/en
Priority to EP02752732A priority patent/EP1423834A1/de
Publication of WO2003015057A1 publication Critical patent/WO2003015057A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/40Hidden part removal
    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62BDEVICES, APPARATUS OR METHODS FOR LIFE-SAVING
    • A62B9/00Component parts for respiratory or breathing apparatus
    • A62B9/006Indicators or warning devices, e.g. of low pressure, contamination
    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62CFIRE-FIGHTING
    • A62C99/00Subject matter not provided for in other groups of this subclass
    • A62C99/0081Training methods or equipment for fire-fighting
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • This invention relates to training firefighters in an augmented reality (AR) simulation that includes creation of graphics depicting fire, smoke, and application of an extinguishing agent; extinguishing agent interaction with fire and airborne particles; applying computer generated extinguishing agent via an instrumented and ruggedized firefighter's nozzle; displaying the simulated phenomena anchored to real-world locations as seen through an instrumented and ruggedized head- worn display; and occlusion of virtual objects by a real person and/or other moveable objects.
  • AR augmented reality
  • the firefighter should wear a real SCBA and use a real firefighter's vari-nozzle to extinguish the fires.
  • These items can be instrumented to measure their positions and orientations for use in the system.
  • these items should be rugged so that they can withstand the rigorous treatment they would undergo in a real operational environment. This includes the need for protection against shock and penetration from contaminants, such as dirt and water. This allows safe, cost-effective training with greater realism than pure virtual reality (VR) simulations.
  • VR virtual reality
  • Tasks (1) to (4) are applicable to any fire situation - reactive or interactive. Therefore, any significant improvement in developing training skills for Tasks (1) to (4) will result in a significantly skilled firefighter for both reactive and interactive scenarios.
  • An objective of this invention is to demonstrate the feasibility of augmented reality as the basis for an untethered, ARBT system to train firefighters.
  • Two enabling technologies will be exploited: a flexible, wearable belt PC and an augmented reality head-mounted display (HMD).
  • the HMD can be integrated with a firefighter's SCBA. This augments the experience by allowing the user to wear a real firefighter's SCBA (Self- Contained Breathing Apparatus) while engaging in computer-enhanced fire training situations.
  • SCBA Self- Contained Breathing Apparatus
  • Augmented reality is a hybrid of a virtual world and the physical world in which virtual stimuli (e.g. visual, acoustic, thermal, olfactory) are dynamically superimposed on sensory stimuli from the physical world.
  • virtual stimuli e.g. visual, acoustic, thermal, olfactory
  • the inventive ARBT system has the significant potential to produce
  • a training program that aims to increase skills in the Tasks (1) to (4) is adaptable to essentially any fire department, large or small, whether on land, air, or sea.
  • Opportunities for Augmented Reality for Training Augmented reality has emerged as a training tool. Augmented reality can be a medium for successful delivery of training.
  • the cost of an effective training program built around augmented reality-based systems arises primarily from considerations of the computational complexity and the number of senses required by the training exercises. Because of the value of training firefighters in Tasks (1) to (4) for any fire situation, and because the program emphasizes firefighter reactions to (vs. interactions with) fire and smoke, training scenarios can be precomputed.
  • PC technology is capable of generating virtual world stimuli - in real time.
  • augmented reality the opportunity identified above- which has focused on reactions of firefighters to fire and smoke in training scenarios - is amenable to augmented reality.
  • Opportunities for Augmented Reality for Training In augmented reality, sensory stimuli from portions of a virtual world are superimposed on sensory stimuli from the real world. If we consider a continuous scale going from the physical world to completely virtual worlds, then hybrid situations are termed augmented reality.
  • the position on a reality scale is determined by the ratio of virtual world sensory information to real world information.
  • This invention creates a firefighter training solution that builds on the concept of an augmented physical world, known as augmented reality. Ideally, all training should take place in the real world. However, due to such factors as cost, safety, and environment, we have moved some or all of the hazards of the real world to the virtual world while maintaining the critical training parameters of the real world, e.g., we are superimposing virtual fire and smoke onto the real world.
  • HMD head mounted display
  • Mitler (1991) divides fire models into two basic categories: deterministic and stochastic models. Deterministic models are further divided into zone models, field models, hybrid zone/field models, and network models. For purposes of practicality and space limitations, we limit the following discussions to deterministic models, specifically zone type fire models. Mitler goes on to prescribe that any good fire model must describe convective heat and mass transfer, radiative heat transfer, ignition, pyrolysis and the formation of soot. For our purposes, models of flame structure are also of importance.
  • This aspect of the invention comprises an orthogonal plane billboarding technique that allows textures with fuzzy edges to be used to convey the sense of a soft-edged 3D model in the space. This is done by creating three orthogonal (perpendicular) planes. A texture map is mapped onto each of these planes, consisting of profile views of the object of interest as silhouettes of how they look from each of these directions.
  • This orthogonal plane billboard is the modeling of a human head and torso which can be used to occlude fire, water, and smoke in the invention.
  • fidelity is increased by providing sufficient accuracy in real-time such that a computer can generate virtual images and mix them with the image data from the specially mounted camera in a way that the user sees the virtual and real images mixed in real time.
  • the head-mounted tracker allows a computer to synchronize the virtual and real images such that the user sees an image being updated correctly with his or her head.
  • the headphones further enhance the virtual/real experience by providing appropriate aural input.
  • Augmented Reality Equipment A description of augmented reality was presented above. Commercial off the shelf technologies exist with which to implement augmented reality applications. This includes helmet-mounted displays (HMDs), position tracking equipment, and live/virtual mixing of imagery.
  • FIG 3 illustrates the geometric particle representation associated with flames.
  • FIG 4 illustrates the three particle systems used to represent a fire.
  • FIG 5 illustrates the idea of two-layer smoke obscuration.
  • FIG 6 illustrates particle arrangement for a surface representation of a particle system.
  • FIG 7 illustrates a surface mesh for a surface representation of a particle system.
  • FIG 8 illustrates the technologies that combine to create an AR firefighter training system, and method.
  • FIG 9 is a diagram indicating a nozzle, extinguishing agent stream, and fire and smoke plume.
  • FIG 10 is a diagram is a variant of FIG 9 where the extinguishing agent stream is modeled to have multiple cone layers to represent multiple velocities in the profile of the stream.
  • FIG 11 is a diagram of the three orthogonal planes that contain the three texture mapped images of a human head, useful in understanding the invention.
  • FIG 12 is a diagram of a human head and torso, which will be compared to graphical components in FIG 13; and
  • FIG 13 is a diagram of two sets of orthogonal planes, along with the joint between the two sets, for the human head and torso of FIG 12.
  • FIG 14 is a diagram of the main components of the HMD integrated with the SCBA.
  • FIG 16 is a diagram of a headphone attachment design.
  • FIG 17 is a diagram of a bumper that can be used to protect a mirror.
  • FIG 18 is an exploded view of a mirror mount design that places minimum pressure on the mirror to minimize distortion due to compression.
  • FIG 19 depicts a structure that can be used to protect a mirror from being bumped and to prevent the mirror mount from being hooked from underneath.
  • FIG 20 is a cross-sectional view of the structure of FIG19.
  • FIGS 21-23 are top, side, and end views, respectively, of the structure in FIGS 19 and 20.
  • FIG 26 is a perspective view of a nozzle and all of the major instrumentation components involved in the preferred embodiment of the invention, except for the cover;
  • FIG 27 is the same as FIG 26, but with the cover in place;
  • FIG 28 is a top view of the fully assembled ruggedized nozzle of FIG 27;
  • FIG 29 is a front view of the fully assembled ruggedized nozzle of FIG 27.
  • FIG 30 schematically depicts the basic optical and tracking components of the preferred embodiment of the invention and one possible arrangement of them.
  • FIG 31 shows the same components with an overlay of the optical paths and relationships to the components.
  • FIG 32 is a dimensioned engineering drawing of the prism of FIG 30.
  • FIG 33 shows the components of FIG 30 in relation to the SCBA face mask and protective shell.
  • FIG 34 shows externally visible components, including the SCBA and protective shell.
  • FIG 1 is a block diagram indicating the hardware components of the augmented reality
  • AR firefighter training system
  • Imagery from a head-worn video camera 4 is mixed in video mixer 3 via a linear luminance key with computer-generated (CG) output that has been converted to NTSC using VGA-to-NTSC encoder 2.
  • the luminance key removes white portions of the computer-generated imagery and replaces them with the camera imagery.
  • Black computer graphics remain in the final image, and luminance values for the computer graphics in between white and black are blended appropriately with the camera imagery.
  • the final image is displayed to a user in head-mounted display (HMD) 5.
  • HMD head-mounted display
  • This system requires significant hardware to accomplish the method of the invention.
  • the user interfaces with the system using an instrumented and ruggedized firefighter's SCBA and vari-nozzle. This equipment must be ruggedized since shock sensitive parts are mounted on it. Additionally, other equipment is used to mix the real and computer generated images and run the simulation.
  • Two preferred trackers for this invention are the INTERSENSE (InterSense, Inc., 73 Second Avenue, Burlington, MA 01803, USA) IS-900TM and the INTERSENSE (InterSense, Inc., 73 Second Avenue, Burlington, MA 01803, USA) IS-600 Mark 2 PlusTM.
  • the HMD 52 can be either see-through or non-see-through in this invention.
  • the preferred method of attachment to the SCBA mask is to cut out part or all of the transparent (viewing) portion of the mask to allow the bulk of the HMD to stick out, while placing the HMD optics close to the wearer's eyes. Holes drilled through the mask provide attachment points for the HMD.
  • the preferred HMD for this invention is the VIRTUAL RESEARCH (Virtual Research Systems, Inc., 3824 Vienna Drive, Aptos , California 95003) V6TM for a non-see-through method. (See FIG 14)
  • any SCBA mask 53 (FIG 14) can be used with this invention.
  • One preferred mask is the SCOTT (Scott Aviation, A Scott Technologies Company, Erie. Lancaster, NY 14086) AV2000TM.
  • SCOTT Scott Aviation, A Scott Technologies Company, Erie. Lancaster, NY 14086
  • This mask is an example of the state of the art for firefighting equipment, and the design of the mask has a hole near the wearer's mouth that allows easy breathing and speaking when a regulator is not attached.
  • the mask face seal, the "catcher's mask” style straps for attaching the mask, and the nose cup are features that are preserved.
  • the rest of the SCBA can be blacked out by using an opaque substance such as tape, foam, plastic, rubber, silicone, paint, or preferably a combination of plastic, silicone, and paint. This is done to ensure that the trainee doesn't see the un-augmented real world by using his or her un-augmented peripheral vision to see around AR smoke or other artificial (computer-generated virtual) obstacles.
  • an opaque substance such as tape, foam, plastic, rubber, silicone, paint, or preferably a combination of plastic, silicone, and paint.
  • the instrumentation that is protected in the instrumented SCBA consists of (1) the head mounted display (HMD) used to show an image to the user; (2) the camera used to acquire the image the user is looking at; (3) the InterSense InertiaCube used to measure the SCBA orientation; (4) two InterSense SoniDiscs used to measure the SCBA position; and (5) the prism used to shift the image in front of the user's eyes so that the image is in front of the camera. All of this equipment, except for the prism, has electrical connections that carry signals through a tether to a computer, which receives and processes these signals.
  • HMD head mounted display
  • the camera used to acquire the image the user is looking at
  • the InterSense InertiaCube used to measure the SCBA orientation
  • (4) two InterSense SoniDiscs used to measure the SCBA position
  • the prism used to shift the image in front of the user's eyes so that the image is in front of the camera. All of this equipment, except for the prism, has electrical connections that carry signals
  • the eye 137 of the person wearing the SCBA looks through the optics 138 to see the image formed on the display element inside the electronics portion 139 of the HMD.
  • the image of the outside world is captured by camera 135, which looks through a prism 140 that has two reflective surfaces to bend the path of light to the camera 135.
  • the tracking components include the InertiaCube 136 and two SoniDiscs 141 which are positioned on either side of the camera, one going into the figure, and one coming out of the figure.
  • the InertiaCube 136 can be placed anywhere there is room on the structure, but the SoniDiscs 141 must be at the top of the structure in order to have access to the external tracking components.
  • FIG 31 shows detailed sketches of the light paths.
  • the FOV 151 of the camera 155 Upon entering the prism 150 at the polished transparent entrance point 153, the FOV 151 of the camera 155 is temporarily reduced due to the refractive index of the glass, preferably SFL6 as it has a very high index of refraction while maintaining a relatively low density.
  • This reduced FOV 151 is reflected first off mirrored surface 152 and then mirrored surface 148 before exiting through surface 149.
  • the FOV 151 is restored to its original size, and any aberrations due to the glass is eliminated since the FOV 151 is entering and exiting the prism perpendicular to surfaces 153 and 149.
  • the image captured by the camera 155 is effectively taken from the virtual eye-point 147, even though the real eye-point of the camera is at point 154.
  • the virtual eye-point 147 would ideally be at the same point as the user's eye- point 144. To make this happen, however, the optics would have to be bigger. It is preferred to use smaller optics that place the virtual eye-point 147 slightly forward of the user's eye- point 144. This arrangement tends to be acceptable to most users. Even though the virtual eye-point 147 isn't lined up exactly with the eye 144, the HMD (146 and 145) as well as the prism 150 are all co-located on the same optical axis 143, thereby minimizing the disorientation of the user.
  • mirrors are used, there are several choices for materials. The lightest and cheapest are plastic mirrors. A step up in quality would be the use of glass mirrors, especially if they are front-surface mirrors (versus the typical back-surfaced mirrors used in households). The highest durability and quality can be achieved with metallic mirrors.
  • Metallic mirrors preferably aluminum, can be manufactured that have tough, highly reflective surfaces. Metal mirrors can be made to have built-in mounting points as well, enabling a mirror very well suited for the needs of the invention for light weight, compact size, and durability.
  • the very simplest alternative method of camera arrangement would be to put the camera parallel to and directly above the optical axis of HMD 146 and 145. This method negates the need for a prism or mirrors, but it loses all of the benefit of the virtual eye-point on the optical axis of the HMD 146 and 145.
  • a hard plastic electronics enclosure or shell 170 attaches to the SCBA 174 preferably with bolts 173, providing a means for hiding from view and protecting from the elements all electronic equipment, except for the SoniDisc speakers 169, which must be exposed to the air to allow the separate tracking devices (not shown) to receive the ultrasonic chirps coming from the SoniDiscs.
  • the plastic shell 170 that surrounds all of the equipment should be made of a tough material, such as nylon, that can withstand the shock of being dropped, yet is slightly bendable, allowing for a little bit of inherent shock-mounting for the equipment.
  • the HMD 176, prism 171, camera 168, and/or tracking equipment 167 and 179 can be mounted to the SCBA 174 and plastic shell 170 with rubber mounting points (not shown).
  • the HMD, prism, camera, and/or tracking equipment can all be mounted together with a very rigid structure, for example a metallic frame (not shown). That rigid structure could then be mounted separately to the plastic shell, preferably with shock- absorbing mounts.
  • the signal wires (not shown) coming from the instrumentation 168, 176, 167, and 179 come out of the plastic shell 170 through a single hole 166 with built-in strain relief, ensuring that the cables cannot be pulled out of the plastic shell 170 through normal use, and also ensuring that pulling on the cables will not create unacceptable cable wear.
  • the cables coming out of the plastic shell can be attached to a single, specialized connector either mounted on the plastic cover at the exit point 166, or preferably attached to the belt of the wearer. From that connector, another single cable connects this one connector to the computer (not shown) and other equipment (not shown) by splitting the cable into its sub-connectors as needed by the various components.
  • This specialized connector provides for easy connect and disconnect of the equipment from the user.
  • the equipment is protected by a plastic cover which protects the overall assembly from both shock and penetration by foreign agents, such as water and dirt.
  • the InertiaCube and SoniDiscs are somewhat easy to uncalibrate and are sensitive to shock. This provides a great deal of shock protection.
  • One potential issue with the HMD 176 is the build-up of heat, as the HMD gives off a substantial amount.
  • One method is to put vent holes (not shown) in the plastic cover 170, allowing direct access to cool air outside the cover 170, however that can allow in foreign contaminants.
  • the preferred method is to have one-way valves 172, 180 inside the SCBA mask 174. In this preferred method, as the user breathes in air, the air comes in through the mouth piece as is normal, but then the air gets redirected by a one-way valve 175, up through a one-way valve 172 and thus into shell 170, then over the electronics, and away from the electronics through another one-way valve 180 before entering the user's airway. When exhaling, the moist, warm air would get a direct path to the outside via the one-way valve 175. This redirection of air can be preferably accomplished through the use of typical, oneway rubber valves.
  • Additional hardware is attached to a firefighter's vari-nozzle 193 to allow control of a virtual water stream.
  • the nozzle used is an Elkhart vari-nozzle.
  • the instrumentation for the nozzle consists of (1) a potentiometer used to measure the nozzle fog pattern; (2) a potentiometer used to measure the nozzle bail angle; (3) an INTERSENSE (Burlington, MA) InertiaCube used to measure the nozzle orientation; and (4) two INTERSENSE (Burlington, MA) SoniDiscs used to measure the nozzle position. All of this equipment is connected by wiring that carries message signals through a tether to a computer and associated equipment (including an analog-to-digital converter) which receives and processes these signals.
  • the InertiaCube and SoniDiscs are equipment from the InterSense IS-600 line of tracking equipment. If the end user of the nozzle calls for the use of tracking equipment other than the IS-600 line, the invention could readily be adapted to protect equipment from the IS-900 line from InterSense, and 3rd Tech's optical tracking equipment.
  • At least two potentiometers are mounted directly to the vari-nozzle 193.
  • the InertiaCube 194 and SoniDiscs 183 are attached to a rigid, yet floating hard plastic island 181 which holds these items firmly in place.
  • This island 181 is attached to a rigid hard plastic base 190 by two narrow, flexible posts 189.
  • the base 190 is rigidly attached to the nozzle.
  • connection can be held much more securely with this method rather than standard plugs. By using solder or strong screw-down terminals, the wire connections can be assured a quality connection.
  • a separate cable (not shown) connects this common mounting block 195 to the computer and associated equipment which receives data from the nozzle.
  • the specific wires that can be easily mounted in this way include (a) the leads from the SoniDiscs 183, and (b) the wires attached to the leads 188 of the potentiometer under the plate 187 and the potentiometer inside the pattern selector 192.
  • the cable connection to the InertiaCube 194 may not be suitable to separately wire in this fashion since the InertiaCube signals may be sensitive to interference due to shielding concerns, though it should be possible to use a connector provided from InterSense.
  • the wires and/or cables are routed through a hole in the nozzle (not shown), and down the hose (not shown) to the end where they can come out and connect to the needed equipment. This method keeps the wires from being visible to the user.
  • the equipment is protected by a plastic cover 198, which protects the overall assembly from both shock and penetration by foreign agents, such as water and dirt.
  • the INTERSENSE (Burlington, MA) InertiaCube is sensitive to shock, especially the action of the metal bail handle 184, hitting the metal stops at either extreme of its range of motion.
  • the InertiaCube and SoniDiscs are mounted on an island which is held up by two soft polymeric pillars. This provides a great deal of protection against shock from slamming the bail handle all the way forward very quickly.
  • This island is also surrounded by a thin layer of padding (not shown in the figures) located between the island 181, and the cover 198 to protect the island from horizontal shock.
  • This thin layer also provides further protection from penetration by foreign agents, and can be made such that an actual seal is made around the circumference of the island.
  • a small shoe made of soft material is used as a bumper 191 to prevent shock caused by setting the device down too rapidly or dropping it.
  • the shoe also provides wear resistance to the base part in the assembly.
  • the shoe is also a visual cue to the user that the device is to be set down using the shoe as a contact point.
  • the protective design uses an alternating yellow and black color scheme (not shown) to get the user's attention that this is not a standard part. Additionally, a sign attached to the cover (not shown) is used which indicates that the device is to be used for training purposes only.
  • FIG 1 One alternative to the display setup diagrammed in FIG 1 is the use of optical see-through AR.
  • camera 4 and video mixer 3 are absent, and HMD 5 is one that allows its wearer to see computer graphics overlaid on his/her direct view of the real world.
  • This embodiment is not currently preferred for fire fighting because current see-through technology does not allow black smoke to obscure a viewer's vision.
  • a second alternative to the display setup diagrammed in FIG 1 is capturing and overlaying the camera video signal in the computer, which removes the video mixer 3 from the system diagram.
  • This allows high-quality imagery to be produced because the alpha, or transparency channel of the computer 1 graphics system may be used to specify the amount of blending between camera and CG imagery.
  • This embodiment is not currently preferred because the type of image blending described here requires additional delay of the video signal over the embodiment of FIG 1, which is undesirable in a fire fighting application because it reduces the level of responsiveness and interactivity of the system.
  • a third alternative to the display setup diagrammed in FIG 1 is producing two CG images and using one as an external key for luminance keying in a video mixer.
  • two VGA-to-NTSC encoders 2 are used to create two separate video signals from two separate windows created on the computer 1.
  • One window is an RGB image of the scene
  • a second window is a grayscale image representing the alpha channel.
  • the RGB image may be keyed with the camera image using the grayscale alpha signal as the keying image.
  • Such an embodiment allows controllable transparency with a minimum of real-world video delay.
  • FIG 1 diagrams the two 6 degree-of-freedom (6DOF) tracking stations 7 and 8 present in all embodiments of the system.
  • One tracking station 7 is attached to the HMD 5 and is used to measure a user's eye location and orientation in order to align the CG scene with the real world. In addition to matching the real-world and CG eye locations, the fields of view must be matched for proper registration.
  • the second tracking station 8 measures the location and orientation of a nozzle 9 that may be used to apply virtual extinguishing agents. Prediction of the 6DOF locations of 7 and 8 is done to account for system delays and allow correct alignment of real and virtual imagery. The amount of prediction is varied to allow for a varying CG frame rate.
  • the system uses an InterSense IS-600 tracking system 6, and it also supports the InterSense IS-900 and Ascension Flock of Birds. SOFTWARE
  • FIGS 2-4 A method for real-time depiction of fire is diagrammed in FIGS 2-4.
  • a particle system is employed for each of the persistent flame, intermittent flame, and buoyant plume components of a fire, as diagrammed in FIG 4.
  • the particles representing persistent and intermittent flames are created graphically as depicted in FIG 3.
  • Four triangles make up a fire particle, with transparent vertices 12-15 at the edges and an opaque vertex 16 in the center. Smooth shading of the triangles interpolates vertex colors over the triangle surfaces.
  • the local Y axis 27 of a fire particle is aligned to the direction of particle velocity, and the particle is rotated about the local Y axis 27 to face the viewer, a technique known as "billboarding.”
  • a fire texture map is projected through both the persistent and intermittent flame particle systems and rotated about a vertical axis to give a horizontal swirling effect.
  • the flame base 17 is used as the particle emitter for the three particle systems, and buoyancy and drag forces are applied to each system to achieve acceleration in the persistent flame, near-constant velocity in the intermittent flame, and deceleration in the buoyant plume.
  • An external force representing wind or vent flow may also be applied to affect the behavior of the fire plume particles.
  • flame particles When flame particles are born, they are given a velocity directed towards the center of the fire and a life span inversely proportional to their initial distance from the flame center.
  • the emission rate of intermittent flame particles fluctuates sinusoidally at a rate determined by a correlation with the flame base area. Flame height may be controlled by appropriately specifying the life span of particles in the center portion of the flame.
  • a number of graphical features contribute to the realistic appearance of the fire and smoke plume diagrammed in FIG 4.
  • Depth buffer writing is disabled when drawing the particles to allow blending without the need to order the drawing of the particles from back to front.
  • a light source is placed in the center of the flames, and its brightness fluctuates in unison with the emission rate of the intermittent flame particle system. The light color is based on the average color of the pixels in the fire texture map applied to the flame particles. Lighting is disabled when drawing the flame particles to allow them to be at full brightness, and lighting is enabled when drawing the smoke particles to allow the light source at the center of the flame to cast light on the smoke plume.
  • a billboarded, texture-mapped, polygon with a texture that is a round shape fading from bright white in the center to transparent at the edges is placed in the center of the flame to simulate a glow.
  • the RGB color of the polygon is the same as the light source, and the alpha of the polygon is proportional to the density of smoke in the atmosphere. When smoke is dense, the glow polygon masks the individual particles, making the flames appear as a flickering glow through smoke.
  • the glow width and height is scaled accordingly with the flame dimensions.
  • FIG 5 describes the concept of two layers of smoke in a compartment.
  • smoke from the buoyant plume rises to the top of a room and spreads out into a layer, creating an upper layer 20 and a lower layer 21 with unique optical densities.
  • the lower layer has optical density kj
  • the upper layer has density fo.
  • a polygonal model of the real room and contents is created.
  • the model is aligned to the corresponding real-world using the system of FIG 1.
  • the above equations are applied to modify the vertex colors to reflect smoke obscuration. Smooth shading interpolates between vertex colors so that per-pixel smoke calculations are not required. If the initial color, C,-, of the vertices is white, and the smoke color, C s , is black, the correct amount of obscuration of the real world will be achieved using the luminance keying method described above.
  • the above equations can be applied to the alpha value of vertices of the room model.
  • color values are generally specified using and integer range of 0 to 255 or a floating point range of 0 to 1.0.
  • this color specification does not take into account light sources such as windows to the outdoors, overhead fluorescent lights, or flames; which will shine through smoke more than non-luminous objects such as walls and furniture.
  • a luminance component was added to the color specification to affect how objects are seen through smoke.
  • the same polygonal model used for smoke obscuration is also used to allow real-world elements to occlude the view of virtual objects such as smoke and fire.
  • a fire plume behind a real desk that has been modeled is occluded by the polygonal model. In the combined AR view, it appears as if the real desk is occluding the view of the fire plume.
  • Graphical elements such as flame height, smoke layer height, upper layer optical density, and lower layer optical density may be given a basis in physics by allowing them to be controlled by a zone fire model.
  • a file reader developed for the system allows CFAST models to control the simulation.
  • CFAST or consolidated fire and smoke transport, is a zone model developed by the National Institute of Standards and Technology (NIST) and used worldwide for compartment fire modeling.
  • Upper layer temperature calculated by CFAST is monitored by the simulation to predict the occurrence of flashover, or full room involvement in a fire. The word "flashover" is displayed to a trainee and the screen is turned red to indicate that this dangerous event in the development of a fire has occurred.
  • a key component in a fire fighting simulation is simulated behavior and appearance of an extinguishing agent.
  • water application from a vari-nozzle 9, 23, and 25 has been simulated using a particle system.
  • a surface representation of a particle system was devised. This representation allows very few particles to represent a water stream, as opposed to alternative methods that would require the entire volume of water to be filled with particles.
  • Behavior such as initial water particle velocity and hose stream range for different nozzle settings is assigned to a water particle system. Water particles are then constrained to emit in a ring pattern from the nozzle location each time the system is updated. This creates a series of rings of particles 22 as seen FIG 6.
  • collision detection with the polygonal room and contents model is employed.
  • a ray is created from a particle's current position and its previous position, and the ray is tested for intersection with room polygons to detect collisions.
  • the particle's velocity component normal to the surface is reversed and scaled according to an elasticity coefficient.
  • the same collision method is applied to smoke particles when they collide with the ceiling of a room. Detection of collision may be accomplished in a number of ways.
  • the "brute force" approach involves testing every particle against every polygon.
  • a space partitioning scheme may be applied to the room polygons in a preprocessing stage to divide the room into smaller units.
  • Some space partitioning schemes include creation of a uniform 3- D grid, binary space partitioning (BSP), and octree space partitioning (OSP).
  • Water particles that collide with the surface on which the flame base is located are stored as particles that can potentially contribute to extinguishment.
  • the average age of these particles is used in conjunction with the nozzle angle to determine the average water density for the extinguishing particles.
  • Triangles are created using the particle locations as vertices. If a triangle is determined to be on top of the fire, then an extinguishment algorithm is applied to the fire.
  • Extinguishing a fire primarily involves reducing and increasing the flame height in a realistic manner. This is accomplished by managing three counters that are given initial values representing extinguish time, soak time, and reflash time. If intersection between water stream and flame base is detected, the extinguish time counter is decremented, and the flame height is proportionately decreased until both reach zero. If water is removed before the counter reaches zero, the counter is incremented until it reaches its initial value, which increments the flame height back to its original value. After flame height reaches zero, continued application of water decrements the soak counter until it reaches zero. If water is removed before the soak counter reaches zero, the reflash counter decrements to zero and the flames re-ignite and grow to their original height.
  • FIG 9 represents the preferred embodiment of a real-time graphical simulation of an extinguishing agent 29 (e.g., water or foam) exiting a nozzle 28 in the vicinity of a fire and smoke plume 30.
  • extinguishing agent 29 e.g., water or foam
  • each extinguishing agent, fire, or smoke particle will have a mass and a velocity associated with it.
  • a force on the fire and smoke particles can be calculated from the speed and direction of extinguishing agent particles.
  • an equation of the form (other actual forms are envisioned, but they will mainly show similar characteristics):
  • K is a factor that can be adjusted (from a nominal value of 1) for:
  • a force on the fire and smoke particles can be calculated based on the change in velocity:
  • Mass is the mass of the fire or smoke particles
  • ⁇ t is the time in between simulation updates
  • the application of the calculated force simulates the visual effect of the extinguishing agent stream causing airflow that alters the motion of fire and smoke particles. Additionally, the calculations can be applied to smoke or other particles that are not part of a fire and smoke plume, such as extinguishing agent passing through ambient steam or ambient smoke particles in a room.
  • the invention extends to other aerosols, gases, and particulate matter, such as dust, chemical smoke, and fumes.
  • a further refinement for determining a force to apply to particles in the fire and smoke plume 35 would entail modeling extinguishing agent 32-34 in cones 32, 33, and 34 (which are affected by gravity and will droop) from the nozzle 31, where the multiple additional concentric cones 32 and 33 to apply varying force.
  • One embodiment that can produce the cones 32, 33, and 34 can be a system of rings (the system of rings may be modeled as a particle system) emitted from the nozzle, which, when connected, form cones 32, 33, and 34.
  • fire and smoke particles 35 which are contained mostly inside the inner cone 34 of the extinguishing agent 32-34 can have one level of force applied, and fire and smoke particles 35 which are not contained within cone 34, but are contained within cones 33 or 32 can have a different, often smaller, force applied to them.
  • multiple levels of velocity from extinguishing agent and air entrainment can be easily simulated to apply multiple levels of force to the fire and smoke.
  • the additional cones 33 and 32 do not have to be drawn in the simulation, as they could be used strictly in determining the force to apply to the fire and smoke.
  • the force applied to a particle can be modeled as: (A) the extinguishing agent cone(s) 32, 33, 34 each having a velocity associated with them, (B) a difference in velocity between a particle 35 and the extinguishing agent cone(s) 32, 33, 34 can be calculated, (C) a force can be calculated that scales with that difference, and (D) the particles 35 will accelerate based on the force calculated, approaching the velocity of the extinguishing agent inside of the cone(s) 32, 33, 34.
  • the results of the simulated effects described above can be observed by drawing particles as computer-generated graphics primitives using real-time graphics software and hardware.
  • the invention is applicable to such areas as training simulations and computer games.
  • the texture map is faded between these two extremes as the orientation changes between these two extremes, to accomplish a desirable graphical mixing result that matches the silhouette of the object (or human) while maintaining a softer border around the edge of the silhouette contained in the texture map.
  • the appearance of a soft (“fuzzy") border is made by fading to transparent the edges of the object silhouette in the texture map.
  • a series of three texture maps containing silhouettes 36, 42, and 37 are shown mapped onto each of three orthogonal planes 38, 40 and 41, respectively.
  • the texture maps may fade to transparent at their edges for a fuzzy appearance to the shape.
  • the orthogonal planes are each broken up into 4 quadrants defined by the intersection of the planes, and the 41 resulting quadrants are rendered from back to front for correct alpha blending in OpenGL of the texture maps and planes, with depth buffering enabled.
  • a plane is perpendicular to the view plane of a virtual viewpoint looking at the object, the plane is rendered to be completely transparent.
  • a linear fade is used to completely fade the texture map to completely opaque when the plane is parallel to the view plane. This fade from opaque to transparent as the planes are turned relative to a viewer is responsible for the large part of the desirable fuzzy appearance to the shape.
  • the texture maps used to shade the three planes were created from a digital image of a person, then made into grayscale silhouettes, and so match the silhouette of a human user very well.
  • the edges 39 of the human silhouettes in the texture maps were blurred so that they would fade linearly from solid white (which represents the silhouette) to solid black (which represents non- silhouette portions of the texture map) to look better in an augmented reality situation where a little bit of a fuzzy edge is desirable. This fuzzy edge spans what is equivalently approximately 0.5 inches in real distance.
  • FIG 12 depicts a diagram of a human 44 wearing a head tracker 43 on a head mounted display.
  • the virtual representation of human 44 is shown used in the inventive technique in FIG 13.
  • Item 43 in FIG 12 is a motion tracker, in this case a six-degree-of-freedom motion tracker that measures the head location and orientation.
  • the position and orientation information from tracker 43 can be applied to orthogonal plane billboards.
  • an orthogonal plane billboard torso 47 can be created in approximately the correct place relative to the joint.
  • the torso in this instance may be designed to remain upright, only rotating about a vertical axis.
  • the head in this instance has full 6 degree-of-freedom motion capability based on the data coming from the tracker worn on the head of the user. This allows the head orthogonal plane billboard to be lined up correspondingly with the user's head.
  • the torso orthogonal plane billboard is attached to the pivot point 46 and is placed "hanging" straight down from that point, and has 4 degrees of freedom: three to control its position in space, and one controlling the horizontal orientation.
  • the head and torso models when lined up to a real human, occlude computer-generated graphics in a scene. If augmented reality video mixing is achieved with a luminance key to combine live and virtual images, white head and torso models will mask out a portion of a computer-generated image for replacement with a live video image of the real world.
  • This invention can be applied to any real world movable objects for which an occlusion model may be needed.
  • the above technique is also applicable to a movable non-person real world object. If the non-person object has no joints, then such an implementation is simpler since the complexity of coupling the separate head and torso models is avoided.
  • the technique for the single movable real-world physical object is functionally identical to the above method when only the head model is used.
  • 3-D audio allows sound volume to diminish with distance from a sound emitter, and it allow works with stereo headphones to give directionality to sounds.
  • 3-D audio emitters are attached to the fire and the hose nozzle.
  • the fire sound volume is proportional to physical volume of the fire.
  • Appendix A contains settings for the parameters of particle systems used in the invention. These parameters are meant to be guidelines that give realistic behavior for the particles. Many of the parameters are changed within the program, but the preferred starting parameters for flames, smoke, steam, and water are listed in the appendix.
  • Flashover refers to the point in the evolution of a compartment fire in which the fire transitions from local burning to involvement of the entire compartment.
  • zone-type fire model should provide sufficient accuracy for meeting our training objectives.
  • zone models including the Consolidated Fire and Smoke Transport Model (CFAST) from NIST and the WPI fire model from Worcester Polytechnic Institute, among others.
  • the outputs of a zone-type fire model can be extended to achieve a visual representation of a compartment fire.
  • Task 1 Infrastructure for Real-Time Display of Fire Task Summary.
  • the organization and structuring of information to be displayed is as important as actual display processing for real-time dynamical presentation of augmented environments.
  • a firefighter moves through a scenario (using an augmented reality device) the location, extent, and density of fire and smoke change.
  • an approach is to examine the transfer of data to and from the hard disk, through system memory, to update display memory with the desired frequency.
  • Precomputation of the bulk of a firefighter training simulation implies that most of the operations involved in real-time presentation of sensory information revolve around data transfer.
  • Task 2 Visual Representation of Smoke and Fire Task Summary.
  • the way in which sensory stimuli are presented in an ARBT scenario may or may not effect task performance by a student. It is essential to capture the aspects of the sensory representations of fire and smoke that affect student behavior in a training scenario without the computational encumbrance of those aspects that do not affect behavior.
  • For the purposes of providing sensory stimuli for firefighter training we need to know not only the spatial distribution and time evolution of temperature and hot gases in a compartment fire, but also the visible appearance of smoke and flame, along with sounds associated with a burning compartment, taken over time. There are three tiers of attributes of fire and smoke:
  • a zone-type fire model can be used to determine the location and extent of the smoke and flame. In addition to these quantities, the zone-type fire model also will yield aerosol densities in a given layer. Values for optical transmission through smoke can be calculated using a standard model such as found in the CFAST (Consolidated Fire and Smoke Transport) model, or in the EOSAEL (Electro-Optical Systems Atmospheric Effects Library) code.
  • the intermittent flame region in a fire oscillates with regularity, and that the oscillations arise from instabilities at the boundary between the fire plume and the surrounding air.
  • the instabilities generate vortex structures in the flame which in turn rise through the flame resulting in observed oscillations.
  • the visual dynamics of flame can be modeled from empirical data such as is known in the art. Measures of Success. This task can be judged on the aesthetics of the visual appearance of the simulated fire and smoke. Ultimately, the visual appearance of fire and smoke should be evaluated relative to the efficacy of an ARBT system.
  • Task 3 Position Anchoring Task Summary. Augmented reality techniques rely on superimposing information onto a physical scene. Superposition means that information is tied to objects or events in the scene. As such, it is necessary then to compensate for movement by an observer in order to maintain the geometric relations between superimposed information and underlying physical structures in the scene.
  • Position sensors in the form of a head tracker can, in real-time, calculate changes in location caused by movement of a firefighter within a training scenario. Virtual objects will be adjusted accordingly to remain "fixed" to the physical world.
  • Middle Color 1.0, 1.0, 1.0, 0.85
  • Middle Color 1.0, 1.0, 1.0, 1.0
  • Middle Color 1.0, 1.0, 1.0, 0.45

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Emergency Management (AREA)
  • Business, Economics & Management (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Pulmonology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)
PCT/US2002/025065 2001-08-09 2002-08-07 Augmented reality-based firefighter training system and method WO2003015057A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CA002456858A CA2456858A1 (en) 2001-08-09 2002-08-07 Augmented reality-based firefighter training system and method
EP02752732A EP1423834A1 (de) 2001-08-09 2002-08-07 Ergänztes realitätsgestütztes trainingssystem und verfahren für feuerwehrleute

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US09/927,043 US7110013B2 (en) 2000-03-15 2001-08-09 Augmented reality display integrated with self-contained breathing apparatus
US09/927,043 2001-08-09
US10/123,364 US6822648B2 (en) 2001-04-17 2002-04-16 Method for occlusion of movable objects and people in augmented reality scenes
US10/123,364 2002-04-16

Publications (1)

Publication Number Publication Date
WO2003015057A1 true WO2003015057A1 (en) 2003-02-20

Family

ID=26821473

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/025065 WO2003015057A1 (en) 2001-08-09 2002-08-07 Augmented reality-based firefighter training system and method

Country Status (3)

Country Link
EP (1) EP1423834A1 (de)
CA (1) CA2456858A1 (de)
WO (1) WO2003015057A1 (de)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1465115A2 (de) * 2003-03-14 2004-10-06 British Broadcasting Corporation Verfahren und Vorrichtung zum Erzeugen einer erwünschten Ansicht einer Szene aus einem gewählten Blickpunkt
EP1775556A1 (de) * 2005-10-13 2007-04-18 Honeywell International Inc. Synthetische zunehmende schwindende Landsicht in der Endphase der Landung
US8135227B2 (en) 2007-04-02 2012-03-13 Esight Corp. Apparatus and method for augmenting sight
AT509799B1 (de) * 2010-04-29 2013-01-15 Gerhard Gersthofer Übungsanordnung mit feuerlöscher zur brandbekämpfung
WO2013111145A1 (en) * 2011-12-14 2013-08-01 Virtual Logic Systems Private Ltd System and method of generating perspective corrected imagery for use in virtual combat training
US8610771B2 (en) 2010-03-08 2013-12-17 Empire Technology Development Llc Broadband passive tracking for augmented reality
US9618748B2 (en) 2008-04-02 2017-04-11 Esight Corp. Apparatus and method for a dynamic “region of interest” in a display system
NO20161132A1 (en) * 2016-07-07 2017-10-30 Real Training As Training system
FR3064801A1 (fr) * 2017-03-31 2018-10-05 Formation Conseil Securite Simulateur de manipulation de dispositif d’extinction d’incendie
US10366514B2 (en) 2008-04-05 2019-07-30 Sococo, Inc. Locating communicants in a multi-location virtual communications environment
EP3466295A4 (de) * 2016-05-25 2020-03-11 Sang Kyu Min Virtual-reality-gehäuse für mobiltelefon zur für doppelten verwendung
US10659511B2 (en) 2007-10-24 2020-05-19 Sococo, Inc. Automated real-time data stream switching in a shared virtual area communication environment
US10728144B2 (en) 2007-10-24 2020-07-28 Sococo, Inc. Routing virtual area based communications
WO2020256965A1 (en) * 2019-06-19 2020-12-24 Carrier Corporation Augmented reality model-based fire extinguisher training platform
US11023092B2 (en) 2007-10-24 2021-06-01 Sococo, Inc. Shared virtual area communication environment based apparatus and methods
US11020624B2 (en) 2016-04-19 2021-06-01 KFT Fire Trainer, LLC Fire simulator
CN112930061A (zh) * 2021-03-04 2021-06-08 贵州理工学院 一种vr-box体验互动装置
US11088971B2 (en) 2012-02-24 2021-08-10 Sococo, Inc. Virtual area communications
FR3120731A1 (fr) * 2021-03-14 2022-09-16 Ertc Center Systeme d’entrainement aux risques et menaces nrbc

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109117533A (zh) * 2018-07-27 2019-01-01 上海宝冶集团有限公司 基于bim结合vr的电子厂房消防方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5059124A (en) * 1990-08-03 1991-10-22 Masahiro Tsujita Imitation apparatus for fire extinguishing training
US5660549A (en) * 1995-01-23 1997-08-26 Flameco, Inc. Firefighter training simulator
US5823784A (en) * 1994-05-16 1998-10-20 Lane; Kerry S. Electric fire simulator
US5920492A (en) * 1996-04-26 1999-07-06 Southwest Research Institute Display list generator for fire simulation system
US5984684A (en) * 1996-12-02 1999-11-16 Brostedt; Per-Arne Method and system for teaching physical skills
US6129552A (en) * 1996-07-19 2000-10-10 Technique-Pedagogie-Securite Equipements Teaching installation for learning and practicing the use of fire-fighting equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5059124A (en) * 1990-08-03 1991-10-22 Masahiro Tsujita Imitation apparatus for fire extinguishing training
US5823784A (en) * 1994-05-16 1998-10-20 Lane; Kerry S. Electric fire simulator
US5660549A (en) * 1995-01-23 1997-08-26 Flameco, Inc. Firefighter training simulator
US5920492A (en) * 1996-04-26 1999-07-06 Southwest Research Institute Display list generator for fire simulation system
US6129552A (en) * 1996-07-19 2000-10-10 Technique-Pedagogie-Securite Equipements Teaching installation for learning and practicing the use of fire-fighting equipment
US5984684A (en) * 1996-12-02 1999-11-16 Brostedt; Per-Arne Method and system for teaching physical skills

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1465115A2 (de) * 2003-03-14 2004-10-06 British Broadcasting Corporation Verfahren und Vorrichtung zum Erzeugen einer erwünschten Ansicht einer Szene aus einem gewählten Blickpunkt
EP1465115A3 (de) * 2003-03-14 2005-09-21 British Broadcasting Corporation Verfahren und Vorrichtung zum Erzeugen einer erwünschten Ansicht einer Szene aus einem gewählten Blickpunkt
EP1775556A1 (de) * 2005-10-13 2007-04-18 Honeywell International Inc. Synthetische zunehmende schwindende Landsicht in der Endphase der Landung
US7719483B2 (en) 2005-10-13 2010-05-18 Honeywell International Inc. Synthetic vision final approach terrain fading
US8135227B2 (en) 2007-04-02 2012-03-13 Esight Corp. Apparatus and method for augmenting sight
US11023092B2 (en) 2007-10-24 2021-06-01 Sococo, Inc. Shared virtual area communication environment based apparatus and methods
US10728144B2 (en) 2007-10-24 2020-07-28 Sococo, Inc. Routing virtual area based communications
US10659511B2 (en) 2007-10-24 2020-05-19 Sococo, Inc. Automated real-time data stream switching in a shared virtual area communication environment
US9618748B2 (en) 2008-04-02 2017-04-11 Esight Corp. Apparatus and method for a dynamic “region of interest” in a display system
US10366514B2 (en) 2008-04-05 2019-07-30 Sococo, Inc. Locating communicants in a multi-location virtual communications environment
US9390503B2 (en) 2010-03-08 2016-07-12 Empire Technology Development Llc Broadband passive tracking for augmented reality
US8610771B2 (en) 2010-03-08 2013-12-17 Empire Technology Development Llc Broadband passive tracking for augmented reality
AT509799B1 (de) * 2010-04-29 2013-01-15 Gerhard Gersthofer Übungsanordnung mit feuerlöscher zur brandbekämpfung
WO2013111145A1 (en) * 2011-12-14 2013-08-01 Virtual Logic Systems Private Ltd System and method of generating perspective corrected imagery for use in virtual combat training
US11088971B2 (en) 2012-02-24 2021-08-10 Sococo, Inc. Virtual area communications
US11020624B2 (en) 2016-04-19 2021-06-01 KFT Fire Trainer, LLC Fire simulator
US11951344B2 (en) 2016-04-19 2024-04-09 KFT Fire Trainer, LLC Fire simulator
US10948727B2 (en) 2016-05-25 2021-03-16 Sang Kyu MIN Virtual reality dual use mobile phone case
EP3466295A4 (de) * 2016-05-25 2020-03-11 Sang Kyu Min Virtual-reality-gehäuse für mobiltelefon zur für doppelten verwendung
WO2018009075A1 (en) * 2016-07-07 2018-01-11 Real Training As Training system
NO341406B1 (en) * 2016-07-07 2017-10-30 Real Training As Training system
NO20161132A1 (en) * 2016-07-07 2017-10-30 Real Training As Training system
FR3064801A1 (fr) * 2017-03-31 2018-10-05 Formation Conseil Securite Simulateur de manipulation de dispositif d’extinction d’incendie
WO2020256965A1 (en) * 2019-06-19 2020-12-24 Carrier Corporation Augmented reality model-based fire extinguisher training platform
CN112930061A (zh) * 2021-03-04 2021-06-08 贵州理工学院 一种vr-box体验互动装置
FR3120731A1 (fr) * 2021-03-14 2022-09-16 Ertc Center Systeme d’entrainement aux risques et menaces nrbc
WO2022194696A1 (fr) * 2021-03-14 2022-09-22 Ertc Center Systeme d'entrainement aux risques et menaces nrbc

Also Published As

Publication number Publication date
EP1423834A1 (de) 2004-06-02
CA2456858A1 (en) 2003-02-20

Similar Documents

Publication Publication Date Title
US6809743B2 (en) Method of generating three-dimensional fire and smoke plume for graphical display
EP1423834A1 (de) Ergänztes realitätsgestütztes trainingssystem und verfahren für feuerwehrleute
Vince Introduction to virtual reality
Vince Essential virtual reality fast: how to understand the techniques and potential of virtual reality
Anthes et al. State of the art of virtual reality technology
US8195084B2 (en) Apparatus and method of simulating a somatosensory experience in space
CN102540464B (zh) 提供环绕视频的头戴式显示设备
US20030210228A1 (en) Augmented reality situational awareness system and method
US20020191004A1 (en) Method for visualization of hazards utilizing computer-generated three-dimensional representations
US7046214B2 (en) Method and system for accomplishing a scalable, multi-user, extended range, distributed, augmented reality environment
WO2007133209A1 (en) Advanced augmented reality system and method for firefighter and first responder training
US20060114171A1 (en) Windowed immersive environment for virtual reality simulators
JP2007229500A (ja) ユーザーを仮想現実に没入させるための方法及び装置
AU2002366994A1 (en) Method and system to display both visible and invisible hazards and hazard information
Spanlang et al. A first person avatar system with haptic feedback
Vichitvejpaisal et al. Firefighting simulation on virtual reality platform
Hatsushika et al. Underwater VR experience system for scuba training using underwater wired HMD
Wilson et al. Design of monocular head-mounted displays for increased indoor firefighting safety and efficiency
Rodrigue et al. Mixed reality simulation with physical mobile display devices
AU2002355560A1 (en) Augmented reality-based firefighter training system and method
USRE45525E1 (en) Apparatus and method of simulating a somatosensory experience in space
Segura et al. Interaction and ergonomics issues in the development of a mixed reality construction machinery simulator for safety training
Schoor et al. Elbe Dom: 360 Degree Full Immersive Laser Projection System.
Bachelder Helicopter aircrew training using fused reality
Gupta et al. Training in virtual environments

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG UZ VN YU ZA ZM ZW

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BY BZ CA CH CN CO CR CU CZ DE DM DZ EC EE ES FI GB GD GE GH HR HU ID IL IN IS JP KE KG KP KR LC LK LR LS LT LU LV MA MD MG MN MW MX MZ NO NZ OM PH PL PT RU SD SE SG SI SK SL TJ TM TN TR TZ UA UG UZ VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE BG CH CY CZ DK EE ES FI FR GB GR IE IT LU MC PT SE SK TR BF BJ CF CG CI GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2002355560

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 2456858

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2002752732

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2002752732

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP