EP1423834A1 - Ergänztes realitätsgestütztes trainingssystem und verfahren für feuerwehrleute - Google Patents
Ergänztes realitätsgestütztes trainingssystem und verfahren für feuerwehrleuteInfo
- Publication number
- EP1423834A1 EP1423834A1 EP02752732A EP02752732A EP1423834A1 EP 1423834 A1 EP1423834 A1 EP 1423834A1 EP 02752732 A EP02752732 A EP 02752732A EP 02752732 A EP02752732 A EP 02752732A EP 1423834 A1 EP1423834 A1 EP 1423834A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- firefighter
- ruggedized
- scba
- extinguishing agent
- nozzle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/40—Hidden part removal
-
- A—HUMAN NECESSITIES
- A62—LIFE-SAVING; FIRE-FIGHTING
- A62B—DEVICES, APPARATUS OR METHODS FOR LIFE-SAVING
- A62B9/00—Component parts for respiratory or breathing apparatus
- A62B9/006—Indicators or warning devices, e.g. of low pressure, contamination
-
- A—HUMAN NECESSITIES
- A62—LIFE-SAVING; FIRE-FIGHTING
- A62C—FIRE-FIGHTING
- A62C99/00—Subject matter not provided for in other groups of this subclass
- A62C99/0081—Training methods or equipment for fire-fighting
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0176—Head mounted characterised by mechanical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Definitions
- This invention relates to training firefighters in an augmented reality (AR) simulation that includes creation of graphics depicting fire, smoke, and application of an extinguishing agent; extinguishing agent interaction with fire and airborne particles; applying computer generated extinguishing agent via an instrumented and ruggedized firefighter's nozzle; displaying the simulated phenomena anchored to real-world locations as seen through an instrumented and ruggedized head- worn display; and occlusion of virtual objects by a real person and/or other moveable objects.
- AR augmented reality
- Augmented reality (AR) technology allows overlay of computer-generated graphics on a person's view of the real world. With AR, computer generated fire, smoke, and extinguishing agents can safely replace live fire training while still allowing trainees to view and interact with each other and the real-world environment. This training can be done most effectively by the firefighter using the actual equipment that would be used in a real world scenario.
- the firefighter should wear a real SCBA and use a real firefighter's vari-nozzle to extinguish the fires.
- These items can be instrumented to measure their positions and orientations for use in the system.
- these items should be rugged so that they can withstand the rigorous treatment they would undergo in a real operational environment. This includes the need for protection against shock and penetration from contaminants, such as dirt and water. This allows safe, cost-effective training with greater realism than pure virtual reality (VR) simulations.
- VR virtual reality
- the primary objective of this invention is the development of an augmented reality-based training (ARBT) system for fire fighting, with application to rescue and hazardous material mitigation.
- ARBT augmented reality-based training
- Tasks (1) to (4) are applicable to any fire situation - reactive or interactive. Therefore, any significant improvement in developing training skills for Tasks (1) to (4) will result in a significantly skilled firefighter for both reactive and interactive scenarios.
- An objective of this invention is to demonstrate the feasibility of augmented reality as the basis for an untethered, ARBT system to train firefighters.
- Two enabling technologies will be exploited: a flexible, wearable belt PC and an augmented reality head-mounted display (HMD).
- the HMD can be integrated with a firefighter's SCBA. This augments the experience by allowing the user to wear a real firefighter's SCBA (Self- Contained Breathing Apparatus) while engaging in computer-enhanced fire training situations.
- SCBA Self- Contained Breathing Apparatus
- a real firefighter's SCBA including catcher's mask-style straps, which gives the user the sensation of being in a real firefighting situation
- a head motion tracker which is mounted on the SCBA to track the position and orientation of the firefighter's head
- a specially mounted camera and mirror or camera and prism configuration which takes a video image of what is directly in front of the user's eyes
- a video display screen which is used to project the computer-enhanced image the firefighter will see in front of his eyes
- specially mounted head phones which can be used to project appropriate sounds into the firefighter's ears.
- the user would carry a firefighter's vari-nozzle, instrumented and ruggedized for use.
- Augmented reality is a hybrid of a virtual world and the physical world in which virtual stimuli (e.g. visual, acoustic, thermal, olfactory) are dynamically superimposed on sensory stimuli from the physical world.
- virtual stimuli e.g. visual, acoustic, thermal, olfactory
- This invention demonstrates a foundation for developing a prototype untethered ARBT system which will support the critical fire fighting tasks of (1) navigation, (2) situation awareness, (3) stress management, and (4) problem solving.
- the system and method of this invention can be not only a low-cost training tool for fire academies and community fire departments, but also provides a test bed for evaluating future fire fighting technologies, such as decision aids, heads-up displays, and global positioning systems for the 21st century firefighter.
- the primary opportunity for an ARBT system is the training of firefighters in the areas of Tasks (1) to (4) above for reactive scenarios.
- the inventive ARBT system has the significant potential to produce
- a training program that aims to increase skills in the Tasks (1) to (4) is adaptable to essentially any fire department, large or small, whether on land, air, or sea.
- Opportunities for Augmented Reality for Training Augmented reality has emerged as a training tool. Augmented reality can be a medium for successful delivery of training.
- the cost of an effective training program built around augmented reality-based systems arises primarily from considerations of the computational complexity and the number of senses required by the training exercises. Because of the value of training firefighters in Tasks (1) to (4) for any fire situation, and because the program emphasizes firefighter reactions to (vs. interactions with) fire and smoke, training scenarios can be precomputed.
- PC technology is capable of generating virtual world stimuli - in real time.
- augmented reality the opportunity identified above- which has focused on reactions of firefighters to fire and smoke in training scenarios - is amenable to augmented reality.
- Opportunities for Augmented Reality for Training In augmented reality, sensory stimuli from portions of a virtual world are superimposed on sensory stimuli from the real world. If we consider a continuous scale going from the physical world to completely virtual worlds, then hybrid situations are termed augmented reality.
- the position on a reality scale is determined by the ratio of virtual world sensory information to real world information.
- This invention creates a firefighter training solution that builds on the concept of an augmented physical world, known as augmented reality. Ideally, all training should take place in the real world. However, due to such factors as cost, safety, and environment, we have moved some or all of the hazards of the real world to the virtual world while maintaining the critical training parameters of the real world, e.g., we are superimposing virtual fire and smoke onto the real world.
- HMD head mounted display
- augmented reality may be a superior approach when compared to completely virtual reality.
- exercise simulators such as stationary bicycles, treadmills or stair climbing machines do not adequately capture either the physical perception or the distribution of workload on the musculoskeletal systems that would be produced by actually walking or crawling in the physical world.
- a firefighter can see his/her fellow firefighters, not just a computer representation as in pure virtual reality.
- Opportunities for Self-Contained Augmented Reality are possible.
- a low-cost, flexible, wearable belt PC technology may be used in augmented reality firefighter training.
- This technology combined with augmented reality and precomputed fire scenarios to handle tasks (1) to (4) above for various physical locations, allows a firefighter to move untethered anywhere, anytime, inexpensively and safely. This will significantly add more realistic training experiences.
- Mitler (1991) divides fire models into two basic categories: deterministic and stochastic models. Deterministic models are further divided into zone models, field models, hybrid zone/field models, and network models. For purposes of practicality and space limitations, we limit the following discussions to deterministic models, specifically zone type fire models. Mitler goes on to prescribe that any good fire model must describe convective heat and mass transfer, radiative heat transfer, ignition, pyrolysis and the formation of soot. For our purposes, models of flame structure are also of importance.
- Zone models are based on finite element analysis (FEA).
- FEA finite element analysis
- a zone model of a fire a region is divided into a few control volumes - zones. The conditions within each volume are usually assumed to be approximately constant.
- two or more zones typically are used: an upper layer, a lower layer, and, optionally, the fire plume, the ceiling, and, if present, a vent.
- Zone models take the form of an initial value problem for a system of differential and algebraic equations. Limitations of zone models include ambiguity in the number and location of zones, doubt on the validity of empirical expressions used to describe processes within and between zones, and inapplicability of zones to structures with large area or complex internal configurations.
- Experiential learning is based on the premise that people best learn new skills by successfully performing tasks requiring those skills.
- the application of virtual reality to the delivery of training builds on the promise of experiential learning to maximize the transfer of training into the task environment.
- virtual reality interfaces also hold the potential for being more motivating than traditional training delivery media by making the training experience itself more fun and interesting. Augmented reality retains these strengths while providing a real world experience for the firefighter.
- This aspect of the invention comprises an orthogonal plane billboarding technique that allows textures with fuzzy edges to be used to convey the sense of a soft-edged 3D model in the space. This is done by creating three orthogonal (perpendicular) planes. A texture map is mapped onto each of these planes, consisting of profile views of the object of interest as silhouettes of how they look from each of these directions.
- This orthogonal plane billboard is the modeling of a human head and torso which can be used to occlude fire, water, and smoke in the invention.
- fidelity is increased by providing sufficient accuracy in real-time such that a computer can generate virtual images and mix them with the image data from the specially mounted camera in a way that the user sees the virtual and real images mixed in real time.
- the head-mounted tracker allows a computer to synchronize the virtual and real images such that the user sees an image being updated correctly with his or her head.
- the headphones further enhance the virtual/real experience by providing appropriate aural input.
- Augmented Reality Equipment A description of augmented reality was presented above. Commercial off the shelf technologies exist with which to implement augmented reality applications. This includes helmet-mounted displays (HMDs), position tracking equipment, and live/virtual mixing of imagery.
- FIG 1 is a block diagram indicating the hardware components of an embodiment of the augmented reality (AR) firefighter training system, also useful in the method of the invention.
- AR augmented reality
- FIG 2 illustrates the geometric particle representation associated with smoke.
- FIG 3 illustrates the geometric particle representation associated with flames.
- FIG 4 illustrates the three particle systems used to represent a fire.
- FIG 5 illustrates the idea of two-layer smoke obscuration.
- FIG 6 illustrates particle arrangement for a surface representation of a particle system.
- FIG 7 illustrates a surface mesh for a surface representation of a particle system.
- FIG 8 illustrates the technologies that combine to create an AR firefighter training system, and method.
- FIG 9 is a diagram indicating a nozzle, extinguishing agent stream, and fire and smoke plume.
- FIG 10 is a diagram is a variant of FIG 9 where the extinguishing agent stream is modeled to have multiple cone layers to represent multiple velocities in the profile of the stream.
- FIG 11 is a diagram of the three orthogonal planes that contain the three texture mapped images of a human head, useful in understanding the invention.
- FIG 12 is a diagram of a human head and torso, which will be compared to graphical components in FIG 13; and
- FIG 13 is a diagram of two sets of orthogonal planes, along with the joint between the two sets, for the human head and torso of FIG 12.
- FIG 14 is a diagram of the main components of the HMD integrated with the SCBA.
- FIG 15 is a diagram of a two-mirror layout for placing the camera viewpoint immediately in front of the wearer's eyes.
- FIG 16 is a diagram of a headphone attachment design.
- FIG 17 is a diagram of a bumper that can be used to protect a mirror.
- FIG 18 is an exploded view of a mirror mount design that places minimum pressure on the mirror to minimize distortion due to compression.
- FIG 19 depicts a structure that can be used to protect a mirror from being bumped and to prevent the mirror mount from being hooked from underneath.
- FIG 20 is a cross-sectional view of the structure of FIG19.
- FIGS 21-23 are top, side, and end views, respectively, of the structure in FIGS 19 and 20.
- FIGS 24 and 25 are top and side views, respectively, of a rugged mirror mount.
- FIG 26 is a perspective view of a nozzle and all of the major instrumentation components involved in the preferred embodiment of the invention, except for the cover;
- FIG 27 is the same as FIG 26, but with the cover in place;
- FIG 28 is a top view of the fully assembled ruggedized nozzle of FIG 27;
- FIG 29 is a front view of the fully assembled ruggedized nozzle of FIG 27.
- FIG 30 schematically depicts the basic optical and tracking components of the preferred embodiment of the invention and one possible arrangement of them.
- FIG 31 shows the same components with an overlay of the optical paths and relationships to the components.
- FIG 32 is a dimensioned engineering drawing of the prism of FIG 30.
- FIG 33 shows the components of FIG 30 in relation to the SCBA face mask and protective shell.
- FIG 34 shows externally visible components, including the SCBA and protective shell.
- FIG 1 is a block diagram indicating the hardware components of the augmented reality
- AR firefighter training system
- Imagery from a head-worn video camera 4 is mixed in video mixer 3 via a linear luminance key with computer-generated (CG) output that has been converted to NTSC using VGA-to-NTSC encoder 2.
- the luminance key removes white portions of the computer-generated imagery and replaces them with the camera imagery.
- Black computer graphics remain in the final image, and luminance values for the computer graphics in between white and black are blended appropriately with the camera imagery.
- the final image is displayed to a user in head-mounted display (HMD) 5.
- HMD head-mounted display
- This system requires significant hardware to accomplish the method of the invention.
- the user interfaces with the system using an instrumented and ruggedized firefighter's SCBA and vari-nozzle. This equipment must be ruggedized since shock sensitive parts are mounted on it. Additionally, other equipment is used to mix the real and computer generated images and run the simulation.
- FIG 14 summarizes the main components of the HMD as integrated with an SCBA, including instrumentation required for tracking in an AR or VR environment.
- the major components of the integrated HMD are a video camera 51 a motion tracker 50, a head- mounted display (HMD) 52, headphones 48, and a self-contained breathing apparatus (SCBA) mask 53 with a "catcher's mask” style straps 49 to hold the SCBA onto the wearer's head.
- SCBA self-contained breathing apparatus
- Any sufficiently lightweight video camera 51 can be used for the camera in the HMD.
- a PANASONIC ® (Matsushita Electric Corporation of America, One Panasonic Way, Secaucus, NJ 07094) GP-KS162 micro ("lipstick" style) camera with a 7 mm lens (GP-LM7TA) is preferably used in the invention. Because the camera must be worn on the user's head, it should be lightweight and minimally obtrusive. The camera's field of view (FOV) must be close to that of the HMD for minimal distortion, i.e., to best map the image of the real world to the AR world; this also contributes to maximizing the user's perception of presence in the AR world. If the HMD is an optical see-through display, the camera is not required. However, the current preferred embodiment of the invention is a video-based AR display.
- the ideal location of a camera for minimal offset from the wearer's eyes is inside the user's eyes.
- the optical path may be folded using mirrored surfaces to allow the camera viewpoint to coincide with the wearer's eye location with the camera at some distance from the wearer's eyes.
- These surfaces can be provided using either mirrors or prisms with reflective coatings.
- a prism is the preferred embodiment for image reflection. Using an even number of mirrored surfaces as in FIG 15 allows the camera images to be used as-is, and using an odd number of mirrored surfaces requires the camera image to be flipped before it is displayed to the wearer of the display.
- FIG 17 shows a bumper design for protecting a mirror and whatever this mirror may bump against.
- FIG 18 shows a mirror mount design that both protects a mirror 57 and allows a minimal amount of pressure to be applied to this mirror by using a mechanism that sandwiches the mirror between parts 58 and 59, reducing the distortion that is created, especially if plastic mirrors are used.
- FIGS 19-23 show an angular frame structure 260 to shield the mirror mount 250 and prevent it from getting hooked from underneath.
- the structure 260 is mounted to the HMD housing 263 independent from the mirror mount 250, and it uses rubber mounts 262 at mounting points 261.
- FIGS 24 and 25 show the rugged mirror mount 250 with a raised area 252 at the front to prevent the mirror (not shown, but it would be resident at location 251) from getting bumped from that direction.
- the mirror mount 250 is mounted vertically through bolt holes 254, and then a bend 253 at a 45 degree angle is used to place the mirror at the correct 45 degree relative to the camera (not shown) which is looking directly down.
- a hole 255 is cut out of the mirror mount 250 to reduce weight.
- two cameras are required. They should be spaced the same distance apart as the wearer's eyes and possibly have an interpupillary distance (IPD) adjustment.
- IPD interpupillary distance
- Each camera can be mounted similarly to the way a single camera mount is described above.
- motion tracking equipment 50 is used to provide real-time, 6 degree of freedom (DOF), position and orientation information about a tracked object.
- DOF 6 degree of freedom
- the camera can be tracked.
- Knowledge of the camera field of view and its position and orientation allows a computer to overlay computer- generated images on the camera video that appear to be anchored to locations in 3-D space.
- attaching the tracker to a user's head allows the user's eye positions to be tracked, enabling the see-through embodiment of this technology.
- Any tracker that provides suitable 6 DOF measurements can be used as a part of this invention.
- Example technologies for motion tracking include magnetic, acousto-inertial, and optical.
- Two preferred trackers for this invention are the INTERSENSE (InterSense, Inc., 73 Second Avenue, Burlington, MA 01803, USA) IS-900TM and the INTERSENSE (InterSense, Inc., 73 Second Avenue, Burlington, MA 01803, USA) IS-600 Mark 2 PlusTM.
- the HMD 52 can be either see-through or non-see-through in this invention.
- the preferred method of attachment to the SCBA mask is to cut out part or all of the transparent (viewing) portion of the mask to allow the bulk of the HMD to stick out, while placing the HMD optics close to the wearer's eyes. Holes drilled through the mask provide attachment points for the HMD.
- the preferred HMD for this invention is the VIRTUAL RESEARCH (Virtual Research Systems, Inc., 3824 Vienna Drive, Aptos , California 95003) V6TM for a non-see-through method. (See FIG 14)
- Headphones 48 must be attached to the SCBA if audio is part of the AR application. Two requirements for the headphones are that they should not block out real-world sounds, and they should not interfere with donning the mask or other firefighter equipment.
- a pair of headphones 55 (AIWA [AIWA AMERICA, INC., 800 Corporate Drive Mahwah, NJ 07430] HP-A091 Stereo HeadphonesTM) rigidly mounted 54 to the SCBA mask at a distance from the wearer's ears can be used (see FIG 16). Additional strength can be added to the shafts that connect the headphones to the SCBA by means of hardened epoxy resin, which can be accomplished with JB WELDTM (JB Weld Company, P.O. Box 483, Sulphur Springs, TX 75483).
- JB WELDTM JB Weld Company, P.O. Box 483, Sulphur Springs, TX 75483
- any SCBA mask 53 (FIG 14) can be used with this invention.
- One preferred mask is the SCOTT (Scott Aviation, A Scott Technologies Company, Erie. Lancaster, NY 14086) AV2000TM.
- SCOTT Scott Aviation, A Scott Technologies Company, Erie. Lancaster, NY 14086
- This mask is an example of the state of the art for firefighting equipment, and the design of the mask has a hole near the wearer's mouth that allows easy breathing and speaking when a regulator is not attached.
- the mask face seal, the "catcher's mask” style straps for attaching the mask, and the nose cup are features that are preserved.
- the rest of the SCBA can be blacked out by using an opaque substance such as tape, foam, plastic, rubber, silicone, paint, or preferably a combination of plastic, silicone, and paint. This is done to ensure that the trainee doesn't see the un-augmented real world by using his or her un-augmented peripheral vision to see around AR smoke or other artificial (computer-generated virtual) obstacles.
- an opaque substance such as tape, foam, plastic, rubber, silicone, paint, or preferably a combination of plastic, silicone, and paint.
- the instrumentation that is protected in the instrumented SCBA consists of (1) the head mounted display (HMD) used to show an image to the user; (2) the camera used to acquire the image the user is looking at; (3) the InterSense InertiaCube used to measure the SCBA orientation; (4) two InterSense SoniDiscs used to measure the SCBA position; and (5) the prism used to shift the image in front of the user's eyes so that the image is in front of the camera. All of this equipment, except for the prism, has electrical connections that carry signals through a tether to a computer, which receives and processes these signals.
- HMD head mounted display
- the camera used to acquire the image the user is looking at
- the InterSense InertiaCube used to measure the SCBA orientation
- (4) two InterSense SoniDiscs used to measure the SCBA position
- the prism used to shift the image in front of the user's eyes so that the image is in front of the camera. All of this equipment, except for the prism, has electrical connections that carry signals
- the eye 137 of the person wearing the SCBA looks through the optics 138 to see the image formed on the display element inside the electronics portion 139 of the HMD.
- the image of the outside world is captured by camera 135, which looks through a prism 140 that has two reflective surfaces to bend the path of light to the camera 135.
- the tracking components include the InertiaCube 136 and two SoniDiscs 141 which are positioned on either side of the camera, one going into the figure, and one coming out of the figure.
- the InertiaCube 136 can be placed anywhere there is room on the structure, but the SoniDiscs 141 must be at the top of the structure in order to have access to the external tracking components.
- FIG 31 shows detailed sketches of the light paths.
- the FOV 151 of the camera 155 Upon entering the prism 150 at the polished transparent entrance point 153, the FOV 151 of the camera 155 is temporarily reduced due to the refractive index of the glass, preferably SFL6 as it has a very high index of refraction while maintaining a relatively low density.
- This reduced FOV 151 is reflected first off mirrored surface 152 and then mirrored surface 148 before exiting through surface 149.
- the FOV 151 is restored to its original size, and any aberrations due to the glass is eliminated since the FOV 151 is entering and exiting the prism perpendicular to surfaces 153 and 149.
- the image captured by the camera 155 is effectively taken from the virtual eye-point 147, even though the real eye-point of the camera is at point 154.
- the virtual eye-point 147 would ideally be at the same point as the user's eye- point 144. To make this happen, however, the optics would have to be bigger. It is preferred to use smaller optics that place the virtual eye-point 147 slightly forward of the user's eye- point 144. This arrangement tends to be acceptable to most users. Even though the virtual eye-point 147 isn't lined up exactly with the eye 144, the HMD (146 and 145) as well as the prism 150 are all co-located on the same optical axis 143, thereby minimizing the disorientation of the user.
- FIG 32 shows the detailed dimensions and descriptions of the preferred prism.
- Surfaces 157 and 159 are mirrored and surfaces 160 and 158 are polished. All other surfaces are plain or preferably painted black. All units in the drawing are inches.
- the prism can be mounted to the SCBA by fixing mounting plates that define tapped holes on the two sides of the prism.
- the prism material is preferably SFL 6, which has a refractive index of 1.8.
- mirrors are used, there are several choices for materials. The lightest and cheapest are plastic mirrors. A step up in quality would be the use of glass mirrors, especially if they are front-surface mirrors (versus the typical back-surfaced mirrors used in households). The highest durability and quality can be achieved with metallic mirrors.
- Metallic mirrors preferably aluminum, can be manufactured that have tough, highly reflective surfaces. Metal mirrors can be made to have built-in mounting points as well, enabling a mirror very well suited for the needs of the invention for light weight, compact size, and durability.
- the very simplest alternative method of camera arrangement would be to put the camera parallel to and directly above the optical axis of HMD 146 and 145. This method negates the need for a prism or mirrors, but it loses all of the benefit of the virtual eye-point on the optical axis of the HMD 146 and 145.
- the HMD 170 (FIGS 33 and 34) is mounted directly to the SCBA 174.
- the InertiaCube 167 (FIG 34) and SoniDiscs 179 are attached rigidly to the camera 168 /prism 171 assembly (or mirrors if those are used), locking their positions together. By locking the position tracking equipment directly to the camera/prism assembly, one can ensure that the computer- generated imagery will correspond to the camera's position.
- a hard plastic electronics enclosure or shell 170 attaches to the SCBA 174 preferably with bolts 173, providing a means for hiding from view and protecting from the elements all electronic equipment, except for the SoniDisc speakers 169, which must be exposed to the air to allow the separate tracking devices (not shown) to receive the ultrasonic chirps coming from the SoniDiscs.
- the plastic shell 170 that surrounds all of the equipment should be made of a tough material, such as nylon, that can withstand the shock of being dropped, yet is slightly bendable, allowing for a little bit of inherent shock-mounting for the equipment.
- the HMD 176, prism 171, camera 168, and/or tracking equipment 167 and 179 can be mounted to the SCBA 174 and plastic shell 170 with rubber mounting points (not shown).
- the HMD, prism, camera, and/or tracking equipment can all be mounted together with a very rigid structure, for example a metallic frame (not shown). That rigid structure could then be mounted separately to the plastic shell, preferably with shock- absorbing mounts.
- the signal wires (not shown) coming from the instrumentation 168, 176, 167, and 179 come out of the plastic shell 170 through a single hole 166 with built-in strain relief, ensuring that the cables cannot be pulled out of the plastic shell 170 through normal use, and also ensuring that pulling on the cables will not create unacceptable cable wear.
- the cables coming out of the plastic shell can be attached to a single, specialized connector either mounted on the plastic cover at the exit point 166, or preferably attached to the belt of the wearer. From that connector, another single cable connects this one connector to the computer (not shown) and other equipment (not shown) by splitting the cable into its sub-connectors as needed by the various components.
- This specialized connector provides for easy connect and disconnect of the equipment from the user.
- the equipment is protected by a plastic cover which protects the overall assembly from both shock and penetration by foreign agents, such as water and dirt.
- the InertiaCube and SoniDiscs are somewhat easy to uncalibrate and are sensitive to shock. This provides a great deal of shock protection.
- One potential issue with the HMD 176 is the build-up of heat, as the HMD gives off a substantial amount.
- One method is to put vent holes (not shown) in the plastic cover 170, allowing direct access to cool air outside the cover 170, however that can allow in foreign contaminants.
- the preferred method is to have one-way valves 172, 180 inside the SCBA mask 174. In this preferred method, as the user breathes in air, the air comes in through the mouth piece as is normal, but then the air gets redirected by a one-way valve 175, up through a one-way valve 172 and thus into shell 170, then over the electronics, and away from the electronics through another one-way valve 180 before entering the user's airway. When exhaling, the moist, warm air would get a direct path to the outside via the one-way valve 175. This redirection of air can be preferably accomplished through the use of typical, oneway rubber valves.
- the protective design uses an obvious indicator to the user that the device is for training use only, and not for use in real emergencies.
- the preferred method uses an alternating yellow and black color scheme to get the user's attention that this is not a standard part. Additionally, a sign is used which indicates that the device is to be used for training purposes only.
- Additional hardware is attached to a firefighter's vari-nozzle 193 to allow control of a virtual water stream.
- the nozzle used is an Elkhart vari-nozzle.
- the instrumentation for the nozzle consists of (1) a potentiometer used to measure the nozzle fog pattern; (2) a potentiometer used to measure the nozzle bail angle; (3) an INTERSENSE (Burlington, MA) InertiaCube used to measure the nozzle orientation; and (4) two INTERSENSE (Burlington, MA) SoniDiscs used to measure the nozzle position. All of this equipment is connected by wiring that carries message signals through a tether to a computer and associated equipment (including an analog-to-digital converter) which receives and processes these signals.
- the InertiaCube and SoniDiscs are equipment from the InterSense IS-600 line of tracking equipment. If the end user of the nozzle calls for the use of tracking equipment other than the IS-600 line, the invention could readily be adapted to protect equipment from the IS-900 line from InterSense, and 3rd Tech's optical tracking equipment.
- At least two potentiometers are mounted directly to the vari-nozzle 193.
- the InertiaCube 194 and SoniDiscs 183 are attached to a rigid, yet floating hard plastic island 181 which holds these items firmly in place.
- This island 181 is attached to a rigid hard plastic base 190 by two narrow, flexible posts 189.
- the base 190 is rigidly attached to the nozzle.
- a hard plastic cover 198 attaches to the base 190, providing a means for hiding all electronic equipment from view, except for the speakers 182 (FIG 27) on top of the SoniDiscs 183, which must be exposed.
- This cover 198 also constrains the floating island 181 from freely drifting laterally on the very flexible posts 189.
- the potentiometer underneath the plate 187 which measures the angle of the bail 184, is attached to the nozzle 193 by a soft polymeric coupling 185.
- the signal wires coming from certain instrumentation are attached to the mounting block 195 using screw-down connectors or soldering posts 186 integrated inside of the cover.
- connection can be held much more securely with this method rather than standard plugs. By using solder or strong screw-down terminals, the wire connections can be assured a quality connection.
- a separate cable (not shown) connects this common mounting block 195 to the computer and associated equipment which receives data from the nozzle.
- the specific wires that can be easily mounted in this way include (a) the leads from the SoniDiscs 183, and (b) the wires attached to the leads 188 of the potentiometer under the plate 187 and the potentiometer inside the pattern selector 192.
- the cable connection to the InertiaCube 194 may not be suitable to separately wire in this fashion since the InertiaCube signals may be sensitive to interference due to shielding concerns, though it should be possible to use a connector provided from InterSense.
- the wires and/or cables are routed through a hole in the nozzle (not shown), and down the hose (not shown) to the end where they can come out and connect to the needed equipment. This method keeps the wires from being visible to the user.
- the equipment is protected by a plastic cover 198, which protects the overall assembly from both shock and penetration by foreign agents, such as water and dirt.
- the INTERSENSE (Burlington, MA) InertiaCube is sensitive to shock, especially the action of the metal bail handle 184, hitting the metal stops at either extreme of its range of motion.
- the InertiaCube and SoniDiscs are mounted on an island which is held up by two soft polymeric pillars. This provides a great deal of protection against shock from slamming the bail handle all the way forward very quickly.
- This island is also surrounded by a thin layer of padding (not shown in the figures) located between the island 181, and the cover 198 to protect the island from horizontal shock.
- This thin layer also provides further protection from penetration by foreign agents, and can be made such that an actual seal is made around the circumference of the island.
- a small shoe made of soft material is used as a bumper 191 to prevent shock caused by setting the device down too rapidly or dropping it.
- the shoe also provides wear resistance to the base part in the assembly.
- the shoe is also a visual cue to the user that the device is to be set down using the shoe as a contact point.
- the protective design uses an alternating yellow and black color scheme (not shown) to get the user's attention that this is not a standard part. Additionally, a sign attached to the cover (not shown) is used which indicates that the device is to be used for training purposes only.
- FIG 1 One alternative to the display setup diagrammed in FIG 1 is the use of optical see-through AR.
- camera 4 and video mixer 3 are absent, and HMD 5 is one that allows its wearer to see computer graphics overlaid on his/her direct view of the real world.
- This embodiment is not currently preferred for fire fighting because current see-through technology does not allow black smoke to obscure a viewer's vision.
- a second alternative to the display setup diagrammed in FIG 1 is capturing and overlaying the camera video signal in the computer, which removes the video mixer 3 from the system diagram.
- This allows high-quality imagery to be produced because the alpha, or transparency channel of the computer 1 graphics system may be used to specify the amount of blending between camera and CG imagery.
- This embodiment is not currently preferred because the type of image blending described here requires additional delay of the video signal over the embodiment of FIG 1, which is undesirable in a fire fighting application because it reduces the level of responsiveness and interactivity of the system.
- a third alternative to the display setup diagrammed in FIG 1 is producing two CG images and using one as an external key for luminance keying in a video mixer.
- two VGA-to-NTSC encoders 2 are used to create two separate video signals from two separate windows created on the computer 1.
- One window is an RGB image of the scene
- a second window is a grayscale image representing the alpha channel.
- the RGB image may be keyed with the camera image using the grayscale alpha signal as the keying image.
- Such an embodiment allows controllable transparency with a minimum of real-world video delay.
- FIG 1 diagrams the two 6 degree-of-freedom (6DOF) tracking stations 7 and 8 present in all embodiments of the system.
- One tracking station 7 is attached to the HMD 5 and is used to measure a user's eye location and orientation in order to align the CG scene with the real world. In addition to matching the real-world and CG eye locations, the fields of view must be matched for proper registration.
- the second tracking station 8 measures the location and orientation of a nozzle 9 that may be used to apply virtual extinguishing agents. Prediction of the 6DOF locations of 7 and 8 is done to account for system delays and allow correct alignment of real and virtual imagery. The amount of prediction is varied to allow for a varying CG frame rate.
- the system uses an InterSense IS-600 tracking system 6, and it also supports the InterSense IS-900 and Ascension Flock of Birds. SOFTWARE
- FIGS 2-4 A method for real-time depiction of fire is diagrammed in FIGS 2-4.
- a particle system is employed for each of the persistent flame, intermittent flame, and buoyant plume components of a fire, as diagrammed in FIG 4.
- the particles representing persistent and intermittent flames are created graphically as depicted in FIG 3.
- Four triangles make up a fire particle, with transparent vertices 12-15 at the edges and an opaque vertex 16 in the center. Smooth shading of the triangles interpolates vertex colors over the triangle surfaces.
- the local Y axis 27 of a fire particle is aligned to the direction of particle velocity, and the particle is rotated about the local Y axis 27 to face the viewer, a technique known as "billboarding.”
- a fire texture map is projected through both the persistent and intermittent flame particle systems and rotated about a vertical axis to give a horizontal swirling effect.
- Smoke particles used to represent the buoyant plume portion of a flame, are created graphically as depicted in FIG 2.
- a texture map 11 representing a puff of smoke is applied to each particle 10, which consists of two triangles, and transparency of the texture-mapped particle masks the appearance of polygon edges.
- Smoke particles 10 are rotated about two axes to face the viewer, a technique known as "spherical billboarding.”
- the flame base 17 is used as the particle emitter for the three particle systems, and buoyancy and drag forces are applied to each system to achieve acceleration in the persistent flame, near-constant velocity in the intermittent flame, and deceleration in the buoyant plume.
- An external force representing wind or vent flow may also be applied to affect the behavior of the fire plume particles.
- flame particles When flame particles are born, they are given a velocity directed towards the center of the fire and a life span inversely proportional to their initial distance from the flame center.
- the emission rate of intermittent flame particles fluctuates sinusoidally at a rate determined by a correlation with the flame base area. Flame height may be controlled by appropriately specifying the life span of particles in the center portion of the flame.
- a number of graphical features contribute to the realistic appearance of the fire and smoke plume diagrammed in FIG 4.
- Depth buffer writing is disabled when drawing the particles to allow blending without the need to order the drawing of the particles from back to front.
- a light source is placed in the center of the flames, and its brightness fluctuates in unison with the emission rate of the intermittent flame particle system. The light color is based on the average color of the pixels in the fire texture map applied to the flame particles. Lighting is disabled when drawing the flame particles to allow them to be at full brightness, and lighting is enabled when drawing the smoke particles to allow the light source at the center of the flame to cast light on the smoke plume.
- a billboarded, texture-mapped, polygon with a texture that is a round shape fading from bright white in the center to transparent at the edges is placed in the center of the flame to simulate a glow.
- the RGB color of the polygon is the same as the light source, and the alpha of the polygon is proportional to the density of smoke in the atmosphere. When smoke is dense, the glow polygon masks the individual particles, making the flames appear as a flickering glow through smoke.
- the glow width and height is scaled accordingly with the flame dimensions.
- FIG 5 describes the concept of two layers of smoke in a compartment.
- smoke from the buoyant plume rises to the top of a room and spreads out into a layer, creating an upper layer 20 and a lower layer 21 with unique optical densities.
- the lower layer has optical density kj
- the upper layer has density fo.
- a polygonal model of the real room and contents is created.
- the model is aligned to the corresponding real-world using the system of FIG 1.
- the above equations are applied to modify the vertex colors to reflect smoke obscuration. Smooth shading interpolates between vertex colors so that per-pixel smoke calculations are not required. If the initial color, C,-, of the vertices is white, and the smoke color, C s , is black, the correct amount of obscuration of the real world will be achieved using the luminance keying method described above.
- the above equations can be applied to the alpha value of vertices of the room model.
- color values are generally specified using and integer range of 0 to 255 or a floating point range of 0 to 1.0.
- this color specification does not take into account light sources such as windows to the outdoors, overhead fluorescent lights, or flames; which will shine through smoke more than non-luminous objects such as walls and furniture.
- a luminance component was added to the color specification to affect how objects are seen through smoke.
- One additional component to the layered smoke model is the addition of a smoke particle system, as depicted in FIG 2.
- a smoke particle system is placed in the upper, denser layer 20 to give movement to the otherwise static obscuration model.
- To determine the volume and optical density of the upper smoke layer one method is to assign volume and density characteristics to the buoyant plume smoke particles. When a buoyant plume smoke particle fades after hitting the ceiling of a room, the volume and optical density of the particle can be added to the upper layer to change the height and optical density the layer.
- the same polygonal model used for smoke obscuration is also used to allow real-world elements to occlude the view of virtual objects such as smoke and fire.
- a fire plume behind a real desk that has been modeled is occluded by the polygonal model. In the combined AR view, it appears as if the real desk is occluding the view of the fire plume.
- Graphical elements such as flame height, smoke layer height, upper layer optical density, and lower layer optical density may be given a basis in physics by allowing them to be controlled by a zone fire model.
- a file reader developed for the system allows CFAST models to control the simulation.
- CFAST or consolidated fire and smoke transport, is a zone model developed by the National Institute of Standards and Technology (NIST) and used worldwide for compartment fire modeling.
- Upper layer temperature calculated by CFAST is monitored by the simulation to predict the occurrence of flashover, or full room involvement in a fire. The word "flashover" is displayed to a trainee and the screen is turned red to indicate that this dangerous event in the development of a fire has occurred.
- a key component in a fire fighting simulation is simulated behavior and appearance of an extinguishing agent.
- water application from a vari-nozzle 9, 23, and 25 has been simulated using a particle system.
- a surface representation of a particle system was devised. This representation allows very few particles to represent a water stream, as opposed to alternative methods that would require the entire volume of water to be filled with particles.
- Behavior such as initial water particle velocity and hose stream range for different nozzle settings is assigned to a water particle system. Water particles are then constrained to emit in a ring pattern from the nozzle location each time the system is updated. This creates a series of rings of particles 22 as seen FIG 6.
- the regular emission pattern and spacing of particles allows a polygon surface to easily be created using the particles as triangle vertices, as seen in the wireframe mesh 24 in FIG 7.
- the surface 24 is texture-mapped with a water texture, and the texture map is translated in the direction of flow at the speed of the flow.
- a second surface particle system that is wider than the first is given a more transparent texture map to the hard edge of the surface particle system representation.
- a third particle system using small billboards to represent water droplets is employed to simulate water splashing.
- collision detection with the polygonal room and contents model is employed.
- a ray is created from a particle's current position and its previous position, and the ray is tested for intersection with room polygons to detect collisions.
- the particle's velocity component normal to the surface is reversed and scaled according to an elasticity coefficient.
- the same collision method is applied to smoke particles when they collide with the ceiling of a room. Detection of collision may be accomplished in a number of ways.
- the "brute force" approach involves testing every particle against every polygon.
- a space partitioning scheme may be applied to the room polygons in a preprocessing stage to divide the room into smaller units.
- Some space partitioning schemes include creation of a uniform 3- D grid, binary space partitioning (BSP), and octree space partitioning (OSP).
- a simpler approach to collisions that is applicable in an empty rectangular room is the use of an axis-aligned bounding box.
- particles are simply given minimum and maximum X, Y, and Z coordinates, and a collision is registered if the particle position meets or exceeds the specified boundaries.
- steam is generated when water particles collide at or near the location of the fire.
- Steam particle emitters are placed at the collision locations and they are given an emittance rate that is scaled by the size of the fire and the inverse of the collision's distance from the fire.
- Steam particles are rendered as spherically biUboarded, texture-mapped polygons similar to the smoke particles in FIG 2, but with a different texture map 11 and different particle behavior.
- steam is generated when a hose stream is aimed at the upper, hot gas layer.
- Steam particle systems may be placed in this layer to simulate this phenomenon.
- Steam emittance in the upper layer can be directly proportional to the temperature of the upper layer as calculated by CFAST.
- Water particles that collide with the surface on which the flame base is located are stored as particles that can potentially contribute to extinguishment.
- the average age of these particles is used in conjunction with the nozzle angle to determine the average water density for the extinguishing particles.
- Triangles are created using the particle locations as vertices. If a triangle is determined to be on top of the fire, then an extinguishment algorithm is applied to the fire.
- Extinguishing a fire primarily involves reducing and increasing the flame height in a realistic manner. This is accomplished by managing three counters that are given initial values representing extinguish time, soak time, and reflash time. If intersection between water stream and flame base is detected, the extinguish time counter is decremented, and the flame height is proportionately decreased until both reach zero. If water is removed before the counter reaches zero, the counter is incremented until it reaches its initial value, which increments the flame height back to its original value. After flame height reaches zero, continued application of water decrements the soak counter until it reaches zero. If water is removed before the soak counter reaches zero, the reflash counter decrements to zero and the flames re-ignite and grow to their original height.
- the rate at which the extinguish and soak counters are decremented can be scaled by the average water density for more realistic behavior.
- a flame base is divided into a 2-D grid of smaller areas.
- Each grid square is an emitter for three particle systems: persistent flames, intermittent flames, and buoyant plume.
- flame particles are bom in a grid square, they are given a velocity directed towards the center of the flame base and a life span inversely proportional to their initial distance from the flame center. This allows multiple flame particle systems to appear as a single fire.
- Each grid square has an independent flame height, extinguish counter, soak counter, and reflash counter. This allows portions of a flame to be extinguished while other portions continue to burn. This is especially useful for larger fires where the hose stream can only be directed at one part of the fire at a time.
- FIG 9 represents the preferred embodiment of a real-time graphical simulation of an extinguishing agent 29 (e.g., water or foam) exiting a nozzle 28 in the vicinity of a fire and smoke plume 30.
- extinguishing agent 29 e.g., water or foam
- each extinguishing agent, fire, or smoke particle will have a mass and a velocity associated with it.
- a force on the fire and smoke particles can be calculated from the speed and direction of extinguishing agent particles.
- an equation of the form (other actual forms are envisioned, but they will mainly show similar characteristics):
- PVout is the velocity (3-D vector) of the smoke and fire particles after the force is applied
- PVjn is the velocity (3-D vector) of the smoke and fire particles before the force is applied
- ExtAV is the velocity (3-D vector) of extinguishing agent particles
- R is the radial distance between the smoke and fire particles and the extinguishing agent particles
- K is a factor that can be adjusted (from a nominal value of 1) for:
- a force on the fire and smoke particles can be calculated based on the change in velocity:
- F is the actual force (3-D vector) to be applied to the smoke and fire particles
- Mass is the mass of the fire or smoke particles
- ⁇ t is the time in between simulation updates
- the application of the calculated force simulates the visual effect of the extinguishing agent stream causing airflow that alters the motion of fire and smoke particles. Additionally, the calculations can be applied to smoke or other particles that are not part of a fire and smoke plume, such as extinguishing agent passing through ambient steam or ambient smoke particles in a room.
- the invention extends to other aerosols, gases, and particulate matter, such as dust, chemical smoke, and fumes.
- a further refinement for determining a force to apply to particles in the fire and smoke plume 35 would entail modeling extinguishing agent 32-34 in cones 32, 33, and 34 (which are affected by gravity and will droop) from the nozzle 31, where the multiple additional concentric cones 32 and 33 to apply varying force.
- One embodiment that can produce the cones 32, 33, and 34 can be a system of rings (the system of rings may be modeled as a particle system) emitted from the nozzle, which, when connected, form cones 32, 33, and 34.
- fire and smoke particles 35 which are contained mostly inside the inner cone 34 of the extinguishing agent 32-34 can have one level of force applied, and fire and smoke particles 35 which are not contained within cone 34, but are contained within cones 33 or 32 can have a different, often smaller, force applied to them.
- multiple levels of velocity from extinguishing agent and air entrainment can be easily simulated to apply multiple levels of force to the fire and smoke.
- the additional cones 33 and 32 do not have to be drawn in the simulation, as they could be used strictly in determining the force to apply to the fire and smoke.
- the force applied to a particle can be modeled as: (A) the extinguishing agent cone(s) 32, 33, 34 each having a velocity associated with them, (B) a difference in velocity between a particle 35 and the extinguishing agent cone(s) 32, 33, 34 can be calculated, (C) a force can be calculated that scales with that difference, and (D) the particles 35 will accelerate based on the force calculated, approaching the velocity of the extinguishing agent inside of the cone(s) 32, 33, 34.
- the results of the simulated effects described above can be observed by drawing particles as computer-generated graphics primitives using real-time graphics software and hardware.
- the invention is applicable to such areas as training simulations and computer games.
- texture maps representing the movable object are mapped onto three orthogonal planes.
- a tracking sensor is used to determine the actual position and orientation of the object, as appropriate.
- the texture map is opaque.
- the texture map is transparent.
- the texture map is faded between these two extremes as the orientation changes between these two extremes, to accomplish a desirable graphical mixing result that matches the silhouette of the object (or human) while maintaining a softer border around the edge of the silhouette contained in the texture map.
- the appearance of a soft (“fuzzy") border is made by fading to transparent the edges of the object silhouette in the texture map.
- a series of three texture maps containing silhouettes 36, 42, and 37 are shown mapped onto each of three orthogonal planes 38, 40 and 41, respectively.
- the texture maps may fade to transparent at their edges for a fuzzy appearance to the shape.
- the orthogonal planes are each broken up into 4 quadrants defined by the intersection of the planes, and the 41 resulting quadrants are rendered from back to front for correct alpha blending in OpenGL of the texture maps and planes, with depth buffering enabled.
- a plane is perpendicular to the view plane of a virtual viewpoint looking at the object, the plane is rendered to be completely transparent.
- a linear fade is used to completely fade the texture map to completely opaque when the plane is parallel to the view plane. This fade from opaque to transparent as the planes are turned relative to a viewer is responsible for the large part of the desirable fuzzy appearance to the shape.
- the texture maps used to shade the three planes were created from a digital image of a person, then made into grayscale silhouettes, and so match the silhouette of a human user very well.
- the edges 39 of the human silhouettes in the texture maps were blurred so that they would fade linearly from solid white (which represents the silhouette) to solid black (which represents non- silhouette portions of the texture map) to look better in an augmented reality situation where a little bit of a fuzzy edge is desirable. This fuzzy edge spans what is equivalently approximately 0.5 inches in real distance.
- FIG 12 depicts a diagram of a human 44 wearing a head tracker 43 on a head mounted display.
- the virtual representation of human 44 is shown used in the inventive technique in FIG 13.
- Item 43 in FIG 12 is a motion tracker, in this case a six-degree-of-freedom motion tracker that measures the head location and orientation.
- the position and orientation information from tracker 43 can be applied to orthogonal plane billboards.
- an orthogonal plane billboard torso 47 can be created in approximately the correct place relative to the joint.
- the torso in this instance may be designed to remain upright, only rotating about a vertical axis.
- the head in this instance has full 6 degree-of-freedom motion capability based on the data coming from the tracker worn on the head of the user. This allows the head orthogonal plane billboard to be lined up correspondingly with the user's head.
- the torso orthogonal plane billboard is attached to the pivot point 46 and is placed "hanging" straight down from that point, and has 4 degrees of freedom: three to control its position in space, and one controlling the horizontal orientation.
- the head and torso models when lined up to a real human, occlude computer-generated graphics in a scene. If augmented reality video mixing is achieved with a luminance key to combine live and virtual images, white head and torso models will mask out a portion of a computer-generated image for replacement with a live video image of the real world.
- This invention can be applied to any real world movable objects for which an occlusion model may be needed.
- the above technique is also applicable to a movable non-person real world object. If the non-person object has no joints, then such an implementation is simpler since the complexity of coupling the separate head and torso models is avoided.
- the technique for the single movable real-world physical object is functionally identical to the above method when only the head model is used.
- 3-D audio allows sound volume to diminish with distance from a sound emitter, and it allow works with stereo headphones to give directionality to sounds.
- 3-D audio emitters are attached to the fire and the hose nozzle.
- the fire sound volume is proportional to physical volume of the fire.
- Appendix A contains settings for the parameters of particle systems used in the invention. These parameters are meant to be guidelines that give realistic behavior for the particles. Many of the parameters are changed within the program, but the preferred starting parameters for flames, smoke, steam, and water are listed in the appendix.
- Flashover refers to the point in the evolution of a compartment fire in which the fire transitions from local burning to involvement of the entire compartment.
- zone-type fire model should provide sufficient accuracy for meeting our training objectives.
- zone models including the Consolidated Fire and Smoke Transport Model (CFAST) from NIST and the WPI fire model from Worcester Polytechnic Institute, among others.
- the outputs of a zone-type fire model can be extended to achieve a visual representation of a compartment fire.
- Task 1 Infrastructure for Real-Time Display of Fire Task Summary.
- the organization and structuring of information to be displayed is as important as actual display processing for real-time dynamical presentation of augmented environments.
- a firefighter moves through a scenario (using an augmented reality device) the location, extent, and density of fire and smoke change.
- an approach is to examine the transfer of data to and from the hard disk, through system memory, to update display memory with the desired frequency.
- Precomputation of the bulk of a firefighter training simulation implies that most of the operations involved in real-time presentation of sensory information revolve around data transfer.
- Task 2 Visual Representation of Smoke and Fire Task Summary.
- the way in which sensory stimuli are presented in an ARBT scenario may or may not effect task performance by a student. It is essential to capture the aspects of the sensory representations of fire and smoke that affect student behavior in a training scenario without the computational encumbrance of those aspects that do not affect behavior.
- For the purposes of providing sensory stimuli for firefighter training we need to know not only the spatial distribution and time evolution of temperature and hot gases in a compartment fire, but also the visible appearance of smoke and flame, along with sounds associated with a burning compartment, taken over time. There are three tiers of attributes of fire and smoke:
- Second tier opacity, luminosity, and dynamics
- a zone-type fire model can be used to determine the location and extent of the smoke and flame. In addition to these quantities, the zone-type fire model also will yield aerosol densities in a given layer. Values for optical transmission through smoke can be calculated using a standard model such as found in the CFAST (Consolidated Fire and Smoke Transport) model, or in the EOSAEL (Electro-Optical Systems Atmospheric Effects Library) code.
- the intermittent flame region in a fire oscillates with regularity, and that the oscillations arise from instabilities at the boundary between the fire plume and the surrounding air.
- the instabilities generate vortex structures in the flame which in turn rise through the flame resulting in observed oscillations.
- the visual dynamics of flame can be modeled from empirical data such as is known in the art. Measures of Success. This task can be judged on the aesthetics of the visual appearance of the simulated fire and smoke. Ultimately, the visual appearance of fire and smoke should be evaluated relative to the efficacy of an ARBT system.
- Task 3 Position Anchoring Task Summary. Augmented reality techniques rely on superimposing information onto a physical scene. Superposition means that information is tied to objects or events in the scene. As such, it is necessary then to compensate for movement by an observer in order to maintain the geometric relations between superimposed information and underlying physical structures in the scene.
- Position sensors in the form of a head tracker can, in real-time, calculate changes in location caused by movement of a firefighter within a training scenario. Virtual objects will be adjusted accordingly to remain "fixed" to the physical world.
- Task 4 Authoring Tools Task Summary.
- An authoring system typically takes the form of a visual programming interface over a modular toolkit of fundamental processes.
- a training instructor can use an authoring tool to visually select and sequence modules to create the desired training course without ever having to resort to direct programming in some computer language such as C or FORTRAN.
- Task 5 ARBT Technology Demonstration Task Summary. The previous tasks herein developed the pieces of an augmented reality fire simulation. It remains to pull everything together into a coherent demonstration to show the suitability of the selected technologies to the delivery of training to firefighters. Approach. A scenario consisting of a real room and virtual fire is to be constructed, and a problem solving situation will be presented to prospective trainees.
- Middle Color 1.0, 1.0, 1.0, 1.0
- Middle Color 1.0, 1.0, 1.0, 0.85
- Middle Color 1.0, 1.0, 1.0, 1.0
- Middle Color 1.0, 1.0, 1.0, 0.45
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- General Engineering & Computer Science (AREA)
- Emergency Management (AREA)
- Optics & Photonics (AREA)
- Pulmonology (AREA)
- General Health & Medical Sciences (AREA)
- Computer Graphics (AREA)
- Public Health (AREA)
- Geometry (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US123364 | 1980-02-21 | ||
US927043 | 1992-08-10 | ||
US09/927,043 US7110013B2 (en) | 2000-03-15 | 2001-08-09 | Augmented reality display integrated with self-contained breathing apparatus |
US10/123,364 US6822648B2 (en) | 2001-04-17 | 2002-04-16 | Method for occlusion of movable objects and people in augmented reality scenes |
PCT/US2002/025065 WO2003015057A1 (en) | 2001-08-09 | 2002-08-07 | Augmented reality-based firefighter training system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1423834A1 true EP1423834A1 (de) | 2004-06-02 |
Family
ID=26821473
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP02752732A Withdrawn EP1423834A1 (de) | 2001-08-09 | 2002-08-07 | Ergänztes realitätsgestütztes trainingssystem und verfahren für feuerwehrleute |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP1423834A1 (de) |
CA (1) | CA2456858A1 (de) |
WO (1) | WO2003015057A1 (de) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109117533A (zh) * | 2018-07-27 | 2019-01-01 | 上海宝冶集团有限公司 | 基于bim结合vr的电子厂房消防方法 |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2400513B (en) * | 2003-03-14 | 2005-10-05 | British Broadcasting Corp | Video processing |
US7719483B2 (en) | 2005-10-13 | 2010-05-18 | Honeywell International Inc. | Synthetic vision final approach terrain fading |
EP3515063A1 (de) | 2007-04-02 | 2019-07-24 | Esight Corp. | Verfahren zur sehverstärkung |
US7769806B2 (en) | 2007-10-24 | 2010-08-03 | Social Communications Company | Automated real-time data stream switching in a shared virtual area communication environment |
WO2009146130A2 (en) | 2008-04-05 | 2009-12-03 | Social Communications Company | Shared virtual area communication environment based apparatus and methods |
US9853922B2 (en) | 2012-02-24 | 2017-12-26 | Sococo, Inc. | Virtual area communications |
US9069851B2 (en) | 2009-01-15 | 2015-06-30 | Social Communications Company | Client application integrating web browsing and network data stream processing for realtime communications |
CA3043204C (en) | 2009-11-19 | 2021-08-31 | Esight Corp. | Apparatus and method for a dynamic "region of interest" in a display system |
US8610771B2 (en) | 2010-03-08 | 2013-12-17 | Empire Technology Development Llc | Broadband passive tracking for augmented reality |
AT509799B1 (de) * | 2010-04-29 | 2013-01-15 | Gerhard Gersthofer | Übungsanordnung mit feuerlöscher zur brandbekämpfung |
WO2013111145A1 (en) * | 2011-12-14 | 2013-08-01 | Virtual Logic Systems Private Ltd | System and method of generating perspective corrected imagery for use in virtual combat training |
WO2013119802A1 (en) | 2012-02-11 | 2013-08-15 | Social Communications Company | Routing virtual area based communications |
WO2013181026A1 (en) | 2012-06-02 | 2013-12-05 | Social Communications Company | Interfacing with a spatial virtual communications environment |
US11020624B2 (en) | 2016-04-19 | 2021-06-01 | KFT Fire Trainer, LLC | Fire simulator |
KR101867153B1 (ko) * | 2016-05-25 | 2018-06-12 | 민상규 | 가상 현실 겸용 핸드폰 케이스 |
NO20161132A1 (en) * | 2016-07-07 | 2017-10-30 | Real Training As | Training system |
FR3064801B1 (fr) * | 2017-03-31 | 2019-08-09 | Formation Conseil Securite | Simulateur de manipulation de dispositif d’extinction d’incendie |
WO2020256965A1 (en) * | 2019-06-19 | 2020-12-24 | Carrier Corporation | Augmented reality model-based fire extinguisher training platform |
CN112930061A (zh) * | 2021-03-04 | 2021-06-08 | 贵州理工学院 | 一种vr-box体验互动装置 |
FR3120731B1 (fr) * | 2021-03-14 | 2023-04-28 | Ertc Center | Systeme d’entrainement aux risques et menaces nrbc |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5059124A (en) * | 1990-08-03 | 1991-10-22 | Masahiro Tsujita | Imitation apparatus for fire extinguishing training |
GB9409790D0 (en) * | 1994-05-16 | 1994-07-06 | Lane Kerry | Fire fighting simulator |
US5660549A (en) * | 1995-01-23 | 1997-08-26 | Flameco, Inc. | Firefighter training simulator |
US5920492A (en) * | 1996-04-26 | 1999-07-06 | Southwest Research Institute | Display list generator for fire simulation system |
US6129552A (en) * | 1996-07-19 | 2000-10-10 | Technique-Pedagogie-Securite Equipements | Teaching installation for learning and practicing the use of fire-fighting equipment |
US5984684A (en) * | 1996-12-02 | 1999-11-16 | Brostedt; Per-Arne | Method and system for teaching physical skills |
-
2002
- 2002-08-07 WO PCT/US2002/025065 patent/WO2003015057A1/en not_active Application Discontinuation
- 2002-08-07 EP EP02752732A patent/EP1423834A1/de not_active Withdrawn
- 2002-08-07 CA CA002456858A patent/CA2456858A1/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
See references of WO03015057A1 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109117533A (zh) * | 2018-07-27 | 2019-01-01 | 上海宝冶集团有限公司 | 基于bim结合vr的电子厂房消防方法 |
Also Published As
Publication number | Publication date |
---|---|
CA2456858A1 (en) | 2003-02-20 |
WO2003015057A1 (en) | 2003-02-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6809743B2 (en) | Method of generating three-dimensional fire and smoke plume for graphical display | |
EP1423834A1 (de) | Ergänztes realitätsgestütztes trainingssystem und verfahren für feuerwehrleute | |
Vince | Introduction to virtual reality | |
Vince | Essential virtual reality fast: how to understand the techniques and potential of virtual reality | |
US8195084B2 (en) | Apparatus and method of simulating a somatosensory experience in space | |
CN102540464B (zh) | 提供环绕视频的头戴式显示设备 | |
US20020191004A1 (en) | Method for visualization of hazards utilizing computer-generated three-dimensional representations | |
US20030210228A1 (en) | Augmented reality situational awareness system and method | |
US7046214B2 (en) | Method and system for accomplishing a scalable, multi-user, extended range, distributed, augmented reality environment | |
WO2007133209A1 (en) | Advanced augmented reality system and method for firefighter and first responder training | |
US20060114171A1 (en) | Windowed immersive environment for virtual reality simulators | |
JP2007229500A (ja) | ユーザーを仮想現実に没入させるための方法及び装置 | |
AU2002366994A1 (en) | Method and system to display both visible and invisible hazards and hazard information | |
Spanlang et al. | A first person avatar system with haptic feedback | |
Vichitvejpaisal et al. | Firefighting simulation on virtual reality platform | |
Hatsushika et al. | Underwater vr experience system for scuba training using underwater wired hmd | |
Wilson et al. | Design of monocular head-mounted displays for increased indoor firefighting safety and efficiency | |
SE523098C2 (sv) | Anordning och förfarande för att i en reell omgivning skapa en virtuell företeelse | |
Rodrigue et al. | Mixed reality simulation with physical mobile display devices | |
AU2002355560A1 (en) | Augmented reality-based firefighter training system and method | |
USRE45525E1 (en) | Apparatus and method of simulating a somatosensory experience in space | |
Segura et al. | Interaction and ergonomics issues in the development of a mixed reality construction machinery simulator for safety training | |
Schoor et al. | Elbe Dom: 360 Degree Full Immersive Laser Projection System. | |
Sibert et al. | Initial assessment of human performance using the gaiter interaction technique to control locomotion in fully immersive virtual environments | |
Gupta et al. | Training in virtual environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20040304 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LI LU MC NL PT SE SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL LT LV MK RO SI |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20080301 |