WO1997019398A1 - A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method - Google Patents

A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method Download PDF

Info

Publication number
WO1997019398A1
WO1997019398A1 PCT/IB1996/001231 IB9601231W WO9719398A1 WO 1997019398 A1 WO1997019398 A1 WO 1997019398A1 IB 9601231 W IB9601231 W IB 9601231W WO 9719398 A1 WO9719398 A1 WO 9719398A1
Authority
WO
WIPO (PCT)
Prior art keywords
tactile
user
control device
virtual reality
tactile stimulus
Prior art date
Application number
PCT/IB1996/001231
Other languages
French (fr)
Inventor
David Victor Keyson
Original Assignee
Philips Electronics N.V.
Philips Norden Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Electronics N.V., Philips Norden Ab filed Critical Philips Electronics N.V.
Priority to EP96935264A priority Critical patent/EP0806002A1/en
Priority to JP9519547A priority patent/JPH10513593A/en
Publication of WO1997019398A1 publication Critical patent/WO1997019398A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/013Force feedback applied to a game

Definitions

  • a method for presenting virtual reality enhanced with tactile stimuli and a system for executing the method.
  • the invention relates to a method for enabling a user of a data processing system to navigate through a virtual reality environment, wherein the method comprises generating a sequence of visual representations of the virtual reality environment on a visual display in response to the user manipulating a control device.
  • the invention also relates to a data processing system for executing the method. Further advantageous aspects are recited in dependent Claims.
  • US Patent 5,459,382 to Jacobus et al describes a method for providing tactile feedback in a virtual reality environment with six degrees of freedom.
  • Dependent on the actual motion control effected by the user person on an interface device first a visual re- ality force field generator is activated, which in its turn causes a force signal to be generated back to the interface device.
  • a conservative force field Through the definition of a conservative force field, various ty ⁇ pes of force feedback can be emulated to the user, for so allowing a user to exercise operati ⁇ ons that later will have to be executed in real world.
  • the reference does not allow exercising with unexpected, and in particular, dynamic disturbances that would make real-life operating so much more complicated.
  • the reference relates principally to simulating real-world, rather than a fully virtual reality such as is proper to entertainment systems.
  • the method according to the invention is characterized by providing via a feedback mechanism appertaining to the manipulated control device, a preprogrammed dynamic tactile stimulus from an available database with tactile stimuli, to the user under selective and combined control by an actual interval of the sequence of generated visual representations.
  • tactile stimuli in particular dynamic stimuli, when added to visual, and possibly auditory, stimuli, can contribute significantly to the user's impression of being actually involved in a virtual reality scenery such as in video games.
  • Video and/or graphics are used to create visual representations of a specific scenery from a variety of viewpoints.
  • the visual representation is altered through the control device as if the user were moving through the scenery.
  • the visual representation shown on the display is a mapping of the virtual scenery onto a plane selected through the control device.
  • a tactile texture is joined to the video information.
  • tactile representations are mappings of attributes, pre-assigned to events in the virtual reality scenery, onto the control device.
  • the tactile stimulus in the present invention is generated under control of the occurrence of at least a specific one of the visual representations of the virtual scenery, in combination with an actual pointing direction of the control device on a predetermined image subfield.
  • the occurrence of a specific pixel pattern or texture, a particular color, or a particular level of greyness is accompanied by one or more particular tactile stimuli.
  • This integration of visual and tactile is utilized to let the user associate a tactile stimulus with an object, an area or an event in the virtual reality scenery.
  • the tactile stimulus in the invention is generated depending on a rate at which certain visual representations succeed each other in the sequence. This combination is employed to let the user associate the tactile stimulus with a speed at which the user moves through the virtual reality scenery.
  • a video recording is made of a real scenery from a plurality of positions and from a plurality of viewing angles.
  • the video data are used to provide the user with particular ones of a variety of visual representations of a virtual reality on a display when moving through the scenery. That is, the displayed image as a whole changes when the user moves, about as if perceived through a moving car's windscreen.
  • the user is enabled to navigate through this virtual reality by controlling, for example, a mouse, trackball or joystick provided with features that provide force feedback. Tactile textures are associated with the visual representation.
  • the visual representation of grass is combined with a slight amount of drag felt through the mouse or trackball when moving across a lawn.
  • a representation of a tree is felt as an impenetrable object.
  • a lake is represented by a pit and a dynamic wave resistance pattern when moving.
  • a resistance with directional characteristics gives the impression of going uphill or going in the teeth of a gale, etcetera.
  • the landscape has different types of surface regarding texture (hardness, flatness) and extent, such as mud, asphalt, grass, loose sand, cobblestones, brushwood, frozen underground, and all kinds of obstacles such as trees, stone walls, hills, rivers, ditches, flock of sheep, turnpikes, etcetera.
  • texture hardness, flatness
  • extent such as mud, asphalt, grass, loose sand, cobblestones, brushwood, frozen underground, and all kinds of obstacles such as trees, stone walls, hills, rivers, ditches, flock of sheep, turnpikes, etcetera.
  • the relevant area of the landscape being displayed at a particular moment depends on the bike's actual direction and location with respect to the scenery.
  • the surface of the area is tactilely represented by stimuli experienced by the user through the control device.
  • the above tactile stimuli correspond to spatial characteristics, such as locations or orientations in the virtual scenery.
  • a conditional tactile stimulus could be the occurrence of irregular and tiny shocks such as simulating tomatoes thrown by a (virtual) angry farmer when being passed by too close.
  • tactile stimuli give a new dimension to virtual reality and considerably increase the sense of the user of actually being involved.
  • the synergetic combination of video, audio and tactile stimuli in a software application considerably extends the concept of multimedia applications. Not only video games but also (semi-) professional application may benefit from tactile enhancement.
  • a computer model of a novel suspension for a motorcycle or a computer model of a novel steering geometry for a car may use the invention to provide a first impression of the behaviour of the suspension or steering in all kinds of terrain and at all kinds of speed.
  • Figure 1 is a block diagram of a system in the invention
  • FIGS 2-4 illustrate various events in an example of a software application that combines video and tactile stimuli
  • Figure 5 is a first geometry diagram of the invention
  • FIG. 6 is a second geometry diagram of the invention. Throughout the drawing, same reference numerals indicate similar or corresponding features.
  • FIG. 1 is a diagram of a data processing system 100 according to the invention.
  • System 100 comprises a display 102, a control device 104 and a data processing apparatus 106.
  • Control device 104 comprises, for example, one or more of the devices dis ⁇ closed in published European patent application 0 489 469.
  • Manipulation of device 104 controls, via apparatus 106, an image or a sequence of images that are shown on display 102.
  • Apparatus 106 supplies to device 104 control signals that generate stimuli for being tactilely sensed by the user. The signals have a predetermined relationship with the images actually shown on display 102.
  • Apparatus 106 supplies to display 102 video and/or graphics data that represents a virtual reality scenery.
  • the visual representation of the scenery changes in response to manipulation of control device 104.
  • Manipulation of device 104 gives the visual impression of travelling through the scenery. For example, successive images are interrelated through a perspective transformation or through a displacement transformation.
  • Apparatus 106 also supplies the control signals to evoke tactile feedback to the user through control device 104.
  • the control signals are generated in synchronism with the movement through the virtual reality scenery or with the occurrence of some event in the virtual reality scenery, i.e. , the control signals are related to a succession of images.
  • Apparatus 106 therefore is operative to combine video or graphics data with tactile data, the latter determining the generation of the control signals.
  • the parallel supply of tactile data and video data is pre-determined in the sense that the generation of a particular image or of a sequence of particular images is accompanied by a particular tactile feedback. This is realized, for example, much in the same way as video or graphics data are accompanied by audio data in conventional video games.
  • the user of the system in the invention is enabled to select the video data interactively and the corresponding tactile data are then generated automatically. For example, a more rapid sequence of images then also brings about a more rapid sequence of the associated tactile stimuli.
  • system 100 comprises a first storage means 108 to store video data for generation of the visual representations under control of control device 104, and second storage means 110 to store tactile data for control of the generation of the tactile stimulus in a predetermined relationship with the video data.
  • First and second storage means 108 and 110 are preferably physically integrated with one another, and the tactile data is preferably logically combined with the video data.
  • the video data throughput must be correspondingly high, while the throughput of tactile data is lower than that of video data as tactile stimuli are typically low-frequency signals.
  • the integration allows reading of the video data and the tactile data, which is logically combined with the video data, without substantially hampering the flow of video data by undesired, time-consuming context switching.
  • One way to achieve this is, for example, merging tactile data with some video data records, identifying the tactile data as such upon reading, and supplying the tactile data to a look-up table 1 12.
  • the tactile data are then just signals to control the generation of the tactile stimuli at control device 104.
  • look-up table 112 is user-programmable to enable selection or tuning of the tactile feedback.
  • storage means 108 and 1 10 are integrated with each other in a single information carrier 1 14 or cartridge such as a diskette or a CD. System 100 can then be used with different software applications by simply exchanging the carrier 1 14.
  • Figures 2-4 give some examples of snap-shots of a video game of the kind mentioned above, wherein the user is supposed to play the role of a police officer on a motorcycle 200.
  • Control device 104 controls speed and direction of motorcycle 200 through the virtual reality scenery.
  • Control device comprises, for example, a first trackball or joystick for directional control and a second trackball or joystick for speed control, both the speed controller and the directional controller being provided with force-feedback means.
  • the officer is to chase a vehicle 202 driven by poachers 204, 206, 208 and 210. Poachers 204-210 are strategically trying to stay beyond reach of the strong arm of the law and drive their truck 202 wildly through the scenery. The scenery changes as the hunt continues. End of the game is when, for example, either the poachers escape or the user has come so close that the truck's license plate 212 can be read, i.e.
  • plate 212 is being displayed in legible characters.
  • the video representation of motorcycle 200 is made to lean with respect to the vertical and horizontal axes in the scenery in synchronism with a change of direction brought about via control device 104 to intensify the impression of the user riding motorcycle 200 himself.
  • motorcycle 200 is trying to catch truck 202 speeding along a bumpy path 204 of loose sand. It is hard to ride a motorcycle along a predetermined line through loose sand as the rear wheel tends to start wobbling, trying to throw the motorcycle off course, and as steering requires a considerable torque to be applied to the handlebars when changing directions.
  • control device 104 This is made to be felt at control device 104 by system 100 exerting alternating transversal forces on the user's hand depending on, e.g., speed and leaning angle.
  • the user therefore has to concentrate very hard in order not to loose control over motorcycle 200 and, at the same time, not to loose sight of truck 202.
  • motorcycle 200 is ridden on a grey autumn day, in a drizzle at dusk (dinner time), along a road 300 with bumps and potholes that are made to be felt as sudden shocks at control device 104, and fallen leaves that tend to gather where the road makes a bend, and that are made to be felt as a sudden loss of resistance at control device 104.
  • motorcycle 200 is chasing truck 202 along a street.
  • Poacher 210 is trying to hit the officer by throwing objects at him, such as, say, tomatoes, spare wheels and other things that are carried in the trunk of a truck for this purpose. So, in addi ⁇ tion to keep motorcycle 200 on course, the user has to avoid projectiles 402, 404 and 406 that otherwise may hit the windscreen of the motorcycle and block the user's vision, or, worse, throw motorcycle 200 off the track altogether.
  • the tactile stimuli introduced under Figures 2 and 3 relate to a spatial characteristic: the virtual texture of some areas in the scenery.
  • the tactile stimulus in Figure 4 is made conditional in that it depends on the occurrence of the situation wherein certain events coincide: both the projectile 400 and motorcycle 200 are to occupy the same space in the virtual reality scenery at the same time in order to generate the tactile stimuli.
  • temporal tactile stimuli may be introduced.
  • the front tyre of motorcycle 200 may sustain damage as a result of which it is loosing pressure.
  • Steering motorcycle 200 then becomes increasingly difficult, which is felt at control device 104 as wobbling reaction forces of increasing amplitude.
  • the pave ⁇ ment of the route through the virtual reality scenery can be made to feel increasingly slippery with time, e.g. , through reduction of reaction forces on the user manipulating control device 104.
  • tactile stimuli can be used in a video game dealing with how to negotiate obstacles on different types of terrain in a cross-country run, steeple chase or voyage of discovery; tactile stimuli can be used to si ⁇ mulate the water waves in a video game relating to canoeing in wild water or to the mooring of a vessel to an oil-rig in a storm; in a video game concerning a low-level flight of a heli ⁇ copter over broken ground and among obstacles tactile cues may be used to transmit the o vibration, that in reality is inherent in this type of aircraft.
  • the tactile data to accompany the video data may be calculated in real-time when moving through the virtual reality scenery.
  • This approach can be used, for example in a professional or semi-professional environment, when a computer model for a suspension system or a steering geometry of a vehicle is tested.
  • Using the invention provides a first impression of the behaviour of the suspension in all kinds of terrain and at all kinds of speed. The developer is thus enabled to make a purposive selection of parameter values describing the best performing computer trials for real testing later on.
  • tactual fields can be used in combination with auditory and visual images. For example, one could spatially navigate through linked vi ⁇ deo information, similar to moving through a virtual reality environment, and encounter various heard, seen and felt objects. Tactual representations for specific objects may extend beyond a representation of the object's physical surface plane. Dynamic fields can be created using tactual information to suggest a certain feeling or experience. They can also be used to guide the user in a suggested direction of movement. The fields may be displayed using force feedforward (active hand-independent device movement) and feedback through a device such as a 3-D trackball with force feedback. Furthermore two or more dynamic tactual fields can be combined to create a new experience. Sound effects (e.g. , from a video source) may also accompany tactual effects. Several examples of dynamic tactual fields are described below.
  • the conveyor belt acts like a walking platform that moves.
  • the input device e.g. trackball
  • the input device begins to move and leads the user to a predesignated position.
  • This movement can be based on using a constant force, which causes the input device to accelerate.
  • the latter motion can be time-based.
  • the time in which the user should be moved from point a to point b is specified.
  • the amount of force on the device is adjusted in real-time to maintain the average velocity at a required value, given time and distance.
  • the bumpy road consists of two combined tactual fields, being a "road field” and a "texture field”, respectively.
  • the road guides the user along a predesignated course by providing force feedback walls. The user can also feel the width of the path. Unlike the conveyor belt, the device does not move on it's own, rather the user can passively follow a felt path.
  • the texture can be changed at any point along the path by superposing a texture field on the road field.
  • the texture field can be adjusted to provide "bump" sensations at a given intensity.
  • the control-pixel distance between "bumps” can also be specified in the x and y directions. By using more than one texture field on a path, the texture can change as the user moves along the road. Sounds associated with moving over bumps have also been used in synchronization with the felt bumps to enhance the experience.
  • a cyclone or swirling effect can be created to enhance the experience of falling into a "hole” .
  • the user can see and hear effects and feel the device (e.g., ball) enter a circling-like mode.
  • the effect can also be used to create vibrations using very tight swirls.
  • ConvF.F is the force vector, composed of two elements x and y, applied on the conveyor belt (force based) object, f (user-variable) is the force magnitude, and alpha (user-variable) is the angle over which the conveyor is rotated.
  • kp, kd and ki are constants with typical values of 48, 16 and 0.01 respectively, and
  • errorP is the difference between the desired cursor position DP at a particular moment and the current cursor position CP
  • a is the cursor position at the moment the conveyor belt started
  • b is the position the cursor should be moved towards
  • total_time user variable
  • delta errorP * errorP - (previous value of errorP)
  • road.F is the force vector, composed of two elements x and y, applied to the road object, F is the force magnitude, and alpha is the angle over which the road is rotated, and
  • texture.F is the force vector applied on the texture object, composed of two elements x and y, and F is the force magnitude
  • Gx is the granularity of the texture in the X direction and Gy Is the granularity of the texture in the y direction.
  • F is the force vector, composed of two elements x and y, applied during the Swirl effect
  • Cl and C2 are constants representing the force magnitude in the x and y directions, respectively
  • f is the frequency of the swirl
  • t (O ⁇ t ⁇ d) is the current time
  • d is the duration of the effect
  • ml and m2 are contants (O ⁇ ml ⁇ l , 0 ⁇ m2 ⁇ l) representing the start and end points, respectively, of the slope function affecting the amplitude.

Abstract

A method for enabling a user of a data processing system to navigate through a virtual reality environment, comprises generating a sequence of visual representations of the virtual reality environment on a visual display in response to the user manipulating a control device. Furthermore, the method provides via a feedback mechanism appertaining to the manipulated control device, a preprogrammed dynamic tactile stimulus from an available database with tactile stimuli to the user. Such is under selective and combined control by an actual interval of the sequence of generated visual representations and furthermore instantaneous mapping of a pointing direction of the control device on a predetermined image subfield.

Description

A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method.
FIELD OF THE INVENTION
The invention relates to a method for enabling a user of a data processing system to navigate through a virtual reality environment, wherein the method comprises generating a sequence of visual representations of the virtual reality environment on a visual display in response to the user manipulating a control device. The invention also relates to a data processing system for executing the method. Further advantageous aspects are recited in dependent Claims.
BACKGROUND ART For some time, film makers have recognized the fact that the impression of a visual representation is enhanced when combined with stimuli for other sensory perceptions such as sound. Certain films have been presented in an entourage wherein the audience is seated on a platform that is made to move in synchronism with actions displayed on the screen. For example, presenting a film of a roller coaster ride is accompanied by the platform being tilted and shaken so as to give an audience a sensation of genuinely being on the ride, or presenting a film about an earthquake uses infra-sound to physically shake the audience. The audience has however no influence over the video information being shown nor over the tilting of the platform or over the shaking.
State of the art flight simulators use video information in combination with tilting of a platform accommodating a mock-up cock-pit with a flight deck. Here, the person in the cockpit guides through the flight deck's controls both the movement of the plat¬ form and the video information being displayed. The chosen path of flight determines the video information shown and the tilting angle determines the direction of gravity that in turn is felt as inertia by the person flying with the seat off his pants. It is clear that use, development and maintenance of these systems are very expensive and that they can be exploited commercially only for mass entertainment or for professional use.
US Patent 5,459,382 to Jacobus et al describes a method for providing tactile feedback in a virtual reality environment with six degrees of freedom. Dependent on the actual motion control effected by the user person on an interface device, first a visual re- ality force field generator is activated, which in its turn causes a force signal to be generated back to the interface device. Through the definition of a conservative force field, various ty¬ pes of force feedback can be emulated to the user, for so allowing a user to exercise operati¬ ons that later will have to be executed in real world. However, the reference does not allow exercising with unexpected, and in particular, dynamic disturbances that would make real-life operating so much more complicated. Moreover, the reference relates principally to simulating real-world, rather than a fully virtual reality such as is proper to entertainment systems.
SUMMARY OF THE INVENTION
Therefore, amongst other things it is an object of the invention to provide a low-cost method for enabling a user to experience an enhanced virtual reality, in particular whilst having to cope with such unexpected disturbances. It is a further object to provide such a method in a multimedia environment. Now, according to one of its aspects, the method according to the invention is characterized by providing via a feedback mechanism appertaining to the manipulated control device, a preprogrammed dynamic tactile stimulus from an available database with tactile stimuli, to the user under selective and combined control by an actual interval of the sequence of generated visual representations. The inventor has recognized that tactile stimuli, in particular dynamic stimuli, when added to visual, and possibly auditory, stimuli, can contribute significantly to the user's impression of being actually involved in a virtual reality scenery such as in video games. Video and/or graphics are used to create visual representations of a specific scenery from a variety of viewpoints. The visual representation is altered through the control device as if the user were moving through the scenery. The visual representation shown on the display is a mapping of the virtual scenery onto a plane selected through the control device. A tactile texture is joined to the video information. Similar to the visual representations, tactile representations are mappings of attributes, pre-assigned to events in the virtual reality scenery, onto the control device. When moving through the virtual environment shown on the display the user feels through the con- trol device a texture associated or other stimulus from a particular surface area or object in the virtual environment so as to intensify the impression of actually being there.
Control devices that provide tactile feedback have been disclosed in European Patent Application 489 469, corresponding US Patent Application serial No. 08/678, 115 (PHN 13,522) and non-prepublished European Patent Application 95200599.9, corresponding US Patent Application Serial No. 08/615,559 (PHN 15,232), all assigned to the present assignee. These applications disclose a trackball and a computer mouse whose revolving members are manipulated by a user, but are also driven by electric motors under software control so as to provide positive and negative torques. These devices are used to provide tactile stimuli for guiding a cursor through a labyrinth shown on the display. The positive and negative torques are experienced by the user as positive and negative reaction forces as if the revolving member's movements were constrained to the fixed path shown. The tactile feedback is related to specific absolute coordinates of the cursor on the screen.
The tactile stimulus in the present invention is generated under control of the occurrence of at least a specific one of the visual representations of the virtual scenery, in combination with an actual pointing direction of the control device on a predetermined image subfield. The occurrence of a specific pixel pattern or texture, a particular color, or a particular level of greyness is accompanied by one or more particular tactile stimuli. This integration of visual and tactile is utilized to let the user associate a tactile stimulus with an object, an area or an event in the virtual reality scenery. Alternatively or supplementarily, the tactile stimulus in the invention is generated depending on a rate at which certain visual representations succeed each other in the sequence. This combination is employed to let the user associate the tactile stimulus with a speed at which the user moves through the virtual reality scenery. For example, a video recording is made of a real scenery from a plurality of positions and from a plurality of viewing angles. The video data are used to provide the user with particular ones of a variety of visual representations of a virtual reality on a display when moving through the scenery. That is, the displayed image as a whole changes when the user moves, about as if perceived through a moving car's windscreen. The user is enabled to navigate through this virtual reality by controlling, for example, a mouse, trackball or joystick provided with features that provide force feedback. Tactile textures are associated with the visual representation. The visual representation of grass is combined with a slight amount of drag felt through the mouse or trackball when moving across a lawn. The impression of a path of cobblestones is evoked by a sequence of alternating pulling and pushing forces that change more rapidly the quicker one moves through the virtual reality scenery. A representation of a tree is felt as an impenetrable object. A lake is represented by a pit and a dynamic wave resistance pattern when moving. A resistance with directional characteristics gives the impression of going uphill or going in the teeth of a gale, etcetera. These tactile stimuli associated with objects or events in the virtual reality scenery are used in a video game, e.g. , wherein the user has to ride a (virtual) motorbike cross-country through a virtual landscape by controlling one or more control devices of the kind specified above. The landscape has different types of surface regarding texture (hardness, flatness) and extent, such as mud, asphalt, grass, loose sand, cobblestones, brushwood, frozen underground, and all kinds of obstacles such as trees, stone walls, hills, rivers, ditches, flock of sheep, turnpikes, etcetera. The relevant area of the landscape being displayed at a particular moment depends on the bike's actual direction and location with respect to the scenery. The surface of the area is tactilely represented by stimuli experienced by the user through the control device. The above tactile stimuli correspond to spatial characteristics, such as locations or orientations in the virtual scenery. One could also provide tactile stimuli with temporal characteristics or conditional tactile stimuli. In the example of the virtual motorbike, incidental gusts of wind, trying to throw the motorbike out of control, are represented by a short-lasting sudden sideways reaction force exerted by the control device on the user. A conditional tactile stimulus could be the occurrence of irregular and tiny shocks such as simulating tomatoes thrown by a (virtual) angry farmer when being passed by too close.
It is clear that tactile stimuli give a new dimension to virtual reality and considerably increase the sense of the user of actually being involved. The synergetic combination of video, audio and tactile stimuli in a software application considerably extends the concept of multimedia applications. Not only video games but also (semi-) professional application may benefit from tactile enhancement. For example, a computer model of a novel suspension for a motorcycle or a computer model of a novel steering geometry for a car may use the invention to provide a first impression of the behaviour of the suspension or steering in all kinds of terrain and at all kinds of speed.
BRIEF DESCRIPTION OF THE DRAWING
The invention is explained in further detail and by way of example with reference to the accompanying drawing wherein: Figure 1 is a block diagram of a system in the invention;
Figures 2-4 illustrate various events in an example of a software application that combines video and tactile stimuli;
Figure 5 is a first geometry diagram of the invention;
Figure 6 is a second geometry diagram of the invention. Throughout the drawing, same reference numerals indicate similar or corresponding features.
DETAILED EMBODIMENTS Figure 1 is a diagram of a data processing system 100 according to the invention. System 100 comprises a display 102, a control device 104 and a data processing apparatus 106. Control device 104 comprises, for example, one or more of the devices dis¬ closed in published European patent application 0 489 469. Manipulation of device 104 controls, via apparatus 106, an image or a sequence of images that are shown on display 102. Apparatus 106 supplies to device 104 control signals that generate stimuli for being tactilely sensed by the user. The signals have a predetermined relationship with the images actually shown on display 102.
Apparatus 106 supplies to display 102 video and/or graphics data that represents a virtual reality scenery. The visual representation of the scenery changes in response to manipulation of control device 104. Manipulation of device 104 gives the visual impression of travelling through the scenery. For example, successive images are interrelated through a perspective transformation or through a displacement transformation.
Apparatus 106 also supplies the control signals to evoke tactile feedback to the user through control device 104. The control signals are generated in synchronism with the movement through the virtual reality scenery or with the occurrence of some event in the virtual reality scenery, i.e. , the control signals are related to a succession of images. Apparatus 106 therefore is operative to combine video or graphics data with tactile data, the latter determining the generation of the control signals. The parallel supply of tactile data and video data is pre-determined in the sense that the generation of a particular image or of a sequence of particular images is accompanied by a particular tactile feedback. This is realized, for example, much in the same way as video or graphics data are accompanied by audio data in conventional video games. The user of the system in the invention is enabled to select the video data interactively and the corresponding tactile data are then generated automatically. For example, a more rapid sequence of images then also brings about a more rapid sequence of the associated tactile stimuli.
More specifically, system 100 comprises a first storage means 108 to store video data for generation of the visual representations under control of control device 104, and second storage means 110 to store tactile data for control of the generation of the tactile stimulus in a predetermined relationship with the video data. First and second storage means 108 and 110 are preferably physically integrated with one another, and the tactile data is preferably logically combined with the video data. In order to create smoothly moving pictures, the video data throughput must be correspondingly high, while the throughput of tactile data is lower than that of video data as tactile stimuli are typically low-frequency signals. The integration allows reading of the video data and the tactile data, which is logically combined with the video data, without substantially hampering the flow of video data by undesired, time-consuming context switching. One way to achieve this is, for example, merging tactile data with some video data records, identifying the tactile data as such upon reading, and supplying the tactile data to a look-up table 1 12. The tactile data are then just signals to control the generation of the tactile stimuli at control device 104. Preferably, look-up table 112 is user-programmable to enable selection or tuning of the tactile feedback. Preferably, storage means 108 and 1 10 are integrated with each other in a single information carrier 1 14 or cartridge such as a diskette or a CD. System 100 can then be used with different software applications by simply exchanging the carrier 1 14. Figures 2-4 give some examples of snap-shots of a video game of the kind mentioned above, wherein the user is supposed to play the role of a police officer on a motorcycle 200. The user interacts with the game through control device 104 that controls speed and direction of motorcycle 200 through the virtual reality scenery. Control device comprises, for example, a first trackball or joystick for directional control and a second trackball or joystick for speed control, both the speed controller and the directional controller being provided with force-feedback means. The officer is to chase a vehicle 202 driven by poachers 204, 206, 208 and 210. Poachers 204-210 are desperately trying to stay beyond reach of the strong arm of the law and drive their truck 202 wildly through the scenery. The scenery changes as the hunt continues. End of the game is when, for example, either the poachers escape or the user has come so close that the truck's license plate 212 can be read, i.e. , plate 212 is being displayed in legible characters. Preferably, the video representation of motorcycle 200 is made to lean with respect to the vertical and horizontal axes in the scenery in synchronism with a change of direction brought about via control device 104 to intensify the impression of the user riding motorcycle 200 himself. In Figure 2 motorcycle 200 is trying to catch truck 202 speeding along a bumpy path 204 of loose sand. It is hard to ride a motorcycle along a predetermined line through loose sand as the rear wheel tends to start wobbling, trying to throw the motorcycle off course, and as steering requires a considerable torque to be applied to the handlebars when changing directions. This is made to be felt at control device 104 by system 100 exerting alternating transversal forces on the user's hand depending on, e.g., speed and leaning angle. The user therefore has to concentrate very hard in order not to loose control over motorcycle 200 and, at the same time, not to loose sight of truck 202.
In Figure 3 motorcycle 200 is ridden on a grey autumn day, in a drizzle at dusk (dinner time), along a road 300 with bumps and potholes that are made to be felt as sudden shocks at control device 104, and fallen leaves that tend to gather where the road makes a bend, and that are made to be felt as a sudden loss of resistance at control device 104.
In Figure 4 motorcycle 200 is chasing truck 202 along a street. Poacher 210 is trying to hit the officer by throwing objects at him, such as, say, tomatoes, spare wheels and other things that are carried in the trunk of a truck for this purpose. So, in addi¬ tion to keep motorcycle 200 on course, the user has to avoid projectiles 402, 404 and 406 that otherwise may hit the windscreen of the motorcycle and block the user's vision, or, worse, throw motorcycle 200 off the track altogether. Note that the tactile stimuli introduced under Figures 2 and 3 relate to a spatial characteristic: the virtual texture of some areas in the scenery. The tactile stimulus in Figure 4 is made conditional in that it depends on the occurrence of the situation wherein certain events coincide: both the projectile 400 and motorcycle 200 are to occupy the same space in the virtual reality scenery at the same time in order to generate the tactile stimuli. In addition to spatial and conditional tactile stimuli, temporal tactile stimuli may be introduced. For example, the front tyre of motorcycle 200 may sustain damage as a result of which it is loosing pressure. Steering motorcycle 200 then becomes increasingly difficult, which is felt at control device 104 as wobbling reaction forces of increasing amplitude. Further, the pave¬ ment of the route through the virtual reality scenery can be made to feel increasingly slippery with time, e.g. , through reduction of reaction forces on the user manipulating control device 104.
Variations on this theme and refinements are manifold. Software applications integrating visual and tactile stimuli extend well beyond the above good-guy/bad- guy scenario. As alternatives to the latter scenario, consider the following: tactile stimuli can be used in a video game dealing with how to negotiate obstacles on different types of terrain in a cross-country run, steeple chase or voyage of discovery; tactile stimuli can be used to si¬ mulate the water waves in a video game relating to canoeing in wild water or to the mooring of a vessel to an oil-rig in a storm; in a video game concerning a low-level flight of a heli¬ copter over broken ground and among obstacles tactile cues may be used to transmit the o vibration, that in reality is inherent in this type of aircraft.
Alternatively, the tactile data to accompany the video data may be calculated in real-time when moving through the virtual reality scenery. This approach can be used, for example in a professional or semi-professional environment, when a computer model for a suspension system or a steering geometry of a vehicle is tested. Using the invention provides a first impression of the behaviour of the suspension in all kinds of terrain and at all kinds of speed. The developer is thus enabled to make a purposive selection of parameter values describing the best performing computer trials for real testing later on.
Above examples merely illustrate an enhanced form of virtual reality by adding tactile stimuli to video and audio information. Above examples are by no means intended to limit the scope of the present invention.
FURTHER IMPLEMENTATIONS OF THE TACTILE STIMULI
In creating and enhancing experiences within pseudo-worlds such as those which may appear in a video film, a number of tactual fields can be used in combination with auditory and visual images. For example, one could spatially navigate through linked vi¬ deo information, similar to moving through a virtual reality environment, and encounter various heard, seen and felt objects. Tactual representations for specific objects may extend beyond a representation of the object's physical surface plane. Dynamic fields can be created using tactual information to suggest a certain feeling or experience. They can also be used to guide the user in a suggested direction of movement. The fields may be displayed using force feedforward (active hand-independent device movement) and feedback through a device such as a 3-D trackball with force feedback. Furthermore two or more dynamic tactual fields can be combined to create a new experience. Sound effects (e.g. , from a video source) may also accompany tactual effects. Several examples of dynamic tactual fields are described below.
The "conveyor belt"
The conveyor belt acts like a walking platform that moves. When the user enters the range of the conveyor belt, the input device (e.g. trackball) begins to move and leads the user to a predesignated position. This movement can be based on using a constant force, which causes the input device to accelerate. Alternatively, the latter motion can be time-based. In the time-based mode, the time in which the user should be moved from point a to point b is specified. The amount of force on the device is adjusted in real-time to maintain the average velocity at a required value, given time and distance. A "bumpy road"
The bumpy road consists of two combined tactual fields, being a "road field" and a "texture field", respectively. The road guides the user along a predesignated course by providing force feedback walls. The user can also feel the width of the path. Unlike the conveyor belt, the device does not move on it's own, rather the user can passively follow a felt path. The texture can be changed at any point along the path by superposing a texture field on the road field. The texture field can be adjusted to provide "bump" sensations at a given intensity. The control-pixel distance between "bumps" can also be specified in the x and y directions. By using more than one texture field on a path, the texture can change as the user moves along the road. Sounds associated with moving over bumps have also been used in synchronization with the felt bumps to enhance the experience.
A "swirling effect"
A cyclone or swirling effect can be created to enhance the experience of falling into a "hole" . The user can see and hear effects and feel the device (e.g., ball) enter a circling-like mode. The effect can also be used to create vibrations using very tight swirls.
Conveyor belt - force based
ConvF.F.x = F * cos(alpha) ConvF.F.y = F * sin(alpha)
Here ConvF.F is the force vector, composed of two elements x and y, applied on the conveyor belt (force based) object, f (user-variable) is the force magnitude, and alpha (user-variable) is the angle over which the conveyor is rotated.
Conveyor belt - time based
ConvT.F.x = F * cos(alpha) ConvT.F.y = F * sin(alpha)
Here ConvT.F Is the force vector, composed of two elements x and y, applied in the conveyor belt (time based) object, alpha (user variable) is the angle over which the conveyor is rotated, and F = kp * errorP + kd * delta errorP + kl * integral errorP
Here kp, kd and ki are constants with typical values of 48, 16 and 0.01 respectively, and
errorP = length (DC - CP)
Here errorP is the difference between the desired cursor position DP at a particular moment and the current cursor position CP, and
DP = a - (elapsed_time)/(total_time) * (b-a)
Here a is the cursor position at the moment the conveyor belt started, b is the position the cursor should be moved towards, and total_time (user variable) is the amount of time the cursor movement should take, and
delta errorP *= errorP - (previous value of errorP)
Here integral errorP = sum of all calculated errorP values so far, and with errorP and integral_errorP held in the ranges [-25,25] and [-5000, 5000] respectively.
Road object
road.F.x = F * s * sin(alpha) road.F.y = F * s * (-cos(alpha))
Here road.F is the force vector, composed of two elements x and y, applied to the road object, F is the force magnitude, and alpha is the angle over which the road is rotated, and
1 , if the cursor is in section A s = 0, if the cursor is in section B
-1, if the cursor is in section C Texture object
texture.F.x = - w.x * F * k texture.F.y = - w.y * F * k
Here texture.F is the force vector applied on the texture object, composed of two elements x and y, and F is the force magnitude, and
- unit vector of v
Here v = (current cursor position) - (cursor position the last time k was equal to 1), and
k = 1, if abs(v.x) > Gx, or abs(v.y) > Gy, k = O, otherwise.
Here, Gx is the granularity of the texture in the X direction and Gy Is the granularity of the texture in the y direction.
Swirl
swirl.F.x = F * Cx * sin(2*pi * f/t) swirl.F.y = F * Cy * cos(2*pf * f/t)
Here swirl. F is the force vector, composed of two elements x and y, applied during the Swirl effect, Cl and C2 are constants representing the force magnitude in the x and y directions, respectively, f is the frequency of the swirl, t (O ≤t ≤ d) is the current time, d is the duration of the effect, and
F = ml + t/d * (m2-ml)
Here ml and m2 are contants (O ≤ml ≤ l , 0 < m2 ≤ l) representing the start and end points, respectively, of the slope function affecting the amplitude.

Claims

CLAIMS:
1. A method for enabling a user of a data processing system to navigate through a virtual reality environment, wherein the method comprises generating a sequence of visual representations of the virtual reality environment on a visual display in response to the user manipulating a control device, characterized by providing via a feedback mechanism appertaining to the manipulated control device, a preprogrammed dynamic tactile stimulus from an available database with tactile stimuli, to the user under selective and combined control by an actual interval of the sequence of generated visual representations.
2. A method as claimed in Claim 1 , furthermore comprising a step of instantaneous mapping of a pointing direction of the control device on a predetermined image subfield, and furthermore mapping said dynamic tactile stimulus on said subfield.
3. A method as claimed in Claims 1 or 2, wherein the dynamic tactile stimulus is selected from the database under control of at least one of the following: an occurrence of a specific visual representation of at least one specific object actually dis- played, or a rate of change of the visual representation of such at least one specific object.
4. A method as claimed in Claims 1 , 2, or 3, wherein said dynamic tactile stimulus emulates a physical interference with said user moving through said virtual reality environment.
5. A method as claimed in any of Claims 1 to 4, wherein said tactile stimulus emulates moving in a topographical or meteorological disturbance associated to said virtual reality environment.
6. A method as claimed in any of Claim s 1 to 5, wherein said tactile stimulus emulates an impact with an external virtual object in said virtual reality environment.
7. A method as claimed in any of Claims 1 to 6, wherein encounter with said tactile stimulus is mapped on a subfield of the visual display in congruence with a particular visual area representation of said subfield.
8. A method as claimed in Claims 7, wherein said image subfield is displayed with a particular texture, colour, or greyness level that is correlated to said tactile stimulus provided.
9. A method as claimed in any of Claims 1 to 8, wherein a selection of said tactile is accompanied by associated sound.
10. A method as claimed in any of Claims 1 to 9, and allowing to superpose two or more individual tactile stimuli.
11. A data processing system for enabling a user to navigate through a virtual reality environment scenery and comprising a display, and furthermore a control device for user-interactive generation of a sequence of visual representations of the scenery on the display, characterized in that the control device is operative to provide through at least one feedback mechanism of the control device to the user a preprogrammed dynamic tactile stimulus from a database repertoire of tactile stimuli under selective and combined control of an actual interval of the sequence of generated visual representations, and furthermore instantaneous mapping of a pointing direction of the control device on a predetermined image subfield.
12. A system as claimed in Claim 11, comprising first storage means to store image data for the generation of the visual representations under control of the control device, and second storage means to store tactile data for control of the generation of the tactile stimulus in a predetermined relationship with the video data.
13. A system as claimed in Claim 12, wherein the first and second storage means are physically integrated with one another, and wherein the tactile data is logically combined with the image data.
14. A system as claimed in Claims 12 or 13, and comprising a look-up table for control of the generation of the tactile stimulus in response to the tactile data.
15. A system as claimed in Claim 14, wherein the look-up table is user- programmable.
16. An information carrier comprising the first and second storage means for use with the system as claimed in Claims 13, 14 or 15.
PCT/IB1996/001231 1995-11-24 1996-11-15 A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method WO1997019398A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP96935264A EP0806002A1 (en) 1995-11-24 1996-11-15 A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method
JP9519547A JPH10513593A (en) 1995-11-24 1996-11-15 Method for presenting virtual reality enhanced by tactile stimuli and system for performing the method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP95203234 1995-11-24
EP95203234.0 1995-11-24

Publications (1)

Publication Number Publication Date
WO1997019398A1 true WO1997019398A1 (en) 1997-05-29

Family

ID=8220862

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB1996/001231 WO1997019398A1 (en) 1995-11-24 1996-11-15 A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method

Country Status (3)

Country Link
EP (1) EP0806002A1 (en)
JP (1) JPH10513593A (en)
WO (1) WO1997019398A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999038064A2 (en) * 1998-01-23 1999-07-29 Koninklijke Philips Electronics N.V. Multiperson tactual virtual environment
GB2337828A (en) * 1998-05-25 1999-12-01 Daewoo Electronics Co Ltd Virtual reality simulation apparatus
WO2000071217A1 (en) * 1999-05-21 2000-11-30 Michael Charles Cooke A feedback assembly for computer games
WO2007117649A2 (en) * 2006-04-06 2007-10-18 Immersion Corporation Systems and methods for enhanced haptic effects
CN105094798A (en) * 2014-05-20 2015-11-25 意美森公司 Haptic design authoring tool
US9468846B2 (en) 2009-01-30 2016-10-18 Performance Designed Products Llc Tactile feedback apparatus and method
CN106128323A (en) * 2016-09-06 2016-11-16 卓汎有限公司 A kind of vehicle window virtual reality display system
US9836117B2 (en) 2015-05-28 2017-12-05 Microsoft Technology Licensing, Llc Autonomous drones for tactile feedback in immersive virtual reality
EP3267290A1 (en) * 2016-07-08 2018-01-10 Thomson Licensing Method, apparatus and system for rendering haptic effects
WO2018010823A1 (en) * 2016-07-15 2018-01-18 Irdeto B.V. Obtaining a user input
US9898864B2 (en) 2015-05-28 2018-02-20 Microsoft Technology Licensing, Llc Shared tactile interaction and user safety in shared space multi-person immersive virtual reality
US9911232B2 (en) 2015-02-27 2018-03-06 Microsoft Technology Licensing, Llc Molding and anchoring physically constrained virtual environments to real-world environments
CN110045826A (en) * 2019-04-01 2019-07-23 北京小马智行科技有限公司 Virtual reality experience methods, devices and systems applied to vehicle

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JO1406B1 (en) * 1984-11-02 1986-11-30 سميث كلاين اند فرينش لابوراتوريز ليمتد Chemical compounds
US7623114B2 (en) * 2001-10-09 2009-11-24 Immersion Corporation Haptic feedback sensations based on audio output from computer devices
US6703550B2 (en) * 2001-10-10 2004-03-09 Immersion Corporation Sound data output and manipulation using haptic feedback
WO2007029811A1 (en) * 2005-09-08 2007-03-15 Sega Corporation Game machine program, game machine, and recording medium storing game machine program
KR101244442B1 (en) * 2011-04-13 2013-03-18 한국기술교육대학교 산학협력단 Haptic Controller and Device Controlling System Using Thereof
WO2015121970A1 (en) * 2014-02-14 2015-08-20 富士通株式会社 Educational tactile device and system
CN110456973B (en) * 2019-07-16 2022-08-19 江苏铁锚玻璃股份有限公司 Touch-based transparent and opaque one-key switching method and device for intelligent vehicle window

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1992018925A1 (en) * 1991-04-20 1992-10-29 W. Industries Limited Haptic computer output device
US5255211A (en) * 1990-02-22 1993-10-19 Redmond Productions, Inc. Methods and apparatus for generating and processing synthetic and absolute real time environments
US5405152A (en) * 1993-06-08 1995-04-11 The Walt Disney Company Method and apparatus for an interactive video game with physical feedback

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5255211A (en) * 1990-02-22 1993-10-19 Redmond Productions, Inc. Methods and apparatus for generating and processing synthetic and absolute real time environments
WO1992018925A1 (en) * 1991-04-20 1992-10-29 W. Industries Limited Haptic computer output device
US5405152A (en) * 1993-06-08 1995-04-11 The Walt Disney Company Method and apparatus for an interactive video game with physical feedback

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999038064A3 (en) * 1998-01-23 1999-09-30 Koninkl Philips Electronics Nv Multiperson tactual virtual environment
WO1999038064A2 (en) * 1998-01-23 1999-07-29 Koninklijke Philips Electronics N.V. Multiperson tactual virtual environment
GB2337828A (en) * 1998-05-25 1999-12-01 Daewoo Electronics Co Ltd Virtual reality simulation apparatus
WO2000071217A1 (en) * 1999-05-21 2000-11-30 Michael Charles Cooke A feedback assembly for computer games
WO2007117649A2 (en) * 2006-04-06 2007-10-18 Immersion Corporation Systems and methods for enhanced haptic effects
WO2007117649A3 (en) * 2006-04-06 2008-09-12 Immersion Corp Systems and methods for enhanced haptic effects
US10152124B2 (en) 2006-04-06 2018-12-11 Immersion Corporation Systems and methods for enhanced haptic effects
EP3287874A1 (en) * 2006-04-06 2018-02-28 Immersion Corporation Systems and methods for enhanced haptic effects
US9468846B2 (en) 2009-01-30 2016-10-18 Performance Designed Products Llc Tactile feedback apparatus and method
CN105094798A (en) * 2014-05-20 2015-11-25 意美森公司 Haptic design authoring tool
US10191552B2 (en) 2014-05-20 2019-01-29 Immersion Corporation Haptic authoring tool using a haptification model
EP2947547A1 (en) * 2014-05-20 2015-11-25 Immersion Corporation Haptic design authoring tool
US9921653B2 (en) 2014-05-20 2018-03-20 Immersion Corporation Haptic authoring tool using a haptification model
US9330547B2 (en) 2014-05-20 2016-05-03 Immersion Corporation Haptic effect authoring tool based on a haptification model
US9911232B2 (en) 2015-02-27 2018-03-06 Microsoft Technology Licensing, Llc Molding and anchoring physically constrained virtual environments to real-world environments
US9898864B2 (en) 2015-05-28 2018-02-20 Microsoft Technology Licensing, Llc Shared tactile interaction and user safety in shared space multi-person immersive virtual reality
US9836117B2 (en) 2015-05-28 2017-12-05 Microsoft Technology Licensing, Llc Autonomous drones for tactile feedback in immersive virtual reality
EP3267290A1 (en) * 2016-07-08 2018-01-10 Thomson Licensing Method, apparatus and system for rendering haptic effects
EP3267288A1 (en) * 2016-07-08 2018-01-10 Thomson Licensing Method, apparatus and system for rendering haptic effects
US20180013640A1 (en) * 2016-07-08 2018-01-11 Thomson Licensing Method, apparatus and system for rendering haptic effects
CN107589830A (en) * 2016-07-08 2018-01-16 汤姆逊许可公司 For the methods, devices and systems of haptic effect to be presented
WO2018010823A1 (en) * 2016-07-15 2018-01-18 Irdeto B.V. Obtaining a user input
CN109416613A (en) * 2016-07-15 2019-03-01 爱迪德技术有限公司 Obtain user's input
US11113380B2 (en) 2016-07-15 2021-09-07 Irdeto B.V. Secure graphics
US11727102B2 (en) 2016-07-15 2023-08-15 Irdeto B.V. Obtaining a user input
CN106128323A (en) * 2016-09-06 2016-11-16 卓汎有限公司 A kind of vehicle window virtual reality display system
CN110045826A (en) * 2019-04-01 2019-07-23 北京小马智行科技有限公司 Virtual reality experience methods, devices and systems applied to vehicle

Also Published As

Publication number Publication date
JPH10513593A (en) 1998-12-22
EP0806002A1 (en) 1997-11-12

Similar Documents

Publication Publication Date Title
EP0806002A1 (en) A method for presenting virtual reality enhanced with tactile stimuli, and a system for executing the method
Hock et al. Carvr: Enabling in-car virtual reality entertainment
CN105807922A (en) Implementation method, device and system for virtual reality entertainment driving
US6426752B1 (en) Game device, method of processing and recording medium for the same
WO2018012395A1 (en) Simulation system, processing method, and information storage medium
JP6719308B2 (en) Simulation system and program
EP3082122A1 (en) Applied layout in virtual motion-acceleration spherical simulator
US20040110565A1 (en) Mobile electronic video game
KR102161646B1 (en) System and method for interworking virtual reality and indoor exercise machine
JPH07507402A (en) Driver training system with performance data feedback
EP0970414B1 (en) Multiperson tactual virtual environment
JP3273038B2 (en) Virtual experience type game device
CN110728878A (en) Somatosensory interactive VR driving simulation device
JP4114822B2 (en) Image generating apparatus and information storage medium
US7246103B2 (en) Probabilistic model of distraction for a virtual reality environment
JP3883224B2 (en) Image composition apparatus and image composition method
JPH06277362A (en) Three-dimensional game device
RU2149667C1 (en) Apparatus for exercising and competing, particularly, for performing sportive locomotions and games
JP3273017B2 (en) Image synthesis device and virtual experience device using the same
JP3770290B2 (en) Image processing device, amusement facility and vehicle for amusement facility
JP2002224434A (en) Image composing device, virtual experience device, and image composing method
JPH113437A (en) Image synthesizer and image synthesizing method
Lawlor Virtual reality vehicle simulator phase 1
JPH11134515A (en) Game device and game screen compositing method
JP3638669B2 (en) Image composition method and game device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

WWE Wipo information: entry into national phase

Ref document number: 1996935264

Country of ref document: EP

ENP Entry into the national phase

Ref country code: JP

Ref document number: 1997 519547

Kind code of ref document: A

Format of ref document f/p: F

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWP Wipo information: published in national office

Ref document number: 1996935264

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1996935264

Country of ref document: EP