WO1999057896A1 - Environment simulation apparatus and method - Google Patents

Environment simulation apparatus and method Download PDF

Info

Publication number
WO1999057896A1
WO1999057896A1 PCT/CA1999/000411 CA9900411W WO9957896A1 WO 1999057896 A1 WO1999057896 A1 WO 1999057896A1 CA 9900411 W CA9900411 W CA 9900411W WO 9957896 A1 WO9957896 A1 WO 9957896A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
environment
producing
further including
location
Prior art date
Application number
PCT/CA1999/000411
Other languages
French (fr)
Inventor
Richard Ronald Alm
Original Assignee
Edti Exhibit Management Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Edti Exhibit Management Inc. filed Critical Edti Exhibit Management Inc.
Priority to AU38046/99A priority Critical patent/AU3804699A/en
Priority to CA002330635A priority patent/CA2330635A1/en
Publication of WO1999057896A1 publication Critical patent/WO1999057896A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/33Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
    • A63F13/338Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using television networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • H04N21/8133Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/087Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only
    • H04N7/088Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/205Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform for detecting the geographical location of the game platform
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/409Data transfer via television network
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video

Definitions

  • This invention relates to an environment simulation apparatus and method and more particularly to a system for recording or transmitting and reproducing at a second location, motion, video and audio sensory information sensed at a first location.
  • the ability to experience conditions or an environment situated at a first location m space is achieved by those who are able to physically be present at the first location.
  • Motion pictures and television have enabled experiencing some features of an environment, namely visual and audio features, at a second, remote location from the first location.
  • a method of generating signals representative of an environment includes the steps of :
  • an apparatus for generating signals representative of an environment includes at least one measurement device for producing at least one environment signal indicative of a property of the environment, a television camera for producing at least one television signal m response to visual and audio stimuli as observed from the first location and an inserter for producing a composite signal including the environment signal and the television signal .
  • the apparatus includes a transmitter for transmitting the composite signal to a remote location.
  • the apparatus may include a recorder for recording the composite signal . - 3 -
  • the apparatus includes a data acquisition system for producing a plurality of movement signals representing movement relative to a plurality of axes, such movement signals including acceleration signals representing acceleration along three mutually orthogonal axes and rotation signals representing rotation about three mutually orthogonal axes.
  • the data acquisition system produces a plurality of codes representing respective features of the environment, including a plurality of codes representing movement relative to respective axes and more particularly acceleration codes representing acceleration of a sensor along at least one axis and rotation codes representing rotation of the sensor about at least one axis.
  • the data acquisition system produces acceleration codes representing acceleration of the sensor along three mutually orthogonal axes respectively and rotation codes representing rotation of the sensor about the three mutually orthogonal axes respectively.
  • the apparatus includes a data acquisition system for producing vibration codes representing vibration at the first location.
  • the inserter encodes the television signal with the acceleration, rotation and vibration codes m a vertical blanking interval of the television signal.
  • the apparatus includes a position detection system for generating position data indicative of the geographical position of the first location and preferably, the inserter produces the composite signal such that it includes the position data. - 4 -
  • the apparatus includes a data acquisition system for producing a vibration code representing vibration at the first location and preferably, the inserter produces the composite signal such that it includes the vibration code.
  • a method of simulating an environment includes the steps of :
  • an apparatus for simulating an environment includes a receiver for receiving a composite signal including an environment signal and a television signal representing the environment, a television data decoder for decomposing the composite signal into the television signal and the environment signal, transducers for producing images and sounds perceptible at a second location in response to the television signal and a transducer at the second location for controlling at least one feature of the environment in response to the environment signal .
  • the apparatus includes a processor for extracting position information from the environment signal and for generating a graphical image in response to the position information.
  • the processor combines the graphical image with a video portion of the television signal .
  • the apparatus includes a display for displaying images including the graphical image and an image represented by the video portion of the television signal.
  • the apparatus includes a processor for extracting motion data from the environment signal and a support operable to move m response to the motion data.
  • the apparatus includes a processor for extracting vibration data from the environment signal and a vibration transducer actuated m response to the vibration data.
  • the apparatus includes a processor for extracting position information from the environment signal, generating a graphical image m response to the position information, combining the graphical image with a video portion of the television signal, extracting motion data from the environment signal, extracting vibration data from the environment signal, and displaying images including the graphical image and an image represented by the video portion of the television signal while moving a support m response to the motion data and actuating a vibration transducer m response to the vibration data.
  • a processor for extracting position information from the environment signal, generating a graphical image m response to the position information, combining the graphical image with a video portion of the television signal, extracting motion data from the environment signal, extracting vibration data from the environment signal, and displaying images including the graphical image and an image represented by the video portion of the television signal while moving a support m response to the motion data and actuating a vibration transducer m response to the vibration data.
  • a method of simulating at a second location, at least one feature of an environment at a first location includes the steps of :
  • an apparatus for simulating at a second location, at least one feature of an environment at a first location.
  • the apparatus includes a signal generator for generating signals representative of the environment at the first location and an environment simulator at the second location.
  • the signal generator includes at least one measurement device for producing at least one environment signal indicative of a property of the environment, a television camera for producing at least one television signal m response to visual and audio stimuli as observed from the first location, and an inserter for producing a composite signal including the environment signal and the television signal .
  • the environment simulator includes a receiver for receiving the composite signal including environment signal and the television signal representing the environment, a television data decoder for decomposing the composite signal into the television signal and the environment signal, transducers for producing images and sounds m response to the television signal and, a transducer for controlling at least one feature of the environment m response to the environment signal.
  • Figure 1 is a block diagram of an apparatus for simulating at a second location at least one feature of an environment at a first location, according to a first embodiment of the invention
  • Figure 2 is a perspective view of a motion capture subsystem, according to the first embodiment showing axes about which motion measurements are taken;
  • Figure 3 is a block diagram of a multimedia data encoder, according to the first embodiment of the invention.
  • Figure 4 is flowchart of a packet routine run by a processor m the multimedia data encoder, according to the first embodiment of the invention
  • Figure 5 is a schematic representation of a data packet produced by the multimedia data encoder, according to the first embodiment of the invention.
  • Figure 6 is a block diagram of a personal computer multimedia system according to the first embodiment of the invention.
  • Figure 7 is a flowchart of a channel data interrupt routine run by the personal computer, according to the first embodiment of the invention. - 8 -
  • Figure 8 is a schematic representation of a motion platform according to the first embodiment of the invention.
  • Figure 9 is an apparatus for the simultaneous recording of motion, video and audio according to a second embodiment of the invention.
  • an apparatus for simulating at a second location, a least one feature of an environment at a first location includes an environment signal production system 10 and an environment simulator 12.
  • the environment signal production system 10 includes an environment data acquisition system 11 and a combiner 13 which cooperate to generate signals representative of an environment .
  • the environment data acquisition system 11 includes four subsystems, namely, a motion capture subsystem 18, an audio/video capture subsystem 20, a vibration capture subsystem 22, and a position profile subsystem 24.
  • the environment data acquisition system 11 is a portable remote system designed to be placed m an existing system (not shown) the environment of which is to be measured and simulated. Such environment may be that of a cockpit of an aircraft, for example.
  • the motion capture subsystem 18 includes an observer station 26 and a motion sensor 28.
  • the observer station 26 is a chair- like structure having a seat 30 and a seat back 32 m which an individual observer m the actual environment being sensed may be seated.
  • the dimensions and structure of the observer station 26 will be determined by the system m which the environment signal production system 10 is placed, such as a pilot's cockpit or race car seat.
  • the chair-like structure forms part of the system m which the environment signal production system 10 is placed, and is a pilot's seat m an aircraft.
  • the motion sensor 28 is affixed to the bottom of the observer station 26.
  • the motion sensor 28 is a MotionPak (Trademark) sensor manufactured by the Systron Donner company of California.
  • the motion sensor 28 includes an mertial sensor cluster with six degrees of freedom, which measures linear acceleration along three mutually perpendicular axes x, y and z, and which measures angular rate of rotation about these same axes.
  • the motion sensor is connected to the bottom of the observer station such that the positive x axis is defined to lie m a direction parallel to the plane of the seat 30 and perpendicular to and away from the line of intersection of the seat 30 and the seat back 32.
  • the positive x, y and z axes are defined to constitute a "right-hand" coordinate system such that the relation between the axes may be represented respectively by the outstretched thumb, index and middle fingers of the human right hand.
  • the positive y axis lies m a direction perpendicular and m a clockwise direction to the positive x axis, viewed from above, and parallel to the plane of the seat 30.
  • the positive z axis extends m a direction perpendicular to both the positive x axis and the positive y axis and extends downwards from the plane of the seat 30.
  • the three positive axes have the directions shown m Figure 2.
  • the use of the negative sense m respect of any or all of these axes is taken to mean the direction 180 degrees opposite to the direction of the corresponding positive axis.
  • Acceleration is measured along each axis x, y and z and angular rate of rotation about each axis is measured m planes perpendicular to the x, y and z axes.
  • Positive rotation is measured m a clockwise direction m each plane when viewed from the origin of the three axes along the corresponding positive axis as shown m Figure 2.
  • the use of the negative sense m respect of any or all of these rotational vectors is taken to mean the rotational sense opposite to the sense of the corresponding positive vector.
  • the motion sensor 28 In response to linear acceleration and angular rate of rotation measurements, the motion sensor 28 generates analog voltage signals representing linear acceleration and angular rate of rotation about each of the axes x, y and z respectively.
  • the amplitude of each signal generated m response to linear acceleration varies m the range of between -7.5VDC and +7.5VDC, according to the instantaneous linear acceleration measured along the associated axis, m linear proportion to a measurable range of linear accelerations from -2.0g to +2.0g, where g represents the gravitational force exerted by the earth on an object at sea level and is equal to 9.81 m/sec/sec.
  • a negative value of acceleration represents an acceleration m a direction opposite to the positive axis, or along the negative axis.
  • each signal generated m response to angular rate of rotation varies m the range of -2.5VDC to +2.5VDC, according to the instantaneous angular rate of rotation about respective axes x, y and z, m linear proportion to a measurable range of angular velocities from - 11 -
  • a negative value of rate of rotation represents angular rate of rotation m a rotational sense counterclockwise when viewed from the origin.
  • the sensor has x, y and z linear acceleration outputs 34, 36, and 38 respectively which continuously provide analog voltage signals corresponding to the instantaneous linear acceleration measured by the motion sensor 28 along the positive x, y and z axes.
  • the sensor thus acts as a data acquisition system or means for producing acceleration signals representing acceleration along three mutually orthogonal axes.
  • the sensor also has x, y and z rate outputs 40, 42, and 44 respectively which continuously provide analog voltage signals corresponding to the instantaneous angular velocity measured by the motion sensor 28 about the x, y and z axes respectively.
  • the sensor thus acts as a data acquisition system or means for producing rotation signals representing rotation about three mutually orthogonal axes.
  • the linear acceleration outputs 34, 36 and 38 and the rate outputs 40, 42 and 44 are shown generally at 45 and are connected to the combiner 13.
  • the motion sensor 28 acts as a data acquisition system or means for producing a plurality of movement signals representing movement relative to a plurality of axes. Such movement signals are indicative of properties of the sensed environment, the properties being acceleration and rotation of the environment m space. Hence it may be said that the sensor acts as a measurement device or means for producing at least one environment signal indicative of a property of the environment.
  • the audio/video capture subsystem 20 includes a stabilizing platform 46, a video camera 48 and a microphone 52.
  • the - 12 - stabilizing platform 46 is physically positioned m proximity to the observer station 26 and the video camera 48 is mounted on the stabilizing platform 46.
  • the locations of the stabilizing platform 46 and video camera 48 are selected to maximize the quality of the video images produced by the video camera 48 and to capture as faithfully as possible images that would be detected by an individual seated at the observer station 26.
  • the stabilizing platform 46 includes a Glidecam (R) system which dampens the movement to which the video camera 48 is subjected, to reduce jitter m video signals produced by the video camera 48.
  • Glidecam R
  • the microphone 52 measures audio disturbances m the audio range and is physically positioned m proximity to the observer station 26 m order to detect audio disturbances audible at the observer station 26.
  • the location of the microphone 52 is selected to maximize the quality of the audio signals detected by the microphone 52 to represent as faithfully as possible, audio disturbances that would be detected by an individual seated at the observer station 26.
  • the microphone 52 converts audio disturbances into electrical signals which it transmits on a microphone cable 56 to the video camera 48.
  • the video camera 48 represents visual images seen by an observer seated at the observer station 26 with conventional composite video signals according to a known video format which, m this embodiment, is that specified by the National Television System Committee (NTSC) . Such video signals are transmitted along a video cable 50 to the combiner 13.
  • NTSC National Television System Committee
  • the video camera acts as a television camera or means for producing at least one television signal m response to visual and audio stimuli as observed from the first location.
  • the vibration capture subsystem 22 includes a low frequency microphone 54.
  • the low frequency microphone 54 is physically positioned m proximity to the observer station 26 m order to detect low frequency vibrations m a frequency range of lOHz to 30Hz, as detected by an observer at the observer station 26.
  • the location of the low frequency microphone 54 is selected to maximize the quality of low frequency signals detected by the low frequency microphone 54 and to represent, as faithfully as possible, low frequency disturbances that would be detected by an individual seated at the observer station 26.
  • the low frequency microphone 54 converts low frequency audio disturbances into electrical signals which it transmits along a vibration signal line 58 to the combiner 13.
  • the low frequency microphone 54 thus acts as vibration signal producing means for producing signals representing vibration experienced m the environment.
  • vibration is a property of the environment being sensed, it may be said that the low frequency microphone also acts as a measurement device or means for producing at least one environment signal indicative of a property of the environment, where vibration is the property being sensed.
  • the position profile subsystem 24 includes an antenna 60, and a global positioning system (GPS) processor 62.
  • the antenna 60 receives GPS signals from a satellite network (not shown) established to compile and broadcast such information.
  • the antenna is a Marine IV (Trademark) antenna manufactured by Ashtec Inc. of Sunnyvale, California.
  • the antenna 60 is connected to the GPS processor 62 which decodes the signals received from the satellite network to produce a serial GPS data message m accordance with the National Marine Electrical Association (NMEA) recommendation 0183 at an output 64.
  • NMEA National Marine Electrical Association
  • the output 64 is connected to the combiner 13.
  • the GPS processor 62 is a G12 GPS board manufactured by Ashtec Inc. of Sunnyvale, California.
  • the antenna 60 and GPS processor 62 thus act as a position detection system or means for generating position data indicative of the geographical position of the first location.
  • the first location is m an aircraft and therefore as the aircraft moves, longitude and latitude positions of the aircraft are represented by said position data.
  • the signal produced by the GPS system acts as an environment signal indicative of a property of the environment, where such property is the geographical position of the environment m space.
  • the combiner 13 includes a multimedia data encoder 70, a TV data encoder 72 and a television transmitter 74.
  • the multimedia data encoder is shown generally at 70 and includes a scaling and offset circuit 76, a multiplexer (MUX) 78, an analog to digital converter 80, an I/O circuit 82, a microprocessor 84, an RS-232-C interface circuit 86, random access memory (RAM) 88 and read only memory (ROM) 90.
  • MUX multiplexer
  • I/O circuit 82 an I/O circuit 82
  • microprocessor 84 RS-232-C interface circuit 86
  • RAM random access memory
  • ROM read only memory
  • the scaling and offset circuit 76 has a plurality of inputs A, B, C, D, E, F, G for receiving the signals representing instantaneous linear acceleration and instantaneous angular velocity provided by the linear acceleration outputs and the rate outputs, and for receiving the vibration signal from the low frequency microphone 54. Essentially, the scaling and offset circuit scales the signals received at the inputs to 0-5 volts. The scaling and offset circuit thus has a plurality of outputs 92 which provide scaled and offset signals representing scaled versions of the motion - 15 - signals and the low frequency microphone signal respectively.
  • the multiplexer has a plurality of inputs 94 which are connected to the outputs 92 of the scaling and offset circuit to receive the scaled and offset signals.
  • the multiplexer also has a control input 96 which is connected to an output 99 of the I/O circuit 82 for receiving signals for directing the MUX to select an input 94.
  • the MUX further has an output 98 which is connected to the analog to digital converter 80.
  • the I/O circuit has an output 100 which is connected to the analog to digital converter 80 to control the analog to digital converter 80 and further has an input 102 for receiving 16 bit digital signals produced by the analog to digital converter 80.
  • the I/O circuit further includes a universal asynchronous receiver/transmitter 104 which is connected to the RS-232-C interface 86.
  • the I/O circuit 82 also has a 16 bit data bus output 106 for communication with the microprocessor 84.
  • the ROM 90 is programmed with codes which direct the microprocessor 84 to execute a packet routine 108.
  • the packet routine establishes buffers within the RAM 88, including a header buffer 110, a length buffer 112, an accelerationX buffer 114, an accelerationY buffer 116, an accelerationZ buffer 118, a rateX buffer 120, a rateY buffer 122, a rateZ buffer 124, a low frequency buffer 126, and a checksum buffer 128. Each of these buffers has a 2 byte length.
  • the packet routine is shown generally at 108 and includes a first block of codes 130 which direct the microprocessor 84 to select a channel A,
  • block 132 After selecting a channel, block 132 directs the microprocessor 84 to write to the I/O circuit 82 to cause the output 100 to produce a signal to direct the analog to digital converter 80 to begin conversion and to present, at the input 102 of the I/O port, a 16 bit digital value representing the instantaneous value of the signal appearing at the selected input 94.
  • channels A, B or C When channels A, B or C are selected the acceleration signals are applied to the A/D converter, which produces acceleration codes representing acceleration of the sensor 28 along the three mutually orthogonal axes respectively.
  • the A/D converter thus acts as a data acquisition system or means for producing acceleration codes representing acceleration of the sensor along three mutually orthogonal axes respectively.
  • the rotation signals are applied to the A/D converter, which produces rotation codes representing rotation of the sensor 28 about the three mutually orthogonal axes respectively.
  • the A/D converter thus further acts as a data acquisition system or means for producing rotation codes representing rotation of the sensor about the three mutually orthogonal axes respectively.
  • the acceleration codes and rotation codes may be generally referred to as movement codes.
  • the A/D converter thus acts as a data acquisition system or means for producing a plurality of codes representing movement relative to respective axes.
  • the vibration signal is applied to the A/D converter which produces vibration codes representing vibration of the observer's station at the first location.
  • the A/D converter thus further acts as a data acquisition system or means for producing a - 17 - vibration code representing vibration at the first location.
  • the A/D converter acts as a data acquisition system or means for producing a plurality of codes representing respective features of the environment .
  • block 134 then directs the microprocessor 84 to read the I/O circuit output 106 to obtain the code or digital value produced by the analog to digital converter 80 and block 136 directs the microprocessor 84 to store the code so obtained m the appropriate buffer 114-126 corresponding to the input 94 selected at block 130.
  • Block 138 then directs the microprocessor 84 to repeat the above process for each input 94 until signals from each channel A-G have been read and corresponding codes have been stored m corresponding buffers 114-126 in the RAM 88.
  • Block 140 then directs the microprocessor 84 to store a header in the header buffer 110, the header being a predefined value used to identify that a data packet to be produced by the system relates to that of motion data.
  • Block 142 then directs the microprocessor 84 to scan the contents of the buffers 110-126 to calculate and store in the length buffer 112, a number indicating the total number of bytes stored in buffers 110-126.
  • Block 144 then directs the microprocessor 84 to use the contents of buffers 110-126 to calculate a checksum value, as is common in the art, and to store such checksum value in the checksum buffer 128.
  • Block 146 then directs the microprocessor 84 to write to the I/O circuit 82 and specifically to the UART 104, the contents of the buffers 110-128.
  • the UART 104 produces an environment signal which is a serial bitstream including a data packet as shown generally at 150 m Figure 5 including a header field 152, a length field 154, an accelerationX field 156, an accelerationY field 158, an accelerationZ field 160, a rateX field 162, a rateY field 164, a rateZ field 166, a vibration field 168 and a checksum field 170.
  • Block 146 of the packet routine directs the microprocessor 84 to control the UART 104 such that the contents of fields 152 through 170 contain the contents of the buffers 110-128 respectively.
  • the baud rate of the data packet 152 so produced is as high as possible, however, m this particular embodiment, a limitation of approximately 115.2kbps is imposed, due to limitations of other equipment .
  • the multimedia data encoder 70 receives analog signals from the motion sensor and from the low frequency microphone and produces an environment signal which is a serial bitstream m compliance with the RS-232-C specification, having a protocol including fields for carrying payload data representing the instantaneous values of linear acceleration, rotation and vibration experienced by the observer m the sensed environment.
  • the environment data acquisition system 11 thus produces an environment signal, i.e. the serial bitstream, indicative of a plurality of properties of the environment.
  • the properties include linear acceleration, rotation and vibration.
  • the TV data encoder 72 includes a TES5 digital inserter manufactured by the Norpak Corporation of Kanata, Ontario. This device is a highly flexible digital video platform which carries out encoding, insertion, reception, bridging and multiplexing of data m the vertical blanking interval (VBI) and active video of any component serial digital television signal.
  • the Norpak unit has a plurality of inputs and has a single output, only some of the inputs being used m this embodiment .
  • the inputs used include a serial digital program video input 172 for receiving the serial digital program video composite signal produced by the camera 48.
  • the inputs used m this embodiment further include a serial port A input 174 and a serial port B input 176.
  • the TV data encoder further includes a serial digital program video output 178.
  • the serial port A input 174 is connected to the serial port output 87 of the multimedia data encoder and is therefore operable to receive the data packet 150 representing motion and low frequency information from the data encoder 70.
  • the serial port B input 176 is connected to the output 64 of the GPS processor 62 and is thereby operable to receive GPS position data messages from the GPS receiver, m a serial data format .
  • the output 178 of the inserter produces a conventional composite video signal m which data received at serial ports A and B, 174 and 176 respectively are encoded on any combination of lines 10 to 25 (7 to 22 m 625 line systems) m any video format including NTSC, PAL, SECAM, 525 and 625 line. Data can be encoded and transmitted m this portion of the composite television signal, at data rates up to 115.2 kbps .
  • the Norpak unit provides built -m forward error correction. - 2 0 -
  • the TV data encoder 72 thus has an output 178 which provides a composite video signal on which motion and low frequency data and GPS data is encoded on a portion of the video signal .
  • the TV data encoder 72 thus acts as an inserter, or means for producing a composite signal including the environment signal and the television signal.
  • the inserter encodes the television signal produced by the camera 48 with the acceleration, rotation and vibration codes and with the position data.
  • the composite video signal produced by the TV data encoder 72 includes the acceleration, rotation and vibration codes and position data.
  • the TV data inserter thus acts as means for encoding the acceleration, rotation and vibration codes m a vertical blanking interval of the television signal and as means for producing the composite video signal such that it includes the position data and as means for producing the composite video signal such that it includes the vibration code.
  • the composite video signal is received at an input 180 of the television transmitter 74.
  • the television transmitter 74 is conventional and transmits the composite video signal, including encoded motion and low frequency data, and GPS data for reception by remotely located television receivers, using conventional radio waves or using cable systems.
  • the television transmitter thus acts as means for transmitting the composite signal to a remote location.
  • the environment signal production system acts as a signal generator or means for generating signals representative of the environment . - 21 -
  • the environment simulator is shown generally at 12.
  • the environment simulator includes a TV receiver 182, a PC multimedia system 184, a display 186, a loudspeaker 188, a motion platform shown generally at 190, and a vibrator shown generally at 192.
  • the environment simulator is at a second location, remote from the first location, such as an amusement park.
  • the TV receiver 182 receives radio frequency signals containing the composite video signal including encoded motion and low frequency data, from the TV transmitter 74.
  • the TV receiver demodulates the radio frequency signals to provide a baseband composite video signal at an output 194 thereof.
  • the TV receiver thus acts as means for receiving a composite signal including an environment signal and a television signal representing the environment .
  • the PC multimedia system is shown generally at 184 and includes a personal computer having a high speed microprocessor 196, temporary memory 198, permanent memory 199 and a personal computer bus 200.
  • a personal computer bus 200 Connected to the personal computer bus are a TV data decoder card 202, a video display adapter card 204, an audio card 206 and a serial interface card 208.
  • the microprocessor 196 is an Intel Pentium (R)300 MHz processor.
  • the TV data decoder card 202 is a
  • the TV data decoder card 202 has a composite video input 210, a composite video output 212 and a PC bus interface 214.
  • the composite video input 210 is connected to receive the base band composite video signal, complete with environmental data from the output 194 of the TV receiver 182 shown m Figure 1. - 22 -
  • the TV data decoder card 202 extracts the environmental data encoded on the vertical blanking interval of the composite video signal and provides a pure composite video signal without environmental data, at the composite video output 212.
  • Environmental data extracted from the VBI interval of the composite video signal received at the input 210 is presented to the PC bus m such a manner that the microprocessor 196 is interrupted when a data packet is available.
  • the PC bus interface 214 provides to the microprocessor 196, data decoded from the composite video signal and an interrupt vector indicating the channel A or B on which the data was received.
  • the TV data decoder acts as means for decomposing the composite signal into the television signal and the environment signal.
  • the video display adapter card 204 is, m this embodiment, an ATI -TV ISA bus card manufactured by ATI of California. This bus card has a PC bus interface 216, a video input 218, a video port output 220 and an audio output 222.
  • the composite video output 212 of the TV data decoder card 202 is connected to the video input 218 of the video display adapter card 204 to receive the composite video signal.
  • the video display adapter card has video buffer memory (not shown) for digitizing video signals received at the video input 218 and has display memory (not shown) for storing digitized video signals for driving the display 186 shown in Figure 1 through the video port 220.
  • the video display adapter card includes an audio decoder (not shown) which decodes audio from the composite video signal appearing at the video input 218 and presents pure audio signals at the audio output 222 for amplification by an audio amplifier (not shown) to drive the loudspeaker 188 shown m Figure 1.
  • the video display adapter card, display, audio amplifier and loudspeaker thus act as means for producing images and sounds perceptible at a second location m response to the television signal .
  • the video display adapter card 204 further includes a processor (not shown) for transferring the contents of the video buffer memory (not shown) to the video display memory (not shown) and for receiving data from the PC bus interface 216 as provided by the microprocessor 196, and for writing such data into the video display buffer, along with the contents of the video input buffer, to cause graphical images to overlay video images represented by the pure composite video signal appearing at the video input 218.
  • the video display adapter 204 also has control registers (not shown) , which are loaded by the microprocessor 196 through the PC bus interface 216, to allow the microprocessor 196 to control positioning and sizing of video and graphical images represented by the contents of the video display buffer.
  • the audio card 206 is a standard audio card having a PC bus interface 224 and an audio output 226.
  • the effective component m the audio card is a digital to analog converter (not shown) , which receives input data from the microprocessor 196, through the PC bus interface 224 and provides analog signals m response thereto, at the audio output 226.
  • the audio output 226 is connected to an audio amplifier (not shown) for driving the vibrator 192 shown m Figure 1.
  • the serial interface card 208 is a standard serial interface found on any personal computer and has a bus interface 228 for receiving data from the microprocessor 196 and has an output 230 for representing such data m an RS-232-C format.
  • the output 230 is connected to the motion platform shown generally at 190 m Figure 1.
  • the permanent memory 199 includes codes readable by the microprocessor 196, for - 24 - directmg the processor to execute a channel data interrupt routine 232.
  • the channel data interrupt routine 232 directs the microprocessor 196 to establish m the temporary memory 198, a full packet buffer 236, a motion packet buffer 238, a vibration buffer 240, a GPS data receive buffer 242 and a graphics data buffer 244.
  • the channel data interrupt routine is shown generally at 232 and begins upon receipt of a channel data interrupt from the TV data decoder card 202 at the PC bus 200.
  • block 246 directs the microprocessor to read an interrupt vector provided by the TV data decoder PC bus interface 214 to determine whether or not the instant data packet is associated with channel A or B. If associated with channel A, the data packet relates to motion data and vibration data and block 248 directs the microprocessor to store the packet m the full packet buffer 236.
  • the full packet buffer has a plurality of two byte fields for storing the contents of respective fields 152 through 170 of the data packet 150 shown m Figure 5.
  • block 250 directs the microprocessor 196 to load the motion packet buffer 238 with the contents of the full packet buffer 236, with the exception of the contents of the very low frequency field and the checksum field.
  • the microprocessor thus acts as means for extracting motion data from the environment signal .
  • Block 252 then directs the microprocessor 196 to transmit the contents of the motion packet buffer 238 to the serial interface card 208, which provides a motion data serial bitstream similar to that shown m Figure 5, without the - 25 - vibration field 168 and the checksum field 170, to the motion platform 190 shown m Figure 1.
  • block 254 then directs the microprocessor 196 to retrieve the contents of the vibration field 168 from the full packet buffer 236 and store such contents m the vibration buffer 240.
  • the microprocessor thus acts as means for extracting vibration data from the environment signal .
  • Block 256 then directs the microprocessor 196 to write the contents of the vibration buffer 240 to the audio card 206, which produces an analog signal at its output 226, for presentation to an amplifier and to the vibrator 192 shown m Figure 1.
  • the audio card 206 thus acts as means for actuating a vibration transducer m response to the vibration data. More generally, the audio card acts as means for operating a transducer at the second location, m response to the environment signal .
  • the microprocessor is directed to block 258 which directs it to store the instant data packet m the GPS data buffer 242.
  • the microprocessor thus acts as means for extracting position information from the environment signal .
  • Block 260 then directs the microprocessor 196 to translate the contents of each field m the GPS data packet into a corresponding graphical component, and to store such graphical component m graphics fields within the graphics data buffer 244.
  • the microprocessor thus acts as means for generating a graphical image m response to the position information.
  • Block 262 then directs the microprocessor to write the contents of the graphics data buffer 244 to the video display adapter card 204.
  • the processor (not shown) in the video display adapter, in response, sets the contents of the video display memory accordingly, such that graphical images representing the GPS data will appear on the display 186 shown in Figure 1.
  • the video display adapter card thus acts as means for combining the graphical image with a video portion of the television signal and the display acts as means for displaying images including the graphical image and an image represented by the video portion of the television signal .
  • the motion platform is shown generally at 190 and includes a six-degree-of-freedom motion platform which, in this embodiment, is provided by Servos and Simulation Inc. of Maitland, Florida, U.S.A.
  • the motion platform includes a VME computer 270 and a six- degree-of-freedom unit shown generally at 272.
  • the VME computer 270 has an input 288 which is connected to the output 230 of the serial interface card 208 to receive the serial bitstream representing the motion data protocol from the serial interface card.
  • the VME computer 270 communicates with an interface unit 273 of the six-degree- of-freedom unit 272, using an advanced Ethernet 100 base T connection 286, and the TCP/IP protocol.
  • the six-degree-of-freedom unit 272 has a base 274 to which is mounted a plurality of extendable actuators, one of which is shown at 276. Each of the actuators has a distal end portion connected to a platform 278, on which is mounted a seat shown generally at 280 having a generally - 27 - horizontal portion 282 and a generally vertical portion 284.
  • the VME computer 270 and the six- degree-of-freedom unit 272 are sold as a single system as model number 710-6-X-115.
  • Software operable to run on the VME computer 270, specific to the present embodiment, is sold under the serial number 710-6-X-55ASW. Effectively, the hardware and software cooperate to move the seat 280 in response to the contents of respective data fields within the motion data bitstream received from the serial interface card 208 shown in Figure 8.
  • the software run by the VME computer includes transfer functions (not shown) , for translating the data contained in each field of the motion data protocol into an associated range of extension of one or more actuators of the six-degree-of-freedom unit 272, to cause the seat 280 to move in such a manner that a person sitting in the seat 280 experiences generally the same feeling of movement experienced by a person sitting in the seat 30 shown in Figure 1, from which the signals were derived.
  • the motion system thus acts as means for moving a support in response to the motion data.
  • the motion platform 190 is positioned relative to the display 186, the loudspeaker 188 and the vibrator 192 such that a person sitting in the seat
  • the PC multimedia system 184, display 186, motion platform 190 and vibrator 192 cooperate to act as means for displaying images including the graphical image and an image represented by the video portion of the television signal while moving a support in response to the motion - 28 - data and actuating a vibration transducer m response to the vibration data.
  • the TV transmitter (74 m Figure 1) and the TV receiver (182 m Figure 1) are replaced with a composite television signal recorder 374 and a composite television signal playback unit 382 respectively.
  • This allows the composite signal representing the environment to be recorded and played back at a later date or time. It also facilitates transfer of the composite signal to editing equipment (not shown) which may be used to edit, alter or add to the video, sound motion data or position data as desired.
  • the recorder 374 thus acts as means for recording said composite signal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Television Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method and apparatus for simulating at a second location, at least one feature of an environment at a first location. The apparatus includes a signal generator for generating signals representative of the environment at the first location and an environment simulator at the second location. The signal generator includes at least one measurement device for producing at least one environment signal indicative of a property of the environment, a television camera for producing at least one television signal in response to visual and audio stimuli as observed from the first location, and an inserter for producing a composite signal including the environment signal and the television signal. The environment simulator includes a receiver for receiving the composite signal including the environment signal and the television signal representing the environment, a television data decoder for decomposing the composite signal into the television signal and the environment signal, transducers for producing images and sounds in response to the television signal and, a transducer for controlling at least one feature of the environment in response to the environment signal.

Description

-1- ENVIRO MENT SIMULATION APPARATUS AND METHOD
BACKGROUND OF THE INVENTION
This invention relates to an environment simulation apparatus and method and more particularly to a system for recording or transmitting and reproducing at a second location, motion, video and audio sensory information sensed at a first location.
The ability to experience conditions or an environment situated at a first location m space is achieved by those who are able to physically be present at the first location. Motion pictures and television have enabled experiencing some features of an environment, namely visual and audio features, at a second, remote location from the first location.
The experience provided by motion pictures and television has been improved by three dimensional cameras and by enhancing audio signal processing, such as evidenced by IMAX(r) and Dolby (r) surround sound.
The experience of a simulation at a second location of an environment at a first location, has been enhanced by providing motion to an amusement park ride while showing motion pictures and playing sounds. However, this requires specialized equipment and motion picture equipment of a commercial nature, which is impractical for consumer users.
What would be desirable therefore is the ability to integrate environment signals with television signals depicting the environment, to enable users to purchase addon equipment for existing television sets to experience motion stimuli, m addition to television pictures and sound. With the advent of new high definition television sets and digital video disk (DVD) technology, a user can experience a reasonable reproduction of sights and sounds - 2 - of a sensed environment, at less cost than with commercial systems employing motion pictures.
BRIEF SUMMARY OF THE INVENTION In accordance with one aspect of the invention, there is provided a method of generating signals representative of an environment . The method includes the steps of :
a) producing at least one environment signal indicative of a property of the environment;
b) producing at least one television signal m response to visual and audio stimuli as observed from a first location;
c) producing a composite signal including the environment signal and the television signal; and
d) operating a transducer at the second location m response to the environment signal.
In accordance with another aspect of the invention, there is provided an apparatus for generating signals representative of an environment. The apparatus includes at least one measurement device for producing at least one environment signal indicative of a property of the environment, a television camera for producing at least one television signal m response to visual and audio stimuli as observed from the first location and an inserter for producing a composite signal including the environment signal and the television signal .
Preferably, the apparatus includes a transmitter for transmitting the composite signal to a remote location.
Alternatively, the apparatus may include a recorder for recording the composite signal . - 3 -
Preferably, the apparatus includes a data acquisition system for producing a plurality of movement signals representing movement relative to a plurality of axes, such movement signals including acceleration signals representing acceleration along three mutually orthogonal axes and rotation signals representing rotation about three mutually orthogonal axes.
Preferably, the data acquisition system produces a plurality of codes representing respective features of the environment, including a plurality of codes representing movement relative to respective axes and more particularly acceleration codes representing acceleration of a sensor along at least one axis and rotation codes representing rotation of the sensor about at least one axis.
It is desirable if the data acquisition system produces acceleration codes representing acceleration of the sensor along three mutually orthogonal axes respectively and rotation codes representing rotation of the sensor about the three mutually orthogonal axes respectively.
Preferably, the apparatus includes a data acquisition system for producing vibration codes representing vibration at the first location.
Preferably, the inserter encodes the television signal with the acceleration, rotation and vibration codes m a vertical blanking interval of the television signal.
Preferably, the apparatus includes a position detection system for generating position data indicative of the geographical position of the first location and preferably, the inserter produces the composite signal such that it includes the position data. - 4 -
Preferably, the apparatus includes a data acquisition system for producing a vibration code representing vibration at the first location and preferably, the inserter produces the composite signal such that it includes the vibration code.
In accordance with another aspect of the invention, there is provided a method of simulating an environment. The method includes the steps of :
a) receiving a composite signal including an environment signal and a television signal representing the environment;
b) decomposing the composite signal into the television signal and the environment signal; and
c) producing images and sounds perceptible at a second location in response to the television signal; and
In accordance with another aspect of the invention, there is provided an apparatus for simulating an environment . The apparatus includes a receiver for receiving a composite signal including an environment signal and a television signal representing the environment, a television data decoder for decomposing the composite signal into the television signal and the environment signal, transducers for producing images and sounds perceptible at a second location in response to the television signal and a transducer at the second location for controlling at least one feature of the environment in response to the environment signal .
Preferably, the apparatus includes a processor for extracting position information from the environment signal and for generating a graphical image in response to the position information. Preferably the processor combines the graphical image with a video portion of the television signal .
Preferably, the apparatus includes a display for displaying images including the graphical image and an image represented by the video portion of the television signal.
Preferably, the apparatus includes a processor for extracting motion data from the environment signal and a support operable to move m response to the motion data.
Preferably, the apparatus includes a processor for extracting vibration data from the environment signal and a vibration transducer actuated m response to the vibration data.
Preferably, the apparatus includes a processor for extracting position information from the environment signal, generating a graphical image m response to the position information, combining the graphical image with a video portion of the television signal, extracting motion data from the environment signal, extracting vibration data from the environment signal, and displaying images including the graphical image and an image represented by the video portion of the television signal while moving a support m response to the motion data and actuating a vibration transducer m response to the vibration data.
In accordance with another aspect of the invention, there is provided a method of simulating at a second location, at least one feature of an environment at a first location. The method includes the steps of :
a) generating signals representative of the environment at the first location by producing at least one environment signal indicative of a property of the environment, producing at least one television signal - 6 - m response to visual and audio stimuli as observed from the first location and producing a composite signal including the environment signal and the television signal; and
b) at the second location, receiving the composite signal, decomposing the composite signal into the television signal and the environment signal, producing images and sounds m response to the television signal and operating a transducer for controlling the property of the environment m response to the environment signal .
In accordance with another aspect of the invention, there is provided an apparatus for simulating at a second location, at least one feature of an environment at a first location. The apparatus includes a signal generator for generating signals representative of the environment at the first location and an environment simulator at the second location. The signal generator includes at least one measurement device for producing at least one environment signal indicative of a property of the environment, a television camera for producing at least one television signal m response to visual and audio stimuli as observed from the first location, and an inserter for producing a composite signal including the environment signal and the television signal . The environment simulator includes a receiver for receiving the composite signal including environment signal and the television signal representing the environment, a television data decoder for decomposing the composite signal into the television signal and the environment signal, transducers for producing images and sounds m response to the television signal and, a transducer for controlling at least one feature of the environment m response to the environment signal. -7- BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
In drawings which illustrate embodiments of the invention, Figure 1 is a block diagram of an apparatus for simulating at a second location at least one feature of an environment at a first location, according to a first embodiment of the invention;
Figure 2 is a perspective view of a motion capture subsystem, according to the first embodiment showing axes about which motion measurements are taken;
Figure 3 is a block diagram of a multimedia data encoder, according to the first embodiment of the invention;
Figure 4 is flowchart of a packet routine run by a processor m the multimedia data encoder, according to the first embodiment of the invention;
Figure 5 is a schematic representation of a data packet produced by the multimedia data encoder, according to the first embodiment of the invention;
Figure 6 is a block diagram of a personal computer multimedia system according to the first embodiment of the invention;
Figure 7 is a flowchart of a channel data interrupt routine run by the personal computer, according to the first embodiment of the invention; - 8 -
Figure 8 is a schematic representation of a motion platform according to the first embodiment of the invention; and
Figure 9 is an apparatus for the simultaneous recording of motion, video and audio according to a second embodiment of the invention.
DETAILED DESCRIPTION OF THE INVENTION
Referring to Figure 1, an apparatus for simulating at a second location, a least one feature of an environment at a first location, according to a first embodiment of the invention includes an environment signal production system 10 and an environment simulator 12.
Environment Signal Production System
The environment signal production system 10 includes an environment data acquisition system 11 and a combiner 13 which cooperate to generate signals representative of an environment .
Environment Data Acquisition System
The environment data acquisition system 11 includes four subsystems, namely, a motion capture subsystem 18, an audio/video capture subsystem 20, a vibration capture subsystem 22, and a position profile subsystem 24. The environment data acquisition system 11 is a portable remote system designed to be placed m an existing system (not shown) the environment of which is to be measured and simulated. Such environment may be that of a cockpit of an aircraft, for example.
Motion Capture Subsystem Referring now to Figure 2, the motion capture subsystem 18 includes an observer station 26 and a motion sensor 28. The observer station 26 is a chair- like structure having a seat 30 and a seat back 32 m which an individual observer m the actual environment being sensed may be seated. Typically, the dimensions and structure of the observer station 26 will be determined by the system m which the environment signal production system 10 is placed, such as a pilot's cockpit or race car seat. In this embodiment the chair-like structure forms part of the system m which the environment signal production system 10 is placed, and is a pilot's seat m an aircraft.
The motion sensor 28 is affixed to the bottom of the observer station 26. In this embodiment the motion sensor 28 is a MotionPak (Trademark) sensor manufactured by the Systron Donner company of California.
The motion sensor 28 includes an mertial sensor cluster with six degrees of freedom, which measures linear acceleration along three mutually perpendicular axes x, y and z, and which measures angular rate of rotation about these same axes. The motion sensor is connected to the bottom of the observer station such that the positive x axis is defined to lie m a direction parallel to the plane of the seat 30 and perpendicular to and away from the line of intersection of the seat 30 and the seat back 32. Furthermore, the positive x, y and z axes are defined to constitute a "right-hand" coordinate system such that the relation between the axes may be represented respectively by the outstretched thumb, index and middle fingers of the human right hand.
Thus, with the positive x axis having been defined as described above, the positive y axis lies m a direction perpendicular and m a clockwise direction to the positive x axis, viewed from above, and parallel to the plane of the seat 30. The positive z axis extends m a direction perpendicular to both the positive x axis and the positive y axis and extends downwards from the plane of the seat 30. - 10 -
The three positive axes have the directions shown m Figure 2. In accordance with convention, the use of the negative sense m respect of any or all of these axes is taken to mean the direction 180 degrees opposite to the direction of the corresponding positive axis.
Acceleration is measured along each axis x, y and z and angular rate of rotation about each axis is measured m planes perpendicular to the x, y and z axes. Positive rotation is measured m a clockwise direction m each plane when viewed from the origin of the three axes along the corresponding positive axis as shown m Figure 2. In accordance with convention, the use of the negative sense m respect of any or all of these rotational vectors is taken to mean the rotational sense opposite to the sense of the corresponding positive vector.
In response to linear acceleration and angular rate of rotation measurements, the motion sensor 28 generates analog voltage signals representing linear acceleration and angular rate of rotation about each of the axes x, y and z respectively. The amplitude of each signal generated m response to linear acceleration varies m the range of between -7.5VDC and +7.5VDC, according to the instantaneous linear acceleration measured along the associated axis, m linear proportion to a measurable range of linear accelerations from -2.0g to +2.0g, where g represents the gravitational force exerted by the earth on an object at sea level and is equal to 9.81 m/sec/sec. A negative value of acceleration represents an acceleration m a direction opposite to the positive axis, or along the negative axis.
The amplitude of each signal generated m response to angular rate of rotation varies m the range of -2.5VDC to +2.5VDC, according to the instantaneous angular rate of rotation about respective axes x, y and z, m linear proportion to a measurable range of angular velocities from - 11 -
-100 degrees per second to +100 degrees per second. A negative value of rate of rotation represents angular rate of rotation m a rotational sense counterclockwise when viewed from the origin.
The sensor has x, y and z linear acceleration outputs 34, 36, and 38 respectively which continuously provide analog voltage signals corresponding to the instantaneous linear acceleration measured by the motion sensor 28 along the positive x, y and z axes. The sensor thus acts as a data acquisition system or means for producing acceleration signals representing acceleration along three mutually orthogonal axes. The sensor also has x, y and z rate outputs 40, 42, and 44 respectively which continuously provide analog voltage signals corresponding to the instantaneous angular velocity measured by the motion sensor 28 about the x, y and z axes respectively. The sensor thus acts as a data acquisition system or means for producing rotation signals representing rotation about three mutually orthogonal axes.
Referring back to Figure 1, the linear acceleration outputs 34, 36 and 38 and the rate outputs 40, 42 and 44 are shown generally at 45 and are connected to the combiner 13. Generally, the motion sensor 28 acts as a data acquisition system or means for producing a plurality of movement signals representing movement relative to a plurality of axes. Such movement signals are indicative of properties of the sensed environment, the properties being acceleration and rotation of the environment m space. Hence it may be said that the sensor acts as a measurement device or means for producing at least one environment signal indicative of a property of the environment.
The Audio/Video Capture Subsystem
The audio/video capture subsystem 20 includes a stabilizing platform 46, a video camera 48 and a microphone 52. The - 12 - stabilizing platform 46 is physically positioned m proximity to the observer station 26 and the video camera 48 is mounted on the stabilizing platform 46. The locations of the stabilizing platform 46 and video camera 48 are selected to maximize the quality of the video images produced by the video camera 48 and to capture as faithfully as possible images that would be detected by an individual seated at the observer station 26. In this embodiment, the stabilizing platform 46 includes a Glidecam (R) system which dampens the movement to which the video camera 48 is subjected, to reduce jitter m video signals produced by the video camera 48.
The microphone 52 measures audio disturbances m the audio range and is physically positioned m proximity to the observer station 26 m order to detect audio disturbances audible at the observer station 26. The location of the microphone 52 is selected to maximize the quality of the audio signals detected by the microphone 52 to represent as faithfully as possible, audio disturbances that would be detected by an individual seated at the observer station 26. In operation, the microphone 52 converts audio disturbances into electrical signals which it transmits on a microphone cable 56 to the video camera 48.
The video camera 48 represents visual images seen by an observer seated at the observer station 26 with conventional composite video signals according to a known video format which, m this embodiment, is that specified by the National Television System Committee (NTSC) . Such video signals are transmitted along a video cable 50 to the combiner 13. Thus, the video camera acts as a television camera or means for producing at least one television signal m response to visual and audio stimuli as observed from the first location. - 13 -
Vibration Capture Subsystem
The vibration capture subsystem 22 includes a low frequency microphone 54. The low frequency microphone 54 is physically positioned m proximity to the observer station 26 m order to detect low frequency vibrations m a frequency range of lOHz to 30Hz, as detected by an observer at the observer station 26. The location of the low frequency microphone 54 is selected to maximize the quality of low frequency signals detected by the low frequency microphone 54 and to represent, as faithfully as possible, low frequency disturbances that would be detected by an individual seated at the observer station 26.
In operation, the low frequency microphone 54 converts low frequency audio disturbances into electrical signals which it transmits along a vibration signal line 58 to the combiner 13. The low frequency microphone 54 thus acts as vibration signal producing means for producing signals representing vibration experienced m the environment. As vibration is a property of the environment being sensed, it may be said that the low frequency microphone also acts as a measurement device or means for producing at least one environment signal indicative of a property of the environment, where vibration is the property being sensed.
Position Profile Subsystem
The position profile subsystem 24 includes an antenna 60, and a global positioning system (GPS) processor 62. The antenna 60 receives GPS signals from a satellite network (not shown) established to compile and broadcast such information. In this embodiment the antenna is a Marine IV (Trademark) antenna manufactured by Ashtec Inc. of Sunnyvale, California. The antenna 60 is connected to the GPS processor 62 which decodes the signals received from the satellite network to produce a serial GPS data message m accordance with the National Marine Electrical Association (NMEA) recommendation 0183 at an output 64. - 14 -
The output 64 is connected to the combiner 13. In this embodiment the GPS processor 62 is a G12 GPS board manufactured by Ashtec Inc. of Sunnyvale, California. The antenna 60 and GPS processor 62 thus act as a position detection system or means for generating position data indicative of the geographical position of the first location. In the embodiment shown, the first location is m an aircraft and therefore as the aircraft moves, longitude and latitude positions of the aircraft are represented by said position data.
The signal produced by the GPS system acts as an environment signal indicative of a property of the environment, where such property is the geographical position of the environment m space.
Combiner
The combiner 13 includes a multimedia data encoder 70, a TV data encoder 72 and a television transmitter 74.
Referring to Figure 3, the multimedia data encoder is shown generally at 70 and includes a scaling and offset circuit 76, a multiplexer (MUX) 78, an analog to digital converter 80, an I/O circuit 82, a microprocessor 84, an RS-232-C interface circuit 86, random access memory (RAM) 88 and read only memory (ROM) 90.
The scaling and offset circuit 76 has a plurality of inputs A, B, C, D, E, F, G for receiving the signals representing instantaneous linear acceleration and instantaneous angular velocity provided by the linear acceleration outputs and the rate outputs, and for receiving the vibration signal from the low frequency microphone 54. Essentially, the scaling and offset circuit scales the signals received at the inputs to 0-5 volts. The scaling and offset circuit thus has a plurality of outputs 92 which provide scaled and offset signals representing scaled versions of the motion - 15 - signals and the low frequency microphone signal respectively.
The multiplexer has a plurality of inputs 94 which are connected to the outputs 92 of the scaling and offset circuit to receive the scaled and offset signals. The multiplexer also has a control input 96 which is connected to an output 99 of the I/O circuit 82 for receiving signals for directing the MUX to select an input 94. The MUX further has an output 98 which is connected to the analog to digital converter 80. The I/O circuit has an output 100 which is connected to the analog to digital converter 80 to control the analog to digital converter 80 and further has an input 102 for receiving 16 bit digital signals produced by the analog to digital converter 80.
The I/O circuit further includes a universal asynchronous receiver/transmitter 104 which is connected to the RS-232-C interface 86. The I/O circuit 82 also has a 16 bit data bus output 106 for communication with the microprocessor 84.
The ROM 90 is programmed with codes which direct the microprocessor 84 to execute a packet routine 108. The packet routine establishes buffers within the RAM 88, including a header buffer 110, a length buffer 112, an accelerationX buffer 114, an accelerationY buffer 116, an accelerationZ buffer 118, a rateX buffer 120, a rateY buffer 122, a rateZ buffer 124, a low frequency buffer 126, and a checksum buffer 128. Each of these buffers has a 2 byte length.
Referring to Figures 3 and 4, the packet routine is shown generally at 108 and includes a first block of codes 130 which direct the microprocessor 84 to select a channel A,
B, C, D, E, F, G, by writing to the I/O circuit 82 to - 16 - control the output 99 to cause the multiplexer 78 to direct signals from one of the inputs 94 to its output 98.
After selecting a channel, block 132 directs the microprocessor 84 to write to the I/O circuit 82 to cause the output 100 to produce a signal to direct the analog to digital converter 80 to begin conversion and to present, at the input 102 of the I/O port, a 16 bit digital value representing the instantaneous value of the signal appearing at the selected input 94. When channels A, B or C are selected the acceleration signals are applied to the A/D converter, which produces acceleration codes representing acceleration of the sensor 28 along the three mutually orthogonal axes respectively. The A/D converter thus acts as a data acquisition system or means for producing acceleration codes representing acceleration of the sensor along three mutually orthogonal axes respectively.
Similarly, when channels D, E or F are selected, the rotation signals are applied to the A/D converter, which produces rotation codes representing rotation of the sensor 28 about the three mutually orthogonal axes respectively. The A/D converter thus further acts as a data acquisition system or means for producing rotation codes representing rotation of the sensor about the three mutually orthogonal axes respectively. The acceleration codes and rotation codes may be generally referred to as movement codes. The A/D converter thus acts as a data acquisition system or means for producing a plurality of codes representing movement relative to respective axes.
Similarly, when channel G is selected, the vibration signal is applied to the A/D converter which produces vibration codes representing vibration of the observer's station at the first location. The A/D converter thus further acts as a data acquisition system or means for producing a - 17 - vibration code representing vibration at the first location.
In general, it may be said that by the production of the acceleration, rotation and vibration codes, the A/D converter acts as a data acquisition system or means for producing a plurality of codes representing respective features of the environment .
Still referring to Figures 3 and 4, block 134 then directs the microprocessor 84 to read the I/O circuit output 106 to obtain the code or digital value produced by the analog to digital converter 80 and block 136 directs the microprocessor 84 to store the code so obtained m the appropriate buffer 114-126 corresponding to the input 94 selected at block 130.
Block 138 then directs the microprocessor 84 to repeat the above process for each input 94 until signals from each channel A-G have been read and corresponding codes have been stored m corresponding buffers 114-126 in the RAM 88.
Block 140 then directs the microprocessor 84 to store a header in the header buffer 110, the header being a predefined value used to identify that a data packet to be produced by the system relates to that of motion data.
Block 142 then directs the microprocessor 84 to scan the contents of the buffers 110-126 to calculate and store in the length buffer 112, a number indicating the total number of bytes stored in buffers 110-126.
Block 144 then directs the microprocessor 84 to use the contents of buffers 110-126 to calculate a checksum value, as is common in the art, and to store such checksum value in the checksum buffer 128. Block 146 then directs the microprocessor 84 to write to the I/O circuit 82 and specifically to the UART 104, the contents of the buffers 110-128.
Essentially, the UART 104 produces an environment signal which is a serial bitstream including a data packet as shown generally at 150 m Figure 5 including a header field 152, a length field 154, an accelerationX field 156, an accelerationY field 158, an accelerationZ field 160, a rateX field 162, a rateY field 164, a rateZ field 166, a vibration field 168 and a checksum field 170. Block 146 of the packet routine directs the microprocessor 84 to control the UART 104 such that the contents of fields 152 through 170 contain the contents of the buffers 110-128 respectively.
It is desirable that the baud rate of the data packet 152 so produced is as high as possible, however, m this particular embodiment, a limitation of approximately 115.2kbps is imposed, due to limitations of other equipment .
Effectively, the multimedia data encoder 70 receives analog signals from the motion sensor and from the low frequency microphone and produces an environment signal which is a serial bitstream m compliance with the RS-232-C specification, having a protocol including fields for carrying payload data representing the instantaneous values of linear acceleration, rotation and vibration experienced by the observer m the sensed environment. The environment data acquisition system 11 thus produces an environment signal, i.e. the serial bitstream, indicative of a plurality of properties of the environment. In this embodiment the properties include linear acceleration, rotation and vibration. - 19 -
TV Data Encoder
Referring back to Figure 1, m this embodiment the TV data encoder 72 includes a TES5 digital inserter manufactured by the Norpak Corporation of Kanata, Ontario. This device is a highly flexible digital video platform which carries out encoding, insertion, reception, bridging and multiplexing of data m the vertical blanking interval (VBI) and active video of any component serial digital television signal. The Norpak unit has a plurality of inputs and has a single output, only some of the inputs being used m this embodiment . The inputs used include a serial digital program video input 172 for receiving the serial digital program video composite signal produced by the camera 48. The inputs used m this embodiment further include a serial port A input 174 and a serial port B input 176. The TV data encoder further includes a serial digital program video output 178. The serial port A input 174 is connected to the serial port output 87 of the multimedia data encoder and is therefore operable to receive the data packet 150 representing motion and low frequency information from the data encoder 70.
The serial port B input 176 is connected to the output 64 of the GPS processor 62 and is thereby operable to receive GPS position data messages from the GPS receiver, m a serial data format .
The output 178 of the inserter produces a conventional composite video signal m which data received at serial ports A and B, 174 and 176 respectively are encoded on any combination of lines 10 to 25 (7 to 22 m 625 line systems) m any video format including NTSC, PAL, SECAM, 525 and 625 line. Data can be encoded and transmitted m this portion of the composite television signal, at data rates up to 115.2 kbps . The Norpak unit provides built -m forward error correction. - 2 0 -
The TV data encoder 72 thus has an output 178 which provides a composite video signal on which motion and low frequency data and GPS data is encoded on a portion of the video signal . The TV data encoder 72 thus acts as an inserter, or means for producing a composite signal including the environment signal and the television signal.
In this embodiment, there are two environment signals, one from the data encoder 70 and one from the GPS processor 62. Hence, the inserter encodes the television signal produced by the camera 48 with the acceleration, rotation and vibration codes and with the position data. Hence the composite video signal produced by the TV data encoder 72 includes the acceleration, rotation and vibration codes and position data. The TV data inserter thus acts as means for encoding the acceleration, rotation and vibration codes m a vertical blanking interval of the television signal and as means for producing the composite video signal such that it includes the position data and as means for producing the composite video signal such that it includes the vibration code.
The composite video signal is received at an input 180 of the television transmitter 74. In this embodiment, the television transmitter 74 is conventional and transmits the composite video signal, including encoded motion and low frequency data, and GPS data for reception by remotely located television receivers, using conventional radio waves or using cable systems. The television transmitter thus acts as means for transmitting the composite signal to a remote location.
In general, it will be appreciated that the environment signal production system acts as a signal generator or means for generating signals representative of the environment . - 21 -
Environment Simulator
Referring back to Figure 1, the environment simulator is shown generally at 12. The environment simulator includes a TV receiver 182, a PC multimedia system 184, a display 186, a loudspeaker 188, a motion platform shown generally at 190, and a vibrator shown generally at 192. The environment simulator is at a second location, remote from the first location, such as an amusement park.
The TV receiver 182 receives radio frequency signals containing the composite video signal including encoded motion and low frequency data, from the TV transmitter 74. The TV receiver demodulates the radio frequency signals to provide a baseband composite video signal at an output 194 thereof. The TV receiver thus acts as means for receiving a composite signal including an environment signal and a television signal representing the environment .
Referring to Figure 6, the PC multimedia system is shown generally at 184 and includes a personal computer having a high speed microprocessor 196, temporary memory 198, permanent memory 199 and a personal computer bus 200. Connected to the personal computer bus are a TV data decoder card 202, a video display adapter card 204, an audio card 206 and a serial interface card 208.
In this embodiment, the microprocessor 196 is an Intel Pentium (R)300 MHz processor.
In this embodiment, the TV data decoder card 202 is a
Norpak TTX71X PC bus card receiver which interfaces directly with the PC bus 200. The TV data decoder card 202 has a composite video input 210, a composite video output 212 and a PC bus interface 214. The composite video input 210 is connected to receive the base band composite video signal, complete with environmental data from the output 194 of the TV receiver 182 shown m Figure 1. - 22 -
Referring back to Figure 6, the TV data decoder card 202 extracts the environmental data encoded on the vertical blanking interval of the composite video signal and provides a pure composite video signal without environmental data, at the composite video output 212. Environmental data extracted from the VBI interval of the composite video signal received at the input 210 is presented to the PC bus m such a manner that the microprocessor 196 is interrupted when a data packet is available. Effectively, the PC bus interface 214 provides to the microprocessor 196, data decoded from the composite video signal and an interrupt vector indicating the channel A or B on which the data was received. Thus, the TV data decoder acts as means for decomposing the composite signal into the television signal and the environment signal.
The video display adapter card 204 is, m this embodiment, an ATI -TV ISA bus card manufactured by ATI of California. This bus card has a PC bus interface 216, a video input 218, a video port output 220 and an audio output 222. The composite video output 212 of the TV data decoder card 202 is connected to the video input 218 of the video display adapter card 204 to receive the composite video signal. The video display adapter card has video buffer memory (not shown) for digitizing video signals received at the video input 218 and has display memory (not shown) for storing digitized video signals for driving the display 186 shown in Figure 1 through the video port 220. In addition the video display adapter card includes an audio decoder (not shown) which decodes audio from the composite video signal appearing at the video input 218 and presents pure audio signals at the audio output 222 for amplification by an audio amplifier (not shown) to drive the loudspeaker 188 shown m Figure 1. The video display adapter card, display, audio amplifier and loudspeaker thus act as means for producing images and sounds perceptible at a second location m response to the television signal . - 23 -
Referring back to Figure 6, the video display adapter card 204 further includes a processor (not shown) for transferring the contents of the video buffer memory (not shown) to the video display memory (not shown) and for receiving data from the PC bus interface 216 as provided by the microprocessor 196, and for writing such data into the video display buffer, along with the contents of the video input buffer, to cause graphical images to overlay video images represented by the pure composite video signal appearing at the video input 218. The video display adapter 204 also has control registers (not shown) , which are loaded by the microprocessor 196 through the PC bus interface 216, to allow the microprocessor 196 to control positioning and sizing of video and graphical images represented by the contents of the video display buffer.
The audio card 206 is a standard audio card having a PC bus interface 224 and an audio output 226. The effective component m the audio card is a digital to analog converter (not shown) , which receives input data from the microprocessor 196, through the PC bus interface 224 and provides analog signals m response thereto, at the audio output 226. The audio output 226 is connected to an audio amplifier (not shown) for driving the vibrator 192 shown m Figure 1.
Referring back to Figure 6, the serial interface card 208 is a standard serial interface found on any personal computer and has a bus interface 228 for receiving data from the microprocessor 196 and has an output 230 for representing such data m an RS-232-C format. The output 230 is connected to the motion platform shown generally at 190 m Figure 1.
Referring back to Figure 6, the permanent memory 199 includes codes readable by the microprocessor 196, for - 24 - directmg the processor to execute a channel data interrupt routine 232.
The channel data interrupt routine 232 directs the microprocessor 196 to establish m the temporary memory 198, a full packet buffer 236, a motion packet buffer 238, a vibration buffer 240, a GPS data receive buffer 242 and a graphics data buffer 244.
Referring to Figures 6 and 7, the channel data interrupt routine is shown generally at 232 and begins upon receipt of a channel data interrupt from the TV data decoder card 202 at the PC bus 200. Upon receiving the channel data interrupt at the microprocessor 196, block 246 directs the microprocessor to read an interrupt vector provided by the TV data decoder PC bus interface 214 to determine whether or not the instant data packet is associated with channel A or B. If associated with channel A, the data packet relates to motion data and vibration data and block 248 directs the microprocessor to store the packet m the full packet buffer 236. The full packet buffer has a plurality of two byte fields for storing the contents of respective fields 152 through 170 of the data packet 150 shown m Figure 5.
Referring back to Figures 6 and 7, block 250 directs the microprocessor 196 to load the motion packet buffer 238 with the contents of the full packet buffer 236, with the exception of the contents of the very low frequency field and the checksum field. The microprocessor thus acts as means for extracting motion data from the environment signal .
Block 252 then directs the microprocessor 196 to transmit the contents of the motion packet buffer 238 to the serial interface card 208, which provides a motion data serial bitstream similar to that shown m Figure 5, without the - 25 - vibration field 168 and the checksum field 170, to the motion platform 190 shown m Figure 1.
Referring back to Figures 6 and 7, block 254 then directs the microprocessor 196 to retrieve the contents of the vibration field 168 from the full packet buffer 236 and store such contents m the vibration buffer 240. The microprocessor thus acts as means for extracting vibration data from the environment signal .
Block 256 then directs the microprocessor 196 to write the contents of the vibration buffer 240 to the audio card 206, which produces an analog signal at its output 226, for presentation to an amplifier and to the vibrator 192 shown m Figure 1. The audio card 206 thus acts as means for actuating a vibration transducer m response to the vibration data. More generally, the audio card acts as means for operating a transducer at the second location, m response to the environment signal .
Referring back to Figures 6 and 7, m the event that upon receiving the channel data interrupt, the TV data decoder card 202 provides an interrupt vector indicating that the instant data is associated with that of channel B, the microprocessor is directed to block 258 which directs it to store the instant data packet m the GPS data buffer 242. The microprocessor thus acts as means for extracting position information from the environment signal .
Block 260 then directs the microprocessor 196 to translate the contents of each field m the GPS data packet into a corresponding graphical component, and to store such graphical component m graphics fields within the graphics data buffer 244. The microprocessor thus acts as means for generating a graphical image m response to the position information. - 26 -
Block 262 then directs the microprocessor to write the contents of the graphics data buffer 244 to the video display adapter card 204. The processor (not shown) in the video display adapter, in response, sets the contents of the video display memory accordingly, such that graphical images representing the GPS data will appear on the display 186 shown in Figure 1. The video display adapter card thus acts as means for combining the graphical image with a video portion of the television signal and the display acts as means for displaying images including the graphical image and an image represented by the video portion of the television signal .
In effect, when a data packet from either channel A or channel B is received, the contents of respective fields within the corresponding data packet are parsed and transmitted to a corresponding device on the PC bus .
Referring to Figure 8, the motion platform is shown generally at 190 and includes a six-degree-of-freedom motion platform which, in this embodiment, is provided by Servos and Simulation Inc. of Maitland, Florida, U.S.A. The motion platform includes a VME computer 270 and a six- degree-of-freedom unit shown generally at 272. The VME computer 270 has an input 288 which is connected to the output 230 of the serial interface card 208 to receive the serial bitstream representing the motion data protocol from the serial interface card. The VME computer 270 communicates with an interface unit 273 of the six-degree- of-freedom unit 272, using an advanced Ethernet 100 base T connection 286, and the TCP/IP protocol.
The six-degree-of-freedom unit 272 has a base 274 to which is mounted a plurality of extendable actuators, one of which is shown at 276. Each of the actuators has a distal end portion connected to a platform 278, on which is mounted a seat shown generally at 280 having a generally - 27 - horizontal portion 282 and a generally vertical portion 284.
In this embodiment, the VME computer 270 and the six- degree-of-freedom unit 272 are sold as a single system as model number 710-6-X-115. Software operable to run on the VME computer 270, specific to the present embodiment, is sold under the serial number 710-6-X-55ASW. Effectively, the hardware and software cooperate to move the seat 280 in response to the contents of respective data fields within the motion data bitstream received from the serial interface card 208 shown in Figure 8. The software run by the VME computer includes transfer functions (not shown) , for translating the data contained in each field of the motion data protocol into an associated range of extension of one or more actuators of the six-degree-of-freedom unit 272, to cause the seat 280 to move in such a manner that a person sitting in the seat 280 experiences generally the same feeling of movement experienced by a person sitting in the seat 30 shown in Figure 1, from which the signals were derived. The motion system thus acts as means for moving a support in response to the motion data.
Referring back to Figure 1, the motion platform 190 is positioned relative to the display 186, the loudspeaker 188 and the vibrator 192 such that a person sitting in the seat
280 may observe the display 186, hear sounds produced by the loudspeaker 188 and experience vibrations produced by the vibrator 192, in such a manner that visual, audio, vibratory and motional effects experienced by a person sitting in seat 30 shown in Figure 1 are similarly experienced by a person sitting on the seat 280. In other words, the PC multimedia system 184, display 186, motion platform 190 and vibrator 192 cooperate to act as means for displaying images including the graphical image and an image represented by the video portion of the television signal while moving a support in response to the motion - 28 - data and actuating a vibration transducer m response to the vibration data.
Alternatives Referring to Figure 9, m an alternative embodiment, the TV transmitter (74 m Figure 1) and the TV receiver (182 m Figure 1) are replaced with a composite television signal recorder 374 and a composite television signal playback unit 382 respectively. This allows the composite signal representing the environment to be recorded and played back at a later date or time. It also facilitates transfer of the composite signal to editing equipment (not shown) which may be used to edit, alter or add to the video, sound motion data or position data as desired. The recorder 374 thus acts as means for recording said composite signal.
While specific embodiments of the invention have been described and illustrated, such embodiments should be considered illustrative of the invention only and not as limiting the invention as construed m accordance with the accompanying claims.

Claims

-29- What is claimed is:
1. A method of generating signals representative of an environment, the method comprising the steps of:
a) producing at least one environment signal indicative of a property of said environment;
b) producing at least one television signal m response to visual and audio stimuli as observed from a first location; and
c) producing a composite signal including said environment signal and said television signal .
2. A method as claimed m claim 1 further including the step of transmitting said composite signal to a remote location.
3. A method as claimed m claim 1 further including the step of recording said composite signal .
4. A method as claimed m claim 1 further including the step of producing a plurality of movement signals representing movement relative to a plurality of axes.
5. A method as claimed m claim 4 further including the step of producing acceleration signals representing acceleration along respective axes.
6. A method as claimed m claim 5 further including the step of producing acceleration signals representing acceleration along three mutually orthogonal axes.
7. A method as claimed m claim 4 further including the step of producing rotation signals representing rotation about respective axes. - 3 0 -
8. A method as claimed m claim 7 further including the step of producing rotation signals representing rotation about three mutually orthogonal axes.
9. A method as claimed m claim 1 further including the step of producing a plurality of codes representing respective features of said environment.
10. A method as claimed m claim 9 further including the step of producing a plurality of codes representing movement relative to respective axes.
11. A method as claimed m claim 10 further including the step of producing acceleration codes representing acceleration of a sensor along at least one axis.
12. A method as claimed m claim 11 further including the step of producing rotation codes representing rotation of said sensor about at least one axis.
13. A method as claimed m claim 10 further including the steps of :
a) producing acceleration codes representing acceleration of a sensor along three mutually orthogonal axes respectively; and
b) producing rotation codes representing rotation of said sensor about said three mutually orthogonal axes respectively.
14. A method as claimed m claim 13 further including the step of producing vibration codes representing vibration at said first location. - 31 -
15. A method as claimed m claim 14 further including the step of encoding said television signal with said acceleration, rotation and vibration codes.
16. A method as claimed m claim 15 further including the step of encoding said acceleration, rotation and vibration codes m a vertical blanking interval of said television signal .
17. A method as claimed m claim 1 further including the step of generating position data indicative of the geographical position of said first location and producing said composite signal such that it includes said position data.
18. A method as claimed m claim 17 further including the step of producing a vibration code representing vibration at said first location and producing said composite signal such that it includes said vibration code .
19. An apparatus for generating signals representative of an environment, the apparatus comprising:
a) at least one measurement device for producing at least one environment signal indicative of a property of said environment;
b) a television camera for producing at least one television signal m response to visual and audio stimuli as observed from a first location; and
c) an inserter for producing a composite signal including said environment signal and said television signal . - 32 -
20. An apparatus as claimed m claim 19 further including a transmitter for transmitting said composite signal to a remote location.
21. An apparatus as claimed m claim 19 further including a recorder for recording said composite signal .
22. An apparatus as claimed m claim 19 further including a data acquisition system for producing a plurality of movement signals representing movement relative to a plurality of axes.
23. An apparatus as claimed m claim 22 wherein said data acquisition system produces acceleration signals representing acceleration along respective axes.
24. An apparatus as claimed m claim 23 wherein said data acquisition system produces acceleration signals representing acceleration along three mutually orthogonal axes .
25. An apparatus as claimed m claim 22 wherein said data acquisition system produces rotation signals representing rotation about respective axes.
26. An apparatus as claimed m claim 25 wherein said data acquisition system produces rotation signals representing rotation about three mutually orthogonal axes .
27. An apparatus as claimed m claim 22 wherein said data acquisition system produces a plurality of codes representing respective features of said environment.
28. An apparatus as claimed m claim 27 wherein said data acquisition system produces a plurality of codes representing movement relative to respective axes. - 33 -
29. An apparatus as claimed in claim 28 wherein said data acquisition system produces acceleration codes representing acceleration of a sensor along at least one axis.
30. An apparatus as claimed in claim 29 wherein said data acquisition system produces rotation codes representing rotation of said sensor about at least one axis.
31. An apparatus as claimed in claim 28 wherein said data acquisition system produces acceleration codes representing acceleration of a sensor along three mutually orthogonal axes respectively and rotation codes representing rotation of said sensor about said three mutually orthogonal axes respectively.
32. An apparatus as claimed in claim 31 wherein said data acquisition system producing vibration codes representing vibration at said first location.
33. An apparatus as claimed in claim 32 wherein said inserter encodes said television signal with said acceleration, rotation and vibration codes.
34. An apparatus as claimed in claim 33 wherein said inserter encodes said television signal with said acceleration, rotation and vibration codes in a vertical blanking interval of said television signal.
35. An apparatus as claimed in claim 19 further including a position detection system for generating position data indicative of the geographical position of said first location and wherein said inserter produces said composite signal such that it includes said position data . - 34 -
36. An apparatus as claimed m claim 35 further including a data acquisition system for producing a vibration code representing vibration at said first location and wherein said inserter produces said composite signal such that it includes said vibration code.
37. An apparatus for generating signals representative of an environment, the apparatus comprising:
a) means for producing at least one environment signal indicative of a property of said environment ;
b) means for producing at least one television signal m response to visual and audio stimuli as observed from a first location; and
c) means for producing a composite signal including said environment signal and said television signal .
38. An apparatus as claimed m claim 37 further including means for transmitting said composite signal to a remote location.
39. An apparatus as claimed m claim 37 further including means for recording said composite signal .
40. An apparatus as claimed m claim 37 further including means for producing a plurality of movement signals representing movement relative to a plurality of axes.
41. An apparatus as claimed m claim 40 further including means for producing acceleration signals representing acceleration along respective axes. - 35 -
42. An apparatus as claimed m claim 41 further including means for producing acceleration signals representing acceleration along three mutually orthogonal axes.
43. An apparatus as claimed m claim 40 further including means for producing rotation signals representing rotation about respective axes.
44. An apparatus as claimed m claim 43 further including means for producing rotation signals representing rotation about three mutually orthogonal axes.
45. An apparatus as claimed m claim 37 further including means for producing a plurality of codes representing respective features of said environment.
46. An apparatus as claimed m claim 45 further including means for producing a plurality of codes representing movement relative to respective axes.
47. An apparatus as claimed m claim 46 further including means for producing acceleration codes representing acceleration of a sensor along at least one axis.
48. An apparatus as claimed m claim 47 further including means for producing rotation codes representing rotation of said sensor about at least one axis.
49. An apparatus as claimed m claim 46 further including:
a) means for producing acceleration codes representing acceleration of a sensor along three mutually orthogonal axes respectively; and
b) means for producing rotation codes representing rotation of said sensor about said three mutually orthogonal axes respectively. - 36 -
50. An apparatus as claimed m claim 49 further including means for producing vibration codes representing vibration at said first location.
51. An apparatus as claimed m claim 50 further including means for encoding said television signal with said acceleration, rotation and vibration codes.
52. An apparatus as claimed m claim 51 further including means for encoding said acceleration, rotation and vibration codes m a vertical blanking interval of said television signal .
53. An apparatus as claimed m claim 37 further including means for generating position data indicative of the geographical position of said first location and producing said composite signal such that it includes said position data.
54. An apparatus as claimed m claim 53 further including means for producing a vibration code representing vibration at said first location and producing said composite signal such that it includes said vibration code .
55. A method of simulating an environment, the method comprising the steps of:
a) receiving a composite signal including an environment signal and a television signal representing said environment;
b) decomposing said composite signal into said television signal and said environment signal; - 37 - c) producing images and sounds perceptible at a second location m response to said television signal; and
d) operating a transducer at said second location m response to said environment signal .
56. A method as claimed m claim 55 further including the step of extracting position information from said environment signal.
57. A method as claimed m claim 56 further including the step of generating a graphical image m response to said position information.
58. A method as claimed m claim 57 further including the step of combining said graphical image with a video portion of said television signal.
59. A method as claimed m claim 58 further including the step of displaying images including said graphical image and an image represented by said video portion of said television signal .
60. A method as claimed m claim 55 further including the step of extracting motion data from said environment signal and moving a support m response to said motion data .
61. A method as claimed m claim 55 further including the step of extracting vibration data from said environment signal and actuating a vibration transducer m response to said vibration data.
62. A method as claimed m claim 55 further including the steps of : - 38 - a) extracting position information from said environment signal;
b) generating a graphical image m response to said position information;
c) combining said graphical image with a video portion of said television signal;
d) extracting motion data from said environment signal ;
e) extracting vibration data from said environment signal; and
f) displaying images including said graphical image and an image represented by said video portion of said television signal while moving a support m response to said motion data and actuating a vibration transducer m response to said vibration data.
63. An apparatus for simulating an environment, the apparatus comprising:
a) a receiver for receiving a composite signal including an environment signal and a television signal representing said environment;
b) a television data decoder for decomposing said composite signal into said television signal and said environment signal;
c) transducers for producing images and sounds perceptible at a second location m response to said television signal; and - 3 9 - d) a transducer at said second location for controlling at least one feature of said environment m response to said environment signal .
64. An apparatus as claimed m claim 63 further including a processor for extracting position information from said environment signal .
65. An apparatus as claimed m claim 64 wherein said processor is programmed to generate a graphical image m response to said position information.
66. An apparatus as claimed m claim 65 wherein said processor is programmed to combine said graphical image with a video portion of said television signal .
67. An apparatus as claimed m claim 66 further including a display for displaying images including said graphical image and an image represented by said video portion of said television signal.
68. An apparatus as claimed m claim 63 further including:
a) a processor for extracting motion data from said environment signal; and
b) a support operable to move m response to said motion data.
69. An apparatus as claimed m claim 63 further including a processor for extracting vibration data from said environment signal and a vibration transducer actuated m response to said vibration data.
70. An apparatus as claimed m claim 63 further including a processor for: - 40 - a) extracting position information from said environment signal;
b) generating a graphical image in response to said position information;
c) combining said graphical image with a video portion of said television signal;
d) extracting motion data from said environment signal ;
e) extracting vibration data from said environment signal; and
f) displaying images including said graphical image and an image represented by said video portion of said television signal while moving a support in response to said motion data and actuating a vibration transducer in response to said vibration data.
An apparatus for simulating an environment, the apparatus comprising:
a) means for receiving a composite signal including an environment signal and a television signal representing said environment;
b) means for decomposing said composite signal into said television signal and said environment signal ;
c) means for producing images and sounds perceptible at a second location in response to said television signal; and - 41 - d) means for operating a transducer at said second location m response to said environment signal.
72. An apparatus as claimed m claim 71 further including means for extracting position information from said environment signal .
73. An apparatus as claimed m claim 72 further including means for generating a graphical image m response to said position information.
74. An apparatus as claimed m claim 73 further including means for combining said graphical image with a video portion of said television signal.
75. An apparatus as claimed m claim 74 further including means for displaying images including said graphical image and an image represented by said video portion of said television signal .
76. An apparatus as claimed m claim 71 further including means for extracting motion data from said environment signal and moving a support m response to said motion data.
77. An apparatus as claimed m claim 71 further including means for extracting vibration data from said environment signal and actuating a vibration transducer m response to said vibration data.
78. An apparatus as claimed m claim 71 further including:
a) means for extracting position information from said environment signal;
b) means for generating a graphical image m response to said position information; - 42 - c) means for combining said graphical image with a video portion of said television signal;
d) means for extracting motion data from said environment signal;
e) means for extracting vibration data from said environment signal; and
f) means for displaying images including said graphical image and an image represented by said video portion of said television signal while moving a support m response to said motion data and actuating a vibration transducer m response to said vibration data.
A method of simulating at a second location, at least one feature of an environment at a first location, the method comprising the steps of:
a) generating signals representative of said environment at said first location by:
l) producing at least one environment signal indicative of a property of said environment ;
li ) producing at least one television signal m response to visual and audio stimuli as observed from said first location; and
m) producing a composite signal including said environment signal and said television signal; and
at said second location: -43- l) receiving said composite signal;
n) decomposing said composite signal into said television signal and said environment signal ;
m) producing images and sounds m response to said television signal; and
IV) operating a transducer for controlling said property of said environment m response to said environment signal .
n apparatus for simulating at a second location, at least one feature of an environment at a first location, the apparatus comprising:
a) a signal generator for generating signals representative of said environment at said first location, said signal generator including:
l) at least one measurement device for producing at least one environment signal indicative of a property of said environment ;
n) a television camera for producing at least one television signal m response to visual and audio stimuli as observed from said first location; and
m) an inserter for producing a composite signal including said environment signal and said television signal; and
b) an environment simulator at said second location, said environment simulator including: - 44 - l) a receiver for receiving said composite signal including said environment signal and said television signal representing said environment ;
n) a television data decoder for decomposing said composite signal into said television signal and said environment signal;
m) transducers for producing images and sounds m response to said television signal; and
IV) a transducer for controlling at least one feature of said environment m response to said environment signal .
An apparatus for simulating at a second location, at least one feature of an environment at a first location, the apparatus comprising:
a) means for generating signals representative of said environment, said means for generating signals including:
l) means for producing at least one environment signal indicative of a property of said environment ;
li ) means for producing at least one television signal m response to visual and audio stimuli as observed from said first location; and
m) means for producing a composite signal including said environment signal and said television signal; and - 45 - an environment simulator at said second location, said environment simulator including:
l) means for receiving said composite signal;
li ) means for decomposing said composite signal into said television signal and said environment signal;
in) means for producing images and sounds m response to said television signal; and
iv) means for operating a transducer for controlling said property of said environment m response to said environment signal .
PCT/CA1999/000411 1998-05-05 1999-05-05 Environment simulation apparatus and method WO1999057896A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU38046/99A AU3804699A (en) 1998-05-05 1999-05-05 Environment simulation apparatus and method
CA002330635A CA2330635A1 (en) 1998-05-05 1999-05-05 Environment simulation apparatus and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US7264998A 1998-05-05 1998-05-05
US09/072,649 1998-05-05

Publications (1)

Publication Number Publication Date
WO1999057896A1 true WO1999057896A1 (en) 1999-11-11

Family

ID=22108948

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA1999/000411 WO1999057896A1 (en) 1998-05-05 1999-05-05 Environment simulation apparatus and method

Country Status (3)

Country Link
AU (1) AU3804699A (en)
CA (1) CA2330635A1 (en)
WO (1) WO1999057896A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1585109A1 (en) * 2003-01-17 2005-10-12 Sony Corporation Information transmission method and device, information recording or reproduction method and device, and recording medium
EP1587100A1 (en) * 2003-01-21 2005-10-19 Sony Corporation Data recording medium, recording method and recorder, reproducing method and reproducer, and data transmitting method and transmitter

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4814896A (en) * 1987-03-06 1989-03-21 Heitzman Edward F Real time video data acquistion systems
EP0696022A1 (en) * 1993-04-20 1996-02-07 Kabushiki Kaisha Ace Denken Driving simulation system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4814896A (en) * 1987-03-06 1989-03-21 Heitzman Edward F Real time video data acquistion systems
EP0696022A1 (en) * 1993-04-20 1996-02-07 Kabushiki Kaisha Ace Denken Driving simulation system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1585109A1 (en) * 2003-01-17 2005-10-12 Sony Corporation Information transmission method and device, information recording or reproduction method and device, and recording medium
EP1585109A4 (en) * 2003-01-17 2009-04-29 Sony Corp Information transmission method and device, information recording or reproduction method and device, and recording medium
US8094588B2 (en) 2003-01-17 2012-01-10 Sony Corporation Information transmission method, information transmission apparatus, information recording or reproducing method, information recording or reproducing apparatus and recording medium
US9185440B2 (en) 2003-01-17 2015-11-10 Sony Corporation Information transmission method and device, information recording or reproduction method and device, and recording medium
EP1587100A1 (en) * 2003-01-21 2005-10-19 Sony Corporation Data recording medium, recording method and recorder, reproducing method and reproducer, and data transmitting method and transmitter
EP1587100A4 (en) * 2003-01-21 2011-04-06 Sony Corp Data recording medium, recording method and recorder, reproducing method and reproducer, and data transmitting method and transmitter

Also Published As

Publication number Publication date
CA2330635A1 (en) 1999-11-11
AU3804699A (en) 1999-11-23

Similar Documents

Publication Publication Date Title
US9918118B2 (en) Apparatus and method for playback of audio-visual recordings
CN107103801B (en) Remote three-dimensional scene interactive teaching system and control method
EP0684059B1 (en) Method and apparatus for the display of video images
US5659691A (en) Virtual reality network with selective distribution and updating of data to reduce bandwidth requirements
JP3796776B2 (en) Video / audio playback device
US7084874B2 (en) Virtual reality presentation
US20020075286A1 (en) Image generating system and method and storage medium
CN105915849A (en) Virtual reality sports event play method and system
KR960003769A (en) Game device using video display device
JPH09284676A (en) Method for processing video and audio signal synchronously with motion of body and video display device
JP2005500721A5 (en)
KR20210031894A (en) Information processing device, information processing method and program
CN110583022B (en) Client device, client device processing method, server, and server processing method
CN109951718A (en) A method of it can 360 degree of panorama captured in real-time live streamings by 5G and VR technology
US11806621B2 (en) Gaming with earpiece 3D audio
JP2021136465A (en) Receiver, content transfer system, and program
CN106412751B (en) A kind of earphone taken one's bearings and its implementation
WO1999057896A1 (en) Environment simulation apparatus and method
US6473136B1 (en) Television broadcast transmitter/receiver and method of transmitting/receiving a television broadcast
JPH09222848A (en) Remote lecture system and network system
JP7173249B2 (en) CLIENT DEVICE, DISPLAY SYSTEM, CLIENT DEVICE PROCESSING METHOD AND PROGRAM
JP3876448B2 (en) Transmitting device and receiving device
Ericson et al. Applications of virtual audio
JPH09179485A (en) Facility abnormal sound detection training simulator
JPH06125573A (en) Three-dimensional image pickup device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref document number: 2330635

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: KR

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

122 Ep: pct application non-entry in european phase