WO2006064817A1 - Operation simulator for railway - Google Patents

Operation simulator for railway Download PDF

Info

Publication number
WO2006064817A1
WO2006064817A1 PCT/JP2005/022896 JP2005022896W WO2006064817A1 WO 2006064817 A1 WO2006064817 A1 WO 2006064817A1 JP 2005022896 W JP2005022896 W JP 2005022896W WO 2006064817 A1 WO2006064817 A1 WO 2006064817A1
Authority
WO
WIPO (PCT)
Prior art keywords
train
vehicle
eye
image
virtual
Prior art date
Application number
PCT/JP2005/022896
Other languages
French (fr)
Japanese (ja)
Inventor
Hitoshi Tsunashima
Takashi Kojima
Original Assignee
Nihon University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nihon University filed Critical Nihon University
Priority to JP2006548862A priority Critical patent/JPWO2006064817A1/en
Publication of WO2006064817A1 publication Critical patent/WO2006064817A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles

Definitions

  • the present invention relates to a railway driving simulator, and more particularly to a railway driving simulator used for railway driving operation training and driver error research.
  • the blocking method is a method that prevents a train from colliding by setting a closed section on the track, and occupying that section by one train and prohibiting other trains from entering that section.
  • a traffic light is used to notify the presence or absence of a train in the blocked section and whether or not it can enter. Therefore, traffic lights are the basis for driving trains safely.
  • DATS automatic train stop device
  • ATC Automatic Train Control Device
  • the EB device is a device that detects an abnormal condition related to an operating crew member and stops the train. For example, if it is detected that a crew member of a motor vehicle does not perform any driving operation on the master controller or brake for one minute, an alarm buzzer is sounded and driving is performed within 5 seconds. If no operation or reset operation is performed, the train is stopped.
  • a railway driving simulator is capable of virtually operating a railway, and mainly when a driving operation is performed by a controller of the vehicle, the vehicle virtually runs in accordance with the operation. In general, the field of view that can be seen by the driver is projected onto the screen.
  • the route image between each station from the station where the driving lesson starts to the station where it ends and the route image near each station that stops in the meantime are maintained, and the inter-station route image according to the driving operation.
  • a route image corresponding to the traveling of the vehicle is projected by switching and reproducing the route image near the stop station (see, for example, Patent Document 1).
  • the forward route image is displayed in front of the cab, and on the side of the cab.
  • a railway driving simulator has been proposed in which a side image of the vicinity of a stop station is displayed (for example, see Patent Document 2).
  • Such a driving simulator is used not only in the railway field but also in the automobile field, and the traveling speed on the image changes depending on the handle operation and the degree of depression of the accelerator pedal, and the traveling road and the specified speed are protected.
  • Drive simulation for safe driving guidance that checks whether or not (For example, see Patent Document 3), a simulation device that evaluates the driver's and passengers' habitability and maneuverability based on their sensibility when designing a vehicle such as an automobile (see, for example, Patent Document 4) ) Is disclosed.
  • ITS Intelligent Transport System
  • evaluation of driving support systems and analysis of driving behavior of elderly drivers are being conducted.
  • it is used to evaluate the stability and maneuverability of aircraft to be developed and to study maneuvering methods.
  • Patent Document 1 JP-A-10-161516
  • Patent Document 2 JP-A-11-1237830
  • Patent Document 3 Japanese Patent No. 2592343
  • Patent Document 4 JP-A-7-271289
  • the present invention provides low-cost railway driving that can provide the same sense of presence and tension as actual driving and can flexibly change the driving environment.
  • the purpose is to provide a simulator.
  • the railway driving simulator of the present invention that achieves the above object displays a 3D landscape on the screen in front of the driver's seat according to the operation of the operating means by the driver wearing polarized glasses,
  • the traveling speed of the virtual train is obtained according to the operation of the operation means, and the obtained
  • Train control means for determining the travel position of the virtual train based on the travel speed and the elapsed time, and a storage storing data relating to a track on which the virtual train travels and a tangible object disposed along the track
  • a 3D image generating means for generating respective scenes respectively visible from the left eye and the right eye of the virtual driver according to the travel speed and the travel position obtained by the train control means
  • 3D image display means for projecting each landscape generated by the 3D image generation means onto the screen via polarizing plates whose polarization axes are orthogonal to each other.
  • the train control means includes the 3D image display means.
  • the train control means obtains the travel speed and travel position of the train, and the 3D image generation means generates a landscape that is visible to the driver when the actual train travels. Since the 3D image display means three-dimensionally projects it onto the screen in front of the driver, the driver wearing polarized glasses can obtain the same realism and tension as if he were actually driving.
  • a master controller that changes the traveling speed of the virtual train and a brake valve that applies pressure to the braking wheels that brake the virtual train are provided as operation means, and the traveling speed and pressure are displayed as the display means. If an instrument panel is provided and the train control means displays the running speed changed by the master controller and the pressure applied by the brake valve on the instrument panel, the sense of presence and tension can be further enhanced. it can.
  • the train control means includes a memory storing data related to the vehicles constituting the virtual train, and data including position information of traffic lights and platforms arranged along the track, and the traveling And calculating a speed and the travel position, and using the data stored in the memory to obtain a displacement including a shake of a vehicle constituting the virtual train at each position on the track.
  • a signal control unit that controls display of the traffic light based on the travel speed and the travel position obtained by the calculation unit, and a door opening / closing control unit that controls opening / closing of the entrance / exit of the virtual train.
  • the 3D image generation means is generated based on the displacement obtained by the calculation unit.
  • a left-eye 3D image generation unit that moves the scenery seen from the left eye and a right-eye 3D image generation part that moves the scenery seen from the right eye are provided, it is possible to A vehicle that shakes as it gets on and off can be relatively reproduced by moving the landscape projected on the screen, further enhancing the sense of presence and tension.
  • the train control means based on the data related to the running sound and the alarm sound stored in the memory, the running sound generated according to the running of the virtual train And a sound generating unit that generates sound including a number of alarm sounds and outputs the sound to a predetermined speaker, or the sound generating unit is connected to the instrument panel when the brake valve is operated.
  • command means for selecting a simulation environment including a travel route of the virtual train and a type of a vehicle constituting the virtual train, and instructing the train control means and the 3D image generation means.
  • the speed and pressure that change according to the operation of the driver are displayed on the instrument panel, while the driver wearing polarized glasses according to the change in the travel speed and the change in the travel position.
  • a three-dimensional landscape similar to that seen from a swaying vehicle is unfolded in front of the driver's seat. Therefore, it is possible to give the driver the same sense of presence and tension as driving an actual vehicle.
  • simulator environments such as travel routes and types of trains to be operated can be flexibly set by menu selection, program change, etc., and a low-cost railway operation simulator can be realized.
  • FIG. 1 is a configuration diagram of a railway driving simulator to which an embodiment of the present invention is applied.
  • FIG. 2 A diagram showing, as an example, a real space where the driver looks at the screen and a virtual space where the virtual camera hitting the eyes in the PC looks through the virtual plane.
  • FIG. 3 is a diagram showing, as an example, the relationship between the 3D image generation processing timing by the left viewing PC and the right viewing PC and the processing timing by the vehicle PC.
  • FIG. 5 is a plan structural view showing a frame constituting an image representing a forward landscape.
  • FIG. 8 is a diagram showing an example of a vehicle model.
  • FIG. 9 is a diagram showing an example of a vehicle model.
  • FIG. 11 is a diagram showing an example of a trajectory error calculated in units of blocks.
  • FIG. 13 is a diagram showing a rolling sound of a brake.
  • the railway driving simulator provides a simulated driver's seat for a railway vehicle, and when a driver sitting in the driver's seat operates the operating means, the actual vehicle is driven according to the operation.
  • the front landscape changes and the display of the display means changes as It is a mechanism that can give the driver the feeling of driving an actual vehicle.
  • FIG. 1 is a configuration diagram of a railway driving simulator to which an embodiment of the present invention is applied.
  • the railway driving simulator shown in Fig. 1 has a vehicle 2 with a driver's seat, a speaker 1 that outputs sound generated by the train running, and a left eye that generates a frontal view that can be seen from the driver's seat of the vehicle 2.
  • a pair of visual display computers for the right and left eyes (corresponding to the 3D image generating means of the present invention, hereinafter referred to as “left vision PC” and “right vision PC”) 4 A, 4B, and left Left-eye projector and right-eye projector for projecting the respective front landscapes generated by the visual field PC and the right visual field PCs 4A and 4B (corresponding to the 3D image display means of the present invention, hereinafter referred to as “left projector” and (Referred to as “right projector”) 3A, 3B, a screen 6 arranged in front of the driver's seat, the left projector and the right projector 3A, 3B are transparently projected and superimposed on the front landscape, and a virtual row It is composed of a command personal computer (hereinafter referred to as “command PC”) 7 that sets the simulation environment by selecting the vehicle travel route and the types of vehicles constituting the virtual train.
  • command PC a command personal computer
  • the vehicle 2 has a controller (corresponding to the operating means of the present invention) 22 for operating the virtual train, and the traveling speed of the virtual train and the pressure of the brake valve according to the operation of the controller 22.
  • Instrument panel for display corresponding to the display means of the present invention
  • vehicle PC personal computer for vehicle control
  • the controller 22 includes a master controller 22A that changes the traveling speed of the virtual train, and a brake valve 22B that brakes the virtual train.
  • vehicle PC20 the left field of view PC4A, and the right field of view PC4B are LAN (Local Area
  • TCP Transmission Control Protocol
  • UDP User Datagram Protocol
  • the vehicle PC 20 controls the traveling of the vehicle 2 by a CPU (central processing unit).
  • the CPU performs acceleration / deceleration processing and braking processing operated by the controller 22 in accordance with a program stored in the memory unit. For example, when the driver performs an operation of accelerating the speed with the master controller 22A or performing a braking operation with the brake valve 22B, the CPU of the vehicle PC 20 calculates the traveling speed and pressure, and the calculation result is displayed on the instrument panel. Display on 21.
  • the vehicle PC 20 obtains the traveling position and traveling speed of a virtual train that travels at a predetermined time interval based on the data stored in the memory unit, and also accompanies the shaking of the vehicle and the curve of the track.
  • the displacement of the vehicle is calculated based on a vehicle model or the like. Then, the calculated travel position, travel speed, and vehicle displacement are supplied to the left view PC4A and the right view PC4B.
  • the vehicle PC 20 records the running sound when the virtual train travels, the warning sound by the traffic lights arranged along the track, and the roaring sound by the brake, recorded from the actual train and stored in the memory unit.
  • the sound generation unit shown in FIG. 15
  • the sound generation unit generates the sound with the sound volume and frequency changed, and outputs the generated sound from the speaker 1.
  • the vehicle PC20 obtains the travel position and travel speed at regular intervals or at regular time intervals, and supplies the obtained travel position and travel speed to both the left visual field PC4A and the right visual field PC4B. At any rate.
  • the CPUs of the left visual field PC4A and the right visual field PC4B execute a process of generating respective forward scenery that can be seen from the left eye and the right eye of the driver sitting in the driver's seat.
  • Each CPU includes an image generation program stored in a memory unit (corresponding to the storage unit of the present invention), a track on which a virtual train runs, and a signal, a platform, a building, and the like arranged along the track.
  • CG computer graphics
  • a 2D image is generated by rendering the stereoscopic model with the left eye and the right eye of the driver as viewpoints.
  • the front view is seen by generating a 2D image by rendering that moves the viewpoints of the left and right eyes of the driver
  • the virtual train driver can experience the same vibrations and curves as the actual train.
  • the displacement of the vehicle including the vehicle shake in the present embodiment is calculated using a vehicle model.
  • the vehicle model is a model that calculates the vibrations when the vehicle 2 actually travels or when passengers get on and off. The vehicle is out of track, irregular pressure is applied when passengers get on and off, vehicle inertia force, gravity, etc. The details will be described later.
  • the CPUs for the left visual field PC4A and the right visual field PC4B are arranged along the line and the line along the line of the visual frustum between the front clip surface and the rear clip surface in perspective projection.
  • 3D models such as traffic lights, platforms, buildings, and distant landscapes are formed, and each front landscape is generated by rendering with either the left eye or the right eye of the driver as the viewpoint.
  • this stereo model is generated in common by any one of the left view PC4A and the right view PC4B, and both CPUs perform the same based on the generated common stereo model.
  • each of the left projector 3A and the right projector 3B is provided with a polarizing plate whose polarization axes are orthogonal to each other, and a left-eye landscape and a right-eye landscape are projected through the polarizing plate. Therefore, the driver sitting in the driver's seat has a polarized lens having the same polarization axis as that attached to the left projector 3A, and a polarized lens having the same polarization axis as that attached to the right projector 3B.
  • Use polarized glasses that fit into your right eye. By viewing the CG landscape projected on the screen through the polarized glasses, a 3D image equivalent to that seen from the front of the driver's seat on the actual train appears in front of the driver.
  • the scenery by CG projected on the screen moves according to the calculated displacement of the vehicle. Because it moves (shakes), the driver feels that the vehicle is shaking when the track curves, when the train runs, or when passengers get on and off. In addition, as driving sounds and warning sounds flow from the speakers as the virtual train travels, the driver can get a sense of realism and tension similar to when driving an actual train. it can.
  • the 3D image generation means is “left vision PC” and “right vision PC”
  • the 3D image display means is “left projector” and “right
  • the projectors are individually configured, the left and right are not necessarily required to be separated from each other.
  • the screen and the 3D image display means may be integrated, for example, as a head mounted display. Furthermore, the 3D image display means accumulates image data of actual travel routes that do not necessarily need to generate an image by computer graphics, and generates an image base rendering (IBR) for their bit stream power. Also good.
  • IBR image base rendering
  • FIG. 2 is a diagram showing, as an example, a real space where the driver looks at the screen and a virtual space where the virtual camera hitting the eyes in the PC sees through the virtual plane.
  • a virtual space (corresponding to a visual frustum) within a certain range viewed by a virtual camera corresponding to a human eye through a virtual plane is a screen in front of the driver. Projected on. Therefore, it is necessary to make the relationship between this virtual space and virtual camera the same as the relationship between the driver and the screen in real space.
  • the interpupillary width of humans is 63mm on average and 31.5mm on the left and right respectively. Therefore, the images generated by the left visual field PC4A and the right visual field PC4B generate the left eye image and the right eye image by shifting the left and right virtual cameras 4a and 4b, respectively.
  • the width of the virtual plane 4x and 4v is set to 800mm.
  • the left eye virtual camera 4a of the left visual field PC4A is shifted to the left by 31.5 mm from the center of the virtual plane 4x, and the right eye of the right visual field PC4B.
  • the virtual camera 4b is shifted 31.5mm to the right from the center of the virtual plane 4y to generate a landscape within a certain range that can be seen through the 800mm virtual plane 4x, 4y, 1000mm ahead of the virtual cameras 4a and 4b. If each generated landscape is projected from the projector, the driver can accurately stereoscopically view the projected image.
  • the projector according to the present embodiment is arranged with height adjustment at a position where the shadow of the driver is not reflected on the screen 6. For this reason, the distance from the left projector 3A and the right projector 3B to the upper and lower portions of the screen 6 is different. Therefore, the projector is provided with a keystone correction function so that the landscape projected on the screen 6 does not deform into a trapezoid.
  • FIG. 3 is a diagram showing, as an example, the relationship between the 3D image generation processing timing by the left viewing PC and the right viewing PC and the processing timing by the vehicle PC.
  • the image generated by the left view PC4A and the right view PC4B and transmitted to the screen through the left projector 3A and the right projector 3B is only the interpupillary width. It must be equivalent to the one that was taken at the same time by shifting the virtual camera. Therefore, the scenery generated by the left view PC4A and the right view PC4B and projected from the left projector 3A and the right projector 3B needs to be synchronized for each frame of the image. Moreover, it is preferable to process each frame in about 1 / (30 to 60) seconds.
  • the interval at which the generated image of one frame is projected on the screen is defined as one period T.
  • the vehicle PC 20 calculates the displacement, travel speed, and pressure of the vehicle due to the vehicle swing or the curve of the track by the vehicle model.
  • the left visual field PC4A and the right visual field PC4B are supplied simultaneously.
  • the calculated running speed and pressure are displayed on the instrument panel 21.
  • the left field of view PC4A calculates the coordinates of the virtual camera (viewpoint) with respect to the displacement caused by the track, the tangible object along the track, the sway of the vehicle, or the curve of the track based on the displacement and travel speed of the supplied vehicle.
  • the rendering of the 3D model is performed, and one frame of the generated image is notified to the vehicle PC 20 at timing B1 and temporarily stored in the buffer memory.
  • the right field of view PC4B calculates the coordinates of the virtual camera (viewpoint) with respect to the displacement caused by the track, the tangible object along the track, the sway of the vehicle, or the curve of the track, based on the supplied displacement and traveling speed of the vehicle.
  • the rendering of the 3D model is performed, and one frame of the generated image is notified to the vehicle PC 20 by the timing B2 and temporarily stored in the buffer memory.
  • the vehicle PC20 confirming that the notification has been received from both the left visual field PC4A and the right visual field PC4B visualizes each image temporarily stored in the buffer memory at timing C when one cycle T has elapsed. At the same time, the visualized images are output to the left projector and the right projector, respectively.
  • the left projector and the right projector simultaneously superimpose and project 2D parallax images on the screen.
  • FIG. 4 is a cross-sectional structure diagram showing a frame constituting an image representing the front scenery
  • FIG. 5 is a plan structure diagram showing a frame constituting the image representing the front scenery.
  • the image generated by the left view PC and the right view PC has a camera frame 100 showing a view that can be viewed from the virtual camera through a virtual plane.
  • a train frame 101 indicating the position of a virtual train is formed on 100, and a base frame 102 indicating the ground is formed on the train frame 101.
  • On the base frame 102 a block frame 103 in which unit blocks of the front scenery are sequentially replenished according to the progress of the virtual train is formed.
  • the track of the own line and the other line is formed.
  • Traffic lights, platforms, standing trees, buildings, and other structures (tangible objects) are formed.
  • the landscape structure within the range of entering the frustum stand seen from the virtual camera through the virtual plane is replenished for each block in the traveling direction of the virtual train. In other words, every time a virtual train passes one block, one block is added.
  • the virtual camera moves in accordance with the traveling speed of the virtual train and moves through the virtual plane from the left and right viewpoints of the train moving through the three-dimensional model formed by a plurality of blocks included in the frustum base.
  • the viewed image is generated by each of the left view PC and the right view PC.
  • the scenery that can be seen from the driver's seat of the virtual train moves backwards sequentially according to the traveling speed, and it appears to the driver that the virtual train is running.
  • FIG. 6 is a diagram showing a flowchart in which images are generated in the left visual field PC and the right visual field PC.
  • step S30 when image generation processing is started in step S30, device generation and attribute setting are performed in the programs stored in the left view PC and the right view PC in step S31. .
  • step S32 information such as the length of the body of the virtual train, the position of the carriage, and the viewpoint position of the driver is received from the memory unit of the vehicle PC.
  • step S33 the structures constituting the forward scenery such as tracks, traffic lights, platforms, buildings, standing trees, etc. are read from the memory units of the left view PC and the right view PC, and further in step S34.
  • Self-track curve, slope, roadbed, track deviation level and friction coefficient, position and roadbed of other lines, signal equipment such as traffic lights and ground elements, station stop position and boarding rate, preceding train diagram, structure arrangement Read trajectory information including position and placement method.
  • the vehicle PC is based on the vehicle model, and the displacement of the vehicle due to vehicle swings such as left and right displacement Y, up and down displacement ⁇ , roll angle ⁇ , pitch angle ⁇ , and single angle ⁇ , and track curves.
  • the travel distance of the train starting from the first station is calculated.
  • the displacement of the vehicle due to the shaking of the vehicle or the curve of the track is combined with the moving distance of the train, which is referred to as vehicle displacement.
  • Each of the left view PC and the right view PC receives vehicle displacement information from the vehicle PC in step S35, and sets the travel distance of the train in the vehicle displacement information received in step 36. Based on this, the position and orientation of the vehicle relative to the ground are calculated.
  • step S37 it is determined whether or not the vehicle has exceeded the block boundary based on the calculated position and orientation of the vehicle. If the vehicle has exceeded the block boundary, a measurement is performed in step S38. A new block is arranged based on the own line data stored in the memory unit, and a structure according to the arrangement method information is arranged on the new block. Here, there is a block boundary every 25m.
  • the position and orientation of the block frame are calculated from the own line data, and are arranged on the base frame.
  • the position and orientation of the other line frame are calculated from the other line data, and are arranged in the block frame. Furthermore, it is executed by placing structures that make up the forward landscape, such as tracks, traffic lights, platforms, buildings, and standing trees, on any frame.
  • step 39 the vehicle is moved according to the position and orientation of the vehicle with respect to the ground.
  • the base frame moves by calculating the position and orientation of the base frame relative to the train frame based on the calculated position and orientation of the vehicle. This changed the position and direction of the train relative to the ground.
  • the train frame moves by calculating the position and orientation of the train frame relative to the camera frame based on the displacement of the vehicle due to the shaking of the vehicle received from the vehicle PC. This caused the vehicle to swing relative to the track.
  • a completion notification is sent to the vehicle PC in step S41, and one frame of the generated image is temporarily stored in the buffer memory.
  • the vehicle PC outputs each 2D image temporarily stored in the buffer memory to the projector at a predetermined timing when the completion notification is received from both the left vision PC and the right vision PC in Step 42. Order.
  • the left view PC and the right view PC simultaneously output the respective 2D images and project them onto the screen via the projector. Thereafter, the same operation is repeated, and the image is 30 per second.
  • Fig. 7 is a schematic diagram for explaining the shaking of the vehicle.
  • FIG. 7A is a reference example in which the vibration device 27 is provided in the vehicle 2.
  • the vibration device 27 has a structure that shakes the floor of the vehicle 2, and thereby reproduces the shaking of the entire vehicle 2.
  • the left view PC and the right view PC are A landscape can be generated and projected on the screen without taking into account the displacement caused by the shaking of the vehicle.
  • FIG. 8B is a schematic diagram showing a case where the vehicle 2 swings with reference to the ground and a case where the ground swings with reference to the floor of the vehicle 2.
  • the vehicle model is configured by a numerical analysis model and parameters for performing calculation from trajectory deviation or the like to calculation of body shake, and is incorporated in the vehicle PC 20.
  • the vehicle PC 20 of the present embodiment uses the vehicle model incorporated in the memory unit based on the traveling speed of the virtual train and the like, left and right y, up and down z, rolling ⁇ , pitching ⁇ , and keying ⁇ . Calculate the amount of displacement.
  • the left view PC4A and the right view PC4B move and rotate a landscape image represented by computer graphics based on the amount of displacement calculated by the vehicle PC20 to generate a "swinging landscape image" on the screen. Project to.
  • FIG 8 and 9 are diagrams showing an example of the vehicle model.
  • FIG. 8 (A) is a vehicle model showing the entire vehicle
  • FIG. 8 (B) shows the input and output of the vehicle model
  • 9A is a vehicle model representing the left and right planes of the vehicle
  • FIG. 9B is a vehicle model representing the front and back planes of the vehicle.
  • Fig. 8 (A), Fig. 9 (A), and Fig. 9 (B) are given trajectory error y, ⁇ , ⁇ , y, z, ⁇ calculated according to the creation of the trajectory. Left and right y, up and down z
  • the vehicle displacement amount is calculated by five vibration patterns.
  • the vehicle shake is caused by a track deviation, cant, inertial force, irregular pressurization by passengers, gravity, etc.
  • cant to raise the outer rail on the curve
  • ⁇ , ⁇ , inertia passengers
  • the force applied by can be input.
  • the shaking of the vehicle 2 may be reflected, and the generated image is moved (shake) based on the displacement amount of the vehicle 2.
  • FIG. 10 is a diagram showing a situation where a trajectory error is created.
  • This trajectory error can be obtained by passing a random number (white noise) through the filter of the first-order lag element, which is the same as the actual trajectory.
  • the track deviation is calculated while the vehicle 2 is traveling.
  • the trajectory error is calculated using Equations 1 to 4.
  • Fig. 11 shows an example of such a trajectory error calculated in units of blocks.
  • Equation 3 is a random number of ⁇ 1 ⁇ ⁇ 1.
  • FIG. 12 is a diagram for explaining irregular pressurization applied to the vehicle.
  • From time t41 to time t43 is the shortest stop time, and from time t41 to time t42 is the getting on / off time T40.
  • Passenger boarding / exiting is simulated by defining the boarding rate and the shortest stop time at each station on the virtual train route, and using the defined boarding rate and the shortest stop time. .
  • the shaking of the vehicle as passengers get on and off is simulated by applying an irregular force to one point of vehicle 2. Specifically, for example, the number of passengers (2a) getting off in FIG. 12 is twice that of passengers (a), so the magnitude of shaking the vehicle 2 is also doubled. In order to eliminate the unnaturalness that the shaking stops suddenly after the boarding / exiting time, a small shaking may be generated until the door is closed.
  • Fig. 13 is a diagram showing the rolling sound of the brake
  • Fig. 13 (A) shows the volume to be changed according to the speed
  • Fig. 13 (B) is the brake sound when the speed is changed. Indicates the coefficient of friction.
  • the vehicle PC 20 responds to the operation of the controller 22 and displays a display on the instrument panel 21.
  • the controller 22 When calculating the pressure of the valve and the running speed v, the actual train sound is recorded, the sound data stored in the memory unit is played, and the brake sound is generated. Volume A and frequency are changed according to the calculated brake valve pressure and travel speed V.
  • the volume A increases in proportion to the pressure of the brake valve (BC pressure P) as shown in Equation 5, and also changes according to the speed V as shown in Equation 6.
  • FIG. 13A is a graph showing the change according to Equation 6.
  • the frequency f varies depending on the BC pressure P and the friction coefficient ⁇ of the brake, that is, the braking force.
  • FIG. 13B is a diagram showing the relationship between ⁇ / ⁇ in Equation 7 and the traveling speed.
  • FIG. 14 is a diagram for explaining the operation of the railway driving simulator.
  • the railway driving simulator includes the controller 22, the vehicle PC20, the left field of view PC4A, and the right field of view PC4B (here, the same operation is performed. PC4 is displayed), speaker 1, left projector 3 3 and right projector 3 ⁇ (here, the same action is performed, so projector 3 including both is displayed for convenience of explanation. )) And screen 6.
  • Vehicle PC20 has its own curve, slope, roadbed, track error level and friction coefficient, position and roadbed of other lines, signal equipment such as traffic lights and ground elements, station stop positions and boarding rates, leading trains , A track information line including the layout position and layout method of the structure, a memory storing route data 200 including route sound, such as train trains, in-car broadcasts, warning lights of traffic lights, etc. Including vehicle characteristic information, instrument panel image attribute information, train running sound, vehicle data including vehicle sound such as door opening / closing sound, memory for storing vehicle 201 Train control processing unit 202, signal control processing unit 203 that controls signal display and alarm sound, operation by controller 22, ATS and ATC etc.
  • the drive control processing unit 208 that calculates the travel speed and travel position of the virtual train according to the track, the track control processing unit 209 that reads the track information of the route based on the track data and sets the parameters in the vehicle model, the track control Based on the parameters supplied from the control processing unit 209, the traveling speed and traveling position supplied from the drive control processing unit 208, and passenger information supplied from the passenger control processing unit 207, the displacement due to the shaking of the vehicle is calculated.
  • a vehicle model calculation processing unit 210; and a sound generation processing unit 211 that reproduces route sound and vehicle sound stored in the memory.
  • the visibility display PC 4 is a memory unit (storage unit of the present invention) that stores landscape data 400 representing tangible objects such as tracks, traffic lights, platforms, and buildings that generate images representing the front landscape of the driver's seat. And image generation control that generates a landscape developed in front of the driver based on the landscape data 400 and information representing the travel speed, travel position, and vehicle displacement supplied from the vehicle PC 20.
  • a processing unit 401 is included.
  • step Sl when the driver performs an operation by operating the controller 22, a signal corresponding to the operation is input to the drive control processing unit 208 (step Sl).
  • step S2 the route data 200 stored in the memory is read out, and the preceding train control processing unit 202, the signal control processing unit 203, the security device control processing unit 204, the conductor control processing unit 205, and the passenger control processing unit 207 Is supplied to the drive control processing unit 208 (step S2).
  • the vehicle data 201 stored in the memory is supplied to the security device control processing unit 204 and the drive control processing unit 208 (step S3).
  • the preceding train control processing unit 202 supplies the traveling position information of the preceding train generated based on the route data 200 to the signal control processing unit 203 (step S4).
  • the signal control processing unit 203 supplies signal information to be displayed on the traffic light to the safety device control processing unit 204 based on the supplied traveling position information of the preceding train (step S5).
  • the security device control processing unit 204 sends data related to the security function to the drive control processing unit 208 (step S6).
  • the security device control processing unit 204 sends alarm information to the sound generation processing unit 211 based on the supplied data relating to the security function (step S 7).
  • the conductor control processing unit 205 uses the route data 200 read from the memory and the driving control unit 208 to generate door control information, in-car broadcast information, The buzzer information is generated, and the generated information is sent to the door control processing unit 206 and the sound generation processing unit 211 (step S8).
  • the door control processing unit 206 sends the door opening / closing information to the passenger control processing unit 207, the sound generation processing unit 211, and the security device control processing unit 204 based on the door control information (step S9).
  • Passenger control processing unit 207 calculates passenger weight information to drive control processing unit 208 based on the boarding rate and the shortest stop time included in route data 200 read from the memory, and vehicle model calculation of boarding / alighting information.
  • the data is sent to the processing unit 210 (step S10).
  • the drive control processing unit 208 calculates the traveling speed, traveling position, etc. of the virtual train based on the track data and vehicle data read from the memory, and determines the traveling speed and traveling position by the conductor control processing unit.
  • the travel position is sent to 205 (step S11), the travel position is sent to the trajectory control processing unit 209, and the acceleration and travel position are sent to the vehicle model calculation processing unit 210 and the sound generation processing unit 211 (step 14).
  • trajectory control processing unit 209 calculates a trajectory error and sends it to the vehicle model calculation processing unit 210 (step S 13).
  • the vehicle model calculation processing unit 210 calculates the displacement of the spring and sends it to the sound generation processing unit 211.
  • the drive control processing unit 208 supplies the travel position and travel speed to the image generation processing unit 401 (Step S 16).
  • the vehicle model calculation processing unit 210 calculates the displacement (vehicle body displacement) due to the shaking of the vehicle, and sends the calculated vehicle body displacement to the image generation processing unit 401 of the visibility display PC 4 (step S17).
  • the view display PC 4 reads the landscape data 400 stored in the memory unit and sends it to the image generation processing unit 401 (step S18).
  • the image generation processing unit 401 receives from the drive control processing unit 208. Based on the supplied travel position and travel speed and the vehicle model calculation processing unit 210 force, the vehicle body displacement is generated to generate an image representing 'swaying landscape', and the generated frame based on the command of the vehicle PC20 Each image is supplied to the projector 3 (step S19).
  • the projector 3 supplies the generated image to the screen 6 (step S21).
  • the sound generation processing unit 211 outputs sound reproduced based on the supplied alarm information, broadcast information, buzzer information, and the like from the speaker 1 (step S20).
  • the railway driving simulator is adapted to the vehicle operation according to the operation of the vehicle. Controls the driving state, and displays the view from the virtually traveling vehicle as a three-dimensional “swinging landscape” based on the traveling position and traveling speed, and the vehicle body displacement due to the vehicle shaking. A running sound, a traffic light warning sound, and a vehicle roaring sound are output according to the displayed image, so that a sense of reality and tension as if driving an actual train can be obtained.
  • a command is issued from the command PC, predetermined data stored in the memory unit is selected, and the simulator environment can be flexibly set and changed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Train Traffic Observation, Control, And Security (AREA)

Abstract

An operation simulator for railway, in which a view is displayed in 3D on a screen in front of an operator seat in concert with operation by an operator wearing polarization glasses and in which conditions of the operation are displayed. The simulator has a train control means for obtaining traveling speed and traveling position of an imaginary train according to the operation; a 3D image creation means having a storage section in which tangible bodies arranged on a rail track and along the rail track are stored and creating, depending on the traveling speed and traveling position, individual views as seen by the left and right eyes of an imaginary operator; and a 3D image display means for projecting the created image on the screen through a polarization plate. When views as seen from the left eye and right eye are created, they are projected at the same time.

Description

明 細 書  Specification
鉄道用運転シミュレータ  Railway driving simulator
技術分野  Technical field
[0001] 本発明は、鉄道用運転シミュレータ、特に、鉄道の運転操作訓練や運転士のエラ 一の研究に用いる鉄道用運転シミュレータに関する。  TECHNICAL FIELD [0001] The present invention relates to a railway driving simulator, and more particularly to a railway driving simulator used for railway driving operation training and driver error research.
背景技術  Background art
[0002] 従来、鉄道の車両運行には、鉄道の安全確保の観点から閉そく方式が採用される のが一般的である。閉そく方式は、線路上に、閉塞区間を設定し、その区間は 1列車 が占有し、他の列車がその区間に進入するのを禁止することにより、列車が衝突する のを防止する方式である。閉塞区間における列車の有無、侵入の可否を知らせるも のが信号機である。したがって、信号機は、列車を安全に運転する基本となるもので ある。  [0002] Conventionally, in order to ensure the safety of railways, it has been common to use a closing system for railway vehicle operations. The blocking method is a method that prevents a train from colliding by setting a closed section on the track, and occupying that section by one train and prohibiting other trains from entering that section. . A traffic light is used to notify the presence or absence of a train in the blocked section and whether or not it can enter. Therefore, traffic lights are the basis for driving trains safely.
し力しながら、鉄道の車両運行中に運転士のいねむりにより、信号機が見落とされ、 誤って閉塞区間に侵入したために、他の列車と衝突するといつた列車事故が発生し ている。このため、信号機が無視された場合であっても安全が確保されるように、運転 士に警告を出すと同時に、列車にブレーキをかけ、 自動停車させるバックアップシス テムが採用されている。  On the other hand, the traffic signal was overlooked due to the driver's swaying while the train was running, and accidentally entered the blockage section, causing a train accident when it collided with another train. For this reason, in order to ensure safety even when the traffic light is ignored, a backup system is used to alert the driver and at the same time brake the train and stop it automatically.
(DATS (自動列車停止装置)は、停止を表示する信号機に列車が接近しても正常 に停車しない場合に、地上から制御信号を送ることにより、運転室内に警報ベルを鳴 らして運転士に注意喚起したり、ブレーキを自動的に動作させたりすることにより、列 車をその信号機の手前で停止させる装置である。  (DATS (automatic train stop device) sends a control signal from the ground and sends an alarm bell to the driver's cab by sending a control signal from the ground when the train does not stop normally even if a train approaches the stop signal. It is a device that stops the train in front of the traffic light by calling attention or automatically operating the brake.
(2) ATC (自動列車制御装置)は、地上からの信号や速度情報に基づいて列車に自 動的にブレーキをかけ、列車が制限速度以下になればブレーキを緩める装置である  (2) ATC (Automatic Train Control Device) is a device that automatically brakes the train based on signals from the ground and speed information, and releases the brake when the train falls below the speed limit.
(3) EB装置は、運転中の乗務員に関する異常状態を検知し、列車を停止させる装 置である。例えば、動力車の乗務員が主幹制御器やブレーキなどに対する運転操作 を 1分間全く行わないことが検知されると、警報ブザーを鳴動させ、 5秒以内に運転 操作や、リセット操作が行われないと、列車を非常停止させる。 (3) The EB device is a device that detects an abnormal condition related to an operating crew member and stops the train. For example, if it is detected that a crew member of a motor vehicle does not perform any driving operation on the master controller or brake for one minute, an alarm buzzer is sounded and driving is performed within 5 seconds. If no operation or reset operation is performed, the train is stopped.
[0003] し力しながら、これらのバックアップシステムを使用しても、踏み切りで車や人と衝突 してしまう事故や、保線作業中の作業員を誤って櫟いてしまう事故、停車すべき駅を 通過してしまう事故が発生してレ、る。  [0003] However, even if these backup systems are used, there are accidents that cause collisions with cars and people at railroad crossings, accidents that accidentally hit workers during track maintenance, and stations that should be stopped. An accident occurs that passes.
[0004] このため、上記のバックアップシステムには、次のような課題が残されていると考えら れる。  [0004] Therefore, it is considered that the following problems remain in the above backup system.
(1)エラーの数自体を減らすものではなレ、。  (1) It shouldn't reduce the number of errors.
(2)エラーが生じた時のみに作動しなければならなレ、が、通常時にも介入する事があ る。 (3)取り扱いを間違うと新たなエラーが発生する。  (2) Although it must operate only when an error occurs, it may intervene even during normal times. (3) A new error will occur if handled incorrectly.
[0005] そこで、バックアップシステムのみに頼るだけでなぐ運転士にエラーを起こさせな レ、ようにするための運転操作訓練やエラー解析が必要である。し力 ながら、実車を 用いての訓練や実験には限度があるので、鉄道用運転シミュレータを用いることによ り繰り返し訓練することや、実験を行うことが有効と考えられる。  [0005] Therefore, there is a need for operation training and error analysis to prevent the driver from making an error just by relying only on the backup system. However, there are limits to training and experiments using actual vehicles, so it is considered effective to train repeatedly and conduct experiments using a railway driving simulator.
[0006] 鉄道用運転シミュレータは、仮想的に鉄道の運転操作を行うことができるものであり 、主に、車両のコントローラで運転操作を行うと、その操作に応じて仮想的に車両が 走行し、それに伴って運転士から見える視界がスクリーンに投影されるように構成さ れたものが一般的である。  [0006] A railway driving simulator is capable of virtually operating a railway, and mainly when a driving operation is performed by a controller of the vehicle, the vehicle virtually runs in accordance with the operation. In general, the field of view that can be seen by the driver is projected onto the screen.
[0007] 例えば、運転教習を開始する駅から終了する駅までの各駅間の路線映像と、その 間に停車する各停車駅近傍の路線映像とを保持し、運転操作に合わせて駅間路線 映像と停車駅近傍路線映像とを切替えて再生することにより、車両の走行に対応した 路線映像が投影されるものがある(例えば、特許文献 1参照)。  [0007] For example, the route image between each station from the station where the driving lesson starts to the station where it ends and the route image near each station that stops in the meantime are maintained, and the inter-station route image according to the driving operation. In some cases, a route image corresponding to the traveling of the vehicle is projected by switching and reproducing the route image near the stop station (see, for example, Patent Document 1).
[0008] また、シミュレーションによる運転操作と路線映像との相関関係を高めることにより、 精度の高い定点停止訓練ができることから、運転台の前方に、前方路線映像が表示 され、運転台の側面に、停車駅近傍の側面映像が表示されるようにした鉄道用運転 シミュレータが提案されている(例えば、特許文献 2参照)。  [0008] In addition, since the fixed point stop training with high accuracy can be performed by increasing the correlation between the driving operation by simulation and the route image, the forward route image is displayed in front of the cab, and on the side of the cab. A railway driving simulator has been proposed in which a side image of the vicinity of a stop station is displayed (for example, see Patent Document 2).
[0009] このような運転シミュレータは、鉄道分野に限らず、 自動車分野でも用いられ、ハン ドル操作とアクセルペダルの踏み具合により映像上の走行速度が変化し、走行路や 指定速度が守られているか否かのチェックを行う安全運転指導用ドライブシミュレ一 タ (例えば、特許文献 3参照)、 自動車等の車両設計に際して、運転者や同乗者の居 住性や操縦性などを感性に基づレ、て評価するシミュレーション装置 (例えば、特許文 献 4参照)が開示されている。また、 ITS (高度道路交通システム)の開発過程で、運 転支援システムの評価、高齢ドライバの運転挙動の解析などが行われている。このほ か、航空分野では、開発される航空機の安定性や操縦性評価、操縦法の検討など に利用されている。 [0009] Such a driving simulator is used not only in the railway field but also in the automobile field, and the traveling speed on the image changes depending on the handle operation and the degree of depression of the accelerator pedal, and the traveling road and the specified speed are protected. Drive simulation for safe driving guidance that checks whether or not (For example, see Patent Document 3), a simulation device that evaluates the driver's and passengers' habitability and maneuverability based on their sensibility when designing a vehicle such as an automobile (see, for example, Patent Document 4) ) Is disclosed. In the process of developing ITS (Intelligent Transport System), evaluation of driving support systems and analysis of driving behavior of elderly drivers are being conducted. In addition, in the field of aviation, it is used to evaluate the stability and maneuverability of aircraft to be developed and to study maneuvering methods.
特許文献 1 :特開平 10— 161516号公報  Patent Document 1: JP-A-10-161516
特許文献 2:特開平 11一 237830号公報  Patent Document 2: JP-A-11-1237830
特許文献 3:特許第 2592343号公報  Patent Document 3: Japanese Patent No. 2592343
特許文献 4 :特開平 7— 271289号公報  Patent Document 4: JP-A-7-271289
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0010] し力、しながら、従来の鉄道用運転シミュレータは、運転士が運転席前方のスクリーン に投影された風景を見たときに、実際に運転しているときのような臨場感や緊張感が 得られないという難点があり、運転操作訓練やエラー解析を行う際の実効性が充分 確保されるとは言い得ない。 [0010] However, conventional railway driving simulators, when the driver sees the scenery projected on the screen in front of the driver's seat, feels as if he / she is actually driving. There is a problem that a feeling cannot be obtained, and it cannot be said that the effectiveness in driving operation training and error analysis is sufficiently secured.
[0011] また、実車両を詳細に再現した運転環境を組み込むことにより実効性の確保を図ろ うとすると、システム構成を列車の機種等に応じて柔軟に変更することができない。 [0011] If an attempt is made to ensure effectiveness by incorporating a driving environment in which an actual vehicle is reproduced in detail, the system configuration cannot be flexibly changed in accordance with the train model or the like.
[0012] さらに、走行状態に応じて運転席などを揺らす動振装置を設け、臨場感を高めるの は、高コストであり、小規模企業や施設などに導入するのは困難性を伴う。  [0012] Furthermore, it is expensive to provide a vibration device that swings the driver's seat or the like in accordance with the traveling state, and to increase the sense of reality, and it is difficult to introduce it to a small company or facility.
[0013] 本発明は、上記事情に鑑み、実際に運転しているのと同様の臨場感や緊張感が得 られると共に、運転環境を柔軟に変更することが可能な、低コストの鉄道用運転シミュ レータを提供することを目的とする。  [0013] In view of the circumstances described above, the present invention provides low-cost railway driving that can provide the same sense of presence and tension as actual driving and can flexibly change the driving environment. The purpose is to provide a simulator.
課題を解決するための手段  Means for solving the problem
[0014] 上記の目的を達成する本発明の鉄道用運転シミュレータは、偏光眼鏡をかけた運 転士による操作手段の操作に合わせて運転席前方のスクリーンに風景が 3D表示さ れると共に、該運転席の表示手段に操作状態が表示される鉄道用運転シミュレータ において、上記操作手段の操作に応じて仮想的列車の走行速度を求め、求めた該 走行速度と経過時間とに基づいて該仮想的列車の走行位置を求める列車制御手段 と、上記仮想的列車が走行する線路及び該線路沿線に配置された有体物に係るデ ータが格納された格納部を有し、上記列車制御手段が求めた上記走行速度及び上 記走行位置に応じて仮想的運転士の左眼と右眼とからそれぞれ見えるそれぞれの 風景を生成する 3D画像生成手段と、上記 3D画像生成手段により生成されたそれぞ れの風景を、互いに偏光軸が直交する偏光板それぞれを介して上記スクリーンに投 影する 3D画像表示手段とを備え、上記列車制御手段は、上記 3D画像生成手段に おいて、左眼と右眼とからそれぞれ見えるそれぞれの風景が生成されたとき、生成さ れたそれぞれの風景を上記 3D画像表示手段から同時に投影させることを特徴とする [0014] The railway driving simulator of the present invention that achieves the above object displays a 3D landscape on the screen in front of the driver's seat according to the operation of the operating means by the driver wearing polarized glasses, In the railway driving simulator in which the operation state is displayed on the seat display means, the traveling speed of the virtual train is obtained according to the operation of the operation means, and the obtained Train control means for determining the travel position of the virtual train based on the travel speed and the elapsed time, and a storage storing data relating to a track on which the virtual train travels and a tangible object disposed along the track A 3D image generating means for generating respective scenes respectively visible from the left eye and the right eye of the virtual driver according to the travel speed and the travel position obtained by the train control means, 3D image display means for projecting each landscape generated by the 3D image generation means onto the screen via polarizing plates whose polarization axes are orthogonal to each other. The train control means includes the 3D image display means. When each of the scenery that can be seen from the left eye and the right eye is generated by the generation means, each of the generated scenery is projected from the 3D image display means at the same time.
[0015] このように、列車制御手段が列車の走行速度及び走行位置を求めると共に、現実 の列車が走行するときに運転士に見える風景を 3D画像生成手段が生成し、生成さ れた風景を 3D画像表示手段から運転士前方のスクリーンに立体的に投影するので 、偏光眼鏡をかけた運転士は実際に運転しているのと同様の臨場感や緊張感を得る こと力 Sできる。 [0015] In this way, the train control means obtains the travel speed and travel position of the train, and the 3D image generation means generates a landscape that is visible to the driver when the actual train travels. Since the 3D image display means three-dimensionally projects it onto the screen in front of the driver, the driver wearing polarized glasses can obtain the same realism and tension as if he were actually driving.
[0016] また、操作手段として、仮想的列車の走行速度を変化させる主幹制御器及び仮想 的列車を制動する制動輪に圧力をかけるブレーキ弁を設け、表示手段として、走行 速度及び圧力を表示する計器パネルを設け、列車制御手段が、主幹制御器により変 化させた走行速度及びブレーキ弁によりかけられた圧力を計器パネルに表示するこ とにすれば、さらに臨場感や緊張感を高めることができる。  [0016] In addition, a master controller that changes the traveling speed of the virtual train and a brake valve that applies pressure to the braking wheels that brake the virtual train are provided as operation means, and the traveling speed and pressure are displayed as the display means. If an instrument panel is provided and the train control means displays the running speed changed by the master controller and the pressure applied by the brake valve on the instrument panel, the sense of presence and tension can be further enhanced. it can.
[0017] さらに、列車制御手段は、上記仮想的列車を構成する車両に係るデータ、及び上 記線路沿線に配置される信号機及びプラットフォームの位置情報を含むデータが格 納されたメモリと、上記走行速度及び上記走行位置を求めると共に、上記メモリに格 納された上記データを用いて上記線路上の各位置における上記仮想的列車を構成 する車両の揺れを含む変位を求める演算部とを備えることや、上記演算部で求めた 上記走行速度及び上記走行位置に基づいて、上記信号機の表示を制御する信号 制御部、及び上記仮想的列車の乗降扉の開閉を制御する扉開閉制御部を備えるこ と、上記 3D画像生成手段は、上記演算部で求めた上記変位に基づいて、生成され た上記左眼から見える風景を移動させる左眼用 3D画像生成部及び生成された上記 右眼から見える風景を移動させる右眼用 3D画像生成部を備えることにすれば、列車 の走行や乗客の乗降に応じて揺れる車両を、スクリーンに投影される風景を移動させ ることによって相対的に再現できるので、さらに臨場感や緊張感を高めることができる [0017] Further, the train control means includes a memory storing data related to the vehicles constituting the virtual train, and data including position information of traffic lights and platforms arranged along the track, and the traveling And calculating a speed and the travel position, and using the data stored in the memory to obtain a displacement including a shake of a vehicle constituting the virtual train at each position on the track. A signal control unit that controls display of the traffic light based on the travel speed and the travel position obtained by the calculation unit, and a door opening / closing control unit that controls opening / closing of the entrance / exit of the virtual train. The 3D image generation means is generated based on the displacement obtained by the calculation unit. In addition, if a left-eye 3D image generation unit that moves the scenery seen from the left eye and a right-eye 3D image generation part that moves the scenery seen from the right eye are provided, it is possible to A vehicle that shakes as it gets on and off can be relatively reproduced by moving the landscape projected on the screen, further enhancing the sense of presence and tension.
[0018] また、列車制御手段は、上記列車制御手段は、上記メモリに格納された走行音及 び警報音に係るデータに基づレ、て、上記仮想的列車の走行に応じて生じる走行音、 及び上記信号幾の警報音を含む音響を生成し、所定のスピーカに出力する音響生 成部を備えることや、その音響生成部が、上記ブレーキ弁が操作されたとき、上記計 器パネルに表示される上記走行速度及び上記圧力に応じて上記格納部に格納され た軋り音に係るデータの音量及び周波数を変化させて上記所定のスピーカから出力 することにすれば、さらに臨場感や緊張感を高めることができる。 [0018] In addition, the train control means, the train control means, based on the data related to the running sound and the alarm sound stored in the memory, the running sound generated according to the running of the virtual train And a sound generating unit that generates sound including a number of alarm sounds and outputs the sound to a predetermined speaker, or the sound generating unit is connected to the instrument panel when the brake valve is operated. By changing the volume and frequency of the data related to the roaring sound stored in the storage unit according to the displayed traveling speed and pressure, and outputting the data from the predetermined speaker, the presence and tension can be further increased. A feeling can be heightened.
[0019] さらに、上記仮想的列車の走行路線、及び該仮想的列車を構成する車両の種類を 含むシミュレーション環境を選択し、上記列車制御手段及び上記 3D画像生成手段 に指令を行なう指令手段を備えれば、シミュレータの運転環境をより柔軟に設定変更 すること力 Sできる。  [0019] Further, there is provided command means for selecting a simulation environment including a travel route of the virtual train and a type of a vehicle constituting the virtual train, and instructing the train control means and the 3D image generation means. As a result, it is possible to change the simulator operating environment more flexibly.
発明の効果  The invention's effect
[0020] 本発明によれば、運転士の操作に従って、計器パネルには変化する速度や圧力が 表示される一方、走行速度の変化や走行位置の変化に応じて、偏光眼鏡を掛けた 運転士には、揺れる車両から見たのと同様の立体的な風景が運転席前方に展開さ れる。したがって運転士に実際の車両を運転するのと同様の臨場感や緊張感を与え ること力 Sできる。また、走行路線や運転される列車の種類などのシミュレータ環境は、 メニュー選択、プログラム変更などにより柔軟に設定可能であり、低コストな鉄道用運 転シミュレータを実現可能である。  [0020] According to the present invention, the speed and pressure that change according to the operation of the driver are displayed on the instrument panel, while the driver wearing polarized glasses according to the change in the travel speed and the change in the travel position. A three-dimensional landscape similar to that seen from a swaying vehicle is unfolded in front of the driver's seat. Therefore, it is possible to give the driver the same sense of presence and tension as driving an actual vehicle. In addition, simulator environments such as travel routes and types of trains to be operated can be flexibly set by menu selection, program change, etc., and a low-cost railway operation simulator can be realized.
図面の簡単な説明  Brief Description of Drawings
[0021] [図 1]本発明の実施形態が適用される鉄道用運転シミュレータの構成図である。  FIG. 1 is a configuration diagram of a railway driving simulator to which an embodiment of the present invention is applied.
[図 2]運転士がスクリーンを見る実空間と PC内で間の眼に当たる仮想カメラが仮想平 面を通して見る仮想空間とを一例として示す図である。 [図 3]左用視界 PC及び右用視界 PCによる 3D画像生成処理タイミングと車両 PCによ る処理タイミングとの関係を一例として示す図である。 [Fig. 2] A diagram showing, as an example, a real space where the driver looks at the screen and a virtual space where the virtual camera hitting the eyes in the PC looks through the virtual plane. FIG. 3 is a diagram showing, as an example, the relationship between the 3D image generation processing timing by the left viewing PC and the right viewing PC and the processing timing by the vehicle PC.
園 4]前方風景を表す画像を構成するフレームを示す断面構造図である。 4] It is a cross-sectional structure diagram showing a frame constituting an image representing a forward landscape.
園 5]前方風景を表す画像を構成するフレームを示す平面構造図である。 FIG. 5] is a plan structural view showing a frame constituting an image representing a forward landscape.
園 6]左用視界 PC及び右用視界 PCにおいて画像が生成されるフローチャートを示 す図である。 6] It is a diagram showing a flowchart in which images are generated in the left view PC and the right view PC.
園 7]車両の揺れについて説明する模式図である。 [7] It is a schematic diagram for explaining the shaking of the vehicle.
[図 8]車両モデルの一例を示す図である。  FIG. 8 is a diagram showing an example of a vehicle model.
[図 9]車両モデルの一例を示す図である。  FIG. 9 is a diagram showing an example of a vehicle model.
園 10]軌道狂いが作成される状況を示す図である。 [Sen 10] This is a diagram showing the situation in which a trajectory error is created.
園 11]ブロックを単位として算出された軌道狂いの一例を示す図である。 FIG. 11] is a diagram showing an example of a trajectory error calculated in units of blocks.
園 12]車両に与える不規則な加圧を説明する図である。 12] It is a diagram for explaining irregular pressurization applied to the vehicle.
[図 13]ブレーキの軋り音を示す図である。  FIG. 13 is a diagram showing a rolling sound of a brake.
園 14]鉄道用運転シミュレータの作用を説明する図である。 14] It is a diagram for explaining the operation of the railway driving simulator.
符号の説明 Explanation of symbols
1 スピーカ  1 Speaker
2 車両  2 Vehicle
3 A 左用プロジェクタ  3 A Left projector
3B 右用プロジェクタ  3B Right projector
4 視界表示用 PC  4 Visibility display PC
4A 左用視界 PC  4A Left view PC
4B 右用視界 PC  4B Right view PC
4a, 4b 仮想カメラ  4a, 4b virtual camera
4x、4y 仮想平面  4x, 4y virtual plane
5 LAN  5 LAN
6 スクリーン  6 screen
7 指令用 PC  7 PC for command
20 車両 PC 21 計器パネル 20 vehicle PC 21 Instrument panel
22 コントローラ  22 Controller
22A 主幹制御器  22A Master controller
22B ブレーキ弁  22B Brake valve
27 動振装置  27 Vibration device
100 カメラフレーム  100 camera frame
101 列車フレーム  101 train frame
102 ベースフレーム  102 base frame
103 ブロックフレーム  103 block frame
200 線路データ  200 track data
201 車両データ  201 Vehicle data
202 先行列車制御部  202 Leading train control unit
203 信号制御処理部  203 Signal control processor
204 保安装置制御処理部  204 Security device control processing section
205 車掌制御処理部  205 Conductor control processing section
206 ドア制御処理部  206 Door control processor
207 乗客制御処理部  207 Passenger control processor
208 駆動制御処理部  208 Drive control processor
209 軌道制御処理部  209 Trajectory control processor
210 車両モデル算出処理部  210 Vehicle model calculation processor
211 音響生成処理部  211 Sound generation processor
400 風景データ  400 Landscape data
401 画像生成制御処理部  401 Image generation control processor
発明を実施するための最良の形態 BEST MODE FOR CARRYING OUT THE INVENTION
以下、本発明の実施形態を図面に基づいて説明する。  Hereinafter, embodiments of the present invention will be described with reference to the drawings.
本発明の鉄道用運転シミュレータは、鉄道用車両の模擬的な運転席を設け、その運 転席に座った運転士が操作手段を操作すると、その操作に合わせて、あたかも実際 の車両を運転しているのと同様に前方風景が変化すると共に、表示手段の表示が変 化し、運転士に実際の車両を運転してレ、るときの感覚を与えることが可能な仕組みで ある。 The railway driving simulator according to the present invention provides a simulated driver's seat for a railway vehicle, and when a driver sitting in the driver's seat operates the operating means, the actual vehicle is driven according to the operation. The front landscape changes and the display of the display means changes as It is a mechanism that can give the driver the feeling of driving an actual vehicle.
[0024] 図 1は、本発明の実施形態が適用される鉄道用運転シミュレータの構成図である。  FIG. 1 is a configuration diagram of a railway driving simulator to which an embodiment of the present invention is applied.
図 1に示す鉄道用運転シミュレータは、運転席が設けられた車両 2と、列車の走行に よって生じる音響を出力するスピーカ 1と、車両 2の運転席から見える前方風景を生 成する、左眼用及び右眼用の一対の視界表示用パーソナルコンピュータ(本発明の 3D画像生成手段に相当し、以下、「左用視界 PC」及び「右用視界 PC」と称する。 ) 4 A、 4Bと、左用視界 PC及び右用視界 PC4A、 4Bによって生成されたそれぞれの前 方風景を投影する左眼用プロジェクタ及び右眼用プロジェクタ (本発明の 3D画像表 示手段に相当し、以下、「左用プロジェクタ」及び「右用プロジェクタ」と称する。) 3A、 3Bと、運転席前方に配置され、左用プロジェクタ及び右用プロジェクタ 3A、 3Bそれ ぞれから前方風景が透過投影され重ね合わされるスクリーン 6と、仮想的列車の走行 路線や仮想的列車を構成する車両の種類などを選択してシミュレーション環境を設 定する指令用パーソナルコンピュータ(以下、「指令 PC」と称する。)7と、により構成さ れている。  The railway driving simulator shown in Fig. 1 has a vehicle 2 with a driver's seat, a speaker 1 that outputs sound generated by the train running, and a left eye that generates a frontal view that can be seen from the driver's seat of the vehicle 2. A pair of visual display computers for the right and left eyes (corresponding to the 3D image generating means of the present invention, hereinafter referred to as “left vision PC” and “right vision PC”) 4 A, 4B, and left Left-eye projector and right-eye projector for projecting the respective front landscapes generated by the visual field PC and the right visual field PCs 4A and 4B (corresponding to the 3D image display means of the present invention, hereinafter referred to as “left projector” and (Referred to as “right projector”) 3A, 3B, a screen 6 arranged in front of the driver's seat, the left projector and the right projector 3A, 3B are transparently projected and superimposed on the front landscape, and a virtual row It is composed of a command personal computer (hereinafter referred to as “command PC”) 7 that sets the simulation environment by selecting the vehicle travel route and the types of vehicles constituting the virtual train.
[0025] 車両 2は、仮想的列車の運転操作を行うコントローラ(本発明の操作手段に相当す る。) 22と、コントローラ 22の操作に応じた仮想的列車の走行速度及びブレーキ弁の 圧力をディスプレイ表示する計器パネル (本発明の表示手段に相当する。) 21と、コ ントローラ 22の操作に応じて仮想的列車の走行を制御する車両制御用パーソナルコ ンピュータ(以下、「車両 PC」と称する。) 20とを備えている。なお、コントローラ 22は、 仮想的列車の走行速度を変化させる主幹制御器 22Aと、仮想的列車を制動するブ レーキ弁 22Bとを有する。  The vehicle 2 has a controller (corresponding to the operating means of the present invention) 22 for operating the virtual train, and the traveling speed of the virtual train and the pressure of the brake valve according to the operation of the controller 22. Instrument panel for display (corresponding to the display means of the present invention) 21 and a personal computer for vehicle control (hereinafter referred to as “vehicle PC”) that controls the running of a virtual train in accordance with the operation of the controller 22 .) 20 and. The controller 22 includes a master controller 22A that changes the traveling speed of the virtual train, and a brake valve 22B that brakes the virtual train.
[0026] なお、車両 PC20、左用視界 PC4A、及び右用視界 PC4Bは、 LAN (Local Area  [0026] It should be noted that the vehicle PC20, the left field of view PC4A, and the right field of view PC4B are LAN (Local Area
Network) 5を介して接続され、それらは TCP (Transmission Control Protoc ol)、UDP (User Datagram Protocol)等のプロトコルにより相互通信が可能であ る。  Network) 5, and they can communicate with each other using protocols such as TCP (Transmission Control Protocol) and UDP (User Datagram Protocol).
[0027] 本実施形態の鉄道用運転シミュレータは、リアルタイム性が重視され、 PC相互間は 高速通信が要求されるため UDPを用レ、、左用視界 PC4A及び右用視界 PC4Bは、 UDPを容易に扱うことができるソフトウェア(例えば、 Winsock)を用いているが、必 ずしもこれに限定されない。 [0027] In the railway driving simulator of this embodiment, real-time characteristics are important, and high-speed communication is required between PCs. Therefore, UDP is used, the left view PC4A and the right view PC4B are: Software that can handle UDP easily (for example, Winsock) is used, but it is not necessarily limited to this.
[0028] 車両 PC20は、 CPU (中央処理装置)によって車両 2の走行制御を行なう。 CPUは 、メモリユニットに格納されたプログラムに従ってコントローラ 22で操作された加減速 処理や制動処理を行う。例えば、運転士が主幹制御器 22Aで速度を加速する操作 を行ったり、ブレーキ弁 22Bによる制動動作を行うと、車両 PC20の CPUは、走行速 度や圧力を算出し、その算出結果を計器パネル 21に表示する。  [0028] The vehicle PC 20 controls the traveling of the vehicle 2 by a CPU (central processing unit). The CPU performs acceleration / deceleration processing and braking processing operated by the controller 22 in accordance with a program stored in the memory unit. For example, when the driver performs an operation of accelerating the speed with the master controller 22A or performing a braking operation with the brake valve 22B, the CPU of the vehicle PC 20 calculates the traveling speed and pressure, and the calculation result is displayed on the instrument panel. Display on 21.
[0029] また、車両 PC20は、メモリユニットに格納されたデータに基づいて、所定の時間間 隔で走行する仮想的列車の走行位置及び走行速度を求めると共に、車両の揺れや 線路のカーブに伴う車両の変位を車両モデルなどに基づいて算出する。そして、算 出された走行位置、走行速度、及び車両の変位を、左用視界 PC4A及び右用視界 PC4Bに供給する。  [0029] In addition, the vehicle PC 20 obtains the traveling position and traveling speed of a virtual train that travels at a predetermined time interval based on the data stored in the memory unit, and also accompanies the shaking of the vehicle and the curve of the track. The displacement of the vehicle is calculated based on a vehicle model or the like. Then, the calculated travel position, travel speed, and vehicle displacement are supplied to the left view PC4A and the right view PC4B.
[0030] さらに、車両 PC20は、仮想的列車が走行する際の走行音、線路沿線に配置され た信号機による警報音、及びブレーキによる軋り音を、現実の列車から録音されメモ リユニットに格納されている音声データを基に、音響生成部(図 15に示す)がその音 量及び周波数を変化させた音響として生成し、生成した音響をスピーカ 1から出力す る。  [0030] Furthermore, the vehicle PC 20 records the running sound when the virtual train travels, the warning sound by the traffic lights arranged along the track, and the roaring sound by the brake, recorded from the actual train and stored in the memory unit. Based on the audio data that has been generated, the sound generation unit (shown in FIG. 15) generates the sound with the sound volume and frequency changed, and outputs the generated sound from the speaker 1.
[0031] 尚、車両 PC20は、走行位置及び走行速度を一定の距離毎に又は一定の時間間 隔で求め、求めた走行位置及び走行速度を左用視界 PC4A及び右用視界 PC4B双 方に供給することにしてもよレ、。  [0031] The vehicle PC20 obtains the travel position and travel speed at regular intervals or at regular time intervals, and supplies the obtained travel position and travel speed to both the left visual field PC4A and the right visual field PC4B. At any rate.
[0032] 左用視界 PC4A及び右用視界 PC4Bそれぞれの CPUは、運転席に座った運転士 の左眼及び右眼それぞれから見えるそれぞれの前方風景を生成する処理を実行す る。それぞれの CPUは、メモリユニット(本発明の格納部に相当する。 )に格納された 画像生成プログラムや、仮想的列車が走行する線路及び線路沿線に配置された信 号機、プラットフォーム、建造物などの有体物に係るデータ、車両 PC20から供給され た走行位置及び走行速度に関する情報に基づいて、コンピュータグラフィック(CG) による前方風景の立体モデルを生成する。そして、その立体モデルを運転士の左眼 及び右眼それぞれを視点とするレンダリングにより 2D画像を生成する。そのとき、車 両 PC20から供給された車両の揺れを含む車両の変位を加味して、運転士の左眼及 び右眼それぞれの視点を移動させるレンダリングにより 2D画像を生成することにより 、前方風景を見ている仮想的列車の運転士に実際の列車と同様の揺れやカーブを 体感させることができる。 [0032] The CPUs of the left visual field PC4A and the right visual field PC4B execute a process of generating respective forward scenery that can be seen from the left eye and the right eye of the driver sitting in the driver's seat. Each CPU includes an image generation program stored in a memory unit (corresponding to the storage unit of the present invention), a track on which a virtual train runs, and a signal, a platform, a building, and the like arranged along the track. Based on the data relating to the tangible object and the information on the traveling position and traveling speed supplied from the vehicle PC 20, a three-dimensional model of the front landscape by computer graphics (CG) is generated. Then, a 2D image is generated by rendering the stereoscopic model with the left eye and the right eye of the driver as viewpoints. Then car Considering the vehicle displacement including the shaking of the vehicle supplied from both PCs 20, the front view is seen by generating a 2D image by rendering that moves the viewpoints of the left and right eyes of the driver The virtual train driver can experience the same vibrations and curves as the actual train.
[0033] ここで、本実施形態における車両の揺れを含む車両の変位は、車両モデルを用い て算出される。車両モデルは、現実に車両 2が走行したときや、乗客の乗降に伴う揺 れを算出するモデルで、軌道狂い、乗客が乗降する際の不規則な加圧、車両の慣 性力や重力等が入力されるもので、詳細は後述する。  [0033] Here, the displacement of the vehicle including the vehicle shake in the present embodiment is calculated using a vehicle model. The vehicle model is a model that calculates the vibrations when the vehicle 2 actually travels or when passengers get on and off.The vehicle is out of track, irregular pressure is applied when passengers get on and off, vehicle inertia force, gravity, etc. The details will be described later.
[0034] 本実施形態では、左用視界 PC4A及び右用視界 PC4Bそれぞれの CPUが、透視 投影における前方クリップ面と後方クリップ面との間の視錘台の中に入る、線路及び 線路沿線に配置された信号機、プラットフォーム、建造物や遠方風景などの立体モ デルを形成し、運転者の左眼及び右眼何れか一方を視点とするレンダリングにより、 それぞれの前方風景が生成される。  [0034] In the present embodiment, the CPUs for the left visual field PC4A and the right visual field PC4B are arranged along the line and the line along the line of the visual frustum between the front clip surface and the rear clip surface in perspective projection. 3D models such as traffic lights, platforms, buildings, and distant landscapes are formed, and each front landscape is generated by rendering with either the left eye or the right eye of the driver as the viewpoint.
[0035] なお、この立体モデルは、左用視界 PC4A及び右用視界 PC4Bの何れか一方の C PUで共通に生成し、生成された共通の立体モデルに基づいて、双方の CPUがそれ  [0035] It should be noted that this stereo model is generated in common by any one of the left view PC4A and the right view PC4B, and both CPUs perform the same based on the generated common stereo model.
[0036] 左用視界 PC4A及び右用視界 PC4Bそれぞれの CPUにより生成された CGによる 風景は、左用プロジェクタ 3A及び右用プロジェクタ 3Bから、左眼用風景及び右眼用 風景としてスクリーン 6に投影される。 [0036] Scenes by CG generated by the CPUs of the left field of view PC4A and the right field of view PC4B are projected on the screen 6 from the left projector 3A and the right projector 3B as a left eye landscape and a right eye landscape.
[0037] なお、左用プロジェクタ 3A及び右用プロジェクタ 3Bそれぞれには、偏光軸が互い に直交する偏光板が取り付けられ、偏光板を介して左眼用風景及び右眼用風景が 投影される。したがって、運転席に座った運転士は、左用プロジェクタ 3Aに取り付け られたのと同じ偏光軸を有する偏光レンズを左眼に、右用プロジェクタ 3Bに取り付け られたのと同じ偏光軸を有する偏光レンズを右眼にそれぞれ嵌め込んだ偏光眼鏡を 力、ける。そして、スクリーンに投影された CGによる風景をその偏光眼鏡を通して見る ことにより、実際の列車の運転席から前方を見たときと同等の 3D映像が運転士の眼 前に現出される。  [0037] Note that each of the left projector 3A and the right projector 3B is provided with a polarizing plate whose polarization axes are orthogonal to each other, and a left-eye landscape and a right-eye landscape are projected through the polarizing plate. Therefore, the driver sitting in the driver's seat has a polarized lens having the same polarization axis as that attached to the left projector 3A, and a polarized lens having the same polarization axis as that attached to the right projector 3B. Use polarized glasses that fit into your right eye. By viewing the CG landscape projected on the screen through the polarized glasses, a 3D image equivalent to that seen from the front of the driver's seat on the actual train appears in front of the driver.
[0038] また、スクリーンに投影された CGによる風景は、算出された車両の変位に応じて移 動する(揺れる)ので、運転士から見れば、線路がカーブするときや列車が走行する とき、乗客が乗降するときなどに、車両が揺れているように感ずる。さらに、仮想的列 車の走行にあわせて、走行音、警報音などがスピーカから流れるので、運転士は、実 際の列車を運転しているときと同様の臨場感や緊張感を得ることができる。 [0038] In addition, the scenery by CG projected on the screen moves according to the calculated displacement of the vehicle. Because it moves (shakes), the driver feels that the vehicle is shaking when the track curves, when the train runs, or when passengers get on and off. In addition, as driving sounds and warning sounds flow from the speakers as the virtual train travels, the driver can get a sense of realism and tension similar to when driving an actual train. it can.
[0039] ここで、本実施形態の鉄道用運転シミュレータは、 3D画像生成手段が「左用視界 P C」及び「右用視界 PC」として、また 3D画像表示手段が、「左用プロジェクタ」及び「 右用プロジェクタ」として、それぞれ別個に構成されているが、必ずしも左右が分離さ れた構造のものである必要はなぐ左右が一体のものとして構成してもよい。  [0039] Here, in the railway driving simulator of the present embodiment, the 3D image generation means is "left vision PC" and "right vision PC", and the 3D image display means is "left projector" and "right Although the projectors are individually configured, the left and right are not necessarily required to be separated from each other.
[0040] また、スクリーン及び 3D画像表示手段を一体とした、例えばヘッドマウントディスプ レイとして構成してもよい。さらに、 3D画像表示手段は、必ずしもコンピュータグラフィ ックによる画像を生成する必要はなぐ実際の走行路線のイメージデータを蓄積して おき、それらのビットストリーム力もイメージベースレンダリング(IBR)により画像生成し てもよい。  [0040] The screen and the 3D image display means may be integrated, for example, as a head mounted display. Furthermore, the 3D image display means accumulates image data of actual travel routes that do not necessarily need to generate an image by computer graphics, and generates an image base rendering (IBR) for their bit stream power. Also good.
[0041] 次に、左用視界 PC4A及び右用視界 PC4Bにおいて生成される視差画像につい て説明する。  [0041] Next, the parallax images generated in the left visual field PC4A and the right visual field PC4B will be described.
[0042] 図 2は、運転士がスクリーンを見る実空間と PC内で間の眼に当たる仮想カメラが仮 想平面を通して見る仮想空間とを一例として示す図である。  [0042] FIG. 2 is a diagram showing, as an example, a real space where the driver looks at the screen and a virtual space where the virtual camera hitting the eyes in the PC sees through the virtual plane.
[0043] 左用視界 PC4A及び右用視界 PC4Bにおける画像生成において、人間の眼にあ たる仮想カメラが仮想平面を通して見る一定範囲内の仮想空間(視錘台に相当する 。)が運転士前方のスクリーンに投影される。したがって、この仮想空間と仮想カメラと の関係が、実空間における運転士とスクリーンの関係と同じになるようにする必要が ある。  [0043] In image generation in the left visual field PC4A and the right visual field PC4B, a virtual space (corresponding to a visual frustum) within a certain range viewed by a virtual camera corresponding to a human eye through a virtual plane is a screen in front of the driver. Projected on. Therefore, it is necessary to make the relationship between this virtual space and virtual camera the same as the relationship between the driver and the screen in real space.
[0044] 図 2に、実空間として示すように、人間(運転士)の瞳孔間幅は平均 63mm、左右そ れぞれ 31. 5mmである。したがって、左用視界 PC4A及び右用視界 PC4Bそれぞ れにより生成される画像は、左右それぞれ 31. 5mm仮想カメラ 4a、 4bをずらして左 眼用画像と、右眼用画像とを生成する。  [0044] As shown in Fig. 2 as a real space, the interpupillary width of humans (drivers) is 63mm on average and 31.5mm on the left and right respectively. Therefore, the images generated by the left visual field PC4A and the right visual field PC4B generate the left eye image and the right eye image by shifting the left and right virtual cameras 4a and 4b, respectively.
[0045] 次に、運転士の左右の眼とスクリーン 6との間の距離が、 1775. 5mm、スクリーン 6 の幅が 1422mmに設定されているので、仮想平面 4x、 4vの幅を 800mmに設定す れば、運転士の左眼、右眼それぞれとスクリーン 6との関係と、仮想空間における仮 想カメラ 4a、 4bと仮想平面 4x、 4yとの関係とが相似になる必要から、左用視界 PC4 A及び右用視界 PC4Bそれぞれの仮想カメラ 4a、 4bと仮想平面 4x、 4yとの間の距 離は lmとなる。 [0045] Next, since the distance between the left and right eyes of the driver and the screen 6 is set to 1775. 5mm and the width of the screen 6 is set to 1422mm, the width of the virtual plane 4x and 4v is set to 800mm. You Therefore, since the relationship between the driver's left and right eyes and the screen 6 and the relationship between the virtual cameras 4a and 4b and the virtual planes 4x and 4y in the virtual space must be similar, the left vision PC4 A The distance between the virtual cameras 4a and 4b and the virtual planes 4x and 4y of the right field of view PC4B is lm.
[0046] したがって、図 2に、仮想空間として示すように、左用視界 PC4Aの左眼用仮想カメ ラ 4aは、仮想平面 4xの中央から 31. 5mmだけ左にずらし、右用視界 PC4Bの右眼 用の仮想カメラ 4bは、仮想平面 4yの中央から 31. 5mmだけ右にずらし、仮想カメラ 4a、 4b力ら 1000mm先の、 800mm幅の仮想平面 4x、 4yを通して見える一定範囲 内の風景を生成し、生成されたそれぞれの風景をプロジェクタから投影すれば、運転 士は投影された画像を正確に立体視できる。  Accordingly, as shown in FIG. 2 as a virtual space, the left eye virtual camera 4a of the left visual field PC4A is shifted to the left by 31.5 mm from the center of the virtual plane 4x, and the right eye of the right visual field PC4B. The virtual camera 4b is shifted 31.5mm to the right from the center of the virtual plane 4y to generate a landscape within a certain range that can be seen through the 800mm virtual plane 4x, 4y, 1000mm ahead of the virtual cameras 4a and 4b. If each generated landscape is projected from the projector, the driver can accurately stereoscopically view the projected image.
[0047] ここで、本実施形態のプロジェクタは、運転士の影がスクリーン 6に映らない位置に 高さ調整がなされて配置される。このため、左用プロジェクタ 3A及び右用プロジェク タ 3B力らスクリーン 6の上下部に至る距離は異なっている。そこで、スクリーン 6に投 影された風景が台形に変形しないよう、プロジェクタに台形補正機能が設けられてい る。  Here, the projector according to the present embodiment is arranged with height adjustment at a position where the shadow of the driver is not reflected on the screen 6. For this reason, the distance from the left projector 3A and the right projector 3B to the upper and lower portions of the screen 6 is different. Therefore, the projector is provided with a keystone correction function so that the landscape projected on the screen 6 does not deform into a trapezoid.
[0048] 図 3は、左用視界 PC及び右用視界 PCによる 3D画像生成処理タイミングと車両 PC による処理タイミングとの関係を一例として示す図である。  FIG. 3 is a diagram showing, as an example, the relationship between the 3D image generation processing timing by the left viewing PC and the right viewing PC and the processing timing by the vehicle PC.
[0049] 左用視界 PC4A及び右用視界 PC4Bそれぞれで生成され、左用プロジェクタ 3A及 び右用プロジェクタ 3Bそれぞれを介してスクリーンに透過投影される画像は、図 2に 示したように、瞳孔間幅だけ仮想カメラをずらして同時に撮影されたものと等価でなけ ればならない。したがって、左用視界 PC4A及び右用視界 PC4Bで生成され、左用 プロジェクタ 3A及び右用プロジェクタ 3Bから投影される風景は、画像 1フレーム毎に 同期を図る必要がある。また、各フレームの処理は、 1/ (30〜60)秒程度で行うこと が好ましい。  [0049] As shown in FIG. 2, the image generated by the left view PC4A and the right view PC4B and transmitted to the screen through the left projector 3A and the right projector 3B is only the interpupillary width. It must be equivalent to the one that was taken at the same time by shifting the virtual camera. Therefore, the scenery generated by the left view PC4A and the right view PC4B and projected from the left projector 3A and the right projector 3B needs to be synchronized for each frame of the image. Moreover, it is preferable to process each frame in about 1 / (30 to 60) seconds.
[0050] 図 3において、生成された 1フレームの画像がスクリーンに投影される間隔を 1周期 Tとする。  In FIG. 3, the interval at which the generated image of one frame is projected on the screen is defined as one period T.
[0051] 車両 PC20は、各周期において、車両モデルによる車両の揺れや線路がカーブす ることによる車両の変位、走行速度及び圧力を計算し、計算された変位をタイミング A で、左用視界 PC4A及び右用視界 PC4Bに同時に供給する。そして、計算された走 行速度及び圧力は、計器パネル 21に表示する。 [0051] In each cycle, the vehicle PC 20 calculates the displacement, travel speed, and pressure of the vehicle due to the vehicle swing or the curve of the track by the vehicle model. Thus, the left visual field PC4A and the right visual field PC4B are supplied simultaneously. The calculated running speed and pressure are displayed on the instrument panel 21.
[0052] 左用視界 PC4Aは、供給を受けた車両の変位や走行速度に基づレ、て、線路、線路 沿線の有体物、車両の揺れや線路のカーブによる変位に対する仮想カメラ(視点)の 座標計算を行い、立体モデルのレンダリングを行レ、、生成した画像 1フレーム分をタ イミング B1で車両 PC20に通知すると共に、バッファメモリに一時蓄積する。  [0052] The left field of view PC4A calculates the coordinates of the virtual camera (viewpoint) with respect to the displacement caused by the track, the tangible object along the track, the sway of the vehicle, or the curve of the track based on the displacement and travel speed of the supplied vehicle. The rendering of the 3D model is performed, and one frame of the generated image is notified to the vehicle PC 20 at timing B1 and temporarily stored in the buffer memory.
[0053] 右用視界 PC4Bは、供給を受けた車両の変位や走行速度に基づいて、線路、線路 沿線の有体物、車両の揺れや線路のカーブによる変位に対する仮想カメラ(視点)の 座標計算を行い、立体モデルのレンダリングを行レ、、生成した画像 1フレーム分をタ イミング B2で車両 PC20に通知すると共に、バッファメモリに一時蓄積する。  [0053] The right field of view PC4B calculates the coordinates of the virtual camera (viewpoint) with respect to the displacement caused by the track, the tangible object along the track, the sway of the vehicle, or the curve of the track, based on the supplied displacement and traveling speed of the vehicle. The rendering of the 3D model is performed, and one frame of the generated image is notified to the vehicle PC 20 by the timing B2 and temporarily stored in the buffer memory.
[0054] そして、左用視界 PC4A及び右用視界 PC4B双方から通知を受けたことを確認した 車両 PC20は、 1周期 Tが経過したタイミング Cで、バッファメモリに一時蓄積されてい たそれぞれの画像を可視化させると共に、可視化させたそれぞれの画像を左用プロ ジェクタ及び右用プロジェクタそれぞれに出力させる。  [0054] Then, the vehicle PC20 confirming that the notification has been received from both the left visual field PC4A and the right visual field PC4B visualizes each image temporarily stored in the buffer memory at timing C when one cycle T has elapsed. At the same time, the visualized images are output to the left projector and the right projector, respectively.
[0055] 左用プロジェクタ及び右用プロジェクタは、 2Dの視差画像それぞれを同時に、スク リーンに重ね合わせて透過投影する。  [0055] The left projector and the right projector simultaneously superimpose and project 2D parallax images on the screen.
[0056] 次に、仮想的列車の進行に合わせて前方風景を生成する画像生成処理について 説明する。  [0056] Next, an image generation process for generating a forward landscape in accordance with the progress of a virtual train will be described.
[0057] 図 4は、前方風景を表す画像を構成するフレームを示す断面構造図であり、図 5は 、前方風景を表す画像を構成するフレームを示す平面構造図である。  FIG. 4 is a cross-sectional structure diagram showing a frame constituting an image representing the front scenery, and FIG. 5 is a plan structure diagram showing a frame constituting the image representing the front scenery.
[0058] 図 4及び図 5に示すように、左用視界 PC及び右用視界 PCにより生成される画像は 、仮想平面を通して仮想カメラから見ることができる視界を示すカメラフレーム 100が あり、そのカメラフレーム 100の上に仮想的列車の位置を示す列車フレーム 101が形 成され、その列車フレーム 101の上に、地面を示すベースフレーム 102が形成される 。そして、そのベースフレーム 102上に、仮想的列車の進行に応じて前方風景の単 位ブロックが順次補充されるブロックフレーム 103が形成され、そのブロックフレーム 1 03上に、 自線、他線の線路、信号機、プラットフォーム、立ち木、建造物などのストラ クチャ一(有体物)が形成される。 [0059] 図 5に示すように、仮想平面を通して仮想カメラから見える風景 (視錘台の中に入る 範囲内のストラクチャ)は、仮想的列車の進行方向にブロック毎に補充される。すなわ ち、仮想的列車が 1ブロック通過するたびに、 1ブロック分追加される。 As shown in FIGS. 4 and 5, the image generated by the left view PC and the right view PC has a camera frame 100 showing a view that can be viewed from the virtual camera through a virtual plane. A train frame 101 indicating the position of a virtual train is formed on 100, and a base frame 102 indicating the ground is formed on the train frame 101. On the base frame 102, a block frame 103 in which unit blocks of the front scenery are sequentially replenished according to the progress of the virtual train is formed. On the block frame 103, the track of the own line and the other line is formed. , Traffic lights, platforms, standing trees, buildings, and other structures (tangible objects) are formed. [0059] As shown in FIG. 5, the landscape (structure within the range of entering the frustum stand) seen from the virtual camera through the virtual plane is replenished for each block in the traveling direction of the virtual train. In other words, every time a virtual train passes one block, one block is added.
[0060] 一方、仮想カメラは、仮想的列車の走行速度に合わせて移動し、視錘台に含まれ る複数のブロックにより形成された立体モデルを移動する列車の左右の視点から仮 想平面を通して見た画像を、左用視界 PC及び右用視界 PCそれぞれが生成する。 これにより、仮想的列車の運転席から見ることができる風景は、走行速度に合わせて 順次後方に移動し、運転士からは仮想的列車が走行しているように見える。  [0060] On the other hand, the virtual camera moves in accordance with the traveling speed of the virtual train and moves through the virtual plane from the left and right viewpoints of the train moving through the three-dimensional model formed by a plurality of blocks included in the frustum base. The viewed image is generated by each of the left view PC and the right view PC. As a result, the scenery that can be seen from the driver's seat of the virtual train moves backwards sequentially according to the traveling speed, and it appears to the driver that the virtual train is running.
[0061] 図 6は、左用視界 PC及び右用視界 PCにおいて画像が生成されるフローチャートを 示す図である。  FIG. 6 is a diagram showing a flowchart in which images are generated in the left visual field PC and the right visual field PC.
[0062] 図 6に示すように、ステップ S30で画像生成処理が開始されると、ステップ S31で左 用視界 PC及び右用視界 PCそれぞれにメモリされたプログラムにおけるデバイスの 生成、属性の設定を行う。  [0062] As shown in FIG. 6, when image generation processing is started in step S30, device generation and attribute setting are performed in the programs stored in the left view PC and the right view PC in step S31. .
[0063] 次に、ステップ S32で車両 PCのメモリユニットから仮想的列車の車体の長さ、台車 の位置、運転士の視点位置などの情報を、受信する。  [0063] Next, in step S32, information such as the length of the body of the virtual train, the position of the carriage, and the viewpoint position of the driver is received from the memory unit of the vehicle PC.
[0064] そして、ステップ S33で左用視界 PC及び右用視界 PCそれぞれのメモリユニットか ら、線路や信号機、プラットフォーム、建造物、立ち木など、前方風景を構成するスト ラクチャを読み出し、さらに、ステップ S34で自線のカーブ、勾配、道床、軌道狂いレ ベルや摩擦係数、他線の位置や道床、及び信号機、地上子などの信号設備、駅の 停止位置や乗車率、先行列車のダイヤ、ストラクチャの配置位置や配置方法を含む 軌道情報を読み出す。  [0064] Then, in step S33, the structures constituting the forward scenery such as tracks, traffic lights, platforms, buildings, standing trees, etc. are read from the memory units of the left view PC and the right view PC, and further in step S34. Self-track curve, slope, roadbed, track deviation level and friction coefficient, position and roadbed of other lines, signal equipment such as traffic lights and ground elements, station stop position and boarding rate, preceding train diagram, structure arrangement Read trajectory information including position and placement method.
[0065] 一方、車両 PCは、車両モデルに基づいて、車両の左右変位 Y、上下変位 Ζ、ロー ル角 φ、ピッチ角 Θ、ョ一角 φなどの車両の揺れや線路のカーブによる車両の変位 [0065] On the other hand, the vehicle PC is based on the vehicle model, and the displacement of the vehicle due to vehicle swings such as left and right displacement Y, up and down displacement Ζ, roll angle φ, pitch angle Θ, and single angle φ, and track curves.
、始発駅を基点とする列車の移動距離を算出する。 The travel distance of the train starting from the first station is calculated.
[0066] ここでは、車両の揺れや線路のカーブによる車両の変位と列車の移動距離を合わ せて、車両変位と称する。 [0066] Here, the displacement of the vehicle due to the shaking of the vehicle or the curve of the track is combined with the moving distance of the train, which is referred to as vehicle displacement.
[0067] 左用視界 PC及び右用視界 PCそれぞれは、ステップ S35で車両 PCから車両変位 の情報を受信し、ステップ 36で受信した車両変位の情報のうちの列車の移動距離に 基づいて、地面に対する車両の位置と向きを計算する。 [0067] Each of the left view PC and the right view PC receives vehicle displacement information from the vehicle PC in step S35, and sets the travel distance of the train in the vehicle displacement information received in step 36. Based on this, the position and orientation of the vehicle relative to the ground are calculated.
[0068] そして、ステップ S37で、計算された車両の位置と向きとにより、車両がブロックの境 界を超えたか否かを判定し、車両がブロックの境界を超えたときは、ステップ S38でメ モリユニットに格納された自線データを基に新たなブロックを配置し、その新たなプロ ック上に、配置方法の情報に従ったストラクチャを配置する。なお、ここでは、 25m毎 にブロックの境界がある。 [0068] Then, in step S37, it is determined whether or not the vehicle has exceeded the block boundary based on the calculated position and orientation of the vehicle. If the vehicle has exceeded the block boundary, a measurement is performed in step S38. A new block is arranged based on the own line data stored in the memory unit, and a structure according to the arrangement method information is arranged on the new block. Here, there is a block boundary every 25m.
[0069] ここで、ストラクチャの配置は、まず、 自線データからブロックフレームの位置と向きと を計算し、ベースフレーム上に配置する。また、他線データから、他線フレームの位 置と向きとを計算し、ブロックフレームに配置する。さらに、線路や信号機、プラットフ オーム、建造物、立ち木など、前方風景を構成するストラクチャを任意のフレームに配 置することにより実行される。 Here, in the arrangement of the structure, first, the position and orientation of the block frame are calculated from the own line data, and are arranged on the base frame. In addition, the position and orientation of the other line frame are calculated from the other line data, and are arranged in the block frame. Furthermore, it is executed by placing structures that make up the forward landscape, such as tracks, traffic lights, platforms, buildings, and standing trees, on any frame.
[0070] 次に、ステップ 39で地面に対する車両の位置と向きとにより車両を移動する。 Next, in step 39, the vehicle is moved according to the position and orientation of the vehicle with respect to the ground.
[0071] ここで、ベースフレームは、計算された車両の位置と向きとにより、列車フレームに 対するベースフレームの位置と向きを計算し移動する。これにより、地面に対して相 対的に列車の位置と向きを変えたことになる。 Here, the base frame moves by calculating the position and orientation of the base frame relative to the train frame based on the calculated position and orientation of the vehicle. This changed the position and direction of the train relative to the ground.
[0072] また、列車フレームは、車両 PCから受信した車両の揺れによる車両の変位に基づ いてカメラフレームに対する列車フレームの位置と向きを計算し移動する。これは、軌 道に対して相対的に車両を揺らしたことになる。 [0072] Further, the train frame moves by calculating the position and orientation of the train frame relative to the camera frame based on the displacement of the vehicle due to the shaking of the vehicle received from the vehicle PC. This caused the vehicle to swing relative to the track.
[0073] 次に、ステップ S40で構築された立体モデルから 2D画像を生成するレンダリングを 行う。 [0073] Next, rendering is performed to generate a 2D image from the three-dimensional model constructed in step S40.
そして、レンダリングが終了すると、ステップ S41で車両 PCに完了通知を行うと共に、 バッファメモリに生成された画像 1フレームを一時格納する。  When rendering is completed, a completion notification is sent to the vehicle PC in step S41, and one frame of the generated image is temporarily stored in the buffer memory.
[0074] 一方、車両 PCは、ステップ 42で左用視界 PC及び右用視界 PC双方から完了通知 を受けた所定のタイミングで、バッファメモリに一時格納された 2D画像それぞれをプ ロジェクタに出力するよう旨令する。 [0074] On the other hand, the vehicle PC outputs each 2D image temporarily stored in the buffer memory to the projector at a predetermined timing when the completion notification is received from both the left vision PC and the right vision PC in Step 42. Order.
[0075] 左用視界 PC及び右用視界 PCは、それぞれの 2D画像を同時に出力し、プロジェク タを介してスクリーンに投影する。以下、同様の動作を繰り返し、画像が 1秒間に 30[0075] The left view PC and the right view PC simultaneously output the respective 2D images and project them onto the screen via the projector. Thereafter, the same operation is repeated, and the image is 30 per second.
〜60フレーム投影される。 [0076] 次に、車両の揺れについて説明する。 ~ 60 frames are projected. [0076] Next, the vehicle shake will be described.
[0077] 図 7は、車両の揺れについて説明する模式図である。  [0077] Fig. 7 is a schematic diagram for explaining the shaking of the vehicle.
[0078] 図 7 (A)は、車両 2に動振装置 27が設けられた参考例である。  FIG. 7A is a reference example in which the vibration device 27 is provided in the vehicle 2.
[0079] 動振装置 27は、車両 2の床を揺らす構造を有し、それによつて車両 2全体の揺れを 再現するものであり、この場合には、左用視界 PC及び右用視界 PCは、車両の揺れ による変位を考慮せずに風景を生成し、スクリーンに投影すればよい。  [0079] The vibration device 27 has a structure that shakes the floor of the vehicle 2, and thereby reproduces the shaking of the entire vehicle 2. In this case, the left view PC and the right view PC are A landscape can be generated and projected on the screen without taking into account the displacement caused by the shaking of the vehicle.
[0080] 図 8 (B)は、地面を基準として車両 2が揺れる場合と、車両 2の床を基準として地面 が揺れる場合とを示す模式図である。 FIG. 8B is a schematic diagram showing a case where the vehicle 2 swings with reference to the ground and a case where the ground swings with reference to the floor of the vehicle 2.
[0081] 図 8 (B)に示すように、風景が移動、回転することと、車両が変位することは車上の 観察者にとっては、相対的に同じことであるため、観察者は車両が揺れているように 感じる。 [0081] As shown in FIG. 8 (B), the movement and rotation of the landscape and the displacement of the vehicle are relatively the same for the observer on the vehicle. I feel like it's shaking.
[0082] したがって、運転席が固定されている鉄道用運転シミュレータの場合には、地上の 風景を揺らす、すなわち、スクリーンに投影される画像を揺らす (写る位置を移動させ る)ことによつても同様の 果を得ることができる。  Therefore, in the case of a railway driving simulator in which the driver's seat is fixed, the scenery on the ground is shaken, that is, the image projected on the screen is shaken (the position where the image is taken is moved). Similar results can be obtained.
[0083] 次に、車両モデルについて詳述する。  Next, the vehicle model will be described in detail.
[0084] 車両モデルは、軌道狂いなどの計算から車体の揺れの計算までを行う数値解析モ デルとパラメータなどにより構成されたもので、車両 PC20に組み込まれている。  [0084] The vehicle model is configured by a numerical analysis model and parameters for performing calculation from trajectory deviation or the like to calculation of body shake, and is incorporated in the vehicle PC 20.
[0085] 本実施形態の車両 PC20は、仮想的列車の走行速度などに基づいて、メモリュニッ トに組み込まれた車両モデルを用いて、左右 y、上下 z、ローリング φ、ピッチング Θ、 ョーイング Φという 5つの変位量を計算する。 [0085] The vehicle PC 20 of the present embodiment uses the vehicle model incorporated in the memory unit based on the traveling speed of the virtual train and the like, left and right y, up and down z, rolling φ, pitching Θ, and keying Φ. Calculate the amount of displacement.
[0086] 左用視界 PC4A及び右用視界 PC4Bは、車両 PC20が計算した変位量に基づい て、コンピュータグラフィックスで表される風景画像を移動、回転させることにより、 "揺 れる風景画像"を、スクリーンに投影する。 [0086] The left view PC4A and the right view PC4B move and rotate a landscape image represented by computer graphics based on the amount of displacement calculated by the vehicle PC20 to generate a "swinging landscape image" on the screen. Project to.
[0087] これにより、固定された車両からその風景画像を見た人は、車両 2が揺れているに 感ずる。 [0087] Thereby, a person who sees the landscape image from a fixed vehicle feels that the vehicle 2 is shaking.
[0088] 図 8及び図 9は、車両モデルの一例を示す図である。  8 and 9 are diagrams showing an example of the vehicle model.
[0089] 図 8及び図 9に示す車両モデルは、車両の揺れを模擬するために、車体、空気ば ね、左右動ダンパからなる振動系がモデル化されている。 [0090] 図 8 (A)は、車両全体を示す車両モデルであり、図 8 (B)は、車両モデルの入力と 出力とを示す。また、図 9 (A)は、車両の左右平面を表す車両モデルであり、図 9 (B) は、車両の前後平面を表す車両モデルである。 In the vehicle model shown in FIGS. 8 and 9, a vibration system including a vehicle body, an air spring, and a left and right motion damper is modeled in order to simulate the shaking of the vehicle. FIG. 8 (A) is a vehicle model showing the entire vehicle, and FIG. 8 (B) shows the input and output of the vehicle model. 9A is a vehicle model representing the left and right planes of the vehicle, and FIG. 9B is a vehicle model representing the front and back planes of the vehicle.
[0091] 図 8 (A)、図 9 (A)及び図 9 (B)における車両モデルは、軌道の作成に応じて計算 された軌道狂い y 、 ζ 、 φ 、y 、 z 、 φ が与えられることにより、左右 y、上下 z  [0091] The vehicle models in Fig. 8 (A), Fig. 9 (A), and Fig. 9 (B) are given trajectory error y, ζ, φ, y, z, φ calculated according to the creation of the trajectory. Left and right y, up and down z
BF BF BF BR BR BR  BF BF BF BR BR BR BR
、ローリング φ、ピッチング θ、ョーイング φ力 なる 5つの振動パターンによる、車両 の変位量が計算されるように構成されてレ、る。  , Rolling φ, pitching θ, winging φ force The vehicle displacement amount is calculated by five vibration patterns.
[0092] 図 8 (B)に示すように、車両の揺れは、軌道の狂い、カント、慣性力、乗降客による 不規則な加圧、重力などによって生じることから、車両モデルには、軌道狂いのほか に、カント(カーブにおいて外側のレールを高くすること) φ 、 φ 、慣性力、乗降客 [0092] As shown in FIG. 8 (B), the vehicle shake is caused by a track deviation, cant, inertial force, irregular pressurization by passengers, gravity, etc. In addition to, cant (to raise the outer rail on the curve) φ, φ, inertia, passengers
BF BR  BF BR
による加わる力などを入力することができる。  The force applied by can be input.
[0093] なお、カントと、軌道狂いの記号が同じなのは、軌道狂いがカントを含めたものとな つているからである。また、慣性力は、車体 2の重心位置に作用する。 [0093] It should be noted that the reason why Kant and trajectory error are the same is because trajectory error includes Kant. The inertial force acts on the position of the center of gravity of the vehicle body 2.
[0094] このような車両モデルに基づいて、車両 2の振動(揺れ)が再現されるが、左用視界[0094] Based on such a vehicle model, the vibration (swing) of the vehicle 2 is reproduced, but the left field of view is reproduced.
PC4A及び右用視界 PC4Bそれぞれによる画像生成の際に車両 2の揺れを反映さ せることにしてもよいし、生成された画像を、車両 2の変位量に基づいて移動(揺らすWhen the images are generated by the PC 4A and the right field of view PC 4B, the shaking of the vehicle 2 may be reflected, and the generated image is moved (shake) based on the displacement amount of the vehicle 2.
)させることにしてもよい。 ).
[0095] 尚、ここでは車両 2の床以下の振動については考慮していない。 [0095] Note that vibration below the floor of the vehicle 2 is not considered here.
[0096] 次に、車両モデルに使用される軌道狂いについて説明する。 [0096] Next, the trajectory error used in the vehicle model will be described.
[0097] 図 10は、軌道狂いが作成される状況を示す図である。 FIG. 10 is a diagram showing a situation where a trajectory error is created.
[0098] 図 10に示すように、車両が左方から右方に向けて 1ブロック走行するたびに新たな 軌道狂いが作成される。軌道には、不規則な凹凸があり、この不規則な凹凸によって 軌道狂いが発生する。  [0098] As shown in FIG. 10, a new track error is created each time the vehicle travels one block from the left to the right. There are irregular irregularities in the trajectory, and the irregular irregularities occur due to the irregular irregularities.
[0099] この軌道狂いは、乱数(ホワイトノイズ)を 1次遅れ要素のフィルタに通すことにより、 実際に走行している軌道と同様のものが得られる。  [0099] This trajectory error can be obtained by passing a random number (white noise) through the filter of the first-order lag element, which is the same as the actual trajectory.
[0100] ここで、 1ブロックが 25mに設定されている場合は、車両 PC20により走行位置及び 走行速度の情報が生成され、生成された情報によって、車両 2が 25m進んだと判断 されたときは、その都度新たな軌道が生成され、古い軌道は削除される。したがって 、軌道狂いは、 25m (1ブロック)を単位として求められる。 [0100] Here, when one block is set to 25 m, information on the driving position and the driving speed is generated by the vehicle PC 20, and when it is determined that the vehicle 2 has advanced 25 m by the generated information Each time, a new trajectory is created and the old trajectory is deleted. Therefore Orbital deviation is calculated in units of 25m (1 block).
[0101] 本実施形態の鉄道用運転シミュレータでは、車両 2を走行させている段階で軌道狂 レ、を算出する。軌道の狂いは数式 1〜数 4を用いて算出される。 [0101] In the railway driving simulator of the present embodiment, the track deviation is calculated while the vehicle 2 is traveling. The trajectory error is calculated using Equations 1 to 4.
[0102] また、そのようにブロックを単位として算出された軌道狂いの一例を図 11に示す。 [0102] Fig. 11 shows an example of such a trajectory error calculated in units of blocks.
[0103] [数 1] [0103] [Equation 1]
Figure imgf000020_0001
Figure imgf000020_0001
また、上記 uに乱数を入力する。 x = 0のとき、 y=y、距離が x = hのときの狂レ、 yは Also, enter a random number in u above. when x = 0, y = y, when the distance is x = h, y
0  0
数式 2で算出される。  Calculated using Equation 2.
[0104] [数 2] [0104] [Equation 2]
Figure imgf000020_0002
Figure imgf000020_0002
数式 1、数式 2に基づいて、数式 3及び数式 4に示すプログラムを繰り返すことにより 、不規則な凹凸の軌道が算出される。 Based on Equations 1 and 2, by repeating the programs shown in Equations 3 and 4, irregular irregular trajectories are calculated.
[数 3]  [Equation 3]
2 2
ここで、数式 3に示す αは— 1≤ α < 1の乱数である。 [0106] [数 4] Here, α shown in Equation 3 is a random number of −1≤α <1. [0106] [Equation 4]
yn = z + {yn_x一 z) exp(-2^i^) y n = z + (y n _ x one z) exp (-2 ^ i ^)
次に、乗客が乗降する際に車両に与える不規則な加圧について説明する。 Next, irregular pressurization applied to the vehicle when passengers get on and off will be described.
[0107] 図 12は、車両に与える不規則な加圧を説明する図である。  FIG. 12 is a diagram for explaining irregular pressurization applied to the vehicle.
[0108] 図 12において、縦軸に乗車率、横軸に時間を表したとき、乗客が乗降するときの乗 車率の変移を実線グラフで示す。  In FIG. 12, when the occupancy rate is shown on the vertical axis and the time is shown on the horizontal axis, the transition of the occupancy rate when passengers get on and off is shown by a solid line graph.
[0109] 時刻 t41で車両 2の扉が開かれ、乗客が降車すると乗車率が低下する。そして、乗 客の降車が終了すると新たな乗客が乗車するので、乗車率が上昇する。時刻 t42で 乗客の乗降が終了すると、時刻 t43で車両 2の扉が閉じられる。  [0109] When the door of vehicle 2 is opened at time t41 and the passenger gets off, the boarding rate decreases. And when passengers get off, new passengers get on, and the boarding rate rises. When passengers get on and off at time t42, the door of vehicle 2 is closed at time t43.
[0110] 時刻 t41から時刻 t43までが最短停車時間であり、時刻 t41から時刻 t42までが乗 降時間 T40である。  [0110] From time t41 to time t43 is the shortest stop time, and from time t41 to time t42 is the getting on / off time T40.
[0111] 乗客による乗降は、仮想的列車の走行路線におけるそれぞれの駅で乗車率と最短 停車時間とを定義し、その定義されたそれぞれの乗車率と最短停車時間とを用いる ことにより模擬される。  [0111] Passenger boarding / exiting is simulated by defining the boarding rate and the shortest stop time at each station on the virtual train route, and using the defined boarding rate and the shortest stop time. .
[0112] 乗客の乗降に伴う車両の揺れは、車両 2の 1点に不規則な力を加えることにより模 擬される。具体的には、例えば、図 12における降車客(2a)は、乗車客(a)の 2倍であ るから、車両 2を揺らす大きさも 2倍とする。なお、乗降時間を過ぎると突然揺れが止 まるという不自然さをなくすため、閉扉されるまでは小さな揺れを発生させることにして あよい。  [0112] The shaking of the vehicle as passengers get on and off is simulated by applying an irregular force to one point of vehicle 2. Specifically, for example, the number of passengers (2a) getting off in FIG. 12 is twice that of passengers (a), so the magnitude of shaking the vehicle 2 is also doubled. In order to eliminate the unnaturalness that the shaking stops suddenly after the boarding / exiting time, a small shaking may be generated until the door is closed.
[0113] 次に、風景に合わせてスピーカ 1から出力されるブレーキの軋り音について説明す る。  [0113] Next, the brake noise output from the speaker 1 in accordance with the scenery will be described.
[0114] 図 13は、ブレーキの軋り音を示す図であり、図 13 (A)は、速度に応じて変化させる 音量を示し、図 13 (B)は、速度が変化したときのブレーキの摩擦係数を示す。  [0114] Fig. 13 is a diagram showing the rolling sound of the brake, Fig. 13 (A) shows the volume to be changed according to the speed, and Fig. 13 (B) is the brake sound when the speed is changed. Indicates the coefficient of friction.
[0115] 車両 PC20は、コントローラ 22の操作に対応して、計器パネル 21に表示するブレー キ弁の圧力や走行速度 vを算出する際に、実際の列車の軋り音を録音し、メモリュニ ットに格納されている音声データを再生し、ブレーキの軋り音を生成するときに、算出 したブレーキ弁の圧力や走行速度 Vに従って音量 Aと周波数とを変化させる。 [0115] The vehicle PC 20 responds to the operation of the controller 22 and displays a display on the instrument panel 21. When calculating the pressure of the valve and the running speed v, the actual train sound is recorded, the sound data stored in the memory unit is played, and the brake sound is generated. Volume A and frequency are changed according to the calculated brake valve pressure and travel speed V.
[0116] これにより、実車の軋り音に近い音響力 Sスピーカ 1から出力される。 [0116] As a result, an acoustic force close to that of a real vehicle is output from the S speaker 1.
[0117] 音量 Aは、数式 5に示すようにブレーキ弁の圧力(BC圧 P)に比例して大きくなり、 更に、数式 6に示すように速度 Vによっても変化する。 [0117] The volume A increases in proportion to the pressure of the brake valve (BC pressure P) as shown in Equation 5, and also changes according to the speed V as shown in Equation 6.
[0118] [数 5] [0118] [Equation 5]
A = ~ g(v) A = ~ g (v)
1  1
[0119] [数 6] [0119] [Equation 6]
V 1一— V 1
g(v) = -e Vl g (v) = -e Vl
尚、図 13 (A)は、数式 6による変化をグラフで示したものである。 Note that FIG. 13A is a graph showing the change according to Equation 6.
また、数式 7に示すように周波数 fは、 BC圧 Pと制輪子の摩擦係数 μ、即ちブレーキ 力によって変化する。  Further, as shown in Equation 7, the frequency f varies depending on the BC pressure P and the friction coefficient μ of the brake, that is, the braking force.
[0120] [数 7] [0120] [Equation 7]
o 尚、図 13 (B)は、数式 7における μ / μ をと走行速度との関係を示す図である。 o FIG. 13B is a diagram showing the relationship between μ / μ in Equation 7 and the traveling speed.
0  0
[0121] これにより、車両 PC20において、コントローラ 22の操作に応じて求めた BC圧と、走 行速度とによって、スピーカ 1から出力される音響の音量及び周波数を変化させた実 録音が再生されるので、運転士に、実際に列車を運転しているような臨場感や緊張 感を与えることができる。  [0121] As a result, in the vehicle PC 20, the actual recording in which the volume and frequency of the sound output from the speaker 1 is changed according to the BC pressure obtained according to the operation of the controller 22 and the running speed is reproduced. Therefore, it can give the driver a sense of realism and tension as if he were actually driving a train.
[0122] 次に、図 1に示す鉄道用運転シミュレータにおける作用、特に、車両 PC20及び左 用視界 PC4A及び右用視界 PC4B (ここでは、説明の都合上両者を含めて視界表示 用 PC4が表示されている。 )における作用について説明する。  [0122] Next, the operation in the railway driving simulator shown in FIG. 1, in particular, the vehicle PC20, the left field of view PC4A, and the right field of view PC4B (here, for the sake of explanation, the view display PC4 is displayed including both of them) The operation of) will be described.
[0123] 図 14は、鉄道用運転シミュレータの作用を説明する図である。  FIG. 14 is a diagram for explaining the operation of the railway driving simulator.
[0124] 図 14において、鉄道用運転シミュレータは、コントローラ 22と、車両 PC20と、左用 視界 PC4A及び右用視界 PC4B (ここでは、同じ作用を行うので、説明の都合上両 者を含めた視界表示用 PC4が表示されている。)と、スピーカ 1と、左用プロジェクタ 3 Α及び右用プロジェクタ 3Β (ここでは、同じ作用を行うので、説明の都合上両者を含 めたプロジェクタ 3が表示されている。)と、スクリーン 6とを備えている。  [0124] In FIG. 14, the railway driving simulator includes the controller 22, the vehicle PC20, the left field of view PC4A, and the right field of view PC4B (here, the same operation is performed. PC4 is displayed), speaker 1, left projector 3 3 and right projector 3Β (here, the same action is performed, so projector 3 including both is displayed for convenience of explanation. )) And screen 6.
[0125] 車両 PC20は、 自線のカーブ、勾配、道床、軌道狂いレベルや摩擦係数、他線の 位置や道床、及び信号機、地上子などの信号設備、駅の停止位置や乗車率、先行 列車のダイヤ、ストラクチャの配置位置や配置方法を含む軌道情報線、及び列車べ ノレ、車内放送、信号機の警報音などの路線音響を含む路線データ 200が格納され たメモリ、車両の揺れを車両モデルを含む車両の特性情報、計器パネルの映像ゃ属 性情報、列車の走行音、ドアの開閉音などの車両音響を含む車両データ 201が格 納されたメモリ、先行列車との車両間隔を制御する先行列車制御処理部 202、信号 機などの表示や警報音を制御する信号制御処理部 203、コントローラ 22による操作 、先行列車制御処理部 202や信号制御処理部 203などとの連携による ATSや ATC などの機能を果たす保安装置制御処理部 204、列車の扉の開閉を制御する車掌制 御処理部 205及びドア制御処理部 206、各駅において仮想的乗客の乗降を制御す る乗客制御処理部 207、コントローラ 22による操作に応じて仮想的列車の走行速度 や走行位置を演算する駆動制御処理部 208、線路データに基づいて当該路線の軌 道情報を読み出し車両モデルにパラメータ設定を行う軌道制御処理部 209、軌道制 御処理部 209から供給されたパラメータ、駆動制御処理部 208から供給された走行 速度や走行位置、及び乗客制御処理部 207から供給された乗降客情報に基づいて 、車両の揺れによる変位を算出する車両モデル算出処理部 210、メモリに格納され た路線音響や車両音響を再生する音響生成処理部 211とを有する。 [0125] Vehicle PC20 has its own curve, slope, roadbed, track error level and friction coefficient, position and roadbed of other lines, signal equipment such as traffic lights and ground elements, station stop positions and boarding rates, leading trains , A track information line including the layout position and layout method of the structure, a memory storing route data 200 including route sound, such as train trains, in-car broadcasts, warning lights of traffic lights, etc. Including vehicle characteristic information, instrument panel image attribute information, train running sound, vehicle data including vehicle sound such as door opening / closing sound, memory for storing vehicle 201 Train control processing unit 202, signal control processing unit 203 that controls signal display and alarm sound, operation by controller 22, ATS and ATC etc. in cooperation with preceding train control processing unit 202 and signal control processing unit 203 function Operation by the safety device control processing unit 204, the conductor control processing unit 205 and the door control processing unit 206 that control the opening and closing of the train doors, the passenger control processing unit 207 that controls the entry and exit of virtual passengers at each station, and the operation by the controller 22 The drive control processing unit 208 that calculates the travel speed and travel position of the virtual train according to the track, the track control processing unit 209 that reads the track information of the route based on the track data and sets the parameters in the vehicle model, the track control Based on the parameters supplied from the control processing unit 209, the traveling speed and traveling position supplied from the drive control processing unit 208, and passenger information supplied from the passenger control processing unit 207, the displacement due to the shaking of the vehicle is calculated. A vehicle model calculation processing unit 210; and a sound generation processing unit 211 that reproduces route sound and vehicle sound stored in the memory.
[0126] また、視界表示用 PC4は、運転席の前方風景を表す画像を生成する線路、信号機 、プラットフォーム、建造物等の有体物を表す風景データ 400が格納されたメモリュ ニット(本発明の格納部に相当する。)と、その風景データ 400や車両 PC20から供給 を受けた走行速度、走行位置、及び車両の変位を表す情報を基に、運転士前方に 展開される風景を生成する画像生成制御処理部 401を有する。  [0126] The visibility display PC 4 is a memory unit (storage unit of the present invention) that stores landscape data 400 representing tangible objects such as tracks, traffic lights, platforms, and buildings that generate images representing the front landscape of the driver's seat. And image generation control that generates a landscape developed in front of the driver based on the landscape data 400 and information representing the travel speed, travel position, and vehicle displacement supplied from the vehicle PC 20. A processing unit 401 is included.
[0127] 先ず、コントローラ 22の操作により運転士が操作を行うと、その操作に応じた信号が 駆動制御処理部 208へ入力される(ステップ Sl)。  First, when the driver performs an operation by operating the controller 22, a signal corresponding to the operation is input to the drive control processing unit 208 (step Sl).
[0128] そして、メモリに格納された路線データ 200が読み出され、先行列車制御処理部 2 02、信号制御処理部 203、保安装置制御処理部 204、車掌制御処理部 205、乗客 制御処理部 207、駆動制御処理部 208へ供給される(ステップ S2)。  Then, the route data 200 stored in the memory is read out, and the preceding train control processing unit 202, the signal control processing unit 203, the security device control processing unit 204, the conductor control processing unit 205, and the passenger control processing unit 207 Is supplied to the drive control processing unit 208 (step S2).
[0129] また、メモリに格納された車両データ 201が、保安装置制御処理部 204、駆動制御 処理部 208へ供給される(ステップ S3)。  [0129] Further, the vehicle data 201 stored in the memory is supplied to the security device control processing unit 204 and the drive control processing unit 208 (step S3).
[0130] 先行列車制御処理部 202は、路線データ 200に基づいて生成された先行列車の 走行位置情報を信号制御処理部 203に供給する(ステップ S4)。  [0130] The preceding train control processing unit 202 supplies the traveling position information of the preceding train generated based on the route data 200 to the signal control processing unit 203 (step S4).
[0131] 信号制御処理部 203は、供給された先行列車の走行位置情報に基づレ、て信号機 に表示する信号情報を保安装置制御処理部 204に供給する (ステップ S5)。  [0131] The signal control processing unit 203 supplies signal information to be displayed on the traffic light to the safety device control processing unit 204 based on the supplied traveling position information of the preceding train (step S5).
[0132] 保安装置制御処理部 204は、保安機能に関するデータを駆動制御処理部 208に 送る(ステップ S6)。  The security device control processing unit 204 sends data related to the security function to the drive control processing unit 208 (step S6).
[0133] 保安装置制御処理部 204は、供給された保安機能に関するデータに基づいて音 響生成処理部 211に警報情報を送る(ステップ S 7)。  The security device control processing unit 204 sends alarm information to the sound generation processing unit 211 based on the supplied data relating to the security function (step S 7).
[0134] 尚、車掌制御処理部 205は、メモリから読み出された路線データ 200や駆動制御 部 208で生成された、走行速度、走行位置等に基づいてドアの制御情報、車内放送 の情報やブザー情報を生成し、生成されたそれらの情報をドア制御処理部 206、音 響生成処理部 211に送る(ステップ S8)。 [0135] ドア制御処理部 206は、ドアの制御情報に基づいてドアの開閉情報を乗客制御処 理部 207、音響生成処理部 211、保安装置制御処理部 204に送る(ステップ S9)。 [0134] The conductor control processing unit 205 uses the route data 200 read from the memory and the driving control unit 208 to generate door control information, in-car broadcast information, The buzzer information is generated, and the generated information is sent to the door control processing unit 206 and the sound generation processing unit 211 (step S8). [0135] The door control processing unit 206 sends the door opening / closing information to the passenger control processing unit 207, the sound generation processing unit 211, and the security device control processing unit 204 based on the door control information (step S9).
[0136] 乗客制御処理部 207は、メモリから読み出された路線データ 200に含まれる乗車率 や最短停車時間とに基づいて乗客の重量情報を駆動制御処理部 208に、乗降情報 を車両モデル算出処理部 210に送る(ステップ S 10)。  [0136] Passenger control processing unit 207 calculates passenger weight information to drive control processing unit 208 based on the boarding rate and the shortest stop time included in route data 200 read from the memory, and vehicle model calculation of boarding / alighting information. The data is sent to the processing unit 210 (step S10).
[0137] 駆動制御処理部 208は、メモリから読み出された線路データ及び車両データに基 づいて、仮想的列車の走行速度、走行位置などを算出し、走行速度及び走行位置 を車掌制御処理部 205に送り(ステップ S11)、走行位置を軌道制御処理部 209に、 加速度及び走行位置を車両モデル算出処理部 210と音響生成処理部 211とに送る (ステップ 14)。  [0137] The drive control processing unit 208 calculates the traveling speed, traveling position, etc. of the virtual train based on the track data and vehicle data read from the memory, and determines the traveling speed and traveling position by the conductor control processing unit. The travel position is sent to 205 (step S11), the travel position is sent to the trajectory control processing unit 209, and the acceleration and travel position are sent to the vehicle model calculation processing unit 210 and the sound generation processing unit 211 (step 14).
[0138] なお、軌道制御処理部 209は、軌道狂いを算出し、車両モデル算出処理部 210に 送る(ステップ S 13)。  Note that the trajectory control processing unit 209 calculates a trajectory error and sends it to the vehicle model calculation processing unit 210 (step S 13).
[0139] 車両モデル算出処理部 210は、ばねの変位を算出し、音響生成処理部 211に送る  [0139] The vehicle model calculation processing unit 210 calculates the displacement of the spring and sends it to the sound generation processing unit 211.
(ステップ S 15)。  (Step S15).
[0140] 駆動制御処理部 208は、走行位置、走行速度を画像生成処理部 401に供給する( ステップ S 16)。  The drive control processing unit 208 supplies the travel position and travel speed to the image generation processing unit 401 (Step S 16).
[0141] 車両モデル算出処理部 210は、車両の揺れによる変位(車体変位)を算出し、算出 した車体変位を視界表示用 PC4の画像生成処理部 401に送る(ステップ S 17)。  [0141] The vehicle model calculation processing unit 210 calculates the displacement (vehicle body displacement) due to the shaking of the vehicle, and sends the calculated vehicle body displacement to the image generation processing unit 401 of the visibility display PC 4 (step S17).
[0142] 一方、視界表示用 PC4では、メモリユニットに格納された風景データ 400を読み出 し、画像生成処理部 401に送り(ステップ S18)、画像生成処理部 401は、駆動制御 処理部 208から供給された走行位置及び走行速度と、車両モデル算出処理部 210 力 供給された車体変位とに基づいで'揺れる風景"を表す画像を生成し、車両 PC2 0の指令に基づいて、生成されたフレーム毎の画像をプロジェクタ 3に供給する(ステ ップ S19)。  [0142] On the other hand, the view display PC 4 reads the landscape data 400 stored in the memory unit and sends it to the image generation processing unit 401 (step S18). The image generation processing unit 401 receives from the drive control processing unit 208. Based on the supplied travel position and travel speed and the vehicle model calculation processing unit 210 force, the vehicle body displacement is generated to generate an image representing 'swaying landscape', and the generated frame based on the command of the vehicle PC20 Each image is supplied to the projector 3 (step S19).
[0143] プロジェクタ 3は、生成した画像をスクリーン 6に供給する(ステップ S21)。  [0143] The projector 3 supplies the generated image to the screen 6 (step S21).
一方、音響生成処理部 211は、供給された警報情報、放送情報、ブザー情報などに 基づいて再生された音響をスピーカ 1から出力させる(ステップ S20)。  On the other hand, the sound generation processing unit 211 outputs sound reproduced based on the supplied alarm information, broadcast information, buzzer information, and the like from the speaker 1 (step S20).
[0144] このように、本発明の鉄道用運転シミュレータは、車両の運行操作に応じて車両の 走行状態を制御し、走行位置及び走行速度と、車両の揺れによる車体変位とに基づ いて、仮想的に走行する車両からの視界を立体的な"揺れる風景"として画像表示す る一方、表示される画像に合わせて走行音や信号機の警報音、車両の軋り音が出力 されので、現実の列車を運転しているような臨場感や緊張感を得ることができる。また 、指令用 PCから指令を行うことによりメモリユニットに格納された所定のデータが選択 されて、シミュレータ環境を柔軟に設定変更することができる。 [0144] As described above, the railway driving simulator according to the present invention is adapted to the vehicle operation according to the operation of the vehicle. Controls the driving state, and displays the view from the virtually traveling vehicle as a three-dimensional “swinging landscape” based on the traveling position and traveling speed, and the vehicle body displacement due to the vehicle shaking. A running sound, a traffic light warning sound, and a vehicle roaring sound are output according to the displayed image, so that a sense of reality and tension as if driving an actual train can be obtained. In addition, when a command is issued from the command PC, predetermined data stored in the memory unit is selected, and the simulator environment can be flexibly set and changed.
本国際出願は、 2004年 12月 14日に出願した日本国特許出願 2004— 361695 号に基づく優先権を主張するものであり、 2004— 361695号の全内容を本国際出 願に援用する。  This international application claims priority based on Japanese Patent Application No. 2004-361695 filed on December 14, 2004, and the entire contents of 2004-361695 are incorporated into this international application.

Claims

請求の範囲 The scope of the claims
[1] 偏光眼鏡をかけた運転士による操作手段の操作に合わせて運転席前方のスクリーン に風景が 3D表示されると共に、該運転席の表示手段に操作状態が表示される鉄道 用運転シミュレータにおいて、  [1] In a railway driving simulator in which scenery is displayed in 3D on the screen in front of the driver's seat in accordance with the operation of the driver by a driver wearing polarized glasses, and the operating state is displayed on the driver's seat display ,
前記操作手段の操作に応じて仮想的列車の走行速度を求め、求めた該走行速度 と経過時間とに基づいて該仮想的列車の走行位置を求める列車制御手段と、 前記仮想的列車が走行する線路及び該線路沿線に配置された有体物に係るデー タが格納された格納部を有し、前記列車制御手段が求めた前記走行速度及び前記 走行位置に応じて仮想的運転士の左眼と右眼とからそれぞれ見えるそれぞれの風 景を生成する 3D画像生成手段と、  Train control means for determining a travel speed of a virtual train in accordance with an operation of the operation means, and determining a travel position of the virtual train based on the determined travel speed and elapsed time; and the virtual train travels A storage unit storing data relating to a track and a tangible object arranged along the track, and the left and right eyes of the virtual driver according to the travel speed and the travel position obtained by the train control means 3D image generation means for generating each scene that can be seen from the eyes;
前記 3D画像生成手段により生成されたそれぞれの風景を、互いに偏光軸が直交 する偏光板それぞれを介して前記スクリーンに投影する 3D画像表示手段とを備え、 前記列車制御手段は、前記 3D画像生成手段において、左眼と右眼とからそれぞ れ見えるそれぞれの風景が生成されたとき、生成されたそれぞれの風景を前記 3D画 像表示手段から同時に投影させることを特徴とする鉄道用運転シミュレータ。  3D image display means for projecting each landscape generated by the 3D image generation means onto the screen via polarizing plates whose polarization axes are orthogonal to each other, and the train control means includes the 3D image generation means A railway driving simulator characterized in that, when each landscape visible from the left eye and the right eye is generated, each generated landscape is simultaneously projected from the 3D image display means.
[2] 前記操作手段は、前記仮想的列車の走行速度を変化させる主幹制御器及び該仮 想的列車を制動する制動輪に圧力をかけるブレーキ弁を有すると共に、 [2] The operation means includes a master controller that changes a traveling speed of the virtual train and a brake valve that applies pressure to a brake wheel that brakes the virtual train.
前記表示手段は、走行速度及び圧力を表示する計器パネルを有するものであって 前記列車制御手段は、前記主幹制御器により変化させた前記走行速度及び前記 ブレーキ弁によりかけられた前記圧力を前記計器パネルに表示することを特徴とする 請求項 1記載の鉄道運転用シミュレータ。  The display means has an instrument panel for displaying a traveling speed and pressure, and the train control means is configured to display the traveling speed changed by the master controller and the pressure applied by the brake valve. The railway driving simulator according to claim 1, wherein the simulator is displayed on a panel.
[3] 前記列車制御手段は、 [3] The train control means includes:
前記仮想的列車を構成する車両に係るデータ、及び前記線路沿線に配置される 信号機及びプラットフォームの位置情報を含むデータが格納されたメモリと、 前記走行速度及び前記走行位置を求めると共に、前記メモリに格納された前記デ ータを用いて前記線路上の各位置における前記仮想的列車を構成する車両の揺れ を含む変位を求める演算部とを備えたことを特徴とする請求項 1記載の鉄道運転用 シミュレータ。 A memory in which data relating to vehicles constituting the virtual train and data including position information of traffic lights and platforms arranged along the track are stored, and the traveling speed and the traveling position are obtained and stored in the memory The railway operation according to claim 1, further comprising: a calculation unit that obtains a displacement including a shake of a vehicle constituting the virtual train at each position on the track using the stored data. for Simulator.
[4] 前記列車制御手段は、前記演算部で求めた前記走行速度及び前記走行位置に基 づいて、前記信号機の表示を制御する信号制御部、及び前記仮想的列車の乗降扉 の開閉を制御する扉開閉制御部を備えたことを特徴とする請求項 3記載の鉄道運転 用シミュレータ。  [4] The train control means controls a signal control unit that controls display of the traffic light and opening / closing of the entrance / exit door of the virtual train based on the travel speed and the travel position obtained by the calculation unit. The railway driving simulator according to claim 3, further comprising a door opening / closing control unit.
[5] 前記 3D画像生成手段は、前記演算部で求めた前記変位に基づいて、生成された前 記左眼から見える風景を移動させる左眼用 3D画像生成部及び生成された前記右眼 力 見える風景を移動させる右眼用 3D画像生成部を備えたことを特徴とする請求項 3記載の鉄道運転用シミュレータ。  [5] The 3D image generation means generates a left-eye 3D image generation unit that moves the scenery seen from the left eye based on the displacement obtained by the calculation unit, and the generated right eye force 4. The railway driving simulator according to claim 3, further comprising a right-eye 3D image generation unit that moves a viewable landscape.
[6] 前記 3D画像表示手段は、前記左眼用 3D画像生成部が生成し、移動させた前記風 景を、前記偏光板を介して投影する左眼用 3D画像投影部及び前記右眼用 3D画像 生成部が生成し、移動させた前記風景を、前記偏光板を介して投影する右眼用 3D 画像投影部を備えたことを特徴とする請求項 5記載の鉄道運転用シミュレータ。  [6] The 3D image display means includes a left-eye 3D image projecting unit that projects and moves the scenery generated and moved by the left-eye 3D image generating unit and the right-eye image. 6. The railway driving simulator according to claim 5, further comprising a right-eye 3D image projecting unit that projects the landscape generated and moved by the 3D image generating unit through the polarizing plate.
[7] 前記列車制御手段は、前記メモリに格納された走行音及び警報音に係るデータに基 づいて、前記仮想的列車の走行に応じて生じる走行音、及び前記信号幾の警報音 を含む音響を生成し、所定のスピーカに出力する音響生成部を備えたことを特徴と する請求項 3記載の鉄道運転用シミュレータ。  [7] The train control means includes a running sound generated according to the running of the virtual train, and a number of warning sounds of the signal based on data relating to running sounds and warning sounds stored in the memory. 4. The railway driving simulator according to claim 3, further comprising a sound generation unit that generates sound and outputs the sound to a predetermined speaker.
[8] 前記音響生成部は、前記ブレーキ弁が操作されたとき、前記計器パネルに表示され る前記走行速度及び前記圧力に応じて前記格納部に格納された軋り音に係るデー タの音量及び周波数を変化させて前記所定のスピーカから出力することを特徴とす る請求項 7記載の鉄道運転用シミュレータ。  [8] When the brake valve is operated, the sound generation unit is configured to output a volume of data relating to a roaring sound stored in the storage unit according to the traveling speed and the pressure displayed on the instrument panel. 8. The railway driving simulator according to claim 7, wherein the frequency is changed and output from the predetermined speaker.
[9] 前記左眼用 3D画像生成部及び前記右眼用 3D画像生成部は、前記演算部で求め た前記走行速度及び前記走行位置に応じて、前記格納部に格納された前記データ により前記線路及び該線路沿線に配置された有体物の、一定範囲における立体モ デルを形成すると共に、前記仮想的運転士の、対応する左眼又は右眼を視点とする レンダリングを行うことにより、 2D画像によるそれぞれの風景を生成し、前記列車制御 手段に通知をそれぞれが行うものであって、  [9] The 3D image generating unit for the left eye and the 3D image generating unit for the right eye may be configured to use the data stored in the storage unit according to the travel speed and the travel position obtained by the calculation unit. By forming a solid model of a certain range of a tangible object placed along the track and along the track, and rendering the virtual driver with the corresponding left eye or right eye as the viewpoint, a 2D image Each landscape is generated and notified to the train control means.
前記列車制御手段は、前記左眼用 3D画像生成部及び前記右眼用 3D画像生成部 双方から前記通知を受けたとき、生成された前記それぞれの風景を、前記左眼用 3D 画像表示部及び前記右眼用 3D画像表示部双方から同時に前記スクリーンに投影さ せることを特徴とする請求項 5記載の鉄道運転用シミュレータ。 The train control means includes the left-eye 3D image generation unit and the right-eye 3D image generation unit. When the notification is received from both, the generated scenery is projected onto the screen simultaneously from both the left-eye 3D image display unit and the right-eye 3D image display unit. Item 5. The railway driving simulator according to item 5.
[10] 前記左眼用 3D画像生成部及び前記右眼用 3D画像生成部それぞれは、前記仮想 的列の進行に伴って進行方向に前記線路及び前記有体物を補充する単位ブロック が設定されたものであって、該仮想的列車の走行に伴う前記変位に基づいて前記車 両の位置を求め、求めた該位置に応じて、前記仮想的運転士の、対応する左眼又は 右眼の視点を移動させて前記それぞれの風景を生成することを特徴とする請求項 9 記載の鉄道運転用シミュレータ。  [10] Each of the 3D image generation unit for the left eye and the 3D image generation unit for the right eye is set with a unit block for replenishing the line and the tangible object in the traveling direction as the virtual row advances. And determining the position of the vehicle based on the displacement accompanying the traveling of the virtual train, and depending on the determined position, the viewpoint of the corresponding left eye or right eye of the virtual driver is determined. The railway driving simulator according to claim 9, wherein the scenery is generated by moving the respective scenery.
[11] 前記仮想的列車の走行路線、及び該仮想的列車を構成する車両の種類を含むシミ ユレーシヨン環境を選択し、前記列車制御手段及び前記 3D画像生成手段に指令を 行なう指令手段を備えたことを特徴とする請求項 1記載の鉄道運転用シミュレータ。  [11] Provided with command means for selecting a simulation environment including a travel route of the virtual train and a type of vehicle constituting the virtual train, and giving a command to the train control means and the 3D image generation means The railway driving simulator according to claim 1.
PCT/JP2005/022896 2004-12-14 2005-12-13 Operation simulator for railway WO2006064817A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006548862A JPWO2006064817A1 (en) 2004-12-14 2005-12-13 Train driving simulator

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004361695 2004-12-14
JP2004-361695 2004-12-14

Publications (1)

Publication Number Publication Date
WO2006064817A1 true WO2006064817A1 (en) 2006-06-22

Family

ID=36587872

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/022896 WO2006064817A1 (en) 2004-12-14 2005-12-13 Operation simulator for railway

Country Status (2)

Country Link
JP (1) JPWO2006064817A1 (en)
WO (1) WO2006064817A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111566706A (en) * 2017-12-26 2020-08-21 株式会社音乐馆 Image generation system, image generation method, and program
KR20230065522A (en) * 2021-11-05 2023-05-12 한국교통대학교산학협력단 Miniature Model of the Personal Type Simulator for the Electric Multiple Unit

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10244074A (en) * 1997-03-06 1998-09-14 Taito Corp Railway simulation game device
JPH1139509A (en) * 1997-07-15 1999-02-12 Taito Corp Picture display optimizing system
JP2000276038A (en) * 1999-03-19 2000-10-06 Honda Motor Co Ltd Driving simulator
JP2003330356A (en) * 2002-05-09 2003-11-19 East Japan Railway Co Bullet train simulator
JP2004151365A (en) * 2002-10-30 2004-05-27 Toyota Motor Corp Mobile object simulation device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3467773B2 (en) * 1996-10-09 2003-11-17 株式会社セガ GAME DEVICE, GAME PROCESSING METHOD, GAME EXECUTION METHOD, AND GAME SYSTEM
JP2003107603A (en) * 2001-09-28 2003-04-09 Namco Ltd Stereophonic image generating device, stereophonic image generation information and information storage medium
JP4365573B2 (en) * 2002-11-13 2009-11-18 株式会社ソフィア Game machine

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10244074A (en) * 1997-03-06 1998-09-14 Taito Corp Railway simulation game device
JPH1139509A (en) * 1997-07-15 1999-02-12 Taito Corp Picture display optimizing system
JP2000276038A (en) * 1999-03-19 2000-10-06 Honda Motor Co Ltd Driving simulator
JP2003330356A (en) * 2002-05-09 2003-11-19 East Japan Railway Co Bullet train simulator
JP2004151365A (en) * 2002-10-30 2004-05-27 Toyota Motor Corp Mobile object simulation device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111566706A (en) * 2017-12-26 2020-08-21 株式会社音乐馆 Image generation system, image generation method, and program
KR20230065522A (en) * 2021-11-05 2023-05-12 한국교통대학교산학협력단 Miniature Model of the Personal Type Simulator for the Electric Multiple Unit
KR102597812B1 (en) 2021-11-05 2023-11-02 한국교통대학교산학협력단 Miniature Model of the Personal Type Simulator for the Electric Multiple Unit

Also Published As

Publication number Publication date
JPWO2006064817A1 (en) 2008-06-12

Similar Documents

Publication Publication Date Title
CN104851330B (en) A kind of parking simulated training method and system
WO1994024652A1 (en) Driving simulation system
WO2022134859A1 (en) Large-closed-space immersive driving system and control method
Bertollini et al. The general motors driving simulator
JP3817612B2 (en) Virtual driving system
KR20120093885A (en) Method for simulating specific movements by haptic feedback, and device implementing the method
US20180182261A1 (en) Real Time Car Driving Simulator
JP3400969B2 (en) 4-wheel driving simulator
CN108922307A (en) Drive simulating training method, device and driving simulation system
WO1998031444A1 (en) Amusement vehicle
JP4312977B2 (en) Driving simulation device and image display control method
JP2011227242A (en) Drive simulation device, control program of drive simulation device, and control method of drive simulation device
WO2006064817A1 (en) Operation simulator for railway
JP2015018074A (en) Vehicle drive simulation device
JP4280980B2 (en) Railway driving simulator
CN111316342B (en) Automatic brake simulation experience device of four-wheel automobile
WO2005066918A1 (en) Simulation device and data transmission/reception method for simulation device
JP4493575B2 (en) MOBILE BODY SIMULATOR, ITS CONTROL METHOD, AND CONTROL PROGRAM
JP2625035B2 (en) Driving simulation experience system
KR20230113881A (en) Immersive Virtual Reality Experience System and Control Method thereof
KR20190075357A (en) Experience apparatus
JP2007215750A (en) Railroad operation simulation game system, and game machine and server device used in the system
JP2002126359A (en) Game system
TWI817553B (en) System and method for training defensive driving
JPH0386187A (en) Amusement vehicle traveling on track

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2006548862

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05816575

Country of ref document: EP

Kind code of ref document: A1