CN107154197A - Immersion flight simulator - Google Patents
Immersion flight simulator Download PDFInfo
- Publication number
- CN107154197A CN107154197A CN201710370869.9A CN201710370869A CN107154197A CN 107154197 A CN107154197 A CN 107154197A CN 201710370869 A CN201710370869 A CN 201710370869A CN 107154197 A CN107154197 A CN 107154197A
- Authority
- CN
- China
- Prior art keywords
- module
- video
- virtual reality
- flight simulator
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007654 immersion Methods 0.000 title claims abstract description 28
- 238000005516 engineering process Methods 0.000 claims abstract description 26
- 238000012545 processing Methods 0.000 claims abstract description 23
- 230000004927 fusion Effects 0.000 claims abstract description 18
- 238000000034 method Methods 0.000 claims abstract description 8
- 230000008569 process Effects 0.000 claims abstract description 7
- 238000012549 training Methods 0.000 claims description 28
- 238000000605 extraction Methods 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 8
- 238000005286 illumination Methods 0.000 claims description 8
- 230000004886 head movement Effects 0.000 claims description 5
- 230000006835 compression Effects 0.000 claims description 3
- 238000007906 compression Methods 0.000 claims description 3
- 230000000903 blocking effect Effects 0.000 claims description 2
- 238000009499 grossing Methods 0.000 claims description 2
- 230000007704 transition Effects 0.000 claims description 2
- 230000003993 interaction Effects 0.000 abstract description 4
- 230000008447 perception Effects 0.000 abstract description 4
- 238000004088 simulation Methods 0.000 description 16
- 210000003128 head Anatomy 0.000 description 12
- 230000009467 reduction Effects 0.000 description 7
- 230000002452 interceptive effect Effects 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- HPTJABJPZMULFH-UHFFFAOYSA-N 12-[(Cyclohexylcarbamoyl)amino]dodecanoic acid Chemical compound OC(=O)CCCCCCCCCCCNC(=O)NC1CCCCC1 HPTJABJPZMULFH-UHFFFAOYSA-N 0.000 description 2
- 208000012886 Vertigo Diseases 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000013523 data management Methods 0.000 description 2
- 230000007812 deficiency Effects 0.000 description 2
- 230000008030 elimination Effects 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 230000001737 promoting effect Effects 0.000 description 2
- 238000012732 spatial analysis Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/08—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
- G09B9/30—Simulation of view from aircraft
- G09B9/307—Simulation of view from aircraft by helmet-mounted projector or display
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/08—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
- G09B9/30—Simulation of view from aircraft
- G09B9/301—Simulation of view from aircraft by computer-processed or -generated image
- G09B9/302—Simulation of view from aircraft by computer-processed or -generated image the image being transformed by computer processing, e.g. updating the image to correspond to the changing point of view
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present invention is immersion flight simulator, belongs to flight simulator technical field.Immersion flight simulator includes wear-type virtual reality device, tracing positioning device, video process apparatus, what comes into a driver's processing unit and display device with 3D binocular camera shooting devices, wherein 3D binocular camera shootings device is connected with video process apparatus, and what comes into a driver's processing unit is connected with virtual reality device, tracing positioning device and display device respectively.The immersion flight simulator of the present invention uses mixed reality technology, the system creation cockpit what comes into a driver's of virtual reality fusion, the natural interaction for watching the corporal parts such as human hand and cockpit can be realized by camera, to terrain environment and the perception of cabin ambient when can strengthen student's simulated flight.
Description
Technical field
The present invention relates to flight simulator technical field, and in particular to immersion flight simulator.
Background technology
With continuing to develop for computer technology, virtual reality skill is applied more and more in military exercises or training
Art carries out war simulation.Current such flight simulator based on virtual reality technology comes with some shortcomings:Although supporting human body
Hand is in virtual scene hypostazation, it is impossible to hypostazation other equipment or other positions of human body;Simply in certain journey on illumination system
Illumination simulation on degree, and actual light differential are larger;Head movement tracking technique is not advanced enough;Need to splice several fixed viewpoints
Shooting video, there is the error for being difficult to eliminate in angle and depth in spliced video and naked eye content, it is impossible under
The problems such as depending on, limited sight.These problems cause people to be differed greatly with cockpit interactive simulation with realistic simulation situation, feeling of immersion matter
Amount is poor, reduces the practicality of flight simulator.
The content of the invention
In order to solve the flight simulator in military exercises in the prior art or training exist people and cockpit interactive simulation with
Realistic simulation situation differs greatly, the problem of feeling of immersion is second-rate, and present invention offer is a kind of to can solve the problem that the heavy of above mentioned problem
Immersion flight simulator.
The technical scheme for realizing the object of the invention is:
Immersion flight simulator, including the wear-type virtual reality device with 3D binocular camera shooting devices, tracing and positioning
Device, video process apparatus, what comes into a driver's processing unit and display device, wherein 3D binocular camera shootings device connect with video process apparatus
Connect, what comes into a driver's processing unit is connected with virtual reality device, tracing positioning device and display device respectively.
Magnetometer, gyroscope and accelerometer are set in further technical scheme, the wear-type virtual reality device.
It is provided with further technical scheme, the what comes into a driver's processing unit:
Virtual scene driving module:Whole training place of the generation comprising battle field information or battlefield simulating scenes;
Stereoscopic vision generation module:The simulating scenes that virtual scene driving module is generated carry out three-dimensional;
Virtual reality what comes into a driver's tracing module:Position for video camera in stereoscopic vision generation module is set according to six-degree-of-freedom information
Put and posture, simulated operation person head movement locus;
Outdoor scene acquisition module:Camera software development kit is called, virtual camera object is created, parameter needed for setting,
Carry out outdoor scene collection;
Prospect abstraction module:For prospect to be extracted;
Video compressing module:Jpeg image is compressed;
Virtual reality virtual reality fusion module:By the video fusion after technical finesse into virtual reality scenario, it is ensured that regard
Frequency frame number and size position are adapted to virtual scene.
The prospect abstraction module includes:The point texture technology modules of 2D tetra-:For obtain foreground video image remove depth
Outer positional information;Profile extraction module based on the HOUGH depth image connected domains converted:For obtaining foreground video image
Depth information;Abstraction module based on mark and Face Detection:For obtaining the limbs image in non-interception UNICOM region;It is based on
Model shade and the prospect abstraction module of spatial multiplex positioning:Based on being obtained in the point texture technology modules of 2D tetra- and profile extraction module
The information extraction foreground image taken, then the image merged in the prospect abstraction module based on model shade and spatial multiplex positioning are complete
Extracted into prospect.
Further technical scheme, the prospect abstraction module positioned based on model shade and spatial multiplex is using truly
Video camera six-degree-of-freedom information and virtual video camera six-degree-of-freedom information carry out equal proportion model shade to what comes into a driver's.
Further technical scheme, the video compressing module is optimized based on standard jpeg image compression frame, is protected
Demonstrate,prove the compatibility of algorithm.
Further technical scheme, the virtual reality virtual reality fusion module by three-dimensional geometry register with block processing,
Video fusion after edge-smoothing transition, interleave compensation, illumination consistency, shade synthesis processing is to virtual reality virtual scene
In, it is ensured that video frame number and size position are adapted to virtual scene.
Further technical scheme, the six-degree-of-freedom information refers in the traceable scope of tracing positioning device, mainly
The six-degree-of-freedom information of head position is generated by tracing positioning device capture operation person head video;Or filled by wear-type virtual reality
Put the six-degree-of-freedom information that built-in magnetometer, gyroscope and accelerometer obtain head position.
Beneficial effects of the present invention are:
The present invention is by mixed reality cockpit technology, and camera is fixed on mode on virtual implementing helmet entity exactly
The locus of human body and operating platform in virtual scene is changed, has been added by human body control operation platform existing
Real tactile experience;Visual angle is using head rotation mode is followed, and larger true reduction human vision, especially solves some go straight up to
Problem is regarded under machine;And realize the technological difficulties such as actual situation illumination consistency, distortion elimination.It compensate for current large-scale flight simulation
The deficiency that device system is present, improves student's feeling of immersion, the sense of reality, simulated training is preferably pressed close to true training, enhance
The practicality of UAV model system, knowwhy is consolidated with minimum cost, operation and disposing capacity is improved, with immersing for height
One and air drill formation are converted into interesting strong interactive mode by sense and telepresenc, and can break through training court limitation and
The interior space is maximally utilized, training of safety and quality are improved while reduction training cost.
The software systems of the present invention rely on three-dimensional geographic information platform, the modeling of comprehensive utilization 3 D stereo, virtual emulation, sea
The technologies such as data management, three-dimensional spatial analysis are measured, the training information under being experienced with immersion is globally shared, training resource is unified adjusts
Match somebody with somebody, science auxiliary operational commanding is target, be integrated with the information processing based on three-dimensional geographic information platform, automatically control etc. and be multiple
The knowledge of high-technology field.Its technology and weapon technologies emulation, weapon system simulation and emulation etc. of fighting, in army
Played a significant role in terms of training, weapons SoS, operational commanding and program plan.
The present invention uses mixed reality technology, and the system creation cockpit what comes into a driver's of virtual reality fusion passes through camera energy
It is enough to realize the natural interaction for watching the corporal parts such as human hand and cockpit, to landform ring when can strengthen student's simulated flight
Border and the perception of cabin ambient.And the system architecture of the present invention is simple, cost performance is high, except the units such as landing training of taking off
Outside, can also simulation true to nature the training of multimachine subject such as really form into columns, simulate true battlefield surroundings and truly regarded there is provided trainee
Feel, the sense of hearing, tactile etc. are experienced, using the personnel that are effectively reduced, goods and materials loss in military field, and reduction training cost has
Important application and promotional value.
Brief description of the drawings
Fig. 1 is the structural representation of immersion flight simulator in the embodiment of the present invention
Embodiment
Fig. 1 is to explain the present invention, but the invention is not restricted in the scope shown in Fig. 1.
Differed greatly to solve cockpit interactive simulation in the prior art with realistic simulation situation, feeling of immersion is second-rate
Problem, the present invention is by using mixed reality technology, and the system creation cockpit what comes into a driver's of virtual reality fusion passes through camera energy
It is enough to realize the natural interaction for watching the corporal parts such as human hand and cockpit, to landform ring when can strengthen student's simulated flight
Border and the perception of cabin ambient.
As shown in figure 1, immersion flight simulator, including the dress of the wear-type virtual reality with 3D binocular camera shootings device 1
Put 2, tracing positioning device 3, video process apparatus 4, what comes into a driver's processing unit 5 and display device, wherein 3D binocular camera shootings device with
Video process apparatus is connected, and what comes into a driver's processing unit is connected with virtual reality device, tracing positioning device and display device respectively.
Magnetometer, gyroscope and accelerometer are set in wear-type virtual reality device.
It is provided with specific what comes into a driver's processing unit:
Virtual scene driving module:Whole training place of the generation comprising battle field information or battlefield simulating scenes;
Stereoscopic vision generation module:The simulating scenes that virtual scene driving module is generated carry out three-dimensional;
Virtual reality what comes into a driver's tracing module:Position for video camera in stereoscopic vision generation module is set according to six-degree-of-freedom information
Put and posture, simulated operation person head movement locus;
Outdoor scene acquisition module:Camera software development kit is called, virtual camera object is created, parameter needed for setting,
Carry out outdoor scene collection;
Prospect abstraction module:For prospect to be extracted;
Video compressing module:Jpeg image is compressed;
Virtual reality virtual reality fusion module:By the video fusion after technical finesse into virtual reality scenario, it is ensured that regard
Frequency frame number and size position are adapted to virtual scene.
Wherein prospect abstraction module includes:The point texture technology modules of 2D tetra-:For obtain foreground video image remove depth
Outer positional information;Profile extraction module based on the HOUGH depth image connected domains converted:For obtaining foreground video image
Depth information;Abstraction module based on mark and Face Detection:For obtaining the limbs image in non-interception UNICOM region;It is based on
Model shade and the prospect abstraction module of spatial multiplex positioning:Based on being obtained in the point texture technology modules of 2D tetra- and profile extraction module
The information extraction foreground image taken, then the image merged in the prospect abstraction module based on model shade and spatial multiplex positioning are complete
Extracted into prospect.
Further, the prospect abstraction module positioned based on model shade and spatial multiplex utilizes real camera six freely
Spend information and virtual video camera six-degree-of-freedom information and equal proportion model shade is carried out to what comes into a driver's.
Further, video compressing module is optimized based on standard jpeg image compression frame, it is ensured that the compatibility of algorithm
Property.
Further, the virtual reality virtual reality fusion module is registered by three-dimensional geometry and put down with blocking processing, edge
Slip over cross, interleave compensation, illumination consistency, shade synthesis processing after video fusion into virtual reality virtual scene, it is ensured that
Video frame number and size position are adapted to virtual scene.
It is preferred that, the six-degree-of-freedom information refers in the traceable scope of tracing positioning device, mainly by tracing and positioning
The six-degree-of-freedom information of device capture operation person head video generation head position;Or as the magnetic built in wear-type virtual reality device
Power meter, gyroscope and accelerometer obtain the six-degree-of-freedom information of head position.
Specifically prospect extraction step is:
1st, the demarcation of multiple-camera includes the acquisition of internal reference, outer ginseng and distortion parameter;
2nd, mark detection positioning central region;
3rd, the color video for collecting outdoor scene acquisition module extracts prospect cabin interior part according to algorithm, is based on
The contours extract of the depth image connected domain of HOUGH conversion;
The 3D models of whole prospect cockpit are designed in advance, and are preserved hereof.Module initialization gets prospect cockpit
3D models;
Using graphics processor (GPU) pretreatment deep image, CUDA API are called to produce video shade, it is too big to depth
Part filtered;
The coloured image of processing is copied to CPU from GPU;
Depth image connected domain is analyzed using OpenCV, the edge of each connected domain is obtained;
The corresponding relation of 3D models and student's headset equipment three-dimensional coordinate, Jin Erji are determined according to tracing positioner position
Calculate relative position of the current 3D binocular cameras in 3D models;
Use each connected domain edge pixel point of follow-on Canny operator extractions depth image;
Straightway is obtained with Hough transform, the rough phenomenon in edge, the complexity that reduction rear edge judges is reduced;
Obtain the corresponding coordinate using camera as origin of marginal point;
Judge some connected domain marginal point whether in 3D models in space, if in the colour in connected domain
Corresponding points need output in figure;If it was not then output is according to the point in the connected domain of 3D model boundaries cutting;
4th, equal proportion model shade and spatial multiplex registration positioning video interception scope;
5th, the detection of interception scope outer skin is using the oval skin color detection method after improving;
6th, many video-splicing algorithm improvements.
More specifically, stereoscopic vision generation module:Intend using Unigine engines by including that virtual scene driving subsystem is generated
Red Army's command information, blue force and threaten generation information, the whole training such as scape, mission bit stream or battlefield simulating scenes carry out it is three-dimensional
(Stereoscopic) change.
VR what comes into a driver's tracing modules:In the traceable scope of tracker, head video generation head position is mainly caught by tracker
Six-degree-of-freedom information.Otherwise magnetometer, gyroscope and accelerometer inside headset equipment obtains six freedom on head
Spend information.When six-degree-of-freedom information is collected in what comes into a driver's processing computer, the video camera inside software design patterns Unigine
(Camera) position and posture, simulation student's head movement track.And due to the field range of video camera observable be it is fixed,
The what comes into a driver's content produced on eyeglass changes therewith.
Outdoor scene acquisition module:CUDA of the outdoor scene acquisition module based on the OpenCV storehouses increased income and Nvidia, calls camera
SDK, carries out following operation, creates virtual camera object, the parameter such as resolution ratio, frame number needed for setting, while being adopted using comprehensive screen
(comprehensive screen is bulk glossy clear material composition, reflective, floodlight effect that outdoor solar light and indoor light are produced thereon to collection technology
Fruit meeting camera-shot is simultaneously strengthened).
It is double by 3D in the virtual training or operation scene that what comes into a driver's computer is run when student puts on VR headset equipments
The HD video that mesh camera is shot is after video computer is handled, generation analog capsule cabin part, avionics meter section and
The part of the handle (control stick and collective-pitch lever, left and right weapon handle) of two bar of member's operation by human hand two and the rudder of foot operation two.Student is present
Spatial orientation information (translation containing space and spatial rotational) in cabin can be caught and be collected into what comes into a driver's computer by VR locators,
In display system software operation processing real-time update to what comes into a driver's, student is set to produce the feeling of immersion for driving real machine.
The present invention improves camera by 3D binocular camera shootings device, and closely depth detection is inaccurate:
Produce depth and disparity map using original right and left eyes video, two kinds of figure thresholdings processing are extracted identification it is bad away from
From the prospect of depth, and with algorithm precisely analyze and predict closely depth.
Camera radial direction, the real time correction of tangential distortion:
Realize that real time algorithm will detect the scope of tangential distortion after camera calibration, carried out with vanishing point to being mainly barrel distortion
Radial distortion automatic correction.
Realize right and left eyes fusion emulation human eye vision:
Disparity map is generated into red blueprint, then enters line distortion post processing, by eyes it is observed that contents interception go out.
The human body in interception area is not extracted using skin detection algorithm.
Virtual reality technology and video fusion
The illumination on actual situation jointing edge/surface is handled with shade unification:
Edge/the normal to a surface and various lighting effects of rendering engine are obtained, two kinds are carried out to actual situation edge and surface
Different processing, completes the unification of whole scene.
Headwork is synchronous with video image change:
Head positioning interpolation processing position and posture, the interleave algorithm when vibration of adaptation high-frequency and Large Amplitude Motion are real
It is existing, finally realize Integral synchronous.
Immersion experience enhancing and spinning sensation are improved:
Strengthen the vision and audio experience of immersion based on two above function, picture is reprocessed and eliminates certain flake
Effect and simulation human eye focusing partial pixel are obscured/sharpened, and increase the motion blur of effective audiovisual with improving distance Strength Changes
The inconsistent spinning sensation brought is moved to improve virtual scene and reality scene.
The present invention is by mixed reality cockpit technology, and camera is fixed on mode on virtual implementing helmet entity exactly
The locus of human body and operating platform in virtual scene is changed, has been added by human body control operation platform existing
Real tactile experience;Visual angle is using head rotation mode is followed, and larger true reduction human vision, especially solves some go straight up to
Problem is regarded under machine;And realize the technological difficulties such as actual situation illumination consistency, distortion elimination.It compensate for current large-scale flight simulation
The deficiency that device system is present, improves student's feeling of immersion, the sense of reality, simulated training is preferably pressed close to true training, enhance
The practicality of UAV model system, knowwhy is consolidated with minimum cost, operation and disposing capacity is improved, with immersing for height
One and air drill formation are converted into interesting strong interactive mode by sense and telepresenc, and can break through training court limitation and
The interior space is maximally utilized, training of safety and quality are improved while reduction training cost.
The software systems of the present invention rely on three-dimensional geographic information platform, the modeling of comprehensive utilization 3 D stereo, virtual emulation, sea
The technologies such as data management, three-dimensional spatial analysis are measured, the training information under being experienced with immersion is globally shared, training resource is unified adjusts
Match somebody with somebody, science auxiliary operational commanding is target, be integrated with the information processing based on three-dimensional geographic information platform, automatically control etc. and be multiple
The knowledge of high-technology field.Its technology and weapon technologies emulation, weapon system simulation and emulation etc. of fighting, in army
Played a significant role in terms of training, weapons SoS, operational commanding and program plan.
The present invention uses mixed reality technology, and the system creation cockpit what comes into a driver's of virtual reality fusion passes through camera energy
It is enough to realize the natural interaction for watching the corporal parts such as human hand and cockpit, to landform ring when can strengthen student's simulated flight
Border and the perception of cabin ambient.And the system architecture of the present invention is simple, cost performance is high, except the units such as landing training of taking off
Outside, can also simulation true to nature the training of multimachine subject such as really form into columns, simulate true battlefield surroundings and truly regarded there is provided trainee
Feel, the sense of hearing, tactile etc. are experienced, using the personnel that are effectively reduced, goods and materials loss in military field, and reduction training cost has
Important application and promotional value.
Above-described embodiment is only the specific embodiment of the invention, but is not limited to embodiment, all not depart from structure of the present invention
In the case of think of, equivalent modification and the prior art addition done according to the application are accordingly to be regarded as the technology of the present invention category.
Claims (8)
1. immersion flight simulator, it is characterised in that:Including the wear-type virtual reality device with 3D binocular camera shooting devices,
Tracing positioning device, video process apparatus, what comes into a driver's processing unit and display device, wherein 3D binocular camera shootings device and Video processing
Device is connected, and what comes into a driver's processing unit is connected with virtual reality device, tracing positioning device and display device respectively.
2. immersion flight simulator according to claim 1, it is characterised in that:In the wear-type virtual reality device
If magnetometer, gyroscope and accelerometer.
3. immersion flight simulator according to claim 1, it is characterised in that:It is provided with the what comes into a driver's processing unit:
Virtual scene driving module:Whole training place of the generation comprising battle field information or battlefield simulating scenes;
Stereoscopic vision generation module:The simulating scenes that virtual scene driving module is generated carry out three-dimensional;
Virtual reality what comes into a driver's tracing module:According to six-degree-of-freedom information set stereoscopic vision generation module in camera position and
Posture, simulated operation person head movement locus;
Outdoor scene acquisition module:Camera software development kit is called, virtual camera object is created, parameter needed for setting is carried out
Outdoor scene is gathered;
Prospect abstraction module:For prospect to be extracted;
Video compressing module:Jpeg image is compressed;
Virtual reality virtual reality fusion module:By the video fusion after technical finesse into virtual reality scenario, it is ensured that frame of video
Number and size position are adapted to virtual scene.
4. immersion flight simulator according to claim 3, it is characterised in that:The virtual reality virtual reality fusion module
Registered and synthesized with blocking processing, edge-smoothing transition, interleave compensation, illumination consistency, shade after handling by three-dimensional geometry
Video fusion is into virtual reality virtual scene, it is ensured that video frame number and size position are adapted to virtual scene.
5. immersion flight simulator according to claim 3, it is characterised in that:The video compressing module is based on standard
Jpeg image compression frame is optimized, it is ensured that the compatibility of algorithm.
6. immersion flight simulator according to claim 3, it is characterised in that:The prospect abstraction module includes:2D
Four point texture technology modules:For obtaining the positional information in addition to depth of foreground video image;The depth converted based on HOUGH
The profile extraction module in image connectivity domain:For obtaining the depth information of foreground video image;Based on mark and Face Detection
Abstraction module:For obtaining the limbs image in non-interception UNICOM region;The prospect positioned based on model shade and spatial multiplex is taken out
Modulus block:Based on the information extraction foreground image obtained in the point texture technology modules of 2D tetra- and profile extraction module, then merge base
Image in the prospect abstraction module that model shade and spatial multiplex are positioned completes prospect and extracted.
7. the immersion flight simulator according to claim 4 or 5 or 6, it is characterised in that:It is described based on model shade and
The prospect abstraction module of spatial multiplex positioning utilizes real camera six-degree-of-freedom information and virtual video camera six-degree-of-freedom information
Equal proportion model shade is carried out to what comes into a driver's.
8. immersion flight simulator according to claim 7, it is characterised in that:The six-degree-of-freedom information refers to chasing after
It is mainly free by the six of tracing positioning device capture operation person head video generation head position in the traceable scope of track positioner
Spend information;Or the six degree of freedom of head position is obtained as the magnetometer built in wear-type virtual reality device, gyroscope and accelerometer
Information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710370869.9A CN107154197A (en) | 2017-05-18 | 2017-05-18 | Immersion flight simulator |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710370869.9A CN107154197A (en) | 2017-05-18 | 2017-05-18 | Immersion flight simulator |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107154197A true CN107154197A (en) | 2017-09-12 |
Family
ID=59792943
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710370869.9A Pending CN107154197A (en) | 2017-05-18 | 2017-05-18 | Immersion flight simulator |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107154197A (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109272804A (en) * | 2018-09-14 | 2019-01-25 | 温州大学 | A kind of cockpit constrains the orthogonal video locating method of lower pilot's head movement |
CN109686168A (en) * | 2018-12-14 | 2019-04-26 | 中国航空工业集团公司西安飞机设计研究所 | A kind of air duty training system based on mixed reality |
CN109949659A (en) * | 2019-05-07 | 2019-06-28 | 中山艾福莱航空仿真科技有限公司 | A kind of flight based on Prepar3D and maintenance simulator |
WO2019140945A1 (en) * | 2018-01-22 | 2019-07-25 | 中国人民解放军陆军航空兵学院 | Mixed reality method applied to flight simulator |
CN111123743A (en) * | 2020-01-03 | 2020-05-08 | 中仿智能科技(上海)股份有限公司 | Man-machine interaction system for simulating aircraft |
CN113035010A (en) * | 2019-12-24 | 2021-06-25 | 北京普德诚科技有限责任公司 | Virtual and real scene combined visual system and flight simulation device |
CN113035011A (en) * | 2021-04-02 | 2021-06-25 | 周浩洋 | 3D immersive flight cabin based on Realsense |
CN113192373A (en) * | 2021-03-18 | 2021-07-30 | 徐州九鼎机电总厂 | Periscope simulation imaging method based on immersive human-computer interaction simulation system |
CN113298955A (en) * | 2021-05-25 | 2021-08-24 | 厦门华厦学院 | Real scene and virtual reality scene fusion method and system and flight simulator |
US20210327295A1 (en) * | 2020-04-17 | 2021-10-21 | Rockwell Collins, Inc. | Head tracking with virtual avionics training products |
CN114743433A (en) * | 2021-12-23 | 2022-07-12 | 中国科学院软件研究所 | Multi-channel alarm presenting method and device for simulating threats in flight training environment |
CN115019597A (en) * | 2022-05-23 | 2022-09-06 | 中国人民解放军海军航空大学 | Aviation simulation training method, device and system based on cloud computing and cloud rendering |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101208723A (en) * | 2005-02-23 | 2008-06-25 | 克雷格·萨默斯 | Automatic scene modeling for the 3D camera and 3D video |
CN101231790A (en) * | 2007-12-20 | 2008-07-30 | 北京理工大学 | Enhancing reality flight simulator based on a plurality of fixed cameras |
CN101305401A (en) * | 2005-11-14 | 2008-11-12 | 微软公司 | Stereo video for gaming |
CN102157011A (en) * | 2010-12-10 | 2011-08-17 | 北京大学 | Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment |
CN103136793A (en) * | 2011-12-02 | 2013-06-05 | 中国科学院沈阳自动化研究所 | Live-action fusion method based on augmented reality and device using the same |
CN104035760A (en) * | 2014-03-04 | 2014-09-10 | 苏州天魂网络科技有限公司 | System capable of realizing immersive virtual reality over mobile platforms |
CN105320820A (en) * | 2015-12-02 | 2016-02-10 | 上海航空电器有限公司 | Rapid cockpit design system and method based on immersive virtual reality platform |
CN106055113A (en) * | 2016-07-06 | 2016-10-26 | 北京华如科技股份有限公司 | Reality-mixed helmet display system and control method |
CN106157731A (en) * | 2015-04-07 | 2016-11-23 | 深圳威阿科技有限公司 | A kind of simulated flight passenger cabin system with mixed reality function and processing method thereof |
CN106408515A (en) * | 2016-08-31 | 2017-02-15 | 郑州捷安高科股份有限公司 | Augmented reality-based vision synthesis system |
CN106530894A (en) * | 2017-01-10 | 2017-03-22 | 北京捷安申谋军工科技有限公司 | Flight trainer virtual head-up display method through augmented reality technology and flight trainer virtual head-up display system thereof |
-
2017
- 2017-05-18 CN CN201710370869.9A patent/CN107154197A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101208723A (en) * | 2005-02-23 | 2008-06-25 | 克雷格·萨默斯 | Automatic scene modeling for the 3D camera and 3D video |
CN101305401A (en) * | 2005-11-14 | 2008-11-12 | 微软公司 | Stereo video for gaming |
CN101231790A (en) * | 2007-12-20 | 2008-07-30 | 北京理工大学 | Enhancing reality flight simulator based on a plurality of fixed cameras |
CN102157011A (en) * | 2010-12-10 | 2011-08-17 | 北京大学 | Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment |
CN103136793A (en) * | 2011-12-02 | 2013-06-05 | 中国科学院沈阳自动化研究所 | Live-action fusion method based on augmented reality and device using the same |
CN104035760A (en) * | 2014-03-04 | 2014-09-10 | 苏州天魂网络科技有限公司 | System capable of realizing immersive virtual reality over mobile platforms |
CN106157731A (en) * | 2015-04-07 | 2016-11-23 | 深圳威阿科技有限公司 | A kind of simulated flight passenger cabin system with mixed reality function and processing method thereof |
CN105320820A (en) * | 2015-12-02 | 2016-02-10 | 上海航空电器有限公司 | Rapid cockpit design system and method based on immersive virtual reality platform |
CN106055113A (en) * | 2016-07-06 | 2016-10-26 | 北京华如科技股份有限公司 | Reality-mixed helmet display system and control method |
CN106408515A (en) * | 2016-08-31 | 2017-02-15 | 郑州捷安高科股份有限公司 | Augmented reality-based vision synthesis system |
CN106530894A (en) * | 2017-01-10 | 2017-03-22 | 北京捷安申谋军工科技有限公司 | Flight trainer virtual head-up display method through augmented reality technology and flight trainer virtual head-up display system thereof |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019140945A1 (en) * | 2018-01-22 | 2019-07-25 | 中国人民解放军陆军航空兵学院 | Mixed reality method applied to flight simulator |
CN109272804B (en) * | 2018-09-14 | 2020-11-24 | 温州大学 | Orthogonal video positioning method for head movement of pilot under constraint of flight cockpit |
CN109272804A (en) * | 2018-09-14 | 2019-01-25 | 温州大学 | A kind of cockpit constrains the orthogonal video locating method of lower pilot's head movement |
CN109686168A (en) * | 2018-12-14 | 2019-04-26 | 中国航空工业集团公司西安飞机设计研究所 | A kind of air duty training system based on mixed reality |
CN109949659A (en) * | 2019-05-07 | 2019-06-28 | 中山艾福莱航空仿真科技有限公司 | A kind of flight based on Prepar3D and maintenance simulator |
CN113035010A (en) * | 2019-12-24 | 2021-06-25 | 北京普德诚科技有限责任公司 | Virtual and real scene combined visual system and flight simulation device |
CN113035010B (en) * | 2019-12-24 | 2023-07-21 | 北京普德诚科技有限责任公司 | Virtual-real scene combined vision system and flight simulation device |
CN111123743A (en) * | 2020-01-03 | 2020-05-08 | 中仿智能科技(上海)股份有限公司 | Man-machine interaction system for simulating aircraft |
US20210327295A1 (en) * | 2020-04-17 | 2021-10-21 | Rockwell Collins, Inc. | Head tracking with virtual avionics training products |
CN113192373A (en) * | 2021-03-18 | 2021-07-30 | 徐州九鼎机电总厂 | Periscope simulation imaging method based on immersive human-computer interaction simulation system |
CN113035011A (en) * | 2021-04-02 | 2021-06-25 | 周浩洋 | 3D immersive flight cabin based on Realsense |
CN113298955A (en) * | 2021-05-25 | 2021-08-24 | 厦门华厦学院 | Real scene and virtual reality scene fusion method and system and flight simulator |
CN113298955B (en) * | 2021-05-25 | 2024-04-30 | 厦门华厦学院 | Real scene and virtual reality scene fusion method, system and flight simulator |
CN114743433A (en) * | 2021-12-23 | 2022-07-12 | 中国科学院软件研究所 | Multi-channel alarm presenting method and device for simulating threats in flight training environment |
CN114743433B (en) * | 2021-12-23 | 2023-03-24 | 中国科学院软件研究所 | Multi-channel alarm presenting method and device for simulating threats in flight training environment |
CN115019597A (en) * | 2022-05-23 | 2022-09-06 | 中国人民解放军海军航空大学 | Aviation simulation training method, device and system based on cloud computing and cloud rendering |
CN115019597B (en) * | 2022-05-23 | 2023-10-03 | 中国人民解放军海军航空大学 | Aviation simulation training method, device and system based on cloud computing and cloud rendering |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107154197A (en) | Immersion flight simulator | |
CN107134194A (en) | Immersion vehicle simulator | |
CN106157359B (en) | Design method of virtual scene experience system | |
US8040361B2 (en) | Systems and methods for combining virtual and real-time physical environments | |
EP2491530B1 (en) | Determining the pose of a camera | |
US20100182340A1 (en) | Systems and methods for combining virtual and real-time physical environments | |
US7479967B2 (en) | System for combining virtual and real-time environments | |
CN106162137B (en) | Virtual visual point synthesizing method and device | |
US5796991A (en) | Image synthesis and display apparatus and simulation system using same | |
CN106710362A (en) | Flight training method implemented by using virtual reality equipment | |
EP2175636A1 (en) | Method and system for integrating virtual entities within live video | |
CN103543827B (en) | Based on the implementation method of the immersion outdoor activities interaction platform of single camera | |
WO2019140945A1 (en) | Mixed reality method applied to flight simulator | |
CN106791778A (en) | A kind of interior decoration design system based on AR virtual reality technologies | |
CN106408515A (en) | Augmented reality-based vision synthesis system | |
CN207883156U (en) | A kind of scenic spot simulated flight experience apparatus | |
CN105959595A (en) | Virtuality to reality autonomous response method for virtuality and reality real-time interaction | |
CN109920000B (en) | Multi-camera cooperation-based dead-corner-free augmented reality method | |
CN107067299A (en) | Virtual fit method and system | |
CN112446939A (en) | Three-dimensional model dynamic rendering method and device, electronic equipment and storage medium | |
CN104463956B (en) | Construction method and device for virtual scene of lunar surface | |
CN106780754A (en) | A kind of mixed reality method and system | |
CN113035010A (en) | Virtual and real scene combined visual system and flight simulation device | |
CN114139370A (en) | Synchronous simulation method and system for optical engine and electromagnetic imaging dual-mode moving target | |
CN207601427U (en) | A kind of simulation laboratory based on virtual reality mixing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: Room 1001-1020, North building, Internet of things building, 368 Xinshi North Road, Shijiazhuang, Hebei 050000 Applicant after: ZHONGKE HENGYUN Co.,Ltd. Address before: 050000 10th floor, IOT building, 377 xinshizhong Road, Shijiazhuang City, Hebei Province Applicant before: HEBEI ZHONGKE HENGYUN SOFTWARE TECHNOLOGY Co.,Ltd. |
|
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20170912 |