CN110097645B - Method for removing jitter of flight simulation picture and flight simulator - Google Patents
Method for removing jitter of flight simulation picture and flight simulator Download PDFInfo
- Publication number
- CN110097645B CN110097645B CN201910410194.5A CN201910410194A CN110097645B CN 110097645 B CN110097645 B CN 110097645B CN 201910410194 A CN201910410194 A CN 201910410194A CN 110097645 B CN110097645 B CN 110097645B
- Authority
- CN
- China
- Prior art keywords
- aircraft
- camera
- image data
- display
- picture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
Abstract
The invention relates to a method for removing jitter of a flight simulation picture and a flight simulator, belonging to the technical field of flight simulation; the method comprises the following steps; in flight simulation, a first aircraft and a second aircraft with the same structure are arranged; acquiring background video image data of a first aircraft during motion; acquiring body image data of a second aircraft in a static state; and carrying out image fusion on the background video image data and the body image data to obtain a fusion picture, and sending the fusion picture to the simulation display equipment for displaying. The invention provides a method for removing jitter through the fusion of double cameras for high-speed, long-time and long-distance flight simulation which needs a wide flight simulation space, and can break through the technical bottleneck caused by the self limitation of a simulation engine.
Description
Technical Field
The invention relates to the technical field of flight simulation, in particular to a method for removing jitter of a flight simulation picture and a flight simulator.
Background
The application of the VR/AR (virtual reality/augmented reality) technology to analog simulation is more and more extensive in the fields of aerospace and the like. However, the currently used analog simulation engine, for example, unity3D, is mainly applied to the development of projects with small motion space, the coordinate range of the engine is required by float floating point precision and cannot exceed 100000 (in Unity3D, the default unit is m), and if the coordinate range is exceeded, unity3D will generate a warning: "Dual to flowing-point precision motions, it is meant to be linearly vibrating the world objects with a small scope", and the moving airplane will vibrate more and more along with the moving airplane being farther away from the origin of coordinates, and even the airplane will be seriously deformed, and the characters and scale bands of the displayed images in the cabin are dislocated, and the images are blurred, as shown in FIG. 1. Therefore, the application of the Unity3D engine in the fields with larger requirements on the motion range, such as aerospace and the like, is limited.
Disclosure of Invention
In view of the above analysis, the present invention aims to provide a flight simulation image de-jitter method and a flight simulator, which solve the problem of image de-jitter in flight simulation.
The invention is mainly realized by the following technical scheme:
the invention discloses a method for removing jitter of a flight simulation picture, which comprises the following steps of;
in flight simulation, a first aircraft and a second aircraft with the same structure are arranged;
acquiring first image data in real time, wherein the first image data is background video image data of a first aircraft during motion;
acquiring second image data, wherein the second image data is the body image data of a second aircraft in a static state;
and carrying out image fusion on the first image data and the second image data to obtain a fusion picture, and sending the fusion picture to the simulation display equipment for displaying.
Further, the flight simulation picture is obtained based on Unity 3D; importing a three-dimensional model comprising a first aircraft, a second aircraft and an airport runway in Unity 3D; and constructing a flight simulation scene.
Further, the three-dimensional models of the first aircraft, the second aircraft and the airport runway are constructed by using 3ds max software, and the file format imported into Unity3D is FBX format.
Further, in Unity3D, a first camera is created, mounted on a first aircraft, and used for capturing background video image data during the movement of the first aircraft;
and creating a second camera which is mounted on the second aircraft and is used for shooting the body image data of the second aircraft in a static state.
Further, image fusion is carried out on the image data shot by the first camera and the second camera by using a Unity3D double-camera fusion method.
Further, the dual-camera fusion specifically includes:
1) Setting a second aircraft, a first camera and a second camera;
marking layer attributes of the second aircraft;
selecting the Culling Mask attribute of the second camera as the layer attribute of the second aircraft, setting the Clear Flags attribute as 'Depth only', and setting the Depth value as 1;
setting the Culling Mask attribute of the first camera as the layer attribute of the second aircraft; clear Flags attribute is set to "Skybox", depth value is set to 0;
2) And according to the set Depth value, the picture displayed by the second camera is positioned at the upper layer, the picture displayed by the first camera is positioned at the lower layer, and the pictures are overlapped.
Further, in flight simulation, a script for controlling the motion of the aircraft is established in Unity3D to obtain a rigid body component of the first aircraft, so that the rigid body component of the aircraft moves forward along the z-axis according to a certain step length, thereby controlling the motion of the first aircraft.
Further, the flight simulation scene construction comprises:
selecting a sky box as a sky background, and setting the material of the sky box in an ambient lighting function Skybox option in Unity 3D;
and arranging the runway model at the coordinate origin, arranging the first aircraft model at a set position according to the set position relation, and adjusting the position of the second aircraft model to an area which is within a set range from the origin.
Further, the fusion picture also comprises head-up display data in the motion process of the first aircraft; the fusion method of the head-up display data comprises the following steps:
1) Presetting a default display position of the head-up display data in the fusion picture, wherein the default display position is positioned outside the cabin and d meters away from the head-up display glass position;
2) Judging whether dynamic target information is included between the flat display glass position and the default display position; if not, go to 3); if so, go to 4);
3) Placing the flat display data at a default display position, and fixedly displaying the flat display data in a preset display size;
4) According to the dynamic target information, dynamically displaying the display size and position of the flat display data; the dynamic display size1= ω size0; size0 is a preset display size; scaling factorIn the formula, alpha is the distance from the eyes of an operator to the position of the flat display glass in the fusion picture during simulation operation, l is the distance from the flat display glass to the dynamic target, and m is the thickness of the dynamic target, namely the width of the target in the sight line direction of the operator; the dynamic display position is d1 meter outside the position of the flat display glass outside the cabin,
in another aspect, a flight simulator is disclosed, which comprises a simulation display device, wherein the simulation picture on the simulation display device is subjected to picture jitter elimination by any one of the above methods for removing jitter.
The invention has the following beneficial effects:
the invention provides a method for removing jitter through the fusion of double cameras for high-speed, long-time and long-distance flight simulation which needs a wide flight simulation space, breaks through the technical bottleneck caused by the self limitation of a simulation engine (Unity 3D software), eliminates troubles for the majority of technologists, and brings great benefits for the development of related industrial projects, the training of personnel and the like.
In addition, the method has low requirements on hardware basis of flight simulation, can construct a small simulator only by being equipped with the existing throttle lever, the existing steering column and the existing HTC (high-temperature vehicle) view head display, can provide immersive virtual reality experience for users, and has the characteristics of low price, easiness in mass configuration and the like compared with a large simulator.
Drawings
The drawings are only for purposes of illustrating particular embodiments and are not to be construed as limiting the invention, wherein like reference numerals are used to designate like parts throughout.
FIG. 1 is a diagram illustrating image dithering in flight simulation;
FIG. 2 is a flow chart of a method for removing jitter from a flight simulation image according to an embodiment of the present invention;
FIG. 3 is a background picture taken by a first camera in an embodiment of the present invention;
FIG. 4 is a second camera of the embodiment of the present invention taking a picture of a subject;
fig. 5 is a fusion picture obtained by performing image fusion in the embodiment of the present invention.
Detailed Description
The preferred embodiments of the present invention will now be described in detail with reference to the accompanying drawings, which form a part hereof, and which together with the embodiments of the invention serve to explain the principles of the invention.
The first embodiment,
The embodiment discloses a method for removing jitter of a flight simulation picture, which comprises the following specific steps as shown in fig. 1;
s1, in flight simulation, a first aircraft and a second aircraft which have the same structure are arranged;
s2, acquiring first image data, wherein the first image data is background video image data of a first aircraft during motion;
s3, acquiring second image data, wherein the second image data is the body image data of a second aircraft in a static state;
and S4, carrying out image fusion on the first image data and the second image data to obtain a fusion picture, and sending the fusion picture to a simulation display device for displaying.
Preferably, the Unity3D engine is selected as the simulation engine of the embodiment, the Unity3D engine is developed in a component manner, pure code development is replaced, simulation steps are greatly simplified, and development period and difficulty are reduced.
Importing a three-dimensional model comprising a first aircraft, a second aircraft and an airport runway in the Unity 3D; constructing a flight simulation scene; controlling the motion of the aircraft to acquire image data and flight parameters; and transmitting and displaying the image data and the flight parameters.
The three-dimensional models of the first aircraft, the second aircraft and the airport runway can be established by 3ds max software, and files with the format of x and FBX are respectively exported after mapping.
And starting the Unity3D software to newly build a project. FBX format three-dimensional runway and aircraft models derived from 3ds max software were imported, a preset (Prefab) original was made, and model presets were copied, named runway, first aircraft, and second aircraft, respectively.
Preferably, the first aircraft model and the second aircraft model adopt the same structure, and the first aircraft is defined as a moving aircraft, and the second aircraft is defined as a static aircraft.
Therefore, the establishment and import of the three-dimensional model are completed, and then a flight simulation scene needs to be established.
In order to establish a flight simulation scene to enable the simulation to realize the effect of flying under the sky background, a proper sky box is selected as the sky background, a menu Window → Lighting is selected in Unity3D software, an illumination view is opened, and the material of the sky box is set in a Skybox option of an ambient illumination (environmental Lighting) function. The sky box of this embodiment may be selected and downloaded from a public website.
In the flight simulation scene construction, a runway model is arranged at a coordinate origin, a first aircraft is arranged at a set position according to the requirement of simulated flight and a set position relation, and the aircraft waits for receiving an instruction to take off; since the second aircraft is not involved in the flight, there are no strict requirements on its position. Therefore, the position of the second aircraft model may be adjusted to be within a certain range of the area from the origin.
By determining the relative positions of the first aircraft model, the second aircraft model and the airport runway, a simple flight simulation scenario is constructed, as shown in fig. 5.
In order to feed back background video image data including sky and runways in the simulated flight process to the vision of a flight operator and enhance the immersion experience of the operator, a first camera is created and mounted on a first aircraft model, and the shooting angle of the camera is adjusted so that the camera can shoot the background video of the first aircraft in the motion process to obtain video image data.
However, if only a single camera is used, when the aircraft is flying over 100000m, a warning will be raised by Unity 3D: "Dual to flight-point precision limits, it is a perfect to shake the world coordinates of the door object with a small range", and the more the aircraft is away from the origin of coordinates, the more the shake is, even the aircraft is seriously deformed, the characters and scale bands of the displayed picture in the cabin are dislocated, and the picture is blurred.
This is because the Unity3D has its own coordinate range required by float floating point precision and cannot exceed the limit of 100000m, and the flight parameters of the first aircraft do not have "jump" or "jitter" phenomena.
In order to solve the problem, a second camera is created and mounted on a second aircraft model, and the shooting angle of the camera is adjusted so that the camera can shoot the aircraft body of the second aircraft in a static state to obtain the image data of the aircraft body.
And then, performing image fusion on the image data shot by the first camera and the second camera by using a Unity3D double-camera fusion method.
Preferably, the dual-camera fusion specifically includes:
1) Setting a second aircraft, a first camera and a second camera;
marking the Layer attribute of the second aircraft as "zucand";
selecting the Culling Mask attribute of the second camera as 'zucand', setting the Clear Flags attribute as 'Depth only', and setting the Depth value as 1;
setting the Culling Mask attribute of the first camera to remove "zucand"; clear Flags attribute is set to "Skybox", depth value is set to 0;
2) And according to the set Depth value, the picture displayed by the second camera is positioned at the upper layer, the picture displayed by the first camera is positioned at the lower layer, and the pictures are overlapped.
A second camera mounted on a second aircraft, which is used only to display static airplanes and not to display moving airplanes, runways and sky backgrounds; a first camera mounted on a first aircraft is used to display only a background scene including a runway and sky. Fusing the image shot by the main camera and the image shot by the background camera: the second camera is used as a main camera, and a display picture is positioned on the upper layer; the first camera is used as a background camera, and the displayed picture is positioned at the lower layer. In the superimposed video frame, the aircraft appears as if flying in the air, but in reality the aircraft is stationary, showing a stationary aircraft rather than a moving aircraft. And the change of the background in the flight is the picture displayed by the moving background camera. Through image fusion, the effect of 'false and false' is achieved by adopting a 'visual deception' technique.
Thus, even if the coordinate values of the sports aircraft exceed the range limited by the Unity3D software and shake, the coordinate values of the sports aircraft cannot be displayed. Because the camera displays a static airplane, the static airplane is not jittered within the range limited by the Unity3D software, and the purpose of removing the jitter is achieved.
The control of the movement of the moving aircraft specifically comprises:
newly building a script for controlling the movement of the airplane: right clicking the assembly folder of Project window with mouse, selecting Create → C # Script in the pop-up menu, and generating control file after clicking.
Because the control of the aircraft motion relates to models such as kinematics, dynamics, control rate, atmosphere and the like, the actual control relationship is very complex, and the technical scheme of the invention is simplified without influencing the explanation of the technical scheme.
Moving the airplane forward along the z-axis according to a certain step length, and adding the following key codes in void Start () { }:
Rigidbody=GameObject.Find("plane").GetComponent<Rigidbody>();
for obtaining rigid body components of a moving aircraft.
In void Update () { }, the following key codes are added:
plane.Translate(Vector 3.forward*speed*Time.deltaTime,space.self);
wherein the plane is a Transform component of the moving airplane. The Translate () function is used to handle the movement of a moving aircraft. Vector 3.Forward indicates that the aircraft is advancing along the z-axis. speed is a speed control parameter and can be set by self. Deltatime denotes the time taken from the previous frame to the current frame. self denotes the local coordinate system.
And mounting the file on a first aircraft for controlling the flight of the first aircraft and obtaining flight parameters including the flight speed, the altitude and the like.
Since the flight parameter data are smooth during the movement of the first aircraft, no 'jitter' or 'jump' phenomenon occurs, and the coordinate value of the z axis of the aircraft exceeds 100000m, but the aircraft still moves forwards according to the fixed step length. The jitter of the moving airplane is independent of the coordinate value data, so that the return and display of the parameters of the moving airplane are possible.
Preferably, when the aircraft performing the flight simulation is an airplane with a head-up display, as described above, if only a single camera is used, when the aircraft is flying over 100000m, the characters and scale bands of the head-up display in the cabin are misaligned and the display is blurred. But without "jumping" or "jitter" due to flight parameters.
Therefore, in the embodiment, the first image data and the second image data are subjected to image fusion to obtain a fusion picture, and the fusion picture is sent to the simulation display device for display. The blurring problem of the flat display picture caused by the limitation of the Unity3D software is eliminated.
Specifically, a flat display content file is created, a display content key code is added to the void Update () { }, third image data, needing to be displayed on the flat display, of a first aircraft, namely a moving aircraft is obtained, and then a display code is added, so that the speed of the flat display of a second aircraft, namely a static aircraft can be displayed. For example, speed information displayed on a flat display, by adding:
speed = rigidbody.
Because when fusing the picture and showing, simulation operating personnel need watch the background and the flat display information of motion simultaneously, in the integration, directly fuse flat display data on the flat display glass of static aircraft, and the distance of flat display data and motion background is too big, and eyes when operating personnel operates need carry out secondary focusing, causes visual fatigue and energy dispersion, can increase the shake of flat display picture moreover.
To solve the problem, the fusion method for the head-up display data of the embodiment includes:
1) Presetting a default display position of the flat display data in the fusion picture, wherein the default display position is positioned outside the cabin and d meters away from the flat display glass position; the d can be set according to a simulation scene, so that the secondary focusing of eyes of an operator during operation can be reduced, and the aims of reducing visual fatigue and energy dispersion are fulfilled;
2) Determining whether dynamic target information is included between the flat display glass position and a default display position; if not, go to 3); if so, go to 4);
3) Placing the flat display data at a default display position, and fixedly displaying the flat display data in a preset display size;
4) According to the dynamic target information, dynamically displaying the display size and position of the flat display data; the dynamic display size1= ω size0; size0 is a preset display size; scaling factorIn the formula, alpha is the distance from the eyes of an operator to the position of the flat display glass in the fusion picture during simulation operation, l is the distance from the flat display glass to the dynamic target, and m is the thickness of the dynamic target, namely the width of the target in the sight line direction of the operator; the dynamic display position is d1 meter outside the position of the flat display glass outside the cabin,
if a plurality of dynamic objects are included between the position of the flat display glass and the default display position, the dynamic display of the display size and position of the flat display data is performed based on the information of the dynamic object closest to the position of the flat display glass.
Specifically, the background video image data captured by the first camera is shown in fig. 3; the picture of the body taken by the second camera is shown in fig. 4; the obtained fusion picture is shown in figure 5, and the problem of picture jitter of the simulation of long-distance flight at high speed can be solved.
The method for removing the jitter of the flight simulation picture disclosed by the embodiment constructs two moving airplanes and two static airplanes for high-speed, long-time and long-distance flight simulation which needs a wide flight simulation space in a fusion mode of two cameras, displays the background and parameters of the moving airplane on the static airplane in a remote mode, overcomes the technical bottleneck caused by the limit of a simulation engine, removes the jitter of the picture, eliminates troubles for the majority of technologists, and brings great benefits for the development of related industrial projects, the training of personnel and the like.
Example II,
The embodiment discloses a flight simulator, which is provided with an existing throttle lever, a steering column and a simulation display device, such as an HTC (high-temperature vehicle) head display, on hardware construction.
And carrying out image jitter elimination on the simulation image on the simulation display equipment by using the flight simulation image jitter eliminating method in the first embodiment. Immersive virtual reality experience can be provided for a user. Compared with a large simulator, the simulator has the characteristics of low price, easiness in mass configuration and the like.
While the invention has been described with reference to specific preferred embodiments, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the following claims.
Claims (10)
1. A method for removing jitter of flight simulation pictures is characterized by comprising the following steps of;
in flight simulation, a first aircraft and a second aircraft with the same structure are arranged;
acquiring first image data in real time, wherein the first image data is background video image data of a first aircraft during motion;
acquiring second image data, wherein the second image data is body image data of a second aircraft in a static state;
and sending a fusion picture obtained by image fusion of the first image data and the second image data to a simulation display device for display.
2. The debounce method according to claim 1, wherein the flight simulation picture is obtained based on Unity 3D;
the flight simulation comprises:
importing a three-dimensional model comprising a first aircraft, a second aircraft and an airport runway in the Unity 3D; and constructing a flight simulation scene.
3. The debounce method of claim 2, wherein importing a three-dimensional model comprising a first aircraft, a second aircraft, and an airport runway in Unity3D comprises:
constructing a three-dimensional model of the first aircraft, the second aircraft and the airport runway using 3ds max software;
exporting a three-dimensional model file in an FBX format;
and importing the exported three-dimensional model file in the FBX format into Unity3D.
4. The method of removing jitter according to claim 3,
in the Unity3D, a first camera is created, is mounted on a first aircraft and is used for shooting background video image data of the first aircraft in the motion process;
and creating a second camera which is mounted on the second aircraft and is used for shooting the body image data of the second aircraft in a static state.
5. The debounce method according to claim 4, wherein image data captured by the first camera and the second camera are image-fused using a Unity3D dual-camera fusion method.
6. The method according to claim 5, wherein the dual-camera fusion specifically comprises:
1) Setting a second aircraft, a first camera and a second camera;
marking the layer attributes of the second aircraft;
selecting the Culling Mask attribute of the second camera as the layer attribute of the second aircraft, setting the Clear Flags attribute as 'Depth only', and setting the Depth value as 1;
setting the Culling Mask attribute of the first camera as the layer attribute of the second aircraft; clear Flags attribute is set to "Skybox", depth value is set to 0;
2) And according to the set Depth value, the picture displayed by the second camera is positioned at the upper layer, the picture displayed by the first camera is positioned at the lower layer, and the pictures are overlapped.
7. The debounce method according to any one of claims 2-6, wherein in flight simulation, the rigid body component of the first aircraft is obtained by creating a script for controlling the motion of the aircraft in Unity3D, and the rigid body component of the aircraft is moved forward along the z-axis by a certain step size to control the motion of the first aircraft.
8. The debounce method according to any one of claims 2-6, wherein the flight simulation scene construction comprises:
selecting a sky box as a sky background, and setting the material of the sky box in an ambient lighting function Skybox option in the Unity 3D;
and arranging the runway model at the coordinate origin, arranging the first aircraft model at a set position according to the set position relation, and adjusting the position of the second aircraft model to be in a region within a set range from the origin.
9. The method for removing jitter according to any one of claims 1-6, wherein the fusion picture further includes highlight data during the motion of the first aircraft; the fusion method of the head-up display data comprises the following steps:
1) Presetting a default display position of the head-up display data in the fusion picture, wherein the default display position is positioned outside the cabin and d meters away from the head-up display glass position;
2) Determining whether dynamic target information is included between the flat display glass position and a default display position; if not, go to 3); if so, go to 4);
3) Placing the flat display data at a default display position, and fixedly displaying the flat display data in a preset display size;
4) According to the dynamic target information, dynamically displaying the display size and position of the flat display data; the dynamic display size1= ω size0; size0 is a preset display size; scaling factorIn the formula, alpha is the distance from the eyes of an operator to the position of the flat display glass in the fusion picture during simulation operation, l is the distance from the flat display glass to the dynamic target, and m is the thickness of the dynamic target, namely the width of the target in the sight line direction of the operator; the dynamic display position is d1 meter outside the position of the flat display glass outside the cabin,
10. a flight simulator comprising an emulation display device, wherein the emulation picture on the emulation display device is subjected to picture jitter elimination by the method for removing jitter according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910410194.5A CN110097645B (en) | 2019-05-16 | 2019-05-16 | Method for removing jitter of flight simulation picture and flight simulator |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910410194.5A CN110097645B (en) | 2019-05-16 | 2019-05-16 | Method for removing jitter of flight simulation picture and flight simulator |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110097645A CN110097645A (en) | 2019-08-06 |
CN110097645B true CN110097645B (en) | 2023-01-20 |
Family
ID=67448297
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910410194.5A Active CN110097645B (en) | 2019-05-16 | 2019-05-16 | Method for removing jitter of flight simulation picture and flight simulator |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110097645B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113838313B (en) * | 2021-11-29 | 2022-02-18 | 中国民用航空总局第二研究所 | Obstacle identification method for course beacon channel clearance jitter |
CN116165914A (en) * | 2023-02-17 | 2023-05-26 | 北京世冠金洋科技发展有限公司 | Simulation method of avionics system and related products |
CN116310243B (en) * | 2023-05-24 | 2023-08-15 | 山东捷瑞数字科技股份有限公司 | AR anti-shake method, system, equipment and storage medium based on artificial intelligence |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103400018B (en) * | 2013-07-12 | 2016-03-09 | 中国民用航空飞行校验中心 | The system and method for a kind of flight program check and checking |
CN106530894B (en) * | 2017-01-10 | 2019-03-08 | 北京捷安申谋军工科技有限公司 | A kind of virtual head up display method and system of flight training device |
CN108460731A (en) * | 2017-03-16 | 2018-08-28 | 中国人民解放军海军航空工程学院青岛校区 | A method of it eliminating scenery picture in networking flight simulation and shakes |
-
2019
- 2019-05-16 CN CN201910410194.5A patent/CN110097645B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN110097645A (en) | 2019-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110097645B (en) | Method for removing jitter of flight simulation picture and flight simulator | |
CN106157359B (en) | Design method of virtual scene experience system | |
CA3049846C (en) | Augmented reality display reflective of visibility affecting features in real-world environment | |
EP2175636A1 (en) | Method and system for integrating virtual entities within live video | |
CN114035682A (en) | Naked eye 3D interactive immersive virtual reality CAVE system | |
CN115546377B (en) | Video fusion method and device, electronic equipment and storage medium | |
CN111045586B (en) | Interface switching method based on three-dimensional scene, vehicle-mounted equipment and vehicle | |
CN115272047A (en) | System, method and display system for generating image frames | |
CN114035681A (en) | 3D active stereo interactive immersive virtual reality CAVE system | |
Wang et al. | An augmented reality application framework for complex equipment collaborative maintenance | |
CN111345037B (en) | Virtual reality image providing method and program using the same | |
Walko | Integration of augmented-reality-glasses into a helicopter simulator with front projection | |
US11368351B1 (en) | Simulation view network streamer | |
WO2022141122A1 (en) | Control method for unmanned aerial vehicle, and unmanned aerial vehicle and storage medium | |
JPWO2018179402A1 (en) | Information processing apparatus, information processing method, and information processing program | |
KR20170009304A (en) | Apparatus for simulating aviation | |
JP6773214B2 (en) | Information processing equipment, information processing methods and information processing programs | |
KR100231712B1 (en) | A design of system architecture and operation for unmanned aero vehicle system | |
CN115019597B (en) | Aviation simulation training method, device and system based on cloud computing and cloud rendering | |
CN219266904U (en) | Vehicle-mounted mixed reality experience expansion system | |
WO2023166794A1 (en) | Information processing device, information processing method, image generation device, image generation method, and program | |
CN113298955A (en) | Real scene and virtual reality scene fusion method and system and flight simulator | |
CN117218319A (en) | Augmented reality processing method and device and electronic equipment | |
JPH04267284A (en) | Simulated visibility device | |
CN116092344A (en) | Virtual-real fusion-based air traffic control controller simulation training system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |