CN117542253A - Pilot cockpit training system - Google Patents
Pilot cockpit training system Download PDFInfo
- Publication number
- CN117542253A CN117542253A CN202311424279.1A CN202311424279A CN117542253A CN 117542253 A CN117542253 A CN 117542253A CN 202311424279 A CN202311424279 A CN 202311424279A CN 117542253 A CN117542253 A CN 117542253A
- Authority
- CN
- China
- Prior art keywords
- cockpit
- training
- glasses
- scene image
- driving scene
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012549 training Methods 0.000 title claims abstract description 82
- 239000011521 glass Substances 0.000 claims abstract description 65
- 239000004973 liquid crystal related substance Substances 0.000 claims abstract description 34
- 238000004891 communication Methods 0.000 claims abstract description 7
- 238000012544 monitoring process Methods 0.000 claims abstract description 4
- 230000004927 fusion Effects 0.000 claims description 17
- 238000012545 processing Methods 0.000 claims description 11
- 238000003384 imaging method Methods 0.000 claims description 7
- 238000007499 fusion processing Methods 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 description 24
- 238000000034 method Methods 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/08—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
- G09B9/30—Simulation of view from aircraft
- G09B9/307—Simulation of view from aircraft by helmet-mounted projector or display
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/08—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/08—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
- G09B9/30—Simulation of view from aircraft
- G09B9/301—Simulation of view from aircraft by computer-processed or -generated image
- G09B9/302—Simulation of view from aircraft by computer-processed or -generated image the image being transformed by computer processing, e.g. updating the image to correspond to the changing point of view
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/08—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
- G09B9/30—Simulation of view from aircraft
- G09B9/308—Simulation of view from aircraft by LCD, gas plasma display or electroluminescent display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Educational Technology (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Plasma & Fusion (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Rehabilitation Tools (AREA)
Abstract
The invention discloses a pilot cockpit training system, which comprises a hemispherical green curtain, a training cockpit and VR glasses, wherein cameras, a computing unit, a gyroscope module, a storage unit, a liquid crystal display screen and convex lenses are integrated in the VR glasses; the hemispherical green curtain is buckled on the periphery of the training cockpit; the training cockpit sends control information in the cockpit to the computing unit; the camera, the gyroscope module, the storage unit and the liquid crystal display screen are respectively in communication connection with the computing unit; the camera acquires an image of a real driving scene in the training cockpit; the gyroscope module is used for monitoring motion data of the VR glasses; virtual driving scene image data are prestored in the storage unit; the computing unit fuses the training with the real driving scene image data in the cockpit and the virtual driving scene image data in the storage unit to obtain a mixed reality scene image, and transmits the mixed reality scene image to the liquid crystal display screen for real-time picture display.
Description
Technical Field
The invention belongs to the technical field of pilot driving simulation training systems, and particularly relates to a pilot cockpit training system.
Background
Along with the development of computer technology, virtual reality technology and mixed reality technology also show wide application prospect in various fields. The existing virtual reality technology is to simulate and generate a three-dimensional virtual world for providing visual, auditory, tactile and other sensory simulation for a user by using computer technology, and the user performs natural interaction with the virtual world by means of equipment such as a handle and the like. The technology mainly depends on the operation processing capability of a computer, and is characterized in that the display field of view is relatively close to the real field of view of human eyes, and the influence of surrounding audio and video is isolated to a great extent, so that a user has strong immersive experience, and the implementation method is relatively simple and has lower hardware cost.
The currently adopted mixed reality technology is a technical way to fuse the real world with the virtual world for display. The method is characterized by better realizing man-machine interaction between a person and the real world and between the person and the virtual world. In the conventional implementation method, a holographic projection mode based on interference and diffraction principles is generally adopted to project a virtual driving scene image in the real world. The method has the defects of narrow projection view, complex structure, high cost, high power consumption, insufficient immersive experience and the like.
Disclosure of Invention
In order to solve the problems of single virtual reality technology or mixed reality technology in the prior pilot driving training, the invention provides a pilot driving cabin training system, which comprehensively utilizes the virtual reality technology and the mixed reality technology, obtains strong immersive experience, has larger display field of view, has a simple system structure and low cost, and has wider application field and stronger universality.
The invention aims at realizing the following technical scheme:
the pilot cockpit training system comprises a hemispherical green curtain, a training cockpit and VR glasses, wherein a camera, a computing unit, a gyroscope module, a storage unit, a liquid crystal display screen and convex lenses are integrated in the VR glasses; the hemispherical green curtain is buckled on the periphery of the training cockpit; the training personnel wear VR glasses to carry out operation training in the training cockpit, the training cockpit is in communication connection with a computing unit of the VR glasses, and control information of the training personnel in the cockpit is sent to the computing unit; the camera, the gyroscope module, the storage unit and the liquid crystal display screen are respectively in communication connection with the computing unit; the camera is used for acquiring real driving scene images during training in the training cockpit in real time and transmitting the acquired real driving scene image data to the computing unit; the gyroscope module is used for monitoring real-time motion data of the VR glasses and transmitting the motion data to the computing unit; virtual driving scene image data corresponding to different model tasks are pre-stored in the storage unit; the computing unit is used for carrying out data processing on the real driving scene image data in the training cockpit and the virtual driving scene image data integrated in the storage unit, fusing mixed reality scene images, and transmitting the mixed reality scene images to the liquid crystal display screen for real-time image display; the display picture of the liquid crystal display screen is projected into the human eye vision of the training person after being corrected by the convex lens.
Further, a control lever, an accelerator, a brake, a switch electric door, buttons and an instrument panel are arranged in the training cab.
Further, the control information of the training personnel in the cockpit comprises control information of the training personnel on a control rod, an accelerator, a brake, a switch valve and a button.
Further, the liquid crystal display screen is provided with a left lens and a right lens, and the convex lens is provided with a left lens and a right lens; the left and right liquid crystal display screens are respectively and correspondingly arranged in front of the left and right convex lenses, and the convex lenses correspond to the front positions of human eyes.
Further, the cameras are a pair of, and a pair of cameras are located VR glasses front end, and camera interval is adjustable.
Further, the computing unit is used for retrieving virtual driving scene data corresponding to the current model task built in the storage unit, comprehensively processing real driving scene image data transmitted by the camera, VR glasses motion data transmitted by the gyroscope module and control information data transmitted by the training cockpit, and obtaining a mixed reality scene image through mixed reality scene fusion.
Further, the mixed reality scene fusion process of the computing unit includes:
s1. Calibrating positions of the VR glasses and the virtual driving scene;
s2, virtual driving scene image processing:
determining a virtual driving scene image display view field, namely a virtual driving scene image to be displayed, according to the positions and angles of the VR glasses calibrated in the step S1;
rearview mirror image imaging of the cockpit: capturing the size and the position of a green curtain on a rearview mirror surface through VR glasses, taking the plane of a rearview mirror as a symmetrical plane, calculating and taking a virtual scene image corresponding to the green curtain position of the rearview mirror in a virtual driving scene behind a cockpit in a storage unit in real time according to a plane reflection imaging principle, and transmitting the virtual scene image to a liquid crystal display in real time;
s3, fusion display of the mixed reality images:
the computing unit performs image fusion on the real scene image in the cockpit obtained by the camera, the virtual driving scene image obtained by computation and the virtual scene image to be displayed by the rearview mirror;
s4, correcting gyroscope errors.
Further, the step S1 includes:
s11, fixing a coordinate system of a virtual driving scene: calculating the movement direction, speed and position information of the training cockpit in the virtual driving scene;
s12. Position calibration of VR glasses and a training cockpit: a coordinate system is established by taking a certain point in the training cockpit as a coordinate origin, and the motion data monitored by the gyroscope module in the VR glasses are converted into the cockpit coordinate system;
s13, converting motion data of the VR glasses in the cockpit coordinate system from the cockpit coordinate system to the virtual driving scene coordinate system, and finally determining corresponding position and angle information of the VR glasses in the virtual driving scene.
Further, in the step S3, the image fusion includes: firstly, aligning the three image pictures, and respectively replacing the detected green pixel positions with corresponding virtual driving scene image pictures and rearview mirror virtual scene image pictures in real scene image pictures obtained by a camera by a computing unit; finally, transmitting the mixed reality scene image after image fusion to a liquid crystal display screen of the VR glasses; each frame of picture is processed in turn according to the steps above.
Further, the step S4 includes: and setting marking points in the hemispherical green curtain or the training cockpit, and correcting deviation generated by the gyroscope through the marking points obtained by the camera every certain time interval.
The invention has the following beneficial effects:
1. the display view angle is larger:
the traditional mixed reality glasses are based on the holographic projection technology, the visual field angle of the displayed virtual picture is not more than 40 degrees, and the visual field angle of the displayed virtual picture are larger and are close to the visual field range of the real human eyes.
2. The visual experience effect is outstanding:
the traditional flight simulator uses a plane or curved display screen, has limited display frames and no stereoscopic impression and distance impression, can provide a user with more real stereoscopic impression, has 360-degree frames identical to the real world, and can simulate the real visual effect and impact impression of the whole flight process.
The existing flight simulator for wearing common VR glasses can not be seen from outside after a user wears the glasses, only the throttle lever and the control lever can be held, and other operations can not be performed.
The existing flight simulator using AR glasses adopts holographic projection technology to realize projection of virtual world, has narrow visual field, transparent image, poor experience sense, lack of unrealism, complex technology and short working time, can realize 120-degree large visual field, is close to normal human eye range, and has high image quality and strong sense of reality.
3. Simple structure, with low costs:
most of the existing mixed reality technologies adopt holographic projection imaging technology, and have the disadvantages of complex manufacturing process, high cost and difficult mass production; on the basis of the virtual reality technology, the invention converts the real scene of the real world into the real image by arranging the left and right 2 cameras, the 1 gyroscope module and the 1 computing unit in front of the virtual reality technology, and then the real image is fused with the virtual driving scene image and then displayed in the visual field of human eyes through the left and right liquid crystal display screens and the left and right convex lenses. The mode realizes the mixed reality effect, and the structure and the processing technology are simpler.
4. The application scene is wider, the universality is strong:
the invention can be used as virtual reality glasses, can expand functions, replaces the traditional mixed reality glasses, and superimposes and projects the real world scene and the virtual scene on the liquid crystal display screen through a computer processing technology. Thus, the advantages of virtual reality and mixed reality techniques are integrated.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the following description will briefly explain the drawings to be used in the description of the embodiments of the present invention, and it is obvious that the drawings in the following description are only embodiments of the present invention, and other drawings may be obtained according to the contents of the embodiments of the present invention and these drawings without inventive effort for those skilled in the art.
FIG. 1 is a schematic diagram of a training system for a pilot cockpit according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an embodiment of the invention in which an image displayed on a liquid crystal display is magnified and imaged by a convex lens;
in the figure:
1. hemispherical green curtain; 2. a cockpit; 3. a camera; 4. a calculation unit; 5. a gyroscope module; 6. a storage unit; 7. a liquid crystal display; 8. a convex lens.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
The embodiment is a pilot cockpit training system, including hemispherical green curtain 1, training cockpit 2 and VR glasses, the integration has a pair of camera 3, computing element 4, gyroscope module 5, memory cell 6, left and right two liquid crystal display 7 and left and right two convex lenses 8 in the VR glasses.
The hemispherical green curtain 1 is covered and buckled on the periphery of the training cockpit, and the view of the porthole of the cockpit is the green curtain; the training cockpit 2 is provided with input devices such as a joystick, an accelerator, a brake, a switch, and buttons, and output devices such as a dashboard (digital display) for displaying status data of the aircraft. The training personnel wear VR glasses and carry out operation training in the cockpit for training, and the cockpit for training 2 is connected with the computing element 4 communication of VR glasses, sends the control information of training personnel in the cockpit to computing element 4, including the control information of training personnel to control pole, throttle, brake, switch valve, button etc..
The camera 3, the gyroscope module 5, the storage unit 6 and the liquid crystal display 7 are respectively in communication connection with the computing unit 4; the camera 3 is used for acquiring real driving scene images during training in the training cockpit 2 in real time and transmitting the acquired real driving scene image data to the computing unit 4; the gyroscope module 5 is used for monitoring real-time motion data of the VR glasses, transmitting the motion data to the computing unit 4, and realizing positioning between the actual driving scene image and the virtual driving scene image through the gyroscope module 5; virtual driving scene image data corresponding to different model tasks are prestored in the storage unit 6; the computing unit 4 is configured to perform data processing on the real driving scene image data in the training cockpit and the virtual driving scene image data integrated in the storage unit 6, and fuse a mixed reality scene image: the computing unit 4 retrieves virtual driving scene data corresponding to the current model task pre-stored in the storage unit 6, comprehensively processes the real driving scene image data transmitted by the camera 3, the VR glasses motion data transmitted by the gyroscope module 5 and the control information data transmitted by the training cockpit 2, obtains a mixed reality scene image through mixed reality scene fusion, and transmits the mixed reality scene image to the liquid crystal display 7 for real-time picture display; the left and right liquid crystal display screens 7 are respectively and correspondingly arranged in front of the left and right convex lenses 8, the convex lenses correspond to the front positions of human eyes, and the display pictures of the left and right liquid crystal display screens are projected into the human eye vision field of training personnel after being corrected by the left and right convex lenses.
The pair of cameras 3 are located the VR glasses front end, and camera 3 interval is adjustable, and when the camera interval was the same with the eye distance, the space that the VR glasses presented felt and the space when the bore hole was closest.
The training cockpit 2 can be replaced according to the specific model task.
The mixed reality scene fusion process of the computing unit is as follows:
s1. Position calibration of VR glasses and virtual driving scenes:
s11, fixing a coordinate system of a virtual driving scene: calculating information such as the movement direction, the speed, the position and the like of the training cockpit in the virtual driving scene;
s12. Position calibration of VR glasses and a training cockpit: a coordinate system is established by taking a certain point in the training cockpit as a coordinate origin, and the motion data monitored by the gyroscope module in the VR glasses are converted into the cockpit coordinate system;
s13, converting motion data of the VR glasses in the cockpit coordinate system from the cockpit coordinate system to the virtual driving scene coordinate system, and finally determining corresponding position, angle and other information of the VR glasses in the virtual driving scene.
S2, virtual driving scene image processing:
determining a virtual driving scene image display view field, namely a virtual driving scene image to be displayed, according to the positions and angles of the VR glasses calibrated in the step S1;
rearview mirror image imaging of the cockpit: the virtual scene image corresponding to the green curtain position of the rearview mirror in the virtual driving scene behind the cockpit in the storage unit is calculated and fetched in real time according to the plane reflection imaging principle by taking the plane of the rearview mirror as a symmetrical plane through the size and the position of the green curtain on the rearview mirror captured by the VR glasses, and the virtual scene image is transmitted to the liquid crystal display in real time.
S3, fusion display of the mixed reality images:
the computing unit performs image fusion on the real scene image in the cockpit obtained by the camera, the virtual driving scene image obtained by computation and the virtual scene image to be displayed by the rearview mirror: firstly, aligning the three image pictures, and respectively replacing the detected green pixel positions with corresponding virtual driving scene image pictures and rearview mirror virtual scene image pictures in real scene image pictures obtained by a camera by a computing unit; and finally, transmitting the mixed reality scene image after image fusion to a liquid crystal display screen of the VR glasses. Each frame of picture is processed in turn according to the steps above.
S4, correcting gyroscope errors:
the gyroscope module is integrated in the VR glasses and monitors and tracks head rotation. The VR glasses are repositioned once per a certain time interval taking account of the gyroscope errors. The positioning method is to add some marking points in the green curtain or the cockpit, and correct the deviation generated by the gyroscope through the marking points obtained by the camera at certain time intervals.
The working principle of the present embodiment is briefly described below:
during operation, the camera of the VR glasses firstly collects real driving scenes and field scenes such as hemispherical green curtains and the like in training personnel in the training cockpit in real time to form a real driving scene image. And the real driving scene image is transmitted to the calculation unit of the VR glasses in real time. Meanwhile, the computing unit calls the virtual driving scene image data in the storage unit, and fusion of the real image and the virtual driving scene is realized through the technologies of position calibration, image fusion and the like. And then, the fused mixed reality scene image is displayed on a liquid crystal display screen. The image on the liquid crystal display screen is directly observed by human eyes after being amplified by the convex lens.
The embodiment is based on a virtual reality technology, and the display of the virtual driving scene image is realized. Through 2 cameras, 1 gyroscope module, 1 computational element about the place ahead setting, with the cockpit real scene conversion that the camera shot into real image, with virtual driving scene image fusion back that prestores in the memory cell of VR glasses again, synthetic mixed reality scene image, send to the liquid crystal display of VR glasses, throw to training personnel's human eye field of vision after the convex lens correction.
When the camera is closed, the VR glasses of the embodiment can be used as virtual reality glasses, and have all functions of the virtual reality glasses; when the camera is opened for use, the camera can be used as mixed reality glasses, and the real scene image and the virtual scene image are overlapped and fused through the calculation of the calculation unit and then projected on the left and right liquid crystal display screens. Therefore, the method integrates the advantages of both virtual reality technology and traditional mixed reality technology.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (10)
1. The pilot cockpit training system is characterized by comprising a hemispherical green curtain, a training cockpit and VR glasses, wherein a camera, a computing unit, a gyroscope module, a storage unit, a liquid crystal display screen and a convex lens are integrated in the VR glasses; the hemispherical green curtain is buckled on the periphery of the training cockpit; the training personnel wear VR glasses to carry out operation training in the training cockpit, the training cockpit is in communication connection with a computing unit of the VR glasses, and control information of the training personnel in the cockpit is sent to the computing unit; the camera, the gyroscope module, the storage unit and the liquid crystal display screen are respectively in communication connection with the computing unit; the camera is used for acquiring real driving scene images during training in the training cockpit in real time and transmitting the acquired real driving scene image data to the computing unit; the gyroscope module is used for monitoring real-time motion data of the VR glasses and transmitting the motion data to the computing unit; virtual driving scene image data corresponding to different model tasks are pre-stored in the storage unit; the computing unit is used for carrying out data processing on the real driving scene image data in the training cockpit and the virtual driving scene image data integrated in the storage unit, fusing mixed reality scene images, and transmitting the mixed reality scene images to the liquid crystal display screen for real-time image display; the display picture of the liquid crystal display screen is projected into the human eye vision of the training person after being corrected by the convex lens.
2. A pilot cockpit training system as in claim 1 wherein said training cockpit is provided with levers, throttles, brakes, switch levers, buttons and dashboards.
3. A pilot cockpit training system as in claim 2 wherein said training personnel's control information in the cockpit includes training personnel's control information for levers, throttles, brakes, on-off gates, buttons.
4. The system of claim 1, wherein the liquid crystal display is provided with a left and a right, and the convex lens is provided with a left and a right; the left and right liquid crystal display screens are respectively and correspondingly arranged in front of the left and right convex lenses, and the convex lenses correspond to the front positions of human eyes.
5. A pilot cockpit training system as in claim 1 wherein said pair of cameras is located at the forward most end of the VR glasses and the camera pitch is adjustable.
6. The system for training the cockpit of the pilot according to claim 1, wherein the computing unit is used for retrieving virtual driving scene data corresponding to a current model task built in the storage unit, comprehensively processing real driving scene image data transmitted by the camera, VR glasses motion data transmitted by the gyroscope module and control information data transmitted by the training cockpit, and obtaining a mixed reality scene image through mixed reality scene fusion.
7. The pilot cockpit training system of claim 6 wherein the mixed reality scene fusion process of the computing unit includes:
s1. Calibrating positions of the VR glasses and the virtual driving scene;
s2, virtual driving scene image processing:
determining a virtual driving scene image display view field, namely a virtual driving scene image to be displayed, according to the positions and angles of the VR glasses calibrated in the step S1;
rearview mirror image imaging of the cockpit: capturing the size and the position of a green curtain on a rearview mirror surface through VR glasses, taking the plane of a rearview mirror as a symmetrical plane, calculating and taking a virtual scene image corresponding to the green curtain position of the rearview mirror in a virtual driving scene behind a cockpit in a storage unit in real time according to a plane reflection imaging principle, and transmitting the virtual scene image to a liquid crystal display in real time;
s3, fusion display of the mixed reality images:
the computing unit performs image fusion on the real scene image in the cockpit obtained by the camera, the virtual driving scene image obtained by computation and the virtual scene image to be displayed by the rearview mirror;
s4, correcting gyroscope errors.
8. A pilot cockpit training system as in claim 7 wherein said step S1 includes:
s11, fixing a coordinate system of a virtual driving scene: calculating the movement direction, speed and position information of the training cockpit in the virtual driving scene;
s12. Position calibration of VR glasses and a training cockpit: a coordinate system is established by taking a certain point in the training cockpit as a coordinate origin, and the motion data monitored by the gyroscope module in the VR glasses are converted into the cockpit coordinate system;
s13, converting motion data of the VR glasses in the cockpit coordinate system from the cockpit coordinate system to the virtual driving scene coordinate system, and finally determining corresponding position and angle information of the VR glasses in the virtual driving scene.
9. The system of claim 7, wherein in step S3, the image fusion comprises: firstly, aligning the three image pictures, and respectively replacing the detected green pixel positions with corresponding virtual driving scene image pictures and rearview mirror virtual scene image pictures in real scene image pictures obtained by a camera by a computing unit; finally, transmitting the mixed reality scene image after image fusion to a liquid crystal display screen of the VR glasses; each frame of picture is processed in turn according to the steps above.
10. A pilot cockpit training system as in claim 7 wherein said step S4 includes: and setting marking points in the hemispherical green curtain or the training cockpit, and correcting deviation generated by the gyroscope through the marking points obtained by the camera every certain time interval.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311424279.1A CN117542253A (en) | 2023-10-31 | 2023-10-31 | Pilot cockpit training system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311424279.1A CN117542253A (en) | 2023-10-31 | 2023-10-31 | Pilot cockpit training system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117542253A true CN117542253A (en) | 2024-02-09 |
Family
ID=89794938
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311424279.1A Pending CN117542253A (en) | 2023-10-31 | 2023-10-31 | Pilot cockpit training system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117542253A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118070554A (en) * | 2024-04-16 | 2024-05-24 | 北京天创凯睿科技有限公司 | Flight simulation mixed reality display system |
-
2023
- 2023-10-31 CN CN202311424279.1A patent/CN117542253A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118070554A (en) * | 2024-04-16 | 2024-05-24 | 北京天创凯睿科技有限公司 | Flight simulation mixed reality display system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240265650A1 (en) | Head-mounted display with pass-through imaging | |
WO2017173735A1 (en) | Video see-through-based smart eyeglasses system and see-through method thereof | |
Sauer et al. | Augmented workspace: Designing an AR testbed | |
US7782320B2 (en) | Information processing method and information processing apparatus | |
CN107392853B (en) | Method and system for video fusion distortion correction and viewpoint fine adjustment of double cameras | |
KR100911066B1 (en) | Image display system, image display method and recording medium | |
CN108616752B (en) | Head-mounted equipment supporting augmented reality interaction and control method | |
CN113035010B (en) | Virtual-real scene combined vision system and flight simulation device | |
CN108093244B (en) | Remote follow-up stereoscopic vision system | |
CN108398787B (en) | Augmented reality display device, method and augmented reality glasses | |
KR20120044461A (en) | The simulated training apparatus and method using mixed reality | |
CN203746012U (en) | Three-dimensional virtual scene human-computer interaction stereo display system | |
CN205195880U (en) | Watch equipment and watch system | |
CN117542253A (en) | Pilot cockpit training system | |
CN107810634A (en) | Display for three-dimensional augmented reality | |
CN106327583A (en) | Virtual reality equipment for realizing panoramic image photographing and realization method thereof | |
CN101581874A (en) | Tele-immersion teamwork device based on multi-camera acquisition | |
US11521297B2 (en) | Method and device for presenting AR information based on video communication technology | |
US10567744B1 (en) | Camera-based display method and system for simulators | |
KR101947372B1 (en) | Method of providing position corrected images to a head mount display and method of displaying position corrected images to a head mount display, and a head mount display for displaying the position corrected images | |
CN111047713A (en) | Augmented reality interaction system based on multi-view visual positioning | |
CN116205980A (en) | Method and device for positioning and tracking virtual reality in mobile space | |
JP2020031413A (en) | Display device, mobile body, mobile body control system, manufacturing method for them, and image display method | |
CN114742977A (en) | Video perspective method based on AR technology | |
Saraiji et al. | Real-time egocentric superimposition of operator's own body on telexistence avatar in virtual environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |