CN111123743A - Man-machine interaction system for simulating aircraft - Google Patents

Man-machine interaction system for simulating aircraft Download PDF

Info

Publication number
CN111123743A
CN111123743A CN202010006653.6A CN202010006653A CN111123743A CN 111123743 A CN111123743 A CN 111123743A CN 202010006653 A CN202010006653 A CN 202010006653A CN 111123743 A CN111123743 A CN 111123743A
Authority
CN
China
Prior art keywords
module
image
display screen
interaction system
simulated aircraft
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010006653.6A
Other languages
Chinese (zh)
Inventor
梁琳
叶亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cntech Co ltd
Original Assignee
Cntech Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cntech Co ltd filed Critical Cntech Co ltd
Priority to CN202010006653.6A priority Critical patent/CN111123743A/en
Publication of CN111123743A publication Critical patent/CN111123743A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B17/00Systems involving the use of models or simulators of said systems
    • G05B17/02Systems involving the use of models or simulators of said systems electric

Abstract

The invention discloses a man-machine interaction system for simulating an aircraft, which comprises: the camera is used for acquiring a video image of the hand and sending the video image to the image processing module; the image processing module divides the image of the hand, and sends the image to the synthesis module after viewpoint conversion; the modeling module is used for establishing a cabin and an external scene model by adopting three-dimensional modeling software, rendering virtual images inside and outside the cabin in real time by utilizing a graphic engine and sending the virtual images to the synthesis module; the synthesis module is used for fusing the images subjected to viewpoint transformation and the cabin internal and external virtual images sent by the modeling module and sending the fused images to the wearable display module; the wearable display module is used for providing visual feedback for the experiencer; the instructor display is connected with the synthesis module and is used for the instructor to supervise the simulation process. The man-machine interaction system for the simulated aircraft provided by the invention seamlessly fuses the hand image and the virtual cabin environment image, provides more real visual feedback, and can ensure the immersion of a user.

Description

Man-machine interaction system for simulating aircraft
Technical Field
The invention relates to the technical field of simulated aircrafts, in particular to a man-machine interaction system of a simulated aircraft.
Background
With the continuous development and progress of computer technology, the application of simulation is more and more extensive. Generally, the main hardware components of the aircraft simulator comprise a simulation cockpit and a visual display, and the aircraft body simulation can be performed by semi-physical simulation or full software simulation according to requirements.
The development of flight simulators has been accompanied by the development of the aviation industry, which has been in the last hundred years. The flight simulator is developed by using the most advanced technology in the recent hundred years of development process, so that the development level of the flight simulator represents the aviation technology level in the time. The development process of flight simulation can be roughly divided into three stages of electromechanical simulation, electronic simulation and digital simulation. Electromechanical simulations have started around the end of the twentieth century and are the primary stage of flight simulators. At the end of the last 20 s, the american edward and forest developed a mechanical flight training device, which was mainly used to demonstrate the working principle of the airplane control mechanism, to train the pilot to understand the structure of the airplane and to improve the ability to pilot the airplane. The electronic simulation phase begins from the end of the fortieth age and belongs to the middle stage of a flight simulator, and the simulator can reproduce the aerodynamic characteristics, the engine characteristics, the characteristics of various mechanical systems and the like of an airplane and can realistically simulate various instruments and equipment in an airplane cabin. The digital simulation phase begins from the early sixties and belongs to the advanced phase of the flight simulator. The function of the flight simulator is greatly enhanced by comprehensively adopting advanced hardware such as a digital computer and the application of new technologies such as a graphic image technology. The digital flight simulator is developed to the present, various technologies and systems are comprehensively applied, and a complete flight simulator mainly comprises five systems, namely a computer system, a motion system, a control system, an instrument system and a vision system. The function of the modern flight simulator is stronger and better, the training effect is better and better, the flight simulator becomes standard equipment for training pilots, and the flight simulator is widely applied to the fields of civil aviation and military.
At present, a man-machine interaction scheme in the existing flight simulator is a pure virtual interaction scheme based on data gloves, the data gloves are mature and wide in application and still are preferred devices, but the defects are obvious, and the hand motion of a user is not flexible after the user wears the data gloves due to the complex hardware structure of the data gloves, so that the immersion sense during interaction can be broken. And because the gloves adopt accurate sensor data acquisition, cause and use and maintenance cost height. In addition, the method drives the geometric model hand by using the gesture position information, and a satisfactory visual effect cannot be achieved.
Therefore, how to ensure the immersion of the user in a completely virtual cockpit environment is a problem that needs to be solved urgently by those skilled in the art.
Disclosure of Invention
In view of the above, the invention provides a man-machine interaction system for an aircraft simulator, which seamlessly fuses the hand image and the virtual cabin environment image, provides more real visual feedback, and can ensure the immersion of a user.
In order to achieve the purpose, the invention adopts the following technical scheme:
a simulated aircraft human-computer interaction system comprising: the system comprises a camera, an image processing module, a modeling module, a synthesis module, a support structure, a wearing display module and an instructor display; wherein the support structure is mounted on a seat of the simulated aircraft and the wearable display module is mounted on the support structure;
the camera is used for acquiring a video image of the hand and sending the video image to the image processing module;
the image processing module is used for segmenting the image of the hand part, and sending the image to the synthesis module after viewpoint conversion;
the modeling module is used for establishing a cabin and an external scene model by adopting three-dimensional modeling software, rendering virtual images inside and outside the cabin in real time by utilizing a graphic engine and sending the virtual images to the synthesis module;
the synthesis module is used for fusing the images subjected to viewpoint transformation and the cabin internal and external virtual images sent by the modeling module and sending the fused images to the wearable display module;
the wearable display module is used for providing visual feedback to an experiencer;
and the instructor display is connected with the synthesis module and is used for the instructor to supervise the simulation process.
Preferably, the wearable display module includes: the display screen comprises a left display screen, a middle display screen and a right display screen; and the left display screen, the middle display screen and the right display screen are connected in sequence.
Preferably, the included angle between the left display screen and the middle display screen and the included angle between the middle display screen and the right display screen are both 120 °.
Preferably, the image processing module includes:
a color space conversion unit for performing color space conversion on the image of the hand;
the illumination compensation unit is used for carrying out illumination compensation on the image subjected to color space conversion;
the model detection unit is used for carrying out skin color detection on the image subjected to illumination compensation based on the skin color model;
the filtering unit is used for performing morphological filtering on the image subjected to the skin color detection;
and the contour processing unit is used for carrying out contour search and smoothing on the image subjected to the morphological filtering to obtain a hand segmentation result.
Preferably, the method further comprises the following steps: and the parameter updating unit is used for extracting model parameters according to the segmentation result and updating the skin color model according to the model parameters.
Preferably, the instructor display is connected with the synthesis module through a video line.
Preferably, the method further comprises the following steps: the device comprises a sensor, a signal conditioner and an external computer;
the sensor is unidirectionally connected with the signal conditioner;
the signal conditioner is connected with the external computer in a bidirectional way.
Preferably, the signal conditioner includes: the device comprises an amplifying circuit, a filtering circuit, a processor and a communication circuit;
the amplifying circuit is sequentially connected with the filter circuit, the processor and the communication circuit; wherein the processor is bidirectionally coupled to the communication circuit;
the other end of the amplifying circuit is connected with the sensor, and the communication circuit is connected with the external computer in a bidirectional mode.
Compared with the prior art, the human-computer interaction system for the simulated aircraft is provided, the hand image and the virtual cabin environment image are seamlessly fused, more real visual feedback is provided, and the immersion feeling of a user can be guaranteed.
In addition, through updating the skin color model parameters, the optimal skin color detection of each frame of image can be realized, so that the hand image segmentation effect is better, the robustness is stronger, and the gesture segmentation can be effectively carried out under various complex backgrounds.
Drawings
FIG. 1 is a schematic structural diagram of a man-machine interaction system for simulating an aircraft according to the present invention;
FIG. 2 is a schematic view of a wearable display module according to the present invention;
FIG. 3 is a diagram of an image processing module provided in the present invention;
fig. 4 is a schematic diagram of a sensor interaction portion provided by the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings.
Referring to the attached drawing 1, the embodiment of the invention discloses a man-machine interaction system for simulating an aircraft, which specifically comprises: the system comprises a camera 1, an image processing module 2, a modeling module 3, a synthesis module 4, a support structure 5, a wearable display module 6 and an instructor display 7; wherein the support structure 5 is mounted on a seat of the simulated aircraft and the wearable display module 6 is mounted on the support structure 5;
the camera 1 is used for acquiring a video image of a hand and sending the video image to the image processing module 2;
the image processing module 2 divides the image of the hand, and sends the image to the synthesis module 4 after viewpoint conversion;
the modeling module 3 is used for establishing a cabin and an external scene model by adopting three-dimensional modeling software, rendering virtual images inside and outside the cabin in real time by utilizing a graphic engine, and sending the virtual images to the synthesis module 4;
the synthesis module 4 is used for fusing the images subjected to viewpoint transformation and the cabin internal and external virtual images sent by the modeling module 3 and sending the fused images to the wearable display module 6;
the wearable display module 6 is used for providing visual feedback to the experiencer;
the instructor display 7 is connected to the synthesis module 4 for the instructor to supervise the simulation process.
The invention discloses and provides a man-machine interaction system for a simulated aircraft, which seamlessly fuses hand images and virtual cabin environment images, provides more real visual feedback and can ensure the immersion of a user. Furthermore, during the simulated training, the instructor can monitor the simulated training of the trainee through the instructor display 7, and find out the places where the operation is not in accordance with the regulations, so as to better perform the subsequent teaching and simulation. In addition, preferably, a recording module can be further arranged and used for recording the whole process of the simulated training so as to be used for subsequent checking by a student, and the teaching of a teacher can be facilitated, and the problem in the simulated training can be found.
In order to further optimize the above technical solution, the wearable display module 6 includes: a left display screen 61, a middle display screen 62, and a right display screen 63; and the left display screen 61, the middle display screen 62 and the right display screen 63 are connected in sequence.
In order to further optimize the above technical solution, an included angle between the left display screen 61 and the middle display screen 62, and an included angle between the middle display screen 62 and the right display screen 63 are both 120 °.
Preferably, the three display screens adopt liquid crystal display screens, and the arrangement of the included angles is convenient for the visual angle and the watching of simulation personnel.
In order to further optimize the above technical solution, the image processing module 2 includes:
a color space conversion unit 21 for performing color space conversion on the image of the hand;
an illumination compensation unit 22 for performing illumination compensation on the color space-converted image;
the model detection unit 23 is configured to perform skin color detection on the image subjected to illumination compensation based on the skin color model;
specifically, the skin color model may be an elliptical model, which can be more balanced in detection accuracy, robustness and real-time performance.
A filtering unit 24, configured to perform morphological filtering on the image subjected to skin color detection;
and the contour processing unit 25 is used for performing contour search and smoothing on the image subjected to the morphological filtering to obtain a hand segmentation result.
The visual perception of the user is directly influenced by the effect of dividing the hand image from the video image. The hand image is better segmented from the video image through the steps of color space conversion, illumination compensation, model detection, morphological filtering, contour processing and the like.
In order to further optimize the above technical solution, the method further comprises: and the parameter updating unit 26 is configured to extract a model parameter according to the segmentation result, and update the skin color model according to the model parameter.
The existing ellipse model can better detect skin color, but has certain defects, the ellipse model parameters are static models established according to a large number of skin color sample rules, the parameters are not changed in the detection process, and in order to enhance the accuracy and robustness of the whole hand segmentation, the parameter updating unit is arranged to adaptively update the model parameters, so that the model segmentation effect is better, and the method can be better suitable for various environments.
In order to further optimize the above technical solution, the instructor display 7 is connected to the composition module 4 by a video line.
In order to further optimize the above technical solution, the method further comprises: a sensor 8, a signal conditioner 9 and an external computer 10;
the sensor 8 is connected with the signal conditioner 9 in a unidirectional way;
the signal conditioner 9 is connected with the external computer 10 in a bidirectional way.
In order to further optimize the above technical solution, the signal conditioner 9 includes: an amplification circuit 91, a filter circuit 92, a processor 93, and a communication circuit 94;
the amplifying circuit 91 is connected with the filtering circuit 92, the processor 93 and the communication circuit 94 in sequence; wherein the processor 93 is bidirectionally coupled to the communication circuit 94;
the other end of the amplifying circuit 91 is connected with the sensor 8, and the communication circuit 94 is connected with the external computer 10 in a bidirectional manner.
The sensors 8 are essential in simulating the aircraft, and particularly force sensors are used for timely knowing the operation force of the pilot on an operating rod, an operating panel and the like through the feedback of the force sensors. However, at present, the sensors 8 are inconvenient to mount and dismount, certain differences exist among different airplanes, and calibration of data is inevitable over time. Therefore, the technical scheme provided by the invention amplifies and filters the output signals of the sensor 8 to prevent interference, compares and presets the parameters of the sensor 8 through the processor 93, can conveniently adjust and use the sensor 8 on the computer in an all-around way through the external computer 10, and can save time and cost. The amplification factor of the amplifying circuit 91 is adjustable to be suitable for various sensors 8.

Claims (8)

1. A simulated aircraft human-computer interaction system, comprising: the system comprises a camera, an image processing module, a modeling module, a synthesis module, a support structure, a wearing display module and an instructor display; wherein the support structure is mounted on a seat of the simulated aircraft and the wearable display module is mounted on the support structure;
the camera is used for acquiring a video image of the hand and sending the video image to the image processing module;
the image processing module is used for segmenting the image of the hand part, and sending the image to the synthesis module after viewpoint conversion;
the modeling module is used for establishing a cabin and an external scene model by adopting three-dimensional modeling software, rendering virtual images inside and outside the cabin in real time by utilizing a graphic engine and sending the virtual images to the synthesis module;
the synthesis module is used for fusing the images subjected to viewpoint transformation and the cabin internal and external virtual images sent by the modeling module and sending the fused images to the wearable display module;
the wearable display module is used for providing visual feedback to an experiencer;
and the instructor display is connected with the synthesis module and is used for the instructor to supervise the simulation process.
2. A simulated aircraft human-computer interaction system as claimed in claim 1, wherein said wearable display module comprises: the display screen comprises a left display screen, a middle display screen and a right display screen; and the left display screen, the middle display screen and the right display screen are connected in sequence.
3. A simulated aircraft human-computer interaction system as claimed in claim 2, wherein the angle between the left display screen and the middle display screen and the angle between the middle display screen and the right display screen are both 120 °.
4. A simulated aircraft human-computer interaction system according to claim 1, wherein said image processing module comprises:
a color space conversion unit for performing color space conversion on the image of the hand;
the illumination compensation unit is used for carrying out illumination compensation on the image subjected to color space conversion;
the model detection unit is used for carrying out skin color detection on the image subjected to illumination compensation based on the skin color model;
the filtering unit is used for performing morphological filtering on the image subjected to the skin color detection;
and the contour processing unit is used for carrying out contour search and smoothing on the image subjected to the morphological filtering to obtain a hand segmentation result.
5. A simulated aircraft human-computer interaction system as claimed in claim 4 further comprising: and the parameter updating unit is used for extracting model parameters according to the segmentation result and updating the skin color model according to the model parameters.
6. A simulated aircraft human-computer interaction system as claimed in claim 4 wherein said instructor display is connected to said compositing module by a video cable.
7. A simulated aircraft human-computer interaction system as claimed in claim 1 further comprising: the device comprises a sensor, a signal conditioner and an external computer;
the sensor is unidirectionally connected with the signal conditioner;
the signal conditioner is connected with the external computer in a bidirectional way.
8. A simulated aircraft human-computer interaction system according to claim 7, wherein said signal conditioner comprises: the device comprises an amplifying circuit, a filtering circuit, a processor and a communication circuit;
the amplifying circuit is sequentially connected with the filter circuit, the processor and the communication circuit; wherein the processor is bidirectionally coupled to the communication circuit;
the other end of the amplifying circuit is connected with the sensor, and the communication circuit is connected with the external computer in a bidirectional mode.
CN202010006653.6A 2020-01-03 2020-01-03 Man-machine interaction system for simulating aircraft Pending CN111123743A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010006653.6A CN111123743A (en) 2020-01-03 2020-01-03 Man-machine interaction system for simulating aircraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010006653.6A CN111123743A (en) 2020-01-03 2020-01-03 Man-machine interaction system for simulating aircraft

Publications (1)

Publication Number Publication Date
CN111123743A true CN111123743A (en) 2020-05-08

Family

ID=70486703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010006653.6A Pending CN111123743A (en) 2020-01-03 2020-01-03 Man-machine interaction system for simulating aircraft

Country Status (1)

Country Link
CN (1) CN111123743A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727182A (en) * 2010-01-28 2010-06-09 南京航空航天大学 Method and system for visualizing real hand of participant in helmet-type virtual reality environment
CN202632580U (en) * 2012-03-07 2012-12-26 北京欣远诚业科技有限公司 Integrated training simulator of flight and ejector of airplane
CN106530891A (en) * 2017-01-03 2017-03-22 刘晨 Driving simulation system based on VR technology
CN206249032U (en) * 2016-12-15 2017-06-13 中国航空工业集团公司沈阳飞机设计研究所 A kind of man-machine interactive system based on emulation aircraft
CN107154197A (en) * 2017-05-18 2017-09-12 河北中科恒运软件科技股份有限公司 Immersion flight simulator
CN107293177A (en) * 2017-08-15 2017-10-24 科盾科技股份有限公司北京分公司 A kind of analogue simulation trainer
CN207622926U (en) * 2017-12-22 2018-07-17 中航电测仪器股份有限公司 A kind of aircraft force snesor man-machine interactive system
US20200005738A1 (en) * 2018-07-02 2020-01-02 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727182A (en) * 2010-01-28 2010-06-09 南京航空航天大学 Method and system for visualizing real hand of participant in helmet-type virtual reality environment
CN202632580U (en) * 2012-03-07 2012-12-26 北京欣远诚业科技有限公司 Integrated training simulator of flight and ejector of airplane
CN206249032U (en) * 2016-12-15 2017-06-13 中国航空工业集团公司沈阳飞机设计研究所 A kind of man-machine interactive system based on emulation aircraft
CN106530891A (en) * 2017-01-03 2017-03-22 刘晨 Driving simulation system based on VR technology
CN107154197A (en) * 2017-05-18 2017-09-12 河北中科恒运软件科技股份有限公司 Immersion flight simulator
CN107293177A (en) * 2017-08-15 2017-10-24 科盾科技股份有限公司北京分公司 A kind of analogue simulation trainer
CN207622926U (en) * 2017-12-22 2018-07-17 中航电测仪器股份有限公司 A kind of aircraft force snesor man-machine interactive system
US20200005738A1 (en) * 2018-07-02 2020-01-02 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
汤勇: "增强半虚拟现实飞机座舱关键技术研究" *

Similar Documents

Publication Publication Date Title
CN102566441A (en) Visual simulation test system for unmanned aerial vehicle (UAV)
CN202632580U (en) Integrated training simulator of flight and ejector of airplane
CN109215436A (en) A kind of teacher station system towards flight simulation training
CN112735221A (en) Simulated flight training teaching system
CA3000452C (en) Assessing a training activity performed by a user in an interactive computer simulation
CN103473966A (en) Semi-physical digital simulation control platform of aircraft cockpit
CN113496635B (en) Flight simulator and flight training simulation method
WO2012170367A2 (en) Aircraft maintenance training simulator apparatus and method
CN113035010B (en) Virtual-real scene combined vision system and flight simulation device
WO2019195898A1 (en) Universal virtual simulator
CN105632271A (en) Ground simulation training system for low-speed wind tunnel model flight experiment
CN114090006A (en) System and method for evaluating man-machine interface layout of civil aircraft cockpit
CN110634350A (en) Simulated aircraft simulation system based on distributed virtual reality
CN212484651U (en) Flight simulation training system of training plane
Andreev et al. Information models for designing conceptual broad-profile flight simulators
CN111123743A (en) Man-machine interaction system for simulating aircraft
CN102456281B (en) Aircraft maintenance information processing method
CN116312136A (en) Aircraft cabin simulation system and fault simulation training method thereof
Robinson et al. Technological research challenges of flight simulation and flight instructor assessments of perceived fidelity
Shashidhara et al. Development of a Full Mission Simulator for Pilot Training of Fighter Aircraft.
CN109326171A (en) Simulator visual scene display delay test macro
RU176063U1 (en) Helicopter aerobatic stand with hinged rotor
CN113946906A (en) Flight simulator delay measurement method and system
CN114360312A (en) Ground service maintenance training system and method based on augmented reality technology
CN112884807B (en) Flight control action monitoring method and system based on infrared thermal imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination