CN111653148B - Flight attitude simulation method and device - Google Patents

Flight attitude simulation method and device Download PDF

Info

Publication number
CN111653148B
CN111653148B CN202010502399.9A CN202010502399A CN111653148B CN 111653148 B CN111653148 B CN 111653148B CN 202010502399 A CN202010502399 A CN 202010502399A CN 111653148 B CN111653148 B CN 111653148B
Authority
CN
China
Prior art keywords
flight attitude
virtual
preset flight
mechanical arm
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010502399.9A
Other languages
Chinese (zh)
Other versions
CN111653148A (en
Inventor
金占国
翟丽红
姚钦
贾辰龙
李鹏
梁赟
谭小蔓
李湘楠
杜青松
赵显亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Air Force Specialty Medical Center of PLA
Original Assignee
Air Force Specialty Medical Center of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Air Force Specialty Medical Center of PLA filed Critical Air Force Specialty Medical Center of PLA
Priority to CN202010502399.9A priority Critical patent/CN111653148B/en
Publication of CN111653148A publication Critical patent/CN111653148A/en
Application granted granted Critical
Publication of CN111653148B publication Critical patent/CN111653148B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/16Ambient or aircraft conditions simulated or indicated by instrument or alarm
    • G09B9/20Simulation or indication of aircraft attitude
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft

Abstract

The application discloses a flight attitude simulation method and device, which are used for improving the simulation effect of the flight attitude and further improving the training quality. The method comprises the following steps: receiving a trigger event simulating a preset flight attitude; acquiring mechanical arm control parameters corresponding to the preset flight attitude and a picture corresponding to the preset flight attitude; and controlling the mechanical arm to simulate the preset flight attitude based on the mechanical arm control parameters, and displaying a picture corresponding to the preset flight attitude through a visual interaction device, wherein a manned device is arranged at the tail end of the mechanical arm. Adopt the scheme that this application provided, can make flight attitude's simulation more be close real flight attitude, secondly, combine the vision interaction device to show the picture that flight attitude corresponds to can combine the vision to synthesize amazing to the training person on the basis of simulation flight attitude, improved flight attitude's simulation effect, and then promoted training quality.

Description

Flight attitude simulation method and device
Technical Field
The application relates to the field of computers, in particular to a flight attitude simulation method and device.
Background
In the field of space orientation ability training and evaluation equipment, currently, vestibular function training and evaluation equipment in two-axial or three-axial directions is mostly limited in middle and low-end equipment, such as rolling rings, four-column swings, suspension ladders, three-dimensional rolling rings, three-dimensional rotary chairs and the like, and the aim of vestibular function training is fulfilled by rotating and stimulating the vestibular function around one axis or three axes; for example, a vestibular stimulation device with three rolling rings in three axial directions around a center cannot simulate a real flight attitude, and the vestibular stimulation is only carried out without visual stimulation; for another example, the vestibular training device with two linked shafts simulating three-dimensional rotation only simulates vestibular stimulation and cannot simulate the true ground flight attitude of an airplane. Some high-end equipment is space directional simulation equipment which is formed by a plurality of hydraulic rods and has multi-degree-of-freedom low-frequency low-speed proprioception stimulation; the above devices can not well simulate various flight postures, and can not be combined with vision to comprehensively stimulate a trainer, so that the simulation effect is poor.
Therefore, it is desirable to provide a flight attitude simulation method, so as to improve the simulation effect of the flight attitude and further improve the training quality.
Disclosure of Invention
An object of the embodiments of the present application is to provide a flight attitude simulation method and device, so as to improve the simulation effect of the flight attitude and further improve the training quality.
In order to solve the technical problem, the embodiment of the application adopts the following technical scheme: a flight attitude simulation method, comprising:
receiving a trigger event simulating a preset flight attitude;
acquiring mechanical arm control parameters corresponding to the preset flight attitude and a picture corresponding to the preset flight attitude;
and controlling the mechanical arm to simulate the preset flight attitude based on the mechanical arm control parameters, and displaying a picture corresponding to the preset flight attitude through a visual interaction device, wherein a manned device is arranged at the tail end of the mechanical arm.
The beneficial effect of this application lies in: acquiring mechanical arm control parameters corresponding to the preset flight attitude and a picture corresponding to the preset flight attitude; the method comprises the steps that a mechanical arm is controlled to simulate a preset flight attitude based on mechanical arm control parameters, and a picture corresponding to the preset flight attitude is displayed through a visual interaction device, wherein a manned device is arranged at the tail end of the mechanical arm, so that simulation equipment consisting of a rolling ring, two-axis linkage equipment or a hydraulic rod in the prior art is replaced by the high-sensitivity mechanical arm, the simulation of the flight attitude is closer to the real flight attitude, and then the picture corresponding to the flight attitude is displayed through the visual interaction device, so that a trainer can be comprehensively stimulated through vision on the basis of simulating the flight attitude, the simulation effect of the flight attitude is improved, and the training quality is improved.
In one embodiment, the triggering event includes:
receiving an event of a trigger instruction of a button corresponding to a preset flight attitude;
or
And event that the current time reaches the preset time.
In one embodiment, the visual interaction device comprises a virtual eyeshade, and the displaying of the picture corresponding to the preset flight attitude through the visual interaction device comprises:
and displaying the picture corresponding to the preset flight attitude through the virtual eye patch.
In one embodiment, the displaying the picture corresponding to the preset flight attitude through the virtual eye shield includes:
and sending the picture corresponding to the preset flight attitude and the display time to the virtual eyeshade so that the virtual eyeshade displays the picture corresponding to the preset flight attitude based on the display time.
In one embodiment, the displaying the picture corresponding to the preset flight attitude through the virtual eye shield includes:
and sending an instruction for displaying the picture corresponding to the preset flight attitude to the virtual eye patch so as to enable the virtual eye patch to display the picture corresponding to the pre-stored preset flight attitude.
In one embodiment, the displaying the picture corresponding to the preset flight attitude through the virtual eye shield includes:
and sending a picture corresponding to a preset flight attitude to a virtual reality simulation control system corresponding to the virtual eyeshade, so that the virtual display simulation system corresponding to the virtual eyeshade controls the virtual eyeshade to display the picture corresponding to the preset flight attitude.
In one embodiment, the visual interaction device comprises a human eye detection apparatus disposed in the virtual eyeshade, the method further comprising:
in the process of simulating a preset flight attitude by the mechanical arm, detecting eye information of a user wearing the virtual eyeshade through eye detection equipment;
calculating a mental load and a vestibular function score of a user wearing the virtual eyewear based on the eye information.
The beneficial effect of this embodiment lies in: in the process of simulating a preset flight attitude by the mechanical arm, detecting eye information of a user wearing the virtual eyeshade through eye detection equipment; and calculating the mental load and the vestibular function score of the user wearing the virtual eyeshade based on the human eye information, and monitoring the mental load of the user, so that training can be stopped in time when the mental load of the user is too large, the user is prevented from being injured, and in addition, due to the existence of a mechanism for automatically calculating the vestibular function score, manual scoring of the user is not needed, and the user operation is simplified.
The application also provides a flight attitude simulation device, including:
the receiving module is used for receiving a trigger event simulating a preset flight attitude;
the acquisition module is used for acquiring mechanical arm control parameters corresponding to the preset flight attitude and a picture corresponding to the preset flight attitude;
and the simulation module is used for controlling the mechanical arm to simulate the preset flight attitude based on the mechanical arm control parameters and displaying a picture corresponding to the preset flight attitude through the visual interaction device, wherein the tail end of the mechanical arm is provided with a manned device.
In one embodiment, the simulation module includes:
and the display sub-module is used for displaying the picture corresponding to the preset flight attitude through the virtual eye patch.
In one embodiment, the apparatus further comprises:
the detection module is used for detecting eye information of a user wearing the virtual eyeshade through eye detection equipment in the process of simulating a preset flight attitude by the mechanical arm;
and the calculation module is used for calculating the mental load and the vestibular function score of the user wearing the virtual eye patch based on the human eye information.
Drawings
FIG. 1A is a flow chart of a flight attitude simulation method according to an embodiment of the present application;
FIG. 1B is a schematic view of a robot arm and people mover according to an embodiment of the present application;
fig. 2A is a schematic diagram of connection and data interaction between a servo control system and a virtual reality simulation control system according to an embodiment of the present disclosure;
FIG. 2B is a flow chart of a method of flight attitude simulation in another embodiment of the present application;
FIG. 3 is a block diagram of a flight attitude simulation apparatus according to an embodiment of the present application;
fig. 4 is a block diagram of a flight attitude simulation apparatus according to another embodiment of the present application.
Detailed Description
Various aspects and features of the present application are described herein with reference to the drawings.
It will be understood that various modifications may be made to the embodiments of the present application. Accordingly, the foregoing description should not be construed as limiting, but merely as exemplifications of embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the application.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the application and, together with a general description of the application given above and the detailed description of the embodiments given below, serve to explain the principles of the application.
These and other characteristics of the present application will become apparent from the following description of preferred forms of embodiment, given as non-limiting examples, with reference to the attached drawings.
It should also be understood that, although the present application has been described with reference to some specific examples, a person of skill in the art shall certainly be able to achieve many other equivalent forms of application, having the characteristics as set forth in the claims and hence all coming within the field of protection defined thereby.
The above and other aspects, features and advantages of the present application will become more apparent in view of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present application are described hereinafter with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely exemplary of the application, which can be embodied in various forms. Well-known and/or repeated functions and constructions are not described in detail to avoid obscuring the application of unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present application in virtually any appropriately detailed structure.
The specification may use the phrases "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," which may each refer to one or more of the same or different embodiments in accordance with the application.
Fig. 1A is a flowchart of a flight attitude simulation method according to an embodiment of the present application, the method including the following steps S11-S13:
in step S11, a trigger event simulating a preset flight attitude is received;
in step S12, acquiring a mechanical arm control parameter corresponding to a preset flight attitude and a picture corresponding to the preset flight attitude;
in step S13, the robot arm is controlled to simulate a preset flight attitude based on the robot arm control parameters, and a picture corresponding to the preset flight attitude is displayed by the visual interaction device, wherein a manned device is disposed at the end of the robot arm.
In the embodiment, a trigger event simulating a preset flight attitude is received; wherein the triggering event comprises: receiving an event of a trigger instruction of a button corresponding to a preset flight attitude; or an event that the current time reaches a preset time. Specifically, the spatial orientation ability training and evaluating device is provided with buttons for simulating flight attitudes such as ground sliding, takeoff, climbing, cruising and turning, and when a user clicks one of the buttons, an event for simulating the flight attitude corresponding to the button is triggered.
Or, the spatial orientation ability training and evaluating device is configured with a training program in advance, and the training program may be set based on the order of the real flight process of the airplane, for example, the real flight process of the airplane is sequentially: ground taxi, take-off, climb, cruise, turn, descent, approach and landing. After the device is started, simulation training processes of ground sliding, takeoff, climbing, cruising, turning and other items are sequentially carried out, when training takeoff is finished, climbing training is automatically carried out, and the like, namely, the preset time can refer to the time when a training item corresponding to the previous flight attitude of the preset flight attitude is finished in the simulation training process. Of course, the set of simulated training flows may also be ordered according to natural time, for example, the training starting time of the pilot is 8 am, then the training personnel may enter the manned device for simulated training in advance before 8 am, when the time reaches 8 am, the device automatically performs the simulated training flows of the items such as ground taxi, take-off, climb, cruise, turn and the like in sequence, and the duration of each training item is fixed, so the starting time of each training item is constant, and therefore, the preset time may also refer to specific natural time.
Acquiring mechanical arm control parameters corresponding to a preset flight attitude and a picture corresponding to the preset flight attitude;
for example, when the aircraft takes off, a driver pulls a rod backwards to lift the aircraft nose, the front wheel lifts off the ground, the aircraft lifts off and leaves the ground, at the moment, the aircraft has a certain elevation angle, and at the view angle of a pilot, a picture of the aircraft nose rising up can be seen through the front glass.
The mechanical arm is controlled to simulate a preset flight attitude based on mechanical arm control parameters, and a picture corresponding to the preset flight attitude is displayed through a visual interaction device, wherein a manned device is arranged at the tail end of the mechanical arm, the specific mechanical arm and the manned device can be shown in fig. 1B, as can be seen from fig. 1B, the mechanical arm 111 can be a six-axis heavy intelligent robot with high flexibility, ultra-large load and high positioning precision, and can flexibly move in multiple dimensions, so that simulation of various flight attitudes is completely met, the manned device 112 can be a manned seat, a mainframe box 1121 is arranged below the manned seat 112, various control buttons are arranged on the mainframe box 1121, and a user can actively train in different flight attitudes by triggering corresponding control buttons, and certainly, passive training can also be performed through a training program with a preset training project sequence.
The beneficial effect of this application lies in: acquiring mechanical arm control parameters corresponding to a preset flight attitude and a picture corresponding to the preset flight attitude; control the arm simulation based on arm control parameter and predetermine the flight gesture, and show the picture that predetermines the flight gesture and correspond through visual interaction device, wherein, the arm end is equipped with manned device, thereby replace the analog device that rolls the ring among the prior art, diaxon aggregate unit or hydraulic stem are constituteed through high sensitivity's arm, make the simulation of flight gesture more be close real flight gesture, secondly, combine visual interaction device to show the picture that the flight gesture corresponds, thereby can be on the basis of simulating the flight gesture, combine the vision to synthesize the stimulus to the training person, the simulation effect of flight gesture has been improved, and then training quality has been promoted.
In one embodiment, the triggering event includes:
receiving an event of a trigger instruction of a button corresponding to a preset flight attitude;
or
And event that the current time reaches the preset time.
The space orientation ability training and evaluating equipment is provided with buttons for simulating flight attitudes such as ground sliding, takeoff, climbing, cruising and turning, and when a user clicks one of the buttons to perform active training, an event for simulating the flight attitude corresponding to the button is triggered.
Or, the spatial orientation ability training and evaluating device is configured with a training program in advance, and the training program may be set based on the order of the real flight process of the airplane, for example, the real flight process of the airplane is sequentially: ground taxi, take-off, climb, cruise, turn, descent, approach and landing. After the device is started, simulation training processes of ground sliding, takeoff, climbing, cruising, turning and other items are sequentially carried out, when training takeoff is finished, climbing training is automatically carried out, and the like, namely, the preset time can refer to the time when a training item corresponding to the previous flight attitude of the preset flight attitude is finished in the simulation training process. Of course, the set of simulated training flows may also be ordered according to natural time, for example, the training starting time of the pilot is 8 am, then the training personnel may enter the manned device for simulated training in advance before 8 am, when the time reaches 8 am, the device automatically performs the simulated training flows of the items such as ground taxi, take-off, climb, cruise, turn and the like in sequence, and the duration of each training item is fixed, so the starting time of each training item is constant, and therefore, the preset time may also refer to specific natural time.
In one embodiment, the visual interaction device may include a virtual eyeshade, and the step S13 may be implemented as the following steps:
and displaying the picture corresponding to the preset flight attitude through the virtual eye patch.
In this embodiment, the visual interaction device includes a virtual eye patch, and the virtual eye patch displays a picture corresponding to the preset flight attitude. The person of training can wear this virtual eye-shade overhead promptly to can watch the picture that corresponding flight gesture corresponds through virtual eye-shade, for example, when predetermineeing the gesture of flight for the gesture of aircraft take-off, the picture that this predetermine flight gesture corresponds is the picture that the aircraft nose rises, the person of training watches the picture that corresponding flight gesture corresponds through virtual eye-shade, can be on the basis of simulation flight gesture, combine the vision to synthesize amazing to the person of training, the simulation effect of flight gesture has been improved, and then training quality has been promoted.
In one embodiment, the displaying the picture corresponding to the preset flight attitude through the virtual eye shield may be implemented as the following steps:
and sending the picture corresponding to the preset flight attitude and the display time to the virtual eyeshade so that the virtual eyeshade displays the picture corresponding to the preset flight attitude based on the display time.
The execution main body of the embodiment can be an application or a platform which can simultaneously control the mechanical arm and the virtual eyeshade, the virtual eyeshade is in wired or wireless connection with the virtual eyeshade, a processor and a storage device are arranged inside the virtual eyeshade, the processor can receive pictures and display time corresponding to the preset flight attitude sent by the application or the platform, the storage device stores the corresponding pictures and display time, and when the processor detects that the current time reaches the display time, the virtual eyeshade is controlled to display the pictures corresponding to the preset flight attitude based on the display time.
In one embodiment, the displaying the picture corresponding to the preset flight attitude through the virtual eye shield may be implemented as the following steps:
and sending an instruction for displaying the picture corresponding to the preset flight attitude to the virtual eye patch so as to enable the virtual eye patch to display the picture corresponding to the pre-stored preset flight attitude.
The execution main body of the embodiment can be an application or a platform which can simultaneously control the mechanical arm and the virtual eyeshade, the virtual eyeshade is in wired or wireless connection with the virtual eyeshade, a storage device is arranged in the virtual eyeshade, pictures and display time corresponding to the preset flight attitude can be stored in advance through the storage device, and when the application or the platform controls the mechanical arm to carry out corresponding flight instructions, the application or the platform sends the instructions for displaying the pictures corresponding to the preset flight attitude to the virtual eyeshade so that the virtual eyeshade displays the pictures corresponding to the preset flight attitude stored in the storage device in advance.
In one embodiment, the displaying the picture corresponding to the preset flight attitude through the virtual eye shield may be implemented as the following steps:
and sending the picture corresponding to the preset flight attitude to a virtual reality simulation control system corresponding to the virtual eyeshade, so that the virtual display simulation system corresponding to the virtual eyeshade controls the virtual eyeshade to display the picture corresponding to the preset flight attitude.
The executing body of the embodiment may be a system for controlling the robot arm, for example, a servo control system, where the servo control system is an automatic control system that enables the position, orientation, state, and the like of the object to be output and can change with any change of the input amount (or given value), specifically, when the robot arm is a six-axis heavy-duty intelligent robot, the servo system may be composed of one servo or may be composed of a plurality of servos, for example, the servo system may be composed of six servos and respectively controls six joints of the six-axis heavy-duty intelligent robot, that is, one key is used for each servo control, so that the calculation amount of each servo is reduced, the fineness of the control is increased, and the probability of control error is reduced.
The system can be connected with a virtual reality simulation control system for controlling the virtual eyeshade, so that the virtual eyeshade can be indirectly controlled through the virtual reality simulation control system, and in the process of controlling the mechanical arm to simulate the preset flight attitude, the system can send the picture corresponding to the preset flight attitude to the virtual reality simulation control system corresponding to the virtual eyeshade, so that the virtual display simulation system corresponding to the virtual eyeshade controls the virtual eyeshade to display the picture corresponding to the preset flight attitude. Fig. 2A is a schematic diagram of connection and data interaction between the servo control system and the virtual reality simulation control system in this embodiment, where arrows are pointed to indicate information flow direction, as can be seen from fig. 2A, the servo control system is used to control the robot arm, the virtual reality simulation control system is used to control the virtual eye patch, and the servo control system can perform data interaction with the virtual reality simulation control system, so that the flight attitude simulated by the robot arm is consistent with the picture displayed by the virtual eye patch.
In one embodiment, as shown in fig. 2B, the visual interaction apparatus includes a human eye detection device disposed in the virtual eyeshade, and the method can be further implemented as the following steps S21-S22:
in step S21, in the process of simulating the preset flight attitude by the robot arm, eye information of the user wearing the virtual eyepatch is detected by the eye detection device;
in step S22, the mental load and vestibular function score of the user wearing the virtual eyewear are calculated based on the human eye information.
In this embodiment, in the process of simulating the preset flight attitude by the mechanical arm, eye information of the user wearing the virtual eyeshade is detected by the eye detection device. Specifically, the human eye detection device may be an eye tracker, which is an important instrument for physiological basic research and records the eye movement track characteristics of a human when processing visual information through detection technology. The human eye information may refer to eye movement information and/or pupil size of the user, and taking the size of the through hole as an example, in a general case, when a person is in a relaxed state, the pupil is enlarged, and when the person is in a tensed state, the pupil is contracted, so that the mental load of the user may be analyzed based on the pupil size, and of course, the mental load of the user may be analyzed based on the eye movement information of the user in combination with the pupil size.
The human vestibular organ is a receptor of a human body to the self motion state and the head at the spatial position, in the flight simulation training process, when the body of a user is trained along with the motion of a manned device at the tail end of a mechanical arm, the vestibular organ can cause the corresponding feeling of the user, the feeling of the user can cause the corresponding vestibular-ocular reflex activity of the user, the vestibular-ocular reflex refers to a reflex path which governs the motion of extraocular muscles when a vestibular peripheral receptor is stimulated, so that eye movement is formed, the temporal and spatial characteristics of the eye movement are physiological expressions in the visual information extraction process, and the pupil size reflects whether the human is in a tension state, so that the vestibular function state and the brain load condition of the user can be judged based on the eye movement information and the pupil size, and further the vestibular function of the user is judged to be qualified, therefore, the brain load and the vestibular function evaluation of the user wearing the virtual eyepatch can be calculated through the human eye information detected by a human eye detection device And (4) dividing.
The beneficial effect of this embodiment lies in: in the process of simulating a preset flight attitude by the mechanical arm, detecting eye information of a user wearing the virtual eyeshade through eye detection equipment; the mental load and the vestibular function score of a user wearing the virtual eyeshade are calculated based on the human eye information, the mental load of the user is monitored, training can be stopped timely when the mental load of the user is too large, damage to the user is avoided, and due to the fact that a mechanism for automatically calculating the vestibular function score exists, manual scoring of the user is not needed, and user operation is simplified.
Fig. 3 is a block diagram of a flight attitude simulation apparatus according to an embodiment of the present application, the apparatus including the following modules:
a receiving module 31, configured to receive a trigger event simulating a preset flight attitude;
the acquiring module 32 is configured to acquire a mechanical arm control parameter corresponding to a preset flight attitude and a picture corresponding to the preset flight attitude;
and the simulation module 33 is used for controlling the mechanical arm to simulate a preset flight attitude based on the mechanical arm control parameters, and displaying a picture corresponding to the preset flight attitude through the visual interaction device, wherein the tail end of the mechanical arm is provided with a manned device.
In one embodiment, as shown in fig. 4, the simulation module 33 includes:
and the display sub-module 41 is used for displaying the picture corresponding to the preset flight attitude through the virtual eye shield.
In one embodiment, the apparatus further comprises:
the detection module is used for detecting eye information of a user wearing the virtual eyeshade through eye detection equipment in the process of simulating the preset flight attitude by the mechanical arm;
and the calculation module is used for calculating the mental load and the vestibular function score of the user wearing the virtual eye patch based on the human eye information.
The above embodiments are only exemplary embodiments of the present application, and are not intended to limit the present application, and the protection scope of the present application is defined by the claims. Various modifications and equivalents may be made by those skilled in the art within the spirit and scope of the present application and such modifications and equivalents should also be considered to be within the scope of the present application.

Claims (4)

1. A method of flight attitude simulation, comprising:
receiving a trigger event simulating a preset flight attitude, wherein the trigger event comprises an event of receiving a trigger instruction of a button corresponding to the preset flight attitude, or an event that the current time reaches a preset time in a training program set based on the sequence and the natural time of the real flight process of the airplane, and the preset time is the natural time of finishing a training project corresponding to the previous flight attitude of the preset flight attitude;
responding to the trigger event, acquiring mechanical arm control parameters corresponding to the preset flight attitude, and controlling a mechanical arm to simulate the preset flight attitude based on the mechanical arm control parameters, wherein a manned device is arranged at the tail end of the mechanical arm;
responding to the trigger event, displaying a picture corresponding to the preset flight attitude through a visual interaction device while controlling the mechanical arm to simulate the preset flight attitude based on the mechanical arm control parameter, wherein the visual interaction device comprises a virtual eyeshade and human eye detection equipment, a storage device is arranged in the virtual eyeshade, and the storage device stores the picture and display time corresponding to the preset flight attitude in advance, so that the picture corresponding to the preset flight attitude stored in the storage device in advance is displayed by the virtual eyeshade while controlling the mechanical arm to simulate the preset flight attitude based on the mechanical arm control parameter, and the method comprises the following steps: sending the picture and the display time corresponding to the preset flight attitude to a virtual eyeshade so that the virtual eyeshade displays the picture corresponding to the preset flight attitude based on the display time;
the human eye detection device is disposed in the virtual eye patch, the method further comprising:
in the process of simulating a preset flight attitude by the mechanical arm, detecting eye information of a user wearing the virtual eyeshade through eye detection equipment, wherein the eye information comprises pupil size;
calculating a mental load and a vestibular function score of a user wearing the virtual eyewear based on the pupil size, and controlling termination of training based on the mental load of the user.
2. The method of claim 1, wherein displaying the image corresponding to the preset flight attitude through the virtual eye shield comprises:
and sending an instruction for displaying the picture corresponding to the preset flight attitude to the virtual eye patch so as to enable the virtual eye patch to display the picture corresponding to the pre-stored preset flight attitude.
3. The method of claim 1, wherein displaying the image corresponding to the preset flight attitude through the virtual eye shield comprises:
and sending a picture corresponding to a preset flight attitude to a virtual reality simulation control system corresponding to the virtual eyeshade, so that the virtual display simulation system corresponding to the virtual eyeshade controls the virtual eyeshade to display the picture corresponding to the preset flight attitude.
4. A flying attitude simulating apparatus comprising:
the receiving module is used for receiving a trigger event for simulating a preset flight attitude, wherein the trigger event comprises an event for receiving a trigger instruction of a button corresponding to the preset flight attitude, or an event that the current time reaches a preset time in a training program set based on the sequence and the natural time of the real flight process of the airplane, and the preset time is the natural time for finishing a training project corresponding to the previous flight attitude of the preset flight attitude;
the acquisition module is used for responding to the trigger event and acquiring mechanical arm control parameters corresponding to the preset flight attitude and a picture corresponding to the preset flight attitude;
the simulation module is used for controlling the mechanical arm to simulate the preset flight attitude based on the mechanical arm control parameters, wherein a manned device is arranged at the tail end of the mechanical arm;
the simulation module comprises a display submodule and is used for responding to the trigger event, displaying a picture corresponding to the preset flight attitude through a visual interaction device when the mechanical arm is controlled to simulate the preset flight attitude based on the mechanical arm control parameter, wherein the visual interaction device comprises a virtual eyeshade and human eye detection equipment, a storage device is arranged in the virtual eyeshade, and the storage device stores the picture and display time corresponding to the preset flight attitude in advance, so that the picture corresponding to the preset flight attitude stored in the storage device in advance is displayed by the virtual eyeshade when the mechanical arm is controlled to simulate the preset flight attitude based on the mechanical arm control parameter, and the simulation module comprises: sending the picture and the display time corresponding to the preset flight attitude to a virtual eyeshade so that the virtual eyeshade displays the picture corresponding to the preset flight attitude based on the display time;
the detection module is used for detecting human eye information of a user wearing the virtual eyeshade through human eye detection equipment in the process of simulating a preset flight attitude by the mechanical arm, wherein the human eye detection equipment is arranged in the virtual eyeshade, and the human eye information comprises pupil size;
and the calculation module is used for calculating the mental load and the vestibular function score of the user wearing the virtual eye patch based on the pupil size and controlling the training termination based on the mental load of the user.
CN202010502399.9A 2020-06-04 2020-06-04 Flight attitude simulation method and device Active CN111653148B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010502399.9A CN111653148B (en) 2020-06-04 2020-06-04 Flight attitude simulation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010502399.9A CN111653148B (en) 2020-06-04 2020-06-04 Flight attitude simulation method and device

Publications (2)

Publication Number Publication Date
CN111653148A CN111653148A (en) 2020-09-11
CN111653148B true CN111653148B (en) 2022-02-01

Family

ID=72347153

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010502399.9A Active CN111653148B (en) 2020-06-04 2020-06-04 Flight attitude simulation method and device

Country Status (1)

Country Link
CN (1) CN111653148B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113470466B (en) * 2021-06-15 2023-04-14 华北科技学院(中国煤矿安全技术培训中心) Mixed reality tunneling machine operation training system
CN113375501B (en) * 2021-07-16 2023-05-12 重庆零壹空间科技集团有限公司 Rocket launching training system and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108133639A (en) * 2017-12-21 2018-06-08 信阳泰蓝仿真科技有限公司 A kind of aviation psychology simulation system and test method
CN110223565A (en) * 2019-06-25 2019-09-10 深圳市道通智能航空技术有限公司 A kind of flight simulation method, device, equipment and storage medium
CN110502103A (en) * 2019-05-29 2019-11-26 中国人民解放军军事科学院军事医学研究院 Brain control UAV system and its control method based on brain-computer interface

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020068641A1 (en) * 2000-12-06 2002-06-06 Dicicco Thomas Roller coaster based simulator for amusement and flight training
CN100579446C (en) * 2008-04-23 2010-01-13 中国人民解放军空军航空医学研究所 Portable atria function detecting equipment
US8718796B2 (en) * 2009-03-05 2014-05-06 Mayo Foundation For Medical Education Galvanic vestibular stimulation system and method of use for simulation, directional cueing, and alleviating motion-related sickness
CN102626304A (en) * 2012-04-19 2012-08-08 重庆大学 Head-mounted wireless video eye tracker
CN104146684B (en) * 2014-06-25 2016-03-30 兰州大学 The dizzy detector of a kind of eyeshield formula
CN204614276U (en) * 2015-03-31 2015-09-02 刘宛平 A kind of emulation omnidirectional simulated flight device with mixed reality function
US9694294B1 (en) * 2015-12-31 2017-07-04 Oculus Vr, Llc Navigation controller for virtual-reality systems
US10986992B2 (en) * 2016-08-17 2021-04-27 Massachusetts Institute Of Technology Dynamic display system and method for customizing a controller in a display system
CN107644566A (en) * 2017-08-17 2018-01-30 北京航空航天大学 A kind of brain electricity evaluation system of the simulated flight device based on brain electricity
CN109009173B (en) * 2018-08-30 2022-02-01 北京机械设备研究所 Fatigue detection and regulation method based on electroencephalogram-eye movement bimodal signals
CN209912211U (en) * 2019-01-24 2020-01-07 刘珈企 Three-axis three-dimensional motion simulation system
CN109938961A (en) * 2019-02-21 2019-06-28 中国民航大学 Closed three-dimensional vestibular exercise device
CN111063233A (en) * 2019-12-19 2020-04-24 上海航天信息研究所 VR flight driving simulation system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108133639A (en) * 2017-12-21 2018-06-08 信阳泰蓝仿真科技有限公司 A kind of aviation psychology simulation system and test method
CN110502103A (en) * 2019-05-29 2019-11-26 中国人民解放军军事科学院军事医学研究院 Brain control UAV system and its control method based on brain-computer interface
CN110223565A (en) * 2019-06-25 2019-09-10 深圳市道通智能航空技术有限公司 A kind of flight simulation method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN111653148A (en) 2020-09-11

Similar Documents

Publication Publication Date Title
US9183756B2 (en) Vestibular stimulation systems and methods of use
Gillingham et al. Spatial orientation in flight
CN111653148B (en) Flight attitude simulation method and device
US9984586B2 (en) Method and device to improve the flying abilities of the airborne devices operator
Menda et al. Optical brain imaging to enhance UAV operator training, evaluation, and interface development
Heinen Evidence for the spotting hypothesis in gymnasts
CN103218929B (en) A kind of based on navigation simulation method and system in the station module of Head down tilt bed rest
Bloomberg et al. Risk of impaired control of spacecraft/associated systems and decreased mobility due to vestibular/sensorimotor alterations associated with space flight
Shebilske An ecological efference mediation theory of natural event perception
US20230334788A1 (en) Mixed-Reality Visor For In-Situ Vehicular Operations Training
CN111785126A (en) Body rotation illusion simulation method integrating visual information and motion perception
Labedan et al. Virtual Reality for Pilot Training: Study of Cardiac Activity.
CN113409648A (en) Flight pitching illusion simulation method and device and flight illusion simulator
Cheung et al. Eye tracking, point of gaze, and performance degradation during disorientation
CN207886596U (en) A kind of VR rehabilitation systems based on mirror neuron
CN113409649B (en) Vestibular inclination illusion simulation method and device and flight illusion simulator
Cheung et al. Physiological and behavioral responses to an exposure of pitch illusion in the simulator
McGrath Tactile instrument for aviation
CN109903636A (en) A kind of portable pitch tilt illusion rectificative training device of fighter-pilot
Lewkowicz et al. Flights with the risk of spatial disorientation in the measurements of oculomotor activity of pilots
CN111359159B (en) Cervical vertebra rehabilitation training method
Blaginin et al. Methods of eye tracking and prospects of using it for the training of aviation medicine specialists
McGrath et al. Tactile displays: from the cockpit to the clinic
RU2667211C1 (en) Method for estimating the accuracy of perception of distance by a person
WO2023037623A1 (en) Rehabilitation assistance device, method therefor, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant