CN113778113A - Pilot-assisted driving method and system based on multi-mode physiological signals - Google Patents

Pilot-assisted driving method and system based on multi-mode physiological signals Download PDF

Info

Publication number
CN113778113A
CN113778113A CN202110962284.2A CN202110962284A CN113778113A CN 113778113 A CN113778113 A CN 113778113A CN 202110962284 A CN202110962284 A CN 202110962284A CN 113778113 A CN113778113 A CN 113778113A
Authority
CN
China
Prior art keywords
data
decision
pilot
aircraft
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110962284.2A
Other languages
Chinese (zh)
Other versions
CN113778113B (en
Inventor
马超
郝李子翼
郜夯
晏志超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology Beijing USTB
Original Assignee
University of Science and Technology Beijing USTB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology Beijing USTB filed Critical University of Science and Technology Beijing USTB
Priority to CN202110962284.2A priority Critical patent/CN113778113B/en
Publication of CN113778113A publication Critical patent/CN113778113A/en
Application granted granted Critical
Publication of CN113778113B publication Critical patent/CN113778113B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • A61B5/397Analysis of electromyograms
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a pilot assistant driving method and system based on multi-mode physiological signals, and relates to the technical field of aircraft assistant driving, wherein the method comprises the following steps: the multi-modal physiological signal recognition system acquires physiological data and environmental perception information of a pilot, acquires a rational action decision suggestion and sends the rational action decision suggestion to the man-machine cooperative task allocation decision system; the human-computer cooperative task allocation decision system determines a movement instruction according to the obtained rational action decision suggestion and sends the movement instruction to the control system; the multi-modal physiological signal recognition system sends the rational action decision suggestion and the environment perception information to the hardware vision system, and the hardware vision system determines flight environment data through the rational action decision suggestion and the environment perception information and displays the flight environment data to a pilot; the data management system stores data generated by each system. The invention can improve the stability, safety and high efficiency of flight.

Description

Pilot-assisted driving method and system based on multi-mode physiological signals
Technical Field
The invention relates to the technical field of auxiliary driving of aircrafts, in particular to a pilot auxiliary driving method and system based on multi-mode physiological signals.
Background
With the ever-increasing level of technology, the demand for aircraft and high-tech pilots is growing rapidly. In recent years, with the development of unmanned aerial vehicle technology, unmanned aerial vehicles can be used for tasks such as detection, monitoring, false target setting, harassment and temptation, electronic interference and the like in battle, and achieve quite good fighting results. However, at present, all unmanned driving still has certain difficult problems to break through. Therefore, how to use the intelligent robot system to assist the pilot in driving is an urgent problem to be solved by breaking through the key technologies of intelligent decision enhancement of pilot behavior intention prediction and man-machine fusion based on multi-modal physiological characteristics.
Disclosure of Invention
In order to solve the problems of how to use an intelligent robot system to assist a pilot in driving, breaking through intelligent decision enhancement of pilot behavior intention prediction and man-machine fusion based on multi-modal physiological characteristics and the like, the embodiment of the invention provides a pilot driving assisting method and system based on multi-modal physiological signals.
In order to solve the technical problems, the invention provides the following technical scheme:
on one hand, the method is realized by a pilot-assisted driving system based on multi-modal physiological signals, and the system comprises a man-machine cooperative task allocation decision-making system, a multi-modal physiological signal identification system, a control system, a hardware view system and a data management system; the method comprises the following steps:
the multi-modal physiological signal identification system acquires physiological data and environmental perception information of a pilot, acquires a rational action decision suggestion according to the physiological data and the environmental perception information, and sends the rational action decision suggestion to the man-machine cooperative task allocation decision system;
the man-machine cooperative task allocation decision-making system determines a motion instruction according to the obtained rational action decision suggestion, and sends the motion instruction to the control system, so that the control system controls the aircraft to move through the motion instruction;
the multi-mode physiological signal recognition system sends the rational action decision suggestion and the environment perception information to the hardware vision system, and the hardware vision system determines flight environment data through the rational action decision suggestion and the environment perception information and displays the flight environment data to the pilot;
the data management system stores data generated by each system.
Optionally, the multi-modal physiological signal recognition system includes a behavior intention prediction module of multi-modal physiological features, a human-computer fusion interaction decision enhancement module, and an autonomous environment perception and situation cognition module;
the multi-modal physiological signal recognition system acquires physiological data and environmental perception information of a pilot, acquires a rational action decision suggestion according to the physiological data and the environmental perception information, and sends the rational action decision suggestion to the man-machine cooperative task allocation decision-making system, wherein the method comprises the following steps:
the behavior intention prediction module senses an instruction state and cognitive characteristic model of a pilot in real time, determines the prediction of behavior and action decisions, transmits the instruction state and cognitive characteristic model of the pilot to the man-machine fusion interaction decision enhancement module, and transmits the prediction of the behavior and action decisions to the autonomous environment sensing and situation cognition module; wherein the physiological data comprises an instruction state and a cognitive feature model;
the autonomous environment perception and situation cognition module transmits the acquired environment perception information to a man-machine fusion interaction decision enhancement module;
and the human-computer fusion interaction decision enhancement module obtains a rational action decision suggestion according to the instruction state and cognitive characteristic model of the pilot and the environment perception information, and sends the rational action decision suggestion to a human-computer cooperative task allocation decision system.
Optionally, the determining, by the human-computer collaborative task allocation decision system, a motion instruction according to the obtained rational action decision suggestion includes:
the man-machine cooperative task allocation decision system acquires hardware state data of the aircraft and internal and external environment data of the aircraft;
the man-machine cooperative task allocation decision system judges whether the acquired rational action decision suggestion is suitable for execution or not according to the hardware state data of the aircraft and the internal and external environment data of the aircraft;
if the judgment result is suitable for execution, determining the movement instruction according to the rational action decision suggestion;
and if the judgment result is that the aircraft is not suitable for execution, the man-machine cooperative task allocation decision-making system determines the motion instruction according to the hardware state data of the aircraft and the internal and external environment data of the aircraft.
Optionally, the flight environment data includes:
GPS data, flight trajectory data, aircraft attitude data, control rate and aircraft internal and external environment simulation data.
Optionally, the data management system stores data generated by each system, and includes:
the data management system stores situation perception module data, hardware view module data, multi-mode physiological characteristic data, strategy generation module data and multi-degree-of-freedom mechanical arm module data.
On the other hand, the system is used for realizing the pilot-assisted driving method based on the multi-modal physiological signals, and comprises a man-machine cooperative task allocation decision-making system, a multi-modal physiological signal identification system, a control system, a hardware view system and a data management system;
the multi-mode physiological signal identification system is used for acquiring physiological data and environment perception information of a pilot, acquiring a rational action decision suggestion according to the physiological data and the environment perception information, and sending the rational action decision suggestion to the man-machine cooperative task allocation decision-making system;
the human-computer cooperative task allocation decision system is used for determining a movement instruction according to the obtained rational action decision suggestion and sending the movement instruction to the control system;
the control system is used for controlling the aircraft to move through the motion command;
the multi-mode physiological signal identification system is used for sending the rational action decision suggestion and the environment perception information to the hardware vision system, and the hardware vision system determines flight environment data through the rational action decision suggestion and the environment perception information and displays the flight environment data to the pilot;
and the data management system is used for storing the data generated by each system.
Optionally, the multi-modal physiological signal recognition system includes a behavior intention prediction module of multi-modal physiological features, a human-computer fusion interaction decision enhancement module, and an autonomous environment perception and situation cognition module;
the multi-modal physiological signal recognition system acquires physiological data and environmental perception information of a pilot, acquires a rational action decision suggestion according to the physiological data and the environmental perception information, and sends the rational action decision suggestion to the man-machine cooperative task allocation decision-making system, wherein the method comprises the following steps:
the behavior intention prediction module senses an instruction state and cognitive characteristic model of a pilot in real time, determines the prediction of behavior and action decisions, transmits the instruction state and cognitive characteristic model of the pilot to the man-machine fusion interaction decision enhancement module, and transmits the prediction of the behavior and action decisions to the autonomous environment sensing and situation cognition module; wherein the physiological data comprises an instruction state and a cognitive feature model;
the autonomous environment perception and situation cognition module transmits the acquired environment perception information to a man-machine fusion interaction decision enhancement module;
and the human-computer fusion interaction decision enhancement module obtains a rational action decision suggestion according to the instruction state and cognitive characteristic model of the pilot and the environment perception information, and sends the rational action decision suggestion to a human-computer cooperative task allocation decision system.
Optionally, the human-computer collaborative task allocation decision system is further configured to:
the man-machine cooperative task allocation decision system acquires hardware state data of the aircraft and internal and external environment data of the aircraft;
the man-machine cooperative task allocation decision system judges whether the acquired rational action decision suggestion is suitable for execution or not according to the hardware state data of the aircraft and the internal and external environment data of the aircraft;
if the judgment result is suitable for execution, determining the movement instruction according to the rational action decision suggestion;
and if the judgment result is that the aircraft is not suitable for execution, the man-machine cooperative task allocation decision-making system determines the motion instruction according to the hardware state data of the aircraft and the internal and external environment data of the aircraft.
Optionally, the flight environment data includes:
GPS data, flight trajectory data, aircraft attitude data, control rate and aircraft internal and external environment simulation data.
Optionally, the data management system is further configured to:
the data management system stores situation perception module data, hardware view module data, multi-mode physiological characteristic data, strategy generation module data and multi-degree-of-freedom mechanical arm module data.
The embodiment of the utility model provides an above-mentioned technical scheme has following beneficial effect at least:
in the embodiment of the invention, the multi-degree-of-freedom mechanical arm can observe physical environments inside and outside a cockpit and physiological data of a pilot in real time through a sensing system, analyze the current environmental condition, plan the operation strategy of the mechanical arm through a man-machine cooperative task allocation decision system, control the mechanical arm to operate according to a preset action, and realize the functions of but not limited to alarm relief, throttle valve operation, control panel operation and the like. In particular, in emergency situations, auxiliary driving of the mechanical arm can be released through personnel or pilots in the ground station, and therefore stability, safety and efficiency of flight are improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1a is a flowchart of a pilot-assisted driving method based on multi-modal physiological signals according to an embodiment of the present invention;
FIG. 1b is a schematic diagram of an architecture of a pilot-assisted driving method based on multi-modal physiological signals according to an embodiment of the present invention;
FIG. 2 is a flow chart of a behavior intent prediction module according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a robot control method according to an embodiment of the present invention;
FIG. 4 is a block diagram of a data management system according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a pilot-assisted pilot system architecture based on multi-modal physiological signals according to an embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages of the present invention more apparent, the following detailed description is given with reference to the accompanying drawings and specific embodiments.
The embodiment of the invention provides a pilot auxiliary driving method based on multi-modal physiological signals, which is realized by a pilot auxiliary driving system based on the multi-modal physiological signals, wherein the system comprises a man-machine cooperative task allocation decision-making system, a multi-modal physiological signal recognition system, a control system, a hardware vision system and a data management system. The flow chart of the pilot-assisted driving method based on the multi-modal physiological signal shown in fig. 1a, and the architectural diagram of the pilot-assisted driving method based on the multi-modal physiological signal shown in fig. 1b, the processing flow of the method may include the following steps:
step 101, a multi-modal physiological signal recognition system acquires physiological data and environmental perception information of a pilot, acquires a rational action decision suggestion according to the physiological data and the environmental perception information, and sends the rational action decision suggestion to a man-machine cooperative task allocation decision system.
The multi-modal physiological signal recognition system comprises a behavior intention prediction module of multi-modal physiological characteristics, a man-machine fusion interaction decision enhancement module and an autonomous environment perception and situation cognition module.
The step 101 comprises the following steps 1011-:
step 1011, the behavior intention prediction module senses the instruction state and the cognitive characteristic model of the pilot in real time and determines the prediction of behavior and action decisions.
The behavior intention prediction module comprises but is not limited to an electroencephalogram acquisition subsystem and/or a myoelectricity acquisition subsystem, an electroencephalogram data display and recording subsystem, an electroencephalogram processing subsystem and a data transmission submodule. In addition, the behavior intent prediction module may also include a manual command collection module, in which case the pilot may manually input commands.
In a feasible implementation mode, the behavior intention prediction module acquires information such as physiological state, instruction, cognition, intention and the like of a pilot through an electroencephalogram acquisition subsystem or a myoelectricity acquisition subsystem, electroencephalogram is processed through an electroencephalogram processing subsystem, processing operations mainly comprise preprocessing, feature extraction and feature classification, prediction of instruction loading and cognition feature models and behavior and action decisions is generated, determined information is transmitted to other modules through a data transmission submodule, and electroencephalogram data of the pilot can be displayed through an electroencephalogram data display and recording subsystem.
Specifically, the processing procedure of the behavior intention prediction module may be as shown in fig. 2: loading an interface parameter configuration file, then performing parameter calibration through a parameter calibration submodule, acquiring electroencephalogram data of a pilot through an electroencephalogram acquisition subsystem, storing the electroencephalogram data in an electroencephalogram data display and recording subsystem, and displaying through the subsystem; and integrating the acquired electroencephalogram data through a behavior intention recognition module. The system can judge whether the pilot exits or not, and if the pilot does not exit, the electroencephalogram data of the pilot are repeatedly acquired through the electroencephalogram acquisition subsystem; and if the judgment result is exit, exiting the system.
Optionally, the electroencephalogram acquisition subsystem may be an electroencephalogram amplifier commonly used in the prior art, the instrument has an external size of 210 × 170 × 40mm, a USB interface is used to supply power to the equipment, a hot plug technology is supported, the instrument has an impedance detection function, the resolution of the equipment is 0.5 microvolt, the sampling rate is 1000Hz, the time constant can be selected from 0.03s, 0.1s and 0.3s, the high-frequency filtering can be selected from 15Hz, 30Hz, 45Hz, 60Hz and 120Hz, and the equipment can meet the functions required by the embodiment of the invention. The electrodes were used in the 10-20 international standard lead, containing ten channels (Fz, Cz, Pz, Oz, P3, P4, P7, P8, O1, O2) and left and right ears a11, a12 as reference electrodes. And carrying out data processing and classification on the obtained twelve channels of electroencephalogram data.
And 1012, transmitting the instruction state of the pilot and the cognitive characteristic model to a man-machine fusion interaction decision enhancement module.
Step 1013, the prediction of the behavior and action decision is transmitted to the autonomous environment perception and situation cognition module.
And 1014, the autonomous environment perception and situation cognition module transmits the acquired environment perception information to the man-machine fusion interaction decision enhancement module.
Step 1015, the human-computer fusion interaction decision enhancement module obtains a rational action decision suggestion according to the instruction state and the cognitive characteristic model of the pilot and the environment perception information, and sends the rational action decision suggestion to the human-computer cooperative task allocation decision system.
In a feasible implementation manner, after the human-computer fusion interaction decision enhancement module acquires the environment perception information, a rational action decision suggestion is obtained according to the environment perception information and the instruction state and cognitive feature model of the pilot, for example, whether an object can be tracked or not is judged, unfavorable ground conditions, controlled areas and the like are avoided as much as possible, the pilot is prompted to keep a target in a better visual field range as much as possible in a tracking task, and loss is prevented.
And 102, determining a motion instruction by the human-computer cooperative task allocation decision system according to the obtained rational action decision suggestion, and sending the motion instruction to the control system.
In one possible implementation, the step 102 includes the following steps 1021-:
step 1021, the man-machine cooperative task allocation decision system obtains hardware state data of the aircraft and internal and external environment data of the aircraft.
And step 1022, the human-computer cooperative task allocation decision system judges whether the acquired rational action decision suggestion is suitable for execution or not according to the hardware state data of the aircraft and the internal and external environment data of the aircraft.
And step 1023, if the judgment result is suitable for execution, determining the motion instruction according to the rational action decision suggestion.
In one possible embodiment, if the human-computer cooperative task allocation decision system decides whether the current environment is suitable by comparing with the rational action, if so, the motion instruction is determined according to the rational action decision suggestion, that is, in this case, the aircraft can fly according to the electroencephalogram instruction, the muscle electrical instruction or the manually input instruction of the pilot.
And step 1024, if the judgment result is that the software is not suitable for execution, determining a motion instruction by the man-machine cooperative task allocation decision system according to the hardware state data of the aircraft and the internal and external environment data of the aircraft.
In a possible implementation manner, if the human-computer cooperative task allocation decision-making system judges that the pilot instruction does not conform to the current environmental condition, the human-computer cooperative task allocation decision-making system determines the movement instruction according to the hardware state data of the aircraft and the internal and external environmental data of the aircraft. In this way, erroneous commands by the pilot may be reduced. Moreover, when the pilot cannot send out the command, such as certain emergency situations occur, the man-machine cooperative task allocation decision-making system can determine a proper motion command according to the current environment to carry out automatic driving.
And 103, controlling the aircraft to move by the control system through the motion command.
The control system comprises a multi-degree-of-freedom mechanical arm.
In one possible embodiment, the control system may include a manual control mode module and an automatic control mode module. The manual control mode module is a mode module in which the pilot manually manipulates the mechanical arm to perform position movement.
The automatic control mode module comprises an instruction analysis submodule, a path planning and track generation submodule, a collision detection submodule and a robot control module, wherein in the mode, the instruction analysis submodule is used for analyzing a motion instruction in a decision system into a robot control instruction so as to control the robot to drive an aircraft, the path planning and track generation module is used for generating a feasible track of the tail end of the robot in a working space, and the collision detection module is used for detecting whether the robot can generate self collision or environmental collision in the motion process.
Specifically, the robot arm control flow may be as shown in fig. 3: and loading configuration parameters to perform system self-check, receiving a decision instruction or a manual operation instruction, analyzing the instruction, generating a path plan and a track, performing collision detection, and performing motion control on the mechanical arm. And then judging whether the system exits or not, if the system does not exit, repeatedly executing instruction analysis, and if the system exits, exiting the system.
And step 104, the multi-mode physiological signal recognition system sends the rational action decision suggestion and the environment perception information to the hardware vision system, and the hardware vision system determines flight environment data through the rational action decision suggestion and the environment perception information and displays the flight environment data to a pilot.
In one possible embodiment, a hardware vision system is used to simulate the real physical environment of the aircraft cabin and the complex ground environment, and the hardware vision system may include but is not limited to: the system comprises a plurality of liquid crystal displays, a foot rudder, a throttle valve, an operating rod, an adjustable support, a voltage converter, a switch, a data storage computer, a robot control computer, a simulation system computer, a vision computer, an electroencephalogram signal processing computer and an electroencephalogram special amplifier, wherein the plurality of liquid crystal displays can comprise three liquid crystal displays for displaying a vision and a liquid crystal display for realizing the display of an independent instrument panel.
The hardware vision system receives the rational action decision suggestion and the environment perception information, flight environment data of the aircraft are generated through the equipment, the flight environment data comprise GPS data, flight track data, aircraft attitude data, control rate, aircraft internal and external environment simulation data and the like, and the corresponding flight environment data are displayed through the liquid crystal display.
Step 105, the data management system stores the data generated by each system.
In one possible implementation, the data management system stores data generated by each system, as shown in fig. 4, and may include situation awareness module data, hardware view module data, multi-modal physiological characteristic data, policy generation module data, and multi-degree-of-freedom manipulator module data.
The situation awareness module data comprises a fuzzy parameter input sub-module, a parameter modification sub-module, a parameter query sub-module and a configuration file output sub-module. The fuzzy parameter input sub-module is used for importing data or configuration files containing all parameters into the situation perception module; the parameter query submodule is used for quickly finding needed data by ground station personnel in the operation process; a configuration file output submodule: and the configuration file is used for uniformly outputting all the parameters to generate the configuration file for the situation awareness module parameters.
The hardware view module data comprises a GPS data submodule, a flight track data submodule, an aircraft attitude data submodule and a control rate submodule. The GPS data submodule is used for recording the position information of the aircraft in the flight process; the flight track data submodule is used for observing and recording the flight path track of the aircraft in the flight process; the aircraft attitude data submodule is used for observing and recording flight attitude data of the aircraft in the test process; the control rate sub-module is used to evaluate the effectiveness of the operation during flight.
The multi-degree-of-freedom mechanical arm module data comprises a mechanical arm data analysis submodule, a sensor data analysis submodule, a path data submodule, a parameter modification submodule, a history record query submodule and a pose data submodule. The mechanical arm data analysis submodule is used for selecting a required single robot motion data packet to analyze; the sensor data analysis submodule is used for analyzing data acquired by the sensor in the flight process; the path data sub-module is used for collecting motion data of the cooperative robot in the flight process; the parameter modification submodule is used for modifying, adding, deleting and the like robot data in the flight process; the historical record query submodule is used for checking mechanical arm data and sensor data in the past flight process according to the event label; the pose data sub-module is used for recording position data and attitude data of the multi-degree-of-freedom mechanical arm in the flight process.
The strategy generation module data comprises a data analysis sub-module, a parameter modification sub-module, a parameter query sub-module and a history query sub-module.
The multi-modal physiological characteristic data comprises a data acquisition sub-module, an instruction recording sub-module, a parameter query sub-module and a history query sub-module.
The following description is given by way of example of a pilot controlling an aircraft by means of a pilot-assisted pilot system based on multimodal physiological signals:
step 1, a pilot enters a cockpit with a mechanical arm control platform, correctly wears a physiological signal acquisition system, such as an electroencephalogram acquisition system, and prepares to execute a specific task of a battlefield, such as investigation, monitoring or striking an enemy area.
And 2, the man-machine cooperative task allocation decision-making system sends information such as the position of the man-machine cooperative task allocation decision-making system and the direct position relation between the man-machine cooperative task allocation decision-making system and an adjacent enemy or friend machine to a display of the man-machine interaction system, and under the condition of permission of a communication band, a field picture can be returned so as to facilitate brain control pilot decision-making.
And 3, performing optimal judgment by the brain control pilot according to battlefield information (the state and the environment information which are fed back and the information which can be observed by the pilot per se), and outputting corresponding physiological signals by watching corresponding stimulation on the screen.
And 4, acquiring the electroencephalogram signals of the decision maker by the brain-computer interface system, processing the electroencephalogram signals to obtain corresponding commands, and changing the control mode of the mechanical arm.
And 5, the brain control pilot can perform further control after selecting the control mode.
In step 3, corresponding electroencephalogram signals can be output by watching corresponding stimulation on the screen, and the corresponding electroencephalogram signals can be generated by acquiring eye closing and eye opening actions of left and right eyes of a brain-controlled driver, and acquiring electroencephalogram signals based on steady-state visual evoked potentials (SSVEPs).
Further, step 4 includes a real-time control mode and a manual control mode. In the real-time control mode, tasks such as controlling the movement of the robot arm, operating a button, and releasing an alarm can be performed. In the manual mode, interference with pilot operation can be avoided because the system is damaged.
In addition, in a specific control task, the brain control pilot only needs to repeat the operations from the step 3 to the step 5. If the brain-controlled pilot wants to return to the previous selection interface after selecting the mode function, for example, the system can terminate the current movement mode by methods such as biting teeth for 1s, releasing teeth for 1s, repeating the above process for 5 times, and the like, and then return to the previous selection interface again.
In the embodiment of the invention, the multi-degree-of-freedom mechanical arm can observe physical environments inside and outside a cockpit and physiological data of a pilot in real time through a sensing system, analyze the current environmental condition, plan the operation strategy of the mechanical arm through a man-machine cooperative task allocation decision system, control the mechanical arm to operate according to a preset action, and realize the functions of but not limited to alarm relief, throttle valve operation, control panel operation and the like. In particular, in emergency situations, auxiliary driving of the mechanical arm can be released through personnel or pilots in the ground station, and therefore stability, safety and efficiency of flight are improved.
The embodiment of the invention provides a pilot-assisted driving system based on multi-modal physiological signals, which is used for realizing the pilot-assisted driving method based on the multi-modal physiological signals. As shown in fig. 5, the pilot-assisted driving system architecture diagram based on multi-modal physiological signals includes a human-machine cooperative task allocation decision system, a multi-modal physiological signal recognition system, a control system, a hardware view system, and a data management system, wherein:
the multi-mode physiological signal identification system is used for acquiring physiological data and environment perception information of a pilot, acquiring a rational action decision suggestion according to the physiological data and the environment perception information, and sending the rational action decision suggestion to the man-machine cooperative task allocation decision-making system;
the human-computer cooperative task allocation decision system is used for determining a movement instruction according to the obtained rational action decision suggestion and sending the movement instruction to the control system;
the control system is used for controlling the aircraft to move through the motion command;
the multi-mode physiological signal identification system is used for sending the rational action decision suggestion and the environment perception information to the hardware vision system, and the hardware vision system determines flight environment data through the rational action decision suggestion and the environment perception information and displays the flight environment data to the pilot;
and the data management system is used for storing the data generated by each system.
Optionally, the multi-modal physiological signal recognition system includes a behavior intention prediction module of multi-modal physiological features, a human-computer fusion interaction decision enhancement module, and an autonomous environment perception and situation cognition module;
the multi-modal physiological signal recognition system acquires physiological data and environmental perception information of a pilot, acquires a rational action decision suggestion according to the physiological data and the environmental perception information, and sends the rational action decision suggestion to the man-machine cooperative task allocation decision-making system, wherein the method comprises the following steps:
the behavior intention prediction module senses an instruction state and cognitive characteristic model of a pilot in real time, determines the prediction of behavior and action decisions, transmits the instruction state and cognitive characteristic model of the pilot to the man-machine fusion interaction decision enhancement module, and transmits the prediction of the behavior and action decisions to the autonomous environment sensing and situation cognition module; wherein the physiological data comprises an instruction state and a cognitive feature model;
the autonomous environment perception and situation cognition module transmits the acquired environment perception information to a man-machine fusion interaction decision enhancement module;
and the human-computer fusion interaction decision enhancement module obtains a rational action decision suggestion according to the instruction state and cognitive characteristic model of the pilot and the environment perception information, and sends the rational action decision suggestion to a human-computer cooperative task allocation decision system.
Optionally, the human-computer collaborative task allocation decision system is further configured to:
the man-machine cooperative task allocation decision system acquires hardware state data of the aircraft and internal and external environment data of the aircraft;
the man-machine cooperative task allocation decision system judges whether the acquired rational action decision suggestion is suitable for execution or not according to the hardware state data of the aircraft and the internal and external environment data of the aircraft;
if the judgment result is suitable for execution, determining the movement instruction according to the rational action decision suggestion;
and if the judgment result is that the aircraft is not suitable for execution, the man-machine cooperative task allocation decision-making system determines the motion instruction according to the hardware state data of the aircraft and the internal and external environment data of the aircraft.
Optionally, the flight environment data includes:
GPS data, flight trajectory data, aircraft attitude data, control rate and aircraft internal and external environment simulation data.
Optionally, the data management system is further configured to:
the data management system stores situation perception module data, hardware view module data, multi-mode physiological characteristic data, strategy generation module data and multi-degree-of-freedom mechanical arm module data.
In the embodiment of the invention, the multi-degree-of-freedom mechanical arm can observe physical environments inside and outside a cockpit and physiological data of a pilot in real time through a sensing system, analyze the current environmental condition, plan the operation strategy of the mechanical arm through a man-machine cooperative task allocation decision system, control the mechanical arm to operate according to a preset action, and realize the functions of but not limited to alarm relief, throttle valve operation, control panel operation and the like. In particular, in emergency situations, auxiliary driving of the mechanical arm can be released through personnel or pilots in the ground station, and therefore stability, safety and efficiency of flight are improved.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A pilot auxiliary driving method based on multi-modal physiological signals is characterized in that the method is realized by a pilot auxiliary driving system based on multi-modal physiological signals, and the system comprises a man-machine cooperative task allocation decision-making system, a multi-modal physiological signal recognition system, a control system, a hardware view system and a data management system; the method comprises the following steps:
the multi-modal physiological signal identification system acquires physiological data and environmental perception information of a pilot, acquires a rational action decision suggestion according to the physiological data and the environmental perception information, and sends the rational action decision suggestion to the man-machine cooperative task allocation decision system;
the man-machine cooperative task allocation decision-making system determines a motion instruction according to the obtained rational action decision suggestion, and sends the motion instruction to the control system, so that the control system controls the aircraft to move through the motion instruction;
the multi-mode physiological signal recognition system sends the rational action decision suggestion and the environment perception information to the hardware vision system, and the hardware vision system determines flight environment data through the rational action decision suggestion and the environment perception information and displays the flight environment data to the pilot;
the data management system stores data generated by each system.
2. The method according to claim 1, wherein the multi-modal physiological signal recognition system comprises a behavioral intention prediction module, a man-machine fusion interaction decision enhancement module and an autonomous environment perception and situation awareness module of multi-modal physiological features;
the multi-modal physiological signal recognition system acquires physiological data and environmental perception information of a pilot, acquires a rational action decision suggestion according to the physiological data and the environmental perception information, and sends the rational action decision suggestion to the man-machine cooperative task allocation decision-making system, wherein the method comprises the following steps:
the behavior intention prediction module senses an instruction state and cognitive characteristic model of a pilot in real time, determines the prediction of behavior and action decisions, transmits the instruction state and cognitive characteristic model of the pilot to the man-machine fusion interaction decision enhancement module, and transmits the prediction of the behavior and action decisions to the autonomous environment sensing and situation cognition module; wherein the physiological data comprises an instruction state and a cognitive feature model;
the autonomous environment perception and situation cognition module transmits the acquired environment perception information to a man-machine fusion interaction decision enhancement module;
and the human-computer fusion interaction decision enhancement module obtains a rational action decision suggestion according to the instruction state and cognitive characteristic model of the pilot and the environment perception information, and sends the rational action decision suggestion to a human-computer cooperative task allocation decision system.
3. The method of claim 1, wherein the determining, by the human-machine collaborative task allocation decision system, a motion instruction according to the obtained rational action decision suggestion comprises:
the man-machine cooperative task allocation decision system acquires hardware state data of the aircraft and internal and external environment data of the aircraft;
the man-machine cooperative task allocation decision system judges whether the acquired rational action decision suggestion is suitable for execution or not according to the hardware state data of the aircraft and the internal and external environment data of the aircraft;
if the judgment result is suitable for execution, determining the movement instruction according to the rational action decision suggestion;
and if the judgment result is that the aircraft is not suitable for execution, the man-machine cooperative task allocation decision-making system determines the motion instruction according to the hardware state data of the aircraft and the internal and external environment data of the aircraft.
4. The method of claim 1, wherein the flight environment data comprises:
GPS data, flight trajectory data, aircraft attitude data, control rate and aircraft internal and external environment simulation data.
5. The method of claim 1, wherein the data management system stores data generated by each system, comprising:
the data management system stores situation perception module data, hardware view module data, multi-mode physiological characteristic data, strategy generation module data and multi-degree-of-freedom mechanical arm module data.
6. A pilot auxiliary driving system based on multi-modal physiological signals is characterized in that the system is used for realizing a pilot auxiliary driving method based on multi-modal physiological signals, and comprises a man-machine cooperative task allocation decision-making system, a multi-modal physiological signal recognition system, a control system, a hardware view system and a data management system;
the multi-mode physiological signal identification system is used for acquiring physiological data and environment perception information of a pilot, acquiring a rational action decision suggestion according to the physiological data and the environment perception information, and sending the rational action decision suggestion to the man-machine cooperative task allocation decision-making system;
the human-computer cooperative task allocation decision system is used for determining a movement instruction according to the obtained rational action decision suggestion and sending the movement instruction to the control system;
the control system is used for controlling the aircraft to move through the motion command;
the multi-mode physiological signal identification system is used for sending the rational action decision suggestion and the environment perception information to the hardware vision system, and the hardware vision system determines flight environment data through the rational action decision suggestion and the environment perception information and displays the flight environment data to the pilot;
and the data management system is used for storing the data generated by each system.
7. The system of claim 6, wherein the multi-modal physiological signal recognition system comprises a behavioral intention prediction module, a human-computer fusion interaction decision enhancement module, and an autonomous environment perception and situation awareness module for multi-modal physiological features;
the multi-modal physiological signal recognition system acquires physiological data and environmental perception information of a pilot, acquires a rational action decision suggestion according to the physiological data and the environmental perception information, and sends the rational action decision suggestion to the man-machine cooperative task allocation decision-making system, wherein the method comprises the following steps:
the behavior intention prediction module senses an instruction state and cognitive characteristic model of a pilot in real time, determines the prediction of behavior and action decisions, transmits the instruction state and cognitive characteristic model of the pilot to the man-machine fusion interaction decision enhancement module, and transmits the prediction of the behavior and action decisions to the autonomous environment sensing and situation cognition module; wherein the physiological data comprises an instruction state and a cognitive feature model;
the autonomous environment perception and situation cognition module transmits the acquired environment perception information to a man-machine fusion interaction decision enhancement module;
and the human-computer fusion interaction decision enhancement module obtains a rational action decision suggestion according to the instruction state and cognitive characteristic model of the pilot and the environment perception information, and sends the rational action decision suggestion to a human-computer cooperative task allocation decision system.
8. The system of claim 6, wherein the human-machine collaborative task allocation decision system is further configured to:
the man-machine cooperative task allocation decision system acquires hardware state data of the aircraft and internal and external environment data of the aircraft;
the man-machine cooperative task allocation decision system judges whether the acquired rational action decision suggestion is suitable for execution or not according to the hardware state data of the aircraft and the internal and external environment data of the aircraft;
if the judgment result is suitable for execution, determining the movement instruction according to the rational action decision suggestion;
and if the judgment result is that the aircraft is not suitable for execution, the man-machine cooperative task allocation decision-making system determines the motion instruction according to the hardware state data of the aircraft and the internal and external environment data of the aircraft.
9. The system of claim 6, wherein the flight environment data comprises:
GPS data, flight trajectory data, aircraft attitude data, control rate and aircraft internal and external environment simulation data.
10. The system of claim 6, wherein the data management system is further configured to:
the data management system stores situation perception module data, hardware view module data, multi-mode physiological characteristic data, strategy generation module data and multi-degree-of-freedom mechanical arm module data.
CN202110962284.2A 2021-08-20 2021-08-20 Pilot auxiliary driving method and pilot auxiliary driving system based on multi-mode physiological signals Active CN113778113B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110962284.2A CN113778113B (en) 2021-08-20 2021-08-20 Pilot auxiliary driving method and pilot auxiliary driving system based on multi-mode physiological signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110962284.2A CN113778113B (en) 2021-08-20 2021-08-20 Pilot auxiliary driving method and pilot auxiliary driving system based on multi-mode physiological signals

Publications (2)

Publication Number Publication Date
CN113778113A true CN113778113A (en) 2021-12-10
CN113778113B CN113778113B (en) 2024-03-26

Family

ID=78838372

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110962284.2A Active CN113778113B (en) 2021-08-20 2021-08-20 Pilot auxiliary driving method and pilot auxiliary driving system based on multi-mode physiological signals

Country Status (1)

Country Link
CN (1) CN113778113B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115089190A (en) * 2022-08-25 2022-09-23 上海华模科技有限公司 Pilot multi-mode physiological signal synchronous acquisition system based on simulator
CN116820379A (en) * 2023-08-31 2023-09-29 中国电子科技集团公司第十五研究所 Equipment display control method based on human engineering, server and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107516452A (en) * 2017-08-17 2017-12-26 北京航空航天大学 A kind of general flight simulation simulated environment evaluation system
US20180039271A1 (en) * 2016-08-08 2018-02-08 Parrot Drones Fixed-wing drone, in particular of the flying-wing type, with assisted manual piloting and automatic piloting
CN108196566A (en) * 2018-03-16 2018-06-22 西安科技大学 A kind of small drone cloud brain control system and its method
WO2018113392A1 (en) * 2016-12-20 2018-06-28 华南理工大学 Brain-computer interface-based robotic arm self-assisting system and method
CN108945397A (en) * 2018-04-24 2018-12-07 中国商用飞机有限责任公司北京民用飞机技术研究中心 A kind of flight management system
CN109308076A (en) * 2017-07-27 2019-02-05 极光飞行科学公司 Unit automation system and method with integration imaging and power sensing mode
CN109710063A (en) * 2018-12-11 2019-05-03 中国航空工业集团公司西安航空计算技术研究所 A kind of intelligent multi-modal human-computer intellectualization frame fought, method and apparatus
CN110502103A (en) * 2019-05-29 2019-11-26 中国人民解放军军事科学院军事医学研究院 Brain control UAV system and its control method based on brain-computer interface
WO2019231477A1 (en) * 2018-05-31 2019-12-05 Gillett Carla R Robot and drone array
US20210178603A1 (en) * 2019-12-12 2021-06-17 The Boeing Company Emotional intelligent robotic pilot

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180039271A1 (en) * 2016-08-08 2018-02-08 Parrot Drones Fixed-wing drone, in particular of the flying-wing type, with assisted manual piloting and automatic piloting
WO2018113392A1 (en) * 2016-12-20 2018-06-28 华南理工大学 Brain-computer interface-based robotic arm self-assisting system and method
CN109308076A (en) * 2017-07-27 2019-02-05 极光飞行科学公司 Unit automation system and method with integration imaging and power sensing mode
CN107516452A (en) * 2017-08-17 2017-12-26 北京航空航天大学 A kind of general flight simulation simulated environment evaluation system
CN108196566A (en) * 2018-03-16 2018-06-22 西安科技大学 A kind of small drone cloud brain control system and its method
CN108945397A (en) * 2018-04-24 2018-12-07 中国商用飞机有限责任公司北京民用飞机技术研究中心 A kind of flight management system
WO2019231477A1 (en) * 2018-05-31 2019-12-05 Gillett Carla R Robot and drone array
CN109710063A (en) * 2018-12-11 2019-05-03 中国航空工业集团公司西安航空计算技术研究所 A kind of intelligent multi-modal human-computer intellectualization frame fought, method and apparatus
CN110502103A (en) * 2019-05-29 2019-11-26 中国人民解放军军事科学院军事医学研究院 Brain control UAV system and its control method based on brain-computer interface
US20210178603A1 (en) * 2019-12-12 2021-06-17 The Boeing Company Emotional intelligent robotic pilot

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
GHOSH, L.; KONAR, A.: ""EEG-Induced Adaptation of Controller Parameter for Closed-Loop Position Control of the End-Effecter in a Robot Arm"", 2019 10TH INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATION AND NETWORKING TECHNOLOGIES (ICCCNT) *
丁全心: ""现代空战中的战术辅助决策技术"", 《光电与控制》, vol. 16, no. 12 *
刘爽;朱国栋;: ""基于操作者表现的机器人遥操作方法"", 《机器人》, vol. 40, no. 04 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115089190A (en) * 2022-08-25 2022-09-23 上海华模科技有限公司 Pilot multi-mode physiological signal synchronous acquisition system based on simulator
CN115089190B (en) * 2022-08-25 2023-02-28 上海华模科技有限公司 Pilot multi-mode physiological signal synchronous acquisition system based on simulator
CN116820379A (en) * 2023-08-31 2023-09-29 中国电子科技集团公司第十五研究所 Equipment display control method based on human engineering, server and storage medium

Also Published As

Publication number Publication date
CN113778113B (en) 2024-03-26

Similar Documents

Publication Publication Date Title
CN113778113B (en) Pilot auxiliary driving method and pilot auxiliary driving system based on multi-mode physiological signals
EP3459807A1 (en) System for monitoring an operator
Scerbo Adaptive automation
CN108415554B (en) Brain-controlled robot system based on P300 and implementation method thereof
CN109528157A (en) System and method for monitoring pilot's health
CN106343977B (en) Unmanned plane operator's condition monitoring system based on Multi-sensor Fusion
CN112051780B (en) Brain-computer interface-based mobile robot formation control system and method
CN110221620B (en) MAS-based multi-unmanned system supervision control station
CN109540546A (en) A kind of test macro and test method of unsafe driving behavior monitoring device
CN109558005A (en) A kind of adaptive man-machine interface configuration method
CN109196437A (en) Intelligent driving method, apparatus and storage medium
Yang et al. Real-time driver cognitive workload recognition: Attention-enabled learning with multimodal information fusion
CN113589835B (en) Autonomous perception-based intelligent robot pilot flight method and device
CN115951599A (en) Unmanned aerial vehicle-based driving capability test system, method and device and storage medium
Havlikova et al. A man as the regulator in man-machine systems
Huo et al. A BCI-based motion control system for heterogeneous robot swarm
CN112936259B (en) Man-machine cooperation method suitable for underwater robot
Wu et al. Visual attention analysis for critical operations in maritime collision avoidance
US20220374808A1 (en) Task concatenation and prediction of operator and system response in socio-technical systems using artificial intelligence
Moczulski Autonomous systems control aided by Virtual Teleportation of remote operator
CN114460958A (en) Brain-computer fusion flight control system based on hierarchical architecture
Kari et al. Eeg application for human-centered experiments in remote ship operations
Wu et al. A camera-based deep-learning solution for visual attention zone recognition in maritime navigational operations
CN116512262A (en) Man-machine cooperation interactive operation system and method based on multi-BCI and MR fusion
CN114041874B (en) Interface display control method and device, computer equipment and system and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant