CN114036641A - Glidepath deviation determination system, method and aircraft - Google Patents

Glidepath deviation determination system, method and aircraft Download PDF

Info

Publication number
CN114036641A
CN114036641A CN202111317457.1A CN202111317457A CN114036641A CN 114036641 A CN114036641 A CN 114036641A CN 202111317457 A CN202111317457 A CN 202111317457A CN 114036641 A CN114036641 A CN 114036641A
Authority
CN
China
Prior art keywords
data
deviation
image
model
aircraft
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111317457.1A
Other languages
Chinese (zh)
Inventor
熊绎维
陈华
郭浩
张炯
孙廷婷
吴赫男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Commercial Aircraft Corp of China Ltd
Beijing Aeronautic Science and Technology Research Institute of COMAC
Original Assignee
Commercial Aircraft Corp of China Ltd
Beijing Aeronautic Science and Technology Research Institute of COMAC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Commercial Aircraft Corp of China Ltd, Beijing Aeronautic Science and Technology Research Institute of COMAC filed Critical Commercial Aircraft Corp of China Ltd
Priority to CN202111317457.1A priority Critical patent/CN114036641A/en
Publication of CN114036641A publication Critical patent/CN114036641A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/15Vehicle, aircraft or watercraft design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

The patent discloses a system and a method for determining deviation of a glideslope and an aircraft, belongs to the field of digital image processing and aviation, and provides an image-based solution different from an ILS (inertial navigation System) and a GBAS (GBAS) system for approach landing guidance. The glide-slope deviation determining system is suitable for civil fixed-wing airplanes, is based on a part of a vision-aided driving system, and is used for solving the problem of how to obtain airplane glide-slope deviation data with the precision similar to that of the existing sensor through an image recognition technology and actually apply the data to a real aircraft. The main technical scheme of the invention comprises the following steps: the system comprises image acquisition equipment, an image processing unit and a pose resolving unit.

Description

Glidepath deviation determination system, method and aircraft
Technical Field
The invention relates to the field of digital image processing and the field of aviation, in particular to a glidepath deviation determining system and an aircraft.
Background
Landing is one of the most critical phases of flight in the operation of a fixed-wing aircraft, and during landing, the deviation between the aircraft and the glidepath is an indispensable parameter. The existing approach and approach landing guidance in China is mostly based on ILS, but the ILS has the following problems: the signal is formed by ground reflection and is easily influenced by the ground and surrounding objects, and the reflected signal is abnormal due to accumulated snow, water and excessively high weeds; the working channel is close to the frequency modulation broadcasting frequency and is easy to be interfered by a frequency modulation station; each runway opening needs 1 set of ILS equipment, one set of ILS system only provides one vertical navigation path, and all airplanes land at the same point; high capital investment and high maintenance cost. Therefore, GBAS-based approach landing guidance research is developed at home and abroad to replace and supplement ILS functions. However, at present, the application of the approach landing guiding method based on GBAS is not completely mature, and the integrity and reliability of the approach landing guiding method are improved.
In the field of civil aircraft, automatic take-off and landing systems based on machine vision have become a future development trend. The civil aircraft can realize the autonomous sliding, taking-off and landing of the civil aircraft by using an airborne image recognition technology and through the full-automatic flight based on the vision. Machine vision based automatic takeoff and landing systems can help pilots reduce concern over aircraft operation, focusing more on policy decisions and task management. The technology is applied to large civil aircrafts, so that the potential in the aspect of future aircraft operation can be enhanced, and the safety performance of the aircrafts can be further improved.
Disclosure of Invention
The invention provides a glideslope deviation determining system, a method and an aircraft provided with the system, and provides a machine vision-based solution different from an ILS system and a GBAS system for approaching landing guidance.
The glide-slope deviation determining system is suitable for civil fixed-wing airplanes, is based on a part of a vision-aided driving system, and is used for solving the problem of how to obtain airplane glide-slope deviation data with the precision similar to that of the existing sensor through an image recognition technology and actually apply the data to a real aircraft.
In order to solve the problem of application of the glide slope deviation determining system in a real aircraft, the invention provides a glide slope deviation determining system, which specifically comprises:
the image acquisition equipment comprises optical image acquisition equipment and infrared image acquisition equipment, and is arranged on the aircraft and used for acquiring image data containing runway key information;
an image processing unit that performs preprocessing on the image data;
the pose resolving unit is used for determining the glide slope deviation of the aircraft according to the image data and comprises a machine learning-based glide slope deviation prediction model;
in the aircraft attitude determination system, the technical problem of how to obtain the glide slope deviation data with the precision similar to that of the existing sensor from the image data by the pose calculation unit is particularly important.
Based on the general inventive concept, the invention provides an end-to-end model, and the estimation of the glidepath deviation data of the airplane can be directly realized by an image through training a machine learning model by a proper method. In the calculation process of the glide-slope deviation data, extra landmark information is not relied on, and solution is not carried out according to the characteristics of a front multi-frame and a rear multi-frame, so that the end-to-end model ensures higher precision and higher calculation speed, is more suitable for practical application, and has stronger robustness. In the practical application process, extra ground information identification is not needed, and front and back multiframe images are not needed, so that the calculated amount is reduced.
Based on the above general inventive concept, in order to ensure the accuracy and robustness of the end-to-end model, the collection and processing of model training data are more important. The invention provides two typical ways for collecting model training data, namely data collected from a simulation environment and data collected from a real scene, from the practical engineering point of view, considering the design and practical application scene based on a vision-aided driving system.
The two training data are respectively good and bad, only the data collected from the simulation environment are used for training the model, a large amount of training data with low cost can be provided for the model, the accuracy of the model is improved through a large amount of simulation data, the simulation data are collected in the existing mature simulation flight software for industrial application, the data under different meteorological environments can be collected, and the very key effect is achieved on early design and preliminary verification based on the vision aided driving system. And the model obtained based on simulation data training can also be directly applied to the real aircraft through simpler adjustment.
The method has the advantages that training set data are derived from a real environment, the trained model has stronger adaptability to input data in the real environment, the airplane glidepath and related data in the real flying scene can be directly acquired from an airplane bus, the data collection difficulty is lower due to the fact that the problem of alignment of image time and time stamps of airplane postures and related data is solved, but obviously, the collection cost is obviously higher than that of the data collected from a simulation environment due to the fact that enough data can be collected through multiple real flights.
In order to further balance the efficiency of collecting training data and the final airplane attitude prediction effect of the model, the data from two sources can be used simultaneously during training.
On the basis of the scheme of the aircraft attitude determination system, the following characteristics can be further included:
optionally, the machine learning-based glide slope deviation prediction model is an end-to-end convolutional neural network model.
Optionally, the machine learning-based glidepath deviation model is obtained by training through the following method:
step S1, collecting data from the simulation environment and/or the real environment; the data comprises image data of a driver visual angle and vehicle-mounted end data;
step S2, preprocessing the data;
step S3, constructing a neural network, wherein the neural network takes the image obtained by the image processing unit as input and takes the slip deviation data and the course deviation data as output; the neural network includes 33 convolutional layers and 1 fully-connected layer.
Optionally, when the data collected in step S1 is derived from a real environment, the onboard end data includes attitude deviation data (ILS data and/or GBAS data), attitude data, barometric altitude, wheel speed, and IRS data information collected by other onboard sensors; the airborne terminal data is acquired from an aircraft bus;
optionally, in the step S2, the preprocessing includes data enhancement processing; the data enhancement processing does not include rotation and translation. Because the specific location of the runway in the image corresponds to the ILS data, random rotation or translation of the input image may make the model less than correctly trained.
Optionally, a method for determining a deviation of a lower sliding track includes:
obtaining an image comprising a runway;
preprocessing the image;
inputting the processed image into a machine learning model;
and obtaining the deviation value of the lower slideway.
Optionally, the machine learning model is an end-to-end model, the input of the model is an image, and the output is a glide slope deviation value.
Optionally, the glideslope deviation value includes glideslope deviation data and heading deviation data.
The technology is verified in the offline data of the X-Plane simulation environment and the real scene. When the device is used, the images are input into a trained machine learning model, and heading deviation and glide deviation guide values predicted according to the images can be output through forward inference of the images through the model. Meanwhile, the real ILS course deviation and the glide deviation guide value are collected, and the average error of each frame of course deviation is calculated to be 0.04 and the glide deviation is calculated to be 0.08 by comparing the predicted value obtained according to the image and the real value of the ILS.
The verification effect in the real scene offline data is shown in fig. 4, which is displayed on an offline computer screen. The numerical value of the right real mark is the real value acquired by ILS, the numerical value of the pre mark is the estimated value of the machine learning model, and the deviation between the two values can be found to be small.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a schematic diagram of a system for determining a deviation of a glidepath in an embodiment of the present invention
FIG. 2 is a flow chart of a method for determining glideslope deviation in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of a method for training a glidepath deviation model in accordance with an embodiment of the present invention;
FIG. 4 is a verification result diagram of the method for determining the deviation of the glideslope in an embodiment of the present invention
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
In an embodiment provided by the invention, the system for determining the attitude of the airplane comprises an image acquisition device, an image processing unit and a pose resolving unit.
The image acquisition equipment is arranged on the aircraft and is used for acquiring image data containing runway key information; the image acquisition equipment uses a CCD camera and is required to meet the specification requirement of GJB 7083. Meanwhile, the image pixels collected by the image collecting device are at least 680x 680. In this embodiment, the image data acquired by the image acquisition device is shot from the view angle of the driver, so that the view field of each frame of image can cover the runway in the whole process of advanced landing, and the focal length of the image data is required to ensure that the runway is clearly visible.
The image processing unit is used for preprocessing the image data and processing the image of the image acquisition equipment to display the image in a cockpit instrument in a proper form; the image processing unit is used for preprocessing image data acquired by the image acquisition equipment, so that the image processing unit is better suitable for a machine learning model in the pose resolving unit.
The pose resolving unit is used for determining the glide slope deviation of the aircraft according to the image data and comprises a machine learning-based glide slope deviation prediction model.
The machine learning-based glide slope deviation prediction model in the attitude calculation unit is obtained by training by using the following method:
step S1, collecting data from the simulation environment; the data comprises image data of a driver visual angle and vehicle-mounted end data; in order to realize the diversity of data, simulated approach landing is carried out from a position 3 nautical miles away from the runway inlet in different simulated weathers including fine weather, cloudy weather, low visibility, storm and the like and at different moments in the morning, noon and evening, and a plurality of images are collected. And then acquiring airborne data corresponding to the image data. The airborne data is airborne data information including ILS data output by professional simulated flight software.
Because the time point of the airborne data output by the professional flight simulation software cannot be in one-to-one correspondence with the time point of the image data, the time stamps of the image data and the airborne end data need to be aligned, and a data set containing the image data and the airborne data corresponding to the image data is formed. In general, the operation of aligning timestamps can be summarized as:
extracting a certain frame of image shot by the image acquisition equipment, acquiring a timestamp T of the current frame of image, and finding a glide deviation and course deviation data which is closest to the timestamp T but not earlier than the timestamp T in the airborne ILS data according to the timestamp T. The frame of image is named in an index mode, and meanwhile, the extracted ILS data are named in the same index mode, so that the airborne image data and the airborne ILS data are guaranteed to be in one-to-one correspondence.
Because the approach attitude of the airplane is stable in the approach process, the change of the glide deviation and the course deviation data within the time difference is not enough to influence the integral training of the model, the method effectively reduces the processing workload of the model training data and improves the efficiency of the model training.
Step S2, preprocess the data. The method specifically comprises the following steps: and performing downsampling operation, data set amplification, data normalization, data recombination and the like on the image.
Wherein the content of the first and second substances,
the down-sampling comprises: to reduce the computational cost, the image resolution is downsampled to 224x 224 and stored in the form of BGR, i.e., each channel represents blue, green, and red, respectively, with pixel values in the range of 0, 255.
The amplification dataset comprises: and carrying out randomized value processing on the brightness, the contrast, the saturation and the hue of the image so as to simulate data of different illumination conditions and camera levels. However, random rotation or translation methods cannot be used in the present invention because the specific location of the runway in the image corresponds to the ILS data, and random rotation or translation of the input image would make the model unable to be trained correctly.
Data normalization: all image data is divided by 255 to normalize the data to between 0,1,
data reorganization: to eliminate the time series correlation of the data set, the tag data is recombined to improve the adaptability and generalization performance.
Step S3, constructing a neural network, wherein the neural network takes the image obtained by the image processing unit as input and takes the following slide deviation data as output; the neural network comprises 33 convolutional layers and 1 fully connected layer;
the concrete framework of the neural network comprises:
the convolutional neural network used in the present invention is a ResNet network and uses a 34-layer model, which contains 33 convolutional layers and 1 fully-connected layer, where the largest pooling layer is added after the first convolution and the average pooling layer is added before the last fully-connected layer. Wherein, after each convolution layer, a BatchNorm method and a Relu activation function are provided, and residual error processing is performed between each two convolution layers to form a residual error block. On the basis of ResNet, the output channel of the last full-connection layer is changed into 2, the one-dimensional channel corresponds to course deviation, and the one-dimensional channel corresponds to glide deviation;
in the training process of the model, the method also comprises the steps of comparing a guide estimation value output by the convolutional neural network with a true value based on ILS or GBAS to obtain a mean square error, and taking the mean square error as a loss function; performing back propagation according to the loss function, and updating the network parameters, wherein the updating aims to minimize the loss function; after training for a number of rounds, an end-to-end network model with a loss function reaching an expected value can be obtained.
The trained convolutional neural network model can be put into use, and when the convolutional neural network model is used, an approaching landing guide value can be directly obtained by inputting an extravehicular image.
And (3) checking the prediction effect of the glide slope deviation value of the neural network model, and respectively inputting newly acquired videos from a simulation environment and an actual environment into the network for verification, wherein the newly acquired videos are completely independent from the previously trained data and are acquired under various illumination and weather conditions.
And inputting the newly acquired airborne image from the simulation environment into a trained convolutional neural network model, and outputting heading deviation and glide deviation data predicted according to the airborne image through forward inference of the image through the model. Meanwhile, real ILS course deviation and glide deviation data are collected, and by comparing a predicted value obtained according to the image and the real value of the ILS, the average error of each frame of the course deviation is calculated to be 0.04, and the glide deviation is calculated to be 0.08.
The verification effect in the real scene offline data is shown in fig. 4, an image acquired during the test flight of the C919 type passenger plane is input into the model provided by the embodiment, and the output result of the model is displayed in the image. The numerical value of the right real mark is the real value acquired by ILS, the numerical value of the pre mark is the estimated value of the machine learning model, and the deviation between the two values can be found to be small.
Example 2
Example 2 is the same as example 1 in the framework and composition of the glide-slope deviation system, except that the training method for the machine learning-based glide-slope deviation prediction model in the pose solution unit is different. Example 2 data collected from a real environment was used in training, and the model data collection process before training was as follows:
data collection from real environment: the data comprises image data of a driver visual angle and vehicle-mounted end data; in order to realize the diversity of data, video images of the aircraft approaching and landing are collected from a position 3 nautical miles away from the runway inlet in different weather, including fine weather, cloudy weather, low visibility, storm and the like, and at different moments in the morning, the noon and the evening. And then acquiring airborne data corresponding to the image data from the aircraft bus. The airborne data includes ILS data information (attitude deviation data) recognized and output by existing sensors in the aircraft, and flight attitude, barometric altitude, wheel speed, IRS data information, and the like are also collected to ensure training accuracy.
The invention also provides a method for determining the deviation of the lower slideway, which comprises the following steps: obtaining an image comprising a runway; preprocessing the image; inputting the processed image into a machine learning model; and obtaining the deviation value of the lower slideway. The machine learning model is an end-to-end model, the input of the model is an image, and the output of the model is a glide slope deviation value. The glideslope deviation value comprises glideslope deviation data and course deviation data.
The working principle of the glide slope deviation determining method is consistent with that of the glide slope deviation determining system, and repeated description is omitted here.
Furthermore, the invention provides an embodiment comprising an aircraft, in particular a fixed-wing aircraft, equipped with a glidepath deviation determination system as described above.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (9)

1. A glideslope deviation determining system, comprising:
the image acquisition equipment comprises optical image acquisition equipment and infrared image acquisition equipment, and is arranged on the aircraft and used for acquiring image data containing runway key information;
an image processing unit that performs preprocessing on the image data;
and the pose resolving unit is used for determining the glide slope deviation of the aircraft according to the image data and comprises a machine learning-based glide slope deviation prediction model.
2. The glideslope deviation determination system of claim 1, wherein the machine learning-based glideslope deviation prediction model is an end-to-end convolutional neural network model.
3. A glideslope deviation determining system according to claim 1 or 2, wherein the machine learning based glideslope deviation model is trained by:
step S1, collecting data from the simulation environment and/or the real environment; the data comprises image data of a driver visual angle and vehicle-mounted end data;
step S2, preprocessing the data;
step S3, constructing a neural network, wherein the neural network takes the image obtained by the image processing unit as input and takes the slip deviation data and the course deviation data as output; the neural network includes 33 convolutional layers and 1 fully-connected layer.
4. A lower glide slope deviation determination system according to claim 3, wherein when the data collected in step S1 is derived from a real environment, the onboard end data comprises ILS deviation data and/or GBAS data, attitude data, barometric altitude, wheel speed and/or IRS data information collected by other onboard sensors; the airborne terminal data is obtained from an aircraft bus.
5. The lower run deviation determination system as claimed in claim 3, wherein in said step S2, the preprocessing includes data enhancement processing; the data enhancement processing does not include rotation and translation.
6. A method of glidepath deviation determination, comprising:
obtaining an image comprising a runway;
preprocessing the image;
and inputting the processed image into a machine learning model, and outputting a lower slide way deviation value.
7. A method as claimed in claim 6, wherein the machine learning model is an end-to-end model, the input to the model being an image and the output being a glidepath deviation value.
8. A glideslope deviation determination method according to claim 7, wherein the glideslope deviation value comprises glideslope deviation data and heading deviation data.
9. An aircraft, characterized in that a glideslope deviation determining system according to claims 1-5 is installed.
CN202111317457.1A 2021-11-08 2021-11-08 Glidepath deviation determination system, method and aircraft Pending CN114036641A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111317457.1A CN114036641A (en) 2021-11-08 2021-11-08 Glidepath deviation determination system, method and aircraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111317457.1A CN114036641A (en) 2021-11-08 2021-11-08 Glidepath deviation determination system, method and aircraft

Publications (1)

Publication Number Publication Date
CN114036641A true CN114036641A (en) 2022-02-11

Family

ID=80136822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111317457.1A Pending CN114036641A (en) 2021-11-08 2021-11-08 Glidepath deviation determination system, method and aircraft

Country Status (1)

Country Link
CN (1) CN114036641A (en)

Similar Documents

Publication Publication Date Title
CN115439424B (en) Intelligent detection method for aerial video images of unmanned aerial vehicle
CN107194989B (en) Traffic accident scene three-dimensional reconstruction system and method based on unmanned aerial vehicle aircraft aerial photography
CN108694386B (en) Lane line detection method based on parallel convolution neural network
Xu et al. Power line-guided automatic electric transmission line inspection system
CN110781757B (en) Airport pavement foreign matter identification and positioning method and system
CN102043964A (en) Tracking algorithm and tracking system for taking-off and landing of aircraft based on tripod head and camera head
AU2013343222A1 (en) Cloud feature detection
CN109948553A (en) A kind of multiple dimensioned dense population method of counting
KR20210080459A (en) Lane detection method, apparatus, electronic device and readable storage medium
US9443356B2 (en) Augmented situation awareness
US20220189145A1 (en) Unpaired image-to-image translation using a generative adversarial network (gan)
CN113963240B (en) Comprehensive detection method for multi-source remote sensing image fusion target
CN108024070B (en) Method for overlaying sensor images on composite image and related display system
CN113486602B (en) Simulation method, system and device for airport runway management digital twin system
CN112349150B (en) Video acquisition method and system for airport flight guarantee time node
CN113052106B (en) Airplane take-off and landing runway identification method based on PSPNet network
CN111898444A (en) Aircraft landing gear state determination method based on image recognition
CN114419231B (en) Traffic facility vector identification, extraction and analysis system based on point cloud data and AI technology
CN113286081B (en) Target identification method, device, equipment and medium for airport panoramic video
CN114089786A (en) Autonomous inspection system based on unmanned aerial vehicle vision and along mountain highway
CN113673444B (en) Intersection multi-view target detection method and system based on angular point pooling
CN115272876A (en) Remote sensing image ship target detection method based on deep learning
Xu et al. Development of power transmission line detection technology based on unmanned aerial vehicle image vision
Dolph et al. Sense and avoid for small unmanned aircraft systems
US20230267753A1 (en) Learning based system and method for visual docking guidance to detect new approaching aircraft types

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination