CN111898444A - Aircraft landing gear state determination method based on image recognition - Google Patents

Aircraft landing gear state determination method based on image recognition Download PDF

Info

Publication number
CN111898444A
CN111898444A CN202010610143.XA CN202010610143A CN111898444A CN 111898444 A CN111898444 A CN 111898444A CN 202010610143 A CN202010610143 A CN 202010610143A CN 111898444 A CN111898444 A CN 111898444A
Authority
CN
China
Prior art keywords
image
airplane
undercarriage
landing gear
aircraft
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010610143.XA
Other languages
Chinese (zh)
Inventor
曾杰
汤本俊
刘高
刘连忠
黄�俊
赵国朋
杨东升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Civio Information And Technology Co ltd
Original Assignee
Anhui Civio Information And Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Civio Information And Technology Co ltd filed Critical Anhui Civio Information And Technology Co ltd
Priority to CN202010610143.XA priority Critical patent/CN111898444A/en
Publication of CN111898444A publication Critical patent/CN111898444A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00664Recognising scenes such as could be captured by a camera operated by a pedestrian or robot, including objects at substantially different ranges from the camera
    • G06K9/00671Recognising scenes such as could be captured by a camera operated by a pedestrian or robot, including objects at substantially different ranges from the camera for providing information about objects in the scene to a user, e.g. as in augmented reality applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/02Computer systems based on biological models using neural network models
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K2209/00Indexing scheme relating to methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K2209/23Detecting or categorising vehicles

Abstract

The invention provides an aircraft landing gear state judgment method based on image recognition, which comprises the following steps: 1) continuously acquiring aerial target images by using an image acquisition module erected on an airstrip; 2) adopting an airplane identification model to identify the airplane and the position of the airplane in the aerial target image; 3) acquiring the flight state of the airplane according to the group of image sequences obtained in the step 2); 4) extracting an airplane image from the airplane identification result; 5) identifying the undercarriage in the airplane image by adopting an undercarriage identification model; 6) extracting an undercarriage image from the undercarriage identification result; 7) and comprehensively judging whether the retraction state of the undercarriage is correct or not by combining the flight state of the airplane and the undercarriage image. According to the invention, the landing gear state judgment is effectively realized by accurately identifying the landing gear and combining the flight state of the aircraft, and the detection accuracy and stability are improved.

Description

Aircraft landing gear state determination method based on image recognition
Technical Field
The invention relates to the technical field of airport safety protection, in particular to an aircraft landing gear folding and unfolding state judgment method based on an image recognition technology.
Background
At present, the retraction and release state of an aircraft landing gear is mainly prompted by an indicating instrument on the aircraft, and when the indicating instrument breaks down, a pilot cannot obtain the current state of the landing gear, so that great safety risk is caused. In order to guarantee flight safety, ground personnel generally observe an airplane through a telescope, manually confirm the retraction and release conditions of an undercarriage, and timely inform a pilot when the undercarriage is found to have a fault. However, this method is very susceptible to environmental factors such as weather, light, visibility, etc., and requires a great concentration of attention of observers, which may lead to safety accidents.
With the development of image processing technology, image-based intelligent detection means are gaining attention. The method can constantly detect and judge the target state, find the abnormal state of the target in time and remind related personnel to process, thereby overcoming the influence caused by personnel factors. Although the landing gear state can be obtained through an image detection technology, the situations of missing report and false report easily occur because the positions of the landing gear under different flight attitudes cannot be accurately distinguished. Therefore, an image recognition technology capable of accurately recognizing the undercarriage is urgently needed to realize stable and reliable detection of the retraction state of the undercarriage of the airplane.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a method for automatically judging the state of an aircraft undercarriage on the basis of an image recognition technology, which can accurately judge whether the state of the undercarriage is correct or not by automatically combining the flight attitude of the aircraft, has accurate detection results, and stably and reliably ensures flight safety.
In order to achieve the aim, the invention provides an aircraft landing gear state determination method based on image recognition, which comprises the following steps:
1) continuously acquiring aerial target images by using an image acquisition module erected on an airstrip;
2) adopting an airplane identification model to identify the airplane and the position of the airplane in the aerial target image;
3) acquiring the flight state of the airplane according to the group of image sequences obtained in the step 2);
4) extracting an airplane image from the airplane identification result;
5) identifying the undercarriage in the airplane image by adopting an undercarriage identification model;
6) extracting an undercarriage image from the undercarriage identification result;
7) and comprehensively judging whether the retraction state of the undercarriage is correct or not by combining the flight state of the airplane and the undercarriage image.
And 2) identifying the size of the target airplane in the image, and recording a time tag and an image acquisition module number when the image is acquired.
Further, the flight state is determined according to the parameter change of the airplane in the image in a period of time, wherein the parameter change comprises the position change of the same airplane in the image acquisition process or the speed change calculated based on the position and the reference object. The flight state can also be determined according to the size change of the same airplane in the acquired images when the same airplane is acquired by the same image acquisition module in front and back.
The comprehensive judgment specifically comprises the following steps: when the aircraft is in the pulling and lifting process but the undercarriage is not in the retracting state, judging that the undercarriage state is wrong and sending an alarm; when the aircraft is in the landing process but the undercarriage is not in the down state, the undercarriage state error is judged and an alarm is given.
In the step 7), the state of the landing gear can be judged according to the position of the same landing gear in the airplane image within a period of time, and the action of the landing gear is judged according to the position change of the landing gear; the alarm adopts an optical alarm mode, and the high-power yellow light or red light halogen lamp arranged on the image acquisition module carries out flash alarm.
Further, the airplane identification model and the undercarriage identification model are subjected to model training by using the same trainer.
Furthermore, an image processing step 4.1 is included after step 4) and before step 5): judging whether the airplane image is clear, if the airplane image is clear, directly entering the step 5), and if not, entering the step 4.2; 4.2, selecting an image enhancement algorithm to increase the image definition according to the actual environment and hardware conditions; positioning the position of the airplane by using a Vibe algorithm and a Yolo algorithm in the step 2); and 5) identifying the undercarriage by adopting an SSD algorithm, and marking the position of the undercarriage in the image.
According to the invention, the landing gear state judgment is effectively realized by accurately identifying the landing gear and combining the flight state of the airplane, and the detection accuracy and stability are improved; the invention realizes real-time tracking of the target, can realize accurate positioning of the airplane and the undercarriage, and greatly improves the timeliness and the accuracy of undercarriage detection; the invention also optimizes the identification and extraction mode of the target, and the airplane identification model and the undercarriage identification model are trained by the same trainer, thereby reducing the complexity of the system.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the example serve to explain the principles of the invention and not to limit the invention.
FIG. 1 is a schematic illustration of a landing gear state determination device of the present invention;
FIG. 2 is a flowchart illustrating the operation of identifying a model according to an embodiment of the present invention;
FIG. 3 is a flowchart of an algorithm of a method for determining the status of an aircraft landing gear based on image recognition according to the present invention;
FIG. 4 is a schematic diagram of the actual detection effect of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. It should be noted that the embodiments and features of the embodiments in the present application may be arbitrarily combined with each other without conflict.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Aiming at the defects in the existing undercarriage state judgment, the invention provides an airplane undercarriage state judgment device and method based on image recognition. Referring to fig. 1, the undercarriage state determination device is composed of functional modules, such as an image acquisition module, an airplane identification module, an airplane image extraction module, an undercarriage identification module, an undercarriage image extraction module, and an undercarriage state determination module.
The image acquisition module is used for continuously acquiring aerial target images by using a camera erected on an airplane runway. The airplane identification module identifies the airplane and the position of the airplane in the image by adopting a deep learning algorithm. The airplane image extraction module extracts a target airplane image from the airplane identification result. And the undercarriage identification module identifies the undercarriage and the position thereof in the airplane image by adopting a deep learning algorithm. And the undercarriage image extraction module extracts a target undercarriage image from the undercarriage identification result. The undercarriage state judging module is used for analyzing the target undercarriage image by combining the target airplane image, so that the retraction state of the undercarriage is accurately judged. The functional modules are connected in sequence.
The method for judging the state of the aircraft landing gear based on image recognition comprises the following steps:
(1) the image acquisition module is used for continuously acquiring aerial target images by using a camera erected on an airplane runway;
(2) the airplane identification module adopts an airplane identification model to identify the airplane and the position of the airplane in the aerial target image, preferably, the airplane identification module also identifies the size of the target airplane in the image, and records a time tag, a camera number and the like when the image is acquired;
(3) the airplane identification module also acquires the airplane flight state according to the group of image sequences obtained in the step (2);
the step (3) may specifically include determining a flight state according to a parameter change of the aircraft in the image within a period of time, where the parameter change includes a position change of the same aircraft in an image acquisition process or a speed change calculated based on the position and a reference object. Or the size change in the acquired image when the same airplane is acquired by the same camera from front to back, and the like.
The flight state can include, but is not limited to, take-off, landing, pulling up, hovering, etc., and the state can be further segmented, such as the initial pulling up stage, the later pulling up stage, etc., and set according to specific requirements.
(4) The airplane image extraction module extracts an airplane image from the airplane identification result;
(5) the undercarriage identification module identifies the undercarriage in the airplane image by adopting an undercarriage identification model;
(6) the undercarriage image extraction module extracts an undercarriage image from the undercarriage identification result;
(7) and the undercarriage state judging module is used for judging whether the retraction state of the undercarriage is correct or not according to the undercarriage image by combining the flight state of the airplane.
Specifically, when the aircraft is in a pulling-up process but the undercarriage is not in a retracting state, the undercarriage state is judged to be wrong, and an alarm is sent to a tower and an operator; when the aircraft is in the landing process but the undercarriage is not in the down state, the undercarriage state is judged to be wrong, and an alarm is sent to the tower and an operator.
Additionally, in the step (7), the landing gear state determination module may further acquire the landing gear state and the action of the aircraft according to the obtained set of landing gear images and the corresponding aircraft image sequence; similarly, the landing gear state may be determined from the position of the same landing gear in the aircraft image over a period of time, and landing gear action may be determined from changes in landing gear position. For example, as the landing gear is exposed out of the wheel well and more exposed portions are exposed, indicating that the landing gear is in a down position; when the undercarriage is exposed out of the wheel well and the exposed part is less and less, the undercarriage is in a retracted state; failure of the landing gear is indicated when the landing gear is only partially exposed from the wheel well and has not changed for more than a predetermined time.
The alarm can adopt an optical alarm mode, and a yellow light or red light halogen lamp with high power is arranged on the camera, and a mode of flashing at preset frequency, such as three long and one short, is adopted, so that related personnel can know the alarm.
Preferably, the step (2) locates the specific position of the airplane by using a modified Vibe algorithm and a Yolo algorithm;
an image processing step (4.1) is further included after the step (4) and before the step (5): judging whether the airplane image is clear, if the airplane image is clear, directly entering the step (5), and if not, entering the step (4.2); and (4.2) selecting traditional image enhancement or adopting super-resolution image enhancement according to the actual environment and hardware conditions, and increasing the image definition.
In the step (5), the landing gear can be identified by adopting an SSD algorithm, and the position of the landing gear is marked in the image.
The identification model comprises two processes of training and identification. Firstly, a training sample set of images of an airplane and an undercarriage is collected, and a model file is obtained through network training. And then inputting the image to be detected into the recognition model to obtain the class attribute and the position of the target. The airplane identification model and the undercarriage identification model are trained by using the same trainer, so that the system complexity is reduced, the difference is only in the pertinence adjustment of the used training sample set, the airplane training test set is used in the training of the airplane identification model, and the undercarriage training test set is used in the training of the undercarriage identification model.
The process for establishing the aircraft identification module and the landing gear identification model is shown in fig. 2 and comprises the following sequential steps:
(1) acquiring airplane images under various conditions (different weather, different time periods, different airplane types and different angles);
(2) marking an airplane manufacturing airplane training test set in the image; further marking the undercarriage in the image marked with the airplane, and manufacturing an undercarriage training test set;
(3) defining a deep learning network used;
(4) and respectively training the networks by using the training test set to obtain an airplane identification module and an undercarriage identification model file.
With reference to fig. 3, a flowchart of a specific algorithm implementation of the method for determining the landing gear state of an aircraft based on image recognition is shown, and the specific steps are as follows:
a. inputting an image to be recognized; b. positioning the specific position of the airplane through an improved Vibe algorithm and a Yolo algorithm; c. judging whether the image is clear, if the image is clear, directly entering the step e, and if not, entering the step d; d. according to the actual environment and hardware conditions, selecting traditional image enhancement or adopting super-resolution image enhancement to increase the image definition; e. the landing gear is identified by using an SSD algorithm, and the position of the landing gear is marked in the image.
The Vibe algorithm is a pixel-level background modeling algorithm, and detects a foreground by comparing a background model with a current input pixel value, so that an airplane is separated from the background. The method comprises the following steps:
firstly, initializing a background model of each pixel point in a single-frame image. It is assumed that the pixel values of each pixel and its neighborhood have a similar distribution in the spatial domain. When the first frame image is input, i.e. when t is 0, the background model of the pixel is:
wherein the content of the first and second substances,NG(x, y) denotes spatially adjacent pixel values, f0(x, y) represents the pixel value of the current point. During initialization NG(x, y) of pixelsi,yi) The possible number of times to choose is L1, 2, 3.
And secondly, carrying out foreground object segmentation on the subsequent image sequence. When t is equal to k, the background model of the pixel point (x, y) isPixel value of fk(x, y). Whether the pixel value is foreground is judged as follows.
Where τ is randomly selected and T is a set threshold.
And thirdly, updating the background model. Background model update can adopt one of three methods:
1) a memoryless update strategy. And when the background model of the pixel point needs to be updated is determined each time, one sample value of the pixel point sample set is replaced by a new pixel value at random.
2) A time-sampling update strategy. And updating the background model according to a certain updating frequency. When a pixel is determined to be background, it has a probability of 1/rate to update the background model. rate is a time sampling factor and is typically 16.
3) And (5) updating the spatial neighborhood. And randomly selecting a background model of the neighborhood of the pixel point aiming at the pixel point needing to be updated, and updating the selected background model by using the new pixel point.
The Yolo algorithm is an image recognition algorithm based on a Convolutional Neural Network (CNN). The method comprises the following steps: one step, inputting an image and zooming to a fixed size; secondly, sending the image into a convolutional neural network for identification; and thirdly, obtaining a prediction result of the target to be detected.
The SSD algorithm is also an image recognition algorithm based on a convolutional neural network, and the SSD algorithm can be used for accurately recognizing small targets. The method comprises the following steps:
firstly, inputting a picture, extracting features through a convolutional neural network, and generating a feature map;
extracting a feature map with multiple scales, and generating multiple prediction frames on each point of the feature map;
and thirdly, gathering all default frames, and filtering the prediction frames with larger overlapping degree by a non-maximum value inhibition method, wherein the final prediction frame is the detection result.
Fig. 4 shows a schematic diagram of the actual detection effect of the present invention, showing the aircraft in landing and showing the position of the landing gear marked, when the landing gear is in the down state, and working normally.
Although the embodiments of the present invention have been described above, the above description is only for the convenience of understanding the present invention, and is not intended to limit the present invention. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the scope of the present invention should be determined by the following claims.

Claims (10)

1. An aircraft landing gear state determination method based on image recognition, the method comprising the steps of:
1) continuously acquiring aerial target images by using an image acquisition module erected on an airstrip;
2) adopting an airplane identification model to identify the airplane and the position of the airplane in the aerial target image;
3) acquiring the flight state of the airplane according to the group of image sequences obtained in the step 2);
4) extracting an airplane image from the airplane identification result;
5) identifying the undercarriage in the airplane image by adopting an undercarriage identification model;
6) extracting an undercarriage image from the undercarriage identification result;
7) and comprehensively judging whether the retraction state of the undercarriage is correct or not by combining the flight state of the airplane and the undercarriage image.
2. The method of claim 1, wherein step 2) further comprises identifying the size of the target aircraft in the image, and recording the time stamp and the image capture module number at the time the image was captured.
3. The method of claim 1, wherein step 3) determines the flight status based on changes in parameters of the aircraft in the images over a period of time.
4. A method according to claim 3, wherein the parameter changes comprise changes in position of the same aircraft during image acquisition or changes in velocity calculated on the basis of position and reference.
5. The method of claim 2, wherein step 3) determines the flight status based on a dimensional change in the captured images of the same aircraft as captured before and after by the same image capture module.
6. A method according to any one of claims 1 to 5, wherein when the aircraft is in a pull-up procedure but the landing gear is not in a stowed condition, a landing gear condition error is determined and an alarm is issued; when the aircraft is in the landing process but the undercarriage is not in the down state, the undercarriage state error is judged and an alarm is given.
7. The method of claim 6, wherein the landing gear status is determined in step 7) based on the position of the same landing gear in the image of the aircraft over a period of time, and landing gear action is determined based on a change in landing gear position.
8. A method according to any one of claims 1 to 5, wherein the aircraft identification model and the landing gear identification model are model trained using the same trainer.
9. The method according to any of claims 1 to 5, further comprising, after step 4) and before step 5), an image processing step 4.1: judging whether the airplane image is clear, if the airplane image is clear, directly entering the step 5), and if not, entering the step 4.2; and 4.2, selecting an image enhancement algorithm to increase the image definition according to the actual environment and hardware conditions.
10. The method according to any one of claims 1 to 5, wherein the Vibe algorithm and the yolo algorithm are used in step 2) to locate the aircraft; and 5) identifying the undercarriage by adopting an SSD algorithm, and marking the position of the undercarriage in the image.
CN202010610143.XA 2020-06-30 2020-06-30 Aircraft landing gear state determination method based on image recognition Pending CN111898444A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010610143.XA CN111898444A (en) 2020-06-30 2020-06-30 Aircraft landing gear state determination method based on image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010610143.XA CN111898444A (en) 2020-06-30 2020-06-30 Aircraft landing gear state determination method based on image recognition

Publications (1)

Publication Number Publication Date
CN111898444A true CN111898444A (en) 2020-11-06

Family

ID=73208012

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010610143.XA Pending CN111898444A (en) 2020-06-30 2020-06-30 Aircraft landing gear state determination method based on image recognition

Country Status (1)

Country Link
CN (1) CN111898444A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112530205A (en) * 2020-11-23 2021-03-19 北京正安维视科技股份有限公司 Airport parking apron airplane state detection method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112530205A (en) * 2020-11-23 2021-03-19 北京正安维视科技股份有限公司 Airport parking apron airplane state detection method and device

Similar Documents

Publication Publication Date Title
CN108037770B (en) Unmanned aerial vehicle power transmission line inspection system and method based on artificial intelligence
CN106203265A (en) A kind of Construction Fugitive Dust Pollution based on unmanned plane collection image is derived from dynamic monitoring and coverage prognoses system and method
CN105373135B (en) A kind of method and system of aircraft docking guidance and plane type recognition based on machine vision
CN106096504A (en) A kind of model recognizing method based on unmanned aerial vehicle onboard platform
CN103914682A (en) Vehicle license plate recognition method and system
KR102094341B1 (en) System for analyzing pot hole data of road pavement using AI and for the same
CN110415544B (en) Disaster weather early warning method and automobile AR-HUD system
CN111898444A (en) Aircraft landing gear state determination method based on image recognition
Van Schaik et al. Assessment of visual cues by tower controllers, with implications for a remote tower control centre
CN109887343B (en) Automatic acquisition and monitoring system and method for flight ground service support nodes
CN111126183A (en) Method for detecting damage of building after earthquake based on near-ground image data
CN111126184A (en) Post-earthquake building damage detection method based on unmanned aerial vehicle video
CN109631848B (en) Transmission line foreign matter intrusion detection system and detection method
KR20190108832A (en) Apparatus and Method for Detecting/Analyzing Defect of Windturbine Blade
Zhang et al. Application Research of YOLO v2 Combined with Color Identification
KR20210080459A (en) Lane detection method, apparatus, electronic device and readable storage medium
CN109614864B (en) Method for detecting retractable state of undercarriage of multi-model aircraft at ground-based view angle
CN112686172A (en) Method and device for detecting foreign matters on airport runway and storage medium
CN108225735B (en) Precision approach indicator flight verification method based on vision
Majidi et al. Real time aerial natural image interpretation for autonomous ranger drone navigation
CN110826456A (en) Countdown board fault detection method and system
CN108564125A (en) A kind of insulator image classification method and system
CN113194589B (en) Airport navigation aid light single lamp fault monitoring method based on video analysis
WO2018211396A1 (en) Detection of powerlines in aerial images
CN105447431B (en) A kind of docking aircraft method for tracking and positioning and system based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination