CN114049580A - Airport apron aircraft positioning system - Google Patents

Airport apron aircraft positioning system Download PDF

Info

Publication number
CN114049580A
CN114049580A CN202110898343.4A CN202110898343A CN114049580A CN 114049580 A CN114049580 A CN 114049580A CN 202110898343 A CN202110898343 A CN 202110898343A CN 114049580 A CN114049580 A CN 114049580A
Authority
CN
China
Prior art keywords
aircraft
image
apron
front wheel
image acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110898343.4A
Other languages
Chinese (zh)
Inventor
张晓海
赵辛
李颖
魏楷臻
赵士瑄
舒远军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Capital International Airport Co ltd
Second Research Institute of CAAC
Original Assignee
Beijing Capital International Airport Co ltd
Second Research Institute of CAAC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Capital International Airport Co ltd, Second Research Institute of CAAC filed Critical Beijing Capital International Airport Co ltd
Priority to CN202110898343.4A priority Critical patent/CN114049580A/en
Publication of CN114049580A publication Critical patent/CN114049580A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention discloses a apron aircraft positioning system, comprising: the image acquisition devices are fixedly arranged in the apron, and each image acquisition device acquires an image in the apron; the multi-channel video stream processing unit is respectively connected with each image acquisition device and identifies the image acquired by each image acquisition device; an aircraft detection unit for determining the type of the aircraft in the identification image and labeling the area of the aircraft in the identification image; an aircraft front wheel detection unit for identifying a complete front wheel of the aircraft in a salient region of the aircraft; and the position information calculating unit is used for calculating the position information of the aircraft in the apron based on the identification image for identifying the complete front wheel and the identification image where the complete front wheel is located and the position information of the image acquisition equipment corresponding to the identification image. The method can avoid the problems of false detection, missed detection and repeated detection, and can improve the accuracy of position calculation while accelerating the calculation of the position information of the aircraft.

Description

Airport apron aircraft positioning system
Technical Field
The invention belongs to the technical field of apron aircraft positioning, and particularly relates to an apron aircraft positioning system.
Background
In the daily operation of airports, the management of aircraft within the airport pad has long been one of the important tasks for airports to maintain normal commercial operation. With the rapid development of Chinese economy, the traffic and logistics industries have met with unprecedented development opportunities, and both the number of passenger flights and the number of freight flights in an airport have been increased explosively. The airport apron is the main active area of various aircrafts and has become the most important operation area for passenger transportation and freight transportation at civil airports. In recent years, collision accidents of aircrafts in airport districts often occur, which not only seriously affect the normal commercial operation of airports, but also bring huge economic loss to airliners. Therefore, effective management of various aircrafts on the airport apron becomes an important means for guaranteeing safe and orderly operation of the airport.
The method has the advantages that the position information of the aircraft is required to be acquired in real time in the aircraft management under the complex airport environment, so that accidents such as collision can be avoided, and the efficiency of the aircraft management is improved. Real-time acquisition of aircraft position information still faces a number of difficulties. First, the large number of tarmac devices deployed within an airport cannot accurately and quickly provide position information for aircraft, requiring manual observation to determine the position of and manage the aircraft. Secondly, although three-dimensional point cloud positioning equipment such as laser radar can realize the positioning of the aircraft, the price is high, and the deployment and installation of new equipment are strictly limited by airport regulations.
Therefore, there is a particular need for a tarmac aircraft positioning system that is inexpensive and accurate in positioning the aircraft.
Disclosure of Invention
The invention aims to provide a apron aircraft positioning system which is low in price and accurate in positioning of an aircraft.
To achieve the above object, the present invention provides a tarmac aircraft positioning system, comprising: the system comprises a plurality of image acquisition devices, a storage device and a display device, wherein the image acquisition devices are all fixedly arranged in the apron, and each image acquisition device acquires an image of a fixed area in the apron; the multi-channel video stream processing unit is connected with each image acquisition device and identifies the image acquired by each image acquisition device, and the identification corresponds to the image acquisition device acquiring the image; the aircraft detection unit is used for determining the type of the aircraft in the identification image and marking a salient region of the aircraft in the identification image; an aircraft nose wheel detection unit for identifying a full nose wheel of an aircraft in a salient region of the aircraft; and the position information calculation unit is used for calculating the position of the aircraft in the apron based on the identified complete front wheel, the identification image where the complete front wheel is located and the position of the image acquisition equipment corresponding to the identification image.
Optionally, the aircraft detection unit determines, through a first convolutional neural network model, an aircraft type in the identification image and marks a salient region of the aircraft in the image; the aircraft nose wheel detection unit identifies a complete nose wheel of an aircraft in a salient region of the aircraft by a second convolutional neural network model.
Optionally, the first convolutional neural network model is obtained by: acquiring a plurality of images of each type of aircraft in an actual scene; marking the type of the aircraft by adopting a manual mode and marking the area where the aircraft is located in the image aiming at each image; taking the type of the marked aircraft and the image of the area where the marked aircraft is located as a training set; and training the first initial convolutional neural network model based on the training set to obtain a first convolutional neural network model.
Optionally, the second convolutional neural network model is obtained by: acquiring a plurality of images of each type of aircraft in an actual scene; for each image, manually marking the area of the complete front wheel of the aircraft in the image; taking the image of the area where the complete front wheel of the calibrated aircraft is located as a training set; and training the second initial convolutional neural network model based on the training set to obtain a second convolutional neural network model.
Optionally, the calculating the position of the aircraft in the apron based on the identified complete front wheel and the identification image where the complete front wheel is located and the position of the image capturing device corresponding to the identification image includes: obtaining the size of the complete front wheel in the identification image; comparing the size with a standard imaging size of a front wheel, and obtaining a distance d between an actual front wheel of the aircraft in the identification image and the image acquisition equipment corresponding to the identification image based on optical parameters of the image acquisition equipment corresponding to the identification image; calculating the position of the actual front wheel of the aircraft in a first coordinate system based on the distance d, wherein the first coordinate system is a world coordinate system; and calculating the position of the aircraft in the airport apron based on the position of the actual front wheel of the aircraft in the first coordinate system and the position of the image acquisition device corresponding to the identification image in the airport apron.
Optionally, the first coordinate system uses a projection of the image capturing device corresponding to the identification image on the ground as an origin, uses a north direction as a y-axis, and uses a east direction as an x-axis, and the calculating, based on the distance, a position of the actual front wheel of the aircraft in the first coordinate system includes: calculating a distance L between the projection of the image acquisition equipment corresponding to the identification image on the ground and an actual front wheel of the aircraft based on the distance d and the height H of the image acquisition equipment corresponding to the identification image from the ground; and calculating the y-axis coordinate and the x-axis coordinate of the actual front wheel of the aircraft in the first coordinate system based on the distance L and the included angle of the image acquisition equipment corresponding to the identification image relative to the x-axis in the first coordinate system.
Alternatively, the distance L is calculated by the following formula:
Figure BDA0003198753440000031
optionally, the position of the aircraft within the apron is calculated by: obtaining the projected position of the image acquisition equipment corresponding to the identification image on the ground based on the position of the image acquisition equipment corresponding to the identification image in the apron, wherein the projected position comprises an x-axis coordinate in the apron and a y-axis coordinate in the apron; adding the x-axis coordinate projected in the airport apron and the x-axis coordinate of the actual front wheel of the aircraft in a first coordinate system to obtain the x-axis coordinate information of the aircraft in the airport apron; and adding the y-axis coordinate projected in the apron and the y-axis coordinate of the actual front wheel of the aircraft in the first coordinate system to obtain the y-axis coordinate information of the aircraft in the apron.
Optionally, the apron aircraft positioning system further includes: and the display terminal is connected with the position information calculation unit and is used for displaying the position information of each aircraft in the apron.
Optionally, the image acquisition device is a fixed monocular camera.
The invention has the beneficial effects that: the airport apron aircraft positioning system of the invention can avoid the problems of wrong detection, missed detection and repeated detection based on the rapid detection of the front wheels of the aircrafts in the saliency areas, can improve the accuracy of position calculation while accelerating the calculation of the position information of the aircrafts, and has low price and convenient wide application.
The present invention has other features and advantages which will be apparent from or are set forth in detail in the accompanying drawings and the following detailed description, which are incorporated herein, and which together serve to explain certain principles of the invention.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in more detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
Fig. 1 shows a block diagram of a tarmac aircraft positioning system according to one embodiment of the present invention.
Fig. 2 shows a schematic view of the distance between the aircraft front wheel in the computed identified image and the image capturing device corresponding to the identified image of the tarmac aircraft positioning system according to one embodiment of the invention.
Description of reference numerals:
102. an image acquisition device; 104. a multi-path video stream processing unit; 106. an aircraft detection unit; 108. an aircraft front wheel detection unit; 110. a position information calculation unit; 202. aircraft nose wheels.
Detailed Description
Preferred embodiments of the present invention will be described in more detail below. While the following describes preferred embodiments of the present invention, it should be understood that the present invention may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
A tarmac aircraft positioning system according to the present invention comprises: the system comprises a plurality of image acquisition devices, a plurality of image acquisition devices and a control device, wherein the image acquisition devices are all fixedly arranged in the apron, and each image acquisition device acquires an image of a fixed area in the apron; the multi-channel video stream processing unit is connected with each image acquisition device, identifies the image acquired by each image acquisition device, and the identification corresponds to the image acquisition device for acquiring the image; the aircraft detection unit is used for determining the type of the aircraft in the identification image and marking a salient region of the aircraft in the identification image; an aircraft nose wheel detection unit for identifying a complete nose wheel of the aircraft in a salient region of the aircraft; and the position information calculation unit is used for calculating the position of the aircraft in the airport apron based on the identification image for identifying the complete front wheel and the identification image where the complete front wheel is located and the position of the image acquisition equipment corresponding to the identification image.
Specifically, the video images acquired by the image acquisition device in real time are encoded and compressed, and then uploaded to a multi-channel video stream processing unit of a management control center through a real-time stream transmission protocol for processing, storing and displaying. The multi-path video stream processing unit is mainly used for decoding video images uploaded by a real-time stream transmission protocol and marking the decoded videos one by one according to the deployment positions of the corresponding cameras on the apron so as to facilitate the reconstruction of a display terminal of the management control center on the complete scene monitoring of the apron space.
One of the keys of aircraft positioning is fast detection of the aircraft nose wheel, but the simple aircraft nose wheel detection unit has problems of false detection, missed detection, repeated detection and the like in practical application. The main reason for the false detection is that the tires of other operating equipment in the apron interfere with the target detection algorithm for the front wheels of the aircraft. The reason for the missed detection is that obstacles often appear in the complex apron environment to shade the front wheels of the aircraft. The large number of monocular cameras deployed in the airport apron can cause repeated detection on the front wheel of the same aircraft, and a large amount of computing time and server resources are consumed to determine whether the detection targets of the cameras are homologous objects or not due to different angles, environmental interference and other factors. Aiming at the difficult problems, the rapid detection of the aircraft and the front wheel of the aircraft is respectively realized in two stages by using two independent target detection units.
The target detection method based on the convolutional neural network model is deployed in the aircraft detection unit, the aircraft detection and classification can be completed in real time, and the salient region of the aircraft where the aircraft is located is marked in the image. The detection result can mark the aircraft in real time and generate a salient region of the aircraft at a display terminal of a management control center. The detection unit can also obtain the model of the aircraft based on the classification result and search the relevant information of the aircraft in the database.
After the aircraft is detected and the position calibration and model information of the aircraft in the image are obtained, the aircraft front wheel detection unit can rapidly detect the front wheel of the aircraft aiming at the salient region of the aircraft. Aircraft detection is carried out on each image, a salient region of the aircraft is marked, and then front wheel detection is carried out on the image marked with the salient region of the aircraft. If the plurality of monocular cameras can detect the front wheel of the same aircraft at the same time, the aircraft detection unit makes a unified annotation, and the video images of the plurality of cameras are utilized in the position information calculation unit to accelerate the calculation of the position information and improve the position accuracy. The position information calculation unit is used for realizing rapid distance measurement by comparing the actual size of the front wheel of the aircraft in an image with the size of the front wheel of the aircraft on a standard defining surface based on the related parameter information of the aircraft and utilizing the perspective principle of a camera, establishing a first coordinate system by taking the projection position of the fixed monocular camera on the ground as an origin, and immediately calculating the position information of the aircraft by the distance between the front wheel of the aircraft and the monocular camera.
And accurate position information of the aircraft in the airport apron and a detection frame of the aircraft in the video image are displayed on a display terminal of the management control center, and the display terminal provides the position information of each aircraft in the airport apron in real time.
According to the embodiment, the airport apron aircraft positioning system can avoid the problems of wrong detection, missed detection and repeated detection based on the rapid detection of the aircraft front wheels of the aircraft in the salient region of the aircraft, can improve the accuracy of position calculation while accelerating the calculation of the aircraft position information, and is cheap and convenient to widely use.
Alternatively, the aircraft detection unit determines the aircraft type in the identification image through the first convolution neural network model and marks the salient region of the aircraft in the image; the aircraft nose wheel detection unit identifies a complete nose wheel of the aircraft in the region of significance of the aircraft by means of a second convolutional neural network model.
The identification image is used as input data of a first convolutional neural network model, the type of the aircraft in the identification image is determined through classification and detection of the first convolutional neural network model, and a salient region of the aircraft is marked in the image; and taking the image of the marked salient region of the aircraft as input data of a second convolutional neural network model, and identifying the complete front wheel of the aircraft through identification of the second convolutional neural network model.
Alternatively, the first convolutional neural network model is obtained by: acquiring a plurality of images of each type of aircraft in an actual scene; marking the type of the aircraft and marking the area where the aircraft is located in the image by adopting a manual mode aiming at each image; taking the type of the marked aircraft and the image of the area where the marked aircraft is located as a first training set; and training the first initial convolutional neural network model based on the first training set to obtain a first convolutional neural network model.
The main purpose of the aircraft detection unit is to realize real-time detection of the aircraft in the video image, and perform classification identification on the aircraft type and labeling of the salient region of the aircraft. In the operating area of the apron, the aircraft is bulky and has various characteristics, for example, the aircraft has a longer fuselage, wings in the middle, wings at the tail, and the like, and can be clearly distinguished from other devices in the apron. Therefore, the method includes the steps of acquiring images of the aircraft in an actual scene, marking the type of the aircraft in a manual mode, marking the area where the aircraft is located in the images, taking the image of each known aircraft type and the area where the marked aircraft is located as a first training set, training the first initial convolutional neural network model based on the first training set, and identifying the type of the aircraft in the images and the area where the aircraft is located through the characteristics of the aircraft. The probability of false detection and missed detection in the detection of the aircraft is extremely low, and compared with the real-time detection of the front wheel of the aircraft, the real-time detection of the aircraft is simpler and faster, and the performance of the aircraft positioning system can be greatly improved.
By means of the classification and identification of the type of the aircraft, the problem of repeated detection can be avoided. The characteristics owned by the individual aircraft are helpful for the identification and classification of the target detection algorithm, and the aircraft related parameters can be obtained, and meanwhile, whether the aircraft in different images is a homologous object can be quickly judged in the system. Based on the result of the annotation, a salient region can be generated on the video image, and rapid detection can be carried out for the front wheel of the aircraft.
Alternatively, the second convolutional neural network model is obtained by: acquiring a plurality of images of each type of aircraft in an actual scene; for each image, manually marking the area where the aircraft is located in the image, and manually marking the area where the complete front wheel of the aircraft is located in the area where the aircraft is located; taking the image of the area where the complete front wheel of the calibrated aircraft is located as a second training set; and training the second initial convolutional neural network model based on a second training set to obtain a second convolutional neural network model.
Specifically, an image of an aircraft under an actual scene is collected, an area where the aircraft is located is marked in the image in a manual mode, an area where the complete front wheel of the aircraft is located is marked in the area where the aircraft is located in the manual mode, the image of the area where the complete front wheel of the aircraft is marked is used as a second training set, a second initial convolutional neural network model is trained on the basis of the second training set, the complete front wheel of the aircraft in the image is identified through the characteristics of the complete front wheel of the aircraft, and the second convolutional neural network model is obtained.
The rapid detection of the front wheels of an aircraft has mainly two purposes: firstly, determining the position of the front wheel of the aircraft in the image, and then determining the space position of the aircraft based on two-dimensional coordinates in the image; secondly, the size of the aircraft nose wheel in the image is obtained based on the salient region of the aircraft, and the subsequent position calculation unit measures the distance between the aircraft nose wheel and the camera.
Alternatively, calculating the position of the aircraft in the apron based on the identified complete front wheel and the identification image where the complete front wheel is located and the position of the image acquisition device corresponding to the identification image comprises: obtaining the size of the complete front wheel in the identification image; comparing the size with the standard imaging size of the front wheel, and obtaining the distance d between the actual front wheel of the aircraft in the identification image and the image acquisition equipment corresponding to the identification image based on the optical parameters of the image acquisition equipment corresponding to the identification image; calculating position information of an actual front wheel of the aircraft in a first coordinate system based on the distance d, wherein the first coordinate system is a world coordinate system; and calculating the position of the aircraft in the airport apron based on the position of the actual front wheel of the aircraft in the first coordinate system and the position of the image acquisition device corresponding to the identification image in the airport apron.
Specifically, an airport apron coordinate system covering the whole airport domain is established by taking longitude and latitude information of the airport as a reference, and the longitude and latitude coordinate position of each image acquisition device in the airport coordinate system is confirmed. Then, taking the image of the complete front wheel of the aircraft as an example, the position information of the aircraft in the airport apron is calculated, and a first coordinate system is established by taking the projection of the image acquisition equipment corresponding to the image of the complete front wheel of the aircraft on the ground as an origin, wherein the first coordinate system takes the northward direction as the y axis and the eastern direction as the x axis. Fixed model aircraft may be equipped with a uniformly sized nose wheel, and the tires of the aircraft may have stringent profile and dimensional standards.
Based on the aircraft type determination function of the aircraft detection unit, the model of the aircraft and the nose wheel parameters of the aircraft of the type are obtained. The imaging size of the aircraft front wheel on the reference surface is used as a detection reference, regression calculation is directly carried out by utilizing the near-far relation in the perspective principle of the image acquisition equipment based on the accurate optical parameters of the image acquisition equipment deployed on the apron, and the distance between the aircraft front wheel and the image acquisition equipment is measured. According to the pythagorean theorem, the distance between the front wheel of the aircraft and the far point of the first coordinate system is calculated, then the y-axis coordinate and the x-axis coordinate of the front wheel of the aircraft in the first coordinate system are obtained according to the single rotation angle of the image acquisition equipment, and then the position information of the aircraft in the apron is obtained through a coordinate system conversion formula.
Alternatively, the first coordinate system uses the projection of the image acquisition device corresponding to the identification image on the ground as an origin, uses the due north direction as a y-axis, and uses the due east direction as an x-axis, and the calculating the position of the actual front wheel of the aircraft in the first coordinate system based on the distance includes: calculating the distance L between the projection of the image acquisition equipment corresponding to the identification image on the ground and the actual front wheel of the aircraft based on the distance d and the height H between the image acquisition equipment corresponding to the identification image and the ground; and calculating the y-axis coordinate and the x-axis coordinate of the actual aircraft nose wheel in the first coordinate system based on the distance L and the included angle of the image acquisition equipment corresponding to the identification image relative to the x-axis in the first coordinate system.
Specifically, the distance information d between the image acquisition device and the aircraft nose wheel can be obtained through the perspective principle of the image acquisition device and accurate optical parameters of the image acquisition device, the height H of the fixed image acquisition device and the rotation angle of the camera are known parameter information, and then the y-axis coordinate and the x-axis coordinate of the aircraft nose wheel in the first coordinate system are calculated and obtained.
Alternatively, the distance L is calculated by the following formula:
Figure BDA0003198753440000101
specifically, the distance L from the front wheel of the aircraft to the origin coordinate in the first coordinate system can be obtained by a simple pythagorean theorem.
Alternatively, the position of the aircraft within the apron is calculated by: obtaining the projected position of the image acquisition equipment corresponding to the identification image on the ground based on the position of the image acquisition equipment corresponding to the identification image in the apron, wherein the projected position comprises an x-axis coordinate in the apron and a y-axis coordinate in the apron; adding the x-axis coordinate projected in the apron and the x-axis coordinate of the actual front wheel of the aircraft in a first coordinate system to obtain the x-axis coordinate information of the aircraft in the apron; and adding the y-axis coordinate projected in the apron and the y-axis coordinate of the actual front wheel of the aircraft in the first coordinate system to obtain the y-axis coordinate information of the aircraft in the apron.
Specifically, the airport global coordinate system is converted based on the position information of the image acquisition equipment corresponding to the identification image in the airport apron, the coordinate information of the aircraft in the first coordinate system is directly converted into the coordinate information in the range of the airport apron, and the coordinates of all the aircraft in the airport apron are integrated and then displayed on a display terminal of the control management center.
As an alternative, the apron aircraft positioning system further comprises: and the display terminal is connected with the position information calculation unit and is used for displaying the position information of each aircraft in the apron.
Specifically, the method is adopted to calculate the position information of each aircraft in the apron, and the display terminal displays the position information of each aircraft in the apron, so that the display terminal is convenient for workers to check.
Alternatively, the image acquisition device is a stationary monocular camera.
Specifically, fixed monocular camera extensively deploys in the airport scope, collects the video image of activity aircraft and relevant operation equipment and personnel in the airport all weather, and this fixed monocular camera price is cheap, and then makes this airport positioning system be convenient for extensively promote.
Examples
Fig. 1 shows a block diagram of a tarmac aircraft positioning system according to one embodiment of the present invention. Fig. 2 shows a schematic view of the distance between the aircraft front wheel in the computed identification image and the image capturing device corresponding to the identification image of the tarmac aircraft positioning system according to one embodiment of the invention.
As shown in fig. 1 and 2, the apron aircraft positioning system includes: the image acquisition devices 102 are all fixedly arranged in the apron, and each image acquisition device 102 acquires an image of a fixed area in the apron; the multi-channel video stream processing unit 104 is connected with each image acquisition device 102, identifies the image acquired by each image acquisition device 102, and the identification corresponds to the image acquisition device acquiring the image; the aircraft detection unit 106 is used for determining the type of the aircraft in the identification image and marking the salient region of the aircraft in the identification image; an aircraft nose wheel detection unit 108, the aircraft nose wheel detection unit 108 being configured to identify a complete nose wheel of the aircraft in a salient region of the aircraft; a position information calculation unit 110, wherein the position information calculation unit 110 calculates the position of the aircraft within the apron based on the identified complete front wheel and the identification image where the complete front wheel is located and the position of the image acquisition device 102 corresponding to the identification image.
The aircraft detection unit determines the type of the aircraft in the identification image through a first convolutional neural network model and marks a salient region of the aircraft in the image; the aircraft nose wheel detection unit identifies a complete nose wheel of the aircraft in the region of significance of the aircraft by means of a second convolutional neural network model.
Wherein the first convolutional neural network model is obtained by the following steps: acquiring a plurality of images of each type of aircraft in an actual scene; marking the type of the aircraft and marking the area where the aircraft is located in the image by adopting a manual mode aiming at each image; taking the type of the marked aircraft and the image of the area where the marked aircraft is located as a first training set; and training the first initial convolutional neural network model based on the first training set to obtain a first convolutional neural network model.
Wherein the second convolutional neural network model is obtained by: acquiring a plurality of images of each type of aircraft in an actual scene; aiming at each image, calibrating the area where the complete front wheel of the aircraft is located in the image by adopting a manual mode; taking the image of the area where the complete front wheel of the calibrated aircraft is located as a second training set; and training the second initial convolutional neural network model based on a second training set to obtain a second convolutional neural network model.
Wherein, based on the recognized complete front wheel, the identification image where the complete front wheel is located and the position of the image acquisition device corresponding to the identification image, calculating the position of the aircraft in the apron comprises: obtaining the size of the complete front wheel in the identification image; comparing the size with the standard imaging size of the front wheel, and obtaining the distance d between the actual front wheel 202 of the aircraft in the identification image and the image acquisition equipment 102 corresponding to the identification image based on the optical parameters of the image acquisition equipment 102 corresponding to the identification image; calculating the position information of the actual front wheel 202 of the aircraft in the first coordinate system based on the distance d; the position of the aircraft within the tarmac is calculated based on the position of the actual front wheel 202 of the aircraft in the first coordinate system and the position of the image capturing device 102 within the tarmac corresponding to the identification image.
Wherein the first coordinate system uses the projection of the image acquisition device 102 corresponding to the identification image on the ground as an origin, uses the due north direction as a y-axis, and uses the due east direction as an x-axis, and based on the distance, calculating the position of the actual front wheel 202 of the aircraft in the first coordinate system includes: calculating the distance L between the projection of the image acquisition device 102 corresponding to the identification image on the ground and the actual front wheel 202 of the aircraft based on the distance and the height H of the image acquisition device 102 corresponding to the identification image from the ground; based on the distance L and the included angle of the image acquisition device corresponding to the identification image with respect to the x-axis in the first coordinate system, the y-axis coordinate and the x-axis coordinate of the aircraft nose wheel 202 in the first coordinate system are calculated.
Wherein the distance L is calculated by the following formula:
Figure BDA0003198753440000131
wherein the position of the aircraft within the apron is calculated by: obtaining a projected position of the image acquisition device 102 corresponding to the identification image on the ground based on the position of the image acquisition device 102 corresponding to the identification image in the apron, wherein the projected position comprises an x-axis coordinate in the apron and a y-axis coordinate in the apron; adding the x-axis coordinate projected in the apron and the x-axis coordinate of the actual front wheel 202 of the aircraft in the first coordinate system to obtain the x-axis coordinate information of the aircraft in the apron; the y-axis coordinates projected into the apron are added to the y-axis coordinates of the actual front wheel 202 of the aircraft in the first coordinate system to obtain y-axis coordinate information of the aircraft in the apron.
Wherein the apron aircraft positioning system further comprises: and the display terminal is connected with the position information calculation unit and is used for displaying the position information of each aircraft in the apron.
Wherein, image capture device 102 is a fixed monocular camera.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments.

Claims (10)

1. A tarmac aircraft positioning system, comprising:
the system comprises a plurality of image acquisition devices, a storage device and a display device, wherein the image acquisition devices are all fixedly arranged in the apron, and each image acquisition device acquires an image of a fixed area in the apron;
the multi-channel video stream processing unit is connected with each image acquisition device and identifies the image acquired by each image acquisition device, and the identification corresponds to the image acquisition device acquiring the image;
the aircraft detection unit is used for determining the type of the aircraft in the identification image and marking a salient region of the aircraft in the identification image;
an aircraft nose wheel detection unit for identifying a full nose wheel of an aircraft in a salient region of the aircraft;
and the position information calculation unit is used for calculating the position of the aircraft in the apron based on the identified complete front wheel, the identification image where the complete front wheel is located and the position of the image acquisition equipment corresponding to the identification image.
2. The apron aircraft positioning system of claim 1, wherein the aircraft detection unit determines by a first convolutional neural network model an aircraft type in an identification image and a region of significance of the aircraft in the image; the aircraft nose wheel detection unit identifies a complete nose wheel of an aircraft in a salient region of the aircraft by a second convolutional neural network model.
3. The apron aircraft positioning system of claim 2, wherein the first convolutional neural network model is obtained by:
acquiring a plurality of images of each type of aircraft in an actual scene;
marking the type of the aircraft by adopting a manual mode and marking the area where the aircraft is located in the image aiming at each image;
taking the type of the marked aircraft and the image of the area where the marked aircraft is located as a first training set;
and training the first initial convolutional neural network model based on the first training set to obtain a first convolutional neural network model.
4. The apron aircraft positioning system of claim 3, wherein the second convolutional neural network model is obtained by:
acquiring a plurality of images of each type of aircraft in an actual scene;
for each image, manually marking the area of the complete front wheel of the aircraft in the image;
taking the image of the area where the complete front wheel of the calibrated aircraft is located as a second training set;
and training a second initial convolutional neural network model based on the second training set to obtain a second convolutional neural network model.
5. The tarmac aircraft positioning system of claim 4 wherein calculating the aircraft position within the tarmac based on the identified full front wheel and its identification image and the location of the image capture device to which the identification image corresponds comprises:
obtaining the size of the complete front wheel in the identification image;
comparing the size with a standard imaging size of a front wheel, and obtaining a distance d between an actual front wheel of the aircraft in the identification image and the image acquisition equipment corresponding to the identification image based on optical parameters of the image acquisition equipment corresponding to the identification image;
calculating the position of the actual front wheel of the aircraft in a first coordinate system based on the distance d, wherein the first coordinate system is a world coordinate system;
and calculating the position of the aircraft in the airport apron based on the position of the actual front wheel of the aircraft in the first coordinate system and the position of the image acquisition device corresponding to the identification image in the airport apron.
6. The tarmac aircraft positioning system of claim 5, wherein the first coordinate system has an origin in a projection of an image capture device corresponding to the identification image onto the ground, a north-positive direction as a y-axis, and an east-positive direction as an x-axis, and wherein calculating the position of the aircraft's actual front wheel in the first coordinate system based on the distance comprises:
calculating a distance L between the projection of the image acquisition equipment corresponding to the identification image on the ground and an actual front wheel of the aircraft based on the distance d and the height H of the image acquisition equipment corresponding to the identification image from the ground;
and calculating the y-axis coordinate and the x-axis coordinate of the actual front wheel of the aircraft in the first coordinate system based on the distance L and the included angle of the image acquisition equipment corresponding to the identification image relative to the x-axis in the first coordinate system.
7. The tarmac aircraft positioning system of claim 6, wherein the distance L is calculated by the formula:
Figure FDA0003198753430000031
8. the tarmac aircraft positioning system of claim 5 wherein the aircraft position within the tarmac is calculated by:
obtaining the projected position of the image acquisition equipment corresponding to the identification image on the ground based on the position of the image acquisition equipment corresponding to the identification image in the apron, wherein the projected position comprises an x-axis coordinate in the apron and a y-axis coordinate in the apron;
adding the x-axis coordinate projected in the airport apron and the x-axis coordinate of the actual front wheel of the aircraft in a first coordinate system to obtain the x-axis coordinate information of the aircraft in the airport apron; and adding the y-axis coordinate projected in the apron and the y-axis coordinate of the actual front wheel of the aircraft in the first coordinate system to obtain the y-axis coordinate information of the aircraft in the apron.
9. The apron aircraft positioning system of claim 8, further comprising: and the display terminal is connected with the position information calculation unit and is used for displaying the position information of each aircraft in the apron.
10. The apron aircraft positioning system of claim 1, wherein the image capture device is a stationary monocular camera.
CN202110898343.4A 2021-08-05 2021-08-05 Airport apron aircraft positioning system Pending CN114049580A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110898343.4A CN114049580A (en) 2021-08-05 2021-08-05 Airport apron aircraft positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110898343.4A CN114049580A (en) 2021-08-05 2021-08-05 Airport apron aircraft positioning system

Publications (1)

Publication Number Publication Date
CN114049580A true CN114049580A (en) 2022-02-15

Family

ID=80204397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110898343.4A Pending CN114049580A (en) 2021-08-05 2021-08-05 Airport apron aircraft positioning system

Country Status (1)

Country Link
CN (1) CN114049580A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115826464A (en) * 2022-11-29 2023-03-21 航科院中宇(北京)新技术发展有限公司 Remote machine position node acquisition system and acquisition method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115826464A (en) * 2022-11-29 2023-03-21 航科院中宇(北京)新技术发展有限公司 Remote machine position node acquisition system and acquisition method thereof
CN115826464B (en) * 2022-11-29 2024-03-22 航科院中宇(北京)新技术发展有限公司 Acquisition method of remote site node acquisition system

Similar Documents

Publication Publication Date Title
CN111326023B (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
CN104091168B (en) Line of electric force based on unmanned plane image extracts localization method automatically
US20110013016A1 (en) Visual Detection of Clear Air Turbulence
US11538349B2 (en) Multi-sensor data fusion-based aircraft detection, tracking, and docking
CN103941746A (en) System and method for processing unmanned aerial vehicle polling image
CN102435174A (en) Method and device for detecting barrier based on hybrid binocular vision
CN106019264A (en) Binocular vision based UAV (Unmanned Aerial Vehicle) danger vehicle distance identifying system and method
CN107607091A (en) A kind of method for measuring unmanned plane during flying flight path
CN114034296A (en) Navigation signal interference source detection and identification method and system
CN114295139A (en) Cooperative sensing positioning method and system
CN106709432B (en) Human head detection counting method based on binocular stereo vision
CN116978009A (en) Dynamic object filtering method based on 4D millimeter wave radar
CN114049580A (en) Airport apron aircraft positioning system
CN105447431B (en) A kind of docking aircraft method for tracking and positioning and system based on machine vision
US20230267753A1 (en) Learning based system and method for visual docking guidance to detect new approaching aircraft types
CN112255604A (en) Method and device for judging accuracy of radar data and computer equipment
Kim et al. Target detection and position likelihood using an aerial image sensor
CN115083209B (en) Vehicle-road cooperation method and system based on visual positioning
CN116430879A (en) Unmanned aerial vehicle accurate guiding landing method and system based on cooperative targets
Douret et al. A multi-cameras 3d volumetric method for outdoor scenes: a road traffic monitoring application
CN114459467B (en) VI-SLAM-based target positioning method in unknown rescue environment
CN113436276B (en) Visual relative positioning-based multi-unmanned aerial vehicle formation method
WO2022083529A1 (en) Data processing method and apparatus
CN109708659A (en) A kind of distributed intelligence photoelectricity low latitude guard system
CN105631431B (en) The aircraft region of interest that a kind of visible ray objective contour model is instructed surveys spectral method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination