CN116343534A - Airplane berth guiding system and method - Google Patents

Airplane berth guiding system and method Download PDF

Info

Publication number
CN116343534A
CN116343534A CN202310330305.8A CN202310330305A CN116343534A CN 116343534 A CN116343534 A CN 116343534A CN 202310330305 A CN202310330305 A CN 202310330305A CN 116343534 A CN116343534 A CN 116343534A
Authority
CN
China
Prior art keywords
aircraft
image
model
area
berth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310330305.8A
Other languages
Chinese (zh)
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Zhihui Xinchuang Information Technology Co ltd
Original Assignee
Nanjing Zhihui Xinchuang Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Zhihui Xinchuang Information Technology Co ltd filed Critical Nanjing Zhihui Xinchuang Information Technology Co ltd
Priority to CN202310330305.8A priority Critical patent/CN116343534A/en
Publication of CN116343534A publication Critical patent/CN116343534A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/06Traffic control systems for aircraft, e.g. air-traffic control [ATC] for control when on the ground
    • G08G5/065Navigation or guidance aids, e.g. for taxiing or rolling
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an aircraft berth guiding system and method, the system comprises a control terminal, a camera device, a laser device, a buried coil and a display device, wherein the camera device, the laser device, the buried coil and the display device are connected with the control terminal, the camera device, the laser device and the display device are arranged on a terminal building in front of a berth, the camera device and the laser device are respectively used for collecting image data and laser point cloud data and sending the image data and the laser point cloud data to the control terminal, the display device is used for displaying aircraft models and distance data, a stop area and a guide area are arranged on the berth, a detection area is arranged on a runway connected with one side of the berth, the buried coil is buried at the beginning end of the stop area, the guide area and the detection area, and a signpost is arranged on the outer side of the runway right behind the berth. The invention enhances the anti-interference capability and the environmental adaptability of the system, can provide correct and effective berth guidance for aircraft drivers, realizes safe and effective berths and improves the airport operation efficiency.

Description

Airplane berth guiding system and method
Technical Field
The invention relates to the technical field of automatic berthing of an aircraft, in particular to an aircraft berthing guiding system and method.
Background
The aircraft berthing guidance system can provide accurate and standard guidance for safely and efficiently berthing the aircraft on the tarmac, and guide pilots to berth the aircraft in an optimal procedure. The aircraft berthing process is one component of the flight operation, and the use of auxiliary equipment to improve the efficiency and safety of the flight operation is also one component of the airport modernization construction.
At present, an aircraft berth guiding system is mainly realized by the following three modes: (1) The automatic guidance of the airplane berth is realized by installing the buried coils at a plurality of positions, and the method has the advantages of low manufacturing cost, no influence of environmental illumination and simple structure. However, the method has the defects of easy damage to the earth surface pressure, difficult maintenance and replacement, inability of identifying the model and low detection precision. (2) The method comprises the steps of adopting a camera to carry out visual detection, monitoring a berth aircraft in real time in a berth area of the aircraft in an airport, adopting an artificial intelligent recognition technology and an image processing technology to carry out target rapid detection on images shot by the camera, recognizing the model of the aircraft, and guiding the aircraft to berth by utilizing the recognized data information. However, the image technology has poor adaptability in environments with weak illumination and poor visibility, and must be in a working state all the time or periodically, so that the service life of the camera is reduced. (3) The laser radar is adopted for detection, laser radar equipment is arranged on a terminal building or a corridor bridge, the transmitted radar signal can accurately detect the distance of the aircraft, the anti-interference capability and the environmental adaptability are strong, but the attitude of the aircraft is difficult to accurately measure, and the laser radar equipment is always or periodically in a working state, so that the service life of the laser radar is reduced.
Disclosure of Invention
The invention aims to provide an airplane berth guiding system and method, which solve the problems that the traditional airplane berth guiding method in the prior art cannot identify a machine type, has low detection precision and poor environment adaptability.
In order to solve the technical problems, the invention adopts the following technical scheme: the utility model provides an aircraft berth guiding system, includes control terminal, with camera equipment, laser equipment, buried coil and the display device that control terminal connects, camera equipment, laser equipment and display device locate on the berth in the dead ahead, camera equipment and laser equipment are used for gathering image data and laser point cloud data respectively and send to control terminal, display device is used for showing aircraft model and distance data, be provided with stop region and guide region on the berth, and be provided with the detection region on the runway that links to each other with berth one side, stop region, guide region and detection region start end have all buried coil for detect the region that the aircraft is located, are located the runway outside of berth directly behind is provided with the bill-board for cooperate camera equipment to detect weather visibility value.
Preferably, the aircraft has features thereon, the features including one or more of a nose, an engine, landing gear, a door, and a wing of the aircraft.
The invention also discloses an airplane berth guiding method which is applied to the airplane berth guiding system and comprises the following steps: when the buried coil detects that the aircraft enters the detection area, starting the camera equipment to collect the image of the signpost; after preprocessing the signpost image, extracting features to obtain a signpost area image; calculating the transmissivity according to the marker area image, and inverting to obtain a visibility value; acquiring an aircraft characteristic part image through camera equipment; preprocessing the aircraft characteristic part image by adopting an image denoising algorithm, and extracting a target characteristic value; establishing a neural network model, and inputting the target characteristic value to obtain an airplane model; when the buried coil detects that the aircraft enters the guiding area, judging whether the visibility value is lower than a first threshold value, if yes, controlling the camera equipment and the laser equipment to synchronously detect the aircraft gesture to obtain laser point cloud and image data, loading the laser point cloud and the image data into a combined gesture guiding model, and if not, controlling the camera equipment to detect the aircraft gesture to obtain the image data, and loading the image data into the visual gesture guiding model; guiding the attitude of the aircraft according to the output result of the guiding model; and after the buried coil detects that the aircraft enters the stopping area, controlling the laser equipment to detect the characteristic part of the aircraft to obtain the residual forward distance of the aircraft, and controlling the aircraft to stop according to the residual forward distance of the aircraft.
As a preferred solution, the sign image includes a background sky area and a sign area, and after preprocessing the sign image, feature extraction is performed to obtain a sign area image, including: and combining preset sign board area shape parameters, iteratively scanning all pixel points on the sign board image, marking similar pixel points in the sign board image, and recording the similar pixel points in a communication matrix to obtain the sign board area image.
Preferably, the calculating the transmissivity according to the area image of the sign board and inverting to obtain the visibility value includes: let the visibility value be noted as V,
Figure BDA0004154745650000021
Figure BDA0004154745650000022
in the above formula, t (x) is the transmissivity of the x coordinate point of the sign board area image, mu is the defogging parameter, and 0<μ<1, c is one of RGB color channels, Ω (x) is a local area centered on x, i.e. a sign area, I c (x) The method is characterized in that the method comprises the steps of taking a region image of a signpost on a channel c, wherein A is sunlight intensity, V is a visibility value, d is the distance from the signpost to an image pickup device, epsilon is a contrast threshold, and 0.05 is taken.
As a preferred scheme, preprocessing the aircraft characteristic part image by adopting an image denoising algorithm, and extracting a target characteristic value, wherein the method comprises the following steps: and denoising and edge detection are carried out on the aircraft characteristic part image by adopting a self-adaptive weight morphology algorithm, and target characteristic values are extracted.
Preferably, before the edge detection, the method further comprises: judging whether the visibility value is lower than a second threshold value, if so, inputting the denoised image into an edge preserving model, and carrying out edge preserving treatment; the calculation formula of the edge preservation model is as follows:
Figure BDA0004154745650000031
Figure BDA0004154745650000032
Figure BDA0004154745650000033
in the above, q i To output the pixel value, |O| is the region O k Internal imageThe number of the pixels, k and i are pixel indexes, a k And b k The coefficients of the linear function when the window center is at k, W i To guide the value of the image, p i For the value of the input pixel,
Figure BDA0004154745650000034
in the region O for the input image p k Mean value in # k And->
Figure BDA0004154745650000035
Respectively the guiding images W are in the area O k Mean and variance in, and κ is the coefficient.
Preferably, the building the neural network model includes: acquiring system data, and dividing the system data into a training set, a testing set and a verification set, wherein the system data comprises characteristic values of aircraft characteristic parts and corresponding machine types; inputting the training set into an initial model for training; setting the rounds of the initial model, outputting a model once every N rounds at intervals, and storing the model as an intermediate model; inputting the verification set into intermediate models, evaluating all the intermediate models by adopting f1 indexes, and selecting the model with the best effect as a final model; and testing the final model by using the test set to obtain model judgment accuracy.
Preferably, the processing procedure of the joint gesture guidance model includes: performing feature positioning and target segmentation on the image data by adopting a mask-CNN network to obtain a feature part image; establishing a unified coordinate system according to the relative installation position relation of the image pickup equipment and the laser equipment; analyzing the laser point cloud according to the unified coordinate system to obtain feature part coordinates corresponding to the feature part image, and constructing a linear equation of a guide line; and according to the characteristic part coordinates and the airplane model size data, referring to the guide lines, and judging whether the posture of the airplane deviates relative to the guide lines.
Preferably, the visual gesture guidance model includes: preprocessing the image data and identifying the characteristic parts of the aircraft; continuously tracking the characteristic part to obtain a characteristic part frame sequence; establishing a geographic coordinate system by taking image pickup equipment as a primary center, constructing a linear equation of a guide line, and marking frame sequence coordinates of a characteristic part; and according to the characteristic part frame sequence coordinates and the airplane model size data, referring to the guide lines, and judging whether the posture of the airplane deviates relative to the guide lines.
Compared with the prior art, the invention has the beneficial effects that: dividing berths and runways into a stopping area, a guiding area and a detecting area, completing model identification of the airplane in the detecting area, completing posture correction in the guiding area and completing stop line distance calculation in the stopping area. And the advantages of the three guiding methods of the camera equipment, the laser equipment and the buried coil are fully integrated according to the matching of different characteristics of each area by using different equipment and algorithms, so that the defects of the camera equipment, the laser equipment and the buried coil are avoided, and the reliability and the accuracy of model identification, gesture detection and distance judgment are improved. By arranging the signpost right behind the berth, the current visibility value is detected by using the camera equipment, and different algorithm models are matched according to the visibility value, so that the anti-interference capability and environmental adaptability of the system are enhanced, correct and effective berth guidance can be provided for aircraft drivers, safe and effective berths are realized, and the airport operation efficiency is improved. The camera equipment and the laser equipment in the application start working when the aircraft enters the appointed area, so that the long-term detection state is avoided, and the service life of the equipment is greatly prolonged.
Drawings
The disclosure of the present invention is described with reference to the accompanying drawings. It is to be understood that the drawings are designed solely for the purposes of illustration and not as a definition of the limits of the invention. In the drawings, like reference numerals are used to refer to like parts. Wherein:
FIG. 1 is a schematic diagram of an aircraft berth guidance system according to an embodiment of the invention;
fig. 2 is a flow chart of an aircraft berth guiding method according to an embodiment of the invention.
Detailed Description
It is to be understood that, according to the technical solution of the present invention, those skilled in the art may propose various alternative structural modes and implementation modes without changing the true spirit of the present invention. Accordingly, the following detailed description and drawings are merely illustrative of the invention and are not intended to be exhaustive or to limit the invention to the precise form disclosed.
An embodiment according to the invention is shown in connection with fig. 1. An aircraft berth guiding system comprises a control terminal, and an imaging device, a laser device, a buried coil and a display device which are connected with the control terminal. The aircraft has features thereon, including one or more of a nose, an engine, a landing gear, a door, and a wing of the aircraft.
The camera device, the laser device and the display device are arranged on a terminal building in front of a berth, the camera device and the laser device are respectively used for collecting image data and laser point cloud data and sending the image data and the laser point cloud data to the control terminal, the display device is used for displaying airplane type and distance data, a stop area and a guide area are arranged on the berth, a detection area is arranged on a runway connected with one side of the berth, buried ground coils are buried at the beginning and end of the stop area, the guide area and the detection area and used for detecting the area where an airplane is located, and a signpost is arranged on the outer side of the runway in front of the berth and used for detecting weather visibility values in cooperation with the camera device. The sign defaults to a red circle.
In one embodiment, the imaging device employs a C-STYLECCD area camera with a resolution of 640 x 480, and the laser device employs a Delfurs ESR millimeter wave radar sensor. The buried coil is an oscillating circuit, and when the aircraft passes through, the oscillation frequency changes due to the change of the space medium, so that the aircraft detection is realized. In order to improve the durability of the buried coil, a plurality of layers of protection plates are covered above the buried coil, so that the service life of the buried coil is prolonged. The control terminal can be a computing power facility such as a computer, a server or a cloud platform and is used for analyzing and processing various acquired data and displaying the processing result on a display device.
Referring to fig. 2, the invention discloses an aircraft berth guiding method, which is applied to the above aircraft berth guiding system and comprises the following steps:
s101, after the buried coil detects that the airplane enters the detection area, starting the camera equipment to collect the image of the signpost. The imaging device adjusts the shooting angle, focal length, brightness and the like according to the azimuth coordinates of the signpost and preset imaging parameters, and shoots multiple frames of images.
S102, after preprocessing the signpost image, extracting features to obtain a signpost area image. The sign image includes a background sky region and a sign region.
Specifically, preprocessing the signpost image includes: after the signpost image is grayed, the gray values with more pixels in the image are widened through histogram equalization, and the gray values with less pixels are combined, so that the contrast ratio of a background sky area and the signpost area is enhanced, and the dynamic range of the gray values of the pixels is increased.
The feature extraction includes: and combining the shape parameters of the preset signpost area, iteratively scanning all pixel points on the signpost image, marking similar pixel points in the pixel points, and recording the similar pixel points in a communication matrix to obtain the signpost area image. For example, in the present embodiment, the sign is circular, and the sign area image is also circular.
After the sign board area image is obtained, the sign board area image is a gray level image, and inverse gray level processing is needed, namely, an image corresponding to the sign board area image after gray level is selected on the sign board image as a final sign board area image.
S103, calculating the transmissivity according to the area image of the signpost, and inverting to obtain a visibility value.
Calculating the transmissivity according to the area image of the signpost, and inverting to obtain a visibility value, wherein the method comprises the following steps: let the visibility value be noted as V,
Figure BDA0004154745650000061
Figure BDA0004154745650000062
in the above formula, t (x) is the transmissivity of the x coordinate point of the sign board area image, mu is the defogging parameter, and 0<μ<1, c is one of RGB color channels, Ω (x) is a local area centered on x, i.e. a sign area, I c (x) The method is characterized in that the method comprises the steps of taking a region image of a signpost on a channel c, wherein A is sunlight intensity, V is a visibility value, d is the distance from the signpost to an image pickup device, epsilon is a contrast threshold, and 0.05 is taken.
S104, acquiring an aircraft characteristic part image through the camera equipment, preprocessing the aircraft characteristic part image by adopting an image denoising algorithm, and extracting a target characteristic value.
The method comprises the following steps: and denoising and edge detection are carried out on the characteristic part image of the airplane under the condition of translation and rotation by adopting a weight self-adaptive morphological algorithm, and a target characteristic value is extracted.
It should be understood that the weight adaptive morphology algorithm mainly consists of 3 parts, namely morphological filtering processing, multi-directional edge extraction and calculation of direction adaptive weights. The method comprises the steps of respectively carrying out edge detection on a plurality of structural elements under the same scale by a weight self-adaptive morphological algorithm, obtaining an edge image under the scale, determining the size of a weighted value according to the filling times of the structural elements and the noise resistance of the structural elements with different scales in the corrosion process by utilizing the principle of a morphological structural element probe, carrying out weighted summation on the edge images under different scales, and carrying out binarization and denoising treatment to obtain a target characteristic value.
Before edge detection, the method further comprises: and judging whether the visibility value is lower than a second threshold value, if so, inputting the denoised image into an edge preserving model, and carrying out edge preserving processing. The problem that the detection error is overlarge due to serious loss of image edge information caused by too low visibility value is avoided.
The calculation formula of the edge preservation model is as follows:
Figure BDA0004154745650000071
Figure BDA0004154745650000072
Figure BDA0004154745650000073
in the above, q i To output the pixel value, |O| is the region O k The number of inner pixels, k and i are pixel indexes, a k And b k The coefficients of the linear function when the window center is at k, W i To guide the value of the image, p i For the value of the input pixel,
Figure BDA0004154745650000074
in the region O for the input image p k Mean value in # k And->
Figure BDA0004154745650000075
Respectively the guiding images W are in the area O k Mean and variance in, and κ is the coefficient.
S105, building a neural network model, and inputting a target characteristic value to obtain the airplane model.
Establishing a neural network model, including:
1) And acquiring system data, and dividing the system data into a training set, a testing set and a verification set, wherein the system data comprises characteristic values of aircraft characteristic parts and corresponding machine types.
2) The training set is input into the initial model for training. Setting the turn of the initial model, outputting the model once every N turns at intervals, and storing the model as an intermediate model.
3) Inputting the verification set into the intermediate models, evaluating all the intermediate models by adopting f1 indexes, and selecting the model with the best effect, namely the final model.
The calculation formula of the f1 index is as follows:
Figure BDA0004154745650000076
wherein P is the accuracy rate, and R is the recall rate.
4) And testing the final model by using the test set to obtain model judgment accuracy, and putting the model into use when the accuracy meets the use requirement.
And S106, judging whether the visibility value is lower than a first threshold value after the buried coil detects that the aircraft enters the guiding area, if so, controlling the camera equipment and the laser equipment to synchronously detect the aircraft gesture, obtaining laser point cloud and image data, and loading the laser point cloud and the image data into a combined gesture guiding model. If not, controlling the camera equipment to detect the aircraft gesture, obtaining image data, and loading the image data into the visual gesture guiding model.
Wherein, the processing procedure of the joint gesture guiding model comprises the following steps:
1) And performing feature positioning and target segmentation on the image data by adopting a mask-CNN network to obtain a feature part image.
2) And establishing a unified coordinate system according to the relative installation position relation of the image pickup equipment and the laser equipment.
3) And analyzing the laser point cloud according to the unified coordinate system to obtain the characteristic part coordinates corresponding to the characteristic part image, and constructing a linear equation of the guide line.
Because the laser point cloud presents the unordered characteristic on the image, in order to calculate the position of the characteristic part by utilizing the depth information of the point cloud, an R-Tree algorithm is required to be adopted to further fuse the image and the coordinate of the characteristic part. The R-Tree is a dynamic balance Tree for organizing data according to the spatial position relation of objects, and has strong flexibility and adjustability.
4) And according to the characteristic part coordinates and the airplane model size data, referring to the guide lines, and judging whether the posture of the airplane deviates relative to the guide lines. For example: if the distance from the point A to the guide line is larger than the distance from the point B to the guide line, the aircraft is prompted to need to turn right.
A process for visual gesture guidance model, comprising:
1) Preprocessing the image data and identifying the characteristic parts of the airplane. The pretreatment can be carried out by adopting a masker-CNN network.
2) And continuously tracking the characteristic part to obtain a characteristic part frame sequence.
3) And (3) taking the image pickup equipment as a primary center to establish a geographic coordinate system, constructing a linear equation of a guide line, and marking the frame sequence coordinates of the characteristic part.
4) And according to the characteristic part frame sequence coordinates and the airplane model size data, referring to the guide lines, and judging whether the posture of the airplane deviates relative to the guide lines.
And S107, guiding the attitude of the aircraft according to the output result of the guiding model.
And S108, after the buried coil detects that the aircraft enters the stopping area, controlling the laser equipment to detect the characteristic part of the aircraft to obtain the residual forward distance of the aircraft, and controlling the aircraft to stop according to the residual forward distance of the aircraft. After entering the stopping area, the laser device emits a laser beam to the aircraft nose according to a period T, the photoelectric element receives the laser beam reflected by the target, and the timer of the laser device measures the time from emission to reception, so that the residual foreground distance of the aircraft is obtained. The camera equipment and the laser equipment in the application start working when the aircraft enters the appointed area, so that the long-term detection state is avoided, and the service life of the equipment is greatly prolonged.
In summary, the beneficial effects of the invention include: dividing berths and runways into a stopping area, a guiding area and a detecting area, completing model identification of the airplane in the detecting area, completing posture correction in the guiding area and completing stop line distance calculation in the stopping area. And the advantages of the three guiding methods of the camera equipment, the laser equipment and the buried coil are fully integrated according to the matching of different characteristics of each area by using different equipment and algorithms, so that the defects of the camera equipment, the laser equipment and the buried coil are avoided, and the reliability and the accuracy of model identification, gesture detection and distance judgment are improved. By arranging the signpost right behind the berth, the current visibility value is detected by using the camera equipment, and different algorithm models are matched according to the visibility value, so that the anti-interference capability and environmental adaptability of the system are enhanced, correct and effective berth guidance can be provided for aircraft drivers, safe and effective berths are realized, and the airport operation efficiency is improved.
It should be appreciated that the integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention is essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The technical scope of the present invention is not limited to the above description, and those skilled in the art may make various changes and modifications to the above-described embodiments without departing from the technical spirit of the present invention, and these changes and modifications should be included in the scope of the present invention.

Claims (10)

1. The utility model provides an aircraft berth guiding system, its characterized in that includes control terminal, with camera equipment, laser equipment, buried coil and the display device that control terminal connects, camera equipment, laser equipment and display device locate on the berth in the dead ahead, camera equipment and laser equipment are used for gathering image data and laser point cloud data respectively and send to control terminal, display device is used for showing aircraft model and distance data, be provided with stop region and guide area on the berth to be provided with the detection region on the runway that links to each other with berth one side, stop region, guide area and detection region start end all buries and have buried coil for detect aircraft place the region, be located the runway outside in berth directly rear is provided with the signboard for cooperate camera equipment to detect weather visibility value.
2. The aircraft berth guidance system of claim 1, wherein the aircraft has features thereon, the features comprising one or more of a nose, an engine, landing gear, a door, and a wing of the aircraft.
3. An aircraft berth guiding method applied to an aircraft berth guiding system according to claim 1 or 2, comprising the steps of:
when the buried coil detects that the aircraft enters the detection area, starting the camera equipment to collect the image of the signpost;
after preprocessing the signpost image, extracting features to obtain a signpost area image;
calculating the transmissivity according to the marker area image, and inverting to obtain a visibility value;
acquiring an aircraft characteristic part image through camera equipment, preprocessing the aircraft characteristic part image by adopting an image denoising algorithm, and extracting a target characteristic value;
establishing a neural network model, and inputting the target characteristic value to obtain an airplane model;
when the buried coil detects that the aircraft enters the guiding area, judging whether the visibility value is lower than a first threshold value, if yes, controlling the camera equipment and the laser equipment to synchronously detect the aircraft gesture to obtain laser point cloud and image data, loading the laser point cloud and the image data into a combined gesture guiding model, and if not, controlling the camera equipment to detect the aircraft gesture to obtain the image data, and loading the image data into the visual gesture guiding model;
guiding the attitude of the aircraft according to the output result of the guiding model;
and after the buried coil detects that the aircraft enters the stopping area, controlling the laser equipment to detect the characteristic part of the aircraft to obtain the residual forward distance of the aircraft, and controlling the aircraft to stop according to the residual forward distance of the aircraft.
4. The aircraft berth guiding method of claim 3, wherein the signpost image includes a background sky area and a signpost area, and the feature extraction is performed after the preprocessing of the signpost image to obtain a signpost area image, comprising: and combining preset sign board area shape parameters, iteratively scanning all pixel points on the sign board image, marking similar pixel points in the sign board image, and recording the similar pixel points in a communication matrix to obtain the sign board area image.
5. An aircraft berth guidance method according to claim 3, wherein calculating transmittance from the signpost section image and inverting to obtain a visibility value comprises: let the visibility value be noted as V,
Figure FDA0004154745640000021
Figure FDA0004154745640000022
in the above formula, t (x) is the transmissivity of the x coordinate point of the sign board area image, mu is the defogging parameter, and 0<μ<1, c is one of RGB color channels, Ω (x) is a local area centered on x, i.e. a sign area, I c (x) The method is characterized in that the method comprises the steps of taking a region image of a signpost on a channel c, wherein A is sunlight intensity, V is a visibility value, d is the distance from the signpost to an image pickup device, epsilon is a contrast threshold, and 0.05 is taken.
6. An aircraft berth guidance method according to claim 3, wherein preprocessing the aircraft feature images using an image denoising algorithm and extracting target feature values comprises: and denoising and edge detection are carried out on the aircraft characteristic part image by adopting a self-adaptive weight morphology algorithm, and target characteristic values are extracted.
7. The aircraft berth guidance method of claim 6, further comprising, prior to the edge detection: judging whether the visibility value is lower than a second threshold value, if so, inputting the denoised image into an edge preserving model, and carrying out edge preserving treatment;
the calculation formula of the edge preservation model is as follows:
Figure FDA0004154745640000023
Figure FDA0004154745640000024
Figure FDA0004154745640000025
in the above, q i To output the pixel value, |O| is the region O k The number of inner pixels, k and i are pixel indexes, a k And b k The coefficients of the linear function when the window center is at k, W i To guide the value of the image, p i For the value of the input pixel,
Figure FDA0004154745640000031
in the region O for the input image p k Mean value in # k And->
Figure FDA0004154745640000032
Respectively the guiding images W are in the area O k Mean and variance in, and κ is the coefficient.
8. An aircraft berth guidance method according to claim 3, wherein the building a neural network model comprises:
acquiring system data, and dividing the system data into a training set, a testing set and a verification set, wherein the system data comprises characteristic values of aircraft characteristic parts and corresponding machine types;
inputting the training set into an initial model for training;
setting the rounds of the initial model, outputting a model once every N rounds at intervals, and storing the model as an intermediate model;
inputting the verification set into intermediate models, evaluating all the intermediate models by adopting f1 indexes, and selecting the model with the best effect as a final model;
and testing the final model by using the test set to obtain model judgment accuracy.
9. A method of berthing a plane according to claim 3, wherein the process of combining the attitude guidance models comprises:
performing feature positioning and target segmentation on the image data by adopting a mask-CNN network to obtain a feature part image;
establishing a unified coordinate system according to the relative installation position relation of the image pickup equipment and the laser equipment;
analyzing the laser point cloud according to the unified coordinate system to obtain feature part coordinates corresponding to the feature part image, and constructing a linear equation of a guide line;
and according to the characteristic part coordinates and the airplane model size data, referring to the guide lines, and judging whether the posture of the airplane deviates relative to the guide lines.
10. The aircraft berth guiding method of claim 3, wherein the visual pose guiding model comprises:
preprocessing the image data and identifying the characteristic parts of the aircraft;
continuously tracking the characteristic part to obtain a characteristic part frame sequence;
establishing a geographic coordinate system by taking image pickup equipment as a primary center, constructing a linear equation of a guide line, and marking frame sequence coordinates of a characteristic part;
and according to the characteristic part frame sequence coordinates and the airplane model size data, referring to the guide lines, and judging whether the posture of the airplane deviates relative to the guide lines.
CN202310330305.8A 2023-03-30 2023-03-30 Airplane berth guiding system and method Pending CN116343534A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310330305.8A CN116343534A (en) 2023-03-30 2023-03-30 Airplane berth guiding system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310330305.8A CN116343534A (en) 2023-03-30 2023-03-30 Airplane berth guiding system and method

Publications (1)

Publication Number Publication Date
CN116343534A true CN116343534A (en) 2023-06-27

Family

ID=86885467

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310330305.8A Pending CN116343534A (en) 2023-03-30 2023-03-30 Airplane berth guiding system and method

Country Status (1)

Country Link
CN (1) CN116343534A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117876650A (en) * 2024-03-07 2024-04-12 中航西安飞机工业集团股份有限公司 Intelligent identifying and positioning method and system for airplane berthing in wide-area scene

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117876650A (en) * 2024-03-07 2024-04-12 中航西安飞机工业集团股份有限公司 Intelligent identifying and positioning method and system for airplane berthing in wide-area scene
CN117876650B (en) * 2024-03-07 2024-05-17 中航西安飞机工业集团股份有限公司 Intelligent identifying and positioning method and system for airplane berthing in wide-area scene

Similar Documents

Publication Publication Date Title
CN110988912B (en) Road target and distance detection method, system and device for automatic driving vehicle
CN111326023B (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
CN106356757B (en) A kind of power circuit unmanned plane method for inspecting based on human-eye visual characteristic
KR102661954B1 (en) A method of processing an image, and apparatuses performing the same
Li et al. Towards automatic power line detection for a UAV surveillance system using pulse coupled neural filter and an improved Hough transform
CN105373135A (en) Method and system for guiding airplane docking and identifying airplane type based on machine vision
CN112101092A (en) Automatic driving environment sensing method and system
CN110246130B (en) Airport pavement crack detection method based on infrared and visible light image data fusion
CN108764082A (en) A kind of Aircraft Targets detection method, electronic equipment, storage medium and system
CN106327474B (en) A kind of automatic on-line blind pixel detection method
CN104034733A (en) Service life prediction method based on binocular vision monitoring and surface crack image recognition
US10861172B2 (en) Sensors and methods for monitoring flying objects
CN111340951A (en) Ocean environment automatic identification method based on deep learning
CN112528979B (en) Transformer substation inspection robot obstacle distinguishing method and system
CN116343534A (en) Airplane berth guiding system and method
CN116978009A (en) Dynamic object filtering method based on 4D millimeter wave radar
Li et al. Bionic Vision‐Based Intelligent Power Line Inspection System
CN105447431B (en) A kind of docking aircraft method for tracking and positioning and system based on machine vision
CN114359865A (en) Obstacle detection method and related device
CN108563986B (en) Method and system for judging posture of telegraph pole in jolt area based on long-distance shooting image
CN110287957B (en) Low-slow small target positioning method and positioning device
CN108181313A (en) A kind of device and method suitable for the detection of contact net running environment safe condition
CN110796677A (en) Cirrus cloud false alarm source detection method based on multiband characteristics
CN113359847B (en) Unmanned aerial vehicle counter-braking method and system based on radio remote sensing technology and storage medium
CN114092522A (en) Intelligent capture tracking method for take-off and landing of airport airplane

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination