CN115527397A - Air traffic control situation feature extraction method and device based on multimode neural network - Google Patents

Air traffic control situation feature extraction method and device based on multimode neural network Download PDF

Info

Publication number
CN115527397A
CN115527397A CN202211237546.XA CN202211237546A CN115527397A CN 115527397 A CN115527397 A CN 115527397A CN 202211237546 A CN202211237546 A CN 202211237546A CN 115527397 A CN115527397 A CN 115527397A
Authority
CN
China
Prior art keywords
neural network
air traffic
traffic control
data
situation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211237546.XA
Other languages
Chinese (zh)
Other versions
CN115527397B (en
Inventor
王壮
潘卫军
周少武
王泆棣
王梓璇
邓蕾蕾
潘璇
何沁悦
陈志远
韩博源
高健伟
唐灵弢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Civil Aviation Flight University of China
Original Assignee
Civil Aviation Flight University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Civil Aviation Flight University of China filed Critical Civil Aviation Flight University of China
Priority to CN202211237546.XA priority Critical patent/CN115527397B/en
Publication of CN115527397A publication Critical patent/CN115527397A/en
Application granted granted Critical
Publication of CN115527397B publication Critical patent/CN115527397B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0043Traffic management of multiple aircrafts from the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0082Surveillance aids for monitoring traffic from a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/06Traffic control systems for aircraft, e.g. air-traffic control [ATC] for control when on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/06Traffic control systems for aircraft, e.g. air-traffic control [ATC] for control when on the ground
    • G08G5/065Navigation or guidance aids, e.g. for taxiing or rolling

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a method and a device for extracting air traffic control situation features based on a multimode neural network, wherein the method comprises the following steps: firstly, acquiring empty pipe monitoring data and preprocessing the data; then, classifying the monitoring data, classifying longitude, latitude, heading and horizontal speed data into two-dimensional attitude data according to two modes of a convolutional neural network and a fully-connected neural network, and classifying the identification number, the altitude and the vertical speed data of the aircraft into the rest effective data; and then drawing an air traffic control two-dimensional situation map by using the two-dimensional attitude data as the input of a convolutional neural network, extracting the situation characteristics of the air traffic control horizontal plane, constructing two-dimensional structured data by using the rest effective data, and extracting the quantity and height situation characteristics of the air traffic control aircrafts. And finally, outputting the extracted empty pipe situation characteristics through characteristic fusion. The method can be applied to the air traffic management process, extracts comprehensive air traffic situation characteristics, and improves the automation and intelligentization level of air traffic management.

Description

Air traffic control situation feature extraction method and device based on multimode neural network
Technical Field
The invention relates to the field of air traffic management, in particular to an air traffic situation feature extraction method and device based on a multimode neural network.
Background
The air traffic control process means that a controller makes a reasonable control decision according to an observed air traffic situation, and commands and guides the aircraft in the air traffic control domain. Air traffic management is developing towards automation and intellectualization, and comprehensive, accurate and efficient air traffic situation feature extraction can effectively reduce pressure of controllers and improve aviation safety and operation efficiency.
The air traffic control situation has the characteristics of large number of aircrafts, dynamic change and high dimensionality of aircraft parameters, so that the existing method is difficult to comprehensively extract the characteristics of the air traffic control situation. The method capable of processing the number of the dynamic aircrafts limits the dimensionality of parameters of the aircrafts, and indirectly limits the use scenes of the method. The method capable of processing the high-dimensional aircraft parameters has the limitation of input scale and can be only applied to scenes with few and unchanged aircraft. Therefore, it is necessary to study a high-level method for extracting features of empty pipe situation.
The deep neural network has strong feature extraction capability and obtains subversive research results in a plurality of fields. There are also many studies in the field of air traffic management, such as incoming and outgoing flight sequencing, air traffic flow prediction, aircraft conflict resolution, and the like. However, a single type of neural network is not enough to extract the air management situation features in the current complex airspace environment, and the structure of the neural network needs to be improved in a targeted manner according to the air management situation features. The deep neural network with the multimode structure is a novel neural network structure, situation data are classified according to feature extraction preferences of different neural networks, corresponding features are extracted respectively by using the different neural networks and then fused, and finally comprehensive situation features are output.
Disclosure of Invention
The method and the device for extracting the air traffic control situation features classify the air traffic control situations, use neural networks with different characteristics to respectively extract different air traffic control situation features, output air traffic control situation feature data after fusion processing, and extract comprehensive air traffic control situation features in an air space scene with large number of aircrafts and dynamic changes.
In order to realize the purpose, the invention adopts the following technical scheme:
a method for extracting air traffic control situation features based on a multimode neural network comprises the following steps:
step one, monitoring data input; acquiring air traffic control monitoring data acquired by air traffic control monitoring equipment;
secondly, preprocessing monitoring data; preprocessing the acquired air traffic control monitoring data to obtain preprocessed air traffic control monitoring data;
step three, monitoring data classification; classifying the preprocessed air traffic control monitoring data according to different modes of the neural network, and dividing the preprocessed air traffic control monitoring data into categories consistent with the modes of the neural network to obtain classified air traffic control monitoring data;
step four, data item processing; converting the classified air traffic control monitoring data into input data of a multimode neural network, and respectively extracting features by the multimode neural network;
step five, feature fusion processing; and fusing the features extracted by the multimode neural network, and outputting the fused empty pipe situation features.
Further, in the second step, the preprocessing of the acquired air traffic control monitoring data includes deleting invalid repetition points in the air traffic control monitoring data, performing interpolation supplement on the missing points in the air traffic control monitoring data, and converting the coordinates of the aircraft in the longitude and latitude coordinate system into coordinates in a cartesian coordinate system.
Further, in the third step, the modes of the neural network include a convolutional neural network and a fully-connected neural network; the monitoring data is divided into two types, including a two-dimensional attitude data type and other effective data types, the two-dimensional attitude data type includes longitude, latitude, course and horizontal speed, and the other effective data types include aircraft identification number, altitude and vertical speed.
Further, in the fourth step, the step of converting the classified air traffic control monitoring data into input data of a multimode neural network and performing feature extraction by the multimode neural network respectively includes the following steps:
drawing an empty pipe two-dimensional situation map by using the two-dimensional attitude data;
constructing two-dimensional structured data by using the rest effective data, wherein each row represents an aircraft, and the data of each column sequentially comprises an identification number, height and vertical speed of the aircraft;
taking the air traffic control two-dimensional situation map as the input of a convolutional neural network, wherein the output of the convolutional neural network is the air traffic control horizontal plane situation characteristic in a one-dimensional vector format;
and taking the two-dimensional structured data as the input of a full-connection neural network, wherein the output of the full-connection neural network is the number and height situation characteristics of the air management aircrafts in a one-dimensional vector format.
Further, in the fifth step, the fusion of the features extracted by the multi-mode neural network by using one neural network specifically includes the following steps:
simultaneously taking the air traffic control horizontal plane situation characteristics in the one-dimensional vector format output by the convolutional neural network and the air traffic control aircraft number and height situation characteristics in the one-dimensional vector format output by the fully-connected neural network as the input of the neural network;
the output of the neural network is the empty pipe situation characteristic in a one-dimensional vector format.
An air traffic situation feature extraction device applying a multi-modal neural network-based air traffic situation feature extraction method, comprising one or more processors, a storage device, and one or more programs, wherein the one or more programs are used for storing the one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors are used for realizing the method according to any one of claims 1 to 5.
Compared with the prior art, the invention has the following advantages and effects:
(1) The method utilizes the convolutional neural network to extract the situation characteristics of the air traffic control horizontal plane, utilizes the fully-connected neural network to extract the quantity and height situation characteristics of the air traffic control aircrafts, and can output comprehensive air traffic control situation characteristics after fusion.
(2) The invention perfects the basic link of intelligent air traffic control, and the comprehensive air traffic situation characteristics are beneficial to the research and use of other link automation methods and intelligent methods in the air traffic control process.
Drawings
FIG. 1 is a flow chart of a method for extracting characteristics of air traffic control situation based on a multi-mode neural network;
FIG. 2 is a schematic illustration of aircraft surveillance data according to an embodiment;
FIG. 3 is a two-dimensional situation diagram of an empty pipe according to the present embodiment;
fig. 4 is a diagram illustrating the structure of the multimode neural network according to the present embodiment.
Detailed Description
The technical solutions of the present invention are further described in detail below with reference to the accompanying drawings, but the scope of the present invention is not limited to the following descriptions.
In order to make the objects, technical solutions and advantages of the present invention more clearly understood, the present invention is further described in detail with reference to the accompanying drawings and embodiments. It should be understood that the detailed description and specific examples, while indicating the preferred embodiment of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention. It is noted that relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one of 8230, and" comprising "does not exclude the presence of additional like elements in a process, method, article, or apparatus comprising the element.
The features and properties of the present invention are described in further detail below with reference to examples.
As shown in fig. 1, a method for extracting characteristics of an air traffic control situation based on a multimode neural network includes the following steps:
step one, monitoring data input; acquiring air traffic control monitoring data acquired by air traffic control monitoring equipment;
secondly, preprocessing monitoring data; preprocessing the acquired air traffic control monitoring data to obtain preprocessed air traffic control monitoring data;
step three, monitoring data classification; classifying the preprocessed air traffic control monitoring data according to different modes of the neural network, dividing the preprocessed air traffic control monitoring data into categories consistent with the modes of the neural network, and obtaining the classified air traffic control monitoring data;
step four, data item processing; converting the classified air traffic control monitoring data into input data of a multimode neural network, and respectively extracting features by the multimode neural network;
step five, feature fusion processing; and fusing the features extracted by the multimode neural network, and outputting the fused empty pipe situation features.
And in the second step, the obtained air traffic control monitoring data is preprocessed, and the preprocessing comprises deleting invalid repetition points in the air traffic control monitoring data, performing interpolation supplement on the missing points in the air traffic control monitoring data, and converting the coordinates of the aircraft in a latitude and longitude coordinate system into coordinates in a Cartesian coordinate system.
In the third step, the modes of the neural network comprise a convolutional neural network and a fully-connected neural network; the monitoring data is divided into two types, including a two-dimensional attitude data type and other effective data types, wherein the two-dimensional attitude data type includes longitude, latitude, course and horizontal speed, and the other effective data types include aircraft identification number, altitude and vertical speed.
In the fourth step, the step of converting the classified air traffic control monitoring data into input data of a multimode neural network and respectively extracting features by the multimode neural network specifically comprises the following steps:
drawing an empty pipe two-dimensional situation map by using the two-dimensional attitude data;
constructing two-dimensional structured data by using the rest effective data, wherein each row represents an aircraft, and the data of each column sequentially comprises an identification number, height and vertical speed of the aircraft;
taking the air traffic control two-dimensional situation map as the input of a convolutional neural network, wherein the output of the convolutional neural network is the air traffic control horizontal plane situation characteristic in a one-dimensional vector format;
and taking the two-dimensional structured data as the input of a fully-connected neural network, wherein the output of the fully-connected neural network is the number and height situation characteristics of the air traffic control aircrafts in a one-dimensional vector format.
In the fifth step, the fusion of the features extracted by the multimode neural network by using one neural network specifically comprises the following steps:
simultaneously taking the air traffic control horizontal plane situation characteristics in the one-dimensional vector format output by the convolutional neural network and the air traffic control aircraft number and height situation characteristics in the one-dimensional vector format output by the fully-connected neural network as the input of the neural network;
the output of the neural network is the empty pipe situation characteristic in a one-dimensional vector format.
An empty pipe situation feature extraction device applying an empty pipe situation feature extraction method based on a multimode neural network, comprising one or more processors and a storage device, wherein the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors realize the method according to any one of claims 1 to 5.
Specifically, (1) monitoring data input is carried out, and air traffic control monitoring data collected by air traffic control monitoring equipment is obtained;
in the present embodiment, there are three aircraft, each aircraft monitoring data represents one piece of monitoring information as shown in fig. 2, and the time number, the transponder code for identifying the aircraft, the longitude, the latitude, the horizontal velocity, the vertical velocity, the altitude, and the heading are sequentially provided from left to right.
(2) Preprocessing the monitoring data, namely preprocessing the acquired air traffic control monitoring data to obtain preprocessed air traffic control monitoring data;
in this embodiment, coordinate conversion is performed to convert longitude and latitude information into cartesian coordinates, with the area center as the origin of coordinates, the due north direction as the y-axis direction, and the due east direction as the x-axis direction.
(3) Classifying the monitoring data, classifying the preprocessed air traffic control monitoring data according to different modes of the neural network, and dividing the preprocessed air traffic control monitoring data into categories consistent with the modes of the neural network to obtain the classified air traffic control monitoring data;
in the embodiment, the modes of the neural network include a convolutional neural network and a fully-connected neural network; the monitoring data is divided into two types, longitude, latitude, heading and horizontal speed are two-dimensional attitude data types, and transponder codes, altitude and vertical speed for identifying the aircraft are other effective data types.
(4) Performing data subentry processing, converting the classified air traffic control monitoring data into input data of a multimode neural network, and respectively performing feature extraction by the multimode neural network;
in this embodiment, a two-dimensional attitude map of an air traffic control system is drawn by using two-dimensional attitude data, as shown in fig. 3, a solid circle is used to represent an aircraft, the position of the solid circle in the map represents the horizontal position of the aircraft in the airspace, a solid straight line represents the heading and speed of the aircraft, the tail of the solid straight line has no arrow, the solid straight line is connected with the center of the solid circle representing the aircraft, the head of the solid straight line has an arrow representing the horizontal direction of the aircraft, and the length of the solid straight line represents the horizontal speed of the aircraft, in this embodiment, the positions, relative positions, headings and horizontal speeds of three aircraft are all included in one two-dimensional map;
using the rest effective data to construct two-dimensional structured data, wherein each row represents an aircraft, and the data in each column sequentially identify the transponder code, the height and the vertical speed of the aircraft;
in the embodiment, two kinds of neural networks are used for respectively extracting the empty pipe horizontal plane situation characteristics and the empty pipe aircraft number and height situation characteristics, as shown in fig. 4;
taking the air traffic control two-dimensional situation map as the input of a convolutional neural network, wherein the output of the convolutional neural network is the air traffic control horizontal plane situation characteristic in a one-dimensional vector format;
and taking the two-dimensional structured data as the input of a fully-connected neural network, wherein the output of the fully-connected neural network is the number and height situation characteristics of the air traffic control aircrafts in a one-dimensional vector format.
(5) Performing feature fusion processing, namely fusing the features extracted by the multimode neural network and outputting fused empty pipe situation features;
in this embodiment, the air traffic control horizontal plane situation features in the one-dimensional vector format output by the convolutional neural network and the number and height situation features in the one-dimensional vector format output by the fully-connected neural network are used as inputs of the neural network, and a fully-connected neural network is used for fusion, as shown in fig. 4. The output of the neural network is the empty pipe situation characteristic in a one-dimensional vector format.
Based on the unified inventive concept, the present invention also provides a device for extracting characteristics of air traffic situation based on multi-modal neural network, which comprises one or more processors, and a storage device, wherein the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors implement the method of the present invention.
Specifically, for example, the empty pipe situation feature extraction device includes: the system comprises a monitoring data acquisition module, a monitoring data preprocessing module, a two-dimensional attitude data processing module, a structured data processing module and an empty management situation characteristic fusion output module. Also, the modules are each capable of reading data in the storage device, and each module has one or more processors.
The above-mentioned embodiments only express one embodiment of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that various changes and modifications can be made by those skilled in the art without departing from the spirit of the invention, and these changes and modifications are all within the scope of the invention. Therefore, the protection scope of the present patent shall be subject to the claims.
The foregoing is illustrative of the preferred embodiments of this invention, and it is to be understood that the invention is not limited to the precise form disclosed herein and that various other combinations, modifications, and environments may be resorted to, falling within the scope of the concept as disclosed herein, either as described above or as apparent to those skilled in the relevant art. And that modifications and variations may be effected by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (6)

1. A method for extracting air traffic control situation features based on a multimode neural network is characterized by comprising the following steps:
step one, monitoring data input; acquiring air traffic control monitoring data acquired by air traffic control monitoring equipment;
secondly, preprocessing monitoring data; preprocessing the acquired air traffic control monitoring data to obtain preprocessed air traffic control monitoring data;
step three, monitoring data classification; classifying the preprocessed air traffic control monitoring data according to different modes of the neural network, and dividing the preprocessed air traffic control monitoring data into categories consistent with the modes of the neural network to obtain classified air traffic control monitoring data;
step four, data item processing; converting the classified air traffic control monitoring data into input data of a multimode neural network, and respectively extracting features by the multimode neural network;
step five, feature fusion processing; and fusing the features extracted by the multimode neural network, and outputting the fused empty pipe situation features.
2. The method according to claim 1, wherein in the second step, the preprocessing of the acquired air traffic control monitoring data includes deleting invalid repetition points in the air traffic control monitoring data, performing interpolation supplement on missing points in the air traffic control monitoring data, and converting coordinates of the aircraft in a longitude and latitude coordinate system into coordinates in a cartesian coordinate system.
3. The method for extracting the characteristics of the air traffic situation based on the multimode neural network as claimed in claim 1, wherein in the third step, the modes of the neural network comprise a convolutional neural network and a fully-connected neural network; the monitoring data is divided into two types, including a two-dimensional attitude data type and other effective data types, wherein the two-dimensional attitude data type includes longitude, latitude, course and horizontal speed, and the other effective data types include aircraft identification number, altitude and vertical speed.
4. The method for extracting characteristics of air traffic control situation based on multimode neural network as claimed in claim 3, wherein in the fourth step, the classified air traffic control monitoring data is converted into input data of multimode neural network, and the multimode neural network is used to extract characteristics respectively, specifically comprising the following steps:
drawing an empty pipe two-dimensional situation map by using the two-dimensional attitude data;
using the rest effective data to construct two-dimensional structured data, wherein each row represents an aircraft, and the data of each column sequentially comprises an identification number, height and vertical speed of the aircraft;
taking the air traffic control two-dimensional situation map as the input of a convolutional neural network, wherein the output of the convolutional neural network is the air traffic control horizontal plane situation characteristic in a one-dimensional vector format;
and taking the two-dimensional structured data as the input of a fully-connected neural network, wherein the output of the fully-connected neural network is the number and height situation characteristics of the air traffic control aircrafts in a one-dimensional vector format.
5. The method for extracting characteristics of air traffic control situation based on multimode neural network as claimed in claim 4, wherein in the fifth step, the neural network is used for fusing the characteristics extracted by the multimode neural network, and the method specifically comprises the following steps:
simultaneously taking the air traffic control horizontal plane situation characteristics in the one-dimensional vector format output by the convolutional neural network and the air traffic control aircraft number and height situation characteristics in the one-dimensional vector format output by the fully-connected neural network as the input of the neural network;
the output of the neural network is the empty pipe situation characteristic in a one-dimensional vector format.
6. The empty pipe situation feature extraction device applying the empty pipe situation feature extraction method based on the multimode neural network as claimed in any one of claims 1 to 5, characterized by comprising one or more processors and a storage device, wherein the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors are enabled to realize the method as claimed in any one of claims 1 to 5.
CN202211237546.XA 2022-09-30 2022-09-30 Air traffic control situation feature extraction method and device based on multimode neural network Active CN115527397B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211237546.XA CN115527397B (en) 2022-09-30 2022-09-30 Air traffic control situation feature extraction method and device based on multimode neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211237546.XA CN115527397B (en) 2022-09-30 2022-09-30 Air traffic control situation feature extraction method and device based on multimode neural network

Publications (2)

Publication Number Publication Date
CN115527397A true CN115527397A (en) 2022-12-27
CN115527397B CN115527397B (en) 2023-06-02

Family

ID=84702320

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211237546.XA Active CN115527397B (en) 2022-09-30 2022-09-30 Air traffic control situation feature extraction method and device based on multimode neural network

Country Status (1)

Country Link
CN (1) CN115527397B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105160201A (en) * 2015-09-30 2015-12-16 成都民航空管科技发展有限公司 Genetic algorithm back propagation (GABP) neural network based controller workload prediction method and system
CN110648561A (en) * 2019-11-04 2020-01-03 中国民航大学 Air traffic situation risk measurement method based on double-layer multi-level network model
CN112465199A (en) * 2020-11-18 2021-03-09 南京航空航天大学 Airspace situation evaluation system
CN112489497A (en) * 2020-11-18 2021-03-12 南京航空航天大学 Airspace operation complexity evaluation method based on deep convolutional neural network
CN113611158A (en) * 2021-06-30 2021-11-05 四川大学 Aircraft trajectory prediction and altitude deployment method based on airspace situation
US20220122470A1 (en) * 2020-10-15 2022-04-21 Beihang University 4-dimensional trajectory regulatory decision-making method for air traffic

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105160201A (en) * 2015-09-30 2015-12-16 成都民航空管科技发展有限公司 Genetic algorithm back propagation (GABP) neural network based controller workload prediction method and system
CN110648561A (en) * 2019-11-04 2020-01-03 中国民航大学 Air traffic situation risk measurement method based on double-layer multi-level network model
US20220122470A1 (en) * 2020-10-15 2022-04-21 Beihang University 4-dimensional trajectory regulatory decision-making method for air traffic
CN112465199A (en) * 2020-11-18 2021-03-09 南京航空航天大学 Airspace situation evaluation system
CN112489497A (en) * 2020-11-18 2021-03-12 南京航空航天大学 Airspace operation complexity evaluation method based on deep convolutional neural network
CN113611158A (en) * 2021-06-30 2021-11-05 四川大学 Aircraft trajectory prediction and altitude deployment method based on airspace situation

Also Published As

Publication number Publication date
CN115527397B (en) 2023-06-02

Similar Documents

Publication Publication Date Title
CN103196430B (en) Based on the flight path of unmanned plane and the mapping navigation method and system of visual information
Siddiqui et al. A drone based transmission line components inspection system with deep learning technique
CN111291697A (en) Method and device for recognizing obstacle
CN113129284B (en) Appearance detection method based on 5G cloud edge cooperation and implementation system
AU2020316538A1 (en) Meteorological parameter-based high-speed train positioning method and system in navigation blind zone
CN113050122A (en) Method and system for sensing speed of dynamic obstacle based on convolutional neural network
CN111958595B (en) Multi-sensor asynchronous information fusion system and method for transformer substation inspection robot
CN103575279A (en) Flight path correlating method and system based on fuzzy information
CN112966555A (en) Remote sensing image airplane identification method based on deep learning and component prior
Lapušinskij et al. The application of Hough transform and Canny edge detector methods for the visual detection of cumuliform clouds
CN113942521B (en) Method for identifying style of driver under intelligent vehicle road system
CN116152611A (en) Multistage multi-scale point cloud completion method, system, equipment and storage medium
CN115131246A (en) Method and device for denoising point cloud data, computer equipment and storage medium
CN114241448A (en) Method and device for obtaining heading angle of obstacle, electronic equipment and vehicle
CN114049362A (en) Transform-based point cloud instance segmentation method
CN115527397B (en) Air traffic control situation feature extraction method and device based on multimode neural network
CN114550107A (en) Bridge linkage intelligent inspection method and system based on unmanned aerial vehicle cluster and cloud platform
CN112651986B (en) Environment recognition method, recognition device, recognition system, electronic equipment and medium
CN110826432B (en) Power transmission line identification method based on aviation picture
CN114740901A (en) Unmanned aerial vehicle cluster flight method and system and cloud platform
CN114708616A (en) Obstacle avoidance method, device, equipment and storage medium
Wang et al. Information Extraction of the Vehicle from High-Resolution Remote Sensing Image Based on Convolution Neural Network
CN117392572B (en) Transmission tower bird nest detection method based on unmanned aerial vehicle inspection
CN116883880B (en) Crane identification method and device based on AR technology and electronic equipment
CN116299288A (en) System and method for identifying air target attribute

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant