CN116994463B - Motion state aircraft information unbinding method, electronic equipment and storage medium - Google Patents

Motion state aircraft information unbinding method, electronic equipment and storage medium Download PDF

Info

Publication number
CN116994463B
CN116994463B CN202311231605.7A CN202311231605A CN116994463B CN 116994463 B CN116994463 B CN 116994463B CN 202311231605 A CN202311231605 A CN 202311231605A CN 116994463 B CN116994463 B CN 116994463B
Authority
CN
China
Prior art keywords
aircraft
information
target area
standard
flt
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311231605.7A
Other languages
Chinese (zh)
Other versions
CN116994463A (en
Inventor
章凡寿
唐红武
籍焱
王仲候
王殿胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Travelsky Mobile Technology Co Ltd
Original Assignee
China Travelsky Mobile Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Travelsky Mobile Technology Co Ltd filed Critical China Travelsky Mobile Technology Co Ltd
Priority to CN202311231605.7A priority Critical patent/CN116994463B/en
Publication of CN116994463A publication Critical patent/CN116994463A/en
Application granted granted Critical
Publication of CN116994463B publication Critical patent/CN116994463B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0082Surveillance aids for monitoring traffic from a ground station
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a moving state aircraft information unbinding method, electronic equipment and a storage medium, and relates to the field of aircraft information matching, wherein the method comprises the following steps: acquiring an aircraft identifier corresponding to each aircraft in a motion state in target area data of a target area; acquiring G in sliding time window T j Corresponding historical estimated position information; according to MG j Determining G j Corresponding estimated speed profile SL of an aircraft j,1 The method comprises the steps of carrying out a first treatment on the surface of the Acquisition of G in T j Historical standard aircraft information corresponding to each historical target area data; according to LG j Determining G j Corresponding true speed profile SL of an aircraft j,2 The method comprises the steps of carrying out a first treatment on the surface of the If beta SL j <βSL 0 Will currently be with G j The bound standard aircraft information is unbundled, so that the unbundling of the motion state aircraft information is realized.

Description

Motion state aircraft information unbinding method, electronic equipment and storage medium
Technical Field
The present invention relates to the field of aircraft information matching, and in particular, to a method for unbinding motion state aircraft information, an electronic device, and a storage medium.
Background
At present, with the rapid development of video technology, cameras for monitoring the condition of an aircraft in the airport in real time are arranged at each airport in the civil aviation field, and as long as the aircraft appears in an airport camera monitoring picture, monitoring personnel can clearly see the running state of each aircraft; however, when a moving aircraft enters a camera monitoring screen, it is usually matched with a corresponding detailed information; however, because the situation that the lifted airplane and the dropped airplane appear in the monitoring picture at the same time exists in the airport, and the distance between the lifted airplane and the dropped airplane is relatively close, the situation that detailed information corresponding to the airplane is matched with errors easily occurs, and the information of the airplane matching seen by the monitoring personnel in the monitoring picture is error information.
Disclosure of Invention
Aiming at the technical problems, the application adopts the following technical scheme:
according to a first aspect of the present application, there is provided a method of unbinding information about a moving aircraft, comprising the steps of:
s100, acquiring an aircraft identifier corresponding to each aircraft in a motion state in target area data of a target area to obtain a motion aircraft identifier set G= (G) 1 ,G 2 ,…,G j ,…,G k ) J=1, 2, …, k; wherein G is j The j-th moving aircraft identification in the target area data of the target area is used, and k is the number of the moving aircraft identifications in the target area data of the target area;
s200, obtaining G in the sliding time window T j Corresponding historical estimated position information to obtain G j Corresponding historical estimated position information set MG j =(MG j,1 ,MG j,2 ,…,MG j,i ,…,MG j,r ) I=1, 2, …, r; wherein, MG j,i Is G in T j Corresponding ith historical estimated position information, r is G in T j The number of corresponding historical estimated position information; MG (media g) j,i =(WG j,i ,tG j,i ),WG j,i To generate MG j,i Time G j Corresponding estimated position coordinates, tG, of the aircraft j,i To generate MG j,i Time of (2); WG (Crystal growth promoting) j,i According to tG j,i Obtaining corresponding historical target area data; the ending time of T is the current time; the historical target area data are target area data of a target area in a preset historical time period;
S300 according to MG j Determining G j Corresponding estimated speed profile SL of an aircraft j,1
S400, obtaining G in T j Historical standard aircraft information corresponding to each historical target area data to obtain G j Corresponding historical standard aircraft information set LG j =(LG j,1 ,LG j,2 ,…,LG j,i1 ,…,LG j,r1 ) I1=1, 2, …, r1; wherein LG (glass fiber reinforced plastic) j,i1 Is G in T j Corresponding i1 th historical standardAircraft information, r1 is G in T j The number of corresponding historical standard aircraft information; LG (light emitting diode) j,i1 =(W’G j,i1 ,t’G j,i1 ),W’G j,i1 To generate LG j,i1 Time G j Corresponding real position coordinates of the aircraft, t' G j,i1 To generate LG j,i1 Time of (2);
s500 according to LG j Determining G j Corresponding true speed profile SL of an aircraft j,2
S600, if beta SL j <βSL 0 Will currently be with G j Unbinding the bound standard aircraft information; wherein, beta SL j Is SL (subscriber line) j,1 With SL (subscriber line) j,2 Is of the order of beta SL 0 Is a preset similarity threshold.
According to another aspect of the present application, there is also provided a non-transitory computer readable storage medium having stored therein at least one instruction or at least one program, the at least one instruction or the at least one program being loaded and executed by a processor to implement the above-described method for generating a response text.
According to another aspect of the present application, there is also provided an electronic device comprising a processor and the above-described non-transitory computer-readable storage medium.
The invention has at least the following beneficial effects:
the invention relates to a moving state aircraft information unbinding method, which is based on G j Each piece of historical estimated position information of the corresponding aircraft image in the target area data of the target area is used for determining an estimated speed curve SL of each aircraft in motion j,1 The method comprises the steps of carrying out a first treatment on the surface of the Meanwhile, according to G j Determining G for each piece of historical standard aircraft information bound by the corresponding aircraft image j Corresponding true speed profile SL of an aircraft j,2 By comparison of SL j,1 With SL (subscriber line) j,2 To determine G j Whether the standard aircraft information currently bound by the corresponding aircraft image is correct or not, so that the wrong standard aircraft information bound by the motion state aircraft image is unbinding.
Further, there are two motion states of the aircraft in the target area, one is a takeoff motion state, in which the speed of the aircraft is gradually increased; the other is a landing motion state in which the speed of the aircraft is gradually reduced; the estimated speed curve can determine whether the aircraft is in a take-off motion state or a landing motion state in the target area, and the real speed curve can determine whether the aircraft corresponding to the standard aircraft information bound by the aircraft identification is in the take-off state or the landing state; therefore, even if the estimated speed is equal to the real speed at a certain moment, if the motion state determined by the estimated speed curve and the real speed curve is different, the standard aircraft information currently bound by the aircraft identifier is not judged to be the correct standard aircraft information, so that the accuracy of judging the standard aircraft information of the motion state aircraft is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for unbinding information of a motion state aircraft according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to fall within the scope of the invention.
It is noted that various aspects of the embodiments are described below within the scope of the following claims. It should be apparent that the aspects described herein may be embodied in a wide variety of forms and that any specific structure and/or function described herein is merely illustrative. Based on the present disclosure, one skilled in the art will appreciate that one aspect described herein may be implemented independently of any other aspect, and that two or more of these aspects may be combined in various ways. For example, an apparatus may be implemented and/or a method practiced using any number of the aspects set forth herein. In addition, such apparatus may be implemented and/or such methods practiced using other structure and/or functionality in addition to one or more of the aspects set forth herein.
A method for unbinding the information of the moving state aircraft will be described with reference to a flowchart of the method for unbinding the information of the moving state aircraft described in fig. 1.
The method for unbinding the motion state aircraft information comprises the following steps:
s100, acquiring an aircraft identifier corresponding to each aircraft in a motion state in target area data of a target area to obtain a motion aircraft identifier set G= (G) 1 ,G 2 ,…,G j ,…,G k ) J=1, 2, …, k; wherein G is j For the j-th moving aircraft identification in the target area data of the target area, k is the number of moving aircraft identifications in the target area data of the target area.
In this embodiment, the target area may be understood as an airport, and the number of the target areas is equal to the panoramic view of the target area, and because the area of the airport is large, it is generally difficult for a single camera to cover the whole airport completely; therefore, in the application, the target area panorama is formed by splicing the monitoring pictures of a plurality of cameras, and a person skilled in the art can splice a plurality of video images according to the need by adopting the existing image splicing technology to obtain the target area panorama.
In the target area data of the target area, there may be a plurality of aircraft images corresponding to the aircraft, and a person skilled in the art can identify each aircraft by using an existing image identification technology according to needs, and perform identification setting on each aircraft in the panoramic view of the target area, where it is understood that the aircraft identification is used for distinguishing each aircraft, and the aircraft identifications corresponding to any two aircraft images are different.
Among all aircraft corresponding to the aircraft identifications in the target area data of the target area, there are aircraft in a static state and aircraft in a moving state, and whether the aircraft image corresponding to the aircraft identification corresponding to each aircraft is in a preset first area in the target area panorama or not can be determined, for example, the first area is an airport runway area; and judging whether the aircraft corresponding to each aircraft image is an aircraft in a motion state, thereby obtaining G.
S200, obtaining G in the sliding time window T j Corresponding historical estimated position information to obtain G j Corresponding historical estimated position information set MG j =(MG j,1 ,MG j,2 ,…,MG j,i ,…,MG j,r ) I=1, 2, …, r; wherein, MG j,i Is G in T j Corresponding ith historical estimated position information, r is G in T j The number of corresponding historical estimated position information; MG (media g) j,i =(WG j,i ,tG j,i ),WG j,i To generate MG j,i Time G j Corresponding estimated position coordinates, tG, of the aircraft j,i To generate MG j,i Time of (2); WG (Crystal growth promoting) j,i According to tG j,i Obtaining corresponding historical target area data; the ending time of T is the current time; the historical target area data is target area data of a target area in a preset historical time period.
In the present embodiment, when G j When the corresponding aircraft image appears in the panoramic view of the target area, G can be determined according to a preset position estimation algorithm j Corresponding estimated position information of the corresponding aircraft image; wherein G is j The time interval of the generation time of the estimated position information of the adjacent two times corresponding to the corresponding aircraft image is a preset duration; the estimated location coordinates include longitude and latitude.
S300 according to MG j Determining G j Corresponding estimated speed profile SL of an aircraft j,1
In the present embodimentAccording to MG j The estimated distance between the estimated positions corresponding to the two pieces of the historical estimated position information can be obtained, and the time interval for generating the time corresponding to the two pieces of the historical estimated position information can be obtained, so that the estimated speed corresponding to the two pieces of the historical estimated position information is determined, and G is generated j A corresponding estimated speed profile of the aircraft.
S400, obtaining G in T j Historical standard aircraft information corresponding to each historical target area data to obtain G j Corresponding historical standard aircraft information set LG j =(LG j,1 ,LG j,2 ,…,LG j,i1 ,…,LG j,r1 ) I1=1, 2, …, r1; wherein LG (glass fiber reinforced plastic) j,i1 Is G in T j Corresponding i1 historical standard aircraft information, r1 is G in T j The number of corresponding historical standard aircraft information; LG (light emitting diode) j,i1 =(W’G j,i1 ,t’G j,i1 ),W’G j,i1 To generate LG j,i1 Time G j Corresponding real position coordinates of the aircraft, t' G j,i1 To generate LG j,i1 Is a time of (a) to be used.
In this embodiment, each aircraft image is matched with corresponding standard aircraft information in the panoramic view of the target area, and the standard aircraft information updates data once every preset time interval; acquisition of G in T j Each piece of standard aircraft information bound to the corresponding aircraft image to obtain G j Corresponding historical standard aircraft information set LG j The method comprises the steps of carrying out a first treatment on the surface of the The standard aircraft information is flight information corresponding to the aircraft acquired by the ADS-B system, and thus, the standard aircraft information is real flight information of the corresponding aircraft.
S500 according to LG j Determining G j Corresponding true speed profile SL of an aircraft j,2
In the present embodiment, according to LG j The real distance between the real position coordinates corresponding to the two pieces of adjacent standard aircraft information can be obtained, and the time interval for generating the time corresponding to the two pieces of adjacent standard aircraft information can also be obtained, so that the two pieces of adjacent standard aviation can be determinedTrue speed corresponding to the information of the device, and G is generated j The corresponding true speed profile of the aircraft.
S600, if beta SL j <βSL 0 Will currently be with G j Unbinding the bound standard aircraft information; wherein, beta SL j Is SL (subscriber line) j,1 With SL (subscriber line) j,2 Is of the order of beta SL 0 Is a preset similarity threshold.
In the present embodiment, if βSL j <βSL 0 Representing the current and G j Standard aircraft information and G bound to corresponding aircraft images j The corresponding aircraft image is not matched and needs to be unbundled to re-determine the correct standard aircraft information at a later time.
The moving state aircraft information unbinding method of the embodiment is according to G j Each piece of historical estimated position information of the corresponding aircraft image in the target area panorama determines an estimated speed curve SL of each aircraft in motion j,1 The method comprises the steps of carrying out a first treatment on the surface of the Meanwhile, according to G j Determining G for each piece of historical standard aircraft information bound by the corresponding aircraft image j Corresponding true speed profile SL of an aircraft j,2 By comparison of SL j,1 With SL (subscriber line) j,2 To determine G j Whether the standard aircraft information currently bound by the corresponding aircraft image is correct or not, so that the wrong standard aircraft information bound by the motion state aircraft image is unbinding.
Further, there are two motion states of the aircraft in the target area, one is a takeoff motion state, in which the speed of the aircraft is gradually increased; the other is a landing motion state in which the speed of the aircraft is gradually reduced; the estimated speed curve can determine whether the aircraft is in a take-off motion state or a landing motion state in the target area, and the real speed curve can determine whether the aircraft corresponding to the standard aircraft information bound by the aircraft image is in the take-off state or the landing state; therefore, even if the estimated speed is equal to the real speed at a certain moment, if the motion state determined by the estimated speed curve and the real speed curve is different, the standard aircraft information currently bound with the aircraft image is not judged to be the correct standard aircraft information, so that the accuracy of judging the standard aircraft information of the motion state aircraft is improved.
Optionally, step S100 includes the steps of:
s110, determining a first area in the target area according to preset conditions; wherein the aircraft located in the first region is in motion.
In this embodiment, the target area is any airport, and a runway area and a stand area are preset in any airport, and the area corresponding to the first area frame is an area of a non-stand in the airport; it will be appreciated that the aircraft is theoretically in motion when not in the stand.
S120, acquiring a target position information set HG= (HG) corresponding to each aircraft identifier 1 ,HG 2 ,…,HG j ,…,HG k ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein HG j Is G j Corresponding target position information of the corresponding aircraft in the target area data.
In this embodiment, the target position information corresponding to the aircraft identifier is central pixel position information of an aircraft image corresponding to the aircraft identifier, and the central pixel position information of the aircraft image may be obtained by the following method:
s121, acquiring surrounding frame identifiers of surrounding frames corresponding to aircraft images corresponding to each aircraft identifier in the target area panorama corresponding to the target area data.
In this embodiment, a bounding box corresponding to each aircraft image can be set in the panoramic view of the target area, for example, a rectangular box corresponding to each aircraft image is set so that the corresponding aircraft image is within the rectangular box, and it is understood that each bounding box can represent the corresponding aircraft image.
S122, acquiring a central pixel point of a bounding box of each aircraft image.
In this embodiment, if the bounding box is a rectangular box, an intersection point of two diagonal lines of the rectangular box may be used as a center pixel point of the bounding box; if the bounding boxes are irregularly shaped, a center pixel point may be preset for each bounding box.
S130, if HG j The corresponding position is located in the first region, HG is then carried out j The corresponding aircraft identity is determined to be a moving aircraft identity.
In this embodiment, when the center pixel point of any bounding box is located in the first region box, it indicates that the aircraft corresponding to the bounding box is located in the first region, that is, the aircraft is in a motion state.
In this embodiment, since the area occupied by the aircraft image in the target area panorama is relatively large, the aircraft image may be in the first area frame and the non-first area frame at the same time, and in this case, it is difficult to determine the state of the aircraft corresponding to the aircraft image; and judging the motion state of the aircraft corresponding to the aircraft image by judging whether the central pixel point of the bounding box is in the first region box or not, so that the motion state of the aircraft corresponding to the aircraft image is simpler to judge.
Optionally, step S300 includes the steps of:
s310, obtaining MG j The estimated distance between the estimated position coordinates corresponding to any two adjacent historical estimated position information in the range to obtain an estimated distance list BG j =(BG j,1 ,BG j,2 ,…,BG j,i2 ,…,BG j,r-1 ) I2=1, 2, …, r-1; wherein BG j,i2 For WG of j,i2 With WG j,i2+1 A predicted distance between the two.
In this embodiment, the estimated position coordinates include longitude and latitude, which can be determined according to WG j,i2 With WG j,i2+1 Corresponding longitude and latitude respectively, and determining BG j,i2 The method comprises the steps of carrying out a first treatment on the surface of the It should be noted that, according to actual needs, a person skilled in the art can determine WG by using an existing method for determining an estimated distance between two points according to longitude and latitude corresponding to the two points respectively j,i2 With WG j,i2+1 The estimated distance between the two is not described in detail herein.
S320, obtaining MG j Generating any two adjacent historiesEstimating the time interval between the times corresponding to the position information to obtain a time interval list TG j =(TG j,1 ,TG j,2 ,…,TG j,i2 ,…,TG j,r-1 ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein TG j,i2 For tG j,i2 And tG j,i2+1 Time interval between.
In this embodiment, when the estimated position information is generated, there is a corresponding time of generation; TG j Any two time intervals in the method can be equal or unequal.
S330, according to BG j And TG j Determination of MG j Corresponding estimated speed list VG j =(VG j,1 ,VG j,2 ,…,VG j,i2 ,…,VG j,r-1 );VG j,i2 Is TG j,i2 Corresponding estimated speed VG j,i2 =BG j,i2 /TG j,i2
S340 according to VG j And MG (media g) j Determining G j Corresponding estimated speed profile SL of an aircraft j,1 The method comprises the steps of carrying out a first treatment on the surface of the Wherein SL is provided with j,1 Is the abscissa of (1) to generate MG j The time corresponding to each history estimated position information in the system is represented by TG on the ordinate j And the estimated speed corresponding to each time interval in the time frame.
In this embodiment, there are two motion states of the aircraft in the panoramic view of the target area, one is a takeoff motion state, in which the speed of the aircraft is gradually increased; the other is a landing motion state in which the speed of the aircraft is gradually reduced; the estimated speed curve can determine the estimated speed of the aircraft at a certain moment, and can also determine whether the aircraft is in a take-off motion state or a landing motion state in a target area, so that the accuracy of determining the motion state aircraft information is improved.
Optionally, the method further comprises the steps of:
s010, obtaining a standard aircraft information list Lt= (Lt) which corresponds to the current time point t and is not bound by the aircraft 1 ,Lt 2 ,…,Lt i3 ,…,Lt y(t) ) I3=1, 2, …, y (t); wherein Lt is i3 Is the i3 rd unbound aircraft within LtStandard aircraft information, y (t) is the number of standard aircraft information corresponding to t and not bound by the aircraft; lt (Lt) i3 Including the corresponding real position coordinates of the aircraft.
In this embodiment, in a specific airport scene, before the aircraft enters the airport, its flight information is already sent to a background management system corresponding to the airport, for example, based on the flight information of the aircraft about to enter, which is received by the ADS-B system, the information is standard aircraft information; after any standard aircraft information is bound by the corresponding aircraft identifier, the corresponding second flag bit is set to be in a bound state, and the standard aircraft information with the state of the second flag bit being in the bound state is obtained, so that Lt can be obtained.
It should be noted that, the standard aircraft information includes the real position coordinates of the corresponding aircraft, and the real position coordinates include the longitude and latitude where the corresponding aircraft is currently located.
S020, obtaining motion state judgment vectors corresponding to the standard aircraft information which is not bound by the aircraft in the Lt to obtain a motion state judgment vector set FLt= (FLt) corresponding to the Lt 1 ,FLt 2 ,…,FLt i3 ,…,FLt y(t) ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein FLt i3 Let Lt i3 Corresponding motion state judgment vectors; FLt (FLt) i3 =(FLt i3,1 ,FLt i3,2 ,FLt i3,3 ),FLt i3,1 For location change identification, FLt i3,2 For air-ground state change identification, FLt i3,3 Is identified for horizontal velocity.
In this embodiment, each standard aircraft information in Lt is updated at a certain time interval, for example, an update is requested to the ADS-B system every 3 seconds, and the newly acquired data is used to cover the original data correspondingly; the true position coordinates and horizontal speed of the aircraft in a motion state can be changed theoretically, and the space and ground state can be changed; the real position coordinates, the horizontal speed and the air-ground state of the aircraft in the static state are not changed; based on the secondary, a third vector corresponding to the standard aircraft information is established, whether the real position coordinates change is represented by the state of the position mark, whether the air-ground state changes is represented by the state of the air-ground mark, and whether the upper horizontal speed is zero is represented by the horizontal speed mark.
It should be noted that, the position change and the horizontal velocity can be obtained through the real position coordinates corresponding to the standard aircraft information acquired in two adjacent times, and the air-ground state can be directly acquired through the ADS-B system.
S030, if FLt i3 Inner FLt i3,1 、FLt i3,2 FLt i3,3 The states of (1) are all the first state, then Lt i3 Determining static standard aircraft information corresponding to the static aircraft; otherwise, let Lt i3 And determining the motion standard aircraft information corresponding to the aircraft in the motion state.
In this embodiment, FLt i3 Inner FLt i3,1 The state of (1) is the first state, FLt i3,2 The state of (2) is the first state and FLt i3,3 The states of (a) are all the first states, representing FLt i3 The corresponding real position coordinates are unchanged, the air-ground state is unchanged and the horizontal speed is zero, namely FLt i3 Corresponding to a stationary state of the aircraft; if FLt i3 Inner FLt i3,1 、FLt i3,2 FLt i3,3 If the state of (2) is not the first state, determining FLt i3 The corresponding aircraft is in motion.
And S040, acquiring information of each movement standard aircraft in the Lt to obtain a movement standard aircraft information list.
In this embodiment, all the motion standard aircraft information is acquired, and when the first target information is matched with the aircraft image in the motion state, only the matching is required in the motion standard aircraft information list; on one hand, the calculation amount during matching can be reduced, and on the other hand, the matching of static standard aircraft information to an aircraft image in a motion state can be avoided, so that the accuracy of matching the aircraft information in the motion state is improved.
In an exemplary embodiment, an aircraft information unbinding method is provided, the method comprising the steps of:
s700, obtaining the target areaThe current target area panorama determines a number of aircraft identifications of the bound standard aircraft information to obtain a second set of aircraft identifications a '= (a' 1 ,A’ 2 ,…,A’ m1 ,…, A’ n1 ) M1=1, 2, …, n1; wherein A 'is' m1 And (3) the aircraft identification of the m1 st bound standard aircraft information in the current target area panorama of the target area, wherein n1 is the number of the aircraft identifications of the bound standard aircraft information in the current target area panorama of the target area.
In this embodiment, in the current target area panorama of the target area, a part of the aircraft identifications are bound with standard aircraft information, and another part of the aircraft identifications are not bound with standard aircraft information, so as to obtain the aircraft identifications of each bound standard aircraft information, and obtain a'.
S710, acquiring the aircraft identification corresponding to each aircraft in the static state from the A' and determining the aircraft identification as the static aircraft identification to obtain a static aircraft identification set Z= (Z) 1 ,Z 2 ,…,Z j1 ,...,Z k1 ) J1=1, 2, …, k1; wherein Z is j1 For the j1 st stationary aircraft identification in the target area data, k1 is the number of stationary aircraft identifications in Z.
In this embodiment, each aircraft identifier in a' corresponds to an aircraft, one part is in a moving state, and the other part is in a stationary state; acquiring an aircraft identifier corresponding to each aircraft in a static state, and obtaining Z; in this embodiment, the current target area panorama of the target area and the target area panorama of the previous time may be compared to determine the aircraft identifier in the static state in the current target area panorama of the target area.
S720, obtaining Z j1 Sub-zone data KZ corresponding to a sub-zone in which the corresponding aircraft is parked in the target zone j1 The method comprises the steps of carrying out a first treatment on the surface of the The subarea is a preset-size area in the target area.
In this embodiment, the sub-regions are preset stand positions in the airport, each sub-region is provided with a unique identifier, and the sub-region data includes identifiers corresponding to the sub-regions.
S730, according to KZ j1 Determining KZ from a standard aircraft information list corresponding to the current time point t j1 Corresponding standard aircraft information.
In this embodiment, each stand at the current time corresponds to an aircraft planned to be parked, that is, each sub-area where an aircraft is planned to be parked at the current time corresponds to a piece of standard aircraft information in the background; through KZ j1 Obtaining a standard aircraft information list corresponding to the current time point t from the background to determine KZ j1 Corresponding standard aircraft information.
S740, if KZ j1 Corresponding standard aircraft information and Z j1 If the currently bound standard aircraft information does not meet the preset second matching condition, Z is determined j1 The currently bound standard aircraft information is unbundled.
In this embodiment, Z j1 A piece of standard aircraft information is bound in the target area panorama, Z j1 The corresponding sub-zone in which the aircraft is parked corresponds in the background to a piece of standard aircraft information, typically Z j1 The corresponding sub-zone in which the aircraft is parked is correct, and therefore, if Z j1 The bound standard aircraft information is correct, then Z j1 Bound standard aircraft information and KZ j1 The corresponding standard aircraft information should be the same, otherwise, Z can be determined j1 The currently bound standard aircraft information is wrong, Z is needed to be calculated j1 The currently bound standard aircraft information is unbundled to rebind the correct standard aircraft information.
In this embodiment, whether the current binding aircraft information of the aircraft identifier is correct or not is verified through the standard aircraft information corresponding to the sub-region where the aircraft corresponding to the aircraft identifier is parked, and because the standard aircraft information corresponding to the sub-region is correct, the accuracy of the verification result of the current binding aircraft information of the aircraft identifier can be ensured.
Optionally, the method further comprises the steps of:
s800, determining an aircraft identification of each unbound standard aircraft information from a current target area panorama of the target area to obtain a first aircraft identification set A= (A) 1 ,A 2 ,…,A m ,…,A n ) M=1, 2, …, n; wherein A is m And (3) identifying the aircraft with the m-th unbound standard aircraft information in the current target area panorama of the target area, wherein n is the number of the aircraft identifications with the unbound standard aircraft information in the current target area panorama of the target area.
In this embodiment, the target area may be understood as an airport, and the panoramic view of the target area is a panoramic view of the airport, and because the area of the airport is large, it is generally difficult for a single camera to cover the whole airport completely; therefore, in the application, the target area panorama is formed by splicing the monitoring pictures of a plurality of cameras, and a person skilled in the art can splice a plurality of video pictures by adopting the existing image splicing technology according to the need to obtain the target area panorama.
In the current target area panorama of the target area, there are a plurality of aircraft images corresponding to the aircraft, and a person skilled in the art can identify each aircraft by adopting the existing image identification technology according to the need, and perform identification setting on each aircraft in the target area panorama, wherein it is understood that the aircraft identifications are used for distinguishing each aircraft, and the aircraft identifications corresponding to any two aircraft images are different; in the current target area panorama of the target area, a part of aircraft identifications are bound with standard aircraft information, another part of aircraft identifications are not bound with the standard aircraft information, and the aircraft identifications of each unbound standard aircraft information are obtained to obtain A.
S810, obtaining A m Corresponding first flag bit lab1_A m State of (2); wherein, lab1_A m For recording A m Whether or not the standard aircraft information that was bound was unbound.
In this embodiment, the position coordinates of any aircraft in the target area are estimated by the position of the corresponding aircraft image in the top view of the target area, and when estimating the position coordinates of the aircraft in the target area, the position coordinates are based on a static target area panorama, in which it is possible that a certain aircraft passes through a sub-area, is not parked in the sub-area, and is bound with a piece of standard aircraft information by mistake.
Based on the above scenario, the standard aircraft information bound to the aircraft identifier in the static state in the current target area panorama of the target area may be verified by the method in steps S011-S014, so that there may be an aircraft identifier of unbound standard aircraft information that unbunds the standard aircraft information bound once; for each aircraft identifier, a corresponding first flag bit is set, and is used for recording whether the corresponding aircraft identifier unbundles the standard aircraft information which is bound.
S820, if lab1_A m The state of (1) is the unbinding state, and then A is acquired m Sub-region data KA corresponding to a sub-region in which a corresponding aircraft is parked in a target region m
In the present embodiment, if lab1_A m To unbind the state, then represent A m Unbinding the standard aircraft information which is bound; in this case, due to A m The corresponding aircraft is in a static state, i.e. is parked in the corresponding sub-area, and can acquire A m Sub-region data KA corresponding to a sub-region in which a corresponding aircraft is parked in a target region m
S830 according to KA m Determining KA from a standard aircraft information list corresponding to the current time point t m Corresponding standard aircraft information.
S840, KA m Corresponding standard aircraft information and A m Binding is performed.
In the present embodiment, KA will be determined because the standard aircraft information corresponding to the sub-region is correct m Corresponding standard aircraft information and A m Binding can ensure A m The currently bound aircraft information is correct, thereby further enhancing the aircraft informationAccuracy of the matching.
In an exemplary embodiment, an aircraft information matching method is provided, the method comprising the steps of:
s910, determining an aircraft identification of each unbound standard aircraft information from a current target area panorama of the target area to obtain a first aircraft identification set A= (A) 1 ,A 2 ,…,A m ,…,A n ) M=1, 2, …, n; wherein A is m And (3) identifying the aircraft with the m-th unbound standard aircraft information in the current target area panorama of the target area, wherein n is the number of the aircraft identifications with the unbound standard aircraft information in the current target area panorama of the target area.
In this embodiment, the target area may be understood as an airport, and the panoramic view of the target area is a panoramic view of the airport, and because the area of the airport is large, it is generally difficult for a single camera to cover the whole airport completely; therefore, in the application, the target area panorama is formed by splicing the monitoring pictures of a plurality of cameras, and a person skilled in the art can splice a plurality of video pictures by adopting the existing image splicing technology according to the need to obtain the target area panorama.
In the current target area panorama of the target area, there are a plurality of aircraft images corresponding to the aircraft, and a person skilled in the art can identify each aircraft by adopting the existing image identification technology according to the need, and perform identification setting on each aircraft in the target area panorama, wherein it is understood that the aircraft identifications are used for distinguishing each aircraft, and the aircraft identifications corresponding to any two aircraft images are different; in the current target area panorama of the target area, a part of aircraft identifications are bound with standard aircraft information, another part of aircraft identifications are not bound with the standard aircraft information, and the aircraft identifications of each unbound standard aircraft information are obtained to obtain A.
S920, through a preset mapping algorithm, A is calculated m Mapping the corresponding aircraft image into a top view of a preset target area corresponding to the target area; wherein the preset target areaThe field top view is provided with a plurality of reference points, and each reference point corresponds to a real position coordinate.
In this embodiment, for the target area panorama, the target area panorama can be converted into the target area top view according to a preset mapping algorithm, for example, a perspective transformation matrix; uniformly setting a plurality of reference points in a top view of a target area, wherein each reference point is a pixel point; for example, the target area top view is a top view of an airport, and a plurality of reference points are uniformly arranged on a runway and the outer side of a stand in the airport top view; each reference point is arranged at a preset position in the top view of the target area, and the top view of the target area is compared with the existing map, for example, the google map; the true position coordinates of each reference point in the target top view, namely the longitude and latitude of each reference point, can be obtained; in this embodiment, the target area panoramic image is converted only once to obtain the target area panoramic image, and in the subsequent conversion process, for example, in the subsequent video monitoring process, when an aircraft enters the target area panoramic image, the aircraft image is mapped only into the target area panoramic image, and the image area of the aircraft image is smaller than that of the whole target area panoramic image, so that the calculation force occupation in image conversion can be greatly reduced, and the efficiency of matching the aircraft information is improved.
And then mapping the aircraft image corresponding to each aircraft identifier in the A to a target area top view through the same preset mapping algorithm so as to facilitate the estimation of the position of the subsequent aircraft.
S930, acquiring a coordinate system corresponding to the top view of the target area from the coordinate system corresponding to the top view of the target area m First pixel point B with nearest center pixel point pixel of corresponding aircraft image in top view of target area m,1 And with A) m A second pixel point B, the center pixel point of the corresponding aircraft image in the top view of the target area, is closest to the second pixel point m,2 Is provided.
In the present embodiment, a plan view corresponding to the target area can be establishedIn which the coordinate system of A can be obtained m First pixel point B with nearest center pixel point pixel of corresponding aircraft image in top view of target area m,1 And with A) m A second pixel point B, the center pixel point of the corresponding aircraft image in the top view of the target area, is closest to the second pixel point m,2 Is a part of the position information of the mobile terminal; a is that m The center pixel of the corresponding aircraft image may be determined by: first, set A m Corresponding rectangular frame, so that A m The corresponding aircraft image is entirely at A m In the corresponding rectangular frame, the edge pixel points of the aircraft image are contacted with the rectangular frame, then, the pixel point coordinates of four vertexes of the rectangular frame are obtained, and further, A is determined m The position coordinates of the center pixel point of the corresponding aircraft image.
S940, according to B m,1 And A is a m Pixel distance, B, of the center pixel point of the corresponding aircraft image m,2 And A is a m Pixel distance of center pixel point of corresponding aircraft image and B m,1 And B m,2 Corresponding real position coordinates, determining A m Corresponding estimated position coordinates of the aircraft; to obtain the estimated position coordinate set WA= (WA) 1 ,WA 2 ,…,WA m ,…,WA n ) Wherein WA m Is A m And corresponding estimated position coordinates of the aircraft.
In the present embodiment, in the target area plan view, the target area is directed to a m Corresponding to the aircraft image, can be according to B m,1 And A is a m Pixel distance, B, of the center pixel point of the corresponding aircraft image m,2 And A is a m Determining a corresponding pixel distance of a central pixel point of the aircraft image m Center pixel and B of corresponding aircraft image m,1 And B m,2 Is set according to the relative position relation of B m,1 And B m,2 Corresponding weights, then combine with B m,1 And B m,2 Corresponding real position coordinate prediction A m Corresponding estimated position coordinates of the aircraft; it will be appreciated that A m Corresponding navigationThe estimated position coordinates of the empty device are position coordinates estimated according to the real position coordinates of the two reference points in the top view of the target area, and are not real position coordinates.
S950, obtaining a standard aircraft information list Wt= (Wt) corresponding to the current time point t 1 ,Wt 2 ,…,Wt i ,…,Wt f(t) ) I=1, 2, …, f (t); wherein Wt is i The i standard aircraft information in the standard aircraft information list corresponding to t is the number of standard aircraft information corresponding to t; wt (Wt) i Including the true position coordinates corresponding to the ith aircraft.
In this embodiment, in a specific airport scene, before an aircraft enters an airport, its flight information, i.e., standard aircraft information, is already sent to a background management system corresponding to the airport, for example, the flight information of the aircraft to be entered, which is received based on the ADS-B system, is the standard aircraft information; it should be noted that, the standard aircraft information includes the corresponding real position coordinates of the aircraft, and the real position coordinates include the longitude and latitude where the corresponding aircraft is currently located; each piece of aircraft information corresponding to the current time point t can be acquired to obtain Wt.
S960, mixing Wt with WA m Standard aircraft information and A meeting preset first matching conditions m Binding is performed.
The real position coordinates corresponding to the information of each aircraft in Wt are matched with WA m Matching if the real position coordinate corresponding to a certain standard aircraft information in Wt and WA m Meets a preset first matching condition, for example, real position coordinates and WA corresponding to a certain standard aircraft information in Wt m The distance difference value of (2) is within a preset range; then the standard aircraft information is indicated as a m Standard aircraft information for a corresponding aircraft, and associating the standard aircraft information with a m Binding is performed, thereby completing A m Matching of corresponding standard aircraft information.
The aircraft information matching method of the embodiment obtains the aircraft identifier of each unbound standard aircraft information in the panoramic view of the target areaA m Mapping the corresponding aircraft image into a preset target area top view, wherein a plurality of reference points are preset in the target area top view, and each reference point corresponds to a real position coordinate; acquisition of AND A in target area Top view m The center pixel point of the corresponding aircraft image in the top view of the target area is closest to the first reference point B m,1 And with A) m The center pixel point pixel of the corresponding aircraft image in the top view of the target area is the second closest reference point B m,2 Is a part of the position information of the mobile terminal; thus, according to B m,1 And B m,2 Corresponding real position coordinates, determining A m Corresponding estimated position coordinates of the aircraft; will A m Matching the estimated position coordinates of the corresponding aircraft with the real position coordinates contained in each piece of standard aircraft information in the standard aircraft information list, and matching the Wt with WA m Standard aircraft information and A meeting preset first matching conditions m Binding is carried out, so that the matching of the aircraft information is realized.
Furthermore, the invention only carries out one-time conversion on the panoramic view of the target area to obtain the top view of the target area, in the subsequent conversion process, only converts the aircraft image corresponding to the aircraft mark without binding the standard aircraft information, and the image area of the aircraft image is smaller compared with the panoramic view of the whole target area, so the invention can greatly reduce the calculation occupation during image conversion and improve the efficiency of aircraft information matching.
Optionally, the real position coordinates include latitude and longitude, WA m Is determined by the following steps:
s941, obtain B m,1 And A is a m First pixel distance d1 and B of corresponding central pixel point of aircraft m,2 And A is a m A second pixel distance d2 of the center pixel of the corresponding aircraft.
S942, determining the first weight α1=d2/(d1+d2) and the second weight α2=d1/(d1+d2) according to d1 and d 2.
In this embodiment, the target area panorama is converted into a target area top viewAfter the figure, the image in the top view of the target area may have some distortion and is not a standard top view; at this time, if A is determined with reference to a certain reference point in the top view of the target area m The first position coordinates of the corresponding aircraft will result in a determined A m The error of the estimated position coordinates of the corresponding aircraft is larger; therefore, in this embodiment, B is selected m,1 And B m,2 Two reference points, and determining weights corresponding to the two reference points respectively, according to A m The corresponding weight is determined according to the proportional relation of the pixel distance between the center pixel point of the corresponding aircraft and the pixel points corresponding to the two reference points, the weight corresponding to the reference point with the smaller distance is larger, and the weight corresponding to the reference point with the longer distance is smaller.
S943 according to alpha 1, alpha 2, B m,1 Corresponding latitude latB m,1 And longitude lon B m,1 B, B m,2 Corresponding latitude latB m,2 And longitude lon B m,2 Determining WA m =(latA m ,lonA m ),latA m =α1×latB m,1 +α2×latB m,2 ,lonA m =α1×lonB m,1 +α2×lonB m,2 The method comprises the steps of carrying out a first treatment on the surface of the Wherein, latA m For WA m Corresponding latitude, lon A m For WA m Corresponding longitude.
In this embodiment, WA is calculated by weighting m The corresponding latitude and longitude are larger in weight, and the corresponding latitude and longitude are smaller in weight. Thereby, the distortion effect of the top view of the target area can be weakened, and WA can be improved m Accuracy of the determination.
Optionally, step S960 includes the steps of:
s961, obtain Wt i Corresponding latitude latWt i Longitude lonWt i
S962 if |latA m -latWt i I < eta 1 and I lon A m -lonWt i I < eta 2, then Wt i As A m Corresponding first position coordinates; wherein, eta 1 is a preset latitude difference threshold value, eta 2 is a preset longitude differenceA threshold value.
In the present embodiment, if |latA m -latWt i I < eta 1 and I lon A m -lonWt i I < eta 2, denote Wt i Corresponding real position coordinates and A m The distance between the estimated position coordinates of the corresponding aircraft is relatively short, and the estimated position coordinates are basically the same position coordinates, so that Wt can be judged i The corresponding real position coordinate is A m Corresponding real position coordinates of the aircraft; due to A m The estimated position coordinates of the corresponding aircraft are estimated position coordinates, and a certain error exists between the estimated position coordinates and the corresponding real position coordinates, so that the setting of eta 1 and eta 2 in the embodiment can weaken A m The corresponding error of the estimated position coordinates ensures that the generalization capability is stronger when the position coordinates are matched, and the standard aircraft information is easier to be matched; in this embodiment, η1 ranges from 0.0003 to 0.0005 and η2 ranges from 0.0002 to 0.004.
Alternatively, A m The center pixel point of the corresponding aircraft image is obtained by the following steps:
s931, setting A in the target region panorama m Corresponding bounding boxes, such that A m The image of the corresponding aircraft is at A m And the corresponding surrounding frame.
In this embodiment, the bounding box may be a rectangular box, so that the pixel points on the outermost side of the aircraft image are in contact with four sides of the rectangular box.
S932, obtaining A in the panoramic view of the target area m Center pixel point QA of corresponding bounding box m
In this embodiment, a person skilled in the art can obtain a in the panoramic view of the target area according to the existing method for determining the center pixel point of the bounding box m Center pixel point QA of corresponding bounding box m And will not be described in detail herein.
S933, through a preset perspective transformation matrix, QA is performed m Mapping to the target area top view to obtain the image and QA located in the target top view m And mapping the pixel points correspondingly.
It should be noted that one skilled in the art can rootIf necessary, adopting the existing perspective transformation matrix to make QA m Mapping to the target area top view to obtain the image and QA located in the target top view m The corresponding mapping pixel points are not described in detail herein.
S934, and QA in the target top view m The corresponding mapping pixel point is used as A m The center pixel of the corresponding aircraft image.
In the present embodiment, QA is only required m Mapping into the target area plan view is unnecessary, and the whole aircraft image is not required to be mapped into the target area plan view, so that the calculated amount can be greatly reduced, and the execution efficiency can be improved.
In an exemplary embodiment, an aircraft position determination method is provided, the method comprising the steps of:
s1, acquiring a sub-region frame identification of a sub-region frame corresponding to each preset sub-region in the target region panorama to obtain a sub-region frame identification set E= (E) 1 ,E 2 ,…,E a ,…,E b ) A=1, 2, …, b; wherein E is a The method comprises the steps that an a-th sub-region frame mark in a target panoramic image is identified, and b is the number of sub-region frame marks corresponding to the target panoramic image; auxiliary pixel points are arranged at preset positions of the corresponding sub-area frames in the sub-area frames.
In this embodiment, the target area may be understood as an airport, the sub-area is a station preset in the airport, and the aircraft may be understood as an aircraft parked in the airport; the sub-region frame corresponding to the sub-region can be obtained by the following two methods:
the first method is as follows: setting a sub-region frame of a preset color at a position corresponding to each sub-region in the target region, for example, setting a red or yellow sub-region frame; and then obtaining a sub-region frame corresponding to each sub-region in the target panoramic image.
The second method is as follows: firstly, a target area panorama is obtained, and then each sub-area image is surrounded in the target area panorama in a line segment adding mode, so that a sub-area frame corresponding to each sub-area is obtained.
It should be noted that, the auxiliary pixel point in each sub-area frame can also be obtained by the two methods; the auxiliary pixel points within each sub-region frame may be disposed within the corresponding sub-region frame at a location where the aircraft nose is parked.
S2, acquiring aircraft identifications corresponding to each aircraft in a static state in the target area panorama to obtain a static aircraft identification set C= (C) 1 ,C 2 ,…,C p ,…,C q ) P=1, 2, …, q; wherein C is p For the p-th stationary aircraft identification within the target area panorama, q is the number of stationary aircraft identifications within the target area panorama.
In this embodiment, in the target area panorama, there may be a plurality of aircraft images corresponding to stationary aircraft, and those skilled in the art can identify each aircraft by using an existing image identification technology according to needs, and perform an identification setting on each stationary aircraft in the target area panorama, where it is understood that the aircraft image identifications are used to distinguish each aircraft, and the aircraft image identifications corresponding to any two aircraft images are different.
S3, acquiring a bounding box corresponding to each static aircraft identifier in C to obtain a bounding box identifier set LC= (LC) corresponding to C 1 ,LC 2 ,…,LC p ,…,LC q ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein LC is p Is C p Surrounding frame identification of corresponding surrounding frame, C p The corresponding aircraft image is entirely in LC p And (3) inner part.
In this embodiment, the target area panorama is a plan view presented on the display screen, and a bounding box, for example, a rectangular box, corresponding to each aircraft image can be automatically set on the target area panorama; such that each aircraft image is entirely within a corresponding bounding box; thereby, the aircraft image is converted to be represented by the corresponding bounding box.
S4, obtaining LC from E p Each sub-region frame with the overlapping region is used as an intermediate sub-region frame to obtain LC p Corresponding mid-subregion frame identification set EC p =(EC p,1 ,EC p,2 ,…,EC p,x ,…,EC p,y(p) ) X=1, 2, …, y (p); wherein EC is p,x For EC (EC) p X and LC of (b) p An intermediate sub-region frame identifier corresponding to the intermediate sub-region frame with the overlapping region, y (p) being EC p Number of inner middle sub-area box identifications.
In this embodiment, when the aircraft is in a static state, because the view angle of the target panorama is not a vertical view angle, when the target panorama is visualized, there may be a case where a surrounding frame corresponding to the static aircraft image coincides with a plurality of sub-area frames, each of the sub-areas corresponding to the surrounding frame corresponding to the static aircraft image has a sub-area corresponding to the sub-area frame of the overlapping area, and it is possible that the sub-area corresponding to the sub-area frame of the overlapping area is not the sub-area corresponding to the aircraft image, but the sub-area corresponding to the sub-area frame of the overlapping area is not the sub-area corresponding to the aircraft image; thus, there is a need for EC p And the middle sub-area frame corresponding to the middle sub-area frame identification is analyzed and processed, and each sub-area frame in the target area panorama is not required to be analyzed and processed, so that the judging efficiency of the position of the aircraft is improved.
S5, obtaining LC p Center pixel point WC of corresponding bounding box p
In the present embodiment, if LC p Rectangular frame, LC can be used p Is the intersection of two diagonals of WC p The method comprises the steps of carrying out a first treatment on the surface of the If LC is used p Is an irregularly shaped bounding box, the LC can be obtained by adopting a mode of presetting a central pixel point p Corresponding center pixel point WC p
S6, if WC p Located in EC p,x In the corresponding middle subregion frame or EC p,x Auxiliary pixel points of the corresponding middle sub-area frame are positioned on LC p Judging C if the corresponding bounding box is inside p Corresponding aircraft is located in the EC in the target area p,x Within the corresponding sub-region.
In this embodiment, in most cases, the deviation of the positions where the aircraft is parked in the corresponding sub-area is not great, and based on this, in this embodiment, the or judgment condition is adopted to judge whether the aircraft is located in the corresponding sub-area; the method can improve the generalization capability of the position judgment of the aircraft and can quickly judge whether the aircraft is positioned in the corresponding subarea.
It should be noted that, in order to improve the accuracy of the judgment, the judgment condition of the AND can also be adopted, namely if WC p Located in EC p,x Within the corresponding middle subregion frame and EC p,x Auxiliary pixel points of the corresponding middle sub-area frame are positioned on LC p Judging C if the corresponding bounding box is inside p Corresponding aircraft is located in the EC in the target area p,x Within the corresponding sub-region; and judging the region corresponding to the position of the aircraft only when the two judging conditions are met, so that the accuracy of the position judgment of the aircraft is improved.
According to the aircraft judging method, sub-region frame identifiers of sub-region frames corresponding to each sub-region in the target region panorama are obtained, and auxiliary pixel points are arranged at preset positions of each sub-region frame; simultaneously, setting a corresponding bounding box for each static aircraft image in the target panoramic image, so that the static aircraft image is completely in the corresponding bounding box; for any aircraft image, acquiring each middle sub-region frame with an overlapping region with a surrounding frame of the aircraft image, thereby obtaining each sub-region where the aircraft corresponding to the aircraft image is likely to park; based on the central pixel point of the bounding box corresponding to the aircraft image is obtained, and when the central pixel point of the bounding box corresponding to the aircraft image is positioned in a certain middle sub-area frame, the aircraft is judged to be positioned in the sub-area corresponding to the middle sub-area frame; when the auxiliary pixel point corresponding to a certain middle sub-region frame is positioned in the surrounding frame corresponding to the aircraft image, the aircraft is also judged to be positioned in the sub-region corresponding to the sub-region frame; thereby enabling a determination of the position of the aircraft.
Further, because the shooting angle of the camera is not a vertical angle, in the target panoramic view, the sub-region frame has certain distortion; the surrounding frames corresponding to the aircraft are surrounding frames arranged under the plane view angle, so that the angles of view latitude of the surrounding frames are different, and the position of the aircraft cannot be determined by comparing the sub-area frames with the surrounding frames independently; according to the invention, the auxiliary pixel points are arranged at the preset positions of the sub-area frames, and the auxiliary pixel points are positioned in the corresponding sub-area frames in the target panoramic image no matter what view angle, and the central pixel points of the surrounding frames corresponding to the aircrafts are positioned in the corresponding surrounding frames all the time, so that the positions of the corresponding aircrafts in the target panoramic image can be represented; determining whether the aircraft is in the corresponding sub-region by judging whether the central pixel point of the surrounding frame is located in a certain sub-region frame or whether the auxiliary pixel point of the certain sub-region frame is located in the surrounding frame; the influence of the frame distortion of the sub-region in the target panoramic image can be eliminated, and the accuracy of judging the position of the aircraft is improved.
Optionally, the bounding box is a rectangular box, and step S6 includes the following steps:
s61, obtain WC P Pixel position information TWC in coordinate system corresponding to target area panorama P =(TWC P,1 ,TWC P,2 ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein TWC P,1 For WC P X-axis coordinate parameters in coordinate system corresponding to target area panorama, TWC P,2 For WC P And a Y-axis coordinate parameter in a coordinate system corresponding to the panoramic view of the target area.
In this embodiment, a coordinate system corresponding to the panorama of the target area is established, and within this coordinate system, the TWC can be obtained P Corresponding X-axis coordinate parameters and Y-axis coordinate parameters to obtain TWC P
S62, obtaining EC p,x The corresponding bounding box has a minimum X-axis coordinate parameter SX1, a maximum X-axis coordinate parameter SX2, a minimum Y-axis coordinate parameter SY1 and a maximum Y-axis coordinate parameter SY2 in a coordinate system corresponding to the panoramic view of the target area.
In this embodiment, EC p,x The corresponding bounding box is rectangular, and EC can be acquired p,x Coordinates of pixel points corresponding to four vertexes of corresponding bounding box, thereby obtaining EC p,x The minimum of the corresponding bounding box in the coordinate system corresponding to the panorama of the target areaX-axis coordinate parameter SX1, maximum X-axis coordinate parameter SX2, minimum Y-axis coordinate parameter SY1, maximum Y-axis coordinate parameter SY2.
S63, if SX1 is less than or equal to TWC P,1 SX2 is less than or equal to 1 and SY is less than or equal to TWC P,2 SY2 is not more than, WC is judged p Located in EC p,x And the corresponding middle subarea frame. Optionally, step S600 further includes the steps of:
S64, obtain EC p,x Pixel position information TEC of auxiliary pixel points of corresponding middle sub-region frame in coordinate system corresponding to target region panorama p,x =(TEC p,x,1 ,TEC p,x,2 ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein, TEC p,x,1 For EC (EC) p,x X-axis coordinate parameters, TEC (thermoelectric cooler) of auxiliary pixel points of corresponding middle sub-region frame in coordinate system corresponding to target region panorama p,x,2 For EC (EC) p,x And the auxiliary pixel points of the corresponding middle sub-area frame are in Y-axis coordinate parameters in a coordinate system corresponding to the target area panorama.
S65, obtain LC p The corresponding bounding box has a minimum X-axis coordinate parameter SX '1, a maximum X-axis coordinate parameter SX'2, a minimum Y-axis coordinate parameter SY '1 and a maximum Y-axis coordinate parameter SY'2 in a coordinate system corresponding to the panoramic view of the target area.
S66, if SX'1 is less than or equal to TEC p,x,1 SX '2 is less than or equal to 2, and SY'1 is less than or equal to TEC p,x,2 SY'2 or less, then determine EC p,x Auxiliary pixel points of the corresponding middle sub-area frame are positioned on LC p And the corresponding surrounding frame. In this embodiment, whether the aircraft is in the corresponding sub-region is determined by determining whether the center pixel point of the bounding box is located in a certain sub-region frame or whether the auxiliary pixel point of a certain sub-region frame is located in the bounding box; the influence of the frame distortion of the sub-region in the target panoramic image can be eliminated, and the accuracy of judging the position of the aircraft is improved.
Alternatively, EC p,x Having two opposing first edges, and two opposing second edges; any first edge and any second edge are perpendicular to each other; EC (EC) p,x The distance from the auxiliary pixel point of the corresponding middle sub-region frame to the two first sides is the same, and EC p,x Auxiliary pixel points of corresponding middle sub-area frame to two firstThe distances between the two sides are different.
In this embodiment, EC p,x Auxiliary pixel points of the corresponding middle sub-region frame are arranged on the EC p,x The non-central position of the corresponding middle sub-area frame is preferably arranged at the position where the aircraft nose in the corresponding middle sub-area frame is parked; thereby, the aircraft can only meet WC when being basically completely parked in the corresponding subarea of the corresponding middle subarea frame p Located in EC p,x In the corresponding middle subregion frame or EC p,x Auxiliary pixel points of the corresponding middle sub-area frame are positioned on LC p And in the corresponding bounding box, thereby improving the accuracy of position judgment.
Optionally, step S3 includes the steps of:
s31, obtaining C p Corresponding aircraft images are in the first pixel point and the second pixel point of two outermost sides opposite in the horizontal direction, and in the third pixel point and the fourth pixel point of two outermost sides opposite in the vertical direction.
In this embodiment, after determining the aircraft image, those skilled in the art can determine C by using the existing image boundary pixel point determination method as required p Corresponding first pixel points and second pixel points on two outermost sides of the aircraft image, which are opposite in the horizontal direction, and third pixel points and fourth pixel points on two outermost sides, which are opposite in the vertical direction, are not described in detail herein.
S32, respectively setting a first straight line corresponding to the first pixel point and a second straight line corresponding to the second pixel point in the vertical direction by taking the first pixel point and the second pixel point as datum points, and respectively setting a third straight line corresponding to the third pixel point and a fourth straight line corresponding to the fourth pixel point in the vertical direction by taking the third pixel point and the fourth pixel point as datum points.
S33, taking a rectangular frame formed by intersecting a first straight line, a second straight line, a third straight line and a fourth straight line as LC p
In this embodiment, the above steps can enable C p The corresponding aircraft image is entirely in LC p And C in p Boundary pixel point and LC of corresponding aircraft image p Contact, LC p Can cover C more accurately p Corresponding aircraft images.
Optionally, WC p The method comprises the following steps of:
S51, obtaining the position C p Fifth pixel point at middle position of corresponding aircraft image head and positioned at C p And a sixth pixel point at the middle position of the tail of the corresponding aircraft image.
S52, obtaining the position C p And the seventh pixel point and the eighth pixel point of the wing vertex of the corresponding aircraft image.
And S53, connecting the fifth pixel point and the sixth pixel point to obtain a first line segment, and connecting the seventh pixel point and the eighth pixel point to obtain a second line segment.
S54, using the pixel point corresponding to the intersection point of the first line segment and the second line segment as WC p
In this embodiment, WC p Is based on C p A first line segment corresponding to the fuselage and a second line segment corresponding to the wing in the corresponding aircraft image are obtained; it can be appreciated that when the angle of the panorama of the corresponding shooting target area of the aircraft changes, the angles of the first line segment and the second line segment relative to the camera change correspondingly, and the WC obtained by the intersection point of the first line segment and the second line segment p Characterization C capable of being accurate all the time p A center pixel point of the corresponding aircraft image; thereby, the accuracy of the aircraft position determination is further improved.
Furthermore, although the steps of the methods in the present disclosure are depicted in a particular order in the drawings, this does not require or imply that the steps must be performed in that particular order or that all illustrated steps be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
Embodiments of the present invention also provide a non-transitory computer readable storage medium that may be disposed in an electronic device to store at least one instruction or at least one program for implementing one of the methods embodiments, the at least one instruction or the at least one program being loaded and executed by the processor to implement the methods provided by the embodiments described above.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Embodiments of the present application also provide an electronic device comprising a processor and the aforementioned non-transitory computer-readable storage medium.
An electronic device according to this embodiment of the application. The electronic device is merely an example, and should not impose any limitations on the functionality and scope of use of embodiments of the present application.
The electronic device is in the form of a general purpose computing device. Components of an electronic device may include, but are not limited to: the at least one processor, the at least one memory, and a bus connecting the various system components, including the memory and the processor.
Wherein the memory stores program code that is executable by the processor to cause the processor to perform steps in various embodiments described herein.
The storage may include readable media in the form of volatile storage, such as Random Access Memory (RAM) and/or cache memory, and may further include Read Only Memory (ROM).
The storage may also include a program/utility having a set (at least one) of program modules including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The bus may be one or more of several types of bus structures including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures.
The electronic device may also communicate with one or more external devices (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device, and/or with any device (e.g., router, modem, etc.) that enables the electronic device to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface. And, the electronic device may also communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through a network adapter. The network adapter communicates with other modules of the electronic device via a bus. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with an electronic device, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
Embodiments of the present invention also provide a computer program product comprising program code for causing an electronic device to carry out the steps of the method according to the various exemplary embodiments of the invention as described in the specification, when said program product is run on the electronic device.
While certain specific embodiments of the invention have been described in detail by way of example, it will be appreciated by those skilled in the art that the above examples are for illustration only and are not intended to limit the scope of the invention. Those skilled in the art will also appreciate that many modifications may be made to the embodiments without departing from the scope and spirit of the invention. The scope of the present disclosure is defined by the appended claims.

Claims (9)

1. A method of unbinding motion state aircraft information, the method comprising the steps of:
s100, acquiring an aircraft identifier corresponding to each aircraft in a motion state in target area data of a target area to obtain a motion aircraft identifier set G= (G) 1 ,G 2 ,…,G j ,…,G k ) J=1, 2, …, k; wherein G is j The j-th moving aircraft identification in the target area data of the target area is used, and k is the number of the moving aircraft identifications in the target area data of the target area;
s200, obtaining G in the sliding time window T j Corresponding historical estimated position information to obtain G j Corresponding historical estimated position information set MG j =(MG j,1 ,MG j,2 ,…,MG j,i ,…,MG j,r ) I=1, 2, …, r; wherein, MG j,i Is G in T j Corresponding ith historical estimated position information, r is G in T j The number of corresponding historical estimated position information; MG (media g) j,i =(WG j,i ,tG j,i ),WG j,i To generate MG j,i Time G j Corresponding estimated position coordinates, tG, of the aircraft j,i To generate MG j,i Time of (2); WG (Crystal growth promoting) j,i According to tG j,i Obtaining corresponding historical target area data; the ending time of T is the current time; the historical target area data are target area data of a target area in a preset historical time period;
s300 according to MG j Determining G j Corresponding estimated speed profile SL of an aircraft j,1
S400, obtaining G in T j History of each corresponding history target area dataStandard aircraft information to obtain G j Corresponding historical standard aircraft information set LG j =(LG j,1 ,LG j,2 ,…,LG j,i1 ,…,LG j,r1 ) I1=1, 2, …, r1; wherein LG (glass fiber reinforced plastic) j,i1 Is G in T j Corresponding i1 historical standard aircraft information, r1 is G in T j The number of corresponding historical standard aircraft information; LG (light emitting diode) j,i1 =(W’G j,i1 ,t’G j,i1 ),W’G j,i1 To generate LG j,i1 Time G j Corresponding real position coordinates of the aircraft, t' G j,i1 To generate LG j,i1 Time of (2); the standard aircraft information is flight information corresponding to the aircraft obtained through an ADS-B system;
s500 according to LG j Determining G j Corresponding true speed profile SL of an aircraft j,2
S600, if beta SL j <βSL 0 Will currently be with G j Unbinding the bound standard aircraft information; wherein, beta SL j Is SL (subscriber line) j,1 With SL (subscriber line) j,2 Is of the order of beta SL 0 Is a preset similarity threshold.
2. The method of unbinding motion state aircraft information according to claim 1, wherein step S100 comprises the steps of:
s110, determining a first area in the target area according to preset conditions; wherein the aircraft located in the first region is in a motion state;
s120, acquiring a target position information set HG= (HG) corresponding to each aircraft identifier 1 ,HG 2 ,…,HG j ,…,HG k ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein HG j Is G j Corresponding target position information of the corresponding aircraft in the target area data;
S130, if HG j The corresponding position is located in the first region, HG is then carried out j The corresponding aircraft identity is determined to be a moving aircraft identity.
3. The method of unbinding motion state aircraft information according to claim 1, wherein step S300 comprises the steps of:
s310, obtaining MG j The estimated distance between the estimated position coordinates corresponding to any two adjacent historical estimated position information in the range to obtain an estimated distance list BG j =(BG j,1 ,BG j,2 ,…,BG j,i2 ,…,BG j,r-1 ) I2=1, 2, …, r-1; wherein BG j,i2 For WG of j,i2 With WG j,i2+1 A predicted distance between the two;
s320, obtaining MG j Generating a time interval between times corresponding to any two adjacent historical estimated position information to obtain a time interval list TG j =(TG j,1 ,TG j,2 ,…,TG j,i2 ,…,TG j,r-1 ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein TG j,i2 For tG j,i2 And tG j,i2+1 Time interval between;
s330, according to BG j And TG j Determination of MG j Corresponding estimated speed list VG j =(VG j,1 ,VG j,2 ,…,VG j,i2 ,…,VG j,r-1 );VG j,i2 Is TG j,i2 Corresponding estimated speed VG j,i2 =BG j,i2 /TG j,i2
S340 according to VG j And MG (media g) j Determining G j Corresponding estimated speed profile SL of an aircraft j,1 The method comprises the steps of carrying out a first treatment on the surface of the Wherein SL is provided with j,1 Is the abscissa of (1) to generate MG j The time corresponding to each history estimated position information in the system is represented by TG on the ordinate j And the estimated speed corresponding to each time interval in the time frame.
4. The method of unbinding motion state aircraft information according to claim 1, further comprising the steps of:
S010, obtaining a standard aircraft information list Lt= (Lt) which corresponds to the current time point t and is not bound by the aircraft 1 ,Lt 2 ,…,Lt i3 ,…,Lt y(t) ) I3=1, 2, …, y (t); wherein Lt is i3 Is the i3 rd standard aircraft information not bound by the aircraft within Lt, y (t) ist corresponds to the number of standard aircraft information that are not bound by the aircraft; lt (Lt) i3 The real position coordinates of the corresponding aircraft are included;
s020, obtaining motion state judgment vectors corresponding to the standard aircraft information which is not bound by the aircraft in the Lt to obtain a motion state judgment vector set FLt= (FLt) corresponding to the Lt 1 ,FLt 2 ,…,FLt i3 ,…,FLt y(t) ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein FLt i3 Let Lt i3 Corresponding motion state judgment vectors; FLt (FLt) i3 =(FLt i3,1 ,FLt i3,2 ,FLt i3,3 ),FLt i3,1 For location change identification, FLt i3,2 For air-ground state change identification, FLt i3,3 Is a horizontal speed mark;
s030, if FLt i3 Inner FLt i3,1 、FLt i3,2 FLt i3,3 The states of (1) are all the first state, then Lt i3 Determining static standard aircraft information corresponding to the static aircraft; otherwise, let Lt i3 Determining the motion standard aircraft information corresponding to the aircraft in the motion state;
and S040, acquiring information of each movement standard aircraft in the Lt to obtain a movement standard aircraft information list.
5. The method of unbinding motion state aircraft information according to claim 1, further comprising the steps of:
S700, acquiring aircraft identifications for determining a plurality of pieces of bound standard aircraft information from current target area data of a target area to obtain a second aircraft identification set A ' = (A ') ' 1 ,A’ 2 ,…,A’ m1 ,…,A’ n1 ) M1=1, 2, …, n1; wherein A 'is' m1 The method comprises the steps that (1) aircraft identifications of the m1 st bound standard aircraft information in current target area data of a target area are obtained, and n1 is the number of the aircraft identifications of the bound standard aircraft information in the current target area data of the target area;
s710, acquiring the aircraft identification corresponding to each aircraft in the static state from the A' and determining the aircraft identification as the static aircraftIdentification to obtain a stationary aircraft identification set z= (Z) 1 ,Z 2 ,…,Z j1 ,...,Z k1 ) J1=1, 2, …, k1; wherein Z is j1 For the j1 st stationary aircraft identification in the target area data, k1 is the number of stationary aircraft identifications in Z;
s720, obtaining Z j1 Sub-zone data KZ corresponding to a sub-zone in which the corresponding aircraft is parked in the target zone j1 The method comprises the steps of carrying out a first treatment on the surface of the The subarea is a region with a preset size in the target region;
s730, according to KZ j1 Determining KZ from a standard aircraft information list corresponding to the current time point t j1 Corresponding standard aircraft information;
s740, if KZ j1 Corresponding standard aircraft information and Z j1 If the currently bound standard aircraft information does not meet the preset second matching condition, Z is determined j1 The currently bound standard aircraft information is unbundled.
6. The method of unbinding motion state aircraft information of claim 5, further comprising the steps of:
s800, determining an aircraft identifier of each unbound standard aircraft information from current target area data of the target area to obtain a first aircraft identifier set A= (A) 1 ,A 2 ,…,A m ,…,A n ) M=1, 2, …, n; wherein A is m The method comprises the steps that the aircraft identification of the m-th unbound standard aircraft information in current target area data of a target area is obtained, and n is the number of the aircraft identifications of the unbound standard aircraft information in the current target area data of the target area;
s810, obtaining A m Corresponding first flag bit lab1_A m State of (2); wherein, lab1_A m For recording A m Whether unbinding the standard aircraft information which is bound;
s820, if lab1_A m The state of (a) is the unbinding state, and A m The corresponding aircraft is in a static state, and then acquisition A m Corresponding aircraft within target areaSub-zone data KA corresponding to the parked sub-zone m
S830 according to KA m Determining KA from a standard aircraft information list corresponding to the current time point t m Corresponding standard aircraft information;
s840, KA m Corresponding standard aircraft information and A m Binding is performed.
7. The method of unbinding motion state aircraft information according to claim 1, wherein βsl 0 The value range of (5) is [0.95,1 ]]。
8. A non-transitory computer readable storage medium having stored therein at least one instruction or at least one program, wherein the at least one instruction or the at least one program is loaded and executed by a processor to implement the method of moving state aircraft information unbinding of any of claims 1-7.
9. An electronic device comprising a processor and the non-transitory computer readable storage medium of claim 8.
CN202311231605.7A 2023-09-22 2023-09-22 Motion state aircraft information unbinding method, electronic equipment and storage medium Active CN116994463B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311231605.7A CN116994463B (en) 2023-09-22 2023-09-22 Motion state aircraft information unbinding method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311231605.7A CN116994463B (en) 2023-09-22 2023-09-22 Motion state aircraft information unbinding method, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116994463A CN116994463A (en) 2023-11-03
CN116994463B true CN116994463B (en) 2023-12-08

Family

ID=88528635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311231605.7A Active CN116994463B (en) 2023-09-22 2023-09-22 Motion state aircraft information unbinding method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116994463B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001266300A (en) * 2000-03-17 2001-09-28 Toshiba Corp System for approach control into runway or the like
FR2936079A1 (en) * 2008-09-16 2010-03-19 Thales Sa METHOD FOR MONITORING THE LANDING PHASE OF AN AIRCRAFT
CN111511643A (en) * 2017-12-22 2020-08-07 Wing航空有限责任公司 Payload coupling device and payload delivery method for unmanned aerial vehicle

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130027226A1 (en) * 2010-04-12 2013-01-31 Flight Focus Pte. Ltd. Moving map display
US11521503B2 (en) * 2019-06-28 2022-12-06 The Boeing Company Methods and systems for authenticating an automatic dependent surveillance-broadcast (ADS-B) signal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001266300A (en) * 2000-03-17 2001-09-28 Toshiba Corp System for approach control into runway or the like
FR2936079A1 (en) * 2008-09-16 2010-03-19 Thales Sa METHOD FOR MONITORING THE LANDING PHASE OF AN AIRCRAFT
CN111511643A (en) * 2017-12-22 2020-08-07 Wing航空有限责任公司 Payload coupling device and payload delivery method for unmanned aerial vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
协同模型在航班进离港调度系统中的应用;申利民 等;《计算机工程与设计》;第30卷(第15期);全文 *

Also Published As

Publication number Publication date
CN116994463A (en) 2023-11-03

Similar Documents

Publication Publication Date Title
US11835561B2 (en) Unmanned aerial vehicle electromagnetic avoidance and utilization system
US11709491B2 (en) Dynamically adjusting UAV flight operations based on radio frequency signal data
CN110069071B (en) Unmanned aerial vehicle navigation method and device, storage medium and electronic equipment
KR102661954B1 (en) A method of processing an image, and apparatuses performing the same
US10990836B2 (en) Method and apparatus for recognizing object, device, vehicle and medium
JP6321570B2 (en) Indoor position information positioning system and indoor position information positioning method
US20230039293A1 (en) Method of processing image, electronic device, and storage medium
US11244164B2 (en) Augmentation of unmanned-vehicle line-of-sight
CN113286081B (en) Target identification method, device, equipment and medium for airport panoramic video
US20220044574A1 (en) System and method for remote viewing of vehicles using drones
CN115719436A (en) Model training method, target detection method, device, equipment and storage medium
CN113286080A (en) Scene monitoring system and video linkage tracking and enhanced display method and device
KR20220100768A (en) Apparatus and method for controlling landing based on image learning
CN114967731A (en) Unmanned aerial vehicle-based automatic field personnel searching method
WO2012066642A1 (en) Field-of-view video information generating apparatus
CN113742440B (en) Road image data processing method and device, electronic equipment and cloud computing platform
CN113286121A (en) Enhanced monitoring method, device, equipment and medium for airport scene video
CN106910358A (en) For the attitude determination method and device of unmanned vehicle
CN116994463B (en) Motion state aircraft information unbinding method, electronic equipment and storage medium
CN117218198A (en) Aircraft information matching method, electronic equipment and storage medium
CN117218199A (en) Aircraft position judging method, electronic equipment and storage medium
JP2021004936A (en) Map data management device and map data management method
US20240177337A1 (en) Spatial calibration method
US20240140437A1 (en) Object based vehicle localization
US20220121850A1 (en) Above-horizon target tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant