CN115123571A - Intelligent airport and intelligent airport runway vehicle real-time tracking rendering method and system - Google Patents

Intelligent airport and intelligent airport runway vehicle real-time tracking rendering method and system Download PDF

Info

Publication number
CN115123571A
CN115123571A CN202210795863.7A CN202210795863A CN115123571A CN 115123571 A CN115123571 A CN 115123571A CN 202210795863 A CN202210795863 A CN 202210795863A CN 115123571 A CN115123571 A CN 115123571A
Authority
CN
China
Prior art keywords
data
vehicle
runway
airport
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210795863.7A
Other languages
Chinese (zh)
Inventor
柳星
陈湘儒
邹承明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University of Technology WUT
Original Assignee
Wuhan University of Technology WUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University of Technology WUT filed Critical Wuhan University of Technology WUT
Priority to CN202210795863.7A priority Critical patent/CN115123571A/en
Publication of CN115123571A publication Critical patent/CN115123571A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64FGROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
    • B64F1/00Ground or aircraft-carrier-deck installations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64FGROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
    • B64F1/00Ground or aircraft-carrier-deck installations
    • B64F1/36Other airport installations

Abstract

The invention discloses a real-time tracking and rendering method and a real-time tracking and rendering system for intelligent airports and intelligent airport runway vehicles, wherein the intelligent airports comprise N runways, each runway comprises M lanes, each lane comprises X plates, and the sizes of the plates are the same; an optical fiber sensor is buried under the runway, runway vibration data are collected in real time, and the runway vibration data of the front-end controller judges whether vehicles pass through the plate number. The method loads an equal-scale airport panoramic model in a Cesium map engine according to the real longitude and latitude coordinates of the airport; and finally, tracking and rendering the vehicles on the airport runway on a three-dimensional map in real time according to the data received in real time. The invention solves the problem that vehicles are inconvenient to track and render in a large-scale area of the airport runway, accelerates the processing speed from data receiving to vehicle rendering and tracking, and reduces errors on positions.

Description

Intelligent airport and intelligent airport runway vehicle real-time tracking rendering method and system
1 technical field
The invention belongs to the technical field of airport safety, and relates to a method and a system for real-time detection, tracking and rendering of an intelligent airport and intelligent airport runway vehicles, in particular to a method and a system for real-time tracking and rendering of an intelligent airport and intelligent airport runway vehicles based on Cesium.
2 background of the invention
In recent years, smart airports have been increasing in recent years, and therefore real-time tracking and rendering of airplanes and guide vehicles on airport runways is an important research object.
The first of the existing vehicle tracking systems is a system that receives satellite signals through a GPS chip or a beidou chip, determines the position of the vehicle itself, and then transmits the position of the vehicle back to a control center, and the control center displays the position information of the vehicle on an electronic map mapped by a qualified enterprise having a navigation electronic map, thereby realizing the real-time acquisition of the position of the vehicle. However, there are many drawbacks to tracking and locating by this method. The network resources on which the GPS or beidou depends are limited. The network on which GPS relies is a link in the overall system, and without it, data cannot be sent and received. Real-time tracking of vehicles on airport runways causes significant errors and even mistakes. The second is to cover the road surface area through a camera, and track and locate the vehicles in the area. The method has the defects that the video data volume is too large, the real-time processing cannot be guaranteed under a large amount of data, and accumulation is caused, so that the generated time delay cannot guarantee the real-time tracking processing. Secondly, because the width of the runway of the airport is too wide, cameras can only be built on two sides of the runway, and therefore the whole runway area cannot be clearly monitored in a covering mode.
Therefore, if all airport runway areas can be monitored, the vehicle information on the airport runways can be processed in time and rendered on the BIM model of the airport, so that intelligent monitoring can be realized on the whole airport, and the vehicles on the airport runways can be tracked and rendered in real time.
Summary of the invention
The invention aims to provide an intelligent airport and a method and a system for receiving data through a WebSocket front end and using a Cesium map engine to realize real-time tracking and rendering of vehicles on a pavement of an airport runway, so that the problem that the airport runway is large in range, the whole area cannot be monitored conveniently through a camera, and delay and errors are reduced.
The technical scheme adopted by the intelligent machine place is as follows: an intelligent airport comprising N runways, each runway comprising M lanes, each lane comprising X number of panels; wherein N, M, X is a preset value;
the X plates have the same size, and the difference between the adjacent front, rear, left and right plates is consistent in longitude and latitude;
an optical fiber sensor is buried under the runway, runway vibration data are collected in real time, a front-end controller judges whether vehicles pass through the plate number according to the runway vibration data, and safety of the airport runway is detected in real time and an alarm is given in time according to different authorities of plate division in different areas of the airport runway.
The method adopts the technical scheme that: a real-time tracking and rendering method for intelligent airport runway vehicles comprises the following steps:
step 1: constructing an equal-proportion airport panoramic model;
and 2, step: converting the airport panoramic model in the obj format into a 3DTiles format supported by Cesium;
and 3, step 3: loading a model according to the real longitude and latitude coordinates of the airport in a corresponding coordinate loading mode in a Cesium map engine to generate a live-action three-dimensional map;
and 4, step 4: tracking and rendering vehicles on the airport runway on a three-dimensional map in real time according to the received data;
the optical fiber sensor and the front-end controller carry out data transmission through a WebSocket communication protocol, and the received data format protocol is as follows:
(1) when no vehicle runs on the runway, the front end receives the 'nondata', which represents that no vehicle needs to be tracked and rendered currently;
(2) when only one vehicle runs, the front end receives vehicle data, the first data is the identification of each vehicle, the identification is destroyed until the vehicle runs out of the runway, and the identification of the next vehicle is sequentially increased; the second bit of data represents forward or reverse travel; the rear 6-bit data is the current plate number of the vehicle;
(3) when data of a plurality of vehicles exist at the same time, different vehicle data are separated by using preset symbols, the first vehicle identification numbers of the different vehicle data are sequentially increased, and the meaning of each vehicle data is consistent with that in the step (2);
(4) when the vehicle passes through the wrong platform, the front end receives the wrong platform alarm data, the first two-bit data identification represents that the vehicle passes through the wrong platform, and the rear 6-bit data represents the plate number where the wrong platform is located.
The technical scheme adopted by the system of the invention is as follows: a smart airport runway vehicle real-time tracking rendering system comprises the following modules:
the module 1 is used for constructing an equal-proportion airport panoramic model;
a module 2, configured to convert the obj-format airport panoramic model into a 3DTiles format that cesum supports loading;
the module 3 is used for loading a model according to the real longitude and latitude coordinates of the airport in a corresponding coordinate loading mode in a Cesium map engine to generate a live-action three-dimensional map;
a module 4 for real-time tracking rendering of vehicles on the airport runway on a three-dimensional map according to the data received by the implementation;
the optical fiber sensor and the front-end controller carry out data transmission through a WebSocket communication protocol, and the received data format agreement protocol is as follows:
(1) when no vehicle runs on the runway, the front end receives the 'nondata', which represents that no vehicle needs to be tracked and rendered currently;
(2) when only one vehicle runs, the front end receives vehicle data, the first data is the identification of each vehicle, until the vehicle runs out of the runway, the identification is destroyed, and the identifications of the next vehicles are sequentially increased; the second bit of data represents forward or reverse travel; the rear 6-bit data is the current plate number of the vehicle;
(3) when data of a plurality of vehicles exist at the same time, different vehicle data are separated by using preset symbols, the first vehicle identification numbers of the different vehicle data are sequentially increased, and the meaning of each vehicle data is consistent with that in the step (2);
(4) when the vehicle passes through the wrong station, the front end receives wrong station alarm data, the first two-bit data identification represents that the vehicle passes through the wrong station, and the rear 6-bit data represents the plate number where the wrong station is located.
The invention solves the problem that vehicles are inconvenient to track and render in a large-range area of the airport runway, accelerates the processing speed from data receiving to vehicle rendering and tracking, and reduces errors on positions.
Drawings
FIG. 1 is a flow chart of an airport equal scale model and loading into a Cesium engine according to an embodiment of the present invention;
FIG. 2 is a flow diagram of a method of an embodiment of the invention;
FIG. 3 is a schematic diagram of an A, B two-point linear interpolation algorithm for close proximity in an embodiment of the present invention;
FIG. 4 is a schematic diagram of an A, B two-point LAGTANGE interpolation algorithm for close proximity in an embodiment of the present invention;
FIG. 5 is a schematic diagram of a two-point linear interpolation algorithm for A, B points that are far adjacent in the embodiment of the present invention;
fig. 6 is a schematic diagram of an A, B two-point lag interpolation algorithm for a far-adjacent point in the embodiment of the present invention.
Detailed description of the preferred embodiments
In order to facilitate understanding and implementation of the present invention for persons of ordinary skill in the art, the present invention is further described in detail with reference to the drawings and examples, it is to be understood that the implementation examples described herein are only for illustration and explanation of the present invention and are not to be construed as limiting the present invention.
In the intelligent airport provided by the embodiment, the airport comprises N runways, each runway comprises M lanes, and each lane comprises X plates; n, M, X is a preset value, and is set according to the actual construction condition of the runway; x plates, each plate is the same in size, and the longitude and latitude of the difference between the adjacent front, rear, left and right plates are consistent;
an optical fiber sensor is buried under the runway, runway vibration data are collected in real time, a front-end controller judges whether vehicles pass through the plate number according to the runway vibration data, and safety of the airport runway is detected in real time and intrusion is alarmed in time according to different authorities of plate division in different areas of the airport runway.
The real-time tracking and rendering method for the intelligent airport runway vehicles provided by the embodiment comprises the following steps:
step 1: constructing an equal-proportion airport panoramic model;
please refer to fig. 1, in this embodiment, the designed BIM model of the airport is loaded in the lisum map engine according to the true longitude and latitude and the construction direction to realize fusion; the runway construction comprises two runways of east (E) west (W), each runway comprises 10 lanes, and each lane comprises 1200 plates.
Taking the west runway as an example, A to J ten runways are also divided on the west runway of the BIM model according to the real construction condition, meanwhile, each runway is divided into 1200 plates and numbered, wherein the size of each plate is the same, and the difference between the adjacent front and back (south-north direction) and left and right (east-west direction) plates is consistent in longitude and latitude. For example: taking track a as an example, the starting plate number is WA0000, the last plate number is WA1199, where w (west) represents the west track, a represents track a, and the last 4 digits represent the number of the track, which are sequentially increased from the starting position.
Coordinates Wwest, Weast, Wnorth and Wsouth of four point positions of east, west, south and north of a west runway are passed; obtaining a rectangular area Wrunway of the whole runway; the rectangle wurnway can be divided into 10 x 1200 runway slabs, i.e., a-J runways, using the center. In order to facilitate subsequent calculation, the longitude and latitude of the central point of each plate is used as the longitude and latitude of the whole plate for calculation.
Step 2: converting the airport panoramic model in the obj format into a 3DTiles format supported by Cesium;
and step 3: loading a model according to the real longitude and latitude coordinates of the airport in a corresponding coordinate loading mode in a Cesium map engine to generate a live-action three-dimensional map;
and 4, step 4: tracking and rendering vehicles on the airport runway on a three-dimensional map in real time according to the received data;
the optical fiber sensor and the front-end controller carry out data transmission through a WebSocket communication protocol, 1 data is received every 0.25s, and the received data format protocol is as follows:
(1) when no vehicle runs on the runway, the front end receives the 'nodata', which represents that no vehicle needs to be tracked and rendered currently;
(2) when only one vehicle is running, the front end receives vehicle data, such as: 0XWD 0016. The first data "0" is the identification of each vehicle, until the vehicle exits the runway, the identification is destroyed, and the identification of the next vehicle is sequentially increased; the second bit data "X" represents forward driving (i.e. from south to north), and if "Y" represents reverse driving (i.e. from north to south); the rear 6-bit data is the current plate number of the vehicle;
(3) when there are data of a plurality of vehicles at the same time, different vehicle data are' separated, and the first vehicle identification number of different vehicle data is sequentially increased, for example: 0XWD0016,1XWC0163,2ywh 0007.; the meaning of each vehicle data is the same as in (2);
(4) when the vehicle passes through the wrong station, the front end receives wrong station alarm data, such as: CTWD 0091; the first two-digit data mark "CT" represents that the vehicle passes through the wrong platform, and the last 6-digit data represents the plate number where the wrong platform is located, and the meaning is consistent with the description of the step 1. The receiving of the wrong station alarm data and the receiving of the normal data of the vehicle are not affected mutually.
Take the example of receiving one vehicle data cardata (0XWD 0016). First, it is determined whether the channel is wrong, card data.subsystem (0, 2)! "CT" so the data is not misstage data; cardata [1] ═ X', and therefore the vehicle travel method is determined to be forward travel; backing (2,4) ═ WD', so the vehicle is judged to be traveling on track D of the west track; parsing (4) ═ 0016'; thus judging that the position is in the 16 th plate of the runway.
Referring to fig. 2, the specific implementation of step 4 includes the following sub-steps:
step 4.1: according to the received plate number passed by the vehicle, calculating the longitude and latitude and the height of the vehicle so as to determine the actual motion track of the vehicle, wherein the height is a fixed value 2, and the longitude longterm and latitude calculation formula is expressed as follows:
longitude=lonlat[i].longitude+PlateNumber×LonDifferenceValue;
latitude=lonlat[i].latitude+PlateNumber×LatDifferenceValue;
the long is longitude of longitude and latitude of the starting position of the ith runway, for example: lonlat [0] stores longitude and latitude of the A runway starting position (WA 0000); lonlat [ i ] latitude is latitude of longitude and latitude of the starting position of the ith runway; the plate number is a plate number of this received data, for example: 1XWC0163 obtained through calculation of transformation of the PlateNumber value 163; LonDifferenceValue is the longitude difference between the adjacent plates before and after the divided plate number; the LatDifferenceValue is the latitude difference between adjacent plates before and after the divided plate number;
in this embodiment, the longitude and latitude coordinates of the central point of each divided plate are uniformly used as the longitude and latitude coordinates of the whole plate. Meanwhile, the point coordinate is used as a detected vehicle coordinate, so that the error range is 0-0.5 plate.
Because the dynamic model in the Cesium can be normally rendered only when the length of the internal coordinate array is more than or equal to 2, the coordinates of two points are added to the internal coordinate array of the Cesium when the first vehicle data is received, the first point is the longitude and latitude position of the first half plate of the received normal data plate, and the second point is the longitude and latitude position of the normally received data plate. After receiving the first data, the Cesium can track and render the vehicle model normally, and the time error is the time error of receiving the first data, i.e. 0.25s described in step 4.
In the embodiment, the real-time speed of the moving vehicle is calculated and updated; defining the plate number of the previous piece of data recorded by predata, and calculating the real-time Speed of the vehicle as follows:
Figure BDA0003731916260000051
Figure BDA0003731916260000052
the data is the plate number of the currently received vehicle data, the distance is the actual distance between two plates, the predata is the plate number of the previous piece of data, and the Math.abs (data-predata) is the plate absolute value difference of the previous piece of data and the current piece of data and is used for calculating the distance; TimeDifference is the time difference between two data transmissions, timenow is the timestamp of the current received data, and is derived from new Date (). getTime (), and timepre records the timestamp of the last received data.
Step 4.2: and (3) adding the longitude and latitude and altitude data of the vehicle obtained by calculation in the step 4.1 into a position array position in the CZML in real time, firstly judging whether the data is the first data of the vehicle, defining a variable flag, and initializing the flag to be 0 for identification. If flag is equal to 0, judging that the data is the first data, and adding coordinates of two points to reduce time delay; otherwise, adding the current data to enable the flag to be + +; the vehicle trajectory is tracked by real-time rendering.
In this embodiment, interpolation is performed in CZML according to timestamps of two data reception, a distance difference threshold is preset, when current data is received, comparison is performed with previous data, when the difference between the distances of the two data is smaller than or equal to the threshold, a linear interpolation algorithm is used, and when the difference is larger than the threshold, a lag interpolation algorithm is used.
Referring to fig. 3, for two closely adjacent points A, B, the linear interpolation algorithm is: the vehicle generates two points C1 and C2 from the point A to the point B and the middle route by linear interpolation, and the A to the point C1, the C1 to the point C2 and the C2 to the point B can keep the original left-to-right direction unchanged.
Referring to fig. 4, for the close A, B neighbors, the lagrange interpolation algorithm is: the vehicle reaches the point B from the point A to the point B, the middle route generates two points C1 and C2 (or C3 and C4) by using a LAGTANGE interpolation algorithm, and the method that the vehicle A to the point C1 and the vehicle C1 to the point C2 and the vehicle C2 to the point B is not from left to right, so that the model vehicle rotates all the time.
Referring to fig. 5, for two points A, B that are far away from each other, the linear interpolation algorithm is: the vehicle generates two points C1 and C2 from the point A to the point B to the point C, and the linear interpolation is used for generating a broken-line-shaped vehicle motion track, so that the motion route of the model vehicle is not smooth.
Referring to fig. 6, for the two points A, B that are far away from each other, the lagrange interpolation algorithm is: from the point A to the point B to the point C, the middle route generates the points C1 and C2 by using the LAGTANGE interpolation algorithm, and the LAGTANGE interpolation algorithm can generate a smooth motion track.
Based on the above comparison and analysis, the present embodiment designs a distance difference threshold (determined according to actual conditions), compares the current data with the previous data when receiving the current data, uses a linear interpolation algorithm when the difference between the distances between the two data is less than or equal to the threshold, and uses a lag interpolation algorithm when the difference is greater than the threshold.
The embodiment sets a plate area which can legally enter the runway, if the first data of the vehicle is received and the legal area is included, the vehicle is a normal vehicle, the color of the vehicle model rendered in the Cesium is displayed as green, otherwise, the vehicle model is an invading vehicle, the color of the vehicle model rendered in the Cesium is displayed as red, and alarm information is displayed in the system firstly.
And when the vehicle data with the first data containing the vehicle identification is not received any more, emptying the information of the position coordinates of the vehicle stored in the Cesium so as to ensure the working performance of the system. After the data is added to the array, the current time in the czml is modified to prevent the execution from being restarted from the beginning: czml [0]. clock.currenttime.currenttime.toshring (), and then czml data, viewer.datasources.add (center.czmldatasource.load (czml)). The method adds the czml data for rendering, so that the viewer.
When the vehicle stops moving, data is also stopped from being received, at the moment, data in position. After the vehicle runs, the front-end page displays relevant information of the vehicle running at the time, including running time, maximum speed, average speed and running track.
It should be understood that the above description of the preferred embodiments is illustrative, and not restrictive, and that various changes and modifications may be made therein by those skilled in the art without departing from the scope of the invention as defined in the appended claims.

Claims (10)

1. An intelligent airport, comprising: the airport comprises N runways, each runway comprises M lanes, and each lane comprises X plates; wherein N, M, X is a preset value;
the X plates have the same size, and the difference between the front plate, the rear plate, the left plate and the right plate is consistent in longitude and latitude;
an optical fiber sensor is buried under the runway, runway vibration data are collected in real time, a front-end controller judges whether vehicles pass through the plate number according to the runway vibration data, and safety of the airport runway is detected in real time and an alarm is given in time according to different authorities of plate division in different areas of the airport runway.
2. An intelligent real-time tracking and rendering method for airport runway vehicles is characterized by comprising the following steps:
step 1: constructing an equal-proportion airport panoramic model;
and 2, step: converting the airport panoramic model in the obj format into a 3DTiles format supported by Cesium;
and step 3: loading a model according to the real longitude and latitude coordinates of the airport in a corresponding coordinate loading mode in a Cesium map engine to generate a live-action three-dimensional map;
and 4, step 4: tracking and rendering vehicles on the airport runway on a three-dimensional map in real time according to the received data;
the optical fiber sensor and the front-end controller carry out data transmission through a WebSocket communication protocol, and the received data format agreement protocol is as follows:
(1) when no vehicle runs on the runway, the front end receives the 'nodata', which represents that no vehicle needs to be tracked and rendered currently;
(2) when only one vehicle runs, the front end receives vehicle data, the first data is the identification of each vehicle, until the vehicle runs out of the runway, the identification is destroyed, and the identifications of the next vehicles are sequentially increased; the second bit of data represents forward or reverse travel; the rear 6-bit data is the current plate number of the vehicle;
(3) when data of a plurality of vehicles exist at the same time, different vehicle data are separated by using preset symbols, the first vehicle identification numbers of the different vehicle data are sequentially increased, and the meaning of each vehicle data is consistent with that in the step (2);
(4) when the vehicle passes through the wrong platform, the front end receives the wrong platform alarm data, the first two-bit data identification represents that the vehicle passes through the wrong platform, and the rear 6-bit data represents the plate number where the wrong platform is located.
3. The intelligent real-time tracking rendering method for airport runway vehicles as claimed in claim 2, wherein the step 4 is realized by the following steps:
step 4.1: according to the received plate number passed by the vehicle, calculating the longitude and latitude and the height of the vehicle so as to determine the actual motion track of the vehicle, wherein the height is a fixed value 2, and the longitude longterm and latitude calculation formula is expressed as follows:
longitude=lonlat[i].longitude+PlateNumber×LonDifferenceValue;
latitude=lonlat[i].latitude+PlateNumber×LatDifferenceValue;
wherein, lonlat [ i ] longitude is longitude of longitude and latitude of starting position of ith runway, and lonlat [ i ] latitude is latitude of longitude and latitude of starting position of ith runway; the PlateNumber is the plate number of the received data; LonDifferenceValue is the longitude difference between adjacent plates before and after the plate number; the LatDifferenceValue is the latitude difference between adjacent plates before and after the plate number;
and 4.2: and (4) adding the longitude and latitude and the height data of the vehicle obtained by calculation in the step 4.1 into a position array position defined in the CZML in real time, and tracking the track of the vehicle by real-time rendering.
4. The method as claimed in claim 3, wherein in step 4.2, interpolation is performed in CZML according to the time stamps of two data receptions, a distance difference threshold is preset, when the current data is received, comparison is performed with the previous data, when the difference between the distances of the two data is less than or equal to the threshold, a linear interpolation algorithm is used, and when the difference is greater than the threshold, a lag interpolation algorithm is used.
5. The intelligent real-time tracking rendering method for airport runway vehicles as claimed in claim 3, wherein: step 4.1, taking the longitude and latitude coordinates of the central point of each plate as the longitude and latitude coordinates of the whole plate; meanwhile, the point coordinates are taken as detected vehicle coordinates.
6. The intelligent real-time tracking rendering method for airport runway vehicles as claimed in claim 3, wherein: in step 4.1, when receiving the first vehicle data, adding the coordinates of two points to the Cesium internal coordinate array, wherein the first point is the longitude and latitude position of the first half of the received normal data plate, and the second point is the longitude and latitude position of the normally received data plate.
7. The intelligent real-time tracking rendering method for airport runway vehicles as claimed in claim 3, wherein: step 4.1, calculating and updating the real-time speed of the moving vehicle; defining the plate number of the previous piece of data recorded by predata, and calculating the real-time Speed of the vehicle as follows:
Figure FDA0003731916250000021
Figure FDA0003731916250000022
the data is the plate number of the currently received vehicle data, the distance is the actual distance between two plates, the predata is the plate number of the previous piece of data, and the Math.abs (data-predata) is the plate absolute value difference of the previous piece of data and the current piece of data and is used for calculating the distance; TimeDifference is the time difference between two data transmissions, timenow is the timestamp of the current received data, and is obtained from new Date (). getTime (), and timere records the timestamp of the last received data.
8. The intelligent real-time tracking rendering method for airport runway vehicles as claimed in claim 3, wherein: in step 4.2, after the vehicle data with the first data containing the vehicle identification is no longer received, the information of the position coordinates of the vehicle stored in the Cesium is emptied.
9. The intelligent real-time tracking and rendering method for airport runway vehicles as claimed in any one of claims 2-8, wherein: and setting a plate area which legally enters the runway, if the first data of the vehicle is received and the legal area is included, displaying the color of the vehicle model rendered in the Cesium as green for a normal vehicle, otherwise, displaying the color of the vehicle model rendered in the Cesium as red for an invading vehicle, and giving an alarm.
10. An intelligent airport runway vehicle real-time tracking rendering system is characterized by comprising the following modules:
the module 1 is used for constructing an equal-proportion airport panoramic model;
a module 2, configured to convert the obj-format airport panoramic model into a 3d files format that the cesum supports loading;
the module 3 is used for loading a model according to the real longitude and latitude coordinates of the airport in a corresponding coordinate loading mode in a Cesium map engine to generate a live-action three-dimensional map;
the module 4 is used for tracking and rendering the vehicles on the airport runway on a three-dimensional map in real time according to the received data;
the optical fiber sensor and the front-end controller carry out data transmission through a WebSocket communication protocol, and the received data format protocol is as follows:
(1) when no vehicle runs on the runway, the front end receives the 'nondata', which represents that no vehicle needs to be tracked and rendered currently;
(2) when only one vehicle runs, the front end receives vehicle data, the first data is the identification of each vehicle, the identification is destroyed until the vehicle runs out of the runway, and the identification of the next vehicle is sequentially increased; the second data represents forward driving or reverse driving; the rear 6 bits of data are the serial number of the current plate of the vehicle;
(3) when data of a plurality of vehicles exist at the same time, different vehicle data are separated by a preset symbol, the first vehicle identification numbers of the different vehicle data are sequentially increased, and the meaning of each vehicle data is consistent with that in the step (2);
(4) when the vehicle passes through the wrong platform, the front end receives the wrong platform alarm data, the first two-bit data identification represents that the vehicle passes through the wrong platform, and the rear 6-bit data represents the plate number where the wrong platform is located.
CN202210795863.7A 2022-07-06 2022-07-06 Intelligent airport and intelligent airport runway vehicle real-time tracking rendering method and system Pending CN115123571A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210795863.7A CN115123571A (en) 2022-07-06 2022-07-06 Intelligent airport and intelligent airport runway vehicle real-time tracking rendering method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210795863.7A CN115123571A (en) 2022-07-06 2022-07-06 Intelligent airport and intelligent airport runway vehicle real-time tracking rendering method and system

Publications (1)

Publication Number Publication Date
CN115123571A true CN115123571A (en) 2022-09-30

Family

ID=83381059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210795863.7A Pending CN115123571A (en) 2022-07-06 2022-07-06 Intelligent airport and intelligent airport runway vehicle real-time tracking rendering method and system

Country Status (1)

Country Link
CN (1) CN115123571A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116610644A (en) * 2023-07-19 2023-08-18 青岛民航凯亚系统集成有限公司 Airport pavement system data storage method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116610644A (en) * 2023-07-19 2023-08-18 青岛民航凯亚系统集成有限公司 Airport pavement system data storage method and system
CN116610644B (en) * 2023-07-19 2023-12-01 青岛民航凯亚系统集成有限公司 Airport pavement system data storage method and system

Similar Documents

Publication Publication Date Title
US11428537B2 (en) Localization and mapping methods using vast imagery and sensory data collected from land and air vehicles
CN104573733B (en) A kind of fine map generation system and method based on high definition orthophotoquad
US8970694B2 (en) Video processing system providing overlay of selected geospatially-tagged metadata relating to a geolocation outside viewable area and related methods
CN107850672B (en) System and method for accurate vehicle positioning
CN107067794B (en) Indoor vehicle positioning and navigation system and method based on video image processing
KR101534056B1 (en) Traffic signal mapping and detection
US11514682B2 (en) Determining weights of points of a point cloud based on geometric features
US8717436B2 (en) Video processing system providing correlation between objects in different georeferenced video feeds and related methods
US8363109B2 (en) Video processing system providing enhanced tracking features for moving objects outside of a viewable window and related methods
US20200309538A1 (en) System for producing and/or updating a digital model of a digital map
CN113379805A (en) Multi-information resource fusion processing method for traffic nodes
US8933961B2 (en) Video processing system generating corrected geospatial metadata for a plurality of georeferenced video feeds and related methods
CN113127590B (en) Map updating method and device
US20190311209A1 (en) Feature Recognition Assisted Super-resolution Method
CN111340856A (en) Vehicle tracking method, device, equipment and storage medium
CN114299464A (en) Lane positioning method, device and equipment
Zhou et al. Developing and testing robust autonomy: The university of sydney campus data set
US20230121226A1 (en) Determining weights of points of a point cloud based on geometric features
CN115123571A (en) Intelligent airport and intelligent airport runway vehicle real-time tracking rendering method and system
Gressenbuch et al. Mona: The munich motion dataset of natural driving
CN102045635B (en) road condition navigation method, mobile terminal and road condition navigation server
US20210248387A1 (en) Map generation device, recording medium and map generation method
CN114556419A (en) Three-dimensional point cloud segmentation method and device and movable platform
CN117746617A (en) Vehicle-road cooperation road side system
CN114943947A (en) Marking method of road traffic light, automatic driving calculation platform and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination