CN114739381A - Airport vehicle navigation system and method - Google Patents

Airport vehicle navigation system and method Download PDF

Info

Publication number
CN114739381A
CN114739381A CN202210194521.XA CN202210194521A CN114739381A CN 114739381 A CN114739381 A CN 114739381A CN 202210194521 A CN202210194521 A CN 202210194521A CN 114739381 A CN114739381 A CN 114739381A
Authority
CN
China
Prior art keywords
data
vehicle
matched
time
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210194521.XA
Other languages
Chinese (zh)
Inventor
宋志伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cangqing Intelligent Technology Shanghai Co ltd
Original Assignee
Cangqing Intelligent Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cangqing Intelligent Technology Shanghai Co ltd filed Critical Cangqing Intelligent Technology Shanghai Co ltd
Priority to CN202210194521.XA priority Critical patent/CN114739381A/en
Publication of CN114739381A publication Critical patent/CN114739381A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0257Hybrid positioning
    • G01S5/0263Hybrid positioning by combining or switching between positions derived from two or more separate positioning systems
    • G01S5/0264Hybrid positioning by combining or switching between positions derived from two or more separate positioning systems at least one of the systems being a non-radio wave positioning system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides an airport vehicle navigation system and a method, wherein the system comprises: the system comprises a first server, an A-SMGCS system, a vehicle dispatching system and a vehicle end system, wherein the first server is connected with the A-SMGCS system and the vehicle dispatching system, reference data are obtained through the A-SMGCS system, dispatching data are obtained through the vehicle dispatching system, and the first server fuses the reference data and the dispatching data through a first processing method to obtain and store first fused data; the car end system is connected with first server, and the car end system includes: the system comprises a data fusion module, a wireless communication module, a sensor module and an automatic driving module, wherein the data fusion module is accessed to a first server through the wireless communication module to obtain first fusion data; the data fusion module acquires the sensing data acquired by the sensor module, and fuses the sensing data with the first fusion data according to a second processing method to acquire navigation correction data; the automatic driving module obtains navigation correction data through the data fusion module so as to plan a navigation path.

Description

Airport vehicle navigation system and method
Technical Field
The invention relates to the technical field of automatic navigation of vehicles, in particular to a system and a method for navigating vehicles in cooperation with an airport A-SMGCS system in an airport environment.
Background
Autonomous vehicles typically acquire information about their surroundings through their own onboard sensors (e.g., lidar, cameras, etc.). The acquired information includes the position of the vehicle and surrounding obstacles (such as other vehicles, pedestrians, etc.). At present, a vehicle-mounted sensor can reach a relatively accurate detection distance of 100-200 meters, and the safety range of detecting obstacles and braking can be met in a general environment, including the conditions of pedestrian crossing and transverse vehicle detection at intersections.
In airport environments, however, the safety distance requirements of the aircraft are higher. A detection distance of 200 meters does not meet safety requirements, and the position and the speed of the airplane need to be detected at an earlier time for performing prior path planning and challenge.
Therefore, how to integrate the existing field monitoring radar and response equipment in the airport and fuse the information obtained by the existing equipment and the perception information of the automatic driving vehicle, the more accurate position detection of the vehicle and the aircraft in the airport environment is achieved, the safety is ensured, and the working efficiency is improved, which is one of the technical problems to be solved urgently in the field.
Disclosure of Invention
The invention mainly aims to provide an airport vehicle navigation system and method, which are used for correcting navigation path planning in an automatic vehicle driving state by utilizing an A-SMGCS system.
In order to achieve the above object, according to one aspect of the present invention, there is provided an airport vehicle navigation method, comprising the steps of:
the first server is connected with the A-SMGCS system and the vehicle dispatching system, reference data are obtained through the A-SMGCS system, dispatching data are obtained through the vehicle dispatching system, the reference data and the dispatching data are fused through a first processing method, and first fusion data are obtained and stored;
the vehicle data fusion module is accessed to a first server through a wireless communication module to obtain first fusion data;
the data fusion module acquires the sensing data acquired by the vehicle sensor module, and fuses the sensing data with the first fusion data according to a second processing method to acquire navigation correction data;
and the vehicle automatic driving module acquires navigation correction data through the data fusion module so as to plan a navigation path.
Wherein in a preferred embodiment, the reference data comprises:
Figure 100002_DEST_PATH_IMAGE001
object id;
Figure 886173DEST_PATH_IMAGE002
time when an object is detected;
Figure 100002_DEST_PATH_IMAGE003
the central pose of the object;
Figure 968399DEST_PATH_IMAGE004
the size of the object;
Figure 100002_DEST_PATH_IMAGE005
the speed of the object; and c, detecting the covariance of the result.
Wherein in a preferred embodiment, the object id comprises:
the global id is a fixed unique number of the airport equipment;
and local id, which is the id allocated to the local object detected by the vehicle-mounted sensor before the local object is not fused with the first fusion data.
Wherein in a preferred embodiment, the first processing method step comprises:
setting:
inputting:
Figure 987302DEST_PATH_IMAGE006
and (3) outputting:
Figure 100002_DEST_PATH_IMAGE007
beginning:
Figure 573004DEST_PATH_IMAGE008
current time of day
Figure 100002_DEST_PATH_IMAGE009
System time
Executing:
s1 receiving the
Figure 815897DEST_PATH_IMAGE010
Secondary detection result
Figure 100002_DEST_PATH_IMAGE011
;
S2 Current time
Figure 119840DEST_PATH_IMAGE012
A system time;
s3 for each detected object
Figure 100002_DEST_PATH_IMAGE013
Predicted as follows
Figure 926122DEST_PATH_IMAGE014
According to
Figure 100002_DEST_PATH_IMAGE015
Predict the object at
Figure 382642DEST_PATH_IMAGE016
Result of time of day
Figure 100002_DEST_PATH_IMAGE017
S4 pairs
Figure 45704DEST_PATH_IMAGE018
Predicting each object according to the method of S3 to obtain the predicted object
Figure 100002_DEST_PATH_IMAGE019
;
S5
Figure 525358DEST_PATH_IMAGE020
And
Figure 400911DEST_PATH_IMAGE019
matching according to the object global id; if it is
Figure 100002_DEST_PATH_IMAGE021
And with
Figure 961205DEST_PATH_IMAGE022
If the matching is successful, the matching will be
Figure 100002_DEST_PATH_IMAGE023
And
Figure 749164DEST_PATH_IMAGE024
are combined and combined
Figure 230960DEST_PATH_IMAGE019
(ii) a If there is
Figure 644624DEST_PATH_IMAGE021
And
Figure 793846DEST_PATH_IMAGE019
if none of the objects can be matched, then directly connecting the objects
Figure 18285DEST_PATH_IMAGE023
Put into
Figure 100002_DEST_PATH_IMAGE025
If there is
Figure 721799DEST_PATH_IMAGE022
And
Figure 939153DEST_PATH_IMAGE020
the medium objects can not be matched, and can not be matched after N times of continuous circulation, and when the time threshold value is reached, the medium objects can be matched
Figure 677302DEST_PATH_IMAGE024
From
Figure 338222DEST_PATH_IMAGE019
Removing;
S6
Figure 794611DEST_PATH_IMAGE026
output of
Figure 100002_DEST_PATH_IMAGE027
Return to S1 loop.
Wherein in a preferred embodiment, the second processing method comprises the steps of:
setting:
inputting:
Figure 284498DEST_PATH_IMAGE006
and (3) outputting:
Figure 142733DEST_PATH_IMAGE028
beginning:
Figure 455114DEST_PATH_IMAGE008
current time of day
Figure 664378DEST_PATH_IMAGE009
System time
Executing:
s1 receiving the
Figure 223536DEST_PATH_IMAGE010
Secondary detection result
Figure 936277DEST_PATH_IMAGE011
S2 Current time
Figure 125950DEST_PATH_IMAGE012
System time
S3 for each detected object
Figure 100002_DEST_PATH_IMAGE029
Predicted as follows
Figure 573243DEST_PATH_IMAGE030
According to
Figure 100002_DEST_PATH_IMAGE031
Predicting the object at
Figure 467249DEST_PATH_IMAGE016
Result of time of day
Figure 50809DEST_PATH_IMAGE032
The prediction method may be EKF or the like
S4 pairs
Figure 411383DEST_PATH_IMAGE018
Predicting each object according to the method of S3 to obtain the predicted object
Figure 100002_DEST_PATH_IMAGE033
S5
Figure 595239DEST_PATH_IMAGE020
And
Figure 761779DEST_PATH_IMAGE019
matching according to the object position if
Figure 199844DEST_PATH_IMAGE021
And
Figure 731320DEST_PATH_IMAGE022
matching is successful, if the ids of two objects are the same or only one id is a global id, then it will be
Figure 871314DEST_PATH_IMAGE023
And
Figure 310386DEST_PATH_IMAGE024
are combined and combined
Figure 117805DEST_PATH_IMAGE019
(ii) a If both ids are global and not the same, it will directly be
Figure 367652DEST_PATH_IMAGE023
Is put into
Figure 198204DEST_PATH_IMAGE034
Then will be
Figure 440967DEST_PATH_IMAGE023
And
Figure 837313DEST_PATH_IMAGE024
are combined and combined
Figure 772908DEST_PATH_IMAGE019
While id is taken
Figure 638227DEST_PATH_IMAGE024
Id in (1); if there is
Figure 684680DEST_PATH_IMAGE021
And with
Figure 935533DEST_PATH_IMAGE019
If none of the objects can be matched, then directly connecting
Figure 979713DEST_PATH_IMAGE023
Is put into
Figure 581595DEST_PATH_IMAGE019
(ii) a If there is
Figure 916893DEST_PATH_IMAGE022
And
Figure 287831DEST_PATH_IMAGE020
the medium objects can not be matched, and can not be matched after N times of continuous circulation, and when the time threshold value is reached, the medium objects can be matched
Figure 299650DEST_PATH_IMAGE024
From
Figure 388828DEST_PATH_IMAGE019
Removing;
S6
Figure 262238DEST_PATH_IMAGE026
output of
Figure 487683DEST_PATH_IMAGE027
Return to S1 loop.
In order to achieve the above object, according to another aspect of the present invention, there is provided an airport vehicle navigation system, comprising: the system comprises a first server, an A-SMGCS system, a vehicle dispatching system and a vehicle end system, wherein the first server is connected with the A-SMGCS system and the vehicle dispatching system, reference data are obtained through the A-SMGCS system, dispatching data are obtained through the vehicle dispatching system, and the first server fuses the reference data and the dispatching data through a first processing method to obtain and store first fused data;
the car end system is connected with a first server, and the car end system comprises: the system comprises a data fusion module, a wireless communication module, a sensor module and an automatic driving module, wherein the data fusion module is accessed to a first server through the wireless communication module to obtain first fusion data; the data fusion module acquires the sensing data acquired by the sensor module, and fuses the sensing data with the first fusion data according to a second processing method to acquire navigation correction data;
and the automatic driving module acquires navigation correction data through the data fusion module so as to plan a navigation path.
Wherein in a preferred embodiment, the reference data comprises:
Figure 935981DEST_PATH_IMAGE001
an object id;
Figure 246877DEST_PATH_IMAGE002
time when an object is detected;
Figure 720715DEST_PATH_IMAGE003
the central pose of the object;
Figure 269508DEST_PATH_IMAGE004
the size of the object;
Figure 623129DEST_PATH_IMAGE005
the speed of the object; and c, detecting the covariance of the result.
Wherein in a preferred embodiment, the object id comprises:
the global id is a fixed unique number of the airport equipment;
and local id, which is the id allocated to the local object detected by the vehicle-mounted sensor before the local object is not fused with the first fusion data.
Wherein in a preferred embodiment, the first processing method step comprises:
setting:
inputting:
Figure 421320DEST_PATH_IMAGE006
and (3) outputting:
Figure 902112DEST_PATH_IMAGE007
beginning:
Figure 102149DEST_PATH_IMAGE008
current time of day
Figure 626671DEST_PATH_IMAGE009
System time
Executing:
s1 receiving the
Figure 912159DEST_PATH_IMAGE010
Secondary detection result
Figure 931062DEST_PATH_IMAGE011
;
S2 Current time
Figure 985605DEST_PATH_IMAGE012
A system time;
s3 for each detected object
Figure 946608DEST_PATH_IMAGE013
Predicted as follows
Figure 719392DEST_PATH_IMAGE014
According to
Figure 276406DEST_PATH_IMAGE015
Predict the object at
Figure 451036DEST_PATH_IMAGE016
Result of time of day
Figure 317361DEST_PATH_IMAGE017
S4 pairs
Figure 577441DEST_PATH_IMAGE018
Predicting each object according to the method of S3 to obtain the predicted object
Figure 938146DEST_PATH_IMAGE019
;
S5
Figure 967282DEST_PATH_IMAGE020
And
Figure 535666DEST_PATH_IMAGE019
matching according to the object global id; if it is
Figure 17463DEST_PATH_IMAGE021
And
Figure 181859DEST_PATH_IMAGE022
if the matching is successful, the matching will be
Figure 268764DEST_PATH_IMAGE023
And
Figure 742471DEST_PATH_IMAGE024
are combined and combined
Figure 711564DEST_PATH_IMAGE019
(ii) a If there is
Figure 663339DEST_PATH_IMAGE021
And
Figure 683379DEST_PATH_IMAGE019
if none of the objects can be matched, then directly connecting the objects
Figure 327987DEST_PATH_IMAGE023
Is put into
Figure 784376DEST_PATH_IMAGE025
If there is
Figure 743105DEST_PATH_IMAGE022
And
Figure 335760DEST_PATH_IMAGE020
the medium objects can not be matched, and can not be matched after N times of continuous circulation, and when the time threshold value is reached, the medium objects can be matched
Figure 167581DEST_PATH_IMAGE024
From
Figure 111266DEST_PATH_IMAGE019
Removing;
S6
Figure 670424DEST_PATH_IMAGE026
output of
Figure 586427DEST_PATH_IMAGE027
Return to S1 loop.
Wherein in a preferred embodiment, the second processing method comprises the steps of:
setting:
inputting:
Figure 307258DEST_PATH_IMAGE006
and (3) outputting:
Figure 477253DEST_PATH_IMAGE028
beginning:
Figure 574522DEST_PATH_IMAGE008
current time of day
Figure 672929DEST_PATH_IMAGE009
System time
Executing:
s1 receiving the
Figure 564661DEST_PATH_IMAGE010
Secondary detection result
Figure 233671DEST_PATH_IMAGE011
S2 Current time
Figure 337893DEST_PATH_IMAGE012
System time
S3 for each detected object
Figure 759647DEST_PATH_IMAGE029
Predicted as follows
Figure 87860DEST_PATH_IMAGE030
According to
Figure 493434DEST_PATH_IMAGE031
Predict the object at
Figure 948817DEST_PATH_IMAGE016
Result of time of day
Figure 693919DEST_PATH_IMAGE032
The prediction method may be EKF or the like
S4 pairs
Figure 193034DEST_PATH_IMAGE018
Predicting each object according to the method of S3 to obtain the predicted object
Figure 820324DEST_PATH_IMAGE033
S5
Figure 63087DEST_PATH_IMAGE020
And
Figure 210165DEST_PATH_IMAGE019
matching according to the object position if
Figure 145760DEST_PATH_IMAGE021
And
Figure 260347DEST_PATH_IMAGE022
if the matching is successful, if the ids of the two objects are the same or only one id is a global id, the matching will be completed
Figure 41221DEST_PATH_IMAGE023
And with
Figure 308386DEST_PATH_IMAGE024
Are combined and combined
Figure 149303DEST_PATH_IMAGE019
(ii) a If both ids are global and not the same, it will directly
Figure 751185DEST_PATH_IMAGE023
Put into
Figure 335750DEST_PATH_IMAGE034
Then will be
Figure 457421DEST_PATH_IMAGE023
And
Figure 469240DEST_PATH_IMAGE024
are combined and combined
Figure 292839DEST_PATH_IMAGE019
While id is adopted
Figure 681095DEST_PATH_IMAGE024
Id (1); if there are
Figure 657273DEST_PATH_IMAGE021
And
Figure 43254DEST_PATH_IMAGE019
if none of the objects can be matched, then directly connecting the objects
Figure 619729DEST_PATH_IMAGE023
Is put into
Figure 546097DEST_PATH_IMAGE019
(ii) a If there are
Figure 626049DEST_PATH_IMAGE022
And
Figure 261560DEST_PATH_IMAGE020
the medium objects can not be matched, and can not be matched after N times of continuous circulation, and when the time threshold value is reached, the medium objects can be matched
Figure 59752DEST_PATH_IMAGE024
From
Figure 993073DEST_PATH_IMAGE019
Removing;
S6
Figure 927531DEST_PATH_IMAGE026
output of
Figure 717632DEST_PATH_IMAGE027
And returning to the S1 loop.
The airport vehicle navigation system and method provided by the invention can effectively utilize the A-SMGCS system to correct the navigation path planning in the automatic driving state of the vehicle, thereby greatly improving the distance and the precision of the unmanned vehicle for detecting the obstacle, and being capable of timely acquiring the information which can not be acquired by the vehicle-mounted sensor and coping in advance. Therefore, compared with the traditional detection method which only relies on the vehicle-mounted sensor, the method is higher in safety and more efficient. Because the field surveillance radar has a large three-dimensional sensing distance and a large coverage area in an airport environment, information which cannot be acquired by an automatic driving vehicle on the ground can be collected (due to shielding or limited sensing distance). Furthermore, the transponder systems on the individual vehicles and aircraft that are pre-assembled can acquire their respective positions even when they have not been scanned by the field surveillance radar. In terms of deployment cost, because the field monitoring radar and the response system are existing equipment in an airport environment, extra deployment cost and engineering are not needed, and only corresponding information is led into the system provided by the invention for information fusion, so that the implementation cost of the technology is greatly saved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic diagram of an airport vehicle navigation system of the present invention.
Detailed Description
The following describes in detail embodiments of the present invention. The following examples will aid those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. All falling within the scope of the present invention.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions.
In a typical airport environment, to ensure the safety of airports, particularly aircraft, the existing advanced ground traffic management system (a-SMGCS) for airports has integrated various types of field-side equipment and vehicle-mounted equipment to detect the position of all vehicles and aircraft at the airport in real time. These devices include:
(1) one or more S-band field Monitoring radars (Primary radars),
(2) multipoint positioning system (MLAT): for example, transponders are installed on aircrafts and vehicles, ground stations (transmitters and receivers) are deployed in the blind areas of radar of the field monitoring, the time difference of arrival of a response signal at the receiver (TDOA location) is utilized to achieve accurate positioning, and targets are identified according to address codes in response codes. The multipoint positioning transponder is compatible with the response signals of the A/C, S, ADS-B mode and the like;
(3) ADS-B (automatic Dependent collaborative broadcast), as shown in the following figure, is generally used for aircraft, and the on-board ADS-B device broadcasts information including its own location.
And the A-SMGCS combines the three monitoring technologies, performs related and comprehensive processing on the three information, and performs serial number identification and tracking on the aircraft and the vehicle.
On the other hand, autonomous vehicles generally utilize onboard sensors (e.g., lidar, cameras, etc.) for location and obstacle detection. This method has several technical drawbacks:
1) the vehicle-mounted sensor has a large blind area. In general, an in-vehicle sensor detects a position or an obstacle by using the principle of linear propagation of light, electromagnetic waves, sound waves, or the like, and must be mounted on a vehicle body. Therefore, the obstacle is easily blocked by the obstacle and the obstacle behind the blocked object cannot be perceived.
This drawback is not obvious on open roads, but in complex airport environments, where large amounts of shelter are present, there are significant safety concerns. The detection efficiency of the vehicle-mounted sensor in an airport environment is much less than that on an open road.
2) Vehicle-mounted sensors have limited detection capabilities for dynamic objects. The in-vehicle sensors are limited by the detection frequency and processing power. Therefore, when there is a dynamic object in the environment, the vehicle-mounted sensor may not detect the dynamic object in time, and a safety accident may occur.
3) The cost of the vehicle-mounted sensor is high. However, when only the vehicle-mounted sensor is used for detecting the position and the obstacle, the corresponding sensor needs to be installed in each vehicle. While on-board sensors typically have a higher cost, the overall cost may increase with the addition of autonomous vehicles. Meanwhile, a large number of vehicle-mounted sensors work in the adjacent areas, and mutual interference can occur, so that the difficulty in solving the problem by the technology is increased.
The invention provides an airport vehicle navigation system and method, which aim to reduce the blind area of an automatic driving vehicle. If the field monitoring radar is arranged on a higher tower, the shielding object at the high-lying position is smaller, and particularly under the condition that no tall building exists in the airport environment, vehicles and aircrafts in the area around the radar can be adhered to; and the distance of electromagnetic wave detection adopted by the radar is far more than that of the vehicle-mounted sensor.
And secondly, when the vehicle or the aircraft is in an area which cannot be detected by the radar of the field monitor, the multipoint positioning system and the ADS-B system can automatically report the respective positions through the response of the vehicle and the aircraft. Therefore, there is substantially no blind spot for the vehicle and the aircraft.
In addition, as the position of the vehicle or the aircraft is continuously tracked by the field surveillance radar, the multipoint positioning system or the ADS-B system, the driving direction of the vehicle or the aircraft can be predicted even if the vehicle or the aircraft is in a blind area of the vehicle-mounted sensor. The corresponding prediction results can be used for enabling the automatic driving vehicle to deal with in advance, and the safety and the passing efficiency of the automatic driving vehicle are improved.
In airport environment, the field surveillance radar, the multipoint positioning system and the ADS-B system are ready-made systems, redeployment is not needed, and corresponding signals can be used by a plurality of automatic driving vehicles. With this system, there is no need to install excessive sensors on the autonomous vehicle, and therefore the cost can be much lower than an autonomous system that relies entirely on-board sensors.
To achieve the above object, according to one aspect of the present invention, as shown in fig. 1, there is provided an airport vehicle navigation system, comprising: the system comprises a first server, an A-SMGCS system, a vehicle dispatching system and a vehicle end system, wherein the first server is connected with the A-SMGCS system and the vehicle dispatching system, reference data are obtained through the A-SMGCS system, dispatching data are obtained through the vehicle dispatching system, and the first server fuses the reference data and the dispatching data through a first processing method to obtain and store first fused data; the car end system is connected with a first server, and the car end system includes: the system comprises a data fusion module, a wireless communication module, a sensor module and an automatic driving module, wherein the data fusion module is accessed to a first server through the wireless communication module to obtain first fusion data; the data fusion module acquires the sensing data acquired by the sensor module, and fuses the sensing data with the first fusion data according to a second processing method to acquire navigation correction data; and the automatic driving module acquires navigation correction data through the data fusion module so as to plan a navigation path.
In this embodiment, the vehicle-side system may subscribe to the first fusion data on the first server through the wireless communication module to keep implementing updating of the data, and perform a pre-route planning and a route adjustment, thereby avoiding a safety hazard in a short distance with the aircraft and other vehicles.
Specifically, in this embodiment, in order to realize data fusion of detection results of different sensors, the present invention preprocesses each sensing detection result to obtain a unified object detection result data structure, that is, the reference data includes
Figure DEST_PATH_IMAGE035
Wherein:
Figure 285011DEST_PATH_IMAGE036
the object id is divided into two types of id with non-overlapping values as follows:
the global id is a fixed unique number of the airport equipment, and the type id is contained in the results of the field surveillance radar, the multipoint positioning system and the ADS-B detection;
the local id is the id which is distributed before the local object detected by the vehicle-mounted sensor is not fused with the detection result of the server;
Figure 100002_DEST_PATH_IMAGE037
time when an object is detected
Figure 84340DEST_PATH_IMAGE038
Object center pose (position and orientation): (latitude, longitude, altitude, direction)
Figure DEST_PATH_IMAGE039
The size of the object is as follows: (Length, Width, height)
Figure 420774DEST_PATH_IMAGE040
Object speed: vector velocity
c, covariance of detection results: (
Figure DEST_PATH_IMAGE041
: the covariance of the pose is calculated by the pose covariance,
Figure 912936DEST_PATH_IMAGE042
velocity covariance), the range of the covariance is determined by the characteristics of different sensors, such as high reliability of position detection of the laser radar and high reliability of velocity detection of the millimeter wave radar.
Wherein the first processing method executed at the first server includes:
setting:
inputting:
Figure 170873DEST_PATH_IMAGE006
and (3) outputting:
Figure 508313DEST_PATH_IMAGE007
beginning:
Figure 682943DEST_PATH_IMAGE008
current time of day
Figure 549267DEST_PATH_IMAGE009
System time
Executing:
s1 receiving the
Figure 12610DEST_PATH_IMAGE010
Secondary detection result
Figure 373315DEST_PATH_IMAGE011
;
S2 Current time
Figure DEST_PATH_IMAGE043
A system time;
s3 for each detected object
Figure 199189DEST_PATH_IMAGE013
Predicted as follows
Figure 236415DEST_PATH_IMAGE014
According to
Figure 734523DEST_PATH_IMAGE015
Predict the object at
Figure 148187DEST_PATH_IMAGE016
Result of time of day
Figure 235092DEST_PATH_IMAGE017
S4 pairs
Figure 708799DEST_PATH_IMAGE018
Predicting each object according to the method of S3 to obtain the predicted object
Figure 943471DEST_PATH_IMAGE019
;
S5
Figure 645979DEST_PATH_IMAGE020
And
Figure 649707DEST_PATH_IMAGE044
matching according to the global id of the object; if it is
Figure 28736DEST_PATH_IMAGE021
And
Figure 485125DEST_PATH_IMAGE022
if the matching is successful, the data will be matched with
Figure 443853DEST_PATH_IMAGE024
Are combined and combined
Figure 52820DEST_PATH_IMAGE044
(ii) a If there is
Figure 133909DEST_PATH_IMAGE021
And
Figure 77594DEST_PATH_IMAGE044
if none of the objects can be matched, then directly connecting the objects
Figure DEST_PATH_IMAGE045
Is put into
Figure 653063DEST_PATH_IMAGE025
If there are
Figure 365804DEST_PATH_IMAGE022
And with
Figure 555477DEST_PATH_IMAGE020
The medium objects can not be matched, and can not be matched after N times of continuous circulation, and when the time threshold value is reached, the medium objects can be matched
Figure 986458DEST_PATH_IMAGE024
From
Figure 349307DEST_PATH_IMAGE044
Removing;
S6
Figure 679005DEST_PATH_IMAGE026
output of
Figure 836317DEST_PATH_IMAGE027
And returning to the S1 loop.
Wherein, executed in the vehicle-end system, the second processing method comprises the following steps:
setting:
inputting:
Figure 754595DEST_PATH_IMAGE006
and (3) outputting:
Figure 389975DEST_PATH_IMAGE028
beginning:
Figure 280571DEST_PATH_IMAGE008
current time of day
Figure 359517DEST_PATH_IMAGE009
System time
Executing:
s1 receiving
Figure 499511DEST_PATH_IMAGE010
Secondary detection result
Figure 204162DEST_PATH_IMAGE011
S2 Current time
Figure 746001DEST_PATH_IMAGE043
System time
S3 for each detected object
Figure 261428DEST_PATH_IMAGE029
Predicted as follows
Figure 888718DEST_PATH_IMAGE046
According to
Figure 865901DEST_PATH_IMAGE031
Predict the object at
Figure 527827DEST_PATH_IMAGE016
Result of time of day
Figure 948575DEST_PATH_IMAGE032
The prediction method may be EKF or the like
S4 pairs
Figure 266424DEST_PATH_IMAGE018
Predicting each object according to the method of S3 to obtain the predicted object
Figure DEST_PATH_IMAGE047
S5
Figure 844036DEST_PATH_IMAGE020
And
Figure 360468DEST_PATH_IMAGE044
the matching is performed according to the object position, the matching algorithm may be ICP or the like, wherein,
if it is
Figure 404647DEST_PATH_IMAGE021
And
Figure 757262DEST_PATH_IMAGE022
matching is successful, if the ids of two objects are the same or only one id is a global id, then it will be
Figure 341827DEST_PATH_IMAGE045
And
Figure 712766DEST_PATH_IMAGE024
are combined and combined
Figure 990163DEST_PATH_IMAGE044
According to the minimum covariance error range, the combination method sets id as global id;
if both ids are global and not the same, it will directly be
Figure 564495DEST_PATH_IMAGE045
Is put into
Figure 156013DEST_PATH_IMAGE048
Then will be
Figure 381458DEST_PATH_IMAGE045
And
Figure 564178DEST_PATH_IMAGE024
are combined and combined
Figure 406232DEST_PATH_IMAGE044
The combination method is based on the minimum covariance error range, and id is adopted
Figure 83332DEST_PATH_IMAGE024
Id in (1);
if there is
Figure 366546DEST_PATH_IMAGE021
And
Figure 985746DEST_PATH_IMAGE044
if none of the objects can be matched, then directly connecting the objects
Figure 783938DEST_PATH_IMAGE045
Is put into
Figure 248417DEST_PATH_IMAGE044
If there is
Figure 464766DEST_PATH_IMAGE022
And
Figure 254868DEST_PATH_IMAGE020
the medium objects can not be matched, and can not be matched after N times of continuous circulation, and when the time threshold value is reached, the medium objects can be matched
Figure 478039DEST_PATH_IMAGE024
From
Figure 11788DEST_PATH_IMAGE044
Removing;
S6
Figure 66332DEST_PATH_IMAGE026
output the output
Figure 778067DEST_PATH_IMAGE027
And returning to the S1 loop.
Therefore, the existing A-SMGCS system of the airport is utilized to carry out the pre-route planning and route adjustment for the automatic driving vehicle in real time, and the potential safety hazard of short distance between the automatic driving vehicle and the aircraft and other vehicles can be avoided.
And meanwhile, the fused first fusion data is stored in the multi-mode hybrid vehicle dispatching system. The dispatching system is responsible for communicating with the automatic driving vehicle, distributing tasks and distributing 'first fusion data'. The scheduling system optimizes the task allocation of the automatic driving vehicle based on the first fusion data, and then sends the optimized task allocation scheme and information of the vehicle and the aircraft to the corresponding automatic driving vehicle through an airport LTE private network. The system may run on a locally deployed cloud server. Each autonomous vehicle can subscribe 'first fusion data' in the dispatching system, perform secondary fusion on the received data and the collected data of the vehicle-mounted sensor, and obtain navigation correction data, so that the information of other vehicles and aircrafts related to the autonomous vehicle is further corrected.
The fused information is sent to an automatic driving module of the automatic driving automobile, so that the driving path of the automatic driving automobile can be optimized and adjusted, and necessary obstacle avoidance can be carried out, so that the safety and the driving efficiency of the automatic driving automobile are improved.
In addition, the data fusion algorithm, the scheduling algorithm, the path planning and the navigation obstacle avoidance algorithm used in the modules are not specially limited, and the data transmission mode among the modules and the hardware system operated by each system are not specially limited, and any scheme for realizing the vehicle-road cooperative position detection in the airport environment by using the above modes is considered to be the disclosure range of the present invention.
In one aspect of the present invention, there is also provided an airport vehicle navigation method, comprising the steps of:
the first server is connected with the A-SMGCS system and the vehicle dispatching system, reference data are obtained through the A-SMGCS system, dispatching data are obtained through the vehicle dispatching system, the reference data and the dispatching data are fused through a first processing method, and first fused data are obtained and stored;
the vehicle data fusion module is accessed to a first server through a wireless communication module to obtain first fusion data; the data fusion module acquires the sensing data acquired by the vehicle sensor module, and fuses the sensing data with the first fusion data according to a second processing method to acquire navigation correction data;
and the vehicle automatic driving module acquires navigation correction data through the data fusion module so as to plan a navigation path.
In order to realize data fusion of detection results of different sensors, the invention preprocesses each sensing detection result to obtain a uniform object detection result data structure, namely reference data comprises
Figure DEST_PATH_IMAGE049
Wherein:
Figure 285272DEST_PATH_IMAGE050
the object id is divided into two types of id with non-overlapping values:
the global id is a fixed unique number of the airport equipment, and the type id is contained in the results of the field surveillance radar, the multipoint positioning system and the ADS-B detection;
the local id is the id which is distributed before the local object detected by the vehicle-mounted sensor is not fused with the detection result of the server;
Figure 357133DEST_PATH_IMAGE037
time when an object is detected
Figure DEST_PATH_IMAGE051
Object center pose (position and orientation): (latitude, longitude, altitude, direction)
Figure 548074DEST_PATH_IMAGE052
The size of the object is as follows: (Length, Width, height)
Figure DEST_PATH_IMAGE053
Object speed: vector velocity
c, covariance of detection results: (
Figure 211136DEST_PATH_IMAGE054
: the covariance of the pose is calculated,
Figure DEST_PATH_IMAGE055
velocity covariance), the range of the covariance is determined by the characteristics of different sensors, such as high reliability of position detection of the laser radar and high reliability of velocity detection of the millimeter wave radar.
Wherein the first processing method executed at the first server includes:
setting:
inputting:
Figure 487528DEST_PATH_IMAGE056
and (3) outputting:
Figure DEST_PATH_IMAGE057
beginning:
Figure 159818DEST_PATH_IMAGE058
current time of day
Figure DEST_PATH_IMAGE059
System time
Executing:
s1 receiving the
Figure 923375DEST_PATH_IMAGE010
Secondary detection result
Figure 711333DEST_PATH_IMAGE060
;
S2 Current time
Figure 458709DEST_PATH_IMAGE043
A system time;
s3 for each detected object
Figure DEST_PATH_IMAGE061
Predicted as follows
Figure 669111DEST_PATH_IMAGE062
According to
Figure DEST_PATH_IMAGE063
Predict the object at
Figure 834644DEST_PATH_IMAGE064
Result of time of day
Figure DEST_PATH_IMAGE065
S4 pairs
Figure 573930DEST_PATH_IMAGE066
Predicting each object according to the method of S3 to obtain the predicted object
Figure 746285DEST_PATH_IMAGE044
;
S5
Figure DEST_PATH_IMAGE067
And
Figure 245531DEST_PATH_IMAGE044
matching according to the global id of the object; if it is
Figure 983680DEST_PATH_IMAGE068
And
Figure DEST_PATH_IMAGE069
if the matching is successful, the matching will be
Figure 175758DEST_PATH_IMAGE045
And with
Figure 632147DEST_PATH_IMAGE070
Are combined and combined
Figure 590876DEST_PATH_IMAGE044
(ii) a If there is
Figure 245848DEST_PATH_IMAGE068
And
Figure 264619DEST_PATH_IMAGE044
if none of the objects can be matched, then directly connecting
Figure 693458DEST_PATH_IMAGE045
Is put into
Figure DEST_PATH_IMAGE071
If there is
Figure 49353DEST_PATH_IMAGE069
And
Figure 762094DEST_PATH_IMAGE067
the medium objects can not be matched, and can not be matched after N times of continuous circulation, and when the time threshold value is reached, the medium objects can be matched
Figure 487518DEST_PATH_IMAGE070
From
Figure 918500DEST_PATH_IMAGE044
Removing;
S6
Figure 219031DEST_PATH_IMAGE072
output of
Figure DEST_PATH_IMAGE073
Return to S1 loop.
Wherein, executed in the vehicle-end system, the second processing method comprises the steps of:
setting:
inputting:
Figure 317437DEST_PATH_IMAGE056
and (3) outputting:
Figure 225481DEST_PATH_IMAGE074
beginning:
Figure 143759DEST_PATH_IMAGE058
current time of day
Figure 44719DEST_PATH_IMAGE059
System time
Executing:
s1 receiving the
Figure 732052DEST_PATH_IMAGE010
Secondary detection result
Figure 263527DEST_PATH_IMAGE060
S2 Current time
Figure 154254DEST_PATH_IMAGE043
System time
S3 for each detected object
Figure DEST_PATH_IMAGE075
Predicted as follows
Figure 390063DEST_PATH_IMAGE046
According to
Figure 931903DEST_PATH_IMAGE076
Predict the object at
Figure 181750DEST_PATH_IMAGE064
Result of time of day
Figure DEST_PATH_IMAGE077
The prediction method may be EKF or the like
S4 pairs
Figure 543461DEST_PATH_IMAGE066
Predicting each object according to the method of S3 to obtain the predicted object
Figure 786224DEST_PATH_IMAGE047
S5
Figure 448149DEST_PATH_IMAGE067
And
Figure 868897DEST_PATH_IMAGE044
the matching is performed according to the object position, the matching algorithm may be ICP or the like, wherein,
if it is
Figure 983484DEST_PATH_IMAGE068
And
Figure 29937DEST_PATH_IMAGE069
matching is successful, if the ids of two objects are the same or only one id is a global id, then it will be
Figure 546369DEST_PATH_IMAGE045
And
Figure 138019DEST_PATH_IMAGE070
are combined and combined
Figure 677584DEST_PATH_IMAGE044
According to the minimum covariance error range, the combination method sets id as global id;
if both ids are global and not the same, it will directly
Figure 262150DEST_PATH_IMAGE045
Is put into
Figure 633088DEST_PATH_IMAGE048
Then will be
Figure 644906DEST_PATH_IMAGE045
And
Figure 750397DEST_PATH_IMAGE070
are combined and combined
Figure 873074DEST_PATH_IMAGE044
The combination method is based on the minimum covariance error range, and id is adopted
Figure 364098DEST_PATH_IMAGE070
Id in (1);
if there is
Figure 546817DEST_PATH_IMAGE068
And
Figure 60975DEST_PATH_IMAGE044
if none of the objects can be matched, then directly connecting the objects
Figure 738075DEST_PATH_IMAGE045
Is put into
Figure 83606DEST_PATH_IMAGE044
If there is
Figure 702806DEST_PATH_IMAGE069
And
Figure 500998DEST_PATH_IMAGE067
the medium objects can not be matched, and can not be matched for N times of continuous circulation, when the frequency threshold value is reached, the medium objects will be matched
Figure 981789DEST_PATH_IMAGE070
From
Figure 181826DEST_PATH_IMAGE044
Removing;
S6
Figure 909611DEST_PATH_IMAGE072
output of
Figure 195099DEST_PATH_IMAGE073
Return to S1 loop.
In conclusion, the airport vehicle navigation system and method provided by the invention can effectively utilize the A-SMGCS system to correct the navigation path plan in the automatic driving state of the vehicle, thereby greatly improving the distance and the precision of the unmanned vehicle for detecting the obstacle, and being capable of timely acquiring the information which can not be acquired by the vehicle-mounted sensor and coping in advance. Therefore, compared with the traditional detection method which only relies on the vehicle-mounted sensor, the method is higher in safety and more efficient. Because the field surveillance radar has a large three-dimensional sensing distance and a large coverage area in an airport environment, information which cannot be acquired by an automatic driving vehicle on the ground can be collected (due to shielding or limited sensing distance). Furthermore, the transponder systems on the individual vehicles and aircraft that are pre-assembled can acquire their respective positions even when they have not been scanned by the field surveillance radar. In terms of deployment cost, because the field monitoring radar and the response system are existing equipment in an airport environment, additional deployment cost and engineering are not needed, and only corresponding information is imported into the system provided by the invention for information fusion, so that the implementation cost of the technology is greatly saved.
The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is to be limited only by the following claims, and their full scope and equivalents, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the invention are intended to be included within the scope of the invention.
It will be appreciated by those skilled in the art that, in addition to implementing the system, apparatus and various modules thereof provided by the present invention in the form of pure computer readable program code, the same procedures may be implemented entirely by logically programming method steps such that the system, apparatus and various modules thereof provided by the present invention are implemented in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system, the device and the modules thereof provided by the present invention can be considered as a hardware component, and the modules included in the system, the device and the modules thereof for implementing various programs can also be considered as structures in the hardware component; modules for performing various functions may also be considered to be both software programs for performing the methods and structures within hardware components.
In addition, all or part of the steps of the method according to the above embodiments may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a single chip, a chip, or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In addition, any combination of various different implementation manners of the embodiments of the present invention can be made, and the embodiments of the present invention should also be regarded as the disclosure of the embodiments of the present invention as long as the combination does not depart from the spirit of the embodiments of the present invention.

Claims (10)

1. An airport vehicle navigation method characterized by the steps of:
the first server is connected with the A-SMGCS system and the vehicle dispatching system, reference data are obtained through the A-SMGCS system, dispatching data are obtained through the vehicle dispatching system, the reference data and the dispatching data are fused through a first processing method, and first fused data are obtained and stored;
the vehicle data fusion module is accessed to a first server through a wireless communication module to obtain first fusion data;
the data fusion module acquires the sensing data acquired by the vehicle sensor module, and fuses the sensing data with the first fusion data according to a second processing method to acquire navigation correction data;
and the vehicle automatic driving module acquires navigation correction data through the data fusion module so as to plan a navigation path.
2. The airport vehicle navigation method of claim 1, wherein said reference data comprises:
Figure DEST_PATH_IMAGE001
object id;
Figure 155964DEST_PATH_IMAGE002
time when an object is detected;
Figure DEST_PATH_IMAGE003
the central pose of the object;
Figure 702483DEST_PATH_IMAGE004
the size of the object;
Figure DEST_PATH_IMAGE005
the speed of the object; and c, detecting the covariance of the result.
3. The airport vehicle navigation method of claim 2, wherein said object id comprises:
the global id is a fixed unique number of the airport equipment;
and local id, which is the id allocated to the local object detected by the vehicle-mounted sensor before the local object is not fused with the first fusion data.
4. The airport vehicle navigation method of claim 3, wherein said first processing method step comprises:
setting:
inputting:
Figure 797347DEST_PATH_IMAGE006
and (3) outputting:
Figure DEST_PATH_IMAGE007
beginning:
Figure 11291DEST_PATH_IMAGE008
current time of day
Figure DEST_PATH_IMAGE009
System time
Executing:
s1 receiving the
Figure 667662DEST_PATH_IMAGE010
Secondary detection result
Figure DEST_PATH_IMAGE011
;
S2 Current time
Figure 701477DEST_PATH_IMAGE012
A system time;
s3 for each detected object
Figure DEST_PATH_IMAGE013
Predicted as follows
Figure 600032DEST_PATH_IMAGE014
According to
Figure DEST_PATH_IMAGE015
Predicting the object at
Figure 668482DEST_PATH_IMAGE016
Result of time of day
Figure DEST_PATH_IMAGE017
S4 pairs
Figure 761334DEST_PATH_IMAGE018
Predicting each object according to the method of S3 to obtain the predicted object
Figure DEST_PATH_IMAGE019
;
S5
Figure 282445DEST_PATH_IMAGE020
And
Figure 1003DEST_PATH_IMAGE019
matching according to the object global id; if it is
Figure DEST_PATH_IMAGE021
And with
Figure 173227DEST_PATH_IMAGE022
If the matching is successful, the matching will be
Figure DEST_PATH_IMAGE023
And with
Figure 686248DEST_PATH_IMAGE024
Are combined and combined
Figure 964827DEST_PATH_IMAGE019
(ii) a If there is
Figure 487076DEST_PATH_IMAGE021
And with
Figure 795697DEST_PATH_IMAGE019
If none of the objects can be matched, then directly connecting
Figure 745199DEST_PATH_IMAGE023
Is put into
Figure DEST_PATH_IMAGE025
If there is
Figure 490170DEST_PATH_IMAGE022
And with
Figure 816109DEST_PATH_IMAGE020
The medium objects can not be matched, and can not be matched for N times of continuous circulation, when the frequency threshold value is reached, the medium objects will be matched
Figure 979237DEST_PATH_IMAGE024
From
Figure 99640DEST_PATH_IMAGE019
Removing;
S6
Figure 364530DEST_PATH_IMAGE026
output the output
Figure DEST_PATH_IMAGE027
And returning to the S1 loop.
5. The airport vehicle navigation method of claim 3, wherein said second processing method step comprises:
setting:
inputting:
Figure 431843DEST_PATH_IMAGE006
and (3) outputting:
Figure 698745DEST_PATH_IMAGE028
beginning:
Figure 255628DEST_PATH_IMAGE008
current time of day
Figure 991503DEST_PATH_IMAGE009
System time
Executing the following steps:
s1 receiving the
Figure 393666DEST_PATH_IMAGE010
Secondary detection result
Figure 282118DEST_PATH_IMAGE011
S2 Current time
Figure 9903DEST_PATH_IMAGE012
System time
S3 for each detected object
Figure DEST_PATH_IMAGE029
Predicted as follows
Figure 436336DEST_PATH_IMAGE030
According to
Figure DEST_PATH_IMAGE031
Predict the object at
Figure 625878DEST_PATH_IMAGE016
Result of time of day
Figure 618105DEST_PATH_IMAGE032
The prediction method may be EKF or the like
S4 pairs
Figure 251212DEST_PATH_IMAGE018
Predicting each object according to the method of S3 to obtain the predicted object
Figure DEST_PATH_IMAGE033
S5
Figure 181253DEST_PATH_IMAGE020
And
Figure 190797DEST_PATH_IMAGE019
according to the position of the objectMatch is made if
Figure 37530DEST_PATH_IMAGE021
And with
Figure 356385DEST_PATH_IMAGE022
If the matching is successful, if the ids of the two objects are the same or only one id is a global id, the matching will be completed
Figure 288569DEST_PATH_IMAGE023
And
Figure 367383DEST_PATH_IMAGE024
are combined and combined
Figure 68623DEST_PATH_IMAGE019
(ii) a If both ids are global and not the same, it will directly
Figure 309111DEST_PATH_IMAGE023
Is put into
Figure 479324DEST_PATH_IMAGE034
Then will be
Figure 96250DEST_PATH_IMAGE023
And
Figure 917576DEST_PATH_IMAGE024
are combined and combined
Figure 63386DEST_PATH_IMAGE019
While id is adopted
Figure 485009DEST_PATH_IMAGE024
Id (1); if there is
Figure 640047DEST_PATH_IMAGE021
And
Figure 315879DEST_PATH_IMAGE019
if none of the objects can be matched, then directly connecting the objects
Figure 898170DEST_PATH_IMAGE023
Is put into
Figure 557821DEST_PATH_IMAGE019
(ii) a If there is
Figure 1703DEST_PATH_IMAGE022
And
Figure 797621DEST_PATH_IMAGE020
the medium objects can not be matched, and can not be matched after N times of continuous circulation, and when the time threshold value is reached, the medium objects can be matched
Figure 816393DEST_PATH_IMAGE024
From
Figure 432182DEST_PATH_IMAGE019
Removing;
S6
Figure 178290DEST_PATH_IMAGE026
output of
Figure 828714DEST_PATH_IMAGE027
Return to S1 loop.
6. An airport vehicle navigation system, comprising: the system comprises a first server, an A-SMGCS system, a vehicle dispatching system and a vehicle end system, wherein the first server is connected with the A-SMGCS system and the vehicle dispatching system, reference data are obtained through the A-SMGCS system, dispatching data are obtained through the vehicle dispatching system, and the first server fuses the reference data and the dispatching data through a first processing method to obtain and store first fused data;
the car end system is connected with a first server, and the car end system includes: the system comprises a data fusion module, a wireless communication module, a sensor module and an automatic driving module, wherein the data fusion module is accessed to a first server through the wireless communication module to obtain first fusion data; the data fusion module acquires the sensing data acquired by the sensor module, and fuses the sensing data with the first fusion data according to a second processing method to acquire navigation correction data;
and the automatic driving module acquires navigation correction data through the data fusion module so as to plan a navigation path.
7. The airport vehicle navigation system of claim 6, wherein said reference data comprises:
Figure 18387DEST_PATH_IMAGE001
an object id;
Figure 387051DEST_PATH_IMAGE002
time when an object is detected;
Figure 687582DEST_PATH_IMAGE003
the central pose of the object;
Figure 943246DEST_PATH_IMAGE004
the size of the object;
Figure 38241DEST_PATH_IMAGE005
the speed of the object; and c, detecting the covariance of the result.
8. The airport vehicle navigation system of claim 7, wherein said object id comprises:
the global id is a fixed unique number of the airport equipment;
and local id, which is the id allocated to the local object detected by the vehicle-mounted sensor before the local object is not fused with the first fusion data.
9. The airport vehicle navigation system of claim 8, wherein said first processing method step comprises:
setting:
inputting:
Figure 894201DEST_PATH_IMAGE006
and (3) outputting:
Figure 732844DEST_PATH_IMAGE007
beginning:
Figure 623440DEST_PATH_IMAGE008
current time of day
Figure 138604DEST_PATH_IMAGE009
System time
Executing the following steps:
s1 receiving the
Figure 216281DEST_PATH_IMAGE010
Secondary detection result
Figure 593036DEST_PATH_IMAGE011
;
S2 Current time
Figure 72559DEST_PATH_IMAGE012
A system time;
s3 for each detected object
Figure 774935DEST_PATH_IMAGE013
Predicted as follows
Figure 90641DEST_PATH_IMAGE014
According to
Figure 271087DEST_PATH_IMAGE015
Predict the object at
Figure 136275DEST_PATH_IMAGE016
Result of time of day
Figure 743974DEST_PATH_IMAGE017
S4 pairs
Figure 796243DEST_PATH_IMAGE018
Predicting each object according to the method of S3 to obtain the predicted object
Figure 29647DEST_PATH_IMAGE019
;
S5
Figure 483762DEST_PATH_IMAGE020
And
Figure 262363DEST_PATH_IMAGE019
matching according to the object global id; if it is
Figure 801928DEST_PATH_IMAGE021
And
Figure 899128DEST_PATH_IMAGE022
if the matching is successful, the matching will be
Figure 207749DEST_PATH_IMAGE023
And
Figure 422830DEST_PATH_IMAGE024
are combined and combined
Figure 449692DEST_PATH_IMAGE019
(ii) a If there is
Figure 775631DEST_PATH_IMAGE021
And with
Figure 453606DEST_PATH_IMAGE019
If none of the objects can be matched, then directly connecting the objects
Figure 574009DEST_PATH_IMAGE023
Put into
Figure 822587DEST_PATH_IMAGE025
If there are
Figure 952217DEST_PATH_IMAGE022
And
Figure 235431DEST_PATH_IMAGE020
the medium objects can not be matched, and can not be matched after N times of continuous circulation, and when the time threshold value is reached, the medium objects can be matched
Figure 543047DEST_PATH_IMAGE024
From
Figure 278922DEST_PATH_IMAGE019
Removing;
S6
Figure 946663DEST_PATH_IMAGE026
output of
Figure 84384DEST_PATH_IMAGE027
Return to S1 loop.
10. The airport vehicle navigation system of claim 8, wherein said second processing method step comprises:
setting:
inputting:
Figure 812168DEST_PATH_IMAGE006
and (3) outputting:
Figure 284607DEST_PATH_IMAGE028
beginning:
Figure 756039DEST_PATH_IMAGE008
current time of day
Figure 748266DEST_PATH_IMAGE009
System time
Executing:
s1 receiving
Figure 646952DEST_PATH_IMAGE010
Secondary detection result
Figure 108151DEST_PATH_IMAGE036
S2 Current time
Figure 117696DEST_PATH_IMAGE012
System time
S3 for each detected object
Figure 964429DEST_PATH_IMAGE029
Predicted as follows
Figure 34016DEST_PATH_IMAGE030
According to
Figure 231779DEST_PATH_IMAGE031
Predict the object at
Figure 294282DEST_PATH_IMAGE016
Result of time of day
Figure 261101DEST_PATH_IMAGE032
The prediction method may be EKF or the like
S4 pairs
Figure 501590DEST_PATH_IMAGE018
Predicting each object according to the method of S3 to obtain the predicted object
Figure 921070DEST_PATH_IMAGE033
S5
Figure 23149DEST_PATH_IMAGE020
And
Figure 844474DEST_PATH_IMAGE019
matching according to the object position if
Figure 521443DEST_PATH_IMAGE021
And
Figure 428220DEST_PATH_IMAGE022
matching is successful, if the ids of two objects are the same or only one id is a global id, then it will be
Figure 583257DEST_PATH_IMAGE023
And
Figure 508357DEST_PATH_IMAGE024
are combined and combined
Figure 356227DEST_PATH_IMAGE019
(ii) a If both ids are global and not the same, it will directly
Figure 750300DEST_PATH_IMAGE023
Put into
Figure 709028DEST_PATH_IMAGE034
Then will be
Figure 504946DEST_PATH_IMAGE023
And
Figure 743291DEST_PATH_IMAGE024
are combined and combined
Figure 624660DEST_PATH_IMAGE019
While id is adopted
Figure 121500DEST_PATH_IMAGE024
Id in (1); if there is
Figure 37504DEST_PATH_IMAGE021
And
Figure 210865DEST_PATH_IMAGE019
if none of the objects can be matched, then directly connecting
Figure 313950DEST_PATH_IMAGE023
Is put into
Figure 348902DEST_PATH_IMAGE019
(ii) a If there is
Figure 119412DEST_PATH_IMAGE022
And
Figure 965140DEST_PATH_IMAGE020
the medium objects can not be matched, and can not be matched after N times of continuous circulation, and when the time threshold value is reached, the medium objects can be matched
Figure 821100DEST_PATH_IMAGE024
From
Figure 659743DEST_PATH_IMAGE019
Removing;
S6
Figure DEST_PATH_IMAGE037
output of
Figure 753601DEST_PATH_IMAGE038
Return to S1 loop.
CN202210194521.XA 2022-03-01 2022-03-01 Airport vehicle navigation system and method Pending CN114739381A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210194521.XA CN114739381A (en) 2022-03-01 2022-03-01 Airport vehicle navigation system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210194521.XA CN114739381A (en) 2022-03-01 2022-03-01 Airport vehicle navigation system and method

Publications (1)

Publication Number Publication Date
CN114739381A true CN114739381A (en) 2022-07-12

Family

ID=82275065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210194521.XA Pending CN114739381A (en) 2022-03-01 2022-03-01 Airport vehicle navigation system and method

Country Status (1)

Country Link
CN (1) CN114739381A (en)

Similar Documents

Publication Publication Date Title
US20210358311A1 (en) Automated system of air traffic control (atc) for at least one unmanned aerial vehicle (uav)
CN102566581B (en) Sensing based on track is with evading
US10073456B2 (en) Automated co-pilot control for autonomous vehicles
US10730521B2 (en) System for autonomous lane merging
CN110099831B (en) Vehicle control system, vehicle control method, and storage medium
US20100100325A1 (en) Site map interface for vehicular application
US20150127249A1 (en) Method and system for creating a current situation depiction
CN109765909B (en) Method for applying V2X system in port
US20220032955A1 (en) Vehicle control device and vehicle control method
GB2524384A (en) Autonomous control in a dense vehicle environment
EP3822140B1 (en) Operational design domain validation coverage for road and lane type
Liu et al. Cooperation of V2I/P2I communication and roadside radar perception for the safety of vulnerable road users
WO2018141613A1 (en) Positioning of unmanned aerial vehicles using millimeter-wave beam infrastructure
CN113866758B (en) Scene monitoring method, system, device and readable storage medium
CN111508281B (en) Method for classifying and guiding ADS-B target by satellite-borne platform
US11423780B2 (en) Traffic control system
JP2019164698A (en) Vehicle control apparatus and vehicle control system
WO2022003343A1 (en) Systems and methods for interactive vehicle transport networks
CN104539906A (en) Image/laser ranging/ABS-B monitoring integrated system
WO2020080995A1 (en) Traffic management system and an unmanned aerial vehicle compatible with such a system
CN114739381A (en) Airport vehicle navigation system and method
CN116095270A (en) Method, device, system and storage medium for infrastructure-supported assistance of a motor vehicle
CN114882717B (en) Object detection system and method based on vehicle-road cooperation
CN113771845A (en) Method, device, vehicle and storage medium for predicting vehicle track
CN114973667B (en) Communication perception calculation integrated road infrastructure system and processing method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination