CN117315581A - Airport station level safety management method, device, equipment and computer storage medium - Google Patents

Airport station level safety management method, device, equipment and computer storage medium Download PDF

Info

Publication number
CN117315581A
CN117315581A CN202311263116.XA CN202311263116A CN117315581A CN 117315581 A CN117315581 A CN 117315581A CN 202311263116 A CN202311263116 A CN 202311263116A CN 117315581 A CN117315581 A CN 117315581A
Authority
CN
China
Prior art keywords
positioning
coordinates
airport
preset
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311263116.XA
Other languages
Chinese (zh)
Inventor
谢海涛
樊治国
周江涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Gaozhong Information Technology Co ltd
Original Assignee
Qingdao Gaozhong Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Gaozhong Information Technology Co ltd filed Critical Qingdao Gaozhong Information Technology Co ltd
Priority to CN202311263116.XA priority Critical patent/CN117315581A/en
Publication of CN117315581A publication Critical patent/CN117315581A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention relates to the technical field of computer data processing, and discloses a method, a device and equipment for airport apron safety management and a computer readable medium, wherein the method comprises the following steps: acquiring real-time video image data of an airport station area; performing target detection on the real-time video image data to obtain the intra-image coordinates of at least one object to be verified, which is included in the airport station area; detecting whether at least one positioning data transmitted by a preset positioning device exists in the airport station area; when the existence of the positioning data is determined, acquiring the positioning data; wherein, a preset positioning device is preset on a preset authorized object corresponding to the airport landing area; matching the positioning data with coordinates in the image to obtain a matching result; and determining an unauthorized object detection result in the object to be verified according to the matching result. By the mode, the airport apron safety management method and the airport apron safety management system realize real-time, efficient and accurate airport apron safety management.

Description

Airport station level safety management method, device, equipment and computer storage medium
Technical Field
The embodiment of the invention relates to the technical field of computer data processing, in particular to a method, a device and equipment for airport apron safety management and a computer storage medium.
Background
After the aircraft falls down, the aircraft enters a stop area through a separation opening and a sliding channel, ground staff controls a movable corridor bridge to be in butt joint with a boarding gate, the cabin door is opened, and passengers directly arrive at a terminal building through the corridor bridge. The airport apron refers to a designated stand area beside a boarding bridge, and ground staff can maintain an airplane.
In order to ensure normal and safe operation of an aircraft, it is necessary to perform rights management on personnel, vehicles and other devices such as power supply devices present on an airport landing, and to prevent unauthorized illegal objects from being present on the airport landing, thereby ensuring the safety and fluency of airport operation.
The inventors found in the implementation of the prior art that: the conventional method for verifying the authorization of the object on the airport landing has the following defects: the entrance guard gate is arranged on the entrance way of the airport station to perform identity authentication operation of personnel and vehicles, the position limitation of verification is larger, the method is not flexible enough, the patrol time period for patrol by patrol personnel on the station is longer, the actions of unauthorized personnel and vehicles on the station cannot be found in real time, and the labor cost is higher.
Thus, there is a need for a solution for airport landing security management that is efficient and accurate in real time.
Disclosure of Invention
In view of the above problems, the embodiments of the present invention provide an airport station security management method, apparatus, device, and computer storage medium, which are used to solve the problems of low efficiency and low accuracy of airport station security management in the prior art.
According to an aspect of the embodiment of the present invention, there is provided an airport field site security management method, including:
acquiring real-time video image data of an airport station area;
performing target detection on the real-time video image data to obtain the intra-image coordinates of at least one object to be verified, which is included in the airport apron area;
detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not;
when the existence of the positioning data is determined, acquiring the positioning data; one of the preset positioning devices is preset on one preset authorized object corresponding to the airport apron area;
matching the positioning data with the coordinates in the image to obtain a matching result;
and determining an unauthorized object detection result in the object to be verified according to the matching result.
In an alternative, the positioning data comprises geospatial coordinates of the positioning device; the method further comprises the steps of:
Unifying the geospatial coordinates of the positioning equipment and the intra-image coordinates into a target coordinate system according to a preset mapping relation between the intra-image coordinates and the geospatial coordinates, so as to obtain a first coordinate of at least one positioning equipment in the target coordinate system and a second coordinate of at least one object to be verified in the target coordinate system; the target coordinate system is a geographic space coordinate system or a camera coordinate system corresponding to the acquisition camera of the real-time video image data;
calculating according to at least one first coordinate and at least one second coordinate to obtain the matching result; the matching result is used for representing whether the positioning equipment corresponding to each first coordinate and the object to be verified corresponding to the second coordinate are the same moving target or not.
In an alternative, the method further comprises:
respectively calculating the distance between the first coordinate and each second coordinate;
and determining the matching result according to a comparison result of the distance and a preset positioning error threshold value.
In an alternative, the method further comprises:
determining a first motion trail of at least one positioning device according to the positioning data respectively;
Respectively determining a second motion trail of at least one object to be verified according to the coordinates in the image;
respectively carrying out track association on at least one first motion track and at least one second motion track according to a preset joint probability interconnection algorithm;
and determining the matching result according to whether the corresponding motion trail is associated or not.
In an alternative, the method further comprises:
the object to be verified, which does not correspond to the same moving object as any of the positioning apparatuses, is determined as the unauthorized object.
In an alternative, the method further comprises:
acquiring radar detection data of the airport apron area;
and calibrating the matching result according to the radar detection data.
In an alternative, the method further comprises:
analyzing the radar detection data to obtain geographic space coordinates of radar identification objects included in the airport apron area;
matching the geospatial coordinates of the radar identification object with the geospatial coordinates of the positioning equipment to obtain a target radar identification object corresponding to the same moving target with the target positioning equipment;
Calibrating the geospatial coordinates of the target positioning device according to the geospatial coordinates of the target radar identification object;
and calibrating the matching result according to the calibrated geospatial coordinates of the positioning equipment.
According to another aspect of the embodiment of the present invention, there is provided an airport station security management apparatus, including:
the first acquisition module is used for acquiring real-time video image data of the airport station area;
the first detection module is used for carrying out target detection on the real-time video image data to obtain the intra-image coordinates of at least one object to be verified, which is included in the airport apron area;
the second detection module is used for detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not;
the second acquisition module is used for acquiring the positioning data when the existence of the positioning data is determined; one of the preset positioning devices is preset on one preset authorized object corresponding to the airport apron area;
the matching module is used for matching the positioning data with the coordinates in the image to obtain a matching result;
and the determining module is used for determining an unauthorized object detection result in the object to be verified according to the matching result.
According to another aspect of the embodiment of the present invention, there is provided an airport station security management apparatus, including:
the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is configured to store at least one executable instruction that causes the processor to perform the operations of the airport apron safety management method embodiment of any one of the preceding claims.
According to yet another aspect of the embodiments of the present invention, there is provided a computer-readable storage medium having stored therein at least one executable instruction for causing an airport apron safety management device to perform the operations of any one of the foregoing airport apron safety management method embodiments.
The embodiment of the invention acquires real-time video image data of the airport station area; performing target detection on the real-time video image data to obtain an intra-image coordinate of at least one object to be verified in the airport apron area in the real-time video image; detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not; when the existence of the positioning data is determined, acquiring the positioning data; one of the preset positioning devices is preset on one of the preset authorized objects corresponding to the airport apron area, namely, only when the authorized object appears in the airport apron area, the positioning device carried by the authorized object correspondingly transmits positioning data back in real time. And matching the positioning data with the coordinates in the image to obtain a matching result, wherein the matching result is used for representing whether the object to be verified corresponding to the coordinates in the image and the positioning equipment corresponding to the positioning data correspond to the same moving target or not. And determining an unauthorized object detection result in the object to be verified according to the matching result, and screening out the object which is not covered in the positioning data from the object to be verified as an unauthorized object. According to the embodiment of the invention, the preset positioning equipment is arranged on the basis of authorized objects one by one, the positioning data transmitted by the preset positioning equipment and received by the airport apron area are used as positioning references of the objects which are legal in the airport apron area, and the detected positioning data are matched with the target detection result based on the real-time video image data, so that unauthorized objects which are not covered in the positioning data can be screened out from the objects to be verified, and the safety management of the airport apron area can be realized efficiently, accurately and economically.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present invention can be more clearly understood, and the following specific embodiments of the present invention are given for clarity and understanding.
Drawings
The drawings are only for purposes of illustrating embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
fig. 1 shows a flow diagram of an airport station security management method according to an embodiment of the present invention;
fig. 2 shows a schematic structural diagram of an airport station security management device according to an embodiment of the present invention;
fig. 3 shows a schematic structural diagram of an airport station security management device according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein.
Fig. 1 shows a flowchart of an airport apron safety management method according to an embodiment of the present invention, which is executed by a computer processing device. The computer processing device may include a cell phone, a notebook computer, etc. As shown in fig. 1, the method comprises the steps of:
step 10: real-time video image data of the airport apron area is acquired.
The airport landing area refers to a designated area beside a boarding bridge, and ground staff can maintain an airplane. The real-time video image data may be a real-time RTSP (Real Time Streaming Protocol real-time streaming protocol) data stream acquired through a preset multi-path camera. In order to ensure normal and safe operation of an aircraft, it is necessary to perform rights management on personnel, vehicles and other devices such as power supply devices present on an airport landing, and to prevent unauthorized illegal objects from being present on the airport landing, thereby ensuring the safety and fluency of airport operation.
It should be noted that, considering that the range of the airport apron is large and the setting of the camera cannot affect the normal in-out and operation of the aircraft, the real-time monitoring video of the existing airport apron area is generally obtained from a distance of 50-150 meters, and the image acquisition distance of the camera different from other cameras set in a smaller range can reach 5-10 meters, which results in a common identity verification mode that the face recognition, license plate recognition and the like cannot be directly performed on the real-time video image data.
Step 20: and performing target detection on the real-time video image data to obtain the intra-image coordinates of at least one object to be verified, which is included in the airport apron area.
And performing target detection on the real-time video image data according to the object characteristics of the preset type to obtain the two-dimensional coordinates of at least one object to be verified of the preset type in the implementation video image in the airport station area. The preset type can be determined according to the requirements of airport apron safety management, such as people, vehicles, charging equipment and other objects which can influence the safety and normal operation of the aircraft on the airport apron.
The target detection may adopt a multi net (multi-layer network) deep learning network structure, and the detection process is as follows: real-time video image data of the airport station area is obtained in real time, and the obtained frame picture is input into a neural network by decoding and frame extraction of the real-time video image data mechanical energy, so that detection results of various targets can be obtained. The neural network is composed of two parts, namely CNN (Convolutional Neural Networks, convolutional neural network) and FCN (Fully Convolutional Networks, fully convolutional neural network), wherein CNN is a first 13-layer convolutional part of vgg-16 (comprising 13 convolutional layers and 3 fully connected layers), and tensors (Encoded features) with the size of 39 x 12 x 512 are output, and then a classification network, a detection network and a segmentation network are connected. The classification network picture is firstly input into a 1x1 convolution structure, and then a fully-connected hidden layer and a classification network are cascaded to obtain classification prediction output. The first step in the detection network is to pass the encoded features through the convolutional layers of a 1x1 network, thereby producing a rough estimate of the bounding box. And then outputs 6 channels again through the 1x1 convolutional layer, wherein each channel output has a different semantic meaning. The first two channels output segments representing a coarse-scale image, i.e. the output values are probability values for the occurrence of the object within a certain range of each center point, and the last four channels represent the coordinates of this position bounding box. Further, considering that the structure prediction result is too rough, the embodiment of the present invention adds a new rebeaom (rescale) unit to predict the residual around the hoisting box by analyzing the features of higher resolution. This unit applies a conv (convolution) 1x1 layer output prediction result for stitching together the high resolution features and the low resolution hidden layer features. The segmentation network uses a 1x1 convolutional network structure to generate a low resolution segmentation size, which is then up-sampled with three deconvolution networks to predict the segmented image of the network. Meanwhile, before decoding the network, the characteristics can be subjected to convolution structure, so that the characteristics with different scales can be extracted, and the three networks can be fused and supplemented.
The object identification task is to determine the type, content and attribute of the object from the image. And the target identification module is used for completing the identification of various targets focused in airport business based on the deep learning model. And extracting the bottom features for target identification, and then inputting a plurality of classification networks to classify and judge different features. The input picture is a detected picture, all convolution layer structures in vgg-16 are firstly intercepted to extract the bottom layer features, 2 pooling layers are used in the algorithm, and higher resolution is helpful to detection and identification. Then the algorithm adopts the full convolution RPN network to detect the object and find the candidate object in the image. The RPN (Recursive PyramidNetwork ) network takes the convolution characteristics of the bottom layer as input, outputs a series of possible boundingboxes, meanwhile, because of the end-to-end characteristics of the RPN network, the proposed object range has very high confidence, 256 target areas of the same sample are input into the RPN network, the network adopts an ROI (Region ofinterest ) pulling layer to extract characteristic diagrams with fixed sizes for the 256 samples, and the extracted characteristics are respectively input into a detection network and an identification network. And finally, tracking the object to be verified obtained by target detection in real time to obtain real-time image coordinates of the object to be verified. The object tracking task is to predict the size and position of an object in a subsequent frame given the size and position of the object in a video sequence. The target tracking module is used for tracking the object concerned by the apron service on the basis of the target detection module.
Step 30: and detecting whether at least one piece of positioning data transmitted by preset positioning equipment exists in the airport apron area.
The authorized objects may be objects that are allowed to appear in the airport apron area in advance, such as ground staff, aircraft operation staff, operation and maintenance related devices, operation and maintenance related vehicles, and the like. The preset positioning equipment sends positioning data of the current position at preset frequency. The preset positioning device may be a GPS device. The positioning data comprise geographic space coordinates of the positioning equipment, and the geographic space coordinates are used for representing coordinate information in three-dimensional geographic spaces such as longitude and latitude, altitude and the like of the positioning equipment.
The preset pointing device may be carried by the authorized object in a variety of forms, such as a tablet, a locator, etc. And detecting the airport apron area in real time through a preset receiving device corresponding to the positioning equipment so as to acquire positioning data transmitted back by the potential preset positioning equipment.
Step 40: when the existence of the positioning data is determined, acquiring the positioning data; one of the preset positioning devices is preset on one preset authorized object corresponding to the airport apron area.
When the presence of a return of positioning data is detected, the positioning data is acquired. Alternatively, identification information of the positioning device corresponding to the positioning data may be synchronously recorded. A preset positioning device is preset on a preset authorized object corresponding to the airport apron area, namely, the preset positioning device is only arranged on the object which is allowed to appear in the airport apron area in advance.
Step 50: and matching the positioning data with the coordinates in the image to obtain a matching result.
And respectively matching the positioning data corresponding to the at least one positioning device with the coordinates in the image corresponding to the at least one object to be verified, and determining whether the object to be verified corresponding to the same moving target with the positioning device exists. The matching result is used for representing whether the positioning equipment corresponding to the positioning data and the object to be verified corresponding to the coordinates in the image actually correspond to the same moving target.
Specifically, considering that the position of the same moving object is determined at a specific time, when the distance between the positioning data and the coordinates in the image is smaller than the preset threshold, it can be determined that the positioning data and the coordinates in the image are actually acquired by different acquisition modes, namely, the mode of object detection for monitoring image video data and the mode of detection of the positioning device, so that the positioning data and the coordinates in the image can be considered to be matched at the moment. Optionally, considering that the motion trail of the same moving target in a period of time can be determined, tracking the positioning device according to the positioning data to obtain the motion trail of the positioning device, correspondingly, tracking the target of the object to be verified to obtain the motion trail of the object to be verified, and determining that the positioning data is matched with the coordinates in the image when the motion trail of the positioning device and the motion trail of the object to be verified are related, such as most of the motion trail are coincident.
Further, considering that the coordinates of the object to be verified detected from the real-time video image are two-dimensional coordinates in the monitored image coordinate system, and the positioning data include three-dimensional coordinates of the positioning device in the geospatial coordinate system, in order to facilitate matching of the positioning data with the coordinates in the image, the matching of the coordinates in the image and the positioning data may be performed after the coordinates in the image and the geospatial coordinates in the positioning data are converted into the same coordinate system based on a mapping relationship among the monitored image coordinate system corresponding to the real-time video image data, the camera coordinate system corresponding to the camera for collecting the real-time video image data, and the geospatial coordinate system corresponding to the positioning data.
Thus, in yet another embodiment of the present invention, step 50 further comprises:
step 501: unifying the geospatial coordinates of the positioning equipment and the intra-image coordinates into a target coordinate system according to a preset mapping relation between the intra-image coordinates and the geospatial coordinates, so as to obtain a first coordinate of at least one positioning equipment in the target coordinate system and a second coordinate of at least one object to be verified in the target coordinate system; the target coordinate system is a geospatial coordinate system or a camera coordinate system corresponding to the acquisition camera of the real-time video image data.
The method comprises the steps of determining internal and external parameters of a camera for acquiring real-time video image data, respectively determining a first transformation matrix between a camera coordinate system and a geospatial coordinate system and a second transformation matrix between the camera coordinate system and a two-dimensional image coordinate system in a video image according to the internal and external parameters, and determining the mapping relation between coordinates in the image and the geospatial coordinate according to the first transformation matrix and the second transformation matrix. Specifically, the video space mapping comprises camera inner and outer parameter calibration and video image mapping. The method comprises the steps of obtaining a series of images of a camera, obtaining a homography matrix, and obtaining a nonlinear optimization of the homography matrix and the distortion coefficient, wherein the inner parameters of the camera are camera focal length, principal point coordinates, the distortion coefficient and the like, the calibration process is that the camera shoots a series of images at different angles, extracting characteristic points of each image and carrying out characteristic matching, solving the homography matrix which is the correspondence relationship among the images by adopting a random sampling consistency method, then solving the inner reference matrix according to a plurality of homography matrix simultaneous equations, and carrying out nonlinear optimization by taking the homography matrix as an initial value;
the external parameters of the camera are transformation matrixes from a geographic space coordinate system to a camera coordinate system, the calibration flow is to automatically match target corresponding relations of two images by adopting a random sampling consistency and a nearest point iteration method according to the detected image internal coordinates in the video image in a default state and the positioning coordinates in the geographic space, and solve homography matrixes between the images, and solve the external parameters of the default state of the camera according to a camera model; after the calibration of the internal and external parameters of the camera is completed, the camera parameters at the current moment are calculated according to the specific rotation angle of the camera by video image mapping, so that a transformation matrix from a geographic space coordinate system to the video image coordinate system is obtained, and the mapping from the coordinates of a target in the geographic space to points in the video image is realized.
Step 502: calculating according to at least one first coordinate and at least one second coordinate to obtain the matching result; the matching result is used for representing whether the positioning equipment corresponding to each first coordinate and the object to be verified corresponding to the second coordinate are the same moving target or not.
Considering that the position of a moving object at a specific moment is determined, further, the moving track of the moving object is also determined, so that whether the first coordinate and the second coordinate actually correspond to the same moving object or not can be determined according to the distance between the coordinates, the moving track of the positioning device and the moving track of the object to be verified can be respectively constructed according to the first coordinate and the second coordinate, and when the moving track of the positioning device and the moving track of the object to be verified are associated, if the contact ratio is greater than a preset threshold, the positioning device corresponding to the first coordinate and the object to be verified corresponding to the second coordinate are determined to be the same moving object.
Optionally, in consideration of a certain time delay in acquisition and return of the positioning data, to avoid matching the positioning data before a period of time (for example, 1 minute) with the real-time video image data at the current moment, before the matching at least one first coordinate and at least one second coordinate respectively, the matching method may further include:
And according to the acquisition time of the positioning data and the time stamp of the real-time video image data, performing time stamp alignment on the positioning data and the real-time video image data.
The first and second coordinates are determined from the positioning data and the real-time video image data corresponding to the same time stamp.
Wherein step 502 further comprises: step 5021: and respectively calculating the distance between the first coordinates and each second coordinate.
Wherein, for each first coordinate, a distance between each second coordinate and the first coordinate is calculated, and the distance may be a euclidean distance.
Step 5022: and determining the matching result according to a comparison result of the distance and a preset positioning error threshold value.
The positioning error threshold is used for representing error conditions caused by acquisition equipment, environmental interference, detection algorithm performance and the like, and when the distance is greater than the positioning error threshold, the positioning equipment corresponding to the first coordinate is considered to be matched with the object to be verified corresponding to the second coordinate.
Optionally, in addition to matching based on static position coordinates of the object to be verified and the positioning device, the step 502 may further include:
Step 5121: and respectively determining first motion tracks of at least one positioning device according to the positioning data.
And carrying out real-time motion tracking on the positioning equipment according to the positioning data to obtain a first motion trail of the positioning equipment.
Step 5122: and respectively determining the second motion trail of at least one object to be verified according to the coordinates in the image.
After target detection is performed to obtain coordinates in an image of the object to be verified, the object to be verified can be tracked based on a target tracking algorithm to obtain a second motion trail. The algorithms for target tracking may include multi-target tracking algorithms or optical flow methods.
Step 5123: and respectively carrying out track association on at least one first motion track and at least one second motion track according to a preset joint probability interconnection algorithm.
Wherein, the measurement (i.e. coordinate information) and the trajectory with shared measurement values are put into the same confirmation matrix through a joint probability interconnection algorithm (JointProbabilitic DataAssociation, JPDA), and constraint is carried out by the following two assumption conditions that 1) each measurement value has a unique source; 2) For a given target, at most one measurement takes the measurement as a source, the confirmation matrix is split to form a plurality of possible events, finally, the joint probability of the measurement value from the target is calculated according to the plurality of possible events, and finally, the track with the joint probability larger than the preset probability value is determined to be associated.
Step 5124: and determining the matching result according to whether the corresponding motion trail is associated or not.
And determining the object to be verified, which is associated with the first motion trail and the second motion trail, and the positioning equipment to correspond to the same motion target.
Furthermore, in order to improve the accuracy of safety management of the airport field station area, on the basis of matching of the positioning data and the real-time video data, other types of real-time positioning data can be introduced to calibrate the positioning data, so that the airport field station safety management based on comprehensive analysis of multi-mode data is realized. Other types of real-time positioning data may include radar detection data, among others.
Specifically, before step 502, the method further includes:
step 5221: and acquiring radar detection data of the airport apron area.
And detecting the airport station area in real time through a plurality of radar devices preset in the airport station area to obtain radar detection data. The radar detection data includes three-dimensional coordinates of objects within the airport apron area in the geographic space.
Step 5222: and calibrating the matching result according to the radar detection data.
The method comprises the steps of obtaining a positioning device, obtaining a target moving object, obtaining a coordinate of the positioning device, and obtaining a coordinate of the positioning device after calibration according to the coordinate of the moving object in the radar detection data and the coordinate of the positioning device after the calibration, wherein the radar is considered to detect all objects in an airport apron area, so that the coordinate of the positioning device is necessarily included in the radar detection data, and the coordinate of the positioning device after the calibration is used as the coordinate of the positioning device corresponding to the moving object after the calibration, which is similar to the process of matching the positioning data with the coordinate in an image to obtain a matching result.
Thus, in one embodiment of the invention, step 5222 further comprises:
step 531: and analyzing the radar detection data to obtain the geospatial coordinates of the radar identification objects included in the airport apron area.
It will be appreciated that the radar identification object is a preset object of the type to be detected, which is consistent with the preset type described in the foregoing step 20, and may include objects that may affect the safety and normal operation of the aircraft on the airport landing by people, vehicles, charging equipment, etc. Optionally, considering that some fixed facility identifiers or other environmental interferences such as animals (e.g. flying birds or small wander animals) except for accidental passing people may exist on the airport apron area, the radar detection data may be subjected to denoising processing, and then the denoising radar detection data is subjected to position information extraction according to the radar detection data characteristics of the object to be verified of a preset type, so as to obtain the geospatial coordinates of the radar identification object in the airport apron area.
Step 532: and matching the geospatial coordinates of the radar identification object with the geospatial coordinates of the positioning equipment to obtain a target radar identification object corresponding to the same moving target with the target positioning equipment.
Specifically, similar to the foregoing step 502, it is determined whether the geospatial coordinates of the radar identification object and the geospatial coordinates of the positioning device match based on whether there is an association between the distance between the radar identification object and the geospatial coordinates of the positioning device and/or the motion trajectory composed of the geospatial coordinates. And when the distance between the geospatial coordinates is smaller than a preset positioning error threshold value, determining that the radar identification object is matched with the geospatial coordinates of the positioning equipment. Optionally, constructing a real-time motion track of the radar detection object according to the geospatial coordinates of the radar detection object, constructing a real-time motion track of the positioning device according to the geospatial coordinates of the positioning device, performing correlation analysis on the real-time motion tracks of the radar detection object and the positioning device, and determining that the radar detection object is matched with the geospatial coordinates of the positioning device when the track correlation probability of the radar detection object and the positioning device is larger than a preset probability value. When the geospatial coordinates of the radar identification object are matched with those of the positioning device, the matched radar identification object and positioning device are determined to correspond to the same moving object.
Step 533: and calibrating the geospatial coordinates of the target positioning equipment according to the geospatial coordinates of the target radar identification object.
Specifically, the geospatial coordinates of the target radar identification object and the geospatial coordinates of the target positioning device corresponding to the target radar identification object are fused, and the fusion mode may be that average is taken, so as to obtain the geospatial coordinates of the target positioning device after calibration.
Step 534: and calibrating the matching result according to the calibrated geospatial coordinates of the positioning equipment.
Specifically, the geographical space coordinates of the positioning equipment after calibration are matched with the coordinates in the image of the object to be verified again, and a calibrated matching result is obtained.
Step 60: and determining an unauthorized object detection result in the object to be verified according to the matching result.
In order to improve the granularity of unauthorized object detection, a specific unauthorized object to be verified is positioned, whether positioning equipment and the object to be verified correspond to the same moving target or not is determined for each object to be verified, and when no positioning equipment and the object to be verified correspond to the same moving target, the object to be verified is determined to be the unauthorized object.
Specifically, step 60 further includes:
step 601: the object to be verified, which does not correspond to the same moving object as any of the positioning apparatuses, is determined as the unauthorized object.
When no positioning device corresponds to the same moving target with a certain object to be verified, the fact that the preset positioning device is not arranged on the object to be verified is characterized, and the positioning device is preset on the basis of only authorized objects allowed to appear on an airport ticket area, so that the object to be verified is determined to be an unauthorized object at the moment.
Alternatively, the number of positioning devices sending back positioning data may be compared with the number of objects to be verified detected according to the real-time video data target, and when the two numbers are consistent, it is characterized that all the objects to be verified have positioning devices corresponding to the positioning devices, so that no unauthorized objects exist in the objects to be verified. Correspondingly, when the number of the positioning devices sending back the positioning data is inconsistent with the number of the objects to be verified, determining that the abnormality exists and alarming. When the number of positioning devices sending back positioning data is smaller than the number of objects to be verified, determining that unauthorized objects appear on the airport apron area, and when the number of positioning devices sending back positioning data is larger than the number of objects to be verified, determining that errors occur in data acquisition, and checking the positioning data and real-time video data acquisition devices.
It will be appreciated that when positioning data is returned for any pre-set positioning device detected within the airport landing area, it is determined that no authorized objects are present within the airport landing area, and therefore all objects to be verified are determined to be unauthorized objects.
The embodiment of the invention acquires real-time video image data of the airport station area; performing target detection on the real-time video image data to obtain an intra-image coordinate of at least one object to be verified in the airport apron area in the real-time video image; detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not; when the existence of the positioning data is determined, acquiring the positioning data; one of the preset positioning devices is preset on one of the preset authorized objects corresponding to the airport apron area, namely, only when the authorized object appears in the airport apron area, the positioning device carried by the authorized object correspondingly transmits positioning data back in real time. And matching the positioning data with the coordinates in the image to obtain a matching result, wherein the matching result is used for representing whether the object to be verified corresponding to the coordinates in the image and the positioning equipment corresponding to the positioning data correspond to the same moving target or not. And determining an unauthorized object detection result in the object to be verified according to the matching result, and screening out the object which is not covered in the positioning data from the object to be verified as an unauthorized object. According to the embodiment of the invention, the preset positioning equipment is arranged on the basis of authorized objects one by one, the positioning data transmitted by the preset positioning equipment and received by the airport apron area are used as positioning references of the objects which are legal in the airport apron area, and the detected positioning data are matched with the target detection result based on the real-time video image data, so that unauthorized objects which are not covered in the positioning data can be screened out from the objects to be verified, and the safety management of the airport apron area can be realized efficiently, accurately and economically.
Fig. 2 shows a schematic structural diagram of an airport station security management device according to an embodiment of the present invention. As shown in fig. 2, the apparatus 70 includes: a first acquisition module 701, a first detection module 702, a second detection module 703, a second acquisition module 704, a matching module 705, a determination module 706.
The first acquiring module 701 is configured to acquire real-time video image data of an airport station area;
the first detection module 702 is configured to perform target detection on the real-time video image data to obtain an intra-image coordinate of at least one object to be verified included in the airport apron area;
a second detection module 703, configured to detect whether there is positioning data returned by at least one preset positioning device in the airport apron area;
a second obtaining module 704, configured to obtain the positioning data when it is determined that the positioning data exists; one of the preset positioning devices is preset on one preset authorized object corresponding to the airport apron area;
the matching module 705 is configured to match the positioning data with coordinates in the image to obtain a matching result;
a determining module 706, configured to determine an unauthorized object detection result in the object to be verified according to the matching result.
The operation process of the airport station apron safety management device provided by the embodiment of the invention is approximately the same as that of the method embodiment, and is not repeated.
The airport station apron safety management device provided by the embodiment of the invention acquires real-time video image data of an airport station apron area; performing target detection on the real-time video image data to obtain an intra-image coordinate of at least one object to be verified in the airport apron area in the real-time video image; detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not; when the existence of the positioning data is determined, acquiring the positioning data; one of the preset positioning devices is preset on one of the preset authorized objects corresponding to the airport apron area, namely, only when the authorized object appears in the airport apron area, the positioning device carried by the authorized object correspondingly transmits positioning data back in real time. And matching the positioning data with the coordinates in the image to obtain a matching result, wherein the matching result is used for representing whether the object to be verified corresponding to the coordinates in the image and the positioning equipment corresponding to the positioning data correspond to the same moving target or not. And determining an unauthorized object detection result in the object to be verified according to the matching result, and screening out the object which is not covered in the positioning data from the object to be verified as an unauthorized object. According to the embodiment of the invention, the preset positioning equipment is arranged on the basis of authorized objects one by one, the positioning data transmitted by the preset positioning equipment and received by the airport apron area are used as positioning references of the objects which are legal in the airport apron area, and the detected positioning data are matched with the target detection result based on the real-time video image data, so that unauthorized objects which are not covered in the positioning data can be screened out from the objects to be verified, and the safety management of the airport apron area can be realized efficiently, accurately and economically.
Fig. 3 is a schematic structural diagram of an airport apron safety management device according to an embodiment of the present invention, and the specific embodiment of the present invention does not limit the specific implementation of the airport apron safety management device.
As shown in fig. 3, the airport apron safety management device may include: a processor (processor) 802, a communication interface (Communications Interface) 804, a memory (memory) 806, and a communication bus 808.
Wherein: processor 802, communication interface 804, and memory 806 communicate with each other via a communication bus 808. A communication interface 804 for communicating with network elements of other devices, such as clients or other servers. The processor 802 is configured to execute the program 810, and may specifically perform the relevant steps in the above-described embodiment of the method for airport apron security management.
In particular, program 810 may include program code including computer-executable instructions.
The processor 802 may be a central processing unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors included in the airport apron safety management device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
Memory 806 for storing a program 810. The memory 806 may include high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 810 may be specifically invoked by the processor 802 to cause the airport apron safety management device to:
acquiring real-time video image data of an airport station area;
performing target detection on the real-time video image data to obtain the intra-image coordinates of at least one object to be verified, which is included in the airport apron area;
detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not;
when the existence of the positioning data is determined, acquiring the positioning data; one of the preset positioning devices is preset on one preset authorized object corresponding to the airport apron area;
matching the positioning data with the coordinates in the image to obtain a matching result;
and determining an unauthorized object detection result in the object to be verified according to the matching result.
The operation process of the airport apron safety management device provided by the embodiment of the invention is approximately the same as that of the embodiment of the method, and is not repeated.
The airport station apron safety management equipment provided by the embodiment of the invention acquires real-time video image data of an airport station apron area; performing target detection on the real-time video image data to obtain an intra-image coordinate of at least one object to be verified in the airport apron area in the real-time video image; detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not; when the existence of the positioning data is determined, acquiring the positioning data; one of the preset positioning devices is preset on one of the preset authorized objects corresponding to the airport apron area, namely, only when the authorized object appears in the airport apron area, the positioning device carried by the authorized object correspondingly transmits positioning data back in real time. And matching the positioning data with the coordinates in the image to obtain a matching result, wherein the matching result is used for representing whether the object to be verified corresponding to the coordinates in the image and the positioning equipment corresponding to the positioning data correspond to the same moving target or not. And determining an unauthorized object detection result in the object to be verified according to the matching result, and screening out the object which is not covered in the positioning data from the object to be verified as an unauthorized object. According to the embodiment of the invention, the preset positioning equipment is arranged on the basis of authorized objects one by one, the positioning data transmitted by the preset positioning equipment and received by the airport apron area are used as positioning references of the objects which are legal in the airport apron area, and the detected positioning data are matched with the target detection result based on the real-time video image data, so that unauthorized objects which are not covered in the positioning data can be screened out from the objects to be verified, and the safety management of the airport apron area can be realized efficiently, accurately and economically.
The embodiment of the invention provides a computer readable storage medium, which stores at least one executable instruction, and the executable instruction enables an airport apron safety management device to execute the airport apron safety management method in any method embodiment when the executable instruction runs on the airport apron safety management device.
The executable instructions may be specifically for causing an airport apron safety management device to:
acquiring real-time video image data of an airport station area;
performing target detection on the real-time video image data to obtain the intra-image coordinates of at least one object to be verified, which is included in the airport apron area;
detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not;
when the existence of the positioning data is determined, acquiring the positioning data; one of the preset positioning devices is preset on one preset authorized object corresponding to the airport apron area;
matching the positioning data with the coordinates in the image to obtain a matching result;
and determining an unauthorized object detection result in the object to be verified according to the matching result.
The operation process of the executable instructions stored in the computer storage medium provided by the embodiment of the present invention is substantially the same as the operation process of the foregoing method embodiment, and will not be described in detail.
The executable instructions stored by the computer storage medium provided by the embodiment of the invention are obtained by acquiring real-time video image data of the airport station area; performing target detection on the real-time video image data to obtain an intra-image coordinate of at least one object to be verified in the airport apron area in the real-time video image; detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not; when the existence of the positioning data is determined, acquiring the positioning data; one of the preset positioning devices is preset on one of the preset authorized objects corresponding to the airport apron area, namely, only when the authorized object appears in the airport apron area, the positioning device carried by the authorized object correspondingly transmits positioning data back in real time. And matching the positioning data with the coordinates in the image to obtain a matching result, wherein the matching result is used for representing whether the object to be verified corresponding to the coordinates in the image and the positioning equipment corresponding to the positioning data correspond to the same moving target or not. And determining an unauthorized object detection result in the object to be verified according to the matching result, and screening out the object which is not covered in the positioning data from the object to be verified as an unauthorized object. According to the embodiment of the invention, the preset positioning equipment is arranged on the basis of authorized objects one by one, the positioning data transmitted by the preset positioning equipment and received by the airport apron area are used as positioning references of the objects which are legal in the airport apron area, and the detected positioning data are matched with the target detection result based on the real-time video image data, so that unauthorized objects which are not covered in the positioning data can be screened out from the objects to be verified, and the safety management of the airport apron area can be realized efficiently, accurately and economically.
The embodiment of the invention provides an airport apron safety management device which is used for executing the airport apron safety management method.
The embodiment of the invention provides a computer program which can be called by a processor to enable an airport apron safety management device to execute the airport apron safety management method in any of the method embodiments.
Embodiments of the present invention provide a computer program product comprising a computer program stored on a computer readable storage medium, the computer program comprising program instructions which, when run on a computer, cause the computer to perform the method of airport apron security management in any of the method embodiments described above.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with the teachings herein. The required structure for a construction of such a system is apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It will be appreciated that the teachings of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed invention requires more features than are expressly recited in each claim.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component, and they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specifically stated.

Claims (10)

1. An airport apron security management method, the method comprising:
acquiring real-time video image data of an airport station area;
Performing target detection on the real-time video image data to obtain the intra-image coordinates of at least one object to be verified, which is included in the airport apron area;
detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not;
when the existence of the positioning data is determined, acquiring the positioning data; one of the preset positioning devices is preset on one preset authorized object corresponding to the airport apron area;
matching the positioning data with the coordinates in the image to obtain a matching result;
and determining an unauthorized object detection result in the object to be verified according to the matching result.
2. The method of claim 1, wherein the positioning data comprises geospatial coordinates of the positioning device; the matching the positioning data with the coordinates in the image to obtain a matching result comprises the following steps:
unifying the geospatial coordinates of the positioning equipment and the intra-image coordinates into a target coordinate system according to a preset mapping relation between the intra-image coordinates and the geospatial coordinates, so as to obtain a first coordinate of at least one positioning equipment in the target coordinate system and a second coordinate of at least one object to be verified in the target coordinate system; the target coordinate system is a geographic space coordinate system or a camera coordinate system corresponding to the acquisition camera of the real-time video image data;
Calculating according to at least one first coordinate and at least one second coordinate to obtain the matching result; the matching result is used for representing whether the positioning equipment corresponding to each first coordinate and the object to be verified corresponding to the second coordinate are the same moving target or not.
3. The method according to claim 2, wherein said calculating based on at least one of said first coordinates and at least one of said second coordinates to obtain said matching result comprises:
respectively calculating the distance between the first coordinate and each second coordinate;
and determining the matching result according to a comparison result of the distance and a preset positioning error threshold value.
4. The method according to claim 2, wherein said calculating based on at least one of said first coordinates and at least one of said second coordinates to obtain said matching result comprises:
determining a first motion trail of at least one positioning device according to the positioning data respectively;
respectively determining a second motion trail of at least one object to be verified according to the coordinates in the image;
respectively carrying out track association on at least one first motion track and at least one second motion track according to a preset joint probability interconnection algorithm;
And determining the matching result according to whether the corresponding motion trail is associated or not.
5. The method according to claim 2, wherein the determining an unauthorized object detection result in the object to be verified according to the matching result includes:
the object to be verified, which does not correspond to the same moving object as any of the positioning apparatuses, is determined as the unauthorized object.
6. The method of claim 2, further comprising, prior to said calculating based on at least one of said first coordinates and at least one of said second coordinates,:
acquiring radar detection data of the airport apron area;
and calibrating the matching result according to the radar detection data.
7. The method of claim 6, wherein said calibrating the matching result from the radar detection data comprises:
analyzing the radar detection data to obtain geographic space coordinates of radar identification objects included in the airport apron area;
matching the geospatial coordinates of the radar identification object with the geospatial coordinates of the positioning equipment to obtain a target radar identification object corresponding to the same moving target with the target positioning equipment;
Calibrating the geospatial coordinates of the target positioning device according to the geospatial coordinates of the target radar identification object;
and calibrating the matching result according to the calibrated geospatial coordinates of the positioning equipment.
8. An airport apron security management apparatus, the apparatus comprising:
the first acquisition module is used for acquiring real-time video image data of the airport station area;
the first detection module is used for carrying out target detection on the real-time video image data to obtain the intra-image coordinates of at least one object to be verified, which is included in the airport apron area;
the second detection module is used for detecting whether positioning data transmitted by at least one preset positioning device exist in the airport apron area or not;
the second acquisition module is used for acquiring the positioning data when the existence of the positioning data is determined; one of the preset positioning devices is preset on one preset authorized object corresponding to the airport apron area;
the matching module is used for matching the positioning data with the coordinates in the image to obtain a matching result;
and the determining module is used for determining an unauthorized object detection result in the object to be verified according to the matching result.
9. An airport apron safety management device, comprising: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is configured to store at least one executable instruction that causes the processor to perform the operations of the airport apron safety management method of any one of claims 1-7.
10. A computer readable storage medium having stored therein at least one executable instruction that, when executed on an airport apron safety management device, causes the airport apron safety management device to perform the operations of the airport apron safety management method of any one of claims 1-7.
CN202311263116.XA 2023-09-27 2023-09-27 Airport station level safety management method, device, equipment and computer storage medium Pending CN117315581A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311263116.XA CN117315581A (en) 2023-09-27 2023-09-27 Airport station level safety management method, device, equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311263116.XA CN117315581A (en) 2023-09-27 2023-09-27 Airport station level safety management method, device, equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN117315581A true CN117315581A (en) 2023-12-29

Family

ID=89286118

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311263116.XA Pending CN117315581A (en) 2023-09-27 2023-09-27 Airport station level safety management method, device, equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN117315581A (en)

Similar Documents

Publication Publication Date Title
Zhao et al. Vision-based anti-uav detection and tracking
US10699125B2 (en) Systems and methods for object tracking and classification
CN108470332B (en) Multi-target tracking method and device
CN110264495B (en) Target tracking method and device
CN108319926A (en) A kind of the safety cap wearing detecting system and detection method of building-site
CN112068111A (en) Unmanned aerial vehicle target detection method based on multi-sensor information fusion
KR102261880B1 (en) Method, appratus and system for providing deep learning based facial recognition service
US11048917B2 (en) Method, electronic device, and computer readable medium for image identification
CN109697499A (en) Pedestrian's flow funnel generation method and device, storage medium, electronic equipment
CN109255286A (en) A kind of quick detection recognition method of unmanned plane optics based on YOLO deep learning network frame
CN110189355A (en) Safe escape channel occupies detection method, device, electronic equipment and storage medium
Cheng et al. Uncertainty‐aware convolutional neural network for explainable artificial intelligence‐assisted disaster damage assessment
CN112836683B (en) License plate recognition method, device, equipment and medium for portable camera equipment
Xing et al. Improved yolov5-based uav pavement crack detection
CN112800918A (en) Identity recognition method and device for illegal moving target
CN116189052A (en) Security method, system, intelligent terminal and storage medium based on video stream analysis
CN116740833A (en) Line inspection and card punching method based on unmanned aerial vehicle
US20220083774A1 (en) Method and system for asset inspection using unmanned aerial vehicles
Bakirci et al. Transforming aircraft detection through LEO satellite imagery and YOLOv9 for improved aviation safety
Zhang et al. Critical Infrastructure Security Using Computer Vision Technologies
Arrahmah et al. Evaluation of SSD Architecture for Small Size Object Detection: A Case Study on UAV Oil Pipeline Monitoring
CN117315581A (en) Airport station level safety management method, device, equipment and computer storage medium
CN111860419A (en) Method for compliance detection in power overhaul process, electronic equipment and storage medium
Ye et al. Unmanned Aerial Vehicle Target Detection Algorithm Based on Infrared Visible Light Feature Level Fusion
US20230410545A1 (en) Lidar-based Alert System

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination