CN117523500B - Monitoring system, method and storage medium of flight guarantee node - Google Patents

Monitoring system, method and storage medium of flight guarantee node Download PDF

Info

Publication number
CN117523500B
CN117523500B CN202410008026.4A CN202410008026A CN117523500B CN 117523500 B CN117523500 B CN 117523500B CN 202410008026 A CN202410008026 A CN 202410008026A CN 117523500 B CN117523500 B CN 117523500B
Authority
CN
China
Prior art keywords
aircraft
state
real
time
node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410008026.4A
Other languages
Chinese (zh)
Other versions
CN117523500A (en
Inventor
张家宝
吴磊
陈南燕
余文青
杨玥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XIAMEN ZHAOXIANG INTELLIGENT TECHNOLOGY CO LTD
Original Assignee
XIAMEN ZHAOXIANG INTELLIGENT TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XIAMEN ZHAOXIANG INTELLIGENT TECHNOLOGY CO LTD filed Critical XIAMEN ZHAOXIANG INTELLIGENT TECHNOLOGY CO LTD
Priority to CN202410008026.4A priority Critical patent/CN117523500B/en
Publication of CN117523500A publication Critical patent/CN117523500A/en
Application granted granted Critical
Publication of CN117523500B publication Critical patent/CN117523500B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The application provides a monitoring system, a method and a storage medium of a flight guarantee node, comprising the following steps: the video acquisition module acquires real-time video streams of viewing angles such as a stand and an aircraft cabin door in an aviation support area; the positioning module collects real-time positioning information related to a worker, an aircraft and a ground moving target in an aviation support area; the real-time video analysis module inputs real-time video frames of all visual angles at the current moment into a trained machine learning model to obtain the corresponding aircraft in-out state, cabin door opening and closing state, ground service state and other guaranteed node states of all visual angles at the current moment; calibrating the guaranteed node state according to the real-time positioning information acquired at the current moment to obtain a high-precision guaranteed node state; and the data integration module generates and reports an XML message of the aircraft at the current moment according to the high-precision guarantee node state, the flight identification and the current moment. According to the method and the device, accuracy and instantaneity for monitoring the flight guarantee node can be effectively improved.

Description

Monitoring system, method and storage medium of flight guarantee node
Technical Field
The present disclosure relates to the field of flight security technologies, and in particular, to a system and method for monitoring a flight security node, and a storage medium.
Background
The traditional flight guarantee node information acquisition mainly depends on manual operation, has low working efficiency, is easy to make mistakes, and cannot acquire and update the flight guarantee node information in real time.
With the development of video technology, a camera is used for collecting real-time video, and the real-time video is accessed through a real-time streaming protocol (RTSP), so that the real-time collection and update of flight guarantee node information can be realized.
However, for flight support node monitoring, the comprehensiveness of video content is poor, and the analysis and processing efficiency of the traditional video analysis technology is low, so that the accuracy and real-time requirements of flight support node monitoring are difficult to ensure.
Disclosure of Invention
In order to solve the problems, the application provides a monitoring system, a method and a storage medium of a flight guarantee node, which can effectively improve the accuracy and the instantaneity of monitoring the flight guarantee node, improve the working efficiency, reduce the monitoring error rate and provide reliable data support for the operation and management of an airport.
In a first aspect, the present application provides a monitoring system for a flight support node, the system comprising:
the video acquisition module is used for: collecting real-time video streams of a plurality of view angles in an aviation support area, wherein the view angles at least comprise: stand and aircraft door;
A positioning module for: collecting real-time positioning information in the flight guarantee area, wherein the real-time positioning information comprises the following steps: at least one of staff positioning information, aircraft positioning information, ground moving target positioning information;
the real-time video analysis module is used for: acquiring a real-time video frame transmitted by the video acquisition module at the current moment, inputting the real-time video frame of each view angle into a trained machine learning model, and obtaining a guaranteed node state corresponding to each view angle at the current moment, wherein the guaranteed node state comprises: an aircraft in-out state, a cabin door switch state, and a ground service state;
the real-time video analysis module is further configured to: calibrating the guaranteed node state according to the real-time positioning information acquired at the current moment to obtain a high-precision guaranteed node state;
the data integration module is used for: and generating an XML message of the aircraft at the current moment according to the high-precision guarantee node state, the flight identification of the corresponding aircraft and the current moment, packaging the XML message according to a preset format, and uploading the XML message to a flight guarantee management platform through a data bus.
In one possible implementation, the positioning module is configured to:
Acquiring positioning information of mobile terminal equipment in the flight guarantee area, and determining one or more staff positions according to a binding relationship between equipment identification of the mobile terminal equipment and staff identity;
determining the position of the aircraft according to satellite positioning signals, range finder signals or high-frequency omni-directional beacons of the aircraft in the flight guarantee area;
the ground moving target comprises a vehicle, and the positioning module is used for determining the position of the vehicle according to satellite positioning signals, radar signals or internet of vehicles signals of the vehicle detected in the flight guarantee area.
In one possible implementation, the real-time video analysis module is configured to:
acquiring a real-time video frame of the stand view angle/the aircraft cabin door view angle transmitted at the current moment, and inputting the real-time video frame into a trained machine learning model;
the machine learning model carries out target detection on the real-time video frame, determines the state of a real-time aircraft in the real-time video frame, compares the state of the previous aircraft corresponding to the previous moment of the current moment with the state of the real-time aircraft, and determines the state of a first guarantee node corresponding to the view angle of a stand station/the state of a second guarantee node corresponding to the view angle of a cabin door of the aircraft and the state of a third guarantee node;
The first protection node state is: aircraft in-position, aircraft push out, upper gear, gear withdrawal, bridge abutment, bridge departure, starting fueling or finishing fueling;
the second guaranteed node state includes: opening/closing the passenger compartment door, opening/closing the cargo compartment door;
the third guarantee node state is: begin/complete meal preparation.
In one possible implementation, for the stand view, the real-time video analysis module is configured to:
A. when the aircraft in the previous aircraft state is in a motion state, the aircraft in the real-time aircraft state is in a stop state and the aircraft is located in a preset aircraft stop area, determining that the first protection node state is aircraft in-position, and recording that the current moment is aircraft in-position time;
B. when the aircraft in the previous aircraft state is in a preset aircraft stand area and is in a static state, the aircraft in the real-time aircraft state is in a motion state, the first protection node state is determined to be the aircraft pushing-out state, and the current time is recorded as the aircraft pushing-out time;
C. when the aircraft in the previous aircraft state is in a preset aircraft stop area and is in a static state, if the aircraft in the real-time aircraft state is put on a first wheel guard, determining that the first protection node state is the aircraft upper wheel guard, and recording that the current moment is the arrival time of the aircraft;
D. When the aircraft in the previous aircraft state is in a preset aircraft stop area and is in a static state, if the aircraft in the real-time aircraft state removes the last gear, determining that the first protection node state is the aircraft gear removing state, and recording that the current moment is the aircraft departure time;
E. when the corridor bridge in the previous aircraft state is in a motion state, the corridor bridge in the real-time aircraft state is in a stop state and the corridor bridge is completely abutted with the aircraft, determining that the first protection node state is an abutment, and recording that the current moment is the abutment completion time;
F. when the corridor bridge is in a stop state and is completely abutted against the aircraft in the previous aircraft state, the corridor bridge is in a motion state and is completely separated from the aircraft in the real-time aircraft state, determining that the first protection node state is off-bridge, and recording that the current moment is off-bridge completion time;
G. when the wing oil port is not connected with an oil pipe in the previous aircraft state, the wing oil port is completely connected with the oil pipe in the real-time aircraft state, determining the first protection node state as starting oiling, and recording the current moment as starting oiling time;
H. And when the wing oil port and the oil pipe are completely connected in the previous aircraft state, disconnecting the wing oil port and the oil pipe in the real-time aircraft state, determining the first protection node state as the completion of refueling, and recording the current moment as the completion of refueling time.
In one possible implementation, the real-time video analysis module is configured to:
determining whether the aircraft is positioned in a preset aircraft stand area at the current moment according to the aircraft position in the real-time positioning information;
and under the condition that the aircraft is determined to be positioned in the stand area, marking a first protection node state obtained by judging in combination with the stand area as a high-precision protection node state, wherein the first protection node state obtained by judging in combination with the stand area comprises the following steps: the aircraft is put in, pushed out, upper gear and withdrawal gear;
and re-determining the first protection node state under the condition that the aircraft is determined to be located outside the stand area and the first protection node system is aircraft in-position, aircraft out-of-position, on-gear or off-gear.
In one possible implementation, the real-time video analysis module is configured to:
Determining whether the aircraft is positioned in a preset aircraft stand area at the current moment according to the aircraft position in the real-time positioning information; determining whether a vehicle for gear up/gear down at the current moment is positioned in the stand area according to the vehicle position in the real-time positioning information; determining whether a worker executing gear up/gear down at the current moment is located in the stand area or not according to the position of the worker in the real-time positioning information;
and under the condition that the aircraft, the working personnel and the working personnel for executing the gear-up/gear-down are all located in the stand area, marking the first protection node state judged to be the gear-up/gear-down as a high-precision protection node state.
In one possible implementation, the real-time video analysis module is configured to, for aircraft door viewing angles:
I. when all passenger/cargo doors in the last aircraft state are in a closed state, one passenger/cargo door in the real-time aircraft state is in an open state, determining the second guaranteed node state comprises: opening the passenger cabin door/cargo cabin door, and recording the current moment as the passenger cabin door opening time/cargo cabin door opening time;
J. When any of the passenger/cargo doors in the last aircraft state are in an open state, all of the passenger/cargo doors in the real-time aircraft state are in a closed state, determining the second assurance node state comprises: closing the cabin door/cargo compartment door and recording the current moment as the cabin door closing time/cargo compartment door closing time;
K. when the meal delivery vehicle in the previous aircraft state is parked in a preset working area and is in an ascending state, the meal delivery vehicle in the real-time aircraft state is in a static state and is completely in butt joint with the aircraft cargo compartment door, the third guarantee node state is determined to be meal delivery starting, and the current moment is recorded as meal delivery starting time;
and L, when the meal delivery vehicle is in a static state and is completely in butt joint with the aircraft cargo compartment door in the last aircraft state, separating the meal delivery vehicle from the aircraft cargo compartment door and in a descending state in the real-time aircraft state, determining that the third guarantee node state is the meal delivery completion state, and recording that the current moment is the meal delivery completion time.
In one possible implementation, the real-time video analysis module is configured to:
determining whether the food cart is positioned in the preset working area at the current moment according to the vehicle position in the real-time positioning information; determining whether a worker executing a meal allocation task at the current moment is located in the preset working area or not according to the position of the worker in the real-time positioning information;
And under the condition that the service trolley and the staff executing the service task are both located in the working area, marking the third guarantee node state judged to start/finish the service as a high-precision guarantee node state.
In a second aspect, a method for monitoring a flight protection node is provided, the method comprising:
s1, a video acquisition module acquires real-time video streams of a plurality of visual angles in an aviation support area, wherein the visual angles at least comprise: stand and aircraft door;
s2, a positioning module collects real-time positioning information in the flight guarantee area, wherein the real-time positioning information comprises the following components: at least one of staff positioning information, aircraft positioning information, ground moving target positioning information;
s3, the real-time video analysis module acquires a real-time video frame transmitted by the video acquisition module at the current moment, the real-time video frame of each view angle is input into a trained machine learning model, and a guarantee node state corresponding to each view angle at the current moment is obtained, wherein the guarantee node state comprises: an aircraft in-out state, a cabin door switch state, and a ground service state;
s4, the real-time video analysis module calibrates the guaranteed node state according to the real-time positioning information acquired at the current moment to obtain a high-precision guaranteed node state;
And S5, the data integration module generates an XML message of the aircraft at the current moment according to the high-precision guarantee node state, the flight identification corresponding to the aircraft and the current moment, encapsulates the XML message according to a preset format, and uploads the XML message to the flight guarantee management platform through a data bus.
In one possible implementation manner, the method for monitoring the flight protection node includes the steps described in any possible implementation manner of the monitoring system of the flight protection node described in the first aspect.
In a third aspect, there is provided a computing device comprising a memory and a processor, the memory storing at least one program, the at least one program being executable by the processor to implement a method of monitoring a flight assurance node as provided in the second aspect.
In a fourth aspect, there is provided a computer readable storage medium having stored therein at least one program that is executed by a processor to implement the method of monitoring a flight assurance node as provided in the second aspect.
The technical scheme provided by the application at least comprises the following technical effects:
the application provides a monitoring system and a method of a flight guarantee node, comprising the following steps: the video acquisition module acquires real-time video streams of viewing angles such as a stand and an aircraft cabin door in an aviation support area; the positioning module collects real-time positioning information related to a worker, an aircraft and a ground moving target in an aviation support area; the real-time video analysis module inputs real-time video frames of all visual angles at the current moment into a trained machine learning model to obtain the corresponding aircraft in-out state, cabin door opening and closing state, ground service state and other guaranteed node states of all visual angles at the current moment; calibrating the guaranteed node state according to the real-time positioning information acquired at the current moment to obtain a high-precision guaranteed node state; and the data integration module generates and reports an XML message of the aircraft at the current moment according to the high-precision guarantee node state, the flight identification and the current moment.
According to the method and the system, the accuracy and the instantaneity for monitoring the flight guarantee node can be effectively improved, the monitoring error rate is reduced while the working efficiency is improved, and reliable data support is provided for operation and management of an airport.
Drawings
Fig. 1 is a schematic diagram of a monitoring system of a flight protection node according to an embodiment of the present application;
fig. 2 is a flow chart of a method for monitoring a flight protection node according to an embodiment of the present application;
fig. 3 is a schematic hardware structure of a computing device according to an embodiment of the present application.
Detailed Description
To further illustrate the embodiments, the present application provides the accompanying drawings. The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate embodiments and together with the description, serve to explain the principles of the embodiments. With reference to these matters, one of ordinary skill in the art would understand other possible embodiments and the advantages of the present application. The components in the figures are not drawn to scale and like reference numerals are generally used to designate like components. The term "at least one" in this application means one or more, the term "plurality" in this application means two or more, for example, a plurality of nodes means two or more.
The present application will now be further described with reference to the drawings and detailed description.
The embodiment of the application provides a monitoring system of a flight support node, which can realize high-precision real-time monitoring on the state of the flight support node. Fig. 1 is a schematic diagram of a monitoring system of a flight protection node according to an embodiment of the present application, referring to fig. 1, the system includes: the system comprises a video acquisition module, a positioning module, a real-time video analysis module and a data integration module.
In this embodiment of the present application, the video acquisition module is configured to acquire real-time video streams of a plurality of viewing angles in a flight guarantee area, where the plurality of viewing angles at least includes: stand and aircraft door.
The positioning module is used for acquiring real-time positioning information in the flight guarantee area, and the real-time positioning information comprises: at least one of the staff positioning information, the aircraft positioning information and the ground moving target positioning information.
The real-time video analysis module is used for acquiring the real-time video frames transmitted by the video acquisition module at the current moment, inputting the real-time video frames of all the visual angles into the trained machine learning model to obtain the corresponding guarantee node states of all the visual angles at the current moment, wherein the guarantee node states comprise: an aircraft in-out state, a cabin door switch state, and a ground service state.
And the real-time video analysis module is also used for calibrating the guaranteed node state according to the real-time positioning information acquired at the current moment to obtain the high-precision guaranteed node state.
The data integration module is used for generating an extensible markup language (XML) message of the aircraft at the current moment according to the high-precision guarantee node state, the flight identification of the corresponding aircraft and the current moment, packaging the XML message according to a preset format, and uploading the XML message to the flight guarantee management platform through a data bus.
The principle of the steps executed by each module in the monitoring system of the flight protection node is described in detail in the following method embodiments, which are not described herein.
The aircraft described herein may be a passenger aircraft, a cargo aircraft, or other type of aircraft, to which the present application is not limited. Illustratively, the video acquisition module includes a plurality of cameras deployed at various points within the flight support area. Wherein, the flight guarantee area includes: the flight area for the aircraft to move (take-off, land, taxi and park) and the station area for passengers and/or cargo to get on and off. Further, the flight area includes a stand area in which the aircraft may take off and land, park, and receive ground services.
Illustratively, the transmission of the real-time video stream is based on a real-time video streaming protocol (RTSP) between the video acquisition module and the real-time video analysis module.
The monitoring system provided by the embodiment of the application integrates various functional modules, can realize the excavation and processing of multi-view information, and combines the high-precision real-time positioning information provided by the positioning module to calibrate the information extracted from the video stream, so that the monitoring result has higher reliability and can stably operate in a complex environment. Therefore, the monitoring system of the flight guarantee node can effectively improve the accuracy and the instantaneity of monitoring the flight guarantee node, improve the working efficiency, reduce the monitoring error rate, and provide convenience and reliable data support for operation and management of an airport.
The following describes the principle of steps executed by each module in the monitoring system of the flight support node by introducing the monitoring method of the flight support node provided by the embodiment of the application. Fig. 2 is a flowchart of a method for monitoring a flight protection node according to an embodiment of the present application, referring to fig. 2, the method includes the following steps S1 to S5.
S1, a video acquisition module acquires real-time video streams of a plurality of visual angles in an aviation security area.
In this embodiment of the present application, the plurality of viewing angles at least includes: stand and aircraft door. In particular, the stand view angle may comprise a left side view angle and a right side view angle, and the aircraft door view angle may comprise a passenger door view angle and a cargo door view angle. Of course, more view angles can be expanded according to requirements to monitor the states of more guaranteed nodes, and the application is not limited to the above.
S2, the positioning module collects real-time positioning information in the flight guarantee area.
In this embodiment of the present application, the real-time positioning information includes: at least one of the staff positioning information, the aircraft positioning information and the ground moving target positioning information.
In one possible implementation, this step S2 includes:
s21, the positioning module collects positioning information of the mobile terminal equipment in the flight guarantee area, and determines one or more staff positions according to the binding relationship between the equipment identification of the mobile terminal equipment and the staff identity.
The mobile terminal device is, for example, a smart phone of a worker, a portable positioning device (e.g., an intelligent electronic tablet), or a portable terminal device supporting positioning at will.
S22, the positioning module determines the position of the aircraft according to satellite positioning signals, range finder signals or high-frequency omni-directional beacons of the aircraft in the flight guarantee area.
The satellite positioning signals are provided by, for example, a satellite positioning system such as GPS, beidou, galileo, etc., which is not limited in this application.
S23, the positioning module is used for determining the position of the vehicle according to satellite positioning signals, radar signals or internet of vehicles signals of the vehicle detected in the flight guarantee area.
The high-precision positioning information can be acquired through the process and used for calibrating the state of the guaranteed node identified from the video stream, so that the accuracy and the instantaneity of monitoring the state of the guaranteed node are improved.
S3, the real-time video analysis module acquires real-time video frames transmitted by the video acquisition module at the current moment, and inputs the real-time video frames of all the visual angles into the trained machine learning model to obtain the corresponding guarantee node states of all the visual angles at the current moment.
In this embodiment, guaranteeing the node state includes: an aircraft in-out state, a cabin door switch state, and a ground service state. Specifically, the aircraft ingress and egress status includes: the aircraft is put in, pushed out, upper wheel chock, wheel chock removing, bridge leaning, bridge leaving and the like; the cabin door switch state includes: a hatch door, etc.; the ground service state includes: begin fueling, complete fueling, begin meal, complete meal, etc. Of course, more guarantee nodes can be expanded according to requirements, and the application is not limited to the above.
In one possible implementation, this step S3 includes:
s31, a real-time video analysis module acquires a real-time video frame of a stand view angle/an aircraft cabin door view angle transmitted at the current moment, and inputs the real-time video frame into a trained machine learning model;
s32, the real-time video analysis module carries out target detection on the real-time video frame through the machine learning model, determines the state of the real-time aircraft in the real-time video frame, compares the state of the previous aircraft corresponding to the previous moment at the current moment with the state of the real-time aircraft, and determines the state of the first guarantee node corresponding to the stand view angle/the state of the second guarantee node corresponding to the view angle of the aircraft cabin door and the state of the third guarantee node.
The first protection node state is: aircraft in-position, aircraft push out, upper gear, gear withdrawal, bridge abutment, bridge departure, starting fueling or finishing fueling; the second guaranteed node state includes: opening/closing the passenger compartment door, opening/closing the cargo compartment door; the third guarantee node state is: begin/complete meal preparation.
In one possible implementation, the process of the real-time video analysis module determining the guaranteed node status for the stand point view includes examples a through H below.
A. When the aircraft in the previous aircraft state is in a motion state, the aircraft in the real-time aircraft state is in a stop state and the aircraft is located in a preset aircraft stop area, determining that the first protection node state is aircraft in-position, and recording that the current moment is aircraft in-position time.
The aircraft landing node is defined as the time when the aircraft arrives at the arrival flight landing, the states of the aircraft identified by the front and rear moments are compared, when the aircraft enters a preset landing area and is changed from a motion state to a stop, the aircraft landing is judged to be completed, and the corresponding moments are recorded.
B. When the aircraft is in the preset aircraft stand area and is in the static state in the previous aircraft state, the aircraft is in the motion state in the real-time aircraft state, the first protection node state is determined to be the aircraft pushing-out state, and the current time is recorded to be the aircraft pushing-out time.
The aircraft push-out node is defined as the time when the aircraft leaves the departure flight stop, the aircraft states identified by the front and rear moments are compared, the aircraft push-out node is judged to be completed when the aircraft states start to move from rest, and the corresponding moments are recorded.
C. When the aircraft is in the preset aircraft stop area and is in a static state in the previous aircraft state, if the aircraft is put on the first gear in the real-time aircraft state, determining that the first protection node state is the aircraft upper gear, and recording that the current moment is the arrival time of the aircraft.
The upper gear node is defined as the time (arrival time) of the action of stopping the first gear of the aircraft by ground crews after the aircraft is stopped, the state of the aircraft identified by the front moment and the back moment is compared, the upper gear is judged to be completed when the aircraft is placed on the first gear, and the corresponding moment is recorded.
D. And when the aircraft is in the preset aircraft stop area and is in a static state in the previous aircraft state, if the aircraft removes the last gear in the real-time aircraft state, determining that the first protection node state is the aircraft gear removing state, and recording that the current moment is the aircraft departure time.
The wheel guard removing node is defined as the time (departure time) of the action of the ground crew for driving to the last wheel guard of the aircraft in real time after the air traffic control unit is pushed out or driven in by the unit, the state of the aircraft identified by the front moment and the rear moment is compared, the wheel guard removing node is judged to be completed when the aircraft removes the last wheel guard, and the corresponding moment is recorded.
E. And when the corridor bridge is in a motion state in the previous aircraft state, the corridor bridge is in a stop state in the real-time aircraft state and is completely abutted against the aircraft, determining the first protection node state as an abutment, and recording the current moment as the abutment completion time.
The bridge leaning node is defined as the time for the bridge to be in butt joint with the aircraft, the states of the aircraft are identified by comparing the front moment and the rear moment, when the bridge is changed from motion to stop and the aircraft is completely in butt joint, the bridge leaning node is judged to be in bridge leaning completion, and the corresponding moment is recorded.
F. When the corridor bridge is in a stop state and the corridor bridge is completely abutted against the aircraft in the previous aircraft state, the corridor bridge is in a motion state and the corridor bridge is completely separated from the aircraft in the real-time aircraft state, determining that the first protection node state is a bridge-off state, and recording that the current moment is the bridge-off completion time;
the bridge departure node is defined as the time when the corridor bridge and the aircraft are separated, the state of the aircraft identified by the front moment and the rear moment is compared, when the corridor bridge is changed from stop to movement and the aircraft is separated, the bridge departure node is judged to be completed, and the corresponding moment is recorded.
G. When the wing oil port is not connected with an oil pipe in the previous aircraft state, the wing oil port is completely connected with the oil pipe in the real-time aircraft state, determining the first protection node state as starting oiling, and recording the current moment as starting oiling time;
the oil filling start node is defined as the time (oil supply start) of the oil pipe abutting against the wing oil port, the state of the aircraft is identified by comparing the front moment and the rear moment, when the moment of the oil pipe abutting against the oil port is judged as starting oil filling, and the corresponding moment is recorded.
H. When the wing oil port and the oil pipe are completely connected in the previous aircraft state, the wing oil port and the oil pipe are disconnected in the real-time aircraft state, the first protection node state is determined to finish oiling, and the current time is recorded to be the oiling completion time.
The oil filling completion node is defined as the time (oil supply start) when the oil pipe breaks an oil port of the wing, the state of the aircraft is identified by comparing the front moment and the rear moment, and when the moment when the action of connecting the oil pipe to the oil port occurs, the oil filling completion node judges that the oil filling is completed, and the corresponding moment is recorded.
In one possible implementation, the process of the real-time video analysis module determining the warranty node status for the aircraft door view angle includes examples I through L below.
I. When all passenger/cargo doors in the last aircraft state are in the closed state, one passenger/cargo door in the real-time aircraft state is in the open state, determining the second security node state comprises: and opening the passenger cabin door/the cargo cabin door, and recording the current moment as the passenger cabin door opening time/the cargo cabin door opening time.
The first passenger door/cargo compartment door is defined as the time when the first passenger door/cargo compartment door is opened by the arrival flight, the state of the aircraft is identified by comparing the front moment and the rear moment, when the first passenger door/cargo compartment door is changed from being closed to being opened for the first time, the first passenger door/cargo compartment door is judged to be opened, and the corresponding moment is recorded.
J. When any of the guest doors/cargo doors in the previous aircraft state are in the open state, and all of the passenger doors/cargo doors in the real-time aircraft state are in the closed state, determining the second assurance node state includes: and closing the passenger cabin door/the cargo cabin door, and recording the current moment as the passenger cabin door closing time/the cargo cabin door closing time.
The closing cabin door node is defined as the time for closing the last cabin door/cargo door of the arrival flight, the state of the aircraft identified by comparing the front moment and the back moment is judged to be completed when the last cabin door/cargo door is finally changed from opening to closing, and the corresponding moment is recorded.
K. When the meal delivery vehicle in the previous aircraft state is parked in a preset working area and is in an ascending state, the meal delivery vehicle in the real-time aircraft state is in a static state and is completely in butt joint with the aircraft cargo compartment door, the third guarantee node state is determined to be meal delivery starting, and the current moment is recorded as meal delivery starting time.
The start catering node is defined as the time when the catering truck rises and starts to dock with a cargo door of a flight, the state of the aircraft identified by the front moment and the back moment is compared, and when the catering truck stops in a working area and the vehicle body rises to be stationary, the situation that the catering is started is judged, and the corresponding moment is recorded. The working area is set in the aviation guarantee area according to the butt joint position of the catering trolley and the aircraft.
And L, when the food cart is in a static state and is completely in butt joint with the cargo compartment door of the aircraft in the previous aircraft state, separating the food cart from the cargo compartment door of the aircraft in the real-time aircraft state and in a descending state, determining a third guarantee node state as the food distribution completion state, and recording the current moment as the food distribution completion time.
The complete catering node is defined as the time when the catering truck is separated from the cargo hold door of the flight and starts to descend, the state of the aircraft identified at the front moment and the back moment is compared, and when the body of the catering truck starts to descend from rest, the catering is judged to be completed, and the corresponding moment is recorded.
Through the process, the multi-view video data can be mined and processed, and the real-time state of various guaranteed nodes can be accurately obtained.
And S4, calibrating the guaranteed node state by the real-time video analysis module according to the real-time positioning information acquired at the current moment to obtain the high-precision guaranteed node state.
The principle of calibrating the states of different guaranteed nodes is described below in conjunction with the various different real-time positioning information described in step S2.
In one possible embodiment, the process of calibrating the real-time video analysis module for the first guaranteed node status identified under the stand view and executed within the preset stand area comprises:
1a, a real-time video analysis module determines whether an aircraft is positioned in a preset aircraft stand area at the current moment according to the aircraft position in the real-time positioning information;
1b, under the condition that the aircraft is determined to be located in the stand area, marking a first protection node state obtained by judging in combination with the stand area as a high-precision protection node state by the real-time video analysis module, wherein the first protection node state obtained by judging in combination with the stand area comprises the following steps: the aircraft is put in, pushed out, upper gear and withdrawal gear;
1c, the real-time video analysis module re-determines the state of the first protection node under the condition that the aircraft is determined to be located outside the aircraft stand area and the first protection node system is the aircraft in-position, the aircraft out-of-out, the gear-up or the gear-out.
Through the process, the real-time positioning information is utilized to accurately confirm the real-time position of the aircraft, and the guarantee activities such as aircraft entering, aircraft pushing out, wheel blocking or wheel blocking removing and the like executed in the preset stand area are calibrated, so that errors caused by identification errors or transmission delay of video streams are avoided, and accuracy and instantaneity of guaranteeing node state identification are improved.
In one possible implementation, for a first protection node involving the coordination of a ground vehicle and a worker, the process of calibrating the real-time video analysis module includes:
2a, a real-time video analysis module determines whether the aircraft is positioned in a preset aircraft stand area at the current moment according to the aircraft position in the real-time positioning information; according to the vehicle position in the real-time positioning information, determining whether the locomotive carrying out gear up/gear down at the current moment is positioned in a stand area; determining whether a worker executing the gear-up/gear-down at the current moment is positioned in the stand area according to the position of the worker in the real-time positioning information;
2b, the real-time video analysis module marks the first protection node state judged to be the upper gear/the lower gear as a high-precision protection node state under the condition that the aircraft, the working personnel and the working vehicles for executing the upper gear/the lower gear are all located in the parking space area.
Through the process, real-time positioning information is utilized to accurately confirm the real-time position of the aircraft, the position of the staff and the position of the ground vehicle, so that the upper wheel block or the lower wheel block and the like are executed in a preset stand area, and the ground vehicle and the staff are involved in the cooperative guarantee activities for calibration, so that errors caused by error identification or transmission delay of video streams are avoided, and the accuracy and the instantaneity of the identification of the state of the guarantee nodes are improved.
In one possible implementation, for a third assurance node involving the coordination of a ground vehicle and a worker, the process of calibrating the real-time video analysis module includes:
3a, a real-time video analysis module determines whether the food cart is positioned in a preset working area at the current moment according to the vehicle position in the real-time positioning information; and determining whether the staff performing the catering task at the current moment is positioned in a preset working area or not according to the positions of the staff in the real-time positioning information.
And 3b, marking the third guarantee node state judged to be the beginning of the catering/finishing of the catering as a high-precision guarantee node state by the real-time video analysis module under the condition that the catering trolley and the staff executing the catering task are both located in the working area.
Through the process, the position of the working personnel and the position of the ground vehicle are accurately confirmed by utilizing the real-time positioning information, so that the operation is performed in a preset working area, and the guarantee activities which are cooperatively performed by the ground vehicle and the working personnel are calibrated, so that errors caused by identification errors or transmission delay of video streams are avoided, and the accuracy and the instantaneity of the identification of the state of the guarantee nodes are improved.
And S5, the data integration module generates an XML message of the aircraft at the current moment according to the high-precision guarantee node state, the flight identification of the corresponding aircraft and the current moment, encapsulates the XML message according to a preset format, and uploads the XML message to the flight guarantee management platform through the data bus.
In the embodiment of the application, the format of the XML message is predefined according to the reporting requirement of the flight guarantee node by utilizing the strong expansion capability of the XML. For example, the XML packet includes a plurality of elements, a first element is used for storing a guaranteed node state of the first guaranteed node, a second element is used for storing a guaranteed node state of the second guaranteed node, and a third element is used for storing a guaranteed node state of the third guaranteed node; each element comprises a plurality of attribute fields besides the data field, a first attribute field of the element is used for storing the moment corresponding to the state of the guarantee node, a second attribute field of the element is used for storing the data length corresponding to the state of the guarantee node, and a third attribute field of the element is used for storing whether the state of the guarantee node is high-precision or not.
The monitoring method of the flight guarantee node can realize the excavation and processing of multi-view information, calibrate the information extracted from the video stream by combining the high-precision real-time positioning information provided by the positioning module, so that the monitoring result has higher reliability and can stably operate in a complex environment; therefore, the accuracy and the instantaneity for monitoring the flight guarantee node can be effectively improved, and the monitoring error rate is reduced while the working efficiency is improved.
Furthermore, the positioning module and the video acquisition module are utilized to realize automatic identification and tracking, so that the input of human resources is greatly reduced, and the accuracy and timeliness of data are ensured; and the data from different sources and the states of the guarantee nodes can be summarized, so that complete and detailed report of the guarantee nodes is generated and provided for the flight guarantee management platform, and convenience and reliable data support are provided for the operation and management of the airport.
The application provides a computing device which can be implemented as any functional module in a monitoring system of the flight support node so as to execute all or part of steps in the monitoring of the flight support node. Fig. 3 is a schematic hardware structure of a computing device provided in an embodiment of the present application, where, as shown in fig. 3, the computing device includes a processor 301, a memory 302, a bus 303, and a computer program stored in the memory 302 and capable of running on the processor 301, where the processor 301 includes one or more processing cores, the memory 302 is connected to the processor 301 through the bus 303, and the memory 302 is used to store program instructions, and the processor implements all or part of the steps in the foregoing method embodiments provided in the present application when executing the computer program.
Further, as an executable scheme, the computing device may be a computer unit, and the computer unit may be a computing device such as a desktop computer, a notebook computer, a palm computer, and a cloud server. The computer unit may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that the constituent structures of the computer unit described above are merely examples of the computer unit and are not limiting, and may include more or fewer components than those described above, or may combine certain components, or different components. For example, the computer unit may further include an input/output device, a network access device, a bus, etc., which is not limited in this embodiment of the present application.
Further, as an implementation, the processor may be a central processing unit (Central Processing Unit, CPU), other general purpose processor, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like that is a control center of the computer unit, connecting various parts of the entire computer unit using various interfaces and lines.
The memory may be used to store the computer program and/or modules, and the processor may implement the various functions of the computer unit by running or executing the computer program and/or modules stored in the memory, and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for a function; the storage data area may store data created according to the use of the cellular phone, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
The present application also provides a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the methods described above in the embodiments of the present application.
The modules/units integrated with the computer unit may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a software distribution medium, and so forth. It should be noted that the content of the computer readable medium can be appropriately increased or decreased according to the requirements of the legislation and the patent practice in the jurisdiction.
While this application has been particularly shown and described with references to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the application as defined by the appended claims.

Claims (8)

1. A system for monitoring a flight support node, the system comprising:
the video acquisition module is used for: collecting real-time video streams of a plurality of view angles in an aviation support area, wherein the view angles at least comprise: stand and aircraft door;
a positioning module for: collecting real-time positioning information in the flight guarantee area, wherein the real-time positioning information comprises the following steps: worker position, aircraft position, vehicle position;
the real-time video analysis module is used for: acquiring a real-time video frame of a stand view angle/an aircraft cabin door view angle transmitted at the current moment, and inputting the real-time video frame into a trained machine learning model; the machine learning model carries out target detection on the real-time video frame, determines the state of a real-time aircraft in the real-time video frame, compares the state of the previous aircraft corresponding to the previous moment of the current moment with the state of the real-time aircraft, and determines the state of a first guarantee node corresponding to the view angle of a stand station/the state of a second guarantee node corresponding to the view angle of a cabin door of the aircraft and the state of a third guarantee node; the first protection node state is: aircraft in-position, aircraft push out, upper gear, gear withdrawal, bridge abutment, bridge departure, starting fueling or finishing fueling; the second guaranteed node state includes: opening/closing the passenger compartment door, opening/closing the cargo compartment door; the third guarantee node state is: begin/complete meal preparation;
The real-time video analysis module is further configured to: calibrating the first, second and/or third guaranteed node states according to the real-time positioning information acquired at the current moment to obtain high-precision guaranteed node states;
the process of calibrating the real-time video analysis module aiming at the state of the first protection node comprises the following steps:
determining whether the aircraft is positioned in a preset aircraft stand area at the current moment according to the aircraft position in the real-time positioning information; and under the condition that the aircraft is determined to be positioned in the stand area, marking a first protection node state obtained by judging in combination with the stand area as a high-precision protection node state, wherein the first protection node state obtained by judging in combination with the stand area comprises the following steps: the aircraft is put in, pushed out, upper gear and withdrawal gear; determining whether a vehicle for gear up/gear down at the current moment is positioned in the stand area according to the vehicle position in the real-time positioning information; determining whether a worker executing gear up/gear down at the current moment is located in the stand area or not according to the position of the worker in the real-time positioning information; under the condition that the aircraft, a working personnel vehicle for executing the gear-up/gear-down and working personnel are all located in the stand area, marking a first protection node state judged to be the gear-up/gear-down as a high-precision protection node state;
The data integration module is used for: and generating an XML message of the aircraft at the current moment according to the high-precision guarantee node state, the flight identification of the corresponding aircraft and the current moment, packaging the XML message according to a preset format, and uploading the XML message to a flight guarantee management platform through a data bus.
2. The monitoring system of claim 1, wherein the positioning module is configured to:
acquiring positioning information of mobile terminal equipment in the flight guarantee area, and determining one or more staff positions according to a binding relationship between equipment identification of the mobile terminal equipment and staff identity;
determining the position of the aircraft according to satellite positioning signals, range finder signals or high-frequency omni-directional beacons of the aircraft in the flight guarantee area;
the ground moving target comprises a vehicle, and the positioning module is used for determining the position of the vehicle according to satellite positioning signals, radar signals or internet of vehicles signals of the vehicle detected in the flight guarantee area.
3. The monitoring system of claim 1, wherein for a stand view, the real-time video analysis module is to:
A. when the aircraft in the previous aircraft state is in a motion state, the aircraft in the real-time aircraft state is in a stop state and the aircraft is located in a preset aircraft stop area, determining that the first protection node state is aircraft in-position, and recording that the current moment is aircraft in-position time;
B. When the aircraft in the previous aircraft state is in a preset aircraft stand area and is in a static state, the aircraft in the real-time aircraft state is in a motion state, the first protection node state is determined to be the aircraft pushing-out state, and the current time is recorded as the aircraft pushing-out time;
C. when the aircraft in the previous aircraft state is in a preset aircraft stop area and is in a static state, if the aircraft in the real-time aircraft state is put on a first wheel guard, determining that the first protection node state is the aircraft upper wheel guard, and recording that the current moment is the arrival time of the aircraft;
D. when the aircraft in the previous aircraft state is in a preset aircraft stop area and is in a static state, if the aircraft in the real-time aircraft state removes the last gear, determining that the first protection node state is the aircraft gear removing state, and recording that the current moment is the aircraft departure time;
E. when the corridor bridge in the previous aircraft state is in a motion state, the corridor bridge in the real-time aircraft state is in a stop state and the corridor bridge is completely abutted with the aircraft, determining that the first protection node state is an abutment, and recording that the current moment is the abutment completion time;
F. When the corridor bridge is in a stop state and is completely abutted against the aircraft in the previous aircraft state, the corridor bridge is in a motion state and is completely separated from the aircraft in the real-time aircraft state, determining that the first protection node state is off-bridge, and recording that the current moment is off-bridge completion time;
G. when the wing oil port is not connected with an oil pipe in the previous aircraft state, the wing oil port is completely connected with the oil pipe in the real-time aircraft state, determining the first protection node state as starting oiling, and recording the current moment as starting oiling time;
H. and when the wing oil port and the oil pipe are completely connected in the previous aircraft state, disconnecting the wing oil port and the oil pipe in the real-time aircraft state, determining the first protection node state as the completion of refueling, and recording the current moment as the completion of refueling time.
4. The monitoring system of claim 1, wherein the real-time video analysis module is further configured to:
and re-determining the first protection node state under the condition that the aircraft is determined to be located outside the stand area and the first protection node system is aircraft in-position, aircraft out-of-position, on-gear or off-gear.
5. The monitoring system of claim 1, wherein the real-time video analysis module is configured to, for aircraft door viewing angles:
I. when all passenger/cargo doors in the last aircraft state are in a closed state, one passenger/cargo door in the real-time aircraft state is in an open state, determining the second guaranteed node state comprises: opening the passenger cabin door/cargo cabin door, and recording the current moment as the passenger cabin door opening time/cargo cabin door opening time;
J. when any of the passenger/cargo doors in the last aircraft state are in an open state, all of the passenger/cargo doors in the real-time aircraft state are in a closed state, determining the second assurance node state comprises: closing the cabin door/cargo compartment door and recording the current moment as the cabin door closing time/cargo compartment door closing time;
K. when the meal delivery vehicle in the previous aircraft state is parked in a preset working area and is in an ascending state, the meal delivery vehicle in the real-time aircraft state is in a static state and is completely in butt joint with the aircraft cargo compartment door, the third guarantee node state is determined to be meal delivery starting, and the current moment is recorded as meal delivery starting time;
And L, when the meal delivery vehicle is in a static state and is completely in butt joint with the aircraft cargo compartment door in the last aircraft state, separating the meal delivery vehicle from the aircraft cargo compartment door and in a descending state in the real-time aircraft state, determining that the third guarantee node state is the meal delivery completion state, and recording that the current moment is the meal delivery completion time.
6. The monitoring system of claim 5, wherein the real-time video analysis module is configured to:
determining whether the food cart is positioned in the preset working area at the current moment according to the vehicle position in the real-time positioning information; determining whether a worker executing a meal allocation task at the current moment is located in the preset working area or not according to the position of the worker in the real-time positioning information;
and under the condition that the service trolley and the staff executing the service task are both located in the working area, marking the third guarantee node state judged to start/finish the service as a high-precision guarantee node state.
7. A method for monitoring a flight support node, the method comprising:
s1, a video acquisition module acquires real-time video streams of a plurality of visual angles in an aviation support area, wherein the visual angles at least comprise: stand and aircraft door;
S2, a positioning module collects real-time positioning information in the flight guarantee area, wherein the real-time positioning information comprises the following components: worker position, aircraft position, vehicle position;
s31, a real-time video analysis module acquires a real-time video frame of a stand view angle/an aircraft cabin door view angle transmitted at the current moment, and inputs the real-time video frame into a trained machine learning model;
s32, a real-time video analysis module carries out target detection on the real-time video frame through the machine learning model, determines the state of a real-time aircraft in the real-time video frame, compares the state of the previous aircraft corresponding to the previous moment of the current moment with the state of the real-time aircraft, and determines a first guarantee node state corresponding to the stand viewing angle/a second guarantee node state corresponding to the view angle of an aircraft cabin door and a third guarantee node state;
wherein, the first protection node state is: aircraft in-position, aircraft push out, upper gear, gear withdrawal, bridge abutment, bridge departure, starting fueling or finishing fueling; the second guaranteed node state includes: opening/closing the passenger compartment door, opening/closing the cargo compartment door; the third guarantee node state is: begin/complete meal preparation;
S4, the real-time video analysis module calibrates the first, second and/or third security node states according to the real-time positioning information acquired at the current moment to obtain high-precision security node states;
the process of calibrating the real-time video analysis module aiming at the state of the first protection node comprises the following steps:
determining whether the aircraft is positioned in a preset aircraft stand area at the current moment according to the aircraft position in the real-time positioning information; and under the condition that the aircraft is determined to be positioned in the stand area, marking a first protection node state obtained by judging in combination with the stand area as a high-precision protection node state, wherein the first protection node state obtained by judging in combination with the stand area comprises the following steps: the aircraft is put in, pushed out, upper gear and withdrawal gear; determining whether a vehicle for gear up/gear down at the current moment is positioned in the stand area according to the vehicle position in the real-time positioning information; determining whether a worker executing gear up/gear down at the current moment is located in the stand area or not according to the position of the worker in the real-time positioning information; under the condition that the aircraft, a working personnel vehicle for executing the gear-up/gear-down and working personnel are all located in the stand area, marking a first protection node state judged to be the gear-up/gear-down as a high-precision protection node state;
And S5, the data integration module generates an XML message of the aircraft at the current moment according to the high-precision guarantee node state, the flight identification corresponding to the aircraft and the current moment, encapsulates the XML message according to a preset format, and uploads the XML message to the flight guarantee management platform through a data bus.
8. A computer readable storage medium having stored therein at least one program, the at least one program being executed by a processor to implement the method of monitoring a flight assurance node of claim 7.
CN202410008026.4A 2024-01-04 2024-01-04 Monitoring system, method and storage medium of flight guarantee node Active CN117523500B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410008026.4A CN117523500B (en) 2024-01-04 2024-01-04 Monitoring system, method and storage medium of flight guarantee node

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410008026.4A CN117523500B (en) 2024-01-04 2024-01-04 Monitoring system, method and storage medium of flight guarantee node

Publications (2)

Publication Number Publication Date
CN117523500A CN117523500A (en) 2024-02-06
CN117523500B true CN117523500B (en) 2024-03-19

Family

ID=89744196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410008026.4A Active CN117523500B (en) 2024-01-04 2024-01-04 Monitoring system, method and storage medium of flight guarantee node

Country Status (1)

Country Link
CN (1) CN117523500B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109871786A (en) * 2019-01-30 2019-06-11 浙江大学 A kind of flight ground safeguard job specification process detection system
CN110610592A (en) * 2019-09-25 2019-12-24 捻果科技(深圳)有限公司 Airport apron safe operation monitoring method based on video analysis and deep learning
CN111814687A (en) * 2020-07-10 2020-10-23 苏州数智源信息技术有限公司 Flight support node intelligent identification system
CN112101253A (en) * 2020-09-18 2020-12-18 广东机场白云信息科技有限公司 Civil airport ground guarantee state identification method based on video action identification
CN115550611A (en) * 2022-09-23 2022-12-30 广东机场白云信息科技有限公司 Intelligent monitoring method, device and system for flight guarantee node

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230138718A1 (en) * 2021-10-29 2023-05-04 Nvidia Corporation Illumination resampling using temporal gradients in light transport simulation systems and applications

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109871786A (en) * 2019-01-30 2019-06-11 浙江大学 A kind of flight ground safeguard job specification process detection system
CN110610592A (en) * 2019-09-25 2019-12-24 捻果科技(深圳)有限公司 Airport apron safe operation monitoring method based on video analysis and deep learning
CN111814687A (en) * 2020-07-10 2020-10-23 苏州数智源信息技术有限公司 Flight support node intelligent identification system
CN112101253A (en) * 2020-09-18 2020-12-18 广东机场白云信息科技有限公司 Civil airport ground guarantee state identification method based on video action identification
CN115550611A (en) * 2022-09-23 2022-12-30 广东机场白云信息科技有限公司 Intelligent monitoring method, device and system for flight guarantee node

Also Published As

Publication number Publication date
CN117523500A (en) 2024-02-06

Similar Documents

Publication Publication Date Title
US11928742B2 (en) Collection of crash data using autonomous or semi-autonomous drones
Barmpounakis et al. Unmanned Aerial Aircraft Systems for transportation engineering: Current practice and future challenges
US11675324B2 (en) Air transportation systems and methods
US10860115B1 (en) Air transportation systems and methods
CN107567606B (en) Centralized system for remote vehicle guidance and method for automatically and manually guiding a vehicle remotely
CN111295570A (en) Information processing device, vehicle, moving object, information processing method, and program
CN106910376B (en) Air traffic operation control instruction monitoring method and system
US11792369B1 (en) Monitoring/alert system for airline gate activities
WO2018032295A1 (en) Accident scene reconstruction method and device, and moving monitoring apparatus
CN103377498A (en) Cloud-architecture electronic water-gate management system and method
CN111079525B (en) Image processing method, device, system and storage medium
DE102012018637A1 (en) Device for warning against aerological phenomena for an aircraft
US20210003419A1 (en) System for Generating Confidence Values in Digital Road Maps
DE102020106677A1 (en) DRONE LANDING SYSTEM AND PROCEDURE
CN117523500B (en) Monitoring system, method and storage medium of flight guarantee node
US20190152619A1 (en) Method and system for integrating offboard generated parameters into a flight management system
CN113095779A (en) Configurable flight guarantee time calculation method and system
US20220284746A1 (en) Collecting sensor data of vehicles
CN108022424B (en) Bus running track tracking method based on multiple data sources
Mcswain High density vertiplex flight test report advanced onboard automation
WO2022143181A1 (en) Information processing method and apparatus, and information processing system
CN114488119A (en) Flight support node acquisition system and method based on multi-source heterogeneous data fusion
CN110626517A (en) Intelligent device and intelligent system for aircraft wheel block and automatic data acquisition method
AU2008344944B2 (en) Tracking coordinator for air-to-air and air-to- ground tracking
CN204965749U (en) Share -car system, share -car terminal based on multisource location

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant