CN116049296B - Rapid event tracing prototype method and device based on space-time stream input - Google Patents

Rapid event tracing prototype method and device based on space-time stream input Download PDF

Info

Publication number
CN116049296B
CN116049296B CN202211575778.6A CN202211575778A CN116049296B CN 116049296 B CN116049296 B CN 116049296B CN 202211575778 A CN202211575778 A CN 202211575778A CN 116049296 B CN116049296 B CN 116049296B
Authority
CN
China
Prior art keywords
event
space
information
time
feet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211575778.6A
Other languages
Chinese (zh)
Other versions
CN116049296A (en
Inventor
郑颖尔
王浩锋
田智
王聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Academy of Civil Aviation Science and Technology
Original Assignee
China Academy of Civil Aviation Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Academy of Civil Aviation Science and Technology filed Critical China Academy of Civil Aviation Science and Technology
Priority to CN202211575778.6A priority Critical patent/CN116049296B/en
Publication of CN116049296A publication Critical patent/CN116049296A/en
Application granted granted Critical
Publication of CN116049296B publication Critical patent/CN116049296B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/26Visual data mining; Browsing structured data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution
    • G06F16/24568Data stream processing; Continuous queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • G06F16/254Extract, transform and load [ETL] procedures, e.g. ETL data flows in data warehouses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The application discloses a rapid event tracing prototype method and device based on space-time stream input.A device firstly receives event information stream and inputs the event information stream to a standardization module, and the module standardizes the event information stream into two types of characteristics of an object and a flow according to rules in a standard library respectively according to the type of the information; secondly, the logic organization module reads the attribute state of the standardized object information according to the standardized object information, and reads the corresponding model configuration file from the model library until all the objects and states in the whole flow are matched; thirdly, according to the flow information of the event, organizing the space-time information of all objects, and forming a configuration file of the whole flow in a specified space range; and finally, inputting the complete flow configuration and the model configuration file on the space-time position thereof to a rendering module for visual rendering, outputting a visual event stream, and processing the process in real time under the condition of meeting the hardware configuration requirement according to the requirement.

Description

Rapid event tracing prototype method and device based on space-time stream input
Technical Field
The application relates to the field of civil aviation technology/computer modeling, in particular to a rapid event tracing prototype method and device based on space-time stream input.
Background
The civil aviation operation safety comprises front-end product safety, safety of all guarantee links in the operation process, investigation and emergency treatment after accidents and government safety supervision. The related data includes QAR data, sensor data, fault cases, maintenance records, unsafe event reports, voyage intelligence, legal regulations, and the like. Aiming at the situations of complicated flow of current civil aviation event reporting and analysis, lack of standard unified event restoration and visual bases and the like, an event tracing method taking space-time information as a key clue is provided, and the dynamic expression and the quick visual capability of the event are greatly improved, so that the related problems of civil aviation in the operation process are solved.
Disclosure of Invention
In order to overcome the defects and shortcomings in the prior art, the application provides a rapid event tracing prototype method and device based on space-time stream input.
The technical scheme adopted by the application is that a rapid event tracing prototype method based on space-time stream input comprises the following steps:
step S1: continuously receiving event information, and receiving the event information needing to be visualized by equipment;
step S2: event data format normalization: carrying out many-to-one mapping conversion on the input event information according to the bottom object model and the space-time relation model;
step S3: matching the object model states, and completing matching of all objects and attribute states thereof in the event;
step S4: organizing the time-space relationship of the event, completing the organization of the flow relationship in the event, wherein the core element is the time-space position relationship, and the time-space position information, the front-back sequence logic, the object relationship and the interval information corresponding to all objects and attributes thereof are used for stringing all single-point models into a complete flow by using time-space clues, and the importance is that understanding the time-space position change, finding time-space position change points and event sequence relationship under the same time-space position, and forming corresponding configuration files;
step S5: event rendering, namely completing automatic rendering of an event process by referring to a designated space-time position relation calling model according to event information program configuration and object model configuration information formed before, and fully restoring key elements in the event process;
step S6: and outputting the continuous event information, outputting the final complete flow visualization effect in a specified form, and completing the automatic visualization process.
Further, the event information includes, but is not limited to, event information containing object IDs and spatiotemporal relationships, encodings, databases, and is not limited to a particular format in which the event information is received.
Further, the object comprises five dimensions of people, things, places, things and conditions, the object visually presented in the five dimensions has a model corresponding to each other in a model library, and the attribute state mainly comprises static attribute and dynamic relation.
A rapid event trace back prototype apparatus based on space-time stream input, the apparatus comprising: the system comprises a standardization module, a logic organization module, a rendering module and a storage module;
the equipment firstly receives event information and inputs the event information into a standardization module, and the module standardizes the event information into two types of characteristics of an object and a flow according to rules in a standard library according to the type of the information; secondly, the logic organization module reads the attribute state of the standardized object information according to the standardized object information, and reads corresponding model configuration information from a model library until all objects and states in the whole flow are matched; thirdly, according to the flow information of the event, organizing the space-time information of all objects, and forming a configuration file of the whole flow in a specified space range; finally, the complete flow configuration and model configuration information on the space-time position are input to a rendering module together for visual rendering, visual event information is output, and the process can be processed in real time under the condition of meeting the hardware configuration requirement according to the requirement.
Further, the apparatus further comprises:
a power supply unit: the power supply is used for supplying power to the whole hardware equipment and supplying power to each stage of circuit;
and a storage unit: the system is used for caching input data, intermediate calculation process and output result, and normalizing and storing standard requirements and object models, and is an internal memory, a main memory, a hard disk and external storage equipment;
a logic reasoning unit: the method comprises the computing processes of standardization, object states, time-space relation indexes and the like for inputting events;
a graphics computing unit: the method is used for rendering the object and the state thereof and carrying out visual organization according to the space-time relationship;
an input/output unit: the method is used for data input and visual output.
Further, the power supply unit, the storage unit, the logic reasoning unit, the graphic calculation unit and the input and output unit are integrated together to form a universal device, and interact with an external power supply and data.
The beneficial effects are that:
(1) Event compatibility: the aviation event related flow is complex, relates to various information, has different expressions of different channels and data sources, is inconvenient to integrate expression although each has corresponding data formats and standards, can be compatible with various structured and unstructured information input by means of a standardized module, integrates a complete event process and performs visualization by means of an input-output unit.
(2) Rapid prototyping: the proposal can automatically complete event information process organization and rendering by means of a logic reasoning unit and a graphic computing unit by relying on space-time streamline input, thereby outputting a visualized event process, carrying out high-efficiency space-time relationship and event process expression, and carrying out real-time prototype output according to stream information according to the requirement.
Drawings
FIG. 1 is a flow chart of the general steps of the present application;
FIG. 2 is a device frame pattern of the present application;
fig. 3 is a block diagram of the hardware of the present application.
Detailed Description
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other, and the present application will be further described in detail with reference to the drawings and the specific embodiments.
As shown in fig. 1, a rapid event tracing prototype method based on space-time stream input,
step1: (continuously) receiving event information: the device receives the event information to be visualized, and the related information includes but is not limited to event information containing object IDs and space-time relations, codes, databases and the like, so that the technical scheme is not limited to a specific format for receiving the event information.
Step2: event data format normalization: the step carries out many-to-one mapping conversion on the input event information according to the bottom object model and the space-time relation model, thereby realizing the standardization of the input information format, and for frequently used data sources, direct mapping can be carried out according to the bottom data interface standards of both parties, thereby realizing the automation of the process.
Step3: object model state matching: the matching of all objects and attribute states thereof in the event is completed, wherein the objects mainly comprise five dimensions of people, things, places, things and conditions, the objects which can be visually presented in the five dimensions are in one-to-one correspondence in a model library, the attribute states mainly comprise static attributes and dynamic relations, and the static attributes and the dynamic relations are presented in a visual model in a popup window, bubble and arrow mark mode; the latter, such as calls, commands, etc., often involve specific interactions between two objects, with specific visual effects, and eventually these information will be stored in the form of a configuration file for the graphic computing unit to render the visual effects.
Step4: event spatiotemporal relationship organization: the method comprises the steps of completing organization of flow relations in events, wherein core elements are space-time position relations, and space-time position information, front-back sequence logic, object relations and interval information corresponding to all objects and attributes thereof are also formed, so that all single-point models are connected into a complete flow by space-time clues, the importance is that understanding of space-time position change is focused, space-time position change points and event sequence relations under the same space-time position are found, and corresponding configuration files are formed.
Step5: event rendering: according to the event information program configuration and object model configuration information formed before, the automatic rendering of the event process is completed by referring to the designated space-time position relation calling model, and key elements in the event process are fully restored.
Step6: (duration) event information output: the step outputs the final complete flow visualization effect in a specified form to complete the automatic visualization process.
As shown in fig. 2, a rapid event tracing prototype apparatus based on space-time stream input, the apparatus comprising: the system comprises a standardization module, a logic reasoning module, a rendering module and a storage module;
the equipment firstly receives event information and inputs the event information into a standardization module, and the module standardizes the event information into two types of characteristics of an object and a flow according to rules in a standard library according to the type of the information; secondly, the logic reasoning module reads the attribute state of the standardized object information according to the standardized object information, and reads corresponding model configuration information from a model library until all objects and states in the whole flow are matched; thirdly, according to the flow information of the event, organizing the space-time information of all objects, and forming a configuration file of the whole flow in a specified space range; finally, the complete flow configuration and model configuration information on the space-time position are input to a rendering module together for visual rendering, visual event information is output, and the process can be processed in real time under the condition of meeting the hardware configuration requirement according to the requirement.
As shown in fig. 3, a rapid event tracing prototype apparatus based on space-time stream input, the apparatus further comprises:
a power supply unit: the power supply is used for supplying power to the whole hardware equipment and supplying power to each stage of circuit;
and a storage unit: the system is used for caching input data, intermediate calculation process and output result, and normalizing and storing standard requirements and object models, and is an internal memory, a main memory, a hard disk and external storage equipment;
a logic reasoning unit: the method comprises the computing processes of standardization, object states, time-space relation indexes and the like for inputting events;
a graphics computing unit: the method is used for rendering the object and the state thereof and carrying out visual organization according to the space-time relationship;
an input/output unit: the method is used for data input and visual output.
The power supply unit, the storage unit, the logic reasoning unit, the graphic calculation unit and the input and output unit are integrated together to form a universal device, and interact with an external power supply and data.
Example one CFIT case
The present case is illustrated with an actual aeronautical event, which is summarized as follows:
the B737-800 aircraft executes the A-B flight, when the B airport executes the approach of the No. 04 runway NDB/DME instrument, the aircraft scrapes the antenna of the south near station equipment in the low-altitude flying process, and the aircraft falls to the C airport, so that personnel are safe. The inspection shows that the cabin door of the main landing gear on the left side of the machine and the tyre are obviously damaged.
The reference implementation steps specifically include:
step1: (continuously) receiving event information: the device is started, the software function is started, and report information of the event is continuously input through an input interface, and specifically:
22:31: and 07, the aircraft normally descends, and flies towards the platform from five sides. The tower is contacted, and the tower commands the No. 04 runway to continue approaching.
22:31:40, dme10.7 sea, 2306 feet in height.
22:32:24, height 2139 feet, completing the landing configuration (flap 30).
22:33:15, dme6.75 sea, down to 1700 feet hold.
22:33:36, the unit presets the height window to 500 feet, setting the drop rate to 800 feet/minute.
22:33:40, dme5.75 sea, height 1698 feet, co-pilot reports "out of finger" (the station is not in use because of the school fly off, the crew confirms the station is passing by the map display), and the aircraft begins to descend.
22:34: and 19, the tower issues a floor permission to the unit.
22:34v41, dme3.25 sea, 823 feet in height, co-pilot report "3 sea, 820, we are highly now good.
22:34:46, 745 feet in height, the captain adjusts the height window to a fly-away height of 3000 feet, and the co-pilot confirms.
22:35:01, dme2.75 sea, height 567 feet, drop rate 780 feet/min.
22:35:06, dme2.5 sea, height 503 feet, cockpit voice prompt "APPROACH Minimums", descent rate 780 feet/min, copilot reset flight director.
22:35:10, 450 feet in height, the captain says "this is completely invisible to the operator.
22:35:12, dme2.25, height 425 feet, cockpit voice prompt "Minimums", captain off autopilot, at a rate of descent 765 feet per minute, engine N1 of 58%.
22:35:14, 399 feet in height, 742 feet/min drop rate, 58% engine N1, shout by copilot "resolution high, fly flat to the point of departure", and the captain answers "good".
22:35:17, dme2 sea, 362 feet high (305 feet radio height), 742 feet/min drop rate, engine N1 58%, and the copilot shouting "see led in lights".
22:35:32, 167 feet in altitude, the aircraft triggered a "TOO LOW, TERRAIN" audio alert (a PULL UP alert is displayed on the main flight display PFD), the crew began with a filler neck with the engine N1 increasing from 57% to 60%, the aircraft elevation increasing from 1.58 degrees to 3.52 degrees, and the descent rate decreasing from 607 feet per minute to 503 feet per minute.
22:35: a second sound "TOO LOW, TERRAIN" audio alert occurs 35. The engine N1 increased from 60% to 65% and the rate of drop decreased from 418 feet per minute to 295 feet per minute.
22:35:37, altitude 118 feet (radio altitude 54 feet), a third sound "TOO LOW, TERRAIN" audio warning was raised, engine N1 increased from 65% to 73%, aircraft elevation increased to 5.45 degrees, and descent rate decreased from 158 feet per minute to 31 feet per minute.
22:35:38, height 114 feet (radio height 28 feet), with the south near field NDB antenna and the middle pointer antenna (south near field attendant reflecting: 22:35, hearing a loud sound, finding the pointer antenna broken, and fragments scattered on the ground in front of the room door).
22:35:42, 140 feet high (51 feet radio), the copilot shoutes "missed approach", the airplane pushes the throttle, and the engine N1 increases to 79%.
22:35:45, 168 feet in height (71 feet in radio height), the captain issues a "fly-away" password, turning on the TO/GA switch.
22:35:49, the copilot reports the tower "go-around", and the captain issues a "take-up" command.
22:35:51, the captain issues a "flap 15" instruction and then issues a "wheel-retracting" instruction again.
22:36:01, the captain says "just should open that (the topographic map) as if something was encountered".
Step2: event data format normalization: event information is to be recorded and sorted and mapped from there according to object type, first five-dimensional object semantics with object model are to be identified and recorded, the identification process of these 5 classes of objects is synchronous, but for clarity of description, three steps are presented here, first "people", "objects":
22:31: and 07, the aircraft normally descends, and flies towards the platform from five sides. The tower is contacted, and the tower commands the No. 04 runway to continue approaching.
22:31:40, dme10.7 sea, 2306 feet in height.
22:32:24, height 2139 feet, completing the landing configuration (flap 30).
22:33:15, dme6.75 sea, down to 1700 feet hold.
22:33:36, the unit presets the height window to 500 feet, setting the drop rate to 800 feet/minute.
22:33:40, dme5.75 sea, height 1698 feet, co-pilot reports "out of finger" (the station is not in use because of the school fly off, the crew confirms the station is passing by the map display), and the aircraft begins to descend.
22:34: and 19, the tower issues a floor permission to the unit.
22:34:41, dme3.25 sea, 823 feet in height, co-pilot report "3 sea, 820, we are highly now good).
22:34:46, 745 feet in height, the captain adjusts the height window to a fly-away height of 3000 feet, and the co-pilot confirms.
22:35:01, dme2.75 sea, height 567 feet, drop rate 780 feet/min.
22:35:06, dme2.5 sea, height 503 feet, cockpit voice prompt "APPROACH Minimums", descent rate 780 feet/min, copilot reset flight director.
22:35:10, 450 feet in height, the captain says "this is completely invisible to the operator.
22:35:12, dme2.25, height 425 feet, cockpit voice prompt "Minimums", captain off autopilot, at a rate of descent 765 feet per minute, engine N1 of 58%.
22:35:14, 399 feet in height, 742 feet/min drop rate, 58% engine N1, shout by copilot "resolution high, fly flat to the point of departure", and the captain answers "good".
22:35:17, dme2 sea, 362 feet high (305 feet radio height), 742 feet/min drop rate, engine N1 58%, and the copilot shouting "see led in lights".
22:35:32, 167 feet in altitude, the aircraft triggered a "TOO LOW, TERRAIN" audio alert (a PULL UP alert is displayed on the main flight display PFD), the crew began with a filler neck with the engine N1 increasing from 57% to 60%, the aircraft elevation increasing from 1.58 degrees to 3.52 degrees, and the descent rate decreasing from 607 feet per minute to 503 feet per minute.
22:35: a second sound "TOO LOW, TERRAIN" audio alert occurs 35. The engine N1 increased from 60% to 65% and the rate of drop decreased from 418 feet per minute to 295 feet per minute.
22:35:37, altitude 118 feet (radio altitude 54 feet), a third sound "TOO LOW, TERRAIN" audio warning was raised, engine N1 increased from 65% to 73%, aircraft elevation increased to 5.45 degrees, and descent rate decreased from 158 feet per minute to 31 feet per minute.
22:35:38, height 114 feet (radio height 28 feet), with the south near field NDB antenna and the middle pointer antenna (south near field attendant reflecting: 22:35, hearing a loud sound, finding the pointer antenna broken, and fragments scattered on the ground in front of the room door).
22:35:42, 140 feet high (51 feet radio), the copilot shoutes "missed approach", the airplane pushes the throttle, and the engine N1 increases to 79%.
22:35:45, 168 feet in height (71 feet in radio height), the captain issues a "fly-away" password, turning on the TO/GA switch.
22:35:49, the copilot reports the tower "go-around", and the captain issues a "take-up" command.
22:35:51, the captain issues a "flap 15" instruction and then issues a "wheel-retracting" instruction again.
22:36:01, the captain says "just should open that (the topographic map) as if something was encountered".
Secondly, the attribute and relation recognition of the object, namely 'event' and 'condition' in the five-dimensional object:
22:31: and 07, the aircraft normally descends, and flies towards the platform from five sides. The tower is contacted, and the tower commands the No. 04 runway to continue approaching.
22:31:40, dme10.7 sea, 2306 feet in height.
22:32:24, height 2139 feet, completing the landing configuration (flap 30).
22:33:15, dme6.75 sea, down to 1700 feet hold.
22:33:36, the unit presets the height window to 500 feet, setting the drop rate to 800 feet/minute.
22:33:40, dme5.75 sea, height 1698 feet, co-pilot reports "out of finger" (the station is not in use because of the school fly off, the crew confirms the station is passing by the map display), and the aircraft begins to descend.
22:34: and 19, the tower issues a floor permission to the unit.
22:34:41, dme3.25 sea, 823 feet in height, co-pilot report "3 sea, 820, we are highly now good).
22:34:46, 745 feet in height, the captain adjusts the height window to a fly-away height of 3000 feet, and the co-pilot confirms.
22:35:01, dme2.75 sea, height 567 feet, drop rate 780 feet/min.
22:35:06, dme2.5 sea, height 503 feet, cockpit voice prompt "APPROACH Minimums", descent rate 780 feet/min, copilot reset flight director.
22:35:10, 450 feet in height, the captain says "this is completely invisible to the operator.
22:35:12, dme2.25, height 425 feet, cockpit voice prompt "Minimums", captain off autopilot, at a rate of descent 765 feet per minute, engine N1 of 58%.
22:35:14, 399 feet in height, 742 feet/min drop rate, 58% engine N1, shout by copilot "resolution high, fly flat to the point of departure", and the captain answers "good".
22:35:17, DME2 sea, 362 feet high (305 feet radio), 742 feet/min drop, engine N1 58%, and the copilot shouting "see led in lights".
22:35:32, 167 feet in altitude, the aircraft triggered a "TOO LOW, TERRAIN" audio alert (a PULL UP alert is displayed on the main flight display PFD), the crew began with a filler neck with the engine N1 increasing from 57% to 60%, the aircraft elevation increasing from 1.58 degrees to 3.52 degrees, and the descent rate decreasing from 607 feet per minute to 503 feet per minute.
22:35: a second sound "TOO LOW, TERRAIN" audio alert occurs 35. The engine N1 increased from 60% to 65% and the rate of drop decreased from 418 feet per minute to 295 feet per minute.
22:35:37, altitude 118 feet (radio altitude 54 feet), a third sound "TOO LOW, TERRAIN" audio warning was raised, engine N1 increased from 65% to 73%, aircraft elevation increased to 5.45 degrees, and descent rate decreased from 158 feet per minute to 31 feet per minute.
22:35:38, height 114 feet (radio height 28 feet), with the south near field NDB antenna and the middle pointer antenna (south near field attendant reflecting: 22:35, hearing a loud sound, finding the pointer antenna broken, and fragments scattered on the ground in front of the room door).
22:35:42, 140 feet high (51 feet radio), the copilot shoutes "missed approach", the airplane pushes the throttle, and the engine N1 increases to 79%.
22:35:45, 168 feet in height (71 feet in radio height), the captain issues a "fly-away" password, turning on the TO/GA switch.
22:35:49, the copilot reports the tower "go-around", and the captain issues a "take-up" command.
22:35:51, the captain issues a "flap 15" instruction and then issues a "wheel-retracting" instruction again.
22:36:01, the captain says "just should open that (the topographic map) as if something was encountered".
Finally, it is the identification of "ground", the immediate free space:
22:31: and 07, the aircraft normally descends, and flies towards the platform from five sides. The tower is contacted, and the tower commands the No. 04 runway to continue approaching.
22:31:40, dme10.7 sea, 2306 feet in height.
22:32:24, height 2139 feet, completing the landing configuration (flap 30).
22:33:15, dme6.75 sea, down to 1700 feet hold.
22:33:36, the unit presets the height window to 500 feet, setting the drop rate to 800 feet/minute.
22:33:40, dme5.75 sea, height 1698 feet, co-pilot reports "out of finger" (the station is not in use because of the school fly off, the crew confirms the station is passing by the map display), and the aircraft begins to descend.
22:34: and 19, the tower issues a floor permission to the unit.
22:34:41, dme3.25 sea, 823 feet in height, co-pilot report "3 sea, 820, we are highly now good).
22:34:46, 745 feet in height, the captain adjusts the height window to a fly-away height of 3000 feet, and the co-pilot confirms.
22:35:01, dme2.75 sea, height 567 feet, drop rate 780 feet/min.
22:35:06, dme2.5 sea, height 503 feet, cockpit voice prompt "APPROACH Minimums", descent rate 780 feet/min, copilot reset flight director.
22:35:10, 450 feet in height, the captain says "this is completely invisible to the operator.
22:35:12, dme2.25, height 425 feet, cockpit voice prompt "Minimums", captain off autopilot, at a rate of descent 765 feet per minute, engine N1 of 58%.
22:35:14, 399 feet in height, 742 feet/min drop rate, 58% engine N1, shout by copilot "resolution high, fly flat to the point of departure", and the captain answers "good".
22:35:17, dme2 sea, 362 feet high (305 feet radio height), 742 feet/min drop rate, engine N1 58%, and the copilot shouting "see led in lights".
22:35:32, 167 feet in altitude, the aircraft triggered a "TOO LOW, TERRAIN" audio alert (a PULL UP alert is displayed on the main flight display PFD), the crew began with a filler neck with the engine N1 increasing from 57% to 60%, the aircraft elevation increasing from 1.58 degrees to 3.52 degrees, and the descent rate decreasing from 607 feet per minute to 503 feet per minute.
22:35: a second sound "TOO LOW, TERRAIN" audio alert occurs 35. The engine N1 increased from 60% to 65% and the rate of drop decreased from 418 feet per minute to 295 feet per minute.
22:35:37, altitude 118 feet (radio altitude 54 feet), a third sound "TOO LOW, TERRAIN" audio warning was raised, engine N1 increased from 65% to 73%, aircraft elevation increased to 5.45 degrees, and descent rate decreased from 158 feet per minute to 31 feet per minute.
22:35:38, height 114 feet (radio height 28 feet), with the south near field NDB antenna and the middle pointer antenna (south near field attendant reflecting: 22:35, hearing a loud sound, finding the pointer antenna broken, and fragments scattered on the ground in front of the room door).
22:35:42, 140 feet high (51 feet radio), the copilot shoutes "missed approach", the airplane pushes the throttle, and the engine N1 increases to 79%.
22:35:45, 168 feet in height (71 feet in radio height), the captain issues a "fly-away" password, turning on the TO/GA switch.
22:35:49, the copilot reports the tower "go-around", and the captain issues a "take-up" command.
22:35:51, the captain issues a "flap 15" instruction and then issues a "wheel-retracting" instruction again.
22:36:01, the captain says "just should open that (the topographic map) as if something was encountered".
Step3: object model state matching:
the following objects are recorded in the above process:
the logical reasoning unit in this step needs to match the attribute and the relation of the object into the object, generally, the original data information already contains an explicit master-slave relation, and when there is a subject miss and the state object is ambiguous, it needs to infer according to the existing state of the model, that is, find the object related to the attribute and the relation in the potential object. For the above inputs, there are matching attributes configured as follows:
the matching relationship is configured as follows:
object 1 Object 2 State of matching relationship
Control tower Aircraft, unit and flight running Command and dialogue
Machine length Co-driving Dialog for a conversation
Aircraft Airport antenna Crash accident
Machine length Aircraft, onboard equipment Control of
Unit set Aircraft, onboard equipment Control of
Co-driving On-board equipment Control of
Step4: event spatiotemporal relationship organization:
the logical reasoning unit binds the space-time position with the object attribute and the relation change together, defaults time organization according to time sequence, supports screening of the space-time position, realizes the change of the space-time position based on a semantic detection process, and specifically selects 22:35:12-22:35:38, the occurrence period of the accident, the following configuration table is formed:
/>
step5: event rendering: and taking the space-time position range as output, scheduling a model from a storage unit by a graphic calculation unit according to the input configuration file, rendering the whole event information course according to the configuration, and outputting in a visual form.
Step6: (duration) event information output: and outputting the rendered visual result outwards through an output interface.
In the description of the present application, it should be noted that, unless explicitly specified and limited otherwise, the terms "disposed," "mounted," "connected," and "fixed" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art in a specific case.
Although embodiments of the present application have been shown and described, it will be understood by those skilled in the art that various equivalent changes, modifications, substitutions and alterations can be made to these embodiments without departing from the principles and spirit of the application, the scope of which is defined in the appended claims and their equivalents.

Claims (2)

1. A rapid event tracing prototype method based on space-time stream input is characterized in that the method comprises the following steps:
step S1: continuously receiving event information, and receiving the event information needing to be visualized by equipment;
step S2: event data format normalization: carrying out many-to-one mapping conversion on the input event information according to the bottom object model and the space-time relation model;
step S3: matching the object model states, and completing matching of all objects and attribute states thereof in the event;
step S4: organizing the time-space relationship of the event, completing the organization of the flow relationship in the event, wherein the core element is the time-space position relationship, and the time-space position information, the front-back sequence logic, the object relationship and the interval information corresponding to all objects and attributes thereof are used for stringing all single-point models into a complete flow by using time-space clues, and the importance is that understanding the time-space position change, finding time-space position change points and event sequence relationship under the same time-space position, and forming corresponding configuration files;
step S5: event rendering, namely completing automatic rendering of an event process by referring to a designated space-time position relation calling model according to event information program configuration and object model configuration information formed before, and fully restoring key elements in the event process;
step S6: outputting the continuous event information, outputting the final complete flow visualization effect in a specified form, and completing the automatic visualization process;
the event information comprises event information, codes and databases containing object IDs and space-time relations, and the specific format of the received event information is not limited;
the object comprises five dimensions of people, things, places, things and conditions, and the attribute state comprises static attribute and dynamic relationship.
2. A rapid event tracing prototype apparatus based on space-time stream input, the apparatus comprising: the system comprises a standardization module, a logic organization module, a rendering module and a storage module;
the equipment firstly receives event information and inputs the event information into a standardization module, and the module standardizes the event information into two types of characteristics of an object and a flow according to rules in a standard library according to the type of the information; secondly, the logic organization module reads the attribute state of the standardized object information according to the standardized object information, and reads corresponding model configuration information from a model library until all objects and states in the whole flow are matched; thirdly, according to the flow information of the event, organizing the space-time information of all objects, and forming a configuration file of the whole flow in a specified space range; finally, the complete flow configuration and model configuration information on the space-time position are input to a rendering module together for visual rendering, visual event information is output, and event tracing can be processed in real time under the condition of meeting the hardware configuration requirement according to the requirement;
the apparatus further comprises:
a power supply unit: the power supply is used for supplying power to the whole hardware equipment and supplying power to each stage of circuit;
and a storage unit: buffering for inputting data, intermediate calculation process and output result;
a logic reasoning unit: the method comprises a standardized, object state and time-space relation index calculation process for inputting events;
a graphics computing unit: the method is used for rendering the object and the state thereof and carrying out visual organization according to the space-time relationship;
an input/output unit: the visual output is used for data input;
the power supply unit, the storage unit, the logic reasoning unit, the graphic calculation unit and the input and output unit are integrated together to form a universal device, and interact with an external power supply and data.
CN202211575778.6A 2022-12-08 2022-12-08 Rapid event tracing prototype method and device based on space-time stream input Active CN116049296B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211575778.6A CN116049296B (en) 2022-12-08 2022-12-08 Rapid event tracing prototype method and device based on space-time stream input

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211575778.6A CN116049296B (en) 2022-12-08 2022-12-08 Rapid event tracing prototype method and device based on space-time stream input

Publications (2)

Publication Number Publication Date
CN116049296A CN116049296A (en) 2023-05-02
CN116049296B true CN116049296B (en) 2023-08-29

Family

ID=86124585

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211575778.6A Active CN116049296B (en) 2022-12-08 2022-12-08 Rapid event tracing prototype method and device based on space-time stream input

Country Status (1)

Country Link
CN (1) CN116049296B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102446227A (en) * 2012-01-17 2012-05-09 杭州安恒信息技术有限公司 Interactive semi-automatic security accident tracing method and system
CN113407652A (en) * 2021-05-24 2021-09-17 北京建筑大学 Space-time data model based on 3DPS

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9059895B2 (en) * 2009-12-08 2015-06-16 Cisco Technology, Inc. Configurable network management system event processing using simple network management table indices

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102446227A (en) * 2012-01-17 2012-05-09 杭州安恒信息技术有限公司 Interactive semi-automatic security accident tracing method and system
CN113407652A (en) * 2021-05-24 2021-09-17 北京建筑大学 Space-time data model based on 3DPS

Also Published As

Publication number Publication date
CN116049296A (en) 2023-05-02

Similar Documents

Publication Publication Date Title
CN108460995B (en) Display system and method for runway intrusion prevention
US10023324B2 (en) Methods and apparatus for providing real-time flight safety advisory data and analytics
Degani et al. Human factors of flight-deck checklists: the normal checklist
Wiener Controlled flight into terrain accidents: System-induced errors
US8665121B2 (en) Systems and methods for aircraft flight tracking and display
CN108630019B (en) System and method for rendering aircraft cockpit displays for use by ATC conditional approval instructions
CN106927056A (en) The display of the meteorological data in aircraft
Johnson Analysis of Top of Descent (TOD) uncertainty
US20070120708A1 (en) Methods and systems for monitoring aircraft approach between approach gates
Puranik A methodology for quantitative data-driven safety assessment for general aviation
CN116049296B (en) Rapid event tracing prototype method and device based on space-time stream input
Palmer Altitude deviations: Breakdowns of an error-tolerant system
Baugh Predicting general aviation accidents using machine learning algorithms
CN114282811B (en) Cross-machine type SOPs (System on Board) based standardized official aircraft flight risk monitoring system and method
Blake The NASA Advanced Concepts Flight Simulator-A unique transport aircraft research environment
US20230392954A1 (en) Vehicle systems and related message prioritization methods
Kalagher et al. Situational Awareness and General Aviation Accidents
CN109733626A (en) A kind of alarm of amphibious aircraft hatch door and instruction system
Majumdar et al. Analysis of General Aviation fixed-wing aircraft accidents involving inflight loss of control using a state-based approach
CN110481804B (en) Flight auxiliary system and aircraft
Thompson Commercial air crew detection of system failures: state of the art and future trends
Leiden et al. Context of human error in commercial aviation
Li et al. Integrated design and verification of civil aircraft cockpit crew alerting system based on system engineering
CN112000333B (en) Avionics interface design reconstruction method based on pilot functional state
Damos et al. The effect of interruptions on flight crew performance: ASRS reports

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant