CN112578984B - Processing method and system for man-machine interaction event of synthetic vision system - Google Patents

Processing method and system for man-machine interaction event of synthetic vision system Download PDF

Info

Publication number
CN112578984B
CN112578984B CN202011467717.9A CN202011467717A CN112578984B CN 112578984 B CN112578984 B CN 112578984B CN 202011467717 A CN202011467717 A CN 202011467717A CN 112578984 B CN112578984 B CN 112578984B
Authority
CN
China
Prior art keywords
event
control
events
interactive
callback function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011467717.9A
Other languages
Chinese (zh)
Other versions
CN112578984A (en
Inventor
张松
肖永红
罗涛
汪坤
江彦
唐太虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Hermes Technology Co ltd
Chengdu Zirui Qingyun Aeronautical And Astronautical Technology Co ltd
Original Assignee
Chengdu Hermes Technology Co ltd
Chengdu Zirui Qingyun Aeronautical And Astronautical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Hermes Technology Co ltd, Chengdu Zirui Qingyun Aeronautical And Astronautical Technology Co ltd filed Critical Chengdu Hermes Technology Co ltd
Priority to CN202011467717.9A priority Critical patent/CN112578984B/en
Publication of CN112578984A publication Critical patent/CN112578984A/en
Application granted granted Critical
Publication of CN112578984B publication Critical patent/CN112578984B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a processing method and a system for man-machine interaction events of a synthetic vision system, wherein the method comprises the following steps: s1, a multifunctional display sends original message data; s2, converting the original message data into state data to be switched; s3, if the state data to be switched changes relative to the current state data, generating an event, and storing the event; s4, distributing the event to a graphical interface interaction control of the multifunctional display; the graphical interface interactive control executes a callback function corresponding to the event and responds to the event; and S5, clearing the responded event in the event cache. The method and the system of the invention enable the main drawing thread or the task of the graphical interface to distribute the events in the event cache, do not occupy the main drawing thread, greatly improve the processing efficiency of the event data, improve the reusability of the limited interactive elements and also improve the interface response of the interactive operation.

Description

Processing method and system for man-machine interaction event of synthetic vision system
Technical Field
The invention relates to the technical field of synthetic vision of avionic equipment, in particular to a processing method and a processing system for man-machine interaction events of a synthetic vision system.
Background
A Synthetic Vision System (SVS) is a novel airborne System, uses data such as elevation terrain, map tiles, a navigation database and the like, generates a three-dimensional Vision scene and superposes and displays main flight parameter information images such as speed, height, attitude and the like through a graphic processing computer, and outputs the images to a multifunctional display for display, thereby improving the situation perception capability of pilots in environments with poor visibility or complex terrain conditions.
The existing synthetic vision system is mainly used for displaying three-dimensional scenes and flight parameters, and does not have more man-machine interaction operations and interaction logics, so that the man-machine interaction mode is simple, and all the interaction operations and interaction logics are completed through a plurality of interaction elements.
However, with the continuous enhancement of computer graphics processing capability, the functions of the synthetic vision system are becoming more and more powerful, and the system integrates the functions of air route management, flight plan management, system maintenance and the like, so that the requirements of the enhancement of the functions on the graphical interaction interface are becoming higher and higher, and the human-computer interaction logic of the graphical control is becoming more and more complex. However, most of the currently-used onboard multifunctional displays have very limited interactive elements for interactive operation in order to save the occupied space in the cockpit, how to realize increasingly complex interactive operation by using the limited interactive elements, and how to respond to the interactive operation with high efficiency becomes a critical problem to be solved urgently.
Disclosure of Invention
The invention aims to solve the problem of how to realize increasingly complex interactive operation by using limited interactive elements and improve response efficiency, and provides a processing method for man-machine interaction events of a synthetic vision system.
In order to achieve the above object, the present invention adopts the following aspects.
A man-machine interaction event processing method for a composite visual system comprises the following steps:
s1, sending original message data by a multifunctional display;
s2, converting the original message data into state data to be switched;
s3, if the state data to be switched changes relative to the current state data, generating an event, and storing the event;
s4, distributing the event to a graphical interface interaction control of the multifunctional display; the graphical interface interactive control executes a callback function corresponding to the event and responds to the event;
and S5, clearing the responded event in the event cache.
As a preferred embodiment of the present invention, the original message data in step S1 includes a data header, a data length, a cycle counter, state information, and a checksum.
As a preferred embodiment of the present invention, step S2 specifically means that the state information of the interactive element is extracted from the state information field in the original message data, and the state information of the interactive element is the to-be-switched state data corresponding to the original message data.
As a preferred aspect of the present invention, in step S3, the event includes, but is not limited to, a press, release, left-turn, right-turn, touch start, touch in, or touch end event.
As a preferred embodiment of the present invention, step S4 specifically includes the following steps:
s41, traversing the event cache;
s42, reading out an event;
s43, searching a callback function in the event callback function mapping table according to the event;
s44, if the callback function is not empty, extracting the callback function and executing the callback function;
s45, judging whether the traversal of the event in the event cache is finished, if so, ending, otherwise, returning to the step S41, and circularly executing the steps S41-S45.
As a preferred scheme of the present invention, in step S4, the gui interactive controls are organized into a gui system with a tree structure, and include a root node control, a child control, and a leaf node control.
As a preferred embodiment of the present invention, in step S44, executing the callback function specifically includes the following steps:
s441, finding a root node control of the graphical interface interaction control corresponding to the callback function;
s442, sequentially judging whether each level of the tree structure control is visible and usable from the root node control, and when the control is visible and usable, judging whether the next level of the sub-control of the control is visible and usable;
and S443, traversing each level of the tree structure control until the graphical interface interaction control corresponding to the callback function is found, executing the callback function by the graphical interface interaction control corresponding to the callback function, and returning the execution parameters.
Based on the same conception, the human-computer interaction event processing system of the synthetic vision system is also provided, which comprises an event processor unit, an event manager unit, an event cache unit and an interaction control unit,
the event processor unit is used for receiving original message data sent by the multifunctional display and converting the original message data into to-be-switched state data of the interactive element, and is also used for generating an event according to the change of the to-be-switched state data relative to the current state data and storing the event into the event cache unit;
the event manager unit is used for managing events in the event cache, including the distribution of unresponsive events and the clearing of responded events in the event cache, providing event response function interfaces for registering and removing the interaction control unit, and pre-storing the mapping relation among the events, the interaction controls and the interaction control event response functions;
the event cache unit is used for storing events, so that the generation, distribution and elimination of the events can be operated in different threads;
and the interactive control unit is used for receiving the event distributed by the event manager unit and enabling the generated event to be responded on the interactive software according to the corresponding interactive control event response function.
As a preferred scheme of the invention, the controls in the interactive control unit all define the data format according to the EventWidget class.
As a preferred scheme of the invention, the controls in the interactive control unit adopt a tree-shaped organization structure, and the EventWidget class of each control comprises EventPage, menuItem or SpinBox.
In summary, due to the adoption of the technical scheme, the invention at least has the following beneficial effects:
1. the method and the system convert the state information data of the interactive elements into events, control the interactive control to make event response through a callback function, save the generated events by adopting an event cache mechanism, distribute the events in the event cache by a main drawing thread or a task of the graphical interface, do not occupy the main drawing thread, greatly improve the processing efficiency of the event data, improve the reusability of the limited interactive elements and improve the interface response of interactive operation.
2. The system is divided into a plurality of modules by adopting an object-oriented idea, the functions of the modules are relatively independent, the system coupling degree is reduced, and the module cohesion degree is increased; abstracting the graphic interactive control of the event to be responded into a base class, inheriting the interactive control of other events to be responded into the base class, quickly obtaining the characteristics of a parent class in event processing, improving the development efficiency and enhancing the expansibility of an application system;
3. the method comprises the steps of organizing a graphical interface interactive control by adopting a tree structure, defining an EventRootPage object which is used as a graphical interface page root node, managing all events distributed from an event manager on the page, simultaneously carrying out further processing on the events, such as combining pressing and releasing events into clicking events and the like, and simplifying the event processing of the interactive control.
Drawings
FIG. 1 is a flowchart of a method for processing man-machine interaction events in a composite view system according to embodiment 1;
fig. 2 is a format of a data packet in embodiment 1;
FIG. 3 is a schematic diagram showing an event distribution flow in embodiment 1;
FIG. 4 is a schematic view of an event response flow in embodiment 1;
FIG. 5 is a block diagram of a system for processing man-machine interaction events in a composite view system according to embodiment 2;
FIG. 6 is a system class diagram of event processing in embodiment 2;
FIG. 7 is a diagram illustrating organization of interaction controls in embodiment 2.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and embodiments, so that the objects, technical solutions and advantages of the present invention will be more clearly understood. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A processing method and system for man-machine interaction events of a synthetic vision system acquire state information of interaction elements sent by a multifunctional display through a data processing module, the method and system process the data information into events which can be identified by a man-machine interaction interface, wherein the events refer to operations which can be identified by software when the state of the interaction elements on the multifunctional display changes, if the state of a certain key on the multifunctional display changes from invalid to valid, the events are identified as key pressing events, and when the state changes from valid to invalid, the events are identified as key releasing events. The events comprise key pressing, key releasing, knob pressing, knob releasing, knob rotating and the like, and the man-machine interaction interface processes the required events according to the requirements of the man-machine interaction interface so as to achieve the purpose of man-machine interaction.
Example 1
A man-machine interaction event processing method of a composite view system is mainly as shown in figure 1, and mainly comprises the following steps:
s1, the multifunctional display sends original message data to a graphic processing board;
s2, the graph processing board receives the original message data and processes the data to form data of a state to be switched;
s3, the graphic processing board generates an event according to the current state data and the state data to be switched, and stores the event into an event cache;
s4, taking out the event from the event cache by each frame of the graphical interface frame of the graphical processing board to distribute the event outwards, and circularly executing the step;
s5, executing a callback function and responding to an event by a graphical interface interactive control of the multifunctional display;
and S6, the graphics processing board clears the responded events in the event cache. And distributing all events in the event cache, and clearing the responded events in the event cache after the interactive control finishes the event response.
Preferably, the step S1 specifically includes the following steps:
and S11, acquiring the electric signal information of the electronic element by a processor of the multifunctional display. Since the multi-function display contains a variety of interactive electronic components including, but not limited to, touch screens, buttons, knobs, the processor of the multi-function display collects electrical signal information of these interactive electronic components.
And S12, converting the collected electric signal information of the electronic element into a corresponding binary data message according to a communication protocol established by the graphic processing board by a processor of the multifunctional display to obtain original message data, wherein the message format of the original message data is as shown in figure 2, and the original message data is sent to the graphic processing board through a hardware interface, wherein the hardware communication interface comprises but is not limited to RS422, RS232, TCP, UDP and SPI interfaces.
The message comprises a data head, a data length, a cycle counter, state information and a check sum, and the meaning of each field is as follows:
a data head: as the identifier of the message data in the system, the message data represented by different data header identifiers is used, such as 5A two bytes as the data header;
data length: the total byte length of the message data;
a cycle counter: the data packet counting, generally using one byte from 0 to 255, starting from 0 when exceeding, can be used for judging whether the data packet is lost;
state information: state information of each interactive element of the equipment, such as a key pressing or releasing state, a knob rotating state, a touch pressing or releasing state, touch position information and the like, is flexibly increased or decreased according to the state acquired by the multifunctional display;
and (4) checking the sum: when the bit-wise accumulation of the transmitted data exceeds 255, the complement of the transmitted data is used as a checksum for ensuring the integrity and accuracy of the data in data communication.
As a preferred scheme, in step S2, the graphics processing board performs data interaction with the multifunctional display through a hardware communication interface, and the graphics processing board software receives original message data sent by the multifunctional display through a communication interface provided by calling a system API or an embedded bottom driver. After receiving the original message data, processing the original message data, wherein the processing of the original message data comprises data check and data conversion.
The data verification steps are as follows:
firstly, data head bytes are taken out from message data, and whether the data are data sent by the multifunctional display is judged; if yes, reading the data length byte, obtaining the transmission byte length, and taking out the whole packet of data according to the transmission byte length.
Secondly, calculating the checksum of all bytes except the check bit of the whole packet of data to obtain a checksum calculation result; and comparing the calculation result with the value on the data check bit to determine whether the received whole packet of data is correct.
The data conversion means: and after the verification is successful, taking out the data representing the state information in the data packet, converting the data into the state information of each interactive element, storing the state information of the interactive elements into a memory, and waiting for the distribution and use of subsequent events. The graphic processing board continuously receives the original message data sent by the multifunctional display and converts the original message data to obtain the state information of each interactive element.
Preferably, step S3 specifically includes the following steps: comparing the state information of the interactive element obtained by current conversion with the last state of the interactive element, displaying the change of the state of the interactive element according to the state information of the interactive element, and generating an event correspondingly, otherwise, generating no event. For example, using 0, 1 to represent the state of key L1, 0 table release, 1 table press, when the state of key L1 was released at the previous moment, and the state received at this moment is pressed, an L1 press event is generated, and if the state is still released at this moment, no event is generated.
Various events are generated by constantly comparing the state changes of the interactive elements at the current time and the last time, and the events include but are not limited to pressing, releasing, left-turning, right-turning, touch starting, touch middle, touch ending events and the like. The generated events are stored in an event cache.
In order to generate the state data and the events of the functional display more efficiently, the data and the event generation are processed in a multi-thread or multi-task mode, and the state data receiving, the processing and the event generation are completed under the condition that a main drawing thread or a task is not influenced.
As a preferred scheme, in the step S4, the graphical interface starts to distribute the event before the frame is drawn, the event is taken out from the event buffer area according to the event type, the corresponding callback function is taken out from the corresponding event callback function mapping table according to the event, and the corresponding callback function is called to execute the event response. As shown in fig. 3, the event distribution interface processing flow chart mainly includes the following steps:
s41, traversing event cache;
s42, reading out an event;
s43, searching a callback function in the event callback function mapping table according to the event;
s44, if the callback function is empty, the callback function does not exist, if the callback function is not empty, the callback function is spoken to exist, and the callback function is extracted;
s45, judging whether the traversal of the event in the event cache is finished, if so, ending, otherwise, returning to the step S41, and circularly executing the steps S41-S45.
As a preferred scheme, in step S5, when the callback function registered in the gui interactive control is called, executing the callback function is to implement event response, because most of the widgets in the gui are organized into the gui system in a tree structure, that is, the widget includes its child widget, and the child widget may include its child widgets, and recursion is performed sequentially. Therefore, the event response processing is also a recursive processing process, firstly, the event response processing function of the root node control is called, the event is firstly processed by the child control, the child control is further processed by the child control, and the event is really responded when the leaf node control is reached. With reference to the event response flow chart of fig. 4, the event response process steps when the control event response processing function of the root node is called are described:
s51, firstly judging the visibility and the availability of the control on the root node, executing the step S52 when the control on the root node is visible or available, otherwise, directly returning to FALSE, and ending the event response.
S52, traversing the child controls of the current control, obtaining each child control, judging the visibility and the availability of each child control, judging whether the child controls have the child controls when the child controls are visible and available, and recursively executing the step 52 if the child controls include the child controls until the child controls do not have the child controls any more, and executing the event response function of the child controls. And after executing the event response function, judging whether the event response result is TRUE, if so, ending the event response, otherwise, traversing other child controls, executing the corresponding event response function, and setting the event response result as TRUE event response end until all the child controls end the event response.
Example 2
The invention also provides a man-machine interaction event processing system of the synthetic vision system, and the modules of the system comprise: the system comprises an event processor unit, an event manager unit, an event cache unit and an interaction control unit. The schematic diagram of the event processing system structure is shown in fig. 5, and the corresponding diagram of the event processing system class is shown in fig. 6.
The design of each unit and type in the system and the implementation dependency relationship thereof are explained with reference to an event processing system structure diagram of fig. 5 and an event processing system class diagram of fig. 6.
Event processor unit
The event processing unit is used for receiving original message data of various interactive elements on the equipment sent by the multifunctional display, verifying and analyzing the original message data, generating an event and storing the event into the event cache unit, and completing the flow required to be realized by the unit by using an eventHandler class in the whole event processing system design. The EventHandler is an event processor, the main processing function runs in a single thread or task, a CPU execution time slice of a graphical interface main drawing thread or task is not occupied, and the main drawing flow is not influenced.
And the EventHandler depends on the EventBuffer and stores the processed and generated events into the cache provided by the class.
Event manager unit
The event manager unit is used for managing events in the event cache, including distribution and destruction of the events in the event cache, providing registration and removal event response function interfaces for the interactive control unit, and forming mapping of the events, the interactive controls and the interactive control event response functions. When the event manager distributes the event, the event generated by the event processor unit is sequentially taken out from the event cache, the interactive control object registered with the event response function is found, and the event response function of the interactive control is sequentially called to execute the event response.
EventManager is an event manager class, mainly comprising a distribution interface and a registration event response interface for the interactive control. The response of the interactive control to the event is executed when the distribution interface of the event manager class is called, redrawing operation on the control is likely to be generated when the event response is executed, the operation needs to be executed in the main drawing thread, and the distribution interface of the event manager should be executed in each frame cycle of the main drawing thread or the task.
EventManager relies on EventBuffer to fetch and distribute events generated by the event handler unit in sequence.
Event buffer unit
The event buffer unit is used as an intermediate layer of the event processing unit and the event manager unit, so that the generation and distribution of events can be operated in different threads, and the system operation efficiency is enhanced. The event buffer unit manages events by adopting a data structure of a queue, and the event generated firstly should be preferentially executed during distribution.
The EventBuffer is an event buffer class, provides an event storage interface for the event processor unit, and provides an event acquisition interface for the event manager unit.
Interactive control unit
The interactive control unit is used as an interactive medium between the application system and the user, converts the internal form realized by the application system into a form which can be received by the user, and feeds back the operation made by the user. When the interactive control is used as a carrier of event response and operates a hardware interactive element on the multifunctional display, the application system receives and processes the data to generate an event, and the event is finally responded on the interactive control.
According to the class diagram of the event processing system shown in fig. 6, all the controls in the interactive control unit are inherited to the EventWidget class, which is used as the base class of the interactive control to realize the most basic functions, including management of the child controls, setting of visibility, usability and the like of the controls, and meanwhile, response callback function interfaces of various events are provided externally, so that different response callbacks are specified in the application to achieve the diversity of processing performances of the same event by the same or different types of interactive controls, for example, when different button controls process click events, the processing modes are different, some buttons send a data request instruction outwards when responding to the pressing of an interactive element, some buttons may open a new page, and when a text box responds, the current text box may be activated to be in an input state.
The EventRootPage is a root page control class, the class is also inherited to the EventWidget, all the characteristics of the parent class are possessed, and all events sent by the EventManager can be registered and acquired from the EventManager when the class object is created, all the events are managed, and the events are distributed to the child controls for processing.
The EventPage, the MenuItem and the SpinBox are inherited to the EventWidget, the characteristics of the EventPage are increased, if the EventPage is a full page and has a uniform page background, a menu bar, a page title and the like, the EventPage can continuously distribute events to the child control, and event response functions can be designated by the EventPage or the event response functions can be designated to process the events; the MenuItem is a menu item, has the shape of a menu, has the effect of being pressed when the hardware key is pressed and released, and only processes the clicking event of the hardware key bound with the MenuItem; the SpinBox is a numerical value fine adjustment box, the SpinBox is in an editing state or an editing canceling state when a hardware key is clicked, and the knob can input numerical values when the SpinBox is in the editing state. The interactive controls inherit the EventWidget, so that the interactive controls have the parent class characteristics quickly, and the application system can be conveniently expanded by more interactive controls.
The interactive control unit organization structure is organized according to a tree structure, an object of an EventRootPage type is taken as a root node, the nodes of the interactive control unit organization structure contain EventPage, menuItem and the like, and are inherited to the interactive control child nodes of the EventWidget class, and the child nodes continuously contain the interactive control child nodes. The organization of the diagram is shown in figure 7.
The EventRootPage distributes the events acquired from the EventManager event manager unit to the child controls of the event manager unit, each user interface comprises a plurality of interactive controls, and when the plurality of controls need to respond to the events sent by certain interactive elements, how to determine which interactive control is allowed to execute event response is determined; if the event response function specified by the child control is executed and the function returns to TRUE, the event does not want to be responded by other controls any more, and then the response of the event is completed; if the event response function of the child control is executed but returns FALSE, the other controls can also reprocess the event; if all the child controls do not respond to the event or the response function responding to the event returns FALSE, the event response function specified by the child controls is executed.
Based on the organization structure and the processing principle of the interactive controls, the father control can determine which interactive control responds to an event only by controlling the display or the hiding, the availability or the unavailability of the son control of the father control, so that the complexity of the system is effectively reduced, and the event processing control logic is clearer.
The units and modules described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all units (for example, each functional unit, processor, memory, and the like) in each embodiment of the present invention may be integrated into one unit, each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those skilled in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
When the integrated unit of the present invention is implemented in the form of a software functional unit and sold or used as a separate product, it may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: various media that can store program code, such as removable storage devices, ROMs, magnetic or optical disks, etc.
The foregoing is merely a detailed description of specific embodiments of the invention and is not intended to limit the invention. Various alterations, modifications and improvements will occur to those skilled in the relevant art without departing from the spirit and scope of the invention.

Claims (5)

1. A man-machine interaction event processing method for a composite visual system is characterized by comprising the following steps:
s1, a multifunctional display sends original message data;
s2, converting the original message data into state data to be switched;
s3, if the state data to be switched changes relative to the current state data, generating an event, and storing the event;
s4, distributing the event to a graphical interface interaction control of a multifunctional display; the graphical interface interactive control executes a callback function corresponding to the event and responds to the event;
s5, clearing the responded event in the event cache;
step S4 specifically includes the following steps:
s41, traversing event cache;
s42, reading out an event;
s43, searching a callback function in an event callback function mapping table according to the event;
s44, if the callback function is not empty, extracting the callback function and executing the callback function;
s45, judging whether the traversal of the event in the event cache is finished, if so, ending, otherwise, returning to the step S41, and circularly executing S41-S45;
the graphical interface interactive controls are organized into a graphical interface system in a tree structure and comprise root node controls, child controls and leaf node controls;
in step S44, executing the callback function specifically includes the following steps:
s441, finding a root node control of the graphical interface interaction control corresponding to the callback function;
s442, starting from the root node control, sequentially judging whether each level of the tree structure control is visible and available, and when the control is visible and available, judging whether the next level of the sub-control of the control is visible and available;
s443, traversing each level of the tree structure control until finding the visible and available graphical interface interactive control of the lowest level corresponding to the callback function, executing the callback function by the visible and available graphical interface interactive control of the lowest level corresponding to the callback function, and returning to the execution parameters;
the control in the interactive control defines a data format according to an eventWidget class, the class is used as a base class of the interactive control to realize the most basic function, including the management of the sub-control, and simultaneously, response callback function interfaces of various events are externally provided, so that different response callbacks are specified in application to achieve the diversity of the processing performance of the same event by the interactive controls of the same or different types;
the EventRootPage is a root page control class, the class is also inherited to the EventWidget, all the characteristics of the parent class are possessed, and all events sent by the EventManager can be registered and acquired to the EventManager when the class object is created, all the events are managed, and the events are distributed to the child controls to be processed;
the controls in the interactive control unit adopt a tree-shaped organization structure, and the EventWidget class of each control comprises EventPage, menuItem or SpinBox;
the EventPage class is used for continuously distributing the events by the child control and also used for processing the events by the appointed event response function;
the MenuItem is a menu item, has the shape of a menu, has the effect of being pressed when the hardware key is pressed and released, and only processes the click event of the hardware key bound with the MenuItem; the SpinBox is a numerical value fine-tuning box, when a hardware key is clicked, the SpinBox is in an editing state or a state of canceling editing, and when the SpinBox is in the editing state, a knob can input numerical values.
2. The method as claimed in claim 1, wherein the raw message data in step S1 includes a header, a data length, a cycle counter, status information, and a checksum.
3. The method for processing the human-computer interaction event of the composite view system according to claim 2, wherein the step S2 specifically refers to extracting state information of an interactive element from a state information field in the original message data, where the state information of the interactive element is to-be-switched state data corresponding to the original message data.
4. A synthetic vision system human-computer interaction event processing method according to claim 3, wherein in step S3, the event includes but is not limited to a press, release, left-turn, right-turn, touch start, touch in or touch end event.
5. A man-machine interaction event processing system of a synthetic vision system is characterized by comprising an event processor unit, an event manager unit, an event cache unit and an interaction control unit,
the event processor unit is used for receiving original message data sent by the multifunctional display and converting the original message data into state data to be switched of the interactive element, and the event processor unit is also used for generating an event according to the change of the state data to be switched relative to the current state data and storing the event into the event cache unit;
the event manager unit is used for managing events in the event cache, including the distribution of unresponsive events and the clearing of responded events in the event cache, providing event response function interfaces for registering and removing the interaction control unit, and pre-storing the mapping relation among the events, the interaction controls and the interaction control event response functions;
the event cache unit is used for storing events, so that the generation, distribution and elimination of the events can be operated in different threads;
the interactive control unit is used for receiving the event distributed by the event manager unit and enabling the generated event to be responded on interactive software according to a corresponding interactive control event response function;
the interactive control unit is configured to receive the event distributed by the event manager unit, and make the generated event respond to the interactive software according to a corresponding interactive control event response function, which specifically includes the following steps:
s41, traversing event cache;
s42, reading out an event;
s43, searching a callback function in an event callback function mapping table according to the event;
s44, if the callback function is not empty, extracting the callback function and executing the callback function;
s45, judging whether the traversal of the event in the event cache is finished, if so, ending, otherwise, returning to the step S41, and circularly executing S41-S45;
the graphical interface interactive controls are organized into a graphical interface system in a tree structure and comprise root node controls, child controls and leaf node controls;
in step S44, executing the callback function specifically includes the following steps:
s441, finding a root node control of the graphical interface interaction control corresponding to the callback function;
s442, starting from the root node control, sequentially judging whether each level of the tree structure control is visible and available, and when the control is visible and available, judging whether the next level of the sub-control of the control is visible and available;
s443, traversing each level of the tree structure control until finding the visible and available graphical interface interactive control of the lowest level corresponding to the callback function, executing the callback function by the visible and available graphical interface interactive control of the lowest level corresponding to the callback function, and returning to the execution parameters; the controls in the interactive control unit define data formats according to an eventWidget class; the type is used as a base class of the interactive control to realize the most basic functions, including the management of the sub-controls, and response callback function interfaces of various events are provided externally, so that different response callbacks are specified in the application to achieve the diversity of the processing performance of the same event by the interactive controls of the same or different types;
the EventRootPage is a root page control class, the class is also inherited to the EventWidget, all the characteristics of the parent class are possessed, and all events sent by the EventManager can be registered and acquired to the EventManager when the class object is created, all the events are managed, and the events are distributed to the child controls to be processed;
the controls in the interactive control unit adopt a tree-shaped organization structure, and the EventWidget class of each control comprises EventPage, menuItem or SpinBox;
the EventPage class is used for continuously distributing the events by the child control and also used for processing the events by the appointed event response function;
the MenuItem is a menu item, has the shape of a menu, has a pressed effect when a hardware key is pressed and released, and only processes a click event of the hardware key bound with the MenuItem; the SpinBox is a numerical value fine-tuning box, when a hardware key is clicked, the SpinBox is in an editing state or a state of canceling editing, and when the SpinBox is in the editing state, a knob can input numerical values.
CN202011467717.9A 2020-12-14 2020-12-14 Processing method and system for man-machine interaction event of synthetic vision system Active CN112578984B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011467717.9A CN112578984B (en) 2020-12-14 2020-12-14 Processing method and system for man-machine interaction event of synthetic vision system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011467717.9A CN112578984B (en) 2020-12-14 2020-12-14 Processing method and system for man-machine interaction event of synthetic vision system

Publications (2)

Publication Number Publication Date
CN112578984A CN112578984A (en) 2021-03-30
CN112578984B true CN112578984B (en) 2022-11-29

Family

ID=75134836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011467717.9A Active CN112578984B (en) 2020-12-14 2020-12-14 Processing method and system for man-machine interaction event of synthetic vision system

Country Status (1)

Country Link
CN (1) CN112578984B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334728B (en) * 2008-07-28 2011-10-19 北京航空航天大学 Interface creating method and platform based on XML document description
US10162514B2 (en) * 2015-09-15 2018-12-25 Rockwell Collins, Inc. Large display format touch gesture interface
CN107391276B (en) * 2017-07-05 2018-09-28 腾讯科技(深圳)有限公司 Distributed monitor method, interception control device and system
CN108153600B (en) * 2017-12-26 2021-09-28 深圳Tcl数字技术有限公司 Panel key response method, television equipment and computer readable storage medium
CN110674025B (en) * 2018-07-03 2023-08-11 百度在线网络技术(北京)有限公司 Interactive behavior monitoring method and device and computer equipment
CN109901916A (en) * 2019-02-26 2019-06-18 北京小米移动软件有限公司 The call back function of event executes method, apparatus, storage medium and mobile terminal
CN111232232B (en) * 2019-12-30 2023-08-29 成都赫尔墨斯科技股份有限公司 Device and method for comprehensive display control of avionics equipment
CN111210516B (en) * 2019-12-30 2023-04-18 成都赫尔墨斯科技股份有限公司 Software platform for integrated display control of avionics equipment

Also Published As

Publication number Publication date
CN112578984A (en) 2021-03-30

Similar Documents

Publication Publication Date Title
US20220046076A1 (en) Method And System For Real-Time Modeling Of Communication, Virtualization And Transaction Execution Related Topological Aspects Of Monitored Software Applications And Hardware Entities
US11714665B2 (en) Method and apparatus for composite user interface creation
US7644367B2 (en) User interface automation framework classes and interfaces
US9542241B2 (en) Navigation application interface
US5557723A (en) Method and system for customizing forms in an electronic mail system
JP2553307B2 (en) Process monitoring method
US7051273B1 (en) Customizing forms in an electronic mail system utilizing custom field behaviors and user defined operations
KR100965708B1 (en) System and method for providing access to user interface information
CN112051993B (en) Method, device, medium and equipment for generating state machine template and processing task
CN109634718B (en) Method and system for creating mirror image by cloud platform
JPH09297697A (en) Three-dimensional real-time monitoring system and method for process attribute
WO2023093414A1 (en) Micro-application development method and apparatus, and device, storage medium and program product
US20150160835A1 (en) Pluggable Layouts for Data Visualization Components
CN111343181A (en) Message processing method and system, data pool and computer readable storage medium
US12014216B2 (en) Method for platform-based scheduling of job flow
JP2023542646A (en) Smart span prioritization based on plug-in service backpressure
CN111580912A (en) Display method and storage medium for multi-level structure resource group
CN113742366A (en) Data processing method and device, computer equipment and storage medium
CN112578984B (en) Processing method and system for man-machine interaction event of synthetic vision system
US6839723B2 (en) Information management system
WO2023193633A1 (en) Image analysis methods and apparatuses, computer device and storage medium
CN110324722B (en) Method, device, equipment and storage medium for acquiring data in live broadcast room
US11543945B1 (en) Accurate local depiction of preview of a program window included in a remote graphical desktop
CN114327709A (en) Control page generation method and device, intelligent device and storage medium
JP2000047836A (en) Data base management system for storing permanent attribute and non-permanent attribute and storage method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant