CN112578984A - Processing method and system for man-machine interaction event of synthetic vision system - Google Patents

Processing method and system for man-machine interaction event of synthetic vision system Download PDF

Info

Publication number
CN112578984A
CN112578984A CN202011467717.9A CN202011467717A CN112578984A CN 112578984 A CN112578984 A CN 112578984A CN 202011467717 A CN202011467717 A CN 202011467717A CN 112578984 A CN112578984 A CN 112578984A
Authority
CN
China
Prior art keywords
event
control
events
interactive
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011467717.9A
Other languages
Chinese (zh)
Other versions
CN112578984B (en
Inventor
张松
肖永红
罗涛
汪坤
江彦
唐太虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Hermes Technology Co ltd
Chengdu Zirui Qingyun Aeronautical And Astronautical Technology Co ltd
Original Assignee
Chengdu Hermes Technology Co ltd
Chengdu Zirui Qingyun Aeronautical And Astronautical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Hermes Technology Co ltd, Chengdu Zirui Qingyun Aeronautical And Astronautical Technology Co ltd filed Critical Chengdu Hermes Technology Co ltd
Priority to CN202011467717.9A priority Critical patent/CN112578984B/en
Publication of CN112578984A publication Critical patent/CN112578984A/en
Application granted granted Critical
Publication of CN112578984B publication Critical patent/CN112578984B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a processing method and a system for man-machine interaction events of a synthetic vision system, wherein the method comprises the following steps: s1, the multifunctional display sends original message data; s2, converting the original message data into the state data to be switched; s3, if the state data to be switched changes relative to the current state data, generating an event and storing the event; s4, distributing the event to a graphical interface interactive control of the multifunctional display; the graphical interface interactive control executes a callback function corresponding to the event and responds to the event; and S5, clearing the responded event in the event cache. The method and the system of the invention enable the main drawing thread or the task of the graphical interface to distribute the events in the event cache, do not occupy the main drawing thread, greatly improve the processing efficiency of the event data, improve the reusability of the limited interactive elements and also improve the interface response of the interactive operation.

Description

Processing method and system for man-machine interaction event of synthetic vision system
Technical Field
The invention relates to the technical field of synthetic vision of avionic equipment, in particular to a processing method and a processing system for man-machine interaction events of a synthetic vision system.
Background
A Synthetic Vision System (SVS) is a novel airborne System, uses data such as elevation terrain, map tiles, a navigation database and the like, generates a three-dimensional Vision scene and superposes and displays main flight parameter information images such as speed, height, attitude and the like through a graphic processing computer, and outputs the images to a multifunctional display for display, thereby improving the situation perception capability of pilots in environments with poor visibility or complex terrain conditions.
The existing synthetic vision system is mainly used for displaying three-dimensional scenes and flight parameters, and does not have more man-machine interaction operations and interaction logics, so that the man-machine interaction mode is simple, and all the interaction operations and interaction logics are completed through a plurality of interaction elements.
However, with the continuous enhancement of computer graphics processing capability, the functions of the synthetic vision system are becoming more and more powerful, and the system integrates the functions of air route management, flight plan management, system maintenance and the like, so that the requirements of the enhancement of the functions on the graphical interaction interface are becoming higher and higher, and the human-computer interaction logic of the graphical control is becoming more and more complex. However, most of the currently-used onboard multifunctional displays have very limited interactive elements for interactive operation in order to save the occupied space in the cockpit, how to realize increasingly complex interactive operation by using the limited interactive elements, and how to respond to the interactive operation with high efficiency becomes a critical problem to be solved urgently.
Disclosure of Invention
The invention aims to solve the problem of how to realize increasingly complex interactive operation by using limited interactive elements and improve response efficiency, and provides a processing method for man-machine interaction events of a synthetic vision system.
In order to achieve the above object, the present invention adopts the following aspects.
A man-machine interaction event processing method for a composite vision system comprises the following steps:
s1, the multifunctional display sends original message data;
s2, converting the original message data into the state data to be switched;
s3, if the state data to be switched changes relative to the current state data, generating an event and storing the event;
s4, distributing the event to a graphical interface interactive control of the multifunctional display; the graphical interface interaction control executes a callback function corresponding to the event and responds to the event;
and S5, clearing the responded event in the event cache.
As a preferred embodiment of the present invention, the original message data in step S1 includes a data header, a data length, a cycle counter, state information, and a checksum.
As a preferred embodiment of the present invention, step S2 specifically means that the state information of the interactive element is extracted from the state information field in the original message data, and the state information of the interactive element is the to-be-switched state data corresponding to the original message data.
As a preferred aspect of the present invention, in step S3, the event includes, but is not limited to, a press, release, left-turn, right-turn, touch start, touch in, or touch end event.
As a preferable embodiment of the present invention, step S4 specifically includes the following steps:
s41, traversing the event cache;
s42, reading out the event;
s43, searching a callback function in the event callback function mapping table according to the event;
s44, if the callback function is not empty, extracting the callback function and executing the callback function;
s45, judging whether the event in the event buffer is traversed or not, if yes, ending, otherwise returning to the step S41, and executing S41-S45 in a circulating way.
As a preferred embodiment of the present invention, in step S4, the gui interactive controls are organized in a tree structure to form a gui system, which includes a root node control, a child control, and a leaf node control.
As a preferred embodiment of the present invention, in step S44, executing the callback function specifically includes the following steps:
s441, finding a root node control of the graphical interface interaction control corresponding to the callback function;
s442, starting from the root node control, sequentially judging whether each level of the tree structure control is visible and available, and when the control is visible and available, judging whether the next level of the sub-control of the control is visible and available;
and S443, traversing each level of the tree structure control until the graphical interface interaction control corresponding to the callback function is found, executing the callback function by the graphical interface interaction control corresponding to the callback function, and returning the execution parameters.
Based on the same conception, the human-computer interaction event processing system of the composite visual system is also provided, which comprises an event processor unit, an event manager unit, an event cache unit and an interaction control unit,
the event processor unit is used for receiving original message data sent by the multifunctional display and converting the original message data into to-be-switched state data of the interactive element, and the event processor unit is also used for generating an event according to the change of the to-be-switched state data relative to the current state data and storing the event into the event cache unit;
the event manager unit is used for managing events in the event cache, including distribution of unresponsive events and clearing of responded events in the event cache, providing an event response function registering and removing interface for the interactive control unit, and pre-storing a mapping relation among the events, the interactive controls and the interactive control event response functions;
the event cache unit is used for storing events, so that the generation, distribution and elimination of the events can be operated in different threads;
and the interactive control unit is used for receiving the event distributed by the event manager unit and enabling the generated event to be responded on the interactive software according to the corresponding interactive control event response function.
As a preferred scheme of the invention, the controls in the interactive control unit all define the data format according to the EventWidget class.
As a preferred scheme of the invention, the controls in the interactive control unit adopt a tree-shaped organization structure, and the EventWidget class of each control comprises EventPage, MenuItem or SpinBox.
In summary, due to the adoption of the technical scheme, the invention at least has the following beneficial effects:
1. the method and the system convert the state information data of the interactive elements into events, control the interactive control to make event response through the callback function, and save the generated events by adopting an event cache mechanism, and the main drawing thread or task of the graphical interface distributes the events in the event cache without occupying the main drawing thread, thereby greatly improving the processing efficiency of the event data, improving the reusability of the limited interactive elements and improving the interface response of interactive operation.
2. The system is divided into a plurality of modules by adopting an object-oriented idea, the functions of the modules are relatively independent, the system coupling degree is reduced, and the module cohesion degree is increased; abstracting the graphic interactive control of the event to be responded into a base class, inheriting the interactive control of other events to be responded into the base class, quickly obtaining the characteristics of a parent class in event processing, improving the development efficiency and enhancing the expansibility of an application system;
3. the method is characterized in that a tree structure is adopted to organize a graphical interface interactive control, EventRootPage objects are defined, the objects are used as graphical interface page root nodes, all events distributed from an event manager on a page are managed, meanwhile, the events are further processed, such as pressing and releasing events are combined into clicking events and the like, the event processing of the interactive control is simplified, each EventRootPage object takes over all the events distributed from the event manager during displaying, the objects only need to control the display or hiding of sub-interactive controls of the objects, the sub-interactive controls can be used or can not be used for controlling the sub-interactive controls to respond to the events, the complexity of an event response flow is reduced, the reusability of interactive elements such as hardware keys and knobs is improved, and the control performance of the hardware interactive elements on different pages and different interactive response controls is enhanced.
Drawings
FIG. 1 is a flowchart of a method for processing man-machine interaction events in a composite view system according to embodiment 1;
fig. 2 is a format of a data packet in embodiment 1;
FIG. 3 is a schematic diagram showing an event distribution flow in embodiment 1;
FIG. 4 is a schematic view of an event response flow in embodiment 1;
FIG. 5 is a block diagram of a system for processing man-machine interaction events in a composite view system according to embodiment 2;
FIG. 6 is a system class diagram of event processing in embodiment 2;
FIG. 7 is a diagram illustrating organization of interaction controls in embodiment 2.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and embodiments, so that the objects, technical solutions and advantages of the present invention will be more clearly understood. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A processing method and system for man-machine interaction events of a synthetic vision system acquire state information of interaction elements sent by a multifunctional display through a data processing module, the method and system process the data information into events which can be identified by a man-machine interaction interface, wherein the events refer to operations which can be identified by software when the state of the interaction elements on the multifunctional display changes, if the state of a certain key on the multifunctional display changes from invalid to valid, the events are identified as key pressing events, and when the state changes from valid to invalid, the events are identified as key releasing events. The events comprise key pressing, key releasing, knob pressing, knob releasing, knob rotating and the like, and the man-machine interaction interface processes the required events according to the requirements of the man-machine interaction interface so as to achieve the purpose of man-machine interaction.
Example 1
A man-machine interaction event processing method of a composite view system is mainly as shown in figure 1, and mainly comprises the following steps:
s1, the multifunctional display sends the original message data to the graphic processing board;
s2, the graphics processing board receives the original message data and processes the data to form the data of the state to be switched;
s3, the graphics processing board generates an event according to the current state data and the state data to be switched, and stores the event into an event cache;
s4, taking out the event from the event buffer memory by each frame of the graphic interface frame of the graphic processing board to distribute the event outwards, and executing the step circularly;
s5, executing a callback function by the graphical interface interactive control of the multifunctional display and responding to the event;
s6, the graphics processing board clears the responded events in the event cache. And distributing all events in the event cache, and clearing the responded events in the event cache after the interactive control finishes the event response.
Preferably, step S1 specifically includes the following steps:
and S11, acquiring the electric signal information of the electronic element by the processor of the multifunctional display. Since the multi-function display contains a variety of interactive electronic components including, but not limited to, touch screens, buttons, knobs, the processor of the multi-function display collects electrical signal information of these interactive electronic components.
S12, the processor of the multifunctional display converts the collected electrical signal information of the electronic component into a corresponding binary data message according to a communication protocol formulated by the graphic processing board, to obtain original message data, the message format of which is shown in fig. 2, and sends the original message data to the graphic processing board through a hardware interface, where the hardware communication interface includes, but is not limited to, RS422, RS232, TCP, UDP, and SPI interfaces.
The message comprises a data head, a data length, a cycle counter, state information and a check sum, and the meaning of each field is as follows:
a data head: as the identification of the message data in the system, the message data represented by different header identifications is used, such as 5A5A two bytes as a header;
data length: the total byte length of the message data;
a cycle counter: the data packet counting, generally using one byte from 0 to 255, starting from 0 when exceeding, can be used for judging whether the data packet is lost;
state information: state information of each interactive element of the equipment, such as a key pressing or releasing state, a knob rotating state, a touch pressing or releasing state, touch position information and the like, is flexibly increased or decreased according to the state acquired by the multifunctional display;
and (4) checking the sum: when the bit-wise accumulation of the transmitted data exceeds 255, the complement of the transmitted data is used as a checksum for ensuring the integrity and accuracy of the data in data communication.
Preferably, in step S2, the graphics processing board performs data interaction with the multi-function display through a hardware communication interface, and the graphics processing board software receives original message data sent by the multi-function display through a communication interface provided by calling a system API or an embedded bottom driver. After receiving the original message data, processing the original message data, wherein the processing of the original message data comprises data check and data conversion.
The data verification steps are as follows:
firstly, data head bytes are taken out from message data, and whether the data are data sent by the multifunctional display is judged; if yes, reading the data length byte, obtaining the transmission byte length, and taking out the whole packet of data according to the transmission byte length.
Secondly, calculating the checksum of all bytes except the check bit of the whole packet of data to obtain a checksum calculation result; and comparing the calculation result with the value on the data check bit to determine whether the received whole packet of data is correct.
The data conversion means that: and after the verification is successful, taking out the data representing the state information in the data packet, converting the data into the state information of each interactive element, storing the state information of the interactive elements into a memory, and waiting for the distribution and use of subsequent events. The graphic processing board continuously receives the original message data sent by the multifunctional display and converts the original message data to obtain the state information of each interactive element.
Preferably, step S3 specifically includes the following steps: comparing the state information of the interactive element obtained by current conversion with the last state of the interactive element, displaying the change of the state of the interactive element according to the state information of the interactive element, and generating an event correspondingly, otherwise, not generating any event. For example, using 0, 1 to indicate the status of key L1, 0 table released, 1 table pressed, when the status was released at the previous time on key L1, and the status received at this time was pressed, an L1 press event is generated, and if the status is still released at this time, no event is generated.
Various events are generated by constantly comparing the state changes of the interactive elements at the current time and the last time, and the events include but are not limited to pressing, releasing, left-turning, right-turning, touch starting, touch middle, touch ending events and the like. The generated event will be stored in the event cache.
In order to generate the state data and the events of the functional display more efficiently, the data and the event generation are processed in a multi-thread or multi-task mode, and the state data receiving, the processing and the event generation are completed under the condition that a main drawing thread or a task is not influenced.
As a preferred scheme, in step S4, the graphical interface starts to distribute the event before the frame is drawn, the event is fetched from the event buffer according to the event type, the corresponding callback function is fetched from the corresponding event callback function mapping table according to the event, and the corresponding callback function is called to execute the event response. As shown in fig. 3, the event distribution interface processing flow chart mainly includes the following steps:
s41, traversing the event cache;
s42, reading out the event;
s43, searching a callback function in the event callback function mapping table according to the event;
s44, if the callback function is empty, it is indicated that the callback function does not exist, if the callback function is not empty, it is said that the callback function exists, and the callback function is extracted;
s45, judging whether the event in the event buffer is traversed or not, if yes, ending, otherwise returning to the step S41, and executing S41-S45 in a circulating way.
As a preferred scheme, in step S5, when the callback function registered in the gui interactive control is called, executing the callback function is to implement event response, because most of the widgets in the gui are organized into the gui system in a tree structure, that is, the widget includes its child widget, and the child widget may include its child widget, and recursion sequentially. Therefore, the event response processing is also a recursive processing process, firstly, the event response processing function of the root node control is called, the event is firstly processed by the child control, the child control is further processed by the child control, and the event is really responded when the leaf node control is reached. The event response process steps when the control event response processing function of the root node is called are described with reference to the event response flow chart of fig. 4:
s51, firstly judging the visibility and the availability of the control on the root node, executing the step S52 when the control on the root node is visible or available, otherwise, directly returning to FALSE, and ending the event response.
S52, traversing the child controls of the current control, obtaining each child control, judging the visibility and the availability of each child control, judging whether the child control is owned by the current control when the visibility and the availability are available, and recursively executing the step 52 when the child control is contained until the child control does not own the child control any more. And after executing the event response function, judging whether the event response result is TRUE, if so, ending the event response, otherwise, traversing other child controls, executing the corresponding event response function, and setting the event response result as TRUE event response end until all the child controls end the event response.
Example 2
The invention also provides a man-machine interaction event processing system of the synthetic vision system, and the modules of the system comprise: the system comprises an event processor unit, an event manager unit, an event cache unit and an interaction control unit. The schematic diagram of the event processing system structure is shown in fig. 5, and the corresponding diagram of the event processing system class is shown in fig. 6.
The design of each unit and type in the system and the implementation dependency relationship thereof are explained with reference to an event processing system structure diagram of fig. 5 and an event processing system class diagram of fig. 6.
Event processor unit
The event processing unit is used for receiving original message data of various interactive elements on the equipment sent by the multifunctional display, verifying and analyzing the original message data, generating an event and storing the event into the event cache unit, and completing the flow required to be realized by the unit by using an eventHandler class in the whole event processing system design. The EventHandler is an event handler type, and the main processing function runs in a single thread or task, does not occupy a CPU execution time slice of a main drawing thread or task of a graphical interface, and does not influence the main drawing flow.
EventHandler relies on EventBuffer to store the event generated by processing into the cache provided by the class.
Event manager unit
The event manager unit is used for managing events in the event cache, including distribution and destruction of the events in the event cache, providing registration and removal event response function interfaces for the interactive control unit, and forming mapping of the events, the interactive controls and the interactive control event response functions. When the event manager distributes the events, the events generated by the event processor unit are sequentially taken out from the event cache, the interactive control object registered with the event response function is found out, and the event response function of the interactive control is sequentially called to execute the event response.
EventManager is an event manager class, mainly comprising a distribution interface and a registration event response interface for the interactive control. The response of the interactive control to the event is executed when the distribution interface of the event manager class is called, redrawing operation on the control is likely to be generated when the event response is executed, the operation needs to be executed in the main drawing thread, and the distribution interface of the event manager should be executed in each frame cycle of the main drawing thread or the task.
EventManager relies on EventBuffer to fetch and distribute events generated by the event handler unit in sequence.
Event buffer unit
The event cache unit is used as an intermediate layer of the event processing unit and the event manager unit, so that the generation and distribution of events can be operated in different threads, and the system operation efficiency is enhanced. The event buffer unit manages events by adopting a data structure of a queue, and the event generated firstly should be preferentially executed during distribution.
The EventBuffer is an event buffer class, provides an event storage interface for the event processor unit, and provides an event acquisition interface for the event manager unit.
Interactive control unit
The interactive control unit is used as an interactive medium between the application system and the user, converts the internal form realized by the application system into a form which can be received by the user, and feeds back the operation made by the user. When the interactive control is used as a carrier of event response and operates a hardware interactive element on the multifunctional display, the application system receives and processes the data to generate an event, and the event is finally responded on the interactive control.
According to the class diagram of the event processing system shown in fig. 6, all the controls in the interactive control unit are inherited to the EventWidget class, which is used as the base class of the interactive control to realize the most basic functions, including management of the child controls, setting of visibility, usability and the like of the controls, and meanwhile, response callback function interfaces of various events are provided externally, so that different response callbacks are specified in the application to achieve the diversity of processing performances of the same event by the same or different types of interactive controls, for example, when different button controls process click events, the processing modes are different, some buttons send a data request instruction outwards when responding to the pressing of an interactive element, some buttons may open a new page, and when a text box responds, the current text box may be activated to be in an input state.
The EventRootPage is a root page control class, the class is also inherited to the EventWidget, all the characteristics of the parent class are possessed, and all events sent by the EventManager can be registered and acquired from the EventManager when the class object is created, all the events are managed, and the events are distributed to the child controls for processing.
The EventPage, the MenuItem and the SpinBox are inherited to the EventWidget, the characteristics of the EventPage are increased, if the EventPage is a full page and has a uniform page background, a menu bar, a page title and the like, the EventPage can continuously distribute events to the child control, and event response functions can be designated by the EventPage or the event response functions can be designated to process the events; the MenuItem is a menu item, has the shape of a menu, has the effect of being pressed when the hardware key is pressed and released, and only processes the click event of the hardware key bound with the MenuItem; the SpinBox is a numerical value fine adjustment box, the SpinBox is in an editing state or an editing canceling state when a hardware key is clicked, and the knob can input numerical values when the SpinBox is in the editing state. The interactive controls inherit the EventWidget, so that the interactive controls have the parent class characteristics quickly, and the application system can be conveniently expanded by more interactive controls.
The interactive control unit organization structure is organized according to a tree structure, an object of an EventRootPage type is taken as a root node, the nodes of the interactive control unit organization structure contain EventPage, MenuItem and the like, and are inherited to the interactive control child nodes of the EventWidget class, and the child nodes continuously contain the interactive control child nodes. The organization of the device is schematically shown in figure 7.
The EventRootPage distributes the events acquired from the EventManager event manager unit to the child controls of the event manager unit, each user interface comprises a plurality of interactive controls, and when the plurality of controls need to respond to the events sent by certain interactive elements, how to determine which interactive control is allowed to execute event response is determined; if the event response function specified by the child control is executed and the function returns to TRUE, the event does not want to be responded by other controls any more, and then the response of the event is completed; if the event response function of the child control is executed but returns FALSE, the other controls can also reprocess the event; if all the child controls do not respond to the event or the response function responding to the event returns FALSE, the event response function specified by the child controls is executed.
Based on the organization structure and the processing principle of the interactive controls, the father control can determine which interactive control responds to an event only by controlling the display or the hiding, the availability or the unavailability of the son control of the father control, so that the complexity of the system is effectively reduced, and the event processing control logic is clearer.
The units and modules described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all units (for example, each functional unit, processor, memory, and the like) in each embodiment of the present invention may be integrated into one unit, each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those skilled in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
When the integrated unit of the present invention is implemented in the form of a software functional unit and sold or used as a separate product, it may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The foregoing is merely a detailed description of specific embodiments of the invention and is not intended to limit the invention. Various alterations, modifications and improvements will occur to those skilled in the art without departing from the spirit and scope of the invention.

Claims (10)

1. A man-machine interaction event processing method for a composite visual system is characterized by comprising the following steps:
s1, the multifunctional display sends original message data;
s2, converting the original message data into state data to be switched;
s3, if the state data to be switched changes relative to the current state data, generating an event and storing the event;
s4, distributing the event to a graphical interface interaction control of a multifunctional display; the graphical interface interactive control executes a callback function corresponding to the event and responds to the event;
and S5, clearing the responded event in the event cache.
2. The method according to claim 1, wherein the original message data in step S1 includes a header, a data length, a cycle counter, status information, and a checksum.
3. The method according to claim 2, wherein step S2 specifically includes extracting state information of an interactive element from a state information field in the original message data, where the state information of the interactive element is to-be-switched state data corresponding to the original message data.
4. A synthetic vision system human-computer interaction event processing method according to claim 3, wherein in step S3, the event includes but is not limited to a press, release, left-turn, right-turn, touch start, touch in or touch end event.
5. The method for processing the human-computer interaction event of the composite view system as claimed in claim 1, wherein the step S4 comprises the following steps:
s41, traversing the event cache;
s42, reading out the event;
s43, searching a callback function in an event callback function mapping table according to the event;
s44, if the callback function is not empty, extracting the callback function and executing the callback function;
s45, judging whether the event in the event buffer is traversed or not, if yes, ending, otherwise returning to the step S41, and executing S41-S45 in a circulating way.
6. The method for processing human-computer interaction events of synthetic vision system as claimed in claim 5, wherein in step S4, the gui interaction controls are organized in a tree structure as a gui system, including a root node control, a child control and a leaf node control.
7. The method for processing human-computer interaction events of a composite view system according to claim 6, wherein in step S44, the step of executing the callback function specifically comprises the steps of:
s441, finding the root node control of the graphical interface interaction control corresponding to the callback function;
s442, starting from the root node control, sequentially judging whether each level of the tree structure control is visible and available, and when the control is visible and available, judging whether the next level of the sub-control of the control is visible and available;
and S443, traversing each level of the tree structure control until the graphical interface interactive control corresponding to the callback function is found, executing the callback function by the graphical interface interactive control corresponding to the callback function, and returning to the execution parameter.
8. A man-machine interaction event processing system of a composite visual system is characterized by comprising an event processor unit, an event manager unit, an event cache unit and an interaction control unit,
the event processor unit is used for receiving original message data sent by the multifunctional display and converting the original message data into to-be-switched state data of the interactive element, and the event processor unit is also used for generating an event according to the change of the to-be-switched state data relative to the current state data and storing the event into the event cache unit;
the event manager unit is used for managing events in the event cache, including distribution of non-response events and clearing of response events in the event cache, providing registration and removal of event response function interfaces for the interaction control unit, and pre-storing mapping relations among the events, the interaction controls and the interaction control event response functions;
the event cache unit is used for storing events, so that the generation, distribution and elimination of the events can be operated in different threads;
and the interactive control unit is used for receiving the event distributed by the event manager unit and enabling the generated event to be responded on the interactive software according to the corresponding interactive control event response function.
9. A composite vision system human-computer interaction event processing system as defined in claim 8, wherein the controls in the interaction control unit are all in data format defined by eventgidge class.
10. A synthetic vision system human-computer interaction event processing system as claimed in claim 9, wherein the controls in the interaction control unit adopt a tree-shaped organization structure, and the EventWidget class of each control includes EventPage, MenuItem or SpinBox.
CN202011467717.9A 2020-12-14 2020-12-14 Processing method and system for man-machine interaction event of synthetic vision system Active CN112578984B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011467717.9A CN112578984B (en) 2020-12-14 2020-12-14 Processing method and system for man-machine interaction event of synthetic vision system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011467717.9A CN112578984B (en) 2020-12-14 2020-12-14 Processing method and system for man-machine interaction event of synthetic vision system

Publications (2)

Publication Number Publication Date
CN112578984A true CN112578984A (en) 2021-03-30
CN112578984B CN112578984B (en) 2022-11-29

Family

ID=75134836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011467717.9A Active CN112578984B (en) 2020-12-14 2020-12-14 Processing method and system for man-machine interaction event of synthetic vision system

Country Status (1)

Country Link
CN (1) CN112578984B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334728A (en) * 2008-07-28 2008-12-31 北京航空航天大学 Interface creating method and platform based on XML document description
US20170075558A1 (en) * 2015-09-15 2017-03-16 Rockwell Collins, Inc. Large Display Format Touch Gesture Interface
CN107391276A (en) * 2017-07-05 2017-11-24 腾讯科技(深圳)有限公司 Distributed monitor method, interception control device and system
CN108153600A (en) * 2017-12-26 2018-06-12 深圳Tcl数字技术有限公司 A kind of panel button response method, television equipment and computer readable storage medium
CN109901916A (en) * 2019-02-26 2019-06-18 北京小米移动软件有限公司 The call back function of event executes method, apparatus, storage medium and mobile terminal
CN110674025A (en) * 2018-07-03 2020-01-10 百度在线网络技术(北京)有限公司 Interactive behavior monitoring method and device and computer equipment
CN111210516A (en) * 2019-12-30 2020-05-29 成都赫尔墨斯科技股份有限公司 Software platform for comprehensive display control of avionics equipment
CN111232232A (en) * 2019-12-30 2020-06-05 成都赫尔墨斯科技股份有限公司 Device and method for comprehensive display control of avionics equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334728A (en) * 2008-07-28 2008-12-31 北京航空航天大学 Interface creating method and platform based on XML document description
US20170075558A1 (en) * 2015-09-15 2017-03-16 Rockwell Collins, Inc. Large Display Format Touch Gesture Interface
CN106527676A (en) * 2015-09-15 2017-03-22 罗克韦尔柯林斯公司 Large display format touch gesture interface
CN107391276A (en) * 2017-07-05 2017-11-24 腾讯科技(深圳)有限公司 Distributed monitor method, interception control device and system
CN108153600A (en) * 2017-12-26 2018-06-12 深圳Tcl数字技术有限公司 A kind of panel button response method, television equipment and computer readable storage medium
CN110674025A (en) * 2018-07-03 2020-01-10 百度在线网络技术(北京)有限公司 Interactive behavior monitoring method and device and computer equipment
CN109901916A (en) * 2019-02-26 2019-06-18 北京小米移动软件有限公司 The call back function of event executes method, apparatus, storage medium and mobile terminal
CN111210516A (en) * 2019-12-30 2020-05-29 成都赫尔墨斯科技股份有限公司 Software platform for comprehensive display control of avionics equipment
CN111232232A (en) * 2019-12-30 2020-06-05 成都赫尔墨斯科技股份有限公司 Device and method for comprehensive display control of avionics equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴爱华: "《计算机基础与计算思维》", 31 August 2018 *
老孟: "《Flutter实战入门》", 30 June 2020 *

Also Published As

Publication number Publication date
CN112578984B (en) 2022-11-29

Similar Documents

Publication Publication Date Title
US11659020B2 (en) Method and system for real-time modeling of communication, virtualization and transaction execution related topological aspects of monitored software applications and hardware entities
US5557723A (en) Method and system for customizing forms in an electronic mail system
US7051273B1 (en) Customizing forms in an electronic mail system utilizing custom field behaviors and user defined operations
KR100965708B1 (en) System and method for providing access to user interface information
US9542241B2 (en) Navigation application interface
US7644367B2 (en) User interface automation framework classes and interfaces
CN112051993B (en) Method, device, medium and equipment for generating state machine template and processing task
JPH09297697A (en) Three-dimensional real-time monitoring system and method for process attribute
WO2023093414A1 (en) Micro-application development method and apparatus, and device, storage medium and program product
US20090193363A1 (en) Representing Multiple Computing Resources Within A Predefined Region Of A Graphical User Interface For Displaying A Single Icon
US20150160835A1 (en) Pluggable Layouts for Data Visualization Components
CN111580912A (en) Display method and storage medium for multi-level structure resource group
US6839723B2 (en) Information management system
CN112578984B (en) Processing method and system for man-machine interaction event of synthetic vision system
EP0479785A1 (en) Method for building a hypermedia information management tool
WO2023193633A1 (en) Image analysis methods and apparatuses, computer device and storage medium
CN112581589A (en) View list layout method, device, equipment and storage medium
US20180143747A1 (en) User interface device and method for displaying screen of user interface device
US7936356B2 (en) Information processing method for information registration, and information processing method for information retrieval
US20100077288A1 (en) Displaying a Form
US11543945B1 (en) Accurate local depiction of preview of a program window included in a remote graphical desktop
CN112929717B (en) Focus management method and display device
CN110933455B (en) Video screening method and device, electronic equipment and storage medium
CN111913711A (en) Video rendering method and device
JP2007041962A (en) Menu display device, menu display method and menu display program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant