CN110658963A - Multi-screen interaction system, equipment, medium and interaction system of human-computer interaction interface - Google Patents

Multi-screen interaction system, equipment, medium and interaction system of human-computer interaction interface Download PDF

Info

Publication number
CN110658963A
CN110658963A CN201910943241.2A CN201910943241A CN110658963A CN 110658963 A CN110658963 A CN 110658963A CN 201910943241 A CN201910943241 A CN 201910943241A CN 110658963 A CN110658963 A CN 110658963A
Authority
CN
China
Prior art keywords
screen
screens
framework
interaction
class
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910943241.2A
Other languages
Chinese (zh)
Other versions
CN110658963B (en
Inventor
王丰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Fengyuanxing Technology Co Ltd
Original Assignee
Shanghai Fengyuanxing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Fengyuanxing Technology Co Ltd filed Critical Shanghai Fengyuanxing Technology Co Ltd
Priority to CN201910943241.2A priority Critical patent/CN110658963B/en
Publication of CN110658963A publication Critical patent/CN110658963A/en
Application granted granted Critical
Publication of CN110658963B publication Critical patent/CN110658963B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R11/0264Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for control means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The multi-screen interactive system, equipment, medium and interactive system of the man-machine interaction interface comprise: the UI framework module is used for constructing a UI framework system applied to 2D/3D mixing in each screen through a plurality of classes; a multi-screen application module for implementing an application of the UI framework system on a plurality of the screens; and the multi-screen interaction module is used for realizing interaction among a plurality of screens applying the UI framework system. The realization that this application can make many screen interactions of human-computer interface becomes the simplification, the lightweight to provide sufficient imagination space for the product designer, will promote automotive electronics's market competition like this powerfully, for each car factory creation value, satisfy consumer's aesthetic requirement simultaneously. And based on the frame technology and the implementation steps, the difficulty of picture manufacturing is reduced, the efficiency is improved, and the labor cost is saved.

Description

Multi-screen interaction system, equipment, medium and interaction system of human-computer interaction interface
Technical Field
The application relates to the technical field of software development, in particular to a multi-screen interaction system, equipment, a medium and an interaction system of a human-computer interaction interface.
Background
In a man-machine interaction interface in an automobile, the traditional implementation scheme is that a central control entertainment system is based on operating systems such as Android, QNX, Linux and the like, the operating systems carry a display output terminal, and a man-machine interaction interface program runs on the operating systems; the liquid crystal instrument system is based on operating systems such as Linux and QNX, the operating system carries another display output terminal, and a man-machine interaction interface program runs on the operating system. And a CAN bus in the automobile is used as a bridge between the human-computer interaction interface programs on the two systems to realize communication of data and service logic.
With the development of times, the concept of the intelligent automobile cockpit is more and more popular in recent years. From the user perception aspect, the display output terminals of the automobile are more and more, and are larger and larger. These screens may be deployed on separate display devices or may be combined arbitrarily into different areas on the same display, such as combining a liquid crystal meter screen and a central entertainment screen into one large display, with meter content displayed to the left and entertainment content displayed to the right. In addition, the content in each screen is not independent and irrelevant picture presentation, but has various interactive relations among the relevant pictures. For example, when the automobile alarm content is displayed, a reminding image and a brief document are displayed on an instrument screen, and more detailed document content is displayed on a central control entertainment screen; for example, the local picture content on the central control screen can be pulled to the instrument screen for display through user operation, and the local picture content can appear on the two screens simultaneously in the operation process, so that a one-in-one-out passing animation effect is presented. From the aspect of internal hardware architecture, more than two operating systems are driven on one hardware CPU chip based on the hypervisor technology, and the contents on a central control entertainment screen and a secondary driving entertainment screen are generally displayed based on an Android system; and displaying contents on a liquid crystal instrument screen, a head-up display screen and a virtual control screen based on the QNX/Linux system. From the aspect of software implementation, a current method for implementing in the industry, or a current method that can only be adopted in the industry due to the technical limitation of the implementation framework, is that content on one display screen is often implemented by at least one human-computer interaction interface program (process), for example, for content on a liquid crystal instrument screen, in order to improve display efficiency, a layered drawing technology is adopted, and each layer of picture needs to be driven by one human-computer interaction interface program. Based on the current situation, if n screens are needed for an intelligent cockpit project, more than or equal to n human-computer interaction interface programs are needed. The interaction requirements of logic between pictures and functional service logic are realized among the programs through an interprocess communication technology or a cross-operating system communication technology, so that the realization difficulty is high, the realized program codes are too complex, and the time consistency of the scene passing animation between screens is difficult to ensure. Moreover, in the current state of the art, the interaction logic between screens can only be realized by writing program codes.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, the present application is directed to a multi-screen interactive system, device, medium, and interactive system with human-computer interactive interface, which solves at least one problem in the prior art.
In order to achieve the above objects and other related objects, the present application provides a multi-screen interactive system for a human-computer interactive interface, the system comprising: the UI framework module is used for constructing a UI framework system applied to 2D/3D mixing in each screen through a plurality of classes; a multi-screen application module for implementing an application of the UI framework system on a plurality of the screens; and the multi-screen interaction module is used for realizing interaction among a plurality of screens applying the UI framework system.
In one or more embodiments of the present application, the classes include: a parent class of all 2D/3D UI elements, an encapsulation class of a camera object in a 3D frame system, an encapsulation class of a light source object in a 3D frame system, a parent class of a physical node object in a 3D frame system, a parent class of a virtual node object in a 3D frame system, an encapsulation class of a physical node object in a 3D frame system, an encapsulation class of a virtual node object in a 3D frame system, any one or more of a packaging class of an object surface object in the 3D frame system, a packaging class of a 3D scene in the 3D frame system, a basic parent class of all controls in the 2D frame system, a basic parent class of a container class control with a rolling characteristic in the 2D frame system, a 2D control with various characteristics in the 2D frame system, and a basic parent class of a packaging 2D scene in the 2D frame system.
In one or more embodiments of the present application, the screen includes: the independent screen corresponding to the whole display area on the display; and/or the block screens corresponding to a certain display area on the display can be superposed in one display.
In one or more embodiments of the present application, the interaction between the screens includes any one or more of the following: interaction between a plurality of the tile screens included in one of the displays; interaction between the independent screens corresponding to the plurality of displays; interaction between the independent screen corresponding to one or more of the displays and the tile screen included in one or more of the displays.
In one or more embodiments of the present application, the plurality of displays corresponding to the plurality of screens are the same operating system, or the plurality of displays corresponding to the plurality of screens are different operating systems respectively.
In one or more embodiments of the present application, the multi-screen application module includes: the resource storage management system is used for driving the UI framework system; the input framework system is used for realizing any one or more of screen touch input recognition, behavior refinement and distribution control, gesture input recognition and control and voice input recognition and control; the UI frame system is used for detecting the change of each UI control object in the UI frame system and driving each UI control object; and the rendering frame system is used for converting the UI control object tree into a rendering tree and drawing on the only one rendering window associated with each screen.
In one or more embodiments of the present disclosure, the multi-screen interaction module includes: the method comprises the steps of constructing a first frame system for linkage migration among pictures in each screen, constructing a second frame system for flow of UI elements among the screens, and constructing a third frame system for multi-screen interaction among the screens running on different operating systems.
In one or more embodiments of the present application, the first framework system implements the linked migration between the frames in each of the screens by constructing migration triggers with different definitions; wherein the definition of the migration trigger comprises: any one or more combinations of events corresponding to each 2D/3D control, system hard key events, system logic events, voice control command events and external gesture events.
In one or more embodiments of the present application, the second framework system establishes a binding relationship with a position attribute of a UI element added in different screens by defining a data attribute, defines a section of numerical animation for the binding relationship, sets an animation trigger, and realizes that the UI element flows between the screens through a synchronization mechanism of the binding relationship after the trigger; wherein the UI element comprises: any one or more of the combination of pictures, texts, controls, partial picture contents and screen picture contents.
In one or more embodiments of the present application, the plurality of displays corresponding to the plurality of screens are the same operating system, or the plurality of displays corresponding to the plurality of screens are different operating systems, respectively; the third framework system is used for transmitting interactive logic data among the plurality of screens through a communication mechanism based on different operating systems.
In one or more embodiments of the present disclosure, the multi-screen interaction module includes: the system comprises a first framework system, a second framework system and a third framework system, wherein the first framework system is used for constructing linkage migration among all pictures in all screens, the second framework system is used for constructing flow of UI elements among all screens, and the third framework system is used for constructing multi-screen interaction among all screens running on different operating systems; the third framework system comprises a plurality of APA interfaces.
To achieve the above and other related objects, there is provided a computer device including: one or more memories, and one or more processors; the one or more memories for storing a computer program; the one or more processors are used for operating the computer program to execute the functions of the multi-screen interaction system of the human-computer interaction interface.
To achieve the above and other related objects, the present application provides a computer storage medium storing a computer program, where the computer program is executed to perform the functions of a multi-screen interactive system of a human-computer interactive interface.
To achieve the above and other related objects, an interactive system is provided, which includes one or more displays for applying the functions of the multi-screen interactive system of the human-computer interactive interface, wherein the displays include: the intelligent electronic watch comprises any one or more of a liquid crystal instrument screen, a head-up display screen, a virtual control screen, a central control entertainment screen, a copilot entertainment screen, a rear seat entertainment screen, a desktop computer, a notebook computer, a smart phone, a tablet computer, an intelligent watch, an intelligent bracelet and intelligent glasses.
As described above, the multi-screen interactive system, the device, the medium, and the interactive system of the man-machine interface of the present application, the system includes: the UI framework module is used for constructing a UI framework system applied to 2D/3D mixing in each screen through a plurality of classes; a multi-screen application module for implementing an application of the UI framework system on a plurality of the screens; and the multi-screen interaction module is used for realizing interaction among a plurality of screens applying the UI framework system.
The realization that this application can make many screen interactions of human-computer interface becomes the simplification, the lightweight to provide sufficient imagination space for the product designer, will promote automotive electronics's market competition like this powerfully, for each car factory creation value, satisfy consumer's aesthetic requirement simultaneously. And based on the frame technology and the implementation steps, the difficulty of picture manufacturing is reduced, the efficiency is improved, and the labor cost is saved.
Drawings
Fig. 1 is a schematic block diagram of a multi-screen interaction system of a human-computer interaction interface according to an embodiment of the present disclosure.
Fig. 2 is a schematic block diagram of a UI framework system in an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a picture management framework system according to an embodiment of the present application.
Fig. 4 is a schematic view illustrating a scene of a multi-screen application module according to an embodiment of the present application.
Fig. 5 is a schematic view of a first framework system in an embodiment of the present application.
Fig. 6 is a schematic view of a second framework system in an embodiment of the present application.
Fig. 7 is a schematic view of a third framework system in the embodiment of the present application.
Fig. 8 is a schematic structural diagram of a computer device in an embodiment of the present application.
Fig. 9 is a schematic structural diagram of an interactive system in an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application is provided by way of specific examples, and other advantages and effects of the present application will be readily apparent to those skilled in the art from the disclosure herein. The present application is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present application. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings so that those skilled in the art to which the present application pertains can easily carry out the present application. The present application may be embodied in many different forms and is not limited to the embodiments described herein.
In order to clearly explain the present application, circuit components not related to the description are omitted, and the same or similar components are denoted by the same reference numerals throughout the specification.
Throughout the specification, when a circuit part is referred to as being "connected" to another circuit part, this includes not only the case of being "directly connected" but also the case of being "indirectly connected" with other elements interposed therebetween. In addition, when a circuit component is said to "include" a certain constituent element, unless otherwise stated, it means that the other constituent element may be included instead of excluding the other constituent element.
When a circuit element is said to be "on" another circuit element, this may be directly on the other circuit element, but may also be accompanied by other circuit elements in between. When a circuit component is said to be "directly" on "another circuit component, there are no other circuit components in between.
Although the terms first, second, etc. may be used herein to describe various elements in some instances, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, the first interface and the second interface, etc. are described. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the singular forms "a", "an" and "the" include plural forms as long as the words do not expressly indicate a contrary meaning. The term "comprises/comprising" when used in this specification is taken to specify the presence of stated features, regions, integers, steps, operations, elements, and/or components, but does not exclude the presence or addition of other features, regions, integers, steps, operations, elements, and/or components.
Terms representing relative spatial terms such as "lower", "upper", and the like may be used to more readily describe one circuit component's relationship to another circuit component as illustrated in the figures. Such terms are intended to include not only the meanings indicated in the drawings, but also other meanings or operations of the device in use. For example, if the device in the figures is turned over, certain circuit elements that were described as "below" other circuit elements would then be described as "above" the other circuit elements. Thus, the exemplary terms "under" and "beneath" all include above and below. The device may be rotated 90 or other angles and the terminology representing relative space is also to be interpreted accordingly.
Aiming at the pain problems of the software for realizing multi-screen interaction, the invention aims to construct a set of systematic implementation method based on a set of newly designed 2D/3D human-computer interaction interface implementation framework, simultaneously drive the contents of a plurality of display screens by using a single human-computer interaction interface program (process), and realize the human-computer interaction interface based on the technical principle of code removal.
It should be noted that, in the multi-screen interaction system, the device, the medium, and the interaction system of the human-computer interaction interface described in the present application, the applicable scene is not limited to a human-computer interaction scene in an automobile, but also applicable to a scene including but not limited to a plurality of human-computer interaction devices and having human-computer interaction requirements or applications in places such as a laboratory, scientific research, teaching, a market, an exhibition, and an exhibition.
As shown in fig. 1, a schematic block diagram of a multi-screen interaction system of a human-computer interaction interface in the embodiment of the present application is shown.
Optionally, the method may be applied to a frame (framework) for implementing a human-computer interaction interface; the method can also be applied to a manufacturing tool (which can be visual) of the human-computer interaction interface.
The system 100 includes: the UI framework module 110, the driving framework module 120, and the multi-screen interaction module 130.
UI framework module 110
In one or more embodiments of the present application, the UI framework module 110 is configured to construct a UI framework system applied to 2D/3D blending in each screen through a plurality of classes.
Generally speaking, the Class (Class) is a construct in object-oriented programming in an object-oriented computer programming language, which is a blueprint for creating objects that describes properties and methods common to the created objects.
A more rigid definition of a class is a cohesive package consisting of some specific metadata. It describes the behavior rules of some objects, which are called instances of the class. Classes have interfaces and structures. The interface describes how to interoperate with classes and their instances through methods, while the structure describes how data is divided into attributes in an instance. A class is the most specific type of object associated with a certain layer [ Note 1 ]. Classes may also have runtime representations (meta-objects) that provide runtime support for manipulating metadata associated with the class.
The programming languages that support classes have subtle differences in how much or little they support the various features associated with the classes. Most support different forms of class inheritance. Many languages also support features that provide encapsulation, such as access modifiers. The appearance of classes provides a means for implementation for the three most important properties of object-oriented programming (encapsulation, inheritance, polymorphism).
In one or more embodiments of the present application, the classes include, but are not limited to: a parent class of all 2D/3D UI elements, an encapsulation class of a camera object in a 3D frame system, an encapsulation class of a light source object in a 3D frame system, a parent class of a physical node object in a 3D frame system, a parent class of a virtual node object in a 3D frame system, an encapsulation class of a physical node object in a 3D frame system, an encapsulation class of a virtual node object in a 3D frame system, any one or more of a packaging class of an object surface object in the 3D frame system, a packaging class of a 3D scene in the 3D frame system, a basic parent class of all controls in the 2D frame system, a basic parent class of a container class control with a rolling characteristic in the 2D frame system, a 2D control with various characteristics in the 2D frame system, and a basic parent class of a packaging 2D scene in the 2D frame system.
In the present embodiment, reference may be made to the block diagram of the UI framework system shown in fig. 2. It should be noted that the names of the classes are only named expressions, and the meanings and the inheritance relationships between the classes are important contents. Meanwhile, for the convenience of drawing, common English abbreviations in software programs are used in the drawing to replace various names, and the specific meanings and the comparison are as follows:
UIElement: is a parent class of all 2D/3D UI elements, defines the properties and methods of commonality of all UI elements, and acts as a bridge between the UI control system and the resource storage management system.
C3 DCamera: encapsulation class of camera objects in a 3D framework system.
C3 DLight: an encapsulation class of a light source object in a 3D framework system.
C3 DNode: parent class of node (physical or virtual node) object in 3D framework system,
c3 DGroup: the encapsulation class of the virtual node object in the 3D framework system, which has the characteristics of a container, may contain the C3 degree child node.
C3 degree: the packaging class of the entity node object in the 3D framework system represents a specific object model, which is composed of C3 DSubEntity.
C3 DSubEntity: encapsulation class of object surface objects in a 3D framework system.
C3 DDocScene: the encapsulation class of a 3D scene in the 3D frame system, the static structure and the dynamic change data in the scene can be serialized into the resource storage management system.
C2 DComponent: and (3) basic parent classes of all controls in the 2D framework system.
C2 DContainer: the basic parent class of the container class control in the 2D framework system may contain child controls.
C2 DScrollContainer: and the 2D framework system is provided with a basic parent class of a container class control with a scrolling characteristic.
C2 DCtrlXxx: representing specific 2D controls encapsulating different expression styles and behavior logic functions, and if the 2D controls are inherited to the C2DComponent, the 2D controls are atomic controls, such as CCtrlImage, CCtrlText and the like; if the C2DContainer is inherited to the C2DContainer, the combination is a simple combination control, such as CCtrlButton and the like; if inherited to C2 DScrolContainer, then it is a composition control with scrolling capability, such as CCtrlList.
C2 ddococontainer: a basic parent class of a 2D scene is packaged in the 2D framework system, and both static structure and dynamic change data in the scene can be serialized into a resource storage management system. The internal structure and the expression form are designed and manufactured in a visual interface manufacturing tool associated with the framework.
In addition, C2 DDocWidget: represents a component object that can be reused, which can be used inside other C2 ddocwidgets, C2 ddocpopops and C2 ddocpages.
C2 DDoPopup: representing a pop-up object that floats on some local location of the screen.
C2 DDocPage: represents a page object within the screen that is generally the same size as the screen.
CMainFrame: representing a screen form object, is the root container of the entire screen element.
In the present application, the meaning of the screen needs to be specified. The screen includes: the independent screen corresponding to the whole display area on the display; and/or the block screens corresponding to a certain display area on the display can be superposed in one display.
Specifically, that is, within one screen, it does not mean only one screen, but it may also accommodate a plurality of virtual screens, with an overlapping relationship, a migration relationship, and the like between the screens.
In the present application, in order to implement the management and operation of the tile screen, a corresponding picture management system is constructed in the present application. See the picture management framework system shown in fig. 3. As shown in the figure, the frame of picture management respectively comprises: a Background layer (Background) of the whole screen, and various stack management layers for managing stacks under C2DdocContainer as in the embodiment of fig. 2.
For example, a Page Stack (Page Stack) manages a C2DDocPage Stack as in the embodiment of FIG. 2; an Overlay Stack (Overlay Stack) manages a C2DDocWidget Stack as in the embodiment of fig. 2; the Popup Stack (Popup Stack) manages the C2 DDoPopup Stack as in the embodiment of FIG. 2.
Multi-screen application module 120
In one or more embodiments of the present application, the multi-screen application module 120 is configured to drive one or more screens applying the UI framework system.
Similarly, the screen herein includes both an independent screen corresponding to an entire display area on the display and a tile screen corresponding to a certain display area on the display, and different tile screens may be superimposed on one of the displays.
In this application, after the UI framework system is constructed by the UI framework module 110, in order to implement that a main display (a main control terminal of the multi-screen interaction system 100 that loads the human-computer interaction interface of this application) respectively runs the UI framework system on different displays, that is, to implement a quick and simple multi-screen application, the multi-screen application module 120 is further provided in the system 100 of this application, and a system architecture built therein is used to provide an operating environment of the UI framework system.
In one or more embodiments of the present application, the multi-screen application module 120 includes:
a resource storage management system 121 for driving the UI framework system;
an input framework system 122 for implementing any one or more of screen touch input recognition and behavior refinement and distribution control, gesture input recognition and control, and voice input recognition and control;
the UI framework system 123 is configured to detect a change of each UI control object therein, and drive each UI control object; wherein, the UI framework system 123 is constructed for the UI framework module 110.
And the rendering framework system 124 is configured to convert the UI control object tree into a rendering tree, and draw on a unique rendering form associated with each screen.
And a rendering window system 125 for loading the rendering window.
As shown in fig. 4, a scene diagram of the multi-screen application module 120 is shown. As shown, the input frame system 122, the UI frame system 123, the rendering frame system 124, and the rendering window system 125 provided by the multi-screen application module 120 can be provided to a plurality of screen systems (hardware systems), so that each set of screen systems includes the independent input frame system 122, the independent UI frame system 123, the independent rendering frame system 124, and the independent rendering window system 125. Further, with the aid of the resource storage management system 121, it is possible to uniformly support serialization and deserialization for data in the UI framework system 123 in each screen system.
In particular, the input framework system 122, which contains the basic screen touch input recognition and behavior refinement, and logic to distribute control, also includes logic for recognition and control of gesture inputs, and even speech inputs.
The UI framework system 123 can control the change of each UI control object and drive the logic behavior of the UI control object. The UI framework system 123 is a 2D/3D mixed UI framework system that is constructed by the UI framework module 110 through multiple classes and applied to each screen. By means of the resource storage management system 121, data in the UI framework system 123 in each screen system can be managed.
The rendering framework system 124 is configured to convert the UI control object tree into a rendering tree and draw on a unique system form associated with each screen system.
The rendering Window is a Window in a Window system on a Windows OS; on Android OS is Surface in Surface flag; on QNX OS is Screen in Screen system; on Linux is window in X11 system or surface in Wayland system. The render window system 125 is configured to load the render window.
Multi-screen interaction module 130
In one or more embodiments of the present application, the multi-screen interaction module 130 is configured to implement interaction between multiple screens applying the UI framework system.
It should be noted that the interaction among the screens includes any one or more of the following, that is, including but not limited to the following embodiments:
in a first embodiment, the interaction between a plurality of tile screens included in one of the displays. In this embodiment, a plurality of display contents are presented in the display, for example, display contents are presented on the left and right sides of one display, or display contents are presented on the upper and lower sides of one display, and interactive operations such as screen transition, content synchronization, and the like can be realized between the display contents.
In a second embodiment, a plurality of displays are associated with the interaction between the independent screens. In this embodiment, each of the displays presents a complete display content, and interactive operations such as screen transition and content synchronization can be implemented between the complete display contents (displays).
In a third embodiment, the interaction between the independent screen corresponding to one or more of the displays and the tile screen included in one or more of the displays. In this embodiment, the display device displays a plurality of display contents or displays a complete display content, and the display contents can realize interactive operations such as screen transition and content synchronization between different situations.
In one or more embodiments of the present disclosure, the multi-screen interaction module 130 includes: a first framework system 131 for constructing inter-screen joint movement within each of the screens, a second framework system 132 for constructing flow of UI elements between the screens, and a third framework system 133 for constructing multi-screen interaction between the screens running on different operating systems.
In one or more embodiments of the present application, the first framework system 131 implements the linked migration between the frames in each of the screens by constructing differently defined migration triggers; wherein the definition of the migration trigger comprises: any one or more combinations of events corresponding to each 2D/3D control, system hard key events, system logic events, voice control command events and external gesture events.
In the above embodiment, the screen transition may occur within one screen or between multiple screens. As in the example of FIG. 5, within Screen 1, by migrating trigger 1, the screen changes from page1 to page 2; guiding the screen 2 to display a page 3 through the migration trigger 2; if migration trigger 1 and migration trigger 2 are identical, the same trigger mechanism will cause the change of pages to occur on both screen 1 and screen 2. Similarly, in the screen 2, through the triggering of the trigger 3, a popup 1 is displayed in the screen 1; popup 2 is displayed in screen 2 by the triggering of trigger 4, and if trigger 3 is the same as trigger 4, the same triggering mechanism will display different popups in screen 1 and screen 2 at the same time.
The trigger here includes several categories, such as various events of various types of 2D/3D controls, events of system hard keys, system logic events, external gesture events, etc., wherein the voice control command can be transferred to the system logic event. The definitions of various migration triggers between pages across screens, and between pages and popups, are designed and built in the visual interface authoring tool associated with the framework and support serialization and deserialization by the resource storage management system 121 as in FIG. 4. The first framework system 131 is responsible for monitoring the running status of the whole system, and when a certain type of trigger is generated, the change of the picture in each screen is driven by the stack management mechanism of CMainFrame as in the embodiment of fig. 2 according to the migration configuration information in the resource storage management system.
In one or more embodiments of the present application, the second framework system 132 establishes a binding relationship between a defined data property and a location property of a UI element added in different screens, defines a section of numerical animation for the binding relationship, sets an animation trigger, and implements the UI element flowing between the screens through a synchronization mechanism of the binding relationship after triggering; wherein the UI element comprises: any one or more of the combination of pictures, texts, controls, partial picture contents and screen picture contents.
In this embodiment, the multi-screen interaction mode also includes a mode in which a UI element flows between screens. The UI element may be a picture, a piece of text, a control, a partial screen, or even the entire screen content. In the example shown in fig. 6, the UI element a on the screen 1, during the movement from the screen 1 to the screen 2, has a part of its content displayed on the screen 1 and another part of its content displayed on the screen 2.
In the above embodiment, in the second framework system 132 in the 2D/3D hybrid framework system, the data attribute position _ x is defined, the UI element a is added to the screen 1, and the position attribute x of the UI element a is in a binding relationship with the data attribute position _ x: x is position _ x; a UI element a is also added to the screen 2, and its position attribute x is bound to the data attribute position _ x: x-position _ x-screen width. In an animation driven system, a segment of numerical animation may be defined for position _ x and an animation trigger may be set. In the runtime state, when the trigger occurs to drive the animation to run, the position _ x value will change continuously, and then the changed value is synchronized to the respective UI elements a in the screen 1 and the screen 2 through the synchronization mechanism of the second framework system 132, so that the effect presented on the screen is that the UI elements a flow from the screen 1 to the screen 2. Based on similar implementation principles, so that designers can design flow patterns of other more diversified UI elements among screens. The establishment of the binding relationship between the UI elements and the data attributes and the driving animation designed for the data attributes can be designed in a visual interface manufacturing tool associated with the framework.
In one or more embodiments of the present application, the third framework 133 is used for transferring interactive logic data between the screens by a communication mechanism based on different operating systems.
It should be noted that the multiple displays corresponding to the multiple screens are the same operating system, or the multiple displays corresponding to the multiple screens are different operating systems respectively.
Specifically, the third framework system 133 includes a plurality of APA interfaces. In one or more embodiments, the API interface comprises: BYTE pbtData, UINT uidataSize, LLONG llTimeStamp BYTE pbtData, UINTuiDataSize, and LLONG llTimeStamp; the LLONG llTimeStamp is used for setting a time synchronization mechanism so that the linkage transfer process between the pictures in each screen and the flow process of the UI elements between the screens are in seamless connection.
In this embodiment, the change of the screen transition between the multiple screens and the flow of the UI elements between the multiple screens may also occur between the human interface programs running on different operating systems. In the example shown in fig. 7, a human interface program 1 is run on an operating system 1, which drives two screens; the operating system 2 runs a human interface program 2, which drives a screen.
In the application scenario, a third framework system 133, such as a Data Transfer module, is required to be constructed in the multi-screen interaction system 110 of the 2D/3D human-computer interaction interface, and is used as a bridge to Transfer interaction logic Data between two human-computer interface programs. Its internal implementation may be based on inter-operating system communication mechanisms, such as the common socket-based communication principle. This module, the two most core interface APIs are sendData (BYTE pbtData, UINT uiDataSize, llongltime) and ondatavailable (BYTE pbtData, UINT uiDataSize, llongltime), where the llTimeStamp parameter is used to make a time synchronization mechanism, making the picture migration process and the flow process of UI elements appear to be seamlessly joined between the two screens.
According to the design concept and implementation steps constructed above, finally, in the field of automobile intelligent cabs, the realization of multi-screen interaction of human-computer interfaces becomes simplified and light, and enough imagination space is provided for product designers, so that the market competitiveness of automobile electronic products is forcefully improved, the value is created for each automobile factory, and the aesthetic requirements of consumers are met. Based on the frame technology and the implementation steps, the difficulty of picture making is reduced, the efficiency is improved, and the labor cost is saved
It should be noted that the division of the modules in the system embodiment of fig. 1 is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the UI framework module 110 may be a processing element separately set up, or may be implemented by being integrated into a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the processing element of the apparatus calls and executes the functions of the picture management module 101. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Of course, the system may also be implemented by combining software with a hardware circuit, for example, by using a computer program loaded in a computer device, such as a liquid crystal instrument panel, a head-up display panel, a virtual control panel, a central control entertainment panel, a copilot entertainment panel, a rear seat entertainment panel, a desktop computer, a notebook computer, a smart phone, a tablet computer, a smart watch, a smart bracelet, and smart glasses, and the like, by using the hardware circuit to run the software program.
Fig. 8 is a schematic structural diagram of a computer device in the embodiment of the present application.
In this embodiment, the computer apparatus 800 includes: one or more memories 801, and one or more processors 808.
The one or more memories 801 storing computer programs;
the one or more processors 808 are configured to execute the computer program to implement the functions of a multi-screen interactive system such as the human-machine interface shown in fig. 1.
In a possible implementation, the one or more memories 801 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) and/or cache memory, and may also include one or more non-transitory computer readable storage media such as ROM, EEPROM, EPROM, flash memory devices, disks, etc., and/or combinations thereof.
In possible implementations, the one or more processors 202 can be any suitable processing element (e.g., processor core, microprocessor, ASIC, FPGA, controller, microcontroller, etc.) and can be one processor or a plurality of processors operatively connected.
It should be noted that, in the implementation of the image processing system, the computer device, and the like of the human-computer interaction interface in the above embodiments, all the related computer programs may be loaded on a computer-readable storage medium, and the computer-readable storage medium may be a tangible device that can hold and store the instructions used by the instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
Fig. 9 is a schematic structural diagram of an interactive system in the embodiment of the present application. As shown, a display 901 including one or more functions for applying the system of fig. 1, the display 901 comprises: the intelligent electronic watch comprises any one or more of a liquid crystal instrument screen, a head-up display screen, a virtual control screen, a central control entertainment screen, a copilot entertainment screen, a rear seat entertainment screen, a desktop computer, a notebook computer, a smart phone, a tablet computer, an intelligent watch, an intelligent bracelet and intelligent glasses.
In this embodiment, the displays 901 CAN be communicatively connected to each other through a wireless network or a wired network, such as any one or more of the internet, a CAN, an intranet, a Wide Area Network (WAN), a Local Area Network (LAN), a wireless network, a Digital Subscriber Line (DSL) network, a frame relay network, an Asynchronous Transfer Mode (ATM) network, a Virtual Private Network (VPN), and/or any other suitable communication network. For example: any one or a plurality of combinations of WIFI, Bluetooth, NFC, GPRS, GSM and Ethernet.
To sum up, many screen interaction system, equipment, medium, the interactive system of man-machine interface of this application, the system includes: the UI framework module is used for constructing a UI framework system applied to 2D/3D mixing in each screen through a plurality of classes; a multi-screen application module for implementing an application of the UI framework system on a plurality of the screens; and the multi-screen interaction module is used for realizing interaction among a plurality of screens applying the UI framework system.
According to the design concept and implementation steps constructed above, an end user can use each small picture according to normal thinking logic habits, the attributes related to the pictures are set for various UI controls, and meanwhile, the purpose of efficiently rendering the pictures can be achieved through an internal mechanism of the UI framework. The process is not sensitive to the user, so that the difficulty of picture making is reduced, the efficiency is improved, and the labor cost is saved.
The above embodiments are merely illustrative of the principles and utilities of the present application and are not intended to limit the application. Any person skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical concepts disclosed in the present application shall be covered by the claims of the present application.

Claims (14)

1. A multi-screen interactive system of a human-computer interactive interface is characterized in that the system comprises:
the UI framework module is used for constructing a UI framework system applied to 2D/3D mixing in each screen through a plurality of classes;
a multi-screen application module for implementing an application of the UI framework system on a plurality of the screens;
and the multi-screen interaction module is used for realizing interaction among a plurality of screens applying the UI framework system.
2. The system of claim 1, wherein the classes comprise: a parent class of all 2D/3D UI elements, an encapsulation class of a camera object in a 3D frame system, an encapsulation class of a light source object in a 3D frame system, a parent class of a physical node object in a 3D frame system, a parent class of a virtual node object in a 3D frame system, an encapsulation class of a physical node object in a 3D frame system, an encapsulation class of a virtual node object in a 3D frame system, any one or more of a packaging class of an object surface object in the 3D frame system, a packaging class of a 3D scene in the 3D frame system, a basic parent class of all controls in the 2D frame system, a basic parent class of a container class control with a rolling characteristic in the 2D frame system, a 2D control with various characteristics in the 2D frame system, and a basic parent class of a packaging 2D scene in the 2D frame system.
3. The system of claim 1, wherein the screen comprises: the independent screen corresponding to the whole display area on the display; and/or the block screens corresponding to a certain display area on the display can be superposed in one display.
4. The system of claim 3, wherein the interaction between the plurality of screens comprises any one or more of:
interaction between a plurality of the tile screens included in one of the displays;
interaction between the independent screens corresponding to the plurality of displays;
interaction between the independent screen corresponding to one or more of the displays and the tile screen included in one or more of the displays.
5. The system of claim 4, wherein the plurality of displays corresponding to the plurality of screens are of a same operating system, or the plurality of displays corresponding to the plurality of screens are of different operating systems.
6. The system of claim 1, wherein the multi-screen application module comprises:
the resource storage management system is used for driving the UI framework system;
the input framework system is used for realizing any one or more of screen touch input recognition, behavior refinement and distribution control, gesture input recognition and control and voice input recognition and control;
the UI frame system is used for detecting the change of each UI control object in the UI frame system and driving each UI control object;
the rendering frame system is used for converting the UI control object tree into a rendering tree and drawing on a unique rendering window associated with each screen;
and the rendering window system is used for loading the rendering window.
7. The system of claim 1, wherein the multi-screen interaction module comprises: the method comprises the steps of constructing a first frame system for linkage migration among pictures in each screen, constructing a second frame system for flow of UI elements among the screens, and constructing a third frame system for multi-screen interaction among the screens running on different operating systems.
8. The system of claim 1, wherein the first framework system implements inter-screen linked migration within each of the screens by constructing differently defined migration triggers; wherein the definition of the migration trigger comprises: any one or more combinations of events corresponding to each 2D/3D control, system hard key events, system logic events, voice control command events and external gesture events.
9. The system according to claim 1, wherein the second framework system establishes a binding relationship with a position property of the UI element added in different screens by defining a data property, defines a section of numerical animation for the binding relationship, sets an animation trigger, and implements the flow of the UI element between the screens through a synchronization mechanism of the binding relationship after the trigger;
wherein the UI element comprises: any one or more of the combination of pictures, texts, controls, partial picture contents and screen picture contents.
10. The system according to claim 1, wherein the plurality of displays corresponding to the plurality of screens are of a same operating system, or the plurality of displays corresponding to the plurality of screens are of different operating systems; the third framework system is used for transmitting interactive logic data among the plurality of screens through a communication mechanism based on different operating systems.
11. The system of claim 10, wherein the multi-screen interaction module comprises: the system comprises a first framework system, a second framework system and a third framework system, wherein the first framework system is used for constructing linkage migration among all pictures in all screens, the second framework system is used for constructing flow of UI elements among all screens, and the third framework system is used for constructing multi-screen interaction among all screens running on different operating systems; the third framework system includes a plurality of API interfaces.
12. A computer device, comprising: one or more memories, and one or more processors;
the one or more memories for storing a computer program;
the one or more processors are configured to execute the computer program to perform the functions of the system according to any one of claims 1 to 11.
13. A computer storage medium, in which a computer program is stored which, when executed, performs the functions of a system according to any one of claims 1 to 11.
14. An interactive system comprising one or more displays to which the functionality of the system of any one of claims 1 to 11 is applied, the displays comprising: the intelligent electronic watch comprises any one or more of a liquid crystal instrument screen, a head-up display screen, a virtual control screen, a central control entertainment screen, a copilot entertainment screen, a rear seat entertainment screen, a desktop computer, a notebook computer, a smart phone, a tablet computer, an intelligent watch, an intelligent bracelet and intelligent glasses.
CN201910943241.2A 2019-09-30 2019-09-30 Multi-screen interaction system, equipment, medium and interaction system of human-computer interaction interface Active CN110658963B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910943241.2A CN110658963B (en) 2019-09-30 2019-09-30 Multi-screen interaction system, equipment, medium and interaction system of human-computer interaction interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910943241.2A CN110658963B (en) 2019-09-30 2019-09-30 Multi-screen interaction system, equipment, medium and interaction system of human-computer interaction interface

Publications (2)

Publication Number Publication Date
CN110658963A true CN110658963A (en) 2020-01-07
CN110658963B CN110658963B (en) 2021-04-06

Family

ID=69038745

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910943241.2A Active CN110658963B (en) 2019-09-30 2019-09-30 Multi-screen interaction system, equipment, medium and interaction system of human-computer interaction interface

Country Status (1)

Country Link
CN (1) CN110658963B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111414205A (en) * 2020-03-31 2020-07-14 惠州华阳通用电子有限公司 Multi-screen linkage starting-up animation display system and implementation method
CN111988382A (en) * 2020-08-07 2020-11-24 浙江讯盟科技有限公司 Method and system for performing application interface interaction across terminals
CN115202554A (en) * 2022-06-10 2022-10-18 重庆长安汽车股份有限公司 Interaction system and method of instrument and vehicle machine
CN117130573A (en) * 2023-10-26 2023-11-28 北京世冠金洋科技发展有限公司 Multi-screen control method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104035813A (en) * 2013-03-06 2014-09-10 腾讯科技(深圳)有限公司 Cross-platform interaction detection method and device
US9317128B2 (en) * 2009-04-02 2016-04-19 Oblong Industries, Inc. Remote devices used in a markerless installation of a spatial operating environment incorporating gestural control
CN105577889A (en) * 2014-10-10 2016-05-11 广州杰赛科技股份有限公司 Multi-screen interaction operation system
CN107003818A (en) * 2014-10-17 2017-08-01 三星电子株式会社 The method and the equipment using this method of screen are shared between devices
CN107589900A (en) * 2017-09-06 2018-01-16 广东欧珀移动通信有限公司 Multi-screen display method, device, terminal and storage medium
CN109828791A (en) * 2018-12-28 2019-05-31 北京奇艺世纪科技有限公司 A kind of animation playing method, terminal and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9317128B2 (en) * 2009-04-02 2016-04-19 Oblong Industries, Inc. Remote devices used in a markerless installation of a spatial operating environment incorporating gestural control
CN104035813A (en) * 2013-03-06 2014-09-10 腾讯科技(深圳)有限公司 Cross-platform interaction detection method and device
CN105577889A (en) * 2014-10-10 2016-05-11 广州杰赛科技股份有限公司 Multi-screen interaction operation system
CN107003818A (en) * 2014-10-17 2017-08-01 三星电子株式会社 The method and the equipment using this method of screen are shared between devices
CN107589900A (en) * 2017-09-06 2018-01-16 广东欧珀移动通信有限公司 Multi-screen display method, device, terminal and storage medium
CN109828791A (en) * 2018-12-28 2019-05-31 北京奇艺世纪科技有限公司 A kind of animation playing method, terminal and computer readable storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111414205A (en) * 2020-03-31 2020-07-14 惠州华阳通用电子有限公司 Multi-screen linkage starting-up animation display system and implementation method
CN111414205B (en) * 2020-03-31 2022-05-17 惠州华阳通用电子有限公司 Multi-screen linkage boot animation display system and implementation method
CN111988382A (en) * 2020-08-07 2020-11-24 浙江讯盟科技有限公司 Method and system for performing application interface interaction across terminals
CN115202554A (en) * 2022-06-10 2022-10-18 重庆长安汽车股份有限公司 Interaction system and method of instrument and vehicle machine
CN115202554B (en) * 2022-06-10 2023-06-06 重庆长安汽车股份有限公司 Interaction system and method for instrument and car machine
CN117130573A (en) * 2023-10-26 2023-11-28 北京世冠金洋科技发展有限公司 Multi-screen control method, device, equipment and storage medium
CN117130573B (en) * 2023-10-26 2024-02-20 北京世冠金洋科技发展有限公司 Multi-screen control method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN110658963B (en) 2021-04-06

Similar Documents

Publication Publication Date Title
CN110658963B (en) Multi-screen interaction system, equipment, medium and interaction system of human-computer interaction interface
US9684434B2 (en) System and method for displaying a user interface across multiple electronic devices
US8756044B2 (en) Graphical partitioning for parallel execution of executable block diagram models
EP2632131A1 (en) Method, apparatus, and system for providing a shared user interface
US20130219303A1 (en) Method, apparatus, and system for providing a shared user interface
US20110258534A1 (en) Declarative definition of complex user interface state changes
EP2584445A1 (en) Method of animating a rearrangement of ui elements on a display screen of an eletronic device
CA2798979A1 (en) Method of rendering a user interface
US20120124492A1 (en) Display and Resolution of Incompatible Layout Constraints
JP2012521041A (en) Smooth layout animation for continuous and discontinuous properties
US20080163081A1 (en) Graphical User Interface Using a Document Object Model
CN110569096B (en) System, method, medium, and apparatus for decoding human-computer interaction interface
US20170352174A1 (en) Method and system for visual data management
Nathan WPF 4.5 Unleashed
CA2806906C (en) System and method for displaying a user interface across multiple electronic devices
US20140325404A1 (en) Generating Screen Data
US10984170B2 (en) Systems and/or methods for dynamic layout design
CN110554900B (en) Method, system, device and medium for presenting human-computer interface effect based on GPU
Kammer et al. The eleventh finger: levels of manipulation in multi-touch interaction
US11803292B2 (en) User interface component and region layout control
CN109783100B (en) Method and device for checking user interface element attribute and electronic equipment
Nekrasov UIKit and Storyboards
CN109634498B (en) Efficient processing method for external events of cockpit display system based on ARINC661
Chatty Supporting multidisciplinary software composition for interactive applications
Liberty et al. Pro Windows 8.1 Development with XAML and C

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant