CN113220180A - Management method, device and equipment of embedded equipment - Google Patents

Management method, device and equipment of embedded equipment Download PDF

Info

Publication number
CN113220180A
CN113220180A CN202110496449.1A CN202110496449A CN113220180A CN 113220180 A CN113220180 A CN 113220180A CN 202110496449 A CN202110496449 A CN 202110496449A CN 113220180 A CN113220180 A CN 113220180A
Authority
CN
China
Prior art keywords
window
type
virtual
operation type
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110496449.1A
Other languages
Chinese (zh)
Other versions
CN113220180B (en
Inventor
王凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202110496449.1A priority Critical patent/CN113220180B/en
Publication of CN113220180A publication Critical patent/CN113220180A/en
Application granted granted Critical
Publication of CN113220180B publication Critical patent/CN113220180B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a management method, a device and equipment of embedded equipment, wherein the method comprises the following steps: receiving an operation command aiming at a target control on a user interface, wherein the user interface comprises at least two virtual layers, each virtual layer comprises at least one window, and each window comprises at least one control; determining the operation type of the operation command; if the operation command is an operation command generated when the user executes the second operation, the operation type is the second operation type; judging whether the operation types supported by the target window to which the target control belongs include a second operation type; if the target window is located in the first virtual layer, the operation types supported by the target window comprise a second operation type; if the target window is located in the second virtual layer, the operation type supported by the target window comprises a first operation type; and if so, triggering the target control to execute the operation command. Through the technical scheme of the application, the embedded equipment supporting the keys can also be controlled through touch control, and the user experience is better.

Description

Management method, device and equipment of embedded equipment
Technical Field
The present application relates to the field of internet technologies, and in particular, to a method, an apparatus, and a device for managing an embedded device.
Background
The embedded device is a device adopting an embedded system, and the embedded system is a special computer system which is centered on application, based on computer technology and capable of flexibly cutting software and hardware according to the requirements (such as function, reliability, cost, volume, power consumption, environment and the like) of a user. The embedded system may be composed of hardware and software, the software content includes a software operating environment and an operating system thereof, the hardware content includes a signal processor, a memory, a communication module, and the like, and the embedded system generally does not realize a large-capacity storage function.
The embedded device may include an embedded key device, that is, an embedded device using keys, and a user operates the keys to control the embedded device. For example, for embedded devices such as a television and a refrigerator, the channel and volume of the television can be adjusted by keys, and the temperature of the refrigerator can be adjusted by keys.
With the rapid development of technology, the use requirements of the embedded device are higher and higher, and it is desirable that the embedded device can be used smoothly and the user experience of the embedded device is better, which causes the embedded device supporting the key to face a great pressure. For example, when the user can only operate the keys to realize the control of the embedded device, the user experience is poor, which is not favorable for the market competitiveness of the embedded device.
Disclosure of Invention
The application provides a management method of an embedded device, which is applied to the embedded device, if the embedded device is the embedded device supporting the operation of a first operation type, the method comprises the following steps:
receiving an operation command aiming at a target control on a user interface; the user interface comprises at least two virtual layers, each virtual layer comprises at least one window, and each window comprises at least one control;
determining the operation type of the operation command; if the operation command is an operation command generated when a user executes a second operation, the operation type is a second operation type;
judging whether the operation types supported by the target window to which the target control belongs comprise a second operation type; if the target window is located in the first virtual layer, the operation types supported by the target window include a second operation type; if the target window is located in a second virtual layer, the operation type supported by the target window comprises a first operation type, and the first operation type is different from the second operation type; the first virtual layer is a virtual layer supporting a second operation type in the at least two virtual layers, and the second virtual layer is a virtual layer supporting a first operation type in the at least two virtual layers;
and if so, triggering the target control to execute the operation command.
Illustratively, if the operation command is an operation command generated when a user executes a first operation, the operation type is a first operation type; after determining the operation type of the operation command, the method further includes: judging whether the operation types supported by the target window to which the target control belongs comprise a first operation type or not; and if so, triggering the target control to execute the operation command.
In a possible implementation manner, if the first operation is a key operation, the second operation is a touch operation, the first operation type is a key type, and the second operation type is a touch type; or if the first operation is a touch operation, the second operation is a key operation, the first operation type is a touch type, and the second operation type is a key type.
For example, before receiving an operation command for a target control on a user interface, the method further includes: determining the number M of virtual layers to be created, and creating M virtual layers; the method comprises the following steps that M is a positive integer larger than 1, each virtual layer corresponds to a layer identifier and a layer attribute, and the layer identifiers corresponding to the M virtual layers are determined based on the display sequence of the M virtual layers;
determining a display sequence of the M virtual layers based on layer identifiers corresponding to the M virtual layers, and fusing the M virtual layers based on the display sequence to obtain fused layers, wherein the fused layers comprise the M virtual layers, and the M virtual layers are arranged according to the display sequence;
and displaying the fused layer on a user interface of the embedded equipment.
In a possible implementation manner, before receiving an operation command for a target control on a user interface, the method further includes: determining a virtual layer to which a window to be created belongs; determining window attributes corresponding to a window to be created; creating the window on the virtual layer based on the window attribute, wherein the window corresponds to the window attribute, the window attribute at least comprises an operation type supported by the window, and the operation type supported by the window comprises a first operation type and/or a second operation type; wherein the window attributes further include at least one of: window size, window background, window rendering effect.
Illustratively, the determining the window attribute corresponding to the window to be created includes: determining window attributes corresponding to the window to be created based on the layer attributes of the virtual layer to which the window to be created belongs;
if the layer attribute indicates that the virtual layer supports a first operation type, the window attribute indicates that a window supports the first operation type; if the layer attribute represents that the virtual layer supports a second operation type, the window attribute represents that a window supports the second operation type; and if the layer attribute indicates that the virtual layer supports a first operation type and a second operation type, the window attribute indicates that the window supports the first operation type and/or the second operation type.
In a possible implementation manner, before receiving an operation command for a target control on a user interface, the method further includes: determining a window to which a control to be created belongs;
determining a control type corresponding to a control to be created;
creating a control matched with the control type on the window, wherein the control corresponds to a control attribute and a control identifier, and the control identifier has uniqueness; wherein the control type is one of the following types: button type, text type, information type, progress bar type, graphic interchange format type.
The application provides a management device of embedded equipment, is applied to embedded equipment, if embedded equipment is the embedded equipment of the operation of supporting first operation type, the device includes: the receiving module is used for receiving an operation command aiming at a target control on a user interface; the user interface comprises at least two virtual layers, each virtual layer comprises at least one window, and each window comprises at least one control;
the determining module is used for determining the operation type of the operation command; if the operation command is an operation command generated when a user executes a second operation, the operation type is a second operation type;
the processing module is used for judging whether the operation types supported by the target window to which the target control belongs comprise a second operation type; if the target window is located in the first virtual layer, the operation types supported by the target window include a second operation type; if the target window is located in a second virtual layer, the operation type supported by the target window comprises a first operation type, and the first operation type is different from the second operation type; the first virtual layer is a virtual layer supporting a second operation type in the at least two virtual layers, and the second virtual layer is a virtual layer supporting a first operation type in the at least two virtual layers; and if so, triggering the target control to execute the operation command.
Illustratively, if the operation command is an operation command generated when a user executes a first operation, the operation type is a first operation type; the processing module is further configured to: judging whether the operation types supported by the target window to which the target control belongs comprise a first operation type or not; if yes, triggering the target control to execute the operation command; if the first operation is a key operation, the second operation is a touch operation, the first operation type is a key type, and the second operation type is a touch type; or if the first operation is a touch operation, the second operation is a key operation, the first operation type is a touch type, and the second operation type is a key type.
The application proposes an embedded device, the embedded device comprising: a processor and a machine-readable storage medium storing machine-executable instructions executable by the processor; the processor is configured to execute machine executable instructions to perform the steps of:
if the embedded device is an embedded device supporting operation of a first operation type, then:
receiving an operation command aiming at a target control on a user interface; the user interface comprises at least two virtual layers, each virtual layer comprises at least one window, and each window comprises at least one control;
determining the operation type of the operation command; if the operation command is an operation command generated when a user executes a second operation, the operation type is a second operation type;
judging whether the operation types supported by the target window to which the target control belongs comprise a second operation type; if the target window is located in the first virtual layer, the operation types supported by the target window include a second operation type; if the target window is located in a second virtual layer, the operation type supported by the target window comprises a first operation type, and the first operation type is different from the second operation type; the first virtual layer is a virtual layer supporting a second operation type in the at least two virtual layers, and the second virtual layer is a virtual layer supporting a first operation type in the at least two virtual layers;
and if so, triggering the target control to execute the operation command.
According to the technical scheme, for the embedded device supporting the keys, the embedded device can be controlled through touch control, user experience is good, a complete implementation scheme from the keys to the touch control is provided for the embedded device, and market competitiveness of the embedded device is effectively improved. At least two virtual layers can be displayed on the user interface, namely, windows of various types are displayed through the at least two virtual layers, for example, one virtual layer displays a state window, and the other virtual layer displays a menu window, so that the problem that one layer is difficult to realize a good graphical user interface is avoided, and the good graphical user interface can be displayed.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments of the present application or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art according to the drawings of the embodiments of the present application.
FIG. 1 is a flowchart illustrating a method for managing an embedded device according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating a method for managing an embedded device according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for managing an embedded device according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating a method for managing an embedded device according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating a method for managing an embedded device according to an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a management apparatus of an embedded device according to an embodiment of the present application;
fig. 7 is a hardware configuration diagram of an embedded device according to an embodiment of the present application.
Detailed Description
The terminology used in the embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein is meant to encompass any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in the embodiments of the present application to describe various information, the information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. Depending on the context, moreover, the word "if" as used may be interpreted as "at … …" or "when … …" or "in response to a determination".
The embedded device may include an embedded key device supporting key operation, an embedded touch device supporting touch operation, and an embedded key touch device supporting both key operation and touch operation.
The embedded key equipment is embedded equipment adopting keys, namely, the embedded equipment is controlled by the keys. The embedded touch device is an embedded device adopting touch control, namely the control of the embedded device is realized through touch control, and the touch control is operated by touching a function diagram on a screen of the device with a finger or a pen point instead of a key. The embedded key touch device is an embedded device which adopts keys and touch at the same time, namely the control of the embedded device can be realized through the keys and the control of the embedded device can also be realized through the touch.
Because the embedded key device can only realize the control of the embedded device through keys, the user experience of the embedded key device is poor, and the market competitiveness of the embedded key device is not favorable.
Because the embedded touch device can only realize the control of the embedded device through touch, the user experience of the embedded touch device is also poor, which is not favorable for the market competitiveness of the embedded touch device.
In view of the above discovery, an embodiment of the present application provides a management method for an embedded device, where the method may be applied to an embedded device, and the embedded device may include, but is not limited to, an embedded key device, an embedded touch device, and an embedded key touch device. For example, the embedded device may be an embedded key device, or the embedded device may also be an embedded touch device.
For the embedded key equipment supporting the keys, the control of the embedded equipment can be realized through the keys, the control of the embedded equipment can also be realized through touch control, the user experience is better, a complete realization scheme from the keys to the touch control is provided for the embedded key equipment, and the market competitiveness of the embedded key equipment is effectively improved.
For the embedded touch control equipment supporting touch control, the control of the embedded equipment can be realized through touch control, the control of the embedded equipment can also be realized through keys, the user experience is better, a complete realization scheme from touch control to keys is provided for the embedded key equipment, and the market competitiveness of the embedded touch control equipment is effectively improved.
In the embodiment of the application, at least two virtual layers can be displayed on the user interface, that is, windows of various types are displayed through the at least two virtual layers, so that the problem that one layer is difficult to realize a good graphical user interface is solved, the good graphical user interface can be displayed, and the user experience is further improved.
The User Interface (UI) is a medium for interaction and information exchange between the system and the User, and is an integral design for man-machine interaction, operation logic and beautiful Interface of software.
The following describes a management method of an embedded device according to an embodiment of the present application with reference to specific embodiments.
Referring to fig. 1, a flowchart of a management method for an embedded device is shown, where the method is applied to an embedded device, and if the embedded device is an embedded device that supports an operation of a first operation type, the method may include:
step 101, receiving an operation command for a target control on a user interface.
For example, the user interface may include at least two virtual layers, each virtual layer may include at least one window, and each window may include at least one control, that is, the user interface may display at least two virtual layers, at least one window on each virtual layer, and at least one control on each window, so that the user may operate the controls displayed on the windows.
When a user operates the controls displayed on the window, the controls are called target controls, the number of the target controls can be at least one, and the window to which the target controls belong is called a target window.
For example, when a user operates a target control displayed on a window displayed on a user interface, the embedded device may receive an operation command for the target control on the user interface.
In a possible embodiment, before step 101, at least two virtual layers may be displayed on the user interface, at least one window may be displayed on each virtual layer, and at least one control may be displayed on each window, so that the construction of the virtual layers, the construction of the windows, and the construction of the controls are involved, and the following processes of the construction of the virtual layers, the construction of the windows, and the construction of the controls are described.
For example, the building process for the virtual layer may include: determining the number M of virtual layers to be created, and creating M virtual layers; m is a positive integer larger than 1, each virtual layer corresponds to a layer identifier and a layer attribute, and the layer identifiers corresponding to the M virtual layers are determined based on the display sequence of the M virtual layers. Then, determining a display sequence of the M virtual layers based on layer identifiers corresponding to the M virtual layers, and fusing the M virtual layers based on the display sequence of the M virtual layers to obtain a fused layer, where the fused layer may include the M virtual layers, and the M virtual layers are arranged according to the display sequence. And then, displaying the fused layer on a user interface of the embedded device.
For example, the construction process for the window may include: determining a virtual layer to which a window to be created belongs, and determining a window attribute corresponding to the window to be created; and creating a window on the virtual layer based on the window attribute, wherein the window corresponds to the window attribute, the window attribute at least comprises an operation type supported by the window, and the operation type supported by the window can comprise a first operation type and/or a second operation type. Illustratively, in addition to the types of operations supported by the window, the window attributes may include, but are not limited to, at least one of the following: window size, window background, window rendering effect.
Determining window attributes corresponding to a window to be created may include, but is not limited to: and determining window attributes corresponding to the window to be created based on the layer attributes of the virtual layer to which the window to be created belongs.
For example, if the layer attribute indicates that the virtual layer supports the first operation type (the virtual layer supporting the first operation type may be referred to as a second virtual layer, and the number of the second virtual layer may be at least one), the window attribute may indicate that the window supports the first operation type. If the layer attribute indicates that the virtual layer supports the second operation type (the virtual layer supporting the second operation type may be referred to as a first virtual layer, and the number of the first virtual layer may be at least one), the window attribute may indicate that the window supports the second operation type. If the layer attribute indicates that the virtual layer supports the first operation type and the second operation type (the virtual layer that supports both the first operation type and the second operation type may be referred to as a third virtual layer, and the number of the third virtual layer may be at least one, or the third virtual layer may not exist), the window attribute may indicate that the window supports the first operation type and/or the second operation type.
For example, the construction process for the control can include: determining a window to which the control to be created belongs, and determining a control type corresponding to the control to be created. Then, a control matched with the control type is created on the window, the control is provided with a control attribute and a control identification, and the control identification is unique (namely unique for the window). The control type may include, but is not limited to, one of the following types: button type, text type, information type, progress bar type, graphic interchange format type.
In step 102, an operation type of the operation command is determined, where the operation type may be a first operation type or a second operation type. For example, if the operation command is an operation command generated when the user performs the second operation, the operation type may be the second operation type; if the operation command is an operation command generated when the user performs the first operation, the operation type may be the first operation type.
For example, if the operation command is an operation command generated by a user performing a first operation on the embedded device, it may be determined that the operation type of the operation command is the first operation type. For example, when the user performs a first operation on the embedded device, the operation command may carry information related to the first operation, and therefore, the embedded device may determine that the operation type of the operation command is the first operation type based on the information.
For example, if the operation command is an operation command generated by a user performing a second operation on the embedded device, it may be determined that the operation type of the operation command is the second operation type. For example, when the user performs the second operation on the embedded device, the operation command may carry information related to the second operation, and therefore, the embedded device may determine that the operation type of the operation command is the second operation type based on the information.
Step 103, if the operation command is an operation command generated when the user performs the second operation, that is, the operation type of the operation command is the second operation type, determining whether the operation type supported by the target window to which the target control belongs includes the second operation type, and if so, performing step 104.
For example, if the target window is located in the first virtual layer, the operation types supported by the target window may include a second operation type; if the target window is located in the second virtual layer, the operation types supported by the target window may include a first operation type, and the first operation type is different from the second operation type. The first virtual layer is a virtual layer supporting a second operation type in the at least two virtual layers, and the second virtual layer is a virtual layer supporting the first operation type in the at least two virtual layers.
For example, a target window to which the target control belongs, that is, a window in which the target control is located, may be determined. Because each window has a window attribute, after the target window to which the target control belongs is determined, the window attribute of the target window, that is, the operation type supported by the target window, can be determined. The operation type supported by the target window may be a first operation type, a second operation type, a first operation type and a second operation type. Obviously, if the operation type supported by the target window is the second operation type, or the operation types supported by the target window are the first operation type and the second operation type, it is indicated that the operation types supported by the target window include the second operation type. If the operation type supported by the target window is the first operation type, the operation type supported by the target window does not comprise the second operation type.
In a possible implementation, the virtual layer supporting the first operation type is referred to as a second virtual layer, and therefore, if the target window is located in the second virtual layer, the operation type supported by the target window is the first operation type, that is, the window attribute of the target window indicates that the target window supports the first operation type, that is, if the target window is located in the second virtual layer, the operation type supported by the target window includes the first operation type.
The virtual layer supporting the second operation type may be referred to as a first virtual layer, and therefore, if the target window is located in the first virtual layer, the operation type supported by the target window is the second operation type, that is, the window attribute of the target window indicates that the target window supports the second operation type, that is, if the target window is located in the first virtual layer, the operation type supported by the target window may include the second operation type.
The virtual layer that supports the first operation type and the second operation type simultaneously may be referred to as a third virtual layer, and therefore, if the target window is located in the third virtual layer, the operation type supported by the target window is the first operation type and/or the second operation type, that is, the window attribute of the target window indicates that the target window supports the first operation type and/or the second operation type, that is, if the target window is located in the third virtual layer, the operation type supported by the target window may include the first operation type, or may include the second operation type, or may include both the first operation type and the second operation type, which is not limited herein.
Step 104, triggering the target control to execute the operation command, that is, if the operation type of the operation command is the second operation type and the operation type supported by the target window to which the target control belongs includes the second operation type, triggering the target control to execute the operation command, that is, the target control executes the operation matched with the operation command. Obviously, for the target control, the control of the embedded device may be implemented through the second operation, so that the embedded device supporting the operation of the first operation type is controlled through the second operation.
For example, if the operation type of the operation command is the second operation type, and the operation type supported by the target window to which the target control belongs does not include the second operation type, the target control is prohibited from being triggered to execute the operation command, that is, the target control does not execute the operation matched with the operation command.
In a possible implementation manner, if the operation command is an operation command generated when the user performs the first operation, the operation type of the operation command is the first operation type, on this basis, it is necessary to determine whether the operation type supported by the target window to which the target control belongs includes the first operation type, if so, the target control is triggered to perform the operation command, and if not, the target control is prohibited from performing the operation command.
For example, the operation type supported by the target window may be a first operation type, may be a second operation type, and may also be the first operation type and the second operation type. If the operation type supported by the target window is the first operation type, or the operation type supported by the target window is the first operation type and the second operation type, the operation type supported by the target window includes the first operation type. If the operation type supported by the target window is the second operation type, the operation type supported by the target window does not include the first operation type.
And if the operation type of the operation command is the first operation type and the operation types supported by the target window to which the target control belongs comprise the first operation type, triggering the target control to execute the operation command, namely, the target control executes the operation matched with the operation command. Obviously, for the target control, the control of the embedded device may be implemented through the first operation, so that the embedded device is controlled through the first operation.
For example, if the operation type of the operation command is the first operation type, and the operation type supported by the target window to which the target control belongs does not include the first operation type, the target control is prohibited from being triggered to execute the operation command, that is, the target control does not execute the operation matched with the operation command.
In the above embodiment, if the first operation is a key operation, the second operation is a touch operation, the first operation type is a key type, and the second operation type is a touch type; in this case, if the operation type of the operation command is a touch type and the operation type supported by the target window to which the target control belongs includes the touch type, the control of the embedded device may be implemented through the touch operation, so that the embedded device supporting the key operation is controlled through the touch operation. Or if the first operation is a touch operation, the second operation is a key operation, the first operation type is a touch type, and the second operation type is a key type; in this case, if the operation type of the operation command is a key type and the operation type supported by the target window to which the target control belongs includes the key type, the control of the embedded device may be implemented through the key operation, so that the embedded device supporting the touch operation is controlled through the key operation.
As can be seen from the foregoing technical solutions, in the embodiments of the present application, for a window displayed on a user interface, an operation type supported by the window may be a key type and/or a touch type. For a window supporting a key type, for a control created on the window, control of the control can be realized through key operation. For a window supporting a touch type, for a control created on the window, control of the control may be achieved through a touch operation. For a window supporting a key type and a touch type, for a control created on the window, the control of the control can be realized through key operation, and the control of the control can also be realized through touch operation. In summary, for the embedded device supporting key operation, the control of the embedded device can also be realized through touch operation, so that the user experience is better, a complete implementation scheme from key to touch is provided for the embedded device, and the market competitiveness of the embedded device is effectively improved. For the embedded equipment supporting touch operation, the control of the embedded equipment can be realized through key operation, the user experience is better, a complete realization scheme from touch to key is provided for the embedded equipment, and the market competitiveness of the embedded equipment is effectively improved. At least two virtual layers can be displayed on the user interface, namely, windows of various types are displayed through the at least two virtual layers, for example, one virtual layer displays a state window, and the other virtual layer displays a menu window, so that the problem that one layer is difficult to realize a good graphical user interface is avoided, the good graphical user interface can be displayed, and the user experience is further improved.
The above technical solution of the embodiment of the present application is described below with reference to specific application scenarios.
Referring to fig. 2, a schematic diagram of a management method of an embedded device is shown, where the embedded device may be an embedded key device (i.e., through key control) or an embedded touch device (i.e., through touch control).
First, a user interface is initialized, and two virtual layers are created on the user interface, or of course, at least three virtual layers may be created on the user interface, and the number of the virtual layers is not limited, taking the creation of two virtual layers on the user interface as an example. In this embodiment, the two virtual layers may be referred to as a state virtual layer and a menu virtual layer, which are certainly an example, and may also be other names such as a first virtual layer and a second virtual layer, which are not limited to this.
The state virtual layer is used for displaying a state type window, and the menu virtual layer is used for displaying a menu type window. Of course, the state virtual layer may also display a window of a non-state type, and the menu virtual layer may also display a window of a non-menu type, which is not limited to this, and may display any type of window.
The state virtual layer may be located above the menu virtual layer, that is, the menu virtual layer is a bottom virtual layer, and the state virtual layer is an upper virtual layer. The state virtual layer may also be located below the menu virtual layer, that is, the menu virtual layer is an upper virtual layer, and the state virtual layer is a bottom virtual layer.
The state virtual layer and the menu virtual layer can be mutually independent, namely the creation process of the state virtual layer is not related to the creation process of the menu virtual layer, and after the creation of the state virtual layer and the menu virtual layer is completed, the state virtual layer and the menu virtual layer are displayed on a user interface. When the state virtual layer and the menu virtual layer are displayed, the state virtual layer may be located on the menu virtual layer.
When the state virtual layer and the menu virtual layer are created, a dynamic memory allocation manner may be adopted, for example, a memory 1 is allocated for the state virtual layer, and a memory 2 is allocated for the menu virtual layer, where the memory 1 is used to store contents related to the state virtual layer, the contents are used to create and display the state virtual layer, and the memory 2 is used to store contents related to the menu virtual layer, and the contents are used to create and display the menu virtual layer.
After all the virtual layers are created, fusion processing of the virtual layers is required, and the fusion of the virtual layers is performed from bottom to top and is finally displayed on a screen. For example, when the state virtual layer is located on the menu virtual layer, after the creation of the state virtual layer and the menu virtual layer is completed, the menu virtual layer is fused first, then the state virtual layer is fused on the menu virtual layer, that is, the fusion is performed from bottom to top, the fused virtual layer is obtained, and finally the fused virtual layer is displayed on the user interface.
When the state virtual layer and the menu virtual layer are created, a state virtual layer management thread can be started, the state virtual layer is created through the state virtual layer management thread, and a window and a control are created on the state virtual layer through the state virtual layer management thread. And a menu virtual layer management thread can be started, a menu virtual layer is created through the menu virtual layer management thread, and a window and a control are created on the menu virtual layer through the menu virtual layer management thread.
Secondly, after the state virtual layer and the menu virtual layer are created, each virtual layer can create a window, and each window can be set with window attributes such as size, background and the like. And each window can also be provided with window attributes supporting a key type and/or a touch type. For example, the window created in the state virtual layer may be set to support a key type but not a touch type. The window created in the menu virtual layer can be set to support a touch type but not a key type. Of course, the above is only an example, and the window attribute may be arbitrarily set with respect to the window of the state virtual layer and the menu virtual layer.
For example, if a window of the state virtual layer supports a key type and a window of the menu virtual layer supports a touch type, if a certain window supports a key type but does not support a touch type, the window may be created in the state virtual layer, and the window is not created in the menu virtual layer. If a certain window supports a touch type but does not support a key type, the window can be created in the menu virtual layer.
Secondly, after the window is created, controls can be created in the window, the controls can be classified according to different types, each type of control has own control attribute, and the control attributes can be set randomly. Each control has a unique control identification belonging to the control in the window, and the attribute setting of the control depends on the control identification, so that the unique control is associated with each control identification, and the control management of the window is facilitated.
Based on the creation of the virtual layer, the creation of the window and the creation of the control, the management of the embedded device is completed, the implementation framework of the user interface is simple, the hierarchy is clear, and the mode of the upper layer caller for creating the user interface is simple.
In a possible implementation manner, regarding the creation process of the virtual layer, the embedded device generally has only one actual display layer, the page display of the single layer is limited, only unique content can be displayed in each area of the user interface, and it is difficult to implement a good graphical user interface.
Referring to fig. 3, for the building process of the virtual layer, the creating process may include:
step 301, determining the number M of virtual layers to be created, and creating M virtual layers.
Illustratively, M is a positive integer greater than 1, each virtual layer corresponds to a layer identifier and a layer attribute, and the layer identifiers corresponding to the M virtual layers are determined based on the display order of the M virtual layers.
When the embedded device is started, a virtual layer may be created first, for example, assuming that M virtual layers need to be created, the embedded device may create M virtual layers. For each virtual layer, when the virtual layer is created, a layer identifier and a layer attribute need to be allocated to the virtual layer.
For each virtual layer, the layer attributes of the virtual layer may include, but are not limited to: the types of the virtual layers include a state virtual layer, a menu virtual layer, a background virtual layer, an adjustment virtual layer, a filling virtual layer, a character virtual layer, a shape virtual layer and the like. The shape of the virtual layer is, for example, a circular virtual layer, a rectangular virtual layer, a triangular virtual layer, etc. The size of the virtual layer, such as the diameter of the circular virtual layer, the length and height of the rectangular virtual layer, etc. And the coordinates of the center point of the virtual layer.
Of course, the above are only a few examples of layer attributes, and no limitation is made to the layer attributes. For a plurality of virtual layers, each virtual layer is independent, and layer attributes of different virtual layers may be the same or different, that is, layer attributes may be set for each virtual layer.
For each virtual layer, the virtual layer corresponds to a layer identifier, and the layer identifier has uniqueness, that is, the layer identifiers of different virtual layers are different. For example, when the virtual layer a, the virtual layer b, and the virtual layer c need to be created, assuming that the display order of the virtual layer a is 1, indicating that the virtual layer a is located at the bottommost layer, the display order of the virtual layer b is 2, indicating that the virtual layer b is located at the middle layer, the display order of the virtual layer c is 3, indicating that the virtual layer c is located at the topmost layer, then the layer identifier of the virtual layer a is 0, the layer identifier of the virtual layer b is 1, and the layer identifier of the virtual layer c is 2, that is, the layer identifiers corresponding to the virtual layers sequentially increase according to the display order from the bottommost layer to the topmost layer. That is, the layer identifiers corresponding to the virtual layers decrease in sequence according to the display sequence from the uppermost layer to the bottommost layer.
Of course, the layer identifiers corresponding to the virtual layers may also decrease sequentially according to the display order from the bottommost layer to the topmost layer, that is, the layer identifiers corresponding to the virtual layers increase sequentially according to the display order from the topmost layer to the bottommost layer, and the layer identifiers corresponding to the virtual layers are not limited.
In summary, when a first virtual layer is created, the layer identifier of the first virtual layer is 0, when a second virtual layer is created, the layer identifier of the second virtual layer is 1, when a third virtual layer is created, the layer identifier of the third virtual layer is 2, and so on, the layer identifiers sequentially increase. The creation process of the virtual layer determines which virtual layer is located on the top of the user interface when the user interface is displayed, for example, the first virtual layer is located on the bottom, and the last virtual layer is located on the top.
For example, when each virtual layer is created, a management thread may be started for the virtual layer, and the management thread may create the virtual layer in a non-blocking manner, so as to improve the operating efficiency of the embedded device. The management thread is used for creating a virtual layer and creating a window and a control on the virtual layer, for example, for processing a window message and refreshing and drawing the control. The management threads of each virtual layer are independent, and the virtual layers are independent, so that the stability of a graphical user interface is improved.
For example, the virtual layer may have layer attributes supporting a key type and/or a touch type, that is, the layer attributes of the virtual layer may further include an operation type supported by the virtual layer, where the operation type is the key type and/or the touch type. Or, the virtual layer may not have layer attributes supporting the key type and/or the touch type, that is, the layer attributes of the virtual layer do not include the operation type supported by the virtual layer.
In a possible implementation manner, if the layer attribute of the virtual layer includes an operation type supported by the virtual layer, 2 virtual layers may be created, and the 2 virtual layers are respectively recorded as a virtual layer a and a virtual layer b, where the layer attribute of the virtual layer a may include the operation type supported by the virtual layer a, the operation type is a key type, and the layer attribute of the virtual layer b may include the operation type supported by the virtual layer b, and the operation type is a touch type. On this basis, when a window is created on the virtual layer, if the operation type supported by the window is the key type, the window may be created on the virtual layer a, and if the operation type supported by the window is the touch type, the window may be created on the virtual layer b.
In another possible implementation manner, if the layer attributes of the virtual layer include operation types supported by the virtual layer, 3 virtual layers may be created, and the 3 virtual layers are respectively recorded as a virtual layer a, a virtual layer b, and a virtual layer c, where the layer attributes of the virtual layer a may include the operation types supported by the virtual layer a, the operation types are key types, the layer attributes of the virtual layer b may include the operation types supported by the virtual layer b, the operation types are touch types, and the layer attributes of the virtual layer c may include the operation types supported by the virtual layer c, and the operation types are key types and touch types.
On this basis, when a window needs to be created on the virtual layer, if the operation type supported by the window is the key type, the window may be created on the virtual layer a, if the operation type supported by the window is the touch type, the window may be created on the virtual layer b, and if the operation type supported by the window is the key type and the touch type, the window may be created on the virtual layer c.
Of course, in practical applications, 4 or more virtual layers may also be created, which is not limited to this. In 4 or more virtual layers, the operation type supported by the first virtual layer is a key type, the operation type supported by the second virtual layer is a touch type, and the operation type supported by the third virtual layer is a key type and a touch type. With respect to the remaining other virtual image layers, the supported operation type may be a key type, or a touch type, or a key type and a touch type, which is not limited herein.
Step 302, determining a display sequence of the M virtual layers based on layer identifiers corresponding to the M virtual layers. For example, after M virtual layers are created, each virtual layer has a layer identifier, and the layer identifiers corresponding to the M virtual layers are determined based on the display order of the M virtual layers, so the display order of the M virtual layers can be determined based on the layer identifiers corresponding to the M virtual layers.
For example, if the layer identifier of the virtual layer a is 0, the layer identifier of the virtual layer b is 1, and the layer identifier of the virtual layer c is 2, the display sequence of all the virtual layers from the bottommost layer to the topmost layer is the virtual layer a, the virtual layer b, and the virtual layer c in sequence, that is, the virtual layer corresponding to the smallest layer identifier is located at the bottommost layer, and the virtual layer corresponding to the largest layer identifier is located at the topmost layer.
Step 303, fusing the M virtual layers based on the display order of the M virtual layers to obtain a fused layer, where the fused layer may include the M virtual layers, and the M virtual layers are arranged according to the display order. For example, the M virtual layers are sequentially fused according to the display order from the bottommost layer to the topmost layer, so as to obtain a fused layer. Assuming that the display sequence from the bottommost layer to the topmost layer is a virtual layer a, a virtual layer b and a virtual layer c in sequence, the virtual layer a can be obtained first, the virtual layer b is fused on the basis of the virtual layer a, namely the virtual layer b is located on the virtual layer a, and the virtual layer c is fused on the basis of the virtual layer a and the virtual layer b, namely the virtual layer c is located on the virtual layer b.
The fused layer after the virtual layer c is the fused layer, the fused layer comprises a virtual layer a, a virtual layer b and a virtual layer c, the virtual layer a is located at the bottommost layer, the virtual layer b is located above the virtual layer a, and the virtual layer c is located above the virtual layer b, namely the virtual layer c is located at the topmost layer.
For example, the virtual layer may be created by dynamically applying for a memory, that is, dynamically applying for a memory buffer for each virtual layer, and storing the content related to the virtual layer through the memory buffer. For example, a memory buffer 1 is dynamically applied for a virtual layer a, the memory buffer 1 is used for storing contents related to the virtual layer a, the contents are used for creating and displaying the virtual layer a, a memory buffer 2 is dynamically applied for the virtual layer b, the memory buffer 2 is used for storing contents related to the virtual layer b, the contents are used for creating and displaying the virtual layer b, a memory buffer 3 is dynamically applied for the virtual layer c, and the memory buffer 3 is used for storing contents related to the virtual layer c, and the contents are used for creating and displaying the virtual layer c.
Based on this, when the M virtual layers are fused to obtain a fused layer, it is actually a fusion process of the memory buffer, for example, the contents of the memory buffer 1, the memory buffer 2, and the memory buffer 3 are fused, and the fused content corresponds to the fused layer, that is, the layer to be displayed.
And step 304, displaying the fused layer on a user interface of the embedded device.
For example, because the user interface only displays one actual layer, the multiple virtual layers may be fused to obtain a fused layer, and the fused layer is displayed on the user interface of the embedded device.
In one possible embodiment, see FIG. 4, which is a schematic diagram of the window construction process.
Step 401, determining a virtual layer to which a window to be created belongs.
For example, a window is a window displayed on a user interface, different windows on the user interface may display different contents, and the window needs to be created on a virtual layer, so when creating the window, it may be selected on which virtual layer the window is created, that is, the window belongs to this virtual layer.
For example, if a window a1 needs to be created on the virtual layer a, when the window a1 is created, it is determined that the virtual layer to which the window a1 belongs is the virtual layer a. If a window b1 needs to be created on the virtual layer b, when the window b1 is created, it is determined that the virtual layer to which the window b1 belongs is the virtual layer a, and so on.
Step 402, determining window attributes corresponding to the window to be created.
Illustratively, the window attribute includes an operation type supported by the window, and the operation type is a key type and/or a touch type. If the operation type is the key type, the window supports the key type, and the control on the window is allowed to be operated through key operation. If the operation type is the touch type, the window is represented to support the touch type, and the control on the window is allowed to be operated through touch operation. If the operation type is a key type and a touch type, the window supports the key type and the touch type, the control on the window is allowed to be operated through key operation, and the control on the window is also allowed to be operated through touch operation.
For example, when a window is created, the operation type supported by the window may be specified in advance, and therefore, the operation type supported by the window may be determined, so as to determine the window attribute corresponding to the window.
For example, if the layer attribute of the virtual layer includes an operation type supported by the virtual layer, when the virtual layer to which the window to be created belongs is determined, the virtual layer to which the window belongs may also be determined based on the window attribute corresponding to the window. For example, if the virtual layer a supports a key type, the virtual layer b supports a touch type, and the virtual layer c supports a key type and a touch type, then, if the window attribute corresponding to the window to be created represents the window supported key type, it is determined that the virtual layer to which the window belongs is the virtual layer a, if the window attribute corresponding to the window to be created represents the window supported touch type, it is determined that the virtual layer to which the window belongs is the virtual layer b, and if the window attribute corresponding to the window to be created represents the window supported key type and the touch type, it is determined that the virtual layer to which the window belongs is the virtual layer c.
Illustratively, in addition to the types of operations supported by the window, the window properties of the window may include, but are not limited to, at least one of: the window size, the window background and the window rendering effect are not limited, and the user experience of the graphical user interface can be well improved by properly setting the window attribute.
For example, if the layer attribute of the virtual layer includes an operation type supported by the virtual layer, the window attribute corresponding to the window to be created may also include, but is not limited to: determining window attributes corresponding to the window to be created based on layer attributes of a virtual layer to which the window to be created belongs; for example, if the layer attribute indicates that the operation type supported by the virtual layer is a key type, the window attribute may indicate that the operation type supported by the window is a key type; if the layer attribute indicates that the operation type supported by the virtual layer is the touch type, the window attribute may indicate that the operation type supported by the window is the touch type; if the layer attribute indicates that the operation type supported by the virtual layer is a key type and a touch type, the window attribute may indicate that the operation type supported by the window is a key type and/or a touch type.
For example, in order to make the operation type supported by a window be a key type and/or a touch type, each window has a unique window callback function belonging to the window, the window callback function is used for processing the message event of the window, and the message event may be a key message event and a touch message event. If the window supports the key type, the window callback function of the window may be set as a key message event. If the window supports the touch type, the window callback function of the window may be set as a touch message event. If the window supports the key type and the touch type, the window callback function of the window may be set to a key message event and a touch message event.
Step 403, creating the window on the virtual layer based on the window attribute, where the window corresponds to the window attribute, and the window attribute may at least include an operation type supported by the window.
For example, if a window a1 needs to be created on the virtual layer a, a window a1 is created on the virtual layer a based on the window attribute of the window a1, the window attribute corresponding to the window a1 includes an operation type supported by the window a1, and the operation type indicates that the window a1 supports a key type and/or a touch type.
In one possible embodiment, see fig. 5, which is a schematic diagram of a control building process.
Step 501, determining a window to which a control to be created belongs.
For example, a control is a control displayed on a window of a user interface, different windows may display different controls, and the control needs to be created on the window, so when creating the control, it is possible to select which window the control is created in, that is, the control belongs to this window. For example, if a control a11 needs to be created on the window a1, when the control a11 is created, the window to which the control a11 belongs is determined to be the window a 1.
Step 502, determining a control type corresponding to a control to be created.
For example, for each window created, a control of a different control type may be created on the window, which may include, but is not limited to, at least one of the following types: button type, text type, information type, progress bar type, graphic interchange format (gif) type. Of course, the above are just a few examples of control types, and no limitation is made to the control types. When the control is created, which control type the control belongs to can be set, so that the control type corresponding to the control to be created can be determined.
Step 503, creating a control matched with the control type on the window, where the control has a control attribute and a control identifier, and the control identifier has uniqueness (i.e., uniqueness with respect to the window).
For example, assuming that the control type corresponding to the control a11 is a button type, a control a11 of the button type is created on the window a1, and the control a11 corresponds to the control attribute and the control identification. The control attribute corresponding to the control a11 may be any attribute for the control, and is not limited to this control attribute. Regarding the control identification corresponding to the control a11, the control identification may be a unique identification for the window a1, that is, the control identifications of different controls are different for all the controls created on the window a 1. Of course, the control identification of the control created on window a1 may or may not be the same as the control identification of the control created on the other window.
Illustratively, after a window has created a control, when the window receives a refresh message, a layer corresponding to the window performs drawing processing on the control, and is finally displayed on a screen to realize drawing of the control.
In summary, a virtual layer may be created first, then a window is created on the virtual layer, then a control is created on the window, when the window receives a refresh message, the virtual layer performs control drawing, and a plurality of virtual layers are finally displayed on the user interface of the embedded device by using a layer fusion method.
In a possible implementation manner, after the embedded device displays the user interface, the target control displayed on the window displayed on the user interface may be operated, and the embedded device may receive an operation command for the target control on the user interface and determine an operation type of the operation command, where the operation type is a key type or a touch type. Determining window properties of a target window to which a target control belongs, wherein the window properties of the target window comprise operation types supported by the target window, and the operation types are key types and/or touch types. On this basis, if the operation type of the operation command is a key type and the target window supports the key type, the target control is triggered to execute the operation command, that is, for the target control, the control of the embedded device can be realized through key operation. If the operation type of the operation command is a touch type and the target window supports the touch type, triggering the target control to execute the operation command, namely, for the target control, controlling the embedded device through touch operation.
In summary, for the embedded device, the control of the embedded device can be realized through the key operation, and the control of the embedded device can also be realized through the touch operation.
Based on the same application concept as the method, an embodiment of the present application provides a management apparatus for an embedded device, which is applied to an embedded device, and is shown in fig. 6, and is a schematic structural diagram of the apparatus, where if the embedded device is an embedded device supporting an operation of a first operation type, the apparatus may include:
a receiving module 61, configured to receive an operation command for a target control on a user interface; the user interface comprises at least two virtual layers, each virtual layer comprises at least one window, and each window comprises at least one control; a determining module 62, configured to determine an operation type of the operation command; if the operation command is an operation command generated when a user executes a second operation, the operation type is a second operation type; the processing module 63 is configured to determine whether the operation types supported by the target window to which the target control belongs include a second operation type; if the target window is located in the first virtual layer, the operation types supported by the target window include a second operation type; if the target window is located in a second virtual layer, the operation type supported by the target window comprises a first operation type, and the first operation type is different from the second operation type; the first virtual layer is a virtual layer supporting a second operation type in the at least two virtual layers, and the second virtual layer is a virtual layer supporting a first operation type in the at least two virtual layers; and if so, triggering the target control to execute the operation command.
In a possible implementation manner, if the operation command is an operation command generated when a user performs a first operation, the operation type is a first operation type; the processing module 63 is further configured to: judging whether the operation types supported by the target window to which the target control belongs comprise a first operation type or not; and if so, triggering the target control to execute the operation command.
In a possible implementation manner, if the first operation is a key operation, the second operation is a touch operation, the first operation type is a key type, and the second operation type is a touch type; or if the first operation is a touch operation, the second operation is a key operation, the first operation type is a touch type, and the second operation type is a key type.
In a possible embodiment, the device further comprises (not shown in the figures):
the creating module is used for determining the number M of virtual layers to be created and creating M virtual layers; the display order of the M virtual layers is determined according to the display order of the M virtual layers, wherein M can be a positive integer larger than 1, each virtual layer corresponds to a layer identifier and a layer attribute, and the layer identifiers corresponding to the M virtual layers are determined according to the display order of the M virtual layers; determining a display sequence of the M virtual layers based on layer identifiers corresponding to the M virtual layers, and fusing the M virtual layers based on the display sequence to obtain fused layers, wherein the fused layers comprise the M virtual layers, and the M virtual layers are arranged according to the display sequence; and displaying the fused layer on a user interface of the embedded equipment.
Exemplarily, the creating module is further configured to determine a virtual layer to which a window to be created belongs; determining window attributes corresponding to a window to be created; creating the window on the virtual layer based on the window attribute, wherein the window corresponds to the window attribute, the window attribute at least comprises an operation type supported by the window, and the operation type supported by the window comprises a first operation type and/or a second operation type; the window attributes further include at least one of: window size, window background, window rendering effect.
Illustratively, the creating module is specifically configured to, when determining the window attribute corresponding to the window to be created: determining window attributes corresponding to the window to be created based on the layer attributes of the virtual layer to which the window to be created belongs; if the layer attribute indicates that the virtual layer supports a first operation type, the window attribute indicates that a window supports the first operation type; if the layer attribute represents that the virtual layer supports a second operation type, the window attribute represents that a window supports the second operation type; and if the layer attribute indicates that the virtual layer supports a first operation type and a second operation type, the window attribute indicates that the window supports the first operation type and/or the second operation type.
Based on the same application concept as the method, an embedded device is provided in the embodiment of the present application, and as shown in fig. 7, the embedded device may include: a processor 71 and a machine-readable storage medium 72, the machine-readable storage medium 72 storing machine-executable instructions executable by the processor 71; the processor 71 is configured to execute machine-executable instructions to implement the methods disclosed in the above embodiments.
For example, the processor 71 is configured to execute machine-executable instructions to perform the following steps:
if the embedded device is an embedded device supporting operation of a first operation type, then:
receiving an operation command aiming at a target control on a user interface; the user interface comprises at least two virtual layers, each virtual layer comprises at least one window, and each window comprises at least one control;
determining the operation type of the operation command; if the operation command is an operation command generated when a user executes a second operation, the operation type is a second operation type;
judging whether the operation types supported by the target window to which the target control belongs comprise a second operation type; if the target window is located in the first virtual layer, the operation types supported by the target window include a second operation type; if the target window is located in a second virtual layer, the operation type supported by the target window comprises a first operation type, and the first operation type is different from the second operation type; the first virtual layer is a virtual layer supporting a second operation type in the at least two virtual layers, and the second virtual layer is a virtual layer supporting a first operation type in the at least two virtual layers;
and if so, triggering the target control to execute the operation command.
Based on the same application concept as the method, embodiments of the present application further provide a machine-readable storage medium, where a plurality of computer instructions are stored on the machine-readable storage medium, and when the computer instructions are executed by a processor, the management method for the embedded device disclosed in the above example of the present application can be implemented.
The machine-readable storage medium may be any electronic, magnetic, optical, or other physical storage device that can contain or store information such as executable instructions, data, and the like. For example, the machine-readable storage medium may be: a RAM (random Access Memory), a volatile Memory, a non-volatile Memory, a flash Memory, a storage drive (e.g., a hard drive), a solid state drive, any type of storage disk (e.g., an optical disk, a dvd, etc.), or similar storage medium, or a combination thereof.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. A typical implementation device is a computer, which may take the form of a personal computer, laptop computer, cellular telephone, camera phone, smart phone, personal digital assistant, media player, navigation device, email messaging device, game console, tablet computer, wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functionality of the units may be implemented in one or more software and/or hardware when implementing the present application.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Furthermore, these computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A management method of an embedded device is applied to the embedded device, and if the embedded device is the embedded device supporting operation of a first operation type, the method comprises the following steps:
receiving an operation command aiming at a target control on a user interface; the user interface comprises at least two virtual layers, each virtual layer comprises at least one window, and each window comprises at least one control;
determining the operation type of the operation command; if the operation command is an operation command generated when a user executes a second operation, the operation type is a second operation type;
judging whether the operation types supported by the target window to which the target control belongs comprise a second operation type; if the target window is located in the first virtual layer, the operation types supported by the target window include a second operation type; if the target window is located in a second virtual layer, the operation type supported by the target window comprises a first operation type, and the first operation type is different from the second operation type; the first virtual layer is a virtual layer supporting a second operation type in the at least two virtual layers, and the second virtual layer is a virtual layer supporting a first operation type in the at least two virtual layers;
and if so, triggering the target control to execute the operation command.
2. The method according to claim 1, wherein if the operation command is an operation command generated when a user performs a first operation, the operation type is a first operation type;
after determining the operation type of the operation command, the method further includes:
judging whether the operation types supported by the target window to which the target control belongs comprise a first operation type or not;
and if so, triggering the target control to execute the operation command.
3. The method according to claim 1 or 2,
if the first operation is a key operation, the second operation is a touch operation, the first operation type is a key type, and the second operation type is a touch type; alternatively, the first and second electrodes may be,
if the first operation is a touch operation, the second operation is a key operation, the first operation type is a touch type, and the second operation type is a key type.
4. The method according to claim 1 or 2,
before receiving an operation command for a target control on a user interface, the method further comprises:
determining the number M of virtual layers to be created, and creating M virtual layers; the method comprises the following steps that M is a positive integer larger than 1, each virtual layer corresponds to a layer identifier and a layer attribute, and the layer identifiers corresponding to the M virtual layers are determined based on the display sequence of the M virtual layers;
determining a display sequence of the M virtual layers based on layer identifiers corresponding to the M virtual layers, and fusing the M virtual layers based on the display sequence to obtain fused layers, wherein the fused layers comprise the M virtual layers, and the M virtual layers are arranged according to the display sequence;
and displaying the fused layer on a user interface of the embedded equipment.
5. The method according to claim 1 or 2,
before receiving an operation command for a target control on a user interface, the method further comprises:
determining a virtual layer to which a window to be created belongs;
determining window attributes corresponding to a window to be created;
creating the window on the virtual layer based on the window attribute, wherein the window corresponds to the window attribute, the window attribute at least comprises an operation type supported by the window, and the operation type supported by the window comprises a first operation type and/or a second operation type; wherein the window attributes further include at least one of: window size, window background, window rendering effect.
6. The method of claim 5,
the determining of the window attribute corresponding to the window to be created includes: determining window attributes corresponding to the window to be created based on the layer attributes of the virtual layer to which the window to be created belongs;
if the layer attribute indicates that the virtual layer supports a first operation type, the window attribute indicates that a window supports the first operation type; if the layer attribute represents that the virtual layer supports a second operation type, the window attribute represents that a window supports the second operation type; and if the layer attribute indicates that the virtual layer supports a first operation type and a second operation type, the window attribute indicates that the window supports the first operation type and/or the second operation type.
7. The method according to claim 1 or 2,
before receiving an operation command for a target control on a user interface, the method further comprises:
determining a window to which a control to be created belongs;
determining a control type corresponding to a control to be created;
creating a control matched with the control type on the window, wherein the control corresponds to a control attribute and a control identifier, and the control identifier has uniqueness; wherein the control type is one of the following types: button type, text type, information type, progress bar type, graphic interchange format type.
8. A management device for an embedded device is applied to the embedded device, and if the embedded device is an embedded device supporting an operation of a first operation type, the management device comprises:
the receiving module is used for receiving an operation command aiming at a target control on a user interface; the user interface comprises at least two virtual layers, each virtual layer comprises at least one window, and each window comprises at least one control;
the determining module is used for determining the operation type of the operation command; if the operation command is an operation command generated when a user executes a second operation, the operation type is a second operation type;
the processing module is used for judging whether the operation types supported by the target window to which the target control belongs comprise a second operation type; if the target window is located in the first virtual layer, the operation types supported by the target window include a second operation type; if the target window is located in a second virtual layer, the operation type supported by the target window comprises a first operation type, and the first operation type is different from the second operation type; the first virtual layer is a virtual layer supporting a second operation type in the at least two virtual layers, and the second virtual layer is a virtual layer supporting a first operation type in the at least two virtual layers; and if so, triggering the target control to execute the operation command.
9. The apparatus according to claim 8, wherein the operation type is a first operation type if the operation command is an operation command generated when a user performs a first operation; the processing module is further configured to: judging whether the operation types supported by the target window to which the target control belongs comprise a first operation type or not; if yes, triggering the target control to execute the operation command;
if the first operation is a key operation, the second operation is a touch operation, the first operation type is a key type, and the second operation type is a touch type; alternatively, the first and second electrodes may be,
if the first operation is a touch operation, the second operation is a key operation, the first operation type is a touch type, and the second operation type is a key type.
10. An embedded device, comprising: a processor and a machine-readable storage medium storing machine-executable instructions executable by the processor; the processor is configured to execute machine executable instructions to perform the steps of:
if the embedded device is an embedded device supporting a first operation of an operation type, then:
receiving an operation command aiming at a target control on a user interface; the user interface comprises at least two virtual layers, each virtual layer comprises at least one window, and each window comprises at least one control;
determining the operation type of the operation command; if the operation command is an operation command generated when a user executes a second operation, the operation type is a second operation type;
judging whether the operation types supported by the target window to which the target control belongs comprise a second operation type; if the target window is located in the first virtual layer, the operation types supported by the target window include a second operation type; if the target window is located in a second virtual layer, the operation type supported by the target window comprises a first operation type, and the first operation type is different from the second operation type; the first virtual layer is a virtual layer supporting a second operation type in the at least two virtual layers, and the second virtual layer is a virtual layer supporting a first operation type in the at least two virtual layers;
and if so, triggering the target control to execute the operation command.
CN202110496449.1A 2021-05-07 2021-05-07 Management method, device and equipment of embedded equipment Active CN113220180B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110496449.1A CN113220180B (en) 2021-05-07 2021-05-07 Management method, device and equipment of embedded equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110496449.1A CN113220180B (en) 2021-05-07 2021-05-07 Management method, device and equipment of embedded equipment

Publications (2)

Publication Number Publication Date
CN113220180A true CN113220180A (en) 2021-08-06
CN113220180B CN113220180B (en) 2023-04-07

Family

ID=77091686

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110496449.1A Active CN113220180B (en) 2021-05-07 2021-05-07 Management method, device and equipment of embedded equipment

Country Status (1)

Country Link
CN (1) CN113220180B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101763214A (en) * 2009-12-30 2010-06-30 宇龙计算机通信科技(深圳)有限公司 Mobile terminal display page zoom method, system and mobile terminal
CN102682182A (en) * 2011-03-09 2012-09-19 上海思穆电子科技有限公司 Multiscreen independent-operated vehicle information system with novel architecture
EP2629190A1 (en) * 2012-02-20 2013-08-21 Samsung Electronics Co., Ltd. Supporting touch input and key input in an electronic device
CN104793788A (en) * 2015-03-31 2015-07-22 小米科技有限责任公司 Key operating method and device
CN111190565A (en) * 2020-04-13 2020-05-22 延锋伟世通电子科技(南京)有限公司 Multi-screen interaction system and method based on single host and single system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101763214A (en) * 2009-12-30 2010-06-30 宇龙计算机通信科技(深圳)有限公司 Mobile terminal display page zoom method, system and mobile terminal
CN102682182A (en) * 2011-03-09 2012-09-19 上海思穆电子科技有限公司 Multiscreen independent-operated vehicle information system with novel architecture
EP2629190A1 (en) * 2012-02-20 2013-08-21 Samsung Electronics Co., Ltd. Supporting touch input and key input in an electronic device
WO2013125789A1 (en) * 2012-02-20 2013-08-29 Samsung Electronics Co., Ltd. Electronic apparatus, method for controlling the same, and computer-readable storage medium
CN104793788A (en) * 2015-03-31 2015-07-22 小米科技有限责任公司 Key operating method and device
CN111190565A (en) * 2020-04-13 2020-05-22 延锋伟世通电子科技(南京)有限公司 Multi-screen interaction system and method based on single host and single system

Also Published As

Publication number Publication date
CN113220180B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
US11809693B2 (en) Operating method for multiple windows and electronic device supporting the same
US11023097B2 (en) Mobile terminal and message-based conversation operation method for grouping messages
US20140013271A1 (en) Prioritization of multitasking applications in a mobile device interface
US9176747B2 (en) User-application interface
WO2019041779A1 (en) Terminal interface switching, moving and gesture processing method and device and terminal
CN111651116A (en) Split screen interaction method, electronic equipment and computer storage medium
KR20160005609A (en) Method for displaying graphic user interface and electronic device supporting the same
EP2682850A1 (en) Prioritization of multitasking applications in a mobile device interface
US11354021B2 (en) Method, device, terminal and storage medium for displaying icons
CN108319410A (en) Method and apparatus for controlling the menu in media apparatus
WO2020151446A1 (en) Method and device for setting mode of monitoring system
KR20120138618A (en) Method and apparatus for operating multi tasking in a mobile device
EP4209870A1 (en) Split-screen display control method and apparatus, and electronic device and storage medium
WO2019242543A1 (en) Method and apparatus for generating group avatar
US20080143673A1 (en) Method and Apparatus For Moving Cursor Using Numerical Keys
US20200028961A1 (en) Switching presentations of representations of objects at a user interface
CN113220180B (en) Management method, device and equipment of embedded equipment
CN117555459A (en) Application group processing method and device, storage medium and electronic equipment
US20180136789A1 (en) Sender-initiated control of information display within multiple-partition user interface
US9146651B1 (en) Displaying multiple applications on limited capability devices
EP2304581A1 (en) Self-management of local resources allocated remotely
US11054968B2 (en) Method and a device for managing a plurality of messages simultaneously
US10481791B2 (en) Magnified input panels
WO2022111695A1 (en) Input method keyboard display method and apparatus, and terminal device
JP6536666B2 (en) Drawing control device, control program therefor, and drawing control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant