CN117407109A - Interface display method, device, equipment and storage medium - Google Patents

Interface display method, device, equipment and storage medium Download PDF

Info

Publication number
CN117407109A
CN117407109A CN202311439625.3A CN202311439625A CN117407109A CN 117407109 A CN117407109 A CN 117407109A CN 202311439625 A CN202311439625 A CN 202311439625A CN 117407109 A CN117407109 A CN 117407109A
Authority
CN
China
Prior art keywords
human
layer
computer interface
interface
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311439625.3A
Other languages
Chinese (zh)
Inventor
刘定虎
何平
冷飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecarx Hubei Tech Co Ltd
Original Assignee
Ecarx Hubei Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecarx Hubei Tech Co Ltd filed Critical Ecarx Hubei Tech Co Ltd
Priority to CN202311439625.3A priority Critical patent/CN117407109A/en
Publication of CN117407109A publication Critical patent/CN117407109A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4406Loading of operating system

Abstract

The invention discloses an interface display method, device, equipment and storage medium. The method comprises the following steps: outputting a starting-up animation, determining a human-computer interface layer, starting a human-computer interface, setting the human-computer interface layer to be visible after the starting-up animation is output, obtaining a target human-computer interface layer, and outputting the human-computer interface by using the target human-computer interface layer. According to the technical scheme, after the startup animation is output, the human-computer interface is hidden by utilizing the invisible human-computer interface layer, and after the startup animation is output, the human-computer interface is displayed by displaying the human-computer interface layer, so that the time sequence problem of startup of the startup animation and the human-computer interface is not needed to be considered, the phenomenon of black screen or splash screen is avoided, and the startup animation and the human-computer interface are reasonably connected.

Description

Interface display method, device, equipment and storage medium
Technical Field
The present invention relates to the field of intelligent terminals, and in particular, to an interface display method, apparatus, device, and storage medium.
Background
Intelligent electronic devices play an important role in modern society, such as smart phones, smart televisions, intelligent cabins of automobiles and the like, and have profound effects on life, work, economy and social development. The graphic display module is taken as a very important part in the intelligent device, and the starting display speed and fluency of the graphic display module directly influence the user experience, the product acceptance and the competitiveness.
Currently, in an intelligent device with a Linux operating system, a precondition that a graphics application HMI (Human Machine Interface, human-computer interface) can display is that a Linux graphics driver DRM (Direct Rendering Manager ) is successfully loaded and a basic window management system waiand or x11 is successfully started. The loading of the driver and the running of the window management service need to consume a certain time, and in order to improve the user experience, a section of starting animation is generally started when the device is started, for example, a trademark of a mobile phone manufacturer is displayed when the smart phone is started, and then the user operation interface is accessed. The startup animation generally only needs to display a few pictures or short videos, the operation of the startup animation does not depend on a window system, the required resources are less, the sending and displaying are more efficient and quick, and the startup animation can be displayed when the kernel starts to be loaded or just after the kernel starts to be loaded.
But when using a boot animation, it needs to be ensured that it does not conflict with the timing of the HMI start-up. Although the timing of the boot animation and the HMI display can be tightly controlled to avoid display conflicts between them, restarting the HMI when the boot animation ends results in a blank period in between, and the device is in a black screen state.
Disclosure of Invention
The invention provides an interface display method, device, equipment and storage medium, which are used for solving the problem of connection between startup animation and HMI display.
In a first aspect, the present invention provides an interface display method, including:
outputting a starting-up animation and determining a human-computer interface layer, wherein the human-computer interface layer is invisible;
starting a human-computer interface, and setting the human-computer interface layer to be visible after the starting-up animation is output, so as to obtain a target human-computer interface layer;
and outputting the human-computer interface by utilizing the target human-computer interface layer.
In a second aspect, the present invention provides an interface display device, comprising:
the interface layer determining module is used for outputting a startup animation and determining a human-computer interface layer, wherein the human-computer interface layer is invisible;
the target layer determining module is used for starting a human-computer interface, and setting the human-computer interface layer to be visible after the output of the starting-up animation is finished, so as to obtain a target human-computer interface layer;
and the interface output module is used for outputting the human-computer interface by utilizing the target human-computer interface layer.
In a third aspect, the present invention provides an electronic device comprising:
at least one processor;
and a memory communicatively coupled to the at least one processor;
wherein the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the interface display method of the first aspect described above.
In a fourth aspect, the present invention provides a computer readable storage medium storing computer instructions for causing a processor to execute the interface display method of the first aspect.
The interface display scheme provided by the invention outputs the startup animation and determines the human-computer interface layer, wherein the human-computer interface layer is invisible, a human-computer interface is started, and after the startup animation is output, the human-computer interface layer is set to be visible to obtain a target human-computer interface layer, and the human-computer interface is output by using the target human-computer interface layer. By adopting the technical scheme, after the startup animation is output, the human-computer interface is hidden by utilizing the invisible human-computer interface layer, and after the startup animation is output, the human-computer interface is displayed by displaying the human-computer interface layer, so that the time sequence problem of startup of the startup animation and the human-computer interface is not needed to be considered, the phenomenon of black screen or splash screen is avoided, and the startup animation and the human-computer interface are reasonably connected.
It should be understood that the description in this section is not intended to identify key or critical features of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of an interface display method according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of a layer structure according to a first embodiment of the present invention;
FIG. 3 is a flowchart of a screen display according to a first embodiment of the present invention;
FIG. 4 is a flowchart of an interface display method according to a second embodiment of the present invention;
fig. 5 is a schematic structural diagram of an interface display device according to a third embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. In the description of the present invention, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of an interface display method according to an embodiment of the present invention, where the method may be applied to displaying a human-machine interface and a startup animation, and the method may be performed by an interface display device, where the interface display device may be implemented in hardware and/or software, and the interface display device may be configured in an electronic device, such as a vehicle, where the electronic device may be configured by two or more physical entities or may be configured by one physical entity.
As shown in fig. 1, the interface display method provided in the first embodiment of the present invention specifically includes the following steps:
s101, outputting a startup animation and determining a human-computer interface layer, wherein the human-computer interface layer is invisible.
Specifically, the display (i.e. output) of the startup animation is generally a direct display mode by using FB (i.e. frame buffer driver), and the human-computer interface is displayed based on a waiand-weston display frame, where the two display modes are independent of each other, but only one physical display path is provided, and when the startup animation and the human-computer interface are simultaneously displayed, a resource competition problem occurs, so that a splash screen occurs. In this embodiment, the boot animation may be displayed first after booting, and then it is determined whether an invisible human-machine interface layer has been created according to the layer identifier. Fig. 2 is a schematic diagram of a layer structure including an infotainment screen 21, an infotainment layer 22, and an infotainment surface layer 23. As shown in FIG. 2, in the Weston infotainment mode, the displayed screen may be abstracted as an infotainment screen 21 (denoted as iv screen), on each of which a plurality of infotainment layers 22 (denoted as iv layers) may be created, and on each of which a plurality of infotainment layers 23 (denoted as iv surfaces) may be placed. The visibility attribute can be added in the data structure of the Weston layer, the non-visibility of the human-computer interface layer can be realized by setting the value corresponding to the attribute to zero, and in addition, the operation for setting the visibility attribute can be packaged into a Weston client program, so that other programs can be conveniently called.
S102, starting a human-computer interface, and setting the human-computer interface layer to be visible after the output of the startup animation is finished, so as to obtain a target human-computer interface layer.
In this embodiment, the man-machine interface may be started after the startup animation is displayed, or may be started while the startup animation is displayed, but the man-machine interface is not displayed at this time because the man-machine interface layer is not visible. And after the startup animation is displayed, the human-computer interface layer can be set to be visible immediately, so that the target human-computer interface layer is obtained.
S103, outputting the human-computer interface by utilizing the target human-computer interface layer.
In this embodiment, by using the visible target human-machine interface layer, display of the human-machine interface can be achieved.
According to the interface display method provided by the embodiment of the invention, the startup animation is output, the human-computer interface layer is determined, wherein the human-computer interface layer is invisible, the human-computer interface is started, the human-computer interface layer is set to be visible after the startup animation is output, the target human-computer interface layer is obtained, and the human-computer interface is output by utilizing the target human-computer interface layer. According to the technical scheme, after the startup animation is output, the human-computer interface is hidden by utilizing the invisible human-computer interface layer, and after the startup animation is output, the human-computer interface is displayed by displaying the human-computer interface layer, so that the time sequence problem of startup of the startup animation and the human-computer interface is not needed to be considered, the phenomenon of black screen or splash screen is avoided, and the startup animation and the human-computer interface are reasonably connected.
Optionally, before the determining the human-machine interface layer, the method further includes: judging whether a target layer corresponding to the human-computer interface exists or not according to the layer identification; wherein the determining the human-machine interface layer comprises: if a target layer corresponding to the human-computer interface exists, setting the target layer to be invisible, and obtaining the human-computer interface layer; if not, a human-computer interface layer is created. The method has the advantages that by judging whether the target layer exists or not, if the target layer corresponding to the human-computer interface exists, the target layer is set to be invisible, the layer redundancy caused by repeated creation of the layer is avoided, and the invisibility of the target layer is further ensured.
Specifically, fig. 3 is a flowchart of a screen display. As shown in fig. 3, after the on-vehicle system (kernel) is powered on and starts to start, the boot animation is directly sent to the screen through the FB channel, and then the Weston service starts to start, which is used to manage the window system. The Weston service may determine whether a target layer corresponding to the human-computer interface exists according to the layer identifier of the existing layer, if so, determine whether the target layer is visible, and if so, set the target layer to be invisible. If the target layer does not exist, a human-machine interface layer needs to be created.
Further, the creating a human-computer interface layer includes: and calling a Weston toolkit in the Weston service to create an initial layer, and setting the initial layer to be invisible to obtain a human-computer interface layer. The advantage of this arrangement is that the initial layer arrangement can be quickly implemented by using the Weston service.
Specifically, as shown in fig. 3, a layer (of a human-computer interface) may be created, and the specific process includes: and calling a Weston toolkit in the Weston service to create an initial layer, and setting the initial layer to be invisible to obtain a human-computer interface layer.
Further, after the man-machine interface is started, the method further comprises: if the human-machine interface layer is a newly created layer, binding the human-machine interface with the layer identification of the human-machine interface layer, and completing the rendering preparation task of the human-machine interface. The benefit of this arrangement is that by binding the layer identifications and preparing for rendering, it is ensured that subsequent HMIs can be quickly displayed.
Specifically, as shown in fig. 3, after the HMI is started, the layer identifier of the newly created layer may be bound to the human-machine interface, and a rendering preparation task of the human-machine interface is completed. If the human-computer interface layer is not the newly created layer, the rendering preparation task of the human-computer interface can be directly completed.
Example two
Fig. 4 is a flowchart of an interface display method provided by the second embodiment of the present invention, and the technical solution of the embodiment of the present invention is further optimized based on the above-mentioned alternative technical solutions, and a specific manner of displaying the human-machine interface and the startup animation is provided.
Optionally, after the outputting of the boot animation is finished, setting the human-machine interface layer to be visible includes: and after the output of the starting-up animation is finished, if the release of the resources occupied by the starting-up animation is determined to be finished, setting the human-computer interface layer to be visible through an initializing system interface. The advantage of this arrangement is that the man-machine interface layer can be quickly set visible by means of the initialisation system interface after the output of the boot animation has ended.
Optionally, after the outputting the human-machine interface by using the target human-machine interface layer, the method further includes: if the current system is determined to enter a second preset mode from a current first preset mode, setting the target human-computer interface layer to be invisible, obtaining a first layer, and outputting the human-computer interface by utilizing the first layer, wherein the first preset mode is different from the second preset mode. The advantage of this arrangement is that if the current system enters the preset mode, the human-machine interface is not required to be closed, and the human-machine interface can be quickly hidden by setting the target human-machine interface layer invisible.
Optionally, after the outputting the human-machine interface by using the first layer, the method further includes: if the current system is determined to enter the first preset mode from the second preset mode, the first image layer is set to be visible, a second image layer is obtained, and the man-machine interface is output by using the second image layer. The setting has the advantages that if the current system exits the preset mode, the man-machine interface is not required to be restarted, the first image layer is set to be visible, the man-machine interface can be quickly awakened, the quick switching of the display interface is realized, and the screen flashing caused by the conflict of the display images is avoided.
As shown in fig. 4, the interface display method provided in the second embodiment of the present invention specifically includes the following steps:
s201, outputting a startup animation.
S202, judging whether a target layer corresponding to the human-computer interface exists according to the layer identification, if so, executing step 205, and if not, executing step 203.
S203, calling a Weston toolkit in the Weston service to create an initial layer, and setting the initial layer to be invisible to obtain a human-computer interface layer.
S204, starting the human-computer interface, binding the human-computer interface with the layer identification of the human-computer interface layer, completing the rendering preparation task of the human-computer interface, and executing step 206.
S205, setting the target layer as invisible, obtaining a human-computer interface layer, and completing the rendering preparation task of the human-computer interface.
S206, after the output of the startup animation is finished, if the release of the resources occupied by the startup animation is determined to be finished, the human-computer interface layer is set to be visible through an initialization system interface.
Specifically, as shown in fig. 3, the HMI is in a sleep state after being started and waits to be awakened, which does not occupy the resources of the CPU (Central Processing Unit ) and the GPU (Graphics Processing Unit, graphics processor). As shown in FIG. 3, after the output of the boot animation is completed, if it is determined that the release of the resources occupied by the boot animation is completed, an instruction may be sent to set the HMI layer visible by initializing a system interface, such as a system interface, and the HMI is awakened.
S207, outputting the human-computer interface by utilizing the target human-computer interface layer.
Specifically, as shown in fig. 3, the Weston service normally synthesizes and displays the target human-machine interface layer, and the HMI is displayed in the screen.
S208, if the current system is determined to enter a second preset mode from the current first preset mode, setting the target human-machine interface layer to be invisible, obtaining a first layer, and outputting the human-machine interface by using the first layer.
Wherein the first preset mode is different from the second preset mode.
For example, if the vehicle is temporarily parked, as shown in fig. 3, the current system may enter the second preset mode (e.g., a rest mode) from the first preset mode (e.g., a normal driving mode), then, as shown in fig. 3, the Weston service may be used to set the target man-machine interface layer to be invisible, and the man-machine interface is hidden by using the invisible layer, that is, the first layer is not synthesized and displayed. At this point, as shown in FIG. 3, the HMI enters a sleep state waiting to be awakened.
S209, if the current system enters the first preset mode from the second preset mode, setting the first layer to be visible, obtaining a second layer, and outputting a man-machine interface by using the second layer.
For example, as shown in fig. 3, if the current system exits the second preset mode and enters the first preset mode, the first layer that is not visible may be set to visible using Weston service, and the HMI is awakened. Then, as shown in FIG. 3, after the HMI wakes up, the Weston service will normally compose the second layer and send it out.
According to the interface display method provided by the embodiment of the invention, after the output of the startup animation is finished, the human-computer interface layer can be quickly set to be visible by using the initialization system interface, after the human-computer interface is displayed by using the human-computer interface layer, if the current system enters the preset mode, the human-computer interface can be quickly hidden by setting the target human-computer interface layer to be invisible without closing the human-computer interface, if the current system exits the preset mode, the human-computer interface can be quickly awakened by setting the first layer to be visible without restarting the human-computer interface, the quick switching of the display interface is realized, and the flash caused by the image transmission conflict is avoided.
Example III
Fig. 5 is a schematic structural diagram of an interface display device according to a third embodiment of the present invention. As shown in fig. 5, the apparatus includes: an interface layer determination module 301, a target layer determination module 302, and an interface output module 303, wherein:
the interface layer determining module is used for outputting a startup animation and determining a human-computer interface layer, wherein the human-computer interface layer is invisible;
the target layer determining module is used for starting a human-computer interface, and setting the human-computer interface layer to be visible after the output of the starting-up animation is finished, so as to obtain a target human-computer interface layer;
and the interface output module is used for outputting the human-computer interface by utilizing the target human-computer interface layer.
According to the interface display device provided by the embodiment of the invention, after the startup animation is output, the human-computer interface is hidden by utilizing the invisible human-computer interface layer, and after the startup animation is output, the human-computer interface is displayed by displaying the human-computer interface layer, so that the time sequence problem of startup animation and the human-computer interface is not needed to be considered, the phenomenon of black screen or splash screen is avoided, and the startup animation and the human-computer interface are reasonably connected.
Optionally, the target layer determining module includes:
and the first layer setting unit is used for setting the human-computer interface layer to be visible through an initializing system interface if the release of the resources occupied by the startup animation is determined to be completed after the startup animation is output.
Optionally, the apparatus further comprises:
and the first output module is used for setting the target human-computer interface layer to be invisible if the current system is determined to enter a second preset mode from a current first preset mode after the human-computer interface is output by using the target human-computer interface layer, so as to obtain a first layer, and outputting the human-computer interface by using the first layer, wherein the first preset mode is different from the second preset mode.
Optionally, the apparatus further comprises:
and the second output module is used for setting the first image layer to be visible after the human-computer interface is output by using the first image layer, and outputting the human-computer interface by using the second image layer if the current system is determined to enter the first preset mode from the second preset mode.
Optionally, the apparatus further comprises:
and the judging module is used for judging whether a target layer corresponding to the human-computer interface exists or not according to the layer identification before the human-computer interface layer is determined.
Optionally, the interface layer determining module includes:
the second layer setting unit is used for setting the target layer to be invisible if the information returned by the judging module is the target layer corresponding to the human-computer interface, so as to obtain the human-computer interface layer;
and the layer creation unit is used for creating the human-computer interface layer if the information returned by the judgment module is the target layer corresponding to the human-computer interface.
Optionally, the creating a human-computer interface layer includes: and calling a Weston toolkit in the Weston service to create an initial layer, and setting the initial layer to be invisible to obtain a human-computer interface layer.
Optionally, the apparatus further comprises:
and the rendering preparation module is used for binding the human-computer interface with the layer mark of the human-computer interface layer and completing the rendering preparation task of the human-computer interface if the human-computer interface layer is a newly created layer after the human-computer interface is started.
The interface display device provided by the embodiment of the invention can execute the interface display method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 6 shows a schematic diagram of an electronic device 40 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as car computers, laptop computers, desktop computers, workstations, personal digital assistants, servers, blade servers, mainframes, and other suitable computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 6, the electronic device 40 includes at least one processor 41, and a memory communicatively connected to the at least one processor 41, such as a Read Only Memory (ROM) 42, a Random Access Memory (RAM) 43, etc., in which the memory stores a computer program executable by the at least one processor, and the processor 41 may perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 42 or the computer program loaded from the storage unit 48 into the Random Access Memory (RAM) 43. In the RAM 43, various programs and data required for the operation of the electronic device 40 may also be stored. The processor 41, the ROM 42 and the RAM 43 are connected to each other via a bus 44. An input/output (I/O) interface 45 is also connected to bus 44.
Various components in electronic device 40 are connected to I/O interface 45, including: an input unit 46 such as a keyboard, a mouse, etc.; an output unit 47 such as various types of displays, speakers, and the like; a storage unit 48 such as a magnetic disk, an optical disk, or the like; and a communication unit 49 such as a network card, modem, wireless communication transceiver, etc. The communication unit 49 allows the electronic device 40 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 41 may be various general and/or special purpose processing components with processing and computing capabilities. Some examples of processor 41 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 41 performs the respective methods and processes described above, such as an interface display method.
In some embodiments, the interface display method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 48. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 40 via the ROM 42 and/or the communication unit 49. When the computer program is loaded into RAM 43 and executed by processor 41, one or more steps of the interface display method described above may be performed. Alternatively, in other embodiments, processor 41 may be configured to perform the interface display method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
The computer equipment provided by the above can be used for executing the interface display method provided by any embodiment, and has corresponding functions and beneficial effects.
Example five
In the context of the present invention, a computer-readable storage medium may be a tangible medium, which when executed by a computer processor, is configured to perform an interface display method comprising:
outputting a starting-up animation and determining a human-computer interface layer, wherein the human-computer interface layer is invisible;
starting a human-computer interface, and setting the human-computer interface layer to be visible after the starting-up animation is output, so as to obtain a target human-computer interface layer;
and outputting the human-computer interface by utilizing the target human-computer interface layer.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer equipment provided by the above can be used for executing the interface display method provided by any embodiment, and has corresponding functions and beneficial effects.
It should be noted that, in the embodiment of the interface display device, each unit and module included are only divided according to the functional logic, but not limited to the above-mentioned division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present invention.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (10)

1. An interface display method, comprising:
outputting a starting-up animation and determining a human-computer interface layer, wherein the human-computer interface layer is invisible;
starting a human-computer interface, and setting the human-computer interface layer to be visible after the starting-up animation is output, so as to obtain a target human-computer interface layer;
and outputting the human-computer interface by utilizing the target human-computer interface layer.
2. The method of claim 1, wherein setting the human-machine interface layer to be visible after the boot animation output is finished comprises:
and after the output of the starting-up animation is finished, if the release of the resources occupied by the starting-up animation is determined to be finished, setting the human-computer interface layer to be visible through an initializing system interface.
3. The method of claim 1, further comprising, after said outputting said human-machine interface with said target human-machine interface layer:
if the current system is determined to enter a second preset mode from a current first preset mode, setting the target human-computer interface layer to be invisible, obtaining a first layer, and outputting the human-computer interface by utilizing the first layer, wherein the first preset mode is different from the second preset mode.
4. A method according to claim 3, further comprising, after said outputting said human-machine interface using said first layer:
if the current system is determined to enter the first preset mode from the second preset mode, the first image layer is set to be visible, a second image layer is obtained, and the man-machine interface is output by using the second image layer.
5. The method of claim 1, further comprising, prior to said determining a human-machine interface layer:
judging whether a target layer corresponding to the human-computer interface exists or not according to the layer identification;
wherein the determining the human-machine interface layer comprises:
if a target layer corresponding to the human-computer interface exists, setting the target layer to be invisible, and obtaining the human-computer interface layer;
if not, a human-computer interface layer is created.
6. The method of claim 5, wherein creating a human-machine interface layer comprises:
and calling a Weston toolkit in the Weston service to create an initial layer, and setting the initial layer to be invisible to obtain a human-computer interface layer.
7. The method of claim 6, further comprising, after said activating a human-machine interface:
if the human-machine interface layer is a newly created layer, binding the human-machine interface with the layer identification of the human-machine interface layer, and completing the rendering preparation task of the human-machine interface.
8. An interface display device, comprising:
the interface layer determining module is used for outputting a startup animation and determining a human-computer interface layer, wherein the human-computer interface layer is invisible;
the target layer determining module is used for starting a human-computer interface, and setting the human-computer interface layer to be visible after the output of the starting-up animation is finished, so as to obtain a target human-computer interface layer;
and the interface output module is used for outputting the human-computer interface by utilizing the target human-computer interface layer.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the interface display method of any one of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to implement the interface display method of any one of claims 1-7 when executed.
CN202311439625.3A 2023-11-01 2023-11-01 Interface display method, device, equipment and storage medium Pending CN117407109A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311439625.3A CN117407109A (en) 2023-11-01 2023-11-01 Interface display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311439625.3A CN117407109A (en) 2023-11-01 2023-11-01 Interface display method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117407109A true CN117407109A (en) 2024-01-16

Family

ID=89497789

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311439625.3A Pending CN117407109A (en) 2023-11-01 2023-11-01 Interface display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117407109A (en)

Similar Documents

Publication Publication Date Title
CN110968415B (en) Scheduling method and device of multi-core processor and terminal
CN112644276B (en) Screen display method, vehicle and computer storage medium
CN114936173B (en) Read-write method, device, equipment and storage medium of eMMC device
CN117407109A (en) Interface display method, device, equipment and storage medium
CN116721007A (en) Task control method, system and device, electronic equipment and storage medium
CN115373618B (en) Multi-screen display method and device, vehicle machine and storage medium
CN115756615A (en) Quick starting method, device, equipment and storage medium
CN115437709A (en) Method and device for loading application home page splash screen resources
CN113867145A (en) Application control method and device, electronic equipment and storage medium
CN112764822A (en) Operating system starting method, device, equipment and medium
CN110704157A (en) Application starting method, related device and medium
CN117827355A (en) Theme switching method and device, vehicle and medium
CN115826898B (en) Cross-screen display method, system, device, equipment and storage medium
CN113656085B (en) Meter starting method, apparatus, device, storage medium and program product
CN115686748B (en) Service request response method, device, equipment and medium under virtualization management
CN113268300B (en) Information display method and device
CN114228745B (en) Driving system module control method, device, equipment, medium, product and vehicle
CN116149755A (en) Method, device, equipment and storage medium for starting central control screen of automobile
CN115062310A (en) Vehicle-mounted application program starting method and device, electronic equipment and storage medium
CN117407111A (en) Method and device for starting graphic display service, electronic equipment and storage medium
CN117908820A (en) Multi-virtual machine screen cutting method, device, equipment and storage medium
CN115373752A (en) Service processing method, device and storage medium
CN116185396A (en) QT graphic display method, operating system, electronic equipment and storage medium
CN115562605A (en) Vehicle starting animation display method and device, electronic equipment and storage medium
CN116302187A (en) Method and device for running software system of vehicle, vehicle-mounted host, medium and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination