CN112817682A - Interface display method, electronic device and non-volatile computer readable storage medium - Google Patents

Interface display method, electronic device and non-volatile computer readable storage medium Download PDF

Info

Publication number
CN112817682A
CN112817682A CN202110193569.4A CN202110193569A CN112817682A CN 112817682 A CN112817682 A CN 112817682A CN 202110193569 A CN202110193569 A CN 202110193569A CN 112817682 A CN112817682 A CN 112817682A
Authority
CN
China
Prior art keywords
application scene
type
instruction
application
post
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110193569.4A
Other languages
Chinese (zh)
Inventor
黄文涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110193569.4A priority Critical patent/CN112817682A/en
Publication of CN112817682A publication Critical patent/CN112817682A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an interface display method, an electronic device and a nonvolatile computer readable storage medium. The display method comprises the following steps: intercepting a graphic API instruction of an application program; shunting the graphic API instruction according to the function of graphic API instruction reaction to obtain a first type of graphic API instruction and a second type of graphic API instruction, wherein the first type of graphic API instruction corresponds to the UI, and the second type of graphic API instruction corresponds to the application scene; preprocessing a UI according to the first type of graphic API instruction, and/or preprocessing an application scene according to the second type of graphic API instruction; and displaying the processed UI and an interface formed by the application scene. In the presentation method, the electronic device and the non-volatile computer-readable storage medium, the UI and the application scene are independently processed by shunting the graphic API instruction and dividing the graphic API instruction into the UI part and the scene part.

Description

Interface display method, electronic device and non-volatile computer readable storage medium
Technical Field
The present application relates to the field of deblurring, and more particularly, to a method for displaying an interface, an electronic device, and a non-volatile computer-readable storage medium.
Background
At present, in a computer Application program, various graphical Application Programming Interfaces (APIs) are often required to be called, the API is applied to various scenes in the Application program and User Interface (UI) controls for the scenes, and appropriate scene design can quickly and accurately convey information and highlight themes, so that participants can enjoy vivid visual effects, and the good UI design can make each scene more personalized, so that the Application program can be operated more comfortably and simply in various scenes, and the characteristics of the Application program can be fully embodied. Various application programs have higher and higher requirements on the UI and the application scene, and the UI is often bound with information such as characters, maps, buildings and the like in the scene, so that the UI and the application scene are difficult to be processed independently.
Disclosure of Invention
The embodiment of the application provides an interface display method, an electronic device and a nonvolatile computer readable storage medium.
The interface display method of the embodiment of the application comprises the following steps: intercepting a graphic API instruction of an application program; shunting the graphics API instructions according to the function of the graphics API instruction reaction to obtain first type graphics API instructions and second type graphics API instructions, wherein the first type graphics API instructions correspond to the UI, and the second type graphics API instructions correspond to an application scene; preprocessing the UI according to the first type of graphic API instruction, and/or preprocessing the application scene according to the second type of graphic API instruction; and displaying an interface formed by the processed UI and the application scene.
The electronic device of the embodiment of the application comprises one or more processors and a display, wherein the one or more processors are used for: intercepting a graphic API instruction of an application program; shunting the graphics API instructions according to the function of the graphics API instruction reaction to obtain first type graphics API instructions and second type graphics API instructions, wherein the first type graphics API instructions correspond to the UI, and the second type graphics API instructions correspond to an application scene; preprocessing the UI according to the first type of graphic API instruction, and/or preprocessing the application scene according to the second type of graphic API instruction; the display is used for displaying the processed UI and an interface formed by the application scene.
A non-transitory computer-readable storage medium of an embodiment of the present application includes a computer program that, when executed by a processor, causes the processor to perform a presentation method as follows: intercepting a graphic API instruction of an application program; shunting the graphics API instructions according to the function of the graphics API instruction reaction to obtain first type graphics API instructions and second type graphics API instructions, wherein the first type graphics API instructions correspond to the UI, and the second type graphics API instructions correspond to an application scene; preprocessing the UI according to the first type of graphic API instruction, and/or preprocessing the application scene according to the second type of graphic API instruction; and displaying an interface formed by the processed UI and the application scene.
In the presentation method, the electronic device, and the non-volatile computer-readable storage medium according to the embodiments of the present application, the UI and the application scene are independently processed by branching the graphics API instruction and dividing the graphics API instruction into the UI part and the scene part.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart of a method of presenting an interface according to some embodiments of the present application;
FIG. 2 is a schematic structural diagram of an electronic device according to some embodiments of the present application;
FIG. 3 is a schematic diagram of a presentation method applied to an application according to some embodiments of the present application;
FIGS. 4-8 are flow charts of a presentation method according to certain embodiments of the present application;
FIG. 9 is a schematic diagram of a connection between a non-volatile computer readable storage medium and a processor according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1, an embodiment of the present application provides a method for displaying an interface, where the method includes:
01: intercepting a graphic API instruction of an application program;
02: shunting the graphic API instruction according to the function of graphic API instruction reaction to obtain a first type of graphic API instruction and a second type of graphic API instruction, wherein the first type of graphic API instruction corresponds to the UI, and the second type of graphic API instruction corresponds to the application scene;
03: preprocessing a UI according to the first type of graphic API instruction, and/or preprocessing an application scene according to the second type of graphic API instruction; and
08: and displaying the processed UI and an interface formed by the application scene.
Referring to fig. 2, the present embodiment further provides an electronic device 100, where the electronic device 100 includes one or more processors 10 and a display 30. The method for displaying the interface according to the embodiment of the present application can be applied to the electronic device 100 according to the embodiment of the present application. Wherein, one or more processors 10 can be used to execute the methods in 01, 02, 03, and the display 30 is used to display the interface formed by the method in 08, that is, one or more processors 10 can be used to: intercepting a graphic API instruction of an application program; shunting the graphic API instruction according to the function of graphic API instruction reaction to obtain a first type of graphic API instruction and a second type of graphic API instruction, wherein the first type of graphic API instruction corresponds to the UI, and the second type of graphic API instruction corresponds to the application scene; and preprocessing the UI according to the first type of graphic API instruction, and/or preprocessing the application scene according to the second type of graphic API instruction. The display 30 is used for displaying an interface formed by the processed UI and the application scene.
Referring to fig. 2, the electronic device 100 may be a terminal having an application installed therein, such as a mobile phone, a tablet computer, a smart watch, a smart helmet, and a computer. The processor 10 and the display 30 may be integrated in a terminal such as a mobile phone, a tablet computer, a smart watch, a smart helmet, and a computer. The present application is described only by taking the electronic device 100 as a mobile phone as an example, and the electronic device 100 is similar to a mobile phone when it is a terminal of another type, and will not be described in detail.
At present, the requirements on the UI and the application scene in the application program are higher and higher, the good UI design can enable a user to be more comfortable, simpler and more humanized in the application program using process, and meanwhile, the proper scene design can quickly and most accurately convey information and highlight a theme, so that the user has more humanized operation in the vivid visual effect. The UI is often bound with information such as characters, maps, buildings, and the like in the scene, and it is difficult to implement independent processing on the UI and the application scene. In the presentation method, the electronic device, and the non-volatile computer-readable storage medium of the present application, the one or more processors 10 selectively preprocess the first type of graphics API instructions and the second type of graphics API instructions by shunting and dividing the graphics API instructions into the first type of graphics API instructions corresponding to the UI portion and the second type of graphics API instructions corresponding to the scene portion, so as to implement independent processing on the UI and the application scene.
Referring to fig. 3, a control layer is set in the calling phase of the graphics API, and the control layer is always located at the lowest layer of the services of the application program to implement filtering of upper layer instructions. Specifically, when the application is started and starts to run, the application program is processed in real time in the running stage. In a Graphics Processing Unit (GPU) pipeline rendering process, one or more processors 10 intercept a bottom Graphics API called by an application, wherein the Graphics API is shunted to prepare for a previous stage of UI detection and is not used for actual interface picture rendering, that is, in the interface presentation method of the present application, the actual rendering process of the application is not required to be known, and in an application running stage, the Graphics API is divided into a first type Graphics API instruction (corresponding to a UI) and a second type Graphics API instruction (corresponding to an application scene) by identifying the UI of the application and determining according to a function reflected by the called Graphics API borrowing, so that the processors 10 can conveniently perform independent preprocessing on the UI and/or the application scene.
Referring to fig. 4, in some embodiments, the method 03: preprocessing the UI according to the first type of graphics API instruction, and/or preprocessing the application scene according to the second type of graphics API instruction, comprising:
031: performing any one of appearance beautification and function extension on the UI;
033: any one of real-time rendering and function extension is performed on the application scene.
Referring to fig. 2, one or more processors 10 may be used to perform the methods in 031 and 033, i.e., one or more processors 10 may be used to perform any one of appearance beautification and function extension on a UI; any one of real-time rendering and function extension is performed on the application scene.
In one embodiment, the pre-processing of the UI by one or more processors 10 includes a function to beautify the appearance of the UI or a function extension. The appearance beautification of the UI comprises the steps of realizing color change of the UI key icons and the display icons, or adding decoration (such as adding shadows to the frames, adding fillet styles to the frames and the like) effects to the frames of the UI or adding a mapping to the UI control under the condition that the picture display of the interface is not influenced, so that the UI control in the interface has more individuality, and a user can obtain a rich and vivid visual effect when using the application program.
The function extension of the UI comprises the step of adding a quick setting function to a UI control in a certain scene, for example, the UI control of a quick screenshot is set on a navigation interface of an entertainment application program, the screenshot interface does not need to be called for screenshot, and a user can quickly and conveniently share an interface to be shared in the process of not influencing the entertainment immersion of the user. The extension of the UI function may also include adding a fingerprint encryption function in some application scenarios, for example, adding a fingerprint encryption setting in an application scenario involving privacy, to prevent a problem of privacy disclosure when a user other than the user owner uses the application scenario involving privacy. The extension of the UI's functionality may also include a combination of key-activated special effects (e.g., special effects speech), such as a crack special effect in the interface when the user activates both buttons. The function extension of the UI may further include a function of adding interaction with other applications, for example, by clicking a button of other applications provided in the interface during the video viewing process, the other applications may be opened, and the interfaces opened by the other applications are displayed in a part of the interface, so that the user may also selectively use the other applications while viewing the video. Specifically, the functional extension of the UI can realize customized functions according to actual application scenes, and the functional extension of the UI can also realize more personalized and differentiated products by combining with the Internet of things.
In another embodiment, the pre-processing of the application scene by the one or more processors 10 includes performing real-time rendering or function expansion. The real-time rendering of the application scene includes performing secondary rendering on data such as textures, light rays and colors in the scene, for example, in a large map related scene, a filter effect, a dynamic special effect, image quality enhancement and the like can be added to the scene, and the processor 10 performs information re-enhancement on original scene information, so that the scene in the interface is more vivid, the aesthetic feeling of the application scene is improved, and the rendering theme is enhanced. The function expansion of the application scene may include switching background information in a certain scene, for example, in a game application program, a battle scene may include rainy background information, sunny background information, daytime background information, and nighttime background information, and a user may selectively switch the background information of the battle scene, so that the application program applying the presentation method is more humanized.
After one or more processors 10 have bypassed the graphics API, the processors 10 independently preprocess the UI and/or application scenarios. For example, the processor 10 may implement any one of appearance beautification and function extension of the UI, while retaining the original function of the application scene, implement functionality enhancement and extension of the UI part, and enhance the interface beauty; for another example, the processor 10 may perform any one of real-time rendering and function extension on the application scene, while keeping the original function of the UI unchanged, and implement secondary rendering on the application scene or increase a filter effect without increasing power consumption of the application program; for example, the processor 10 may be configured to perform pre-processing on the UI and the application scene simultaneously, for example, to enhance and expand functionality of the UI part and enhance the appearance of the interface, and at the same time, perform secondary rendering on the application scene, add a filter effect, and the like, so as to achieve richer scene effects and functions for the application program.
Referring to fig. 5, the display method provided in the embodiment of the present application further includes:
07: and performing instruction rearrangement on the first type of graphics API instructions and the second type of graphics API instructions.
Referring to fig. 2, one or more processors 10 may be configured to perform the method of 07, i.e., one or more processors 10 may be configured to perform instruction reordering on a first type of graphics API instructions and a second type of graphics API instructions.
Specifically, the one or more processors 10 perform instruction rearrangement on the branched first type graphics API instruction and second type graphics API instruction, where the instruction rearrangement is: the processor 10 performs out-of-order execution optimization on the input code, and when reordering, data dependency is observed, that is, the processor 10 does not change the execution order of two operations having data dependency relationship, and the final result after instruction reordering is consistent with the result under sequential execution.
The one or more processors 10 perform instruction rearrangement on the first type graphics API and the second type graphics API to optimize the algorithm and improve the execution efficiency, and in combination with the use of the buffer, the processing work after the graphics API bypass does not affect the real-time performance of the rendered image.
Referring to fig. 6, the display method provided in the embodiment of the present application further includes:
04: detecting the pretreated UI;
08: displaying an interface formed by the processed UI and the application scene, comprising:
081: and when the UI is matched with the application scene, displaying an interface formed by the preprocessed UI and the application scene.
Referring to fig. 2, one or more processors 10 may be configured to perform the methods 04, 081, that is, one or more processors 10 may be configured to detect the preprocessed UI; the display 30 is used to: and when the UI and the application scene are matched, displaying an interface formed by the preprocessed UI and the application scene.
In one embodiment, one or more processors 10 detect a UI after performing preprocessing, and specifically, filter an instruction corresponding to the preprocessed UI, for example, in a game application, add a UI control related to settlement to a game settlement interface, and do not present the UI control in other game interfaces, at this time, the processor 10 determines the application scene, sends a UI instruction satisfying the application scene to a GPU for rendering, so as to improve the execution efficiency of the GPU, when an interface where the game application runs is the game settlement interface, the added UI control related to settlement is adapted to the application scene, and the display 30 displays the UI after preprocessing (adding the UI control related to settlement) and the interface formed by the application scene.
Referring to fig. 7, the display method provided in the embodiment of the present application further includes:
05: carrying out post-processing on the UI and the application scene after the pre-processing;
08: displaying an interface formed by the processed UI and the application scene, comprising:
083: and displaying an interface formed by the UI after the post-processing and the application scene after the post-processing.
Referring to fig. 2, one or more processors 10 may be configured to perform the methods of 05, 083, that is, one or more processors 10 may be configured to perform post-processing on the pre-processed UI and application scene; the display 30 is used to: and displaying an interface formed by the UI after the post-processing and the application scene after the post-processing.
In one embodiment, the processor 10 performs post-processing on the preprocessed UI and application scene, including: carrying out post-processing on the preprocessed UI and the application scenes which are not preprocessed; or carrying out post-processing on the UI which is not subjected to the pre-processing and the application scene subjected to the pre-processing; or carrying out post-processing on the preprocessed UI and the preprocessed application scene. The processor 10 performs post-processing on the pre-processed UI and the application scene, including integrating the pre-processed UI and the application scene, so as to prevent the pre-processed UI and the application scene from generating delay and frame dropping in the GPU rendering process, and when the display 30 displays an interface formed by the post-processed UI and the application scene, it is ensured that the pre-processed UI and the application scene can be synchronously displayed without affecting the use of a user.
Referring to fig. 8, the display method provided in the embodiment of the present application further includes:
06: detecting the UI after post-processing;
08: displaying an interface formed by the processed UI and the application scene, comprising:
085: and when the UI is matched with the application scene, displaying an interface formed by the post-processed UI and the post-processed application scene.
Referring to fig. 2, one or more processors 10 may be configured to perform the methods of 06, 081, i.e., one or more processors 10 may be configured to detect the pre-processed UI; the display 30 is used to: and when the UI and the application scene are matched, displaying an interface formed by the UI after the post-processing and the application scene after the post-processing.
In one embodiment, one or more processors 10 detect a UI after performing post-processing, and similarly, filter an instruction corresponding to the UI after performing post-processing, for example, in a game application, the processor 10 performs pre-processing on the UI, that is, adds a UI control related to settlement in a game settlement interface, and when the UI control does not appear in other game interfaces, and the processor 10 integrates the pre-processed UI and the application scene, the processor 10 determines the application scene, and sends a UI instruction satisfying the application scene to a GPU for rendering, so as to improve the execution efficiency of the GPU, and when an interface where the game application runs is the game settlement interface, the added UI control related to settlement is adapted to the application scene, and the display 30 displays an interface formed by the UI after performing post-processing and the application scene after performing post-processing.
In the presentation method, the electronic device, and the non-volatile computer-readable storage medium provided by the application, the processor 10 splits the graphic API instructions into a first type of graphic API instructions corresponding to the UI portion and a second type of graphic API instructions corresponding to the application scene portion, the processor 10 does not need to know the actual rendering process of the application program, the processor 10 selectively preprocesses the first type of graphic API instructions and the second type of graphic API instructions, and rearranges the split graphic API instructions, so as to improve the execution efficiency of the GPU, and by using the buffer, the real-time performance of a rendered image is not affected by UI detection, and the user experience is improved.
Referring to fig. 9, the present application provides a non-volatile computer-readable storage medium 200 comprising 201. The computer program 201, when executed by the one or more processors 10, causes the one or more processors 10 and the display 30 to perform the rendering methods of 01, 02, 03, 031, 033, 04, 05, 06, 07, 081, 083, 085.
For example, the computer program 201, when executed by the one or more processors 10, causes the processors 10 to perform the method of presenting:
01: intercepting a graphic API instruction of an application program;
02: shunting the graphic API instruction according to the function of graphic API instruction reaction to obtain a first type of graphic API instruction and a second type of graphic API instruction, wherein the first type of graphic API instruction corresponds to the UI, and the second type of graphic API instruction corresponds to the application scene;
03: preprocessing a UI according to the first type of graphic API instruction, and/or preprocessing an application scene according to the second type of graphic API instruction; and
08: and displaying the processed UI and an interface formed by the application scene.
As another example, the computer program 201, when executed by the one or more processors 10, causes the processors 10 to perform the presentation method as follows:
01: intercepting a graphic API instruction of an application program;
02: shunting the graphic API instruction according to the function of graphic API instruction reaction to obtain a first type of graphic API instruction and a second type of graphic API instruction, wherein the first type of graphic API instruction corresponds to the UI, and the second type of graphic API instruction corresponds to the application scene;
031: performing any one of appearance beautification and function extension on the UI;
033: performing any one of real-time rendering and function extension on the application scene;
04: detecting the pretreated UI;
07: performing instruction rearrangement on the first type of graphics API instructions and the second type of graphics API instructions;
081: and when the UI is matched with the application scene, displaying an interface formed by the preprocessed UI and the application scene.
As another example, the computer program 201, when executed by the one or more processors 10, causes the processors 10 to perform the method of presenting:
01: intercepting a graphic API instruction of an application program;
02: shunting the graphic API instruction according to the function of graphic API instruction reaction to obtain a first type of graphic API instruction and a second type of graphic API instruction, wherein the first type of graphic API instruction corresponds to the UI, and the second type of graphic API instruction corresponds to the application scene;
031: performing any one of appearance beautification and function extension on the UI;
033: performing any one of real-time rendering and function extension on the application scene;
04: detecting the pretreated UI;
05: carrying out post-processing on the UI and the application scene after the pre-processing;
07: performing instruction rearrangement on the first type of graphics API instructions and the second type of graphics API instructions;
083: and displaying an interface formed by the UI after the post-processing and the application scene after the post-processing.
Also for example, the computer program 201, when executed by the one or more processors 10, causes the processors 10 to perform the presentation method as follows:
01: intercepting a graphic API instruction of an application program;
02: shunting the graphic API instruction according to the function of graphic API instruction reaction to obtain a first type of graphic API instruction and a second type of graphic API instruction, wherein the first type of graphic API instruction corresponds to the UI, and the second type of graphic API instruction corresponds to the application scene;
031: performing any one of appearance beautification and function extension on the UI;
033: performing any one of real-time rendering and function extension on the application scene;
05: carrying out post-processing on the UI and the application scene after the pre-processing;
06: detecting the UI after post-processing;
07: performing instruction rearrangement on the first type of graphics API instructions and the second type of graphics API instructions;
085: and when the UI is matched with the application scene, displaying an interface formed by the post-processed UI and the post-processed application scene.
In the description herein, references to the description of the terms "certain embodiments," "one example," "exemplary," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (13)

1. A method for displaying an interface is characterized by comprising the following steps:
intercepting a graphic API instruction of an application program;
shunting the graphics API instructions according to the function of the graphics API instruction reaction to obtain first type graphics API instructions and second type graphics API instructions, wherein the first type graphics API instructions correspond to the UI, and the second type graphics API instructions correspond to an application scene;
preprocessing the UI according to the first type of graphic API instruction, and/or preprocessing the application scene according to the second type of graphic API instruction; and
and displaying the processed UI and an interface formed by the application scene.
2. Presentation method according to claim 1,
preprocessing the UI comprises the following steps: performing any one of appearance beautification and function extension on the UI;
preprocessing the application scene comprises the following steps: and performing any one of real-time rendering and function extension on the application scene.
3. Presentation method according to claim 1, characterized in that the presentation method further comprises:
and performing instruction rearrangement on the first type of graphics API instructions and the second type of graphics API instructions.
4. Presentation method according to claim 1, characterized in that the presentation method further comprises: detecting the preprocessed UI;
the displaying the processed interface formed by the UI and the application scene comprises:
and when the UI is matched with the application scene, displaying an interface formed by the preprocessed UI and the application scene.
5. Presentation method according to claim 1, characterized in that the presentation method further comprises: post-processing the UI and the application scene;
the displaying the processed interface formed by the UI and the application scene comprises:
and displaying an interface formed by the UI after the post-processing and the application scene after the post-processing.
6. The presentation method as claimed in claim 6, wherein the presentation method further comprises: detecting the UI after post-processing;
the displaying the interface formed by the post-processed UI and the post-processed application scene includes:
and when the UI is matched with the application scene, displaying an interface formed by the post-processed UI and the post-processed application scene.
7. An electronic device, comprising:
one or more processors to: intercepting a graphic API instruction of an application program; shunting the graphics API instructions according to the function of the graphics API instruction reaction to obtain first type graphics API instructions and second type graphics API instructions, wherein the first type graphics API instructions correspond to the UI, and the second type graphics API instructions correspond to an application scene; preprocessing the UI according to the first type of graphic API instruction, and/or preprocessing the application scene according to the second type of graphic API instruction; and
and the display is used for displaying the processed UI and an interface formed by the application scene.
8. The electronic device of claim 7, wherein one or more of the processors are further configured to:
performing any one of appearance beautification and function extension on the UI;
and performing any one of real-time rendering and function extension on the application scene.
9. The electronic device of claim 7, wherein one or more of the processors are further configured to:
and performing instruction rearrangement on the first type of graphics API instructions and the second type of graphics API instructions.
10. The electronic device according to claim 7, wherein the one or more processors are further configured to detect the UI after pre-processing; the display is further to: and when the UI is matched with the application scene, displaying an interface formed by the preprocessed UI and the application scene.
11. The electronic device of claim 7, wherein one or more of the processors are further configured to: post-processing the UI and the application scene;
the display is further used for displaying an interface formed by the post-processed UI and the post-processed application scene.
12. The electronic device according to claim 7, wherein one or more of the processors are further configured to detect the UI after post-processing;
and the display is also used for displaying an interface formed by the post-processed UI and the post-processed application scene when the UI is matched with the application scene.
13. One or more computer programs stored thereon which, when executed by one or more processors, implement the presentation method of any one of claims 1 to 6.
CN202110193569.4A 2021-02-20 2021-02-20 Interface display method, electronic device and non-volatile computer readable storage medium Pending CN112817682A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110193569.4A CN112817682A (en) 2021-02-20 2021-02-20 Interface display method, electronic device and non-volatile computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110193569.4A CN112817682A (en) 2021-02-20 2021-02-20 Interface display method, electronic device and non-volatile computer readable storage medium

Publications (1)

Publication Number Publication Date
CN112817682A true CN112817682A (en) 2021-05-18

Family

ID=75864351

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110193569.4A Pending CN112817682A (en) 2021-02-20 2021-02-20 Interface display method, electronic device and non-volatile computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112817682A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018000619A1 (en) * 2016-06-29 2018-01-04 乐视控股(北京)有限公司 Data display method, device, electronic device and virtual reality device
CN108568112A (en) * 2018-04-20 2018-09-25 网易(杭州)网络有限公司 A kind of generation method of scene of game, device and electronic equipment
CN111045586A (en) * 2018-10-12 2020-04-21 上海博泰悦臻电子设备制造有限公司 Interface switching method based on three-dimensional scene, vehicle-mounted equipment and vehicle
CN111294502A (en) * 2018-12-07 2020-06-16 中国移动通信集团终端有限公司 Photographing method, device with photographing function, equipment and storage medium
CN111803945A (en) * 2020-07-23 2020-10-23 网易(杭州)网络有限公司 Interface rendering method and device, electronic equipment and storage medium
CN111803940A (en) * 2020-01-14 2020-10-23 厦门雅基软件有限公司 Game processing method and device, electronic equipment and computer-readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018000619A1 (en) * 2016-06-29 2018-01-04 乐视控股(北京)有限公司 Data display method, device, electronic device and virtual reality device
CN108568112A (en) * 2018-04-20 2018-09-25 网易(杭州)网络有限公司 A kind of generation method of scene of game, device and electronic equipment
CN111045586A (en) * 2018-10-12 2020-04-21 上海博泰悦臻电子设备制造有限公司 Interface switching method based on three-dimensional scene, vehicle-mounted equipment and vehicle
CN111294502A (en) * 2018-12-07 2020-06-16 中国移动通信集团终端有限公司 Photographing method, device with photographing function, equipment and storage medium
CN111803940A (en) * 2020-01-14 2020-10-23 厦门雅基软件有限公司 Game processing method and device, electronic equipment and computer-readable storage medium
CN111803945A (en) * 2020-07-23 2020-10-23 网易(杭州)网络有限公司 Interface rendering method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11074725B2 (en) Rendering semi-transparent user interface elements
US10118097B2 (en) Systems and methods for automated image processing for images with similar luminosities
US9665171B1 (en) Gaze and saccade based graphical manipulation
US11902613B2 (en) Video transparent playing processing method, intelligent television, and storage medium
RU2530272C2 (en) Method and apparatus for creating user interface
KR102307163B1 (en) Cross-platform rendering engine
CN114669047B (en) Image processing method, electronic equipment and storage medium
WO2008137957A1 (en) Post-render graphics overlays
CN109035381B (en) Cartoon picture hair rendering method and storage medium based on UE4 platform
CN106681593A (en) Display control method and device for user interface UI control
WO2023045941A1 (en) Image processing method and apparatus, electronic device and storage medium
CN111383320B (en) Virtual model processing method, device, equipment and storage medium
JP7160495B2 (en) Image preprocessing method, device, electronic device and storage medium
JP2017068438A (en) Computer program for generating silhouette, and computer implementing method
US10484523B2 (en) Mobile communication terminal and method therefore
CN110286979A (en) Reduce the rendering method and system of Overdraw caused by UI covers
KR102160092B1 (en) Method and system for processing image using lookup table and layered mask
CN107203312A (en) Rendering intent, the storage device of mobile terminal and its picture
CN112817682A (en) Interface display method, electronic device and non-volatile computer readable storage medium
US20170031583A1 (en) Adaptive user interface
WO2023045961A1 (en) Virtual object generation method and apparatus, and electronic device and storage medium
CN110992276A (en) Image processing method, device, medium and electronic equipment
CN105892663A (en) Information processing method and electronic device
CN109859258A (en) Interface processing method and device, electronic equipment
CN116459511A (en) Special effect optimization method, device, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination