CN109885373B - Rendering method and device of user interface - Google Patents

Rendering method and device of user interface Download PDF

Info

Publication number
CN109885373B
CN109885373B CN201910146557.9A CN201910146557A CN109885373B CN 109885373 B CN109885373 B CN 109885373B CN 201910146557 A CN201910146557 A CN 201910146557A CN 109885373 B CN109885373 B CN 109885373B
Authority
CN
China
Prior art keywords
display
screen
mode
bang
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910146557.9A
Other languages
Chinese (zh)
Other versions
CN109885373A (en
Inventor
郝竹明
张硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910146557.9A priority Critical patent/CN109885373B/en
Publication of CN109885373A publication Critical patent/CN109885373A/en
Application granted granted Critical
Publication of CN109885373B publication Critical patent/CN109885373B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a rendering method of a user interface, which comprises the steps of determining the type of a display screen for displaying the user interface before displaying a UI of an operation object, determining a screen display mode aiming at the operation object if the display screen is a special-shaped screen, and modifying a corresponding display area of the operation object on the display screen to be in accordance with a target mode by calling a display edge interface corresponding to the target mode after determining that the screen display mode is the target mode. That is, the shape of the modified display area satisfies the display shape characteristics required by the target mode, so that the condition that the UI of the running object is rendered according to the original relative position does not appear that the UI is displayed incompletely and is blocked. The display area mode which accords with the screen display mode is obtained through modification, when the UI is rendered, the UI relative rendering position does not need to be uniformly adjusted according to the display areas with different shapes, and the system processing burden of rendering the UI in the display areas with different shapes is reduced.

Description

Rendering method and device of user interface
Technical Field
The present invention relates to the field of data processing, and in particular, to a method and an apparatus for rendering a user interface.
Background
With the development of display technologies, the display screens of intelligent terminals are also increasingly diversified. Display screens with differently shaped display areas, such as shaped screens, have been produced. The special-shaped screen belongs to a display screen with irregular display area edges, for example, the current common Liuhai screen in the field of smart phones belongs to the special-shaped screen, and elements such as a camera and a sensor are arranged at the top of the front face of the smart phone, so that a part of non-display area is arranged at the top of the front face, and the two sides of the non-display area are still display areas.
The intelligent terminal can run programs or applications and interact with a User through a User Interface (UI) displayed on a display screen by the programs or the applications. Due to the irregular display area of the special-shaped screen, the problem that interaction is difficult or cannot be achieved due to the fact that the UI is shielded by the non-display area needs to be avoided.
Therefore, the currently adopted mode is that before a program and application display a UI on a display screen, the rendering position of the UI relative to the display area is modified, the rendering position of the UI is uniformly moved to the center direction of the display screen, and then the UI rendering is carried out, so that the phenomenon that the UI originally positioned at the edge of the display screen is not shielded due to an abnormal screen is avoided.
However, this approach results in the need to alter the rendering position of the UI relative to the display area each time before the UI is presented, increasing the processing burden on the system.
Disclosure of Invention
In order to solve the above technical problems, embodiments of the present invention provide a method and an apparatus for rendering a user interface, which do not need to uniformly adjust the relative rendering position of a UI with respect to display areas of different shapes, and reduce the processing load of a system for rendering a UI in the display areas of different shapes.
The embodiment of the invention discloses the following technical scheme:
in a first aspect, an embodiment of the present invention provides a method for rendering a user interface, where the method includes:
determining the type of a display screen, wherein the display screen is used for displaying a user interface of a running object;
if the display screen is the special-shaped screen, determining a screen display mode aiming at the operation object;
if the screen display mode is a target mode, modifying a display area corresponding to the running object on the display screen to be in accordance with the target mode by calling a display edge interface corresponding to the target mode;
rendering the user interface within the modified display area.
In a second aspect, an embodiment of the present invention provides an apparatus for rendering a user interface, where the apparatus includes a first determining unit, a second determining unit, a modifying unit, and a rendering unit:
the first determining unit is used for determining the type of a display screen, and the display screen is used for displaying a user interface of an operation object;
the second determining unit is configured to determine a screen display mode for the operation object if the first determining unit determines that the display screen is an irregular screen;
the modifying unit is configured to modify, if the second determining unit determines that the screen display mode is the target mode, a display area corresponding to the running object on the display screen to conform to the target mode by calling a display edge interface corresponding to the target mode;
the rendering unit is used for rendering the user interface in the modified display area.
In a third aspect, an embodiment of the present invention provides a rendering apparatus for a user interface, where the apparatus includes a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the rendering method of the user interface of the first aspect according to instructions in the program code.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium for storing program code, where the program code is used to execute the rendering method of the user interface according to the first aspect.
According to the technical scheme, the type of the display screen used for displaying the user interface is determined before the UI of the operation object is displayed, if the UI is an irregular screen, the screen display mode aiming at the operation object is determined, and after the screen display mode is determined to be the target mode, the corresponding display area of the operation object on the display screen is modified to accord with the target mode by calling the display edge interface corresponding to the target mode. That is, the shape of the modified display area satisfies the display shape characteristics required by the target mode, so that the condition that the UI of the running object is rendered according to the original relative position does not appear that the UI is displayed incompletely and is blocked. The display area mode which accords with the screen display mode is obtained through modification, when the UI is rendered, the UI relative rendering position does not need to be uniformly adjusted according to the display areas with different shapes, and the system processing burden of rendering the UI in the display areas with different shapes is reduced. Moreover, compared with the UI layout rendered by uniformly moving the rendering position to the central direction of the display screen, the UI layout rendered by the method better conforms to the shape characteristics of the display area in the current screen display mode, and the use experience of a user is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is an illustration of a game interface representation resulting from rendering a user interface in a conventional manner;
fig. 2 is an exemplary diagram of an application scenario of a rendering method of a user interface according to an embodiment of the present invention;
fig. 3 is a flowchart of a rendering method of a user interface according to an embodiment of the present invention;
fig. 4a is a diagram illustrating an example of a display area in the bang mode according to an embodiment of the present invention;
fig. 4b is an exemplary diagram of a display area in the bang-bang mode according to an embodiment of the present invention;
fig. 5a is a game interface diagram obtained after UI is rendered in the display bang mode according to the embodiment of the present invention;
fig. 5b is a game interface diagram obtained after rendering a UI in the bang-bang mode according to the embodiment of the present invention;
FIG. 6 is a diagram illustrating an example of a period from startup to shutdown of a runtime object according to an embodiment of the present invention;
fig. 7 is a view layout of an intelligent terminal according to an embodiment of the present invention;
fig. 8 is a flowchart of a rendering method of a user interface according to an embodiment of the present invention;
FIG. 9a is a block diagram of a rendering apparatus of a user interface according to an embodiment of the present invention;
FIG. 9b is a block diagram of a rendering apparatus of a user interface according to an embodiment of the present invention;
FIG. 9c is a block diagram of a rendering apparatus of a user interface according to an embodiment of the present invention;
fig. 10 is a block diagram of a rendering apparatus for a user interface according to an embodiment of the present invention;
fig. 11 is a block diagram of a server according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention will be described below with reference to the accompanying drawings.
In a conventional user interface rendering method, because a special-shaped screen generally sets a certain part of the edge of a display screen as a non-display area, before a UI is displayed each time, the rendering position of the UI relative to the display area needs to be changed, and the rendering position of the UI is uniformly moved toward the center of the display screen, so that the rendered UI layout is difficult to conform to the shape characteristics of the display area in the current screen display mode. In addition, since there may be many UIs on the display screen, all UIs, especially all UIs near the edge of the display screen, need to be considered when moving the rendering position of the UI, which is tedious and easy to miss, and increases the processing load of the system.
As shown in fig. 1, fig. 1 illustrates a game interface diagram including multiple UIs, shown respectively at 101, 102 … … 107. As can be seen from fig. 1, the display screen of the smart phone for displaying the game interface is a bang screen, and the bang part of the smart phone is shown as 108. In order to avoid that the Liuhai screen does not shield the UI in different screen display modes, rendering positions of all the UIs move towards the center direction of the display screen, and rendering of the UI at a position close to the edge of the display screen is avoided as much as possible. Because the rendering position of the UI is uniformly moved towards the center direction of the display screen, even if display areas exist on two sides of the Liuhai part, the UI cannot be rendered in the display areas, and the rendered UI layout does not conform to the shape characteristics of the display areas in the current screen display mode.
In addition, since the unified movement of the rendering position of the UI to the center direction of the display screen belongs to upper layer operation, attention needs to be paid to each UI, and the work is cumbersome and easy to miss. And because the rendering position of the UI is uniformly moved towards the center direction of the display screen, the display position of the UI on the upper layer and the actual position of the UI corresponding to the bottom layer can have deviation, even if a user operates a certain UI, the bottom layer can not react to the operation of the user due to the deviation, and then the user is difficult to interact with the operation object through the operation UI.
In order to solve the foregoing technical problem, an embodiment of the present invention provides a method for rendering a user interface, where before a UI of an operating object is displayed, if a display screen for displaying the user interface is determined to be an irregular screen and a screen display mode for the operating object is determined to be a target mode, it is no longer necessary to uniformly adjust a relative rendering position of the UI with respect to display areas of different shapes, but a display edge interface corresponding to the target mode is called to modify a display area, corresponding to the operating object, of the display screen to conform to the target mode, so that the UI of the operating object rendered according to an original relative position does not have a situation that the UI is displayed incompletely and is blocked.
The rendering method of the user interface provided by the embodiment of the invention can be applied to an intelligent terminal, and the intelligent terminal can be an intelligent mobile phone, a tablet computer and the like. As shown in fig. 2, a program or an application may be run on the intelligent terminal 201, and the running object is the program or the application run on the intelligent terminal 201.
Taking the operation object as an example, before a UI of the operation object needs to be displayed, the intelligent terminal 201 determines a type of a display screen for displaying the UI, where the UI is a medium for interaction between the operation object and a user, and the user may implement interaction with the operation object by operating the UI. The UI of the running object shown on the display screen may include a plurality of UIs, and the user may interact with the running object through different UIs, as shown in fig. 2, 101 and 102 … … 107 are different UIs, where the user may obtain map information of the running object through 101, and the user may control character movement in the game through 102, and the like.
If the display screen of the intelligent terminal 201 is the special-shaped screen, the intelligent terminal further determines a screen display mode for the operation object, and after the screen display mode is determined to be the target mode, the display area corresponding to the operation object on the display screen is modified to conform to the target mode by calling the display edge interface corresponding to the target mode. One display area that is modified to conform to the target mode is shown at 202 in FIG. 2.
The special-shaped screen belongs to a display screen with irregular display area edges. In order to arrange elements such as a camera and a sensor on the front surface of a smart phone, a part of a non-display area is usually arranged on the edge of a display screen, so that the elements such as the camera and the sensor are arranged on the non-display area. For example, the bang screen and the water drop screen of a smart phone are both special-shaped screens. As another example, a non-display area may be provided on the bottom of the smartphone, and a sensor may be provided in the non-display area for fingerprint unlocking and the like.
After the display area which accords with the target mode is obtained through modification, the UI is rendered in the modified display area, the UI relative rendering position does not need to be uniformly adjusted according to the display areas with different shapes, and the system processing burden of rendering the UI in the display areas with different shapes is reduced. The intelligent terminal 201 renders the UI on the display screen of the intelligent terminal 201 by using the method provided by the embodiment of the invention, and the rendered UI layout on the display screen of the intelligent terminal 201 is shown as 203 in fig. 2, so that the UI layout rendered by uniformly moving the rendering position to the center direction of the display screen is more in line with the shape characteristics of the display area in the current screen display mode, and the use experience of the user is improved.
Next, a method for rendering a user interface according to an embodiment of the present invention will be described in detail with reference to the drawings.
Referring to fig. 3, fig. 3 shows a flow chart of a method of rendering a user interface, the method comprising:
s301, determining the type of the display screen.
The display screen is used for displaying a user interface of the running object.
The display screen may include multiple types, such as a shaped screen and a non-shaped screen. Due to the irregular display area of the special-shaped screen, the problem that interaction is difficult or cannot be achieved due to the fact that the UI is shielded by the non-display area needs to be avoided. Therefore, before rendering the UI, the type of the display screen needs to be determined first, so that in the case that the display screen is a special-shaped screen, S302-S304 can be performed to avoid the rendered UI being blocked by the non-display area.
S302, if the display screen is the special-shaped screen, determining a screen display mode aiming at the operation object.
Under the condition that the display screen is the special-shaped screen, the screen display mode can include multiple, and the screen display mode aiming at the running object can be changed by setting the intelligent terminal.
Use the dysmorphism screen to be bang of bang screen as an example, the screen display mode can be including showing bang mode and hiding the bang mode, through setting up intelligent terminal, can switch the screen display mode to the operation object between showing bang mode and hiding the bang mode.
S303, if the screen display mode is the target mode, modifying the corresponding display area of the running object on the display screen to accord with the target mode by calling a display edge interface corresponding to the target mode.
Different screen display modes have display areas which are consistent with the screen display modes, and the display areas which are consistent with the target modes are obtained through modification according to the determined target model, so that the UI of the operation object can be rendered at the subsequent time according to the original relative position, and the situations that the UI is displayed incompletely and is shielded can not occur.
It is understood that the contoured screen may include a variety of screens, such as a bang screen, a drip screen, and the like. In an implementation, if the dysmorphism screen is the bang screen, the screen display mode of bang screen is including showing the bang mode and hiding the bang mode, then the target mode is including showing the bang mode or hiding the bang mode.
The display area of the operation object on the display screen may be different when the special-shaped screen is in different target modes. Taking the special-shaped screen as the bang screen as an example, if the target mode is the bang display mode, the corresponding display area of the operation object on the display screen is the display area for reserving the bang part. Referring to fig. 4a, the display area in the dashed box is shown as the bang portion, and the display area in the bang portion includes a display area (white area in the dashed box) and a non-display area (black area in the dashed box), and the display area in the bang portion is reserved to indicate that the display area in the bang portion can be used to display the UI, so that the display area in the display bang mode is shown as the white area in fig. 4 a.
If the target mode is the banned bang mode, the corresponding display area of the running object on the display screen is the display area of the bang hidden part. Referring to fig. 4b, the white area in 401 is the display area for displaying the bang section in the bang-bang mode, and the display area for hiding the bang section in the bang-bang mode indicates that the display area for hiding the bang section is no longer used for displaying the UI, and therefore, the display area in the bang-bang mode is shown as the white area in 402 in fig. 4 b.
It should be noted that the terminal types of the intelligent terminal may include many types, and in the target mode, different types of intelligent terminals have different display edge interfaces, and the display edge interface called in S303 should be an interface corresponding to the terminal type of the intelligent terminal. Therefore, in a possible implementation manner, the method further includes determining a terminal type of the intelligent terminal configured with the display screen, and then determining a corresponding display edge interface according to the terminal type.
Generally, the types of terminals corresponding to different brands of intelligent terminals may be different, and therefore, the different brands of intelligent terminals invoke different display edge interfaces to modify the display area. For example, when the brand of the intelligent terminal is brand a and the target mode is the display bang mode, the corresponding display edge interface is an interface called by calling an addExtraFlags () function; when the target mode is the hidden bang mode, the corresponding display edge interface is an interface called by calling the clearExtraFlags () function. If the terminal type is brand B, the display edge interface corresponding to the terminal type of brand B is called.
It should be noted that the step of determining the terminal type of the intelligent terminal configured with the display screen may be performed before the step of determining the screen display mode for the operation object, may be performed after the step of determining the screen display mode for the operation object, and may also be performed simultaneously with the step of determining the screen display mode for the operation object, which is not limited in this embodiment.
S304, rendering the user interface in the modified display area.
In this embodiment, the UI of the running object may be rendered according to the original relative position, and the UI layout rendered by this method better conforms to the shape characteristics of the display area in the current screen display mode than the UI layout rendered by uniformly moving the rendering position to the center direction of the display screen, thereby improving the user experience.
As shown in fig. 5a, fig. 5a is a game interface diagram obtained after the UI is rendered in the bang mode, and compared to the game interface diagram obtained in the conventional manner in fig. 1, in the game interface diagram obtained by using the user interface rendering method provided in the embodiment of the present invention, the UIs shown in 101 and 102 may be rendered in the display area of the bang portion, instead of moving the rendering position of the UI in the direction toward the center of the display screen in the conventional manner, so that the UIs are not rendered in the display areas on both sides of the bang portion, and the UI is prevented from being blocked by the bang. Therefore, the UI layout rendered by the method provided by the embodiment of the invention is more consistent with the shape characteristics of the display area in the current screen display mode.
Accordingly, a game interface diagram obtained after rendering the UI in the hidden liu mode is shown in fig. 5 b.
According to the technical scheme, the type of the display screen used for displaying the user interface is determined before the UI of the operation object is displayed, if the UI is an irregular screen, the screen display mode aiming at the operation object is determined, and after the screen display mode is determined to be the target mode, the corresponding display area of the operation object on the display screen is modified to accord with the target mode by calling the display edge interface corresponding to the target mode. That is, the shape of the modified display area satisfies the display shape characteristics required by the target mode, so that the condition that the UI of the running object is rendered according to the original relative position does not appear that the UI is displayed incompletely and is blocked. The display area mode which accords with the screen display mode is obtained through modification, when the UI is rendered, the UI relative rendering position does not need to be uniformly adjusted according to the display areas with different shapes, and the system processing burden of rendering the UI in the display areas with different shapes is reduced.
It should be noted that, in the embodiment of the present invention, modifying the display area by performing steps S301 to S304 to render the UI in the modified display area needs to be performed under a certain condition, and the display area is not modified randomly, that is, the intelligent terminal needs to perform the steps S301 to S304 when the running object meets a preset condition.
It can be understood that, since the purpose of modifying the display area is to ensure that the UI is not blocked by the non-display area of the special-shaped screen after the UI of the running object is rendered, when the UI of the running object needs to be rendered, the steps S301 to S304 may be triggered to be executed, that is, the preset condition may be when the UI of the running object needs to be rendered.
It should be noted that, in the whole period from the start to the close of the running object, the UI required to render the running object may include various occasions. When the running object is started, rendering the UI of the running object is required, and at this time, in order to avoid the rendered UI being blocked, the steps of S301 to S304 may be triggered to be executed.
Referring to fig. 6, the whole period of the running object from start to close includes a start active interface (Activity dropped), an onCreate () function is a function called when the active interface is started, … …, a close active interface (Activity cut down), and an enddescribe () function is a function called when the active interface is closed. When the intelligent terminal detects that the running object calls the onCreate () function, the running object is started, and the steps S301-S304 are triggered and executed.
When the running object is switched to the system foreground from the system background of the intelligent terminal to which the display screen belongs, the UI of the running object needs to be rendered again. For example, the running object is a game, after the user cuts out the setting of the intelligent terminal from the game and changes the setting of the intelligent terminal (for example, changes the bang hidden mode of the bang screen to the bang display mode), the user switches back to the game again and needs to render the UI of the game again, and at this time, in order to avoid the rendered UI being blocked, the steps of S301 to S304 may be triggered and executed.
When the operation object is switched to the system foreground from the system background of the intelligent terminal to which the display screen belongs, the function onWindowFocusChanged () is called, so that when the intelligent terminal detects that the operation object calls the function onWindowFocusChanged (), the steps of S301-S304 can be triggered and executed.
When the running object completes the hot update, the UI of the updated running object may be changed, and in order to avoid the rendered UI being blocked, the steps of S301 to S304 may be triggered to be executed.
Xml of the operation object is added with android attributes, some attributes may be changed due to hot update of the operation object, and when the attribute (Configuration) is changed, the operation object calls the function of onconfigurechanged (), so that when the intelligent terminal detects that the operation object calls the function of onconfigurechanged (), the steps of S301-S304 can be triggered to be executed.
Therefore, in a possible implementation manner, the preset condition may be when the running object is started, or when the running object is switched from the system background to the system foreground of the intelligent terminal to which the display screen belongs, or when the running object is updated thermally.
It should be noted that the key of the method provided by this embodiment is to modify the display area, and next, how to modify the corresponding display area of the running object on the display screen to conform to the target mode in S303 will be described.
Because the root view corresponding to the running object can define the shape of the display area corresponding to the running object on the screen, and the purpose of modifying the display area is to modify the shape of the display area into the display shape characteristic meeting the requirement of the target mode, in a possible implementation manner, the root view corresponding to the running object can be modified by calling the display edge interface corresponding to the target mode, so that the display area corresponding to the running object on the display screen is modified into the display shape meeting the target mode.
Taking a running object as a game as an example, the view layout of the intelligent terminal is shown in fig. 7, where a main activity interface (MainActivity) of the game is entered, the MainActivity includes a root view (RootView) of the game, the RootView includes a Unity view (unitplayerview), the Unity is a game engine, and the Unity view is a view related to the game engine. When the display edge interface corresponding to the target mode is called to modify the corresponding display area of the running object on the display screen to conform to the target mode, if only the UnityPlayerView is modified, the camera setting of the game UI needs to be continuously modified to match the logic operation (touch) and the rendering View, which causes many unnecessary trouble problems. Therefore, in the embodiment, the root view corresponding to the running object is modified by calling the display edge interface corresponding to the target mode, so that the modification of the display area can be directly completed, and the operation is simple.
Next, a rendering method of a user interface provided by the embodiment of the present invention will be described with reference to a specific application scenario. In the application scene, the running object is a game, and when the game is started or an onWindowFocusChanged () function is called to switch the game from a system background of the intelligent terminal to a system foreground, a UI of the game needs to be rendered, and the intelligent terminal is triggered to execute the rendering method of the user interface provided by the embodiment of the invention. The terminal type of the intelligent terminal is brand A.
In this case, the rendering method of the user interface is shown in fig. 8 and includes:
s801, starting the game.
The onCreate () function is a function called by the game when the game is completely started, and the execution of S802 can be triggered when the game calls the onCreate () function or calls the onWindowFocusChanged () function.
S802, determining whether the type of the display screen is a Liuhai screen, if so, executing S803, and if not, executing S807.
And S803, determining the terminal type of the intelligent terminal configured with the display screen.
S804, determining whether the screen display mode for the game is the Liuhai display mode, if so, executing S805, and if not, executing S806.
S805, calling a display edge interface by calling an addExtraFlags () function, and modifying a display area corresponding to the game on the display screen to be in accordance with the display Liuhai mode.
S806, calling a display edge interface by calling a clearExtraflags () function, and modifying a display area corresponding to the game on the display screen to be in accordance with the hidden Liu mode.
S807, entering the game.
……
And S808, quitting the game.
According to the technical scheme, the type of the display screen used for displaying the user interface is determined before the UI of the operation object is displayed, if the UI is an irregular screen, the screen display mode aiming at the operation object is determined, and after the screen display mode is determined to be the target mode, the corresponding display area of the operation object on the display screen is modified to accord with the target mode by calling the display edge interface corresponding to the target mode. That is, the shape of the modified display area satisfies the display shape characteristics required by the target mode, so that the condition that the UI of the running object is rendered according to the original relative position does not appear that the UI is displayed incompletely and is blocked. The display area mode which accords with the screen display mode is obtained through modification, when the UI is rendered, the UI relative rendering position does not need to be uniformly adjusted according to the display areas with different shapes, and the system processing burden of rendering the UI in the display areas with different shapes is reduced. Moreover, compared with the UI layout rendered by uniformly moving the rendering position to the central direction of the display screen, the UI layout rendered by the method better conforms to the shape characteristics of the display area in the current screen display mode, and the use experience of a user is improved.
Based on the rendering method of the user interface provided in the foregoing embodiment, this embodiment further provides a rendering apparatus of the user interface, referring to fig. 9a, the apparatus includes a first determining unit 901, a second determining unit 902, a modifying unit 903, and a rendering unit 904:
the first determining unit 901 is configured to determine a type of a display screen, where the display screen is used to display a user interface of an operating object;
the second determining unit 902 is configured to determine a screen display mode for the operation object if the first determining unit determines that the display screen is an irregular screen;
the modifying unit 903 is configured to modify, if the second determining unit determines that the screen display mode is the target mode, a display area corresponding to the running object on the display screen to conform to the target mode by calling a display edge interface corresponding to the target mode;
the rendering unit 904 is configured to render the user interface in the modified display area.
In one implementation, referring to fig. 9b, the apparatus further comprises a third determining unit 905:
the third determining unit 905 is configured to trigger the first determining unit 901 to execute the step of determining the type of the display screen if it is determined that the running object meets the preset condition.
In one implementation, the preset condition includes any one of:
when the running object is started;
when the running object is switched to a system foreground from a system background of the intelligent terminal to which the display screen belongs;
when the running object completes the hot update.
In an implementation manner, the modifying unit 903 is configured to modify the root view corresponding to the running object by calling a display edge interface corresponding to the target mode, and modify a display area of the running object on the display screen to conform to the target mode.
In one implementation, referring to fig. 9c, the apparatus further comprises a fourth determining unit 906 and a fifth determining unit 907:
the fourth determining unit 906 is configured to determine a terminal type of the intelligent terminal configured with the display screen;
the fifth determining unit 907 is configured to determine a corresponding display edge interface according to the terminal type.
Fig. 9c is only an exemplary structure of a rendering apparatus of a user interface, and the fourth determining unit 906 and the fifth determining unit 907 may be located in other positions, which is not limited in this embodiment.
In one implementation, if the special-shaped screen is a bang screen, the target mode includes displaying the bang mode or hiding the bang mode.
In one implementation manner, if the target mode is a display bang mode, the corresponding display area of the running object on the display screen is a display area for reserving a bang part;
and if the target mode is a banned bang mode, the display area of the running object corresponding to the display screen is a display area for banning the bang part.
According to the technical scheme, the type of the display screen used for displaying the user interface is determined before the UI of the operation object is displayed, if the UI is an irregular screen, the screen display mode aiming at the operation object is determined, and after the screen display mode is determined to be the target mode, the corresponding display area of the operation object on the display screen is modified to accord with the target mode by calling the display edge interface corresponding to the target mode. That is, the shape of the modified display area satisfies the display shape characteristics required by the target mode, so that the condition that the UI of the running object is rendered according to the original relative position does not appear that the UI is displayed incompletely and is blocked. The display area mode which accords with the screen display mode is obtained through modification, when the UI is rendered, the UI relative rendering position does not need to be uniformly adjusted according to the display areas with different shapes, and the system processing burden of rendering the UI in the display areas with different shapes is reduced. Moreover, compared with the UI layout rendered by uniformly moving the rendering position to the central direction of the display screen, the UI layout rendered by the method better conforms to the shape characteristics of the display area in the current screen display mode, and the use experience of a user is improved.
An embodiment of the present invention further provides a rendering device for a user interface, and the following describes the rendering device for a user interface with reference to the drawings. Referring to fig. 10, an embodiment of the present invention provides a rendering device 1000 for a user interface, where the device 1000 may also be a terminal device, and the terminal device may be any intelligent terminal including a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a Point of Sales (POS), a vehicle-mounted computer, and the terminal device is a mobile phone:
fig. 10 is a block diagram showing a partial structure of a cellular phone related to a terminal device provided in an embodiment of the present invention. Referring to fig. 10, the cellular phone includes: radio Frequency (RF) circuit 1010, memory 1020, input unit 1030, display unit 1040, sensor 1050, audio circuit 1060, wireless fidelity (WiFi) module 1070, processor 1080, and power source 1090. Those skilled in the art will appreciate that the handset configuration shown in fig. 10 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 10:
RF circuit 1010 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for processing downlink information of a base station after receiving the downlink information to processor 1080; in addition, the data for designing uplink is transmitted to the base station. In general, RF circuit 1010 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 1010 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The memory 1020 can be used for storing software programs and modules, and the processor 1080 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1020. The memory 1020 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1020 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1030 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1030 may include a touch panel 1031 and other input devices 1032. The touch panel 1031, also referred to as a touch screen, may collect touch operations by a user (e.g., operations by a user on or near the touch panel 1031 using any suitable object or accessory such as a finger, a stylus, etc.) and drive corresponding connection devices according to a preset program. Alternatively, the touch panel 1031 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1080, and can receive and execute commands sent by the processor 1080. In addition, the touch panel 1031 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 1030 may include other input devices 1032 in addition to the touch panel 1031. In particular, other input devices 1032 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, a joystick, or the like.
The display unit 1040 may be used to display information input by a user or information provided to the user and various menus of the cellular phone. The Display unit 1040 may include a Display panel 1041, and optionally, the Display panel 1041 may be configured in a form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 1031 can cover the display panel 1041, and when the touch panel 1031 detects a touch operation on or near the touch panel 1031, the touch operation is transmitted to the processor 1080 to determine the type of the touch event, and then the processor 1080 provides a corresponding visual output on the display panel 1041 according to the type of the touch event. Although in fig. 10, the touch panel 1031 and the display panel 1041 are two separate components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1031 and the display panel 1041 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1050, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1041 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1041 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 1060, speaker 1061, microphone 1062 may provide an audio interface between the user and the handset. The audio circuit 1060 can transmit the electrical signal converted from the received audio data to the speaker 1061, and the electrical signal is converted into a sound signal by the speaker 1061 and output; on the other hand, the microphone 1062 converts the collected sound signal into an electrical signal, which is received by the audio circuit 1060 and converted into audio data, which is then processed by the audio data output processor 1080 and then sent to, for example, another cellular phone via the RF circuit 1010, or output to the memory 1020 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help the user to send and receive e-mail, browse web pages, access streaming media, etc. through the WiFi module 1070, which provides wireless broadband internet access for the user. Although fig. 10 shows the WiFi module 1070, it is understood that it does not belong to the essential constitution of the handset, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 1080 is a control center of the mobile phone, connects various parts of the whole mobile phone by using various interfaces and lines, and executes various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1020 and calling data stored in the memory 1020, thereby integrally monitoring the mobile phone. Optionally, processor 1080 may include one or more processing units; preferably, the processor 1080 may integrate an application processor, which handles primarily the operating system, user interfaces, applications, etc., and a modem processor, which handles primarily the wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 1080.
The handset also includes a power source 1090 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 1080 via a power management system to manage charging, discharging, and power consumption via the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In this embodiment, the processor 1080 included in the terminal device further has the following functions:
determining the type of a display screen, wherein the display screen is used for displaying a user interface of a running object;
if the display screen is the special-shaped screen, determining a screen display mode aiming at the operation object;
if the screen display mode is a target mode, modifying a display area corresponding to the running object on the display screen to be in accordance with the target mode by calling a display edge interface corresponding to the target mode;
rendering the user interface within the modified display area.
Referring to fig. 11, embodiments of the present invention provide a server, which may have large differences due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 1122 (e.g., one or more processors) and a memory 1132, and one or more storage media 1130 (e.g., one or more mass storage devices) for storing applications 1142 or data 1144. Memory 1132 and storage media 1130 may be, among other things, transient storage or persistent storage. The program stored on the storage medium 1130 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Still further, central processor 1122 may be configured to communicate with storage medium 1130 to perform a series of instruction operations in storage medium 1130 on apparatus 1100 for neural network model training.
The apparatus 1100 for neural network model training may also include one or more power supplies 1126, one or more wired or wireless network interfaces 1150, one or more input-output interfaces 1158, and/or one or more operating systems 1141, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, and so forth.
The steps performed by the server in the above embodiment may be based on the server structure shown in fig. 11.
The terms "first," "second," "third," "fourth," and the like in the description of the embodiments of the invention and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It is to be understood that, in the present invention, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present invention may be implemented in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (8)

1. A method of rendering a user interface, the method comprising:
when the operation object meets a preset condition, determining the type of a display screen, wherein the display screen is used for displaying a user interface of the operation object, and the preset condition is that the operation object finishes starting the user interface of the operation object to be rendered, the operation object is switched from a system background of an intelligent terminal to which the display screen belongs to the user interface of the operation object to be rendered before the system, and the operation object finishes hot updating the user interface of the operation object to be rendered;
if the display screen is the special-shaped screen, determining a screen display mode aiming at the operation object;
if the screen display mode is a target mode, modifying the root view corresponding to the running object by calling a display edge interface corresponding to the target mode, and modifying the display area corresponding to the running object on the display screen to conform to the target mode, wherein the root view corresponding to the running object is the shape of the display area corresponding to the running object on the screen, and the shape of the modified display area meets the display shape characteristics required by the target mode, so as to render the user interface of the running object according to the original relative position, if the special-shaped screen is a bang screen, the target mode comprises a display bang mode or a hidden bang mode, when the target mode is the display bang mode, the corresponding display edge interface is an interface called by calling an addextratags () function, and when the target mode is the hidden bang mode, the corresponding display edge interface is an interface called by calling a clearExtraFlags () function;
rendering the user interface within the modified display area.
2. The method of claim 1, further comprising:
determining the terminal type of the intelligent terminal configured with the display screen;
and determining a corresponding display edge interface according to the terminal type.
3. The method according to claim 1, wherein if the target mode is a display bang mode, the corresponding display area of the running object on the display screen is a display area for reserving bang parts;
and if the target mode is a banned bang mode, the display area of the running object corresponding to the display screen is a display area for banning the bang part.
4. An apparatus for rendering a user interface, the apparatus comprising a first determining unit, a second determining unit, a modifying unit, and a rendering unit:
the first determining unit is configured to determine the type of a display screen when the running object meets a preset condition, where the display screen is used to display a user interface of the running object, and the preset condition is any one of that the running object completes starting the user interface of the running object to be rendered, that the running object needs to be rendered before switching from a system background of the intelligent terminal to which the display screen belongs to the system, and that the running object completes hot updating the user interface of the running object to be rendered;
the second determining unit is configured to determine a screen display mode for the operation object if the first determining unit determines that the display screen is an irregular screen;
the modifying unit is configured to modify, if the second determining unit determines that the screen display mode is a target mode, a root view corresponding to the operation object by calling a display edge interface corresponding to the target mode, and modify a display area of the operation object on the display screen to conform to the target mode, where the root view corresponding to the operation object is a shape of the display area of the operation object on the screen, and the modified display area has a shape meeting a display shape characteristic required by the target mode, so as to render a user interface of the operation object according to an original relative position, and if the special-shaped screen is a bang screen, the target mode includes a display bang mode or a hidden bang mode, and when the target mode is the display bang mode, the corresponding display edge interface is an interface called by calling an addExtraFlags () function, when the target mode is a hidden bang mode, the corresponding display edge interface is an interface called by calling a clearExtraFlags () function;
the rendering unit is used for rendering the user interface in the modified display area.
5. The apparatus of claim 4, further comprising a fourth determination unit and a fifth determination unit:
the fourth determining unit is used for determining the terminal type of the intelligent terminal configured with the display screen;
and the fifth determining unit is used for determining a corresponding display edge interface according to the terminal type.
6. The apparatus according to claim 4, wherein if the target mode is a display bang mode, the corresponding display area of the operation object on the display screen is a display area for reserving a bang part;
and if the target mode is a banned bang mode, the display area of the running object corresponding to the display screen is a display area for banning the bang part.
7. A rendering device for a user interface, the device comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the method of rendering a user interface of claims 1-3 according to instructions in the program code.
8. A computer-readable storage medium for storing program code for performing the rendering method of the user interface of claims 1-3.
CN201910146557.9A 2019-02-27 2019-02-27 Rendering method and device of user interface Active CN109885373B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910146557.9A CN109885373B (en) 2019-02-27 2019-02-27 Rendering method and device of user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910146557.9A CN109885373B (en) 2019-02-27 2019-02-27 Rendering method and device of user interface

Publications (2)

Publication Number Publication Date
CN109885373A CN109885373A (en) 2019-06-14
CN109885373B true CN109885373B (en) 2021-11-23

Family

ID=66929663

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910146557.9A Active CN109885373B (en) 2019-02-27 2019-02-27 Rendering method and device of user interface

Country Status (1)

Country Link
CN (1) CN109885373B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112748894A (en) * 2019-10-30 2021-05-04 北京小米移动软件有限公司 Hole digging screen display method and device
CN111459227A (en) * 2020-03-31 2020-07-28 联想(北京)有限公司 Electronic device and display control method thereof
CN114168031B (en) * 2022-02-11 2023-03-31 荣耀终端有限公司 Display optimization method and device for hole digging screen and storage medium
CN115328592B (en) * 2022-07-08 2023-12-29 华为技术有限公司 Display method and related device
CN115361468B (en) * 2022-10-21 2023-02-28 荣耀终端有限公司 Display optimization method and device during screen rotation and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108182043A (en) * 2018-01-19 2018-06-19 维沃移动通信有限公司 A kind of method for information display and mobile terminal
CN108536498A (en) * 2017-12-29 2018-09-14 广东欧珀移动通信有限公司 Electronic device, the control method of chat interface and Related product
CN109032445A (en) * 2018-07-16 2018-12-18 维沃移动通信有限公司 A kind of control method for screen display and terminal device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9013510B2 (en) * 2011-07-29 2015-04-21 Google Inc. Systems and methods for rendering user interface elements in accordance with a device type
CN106250080A (en) * 2016-07-29 2016-12-21 腾讯科技(深圳)有限公司 Method for displaying image and device
CN108536366A (en) * 2018-03-28 2018-09-14 维沃移动通信有限公司 A kind of application window method of adjustment and terminal
CN108519848A (en) * 2018-03-30 2018-09-11 联想(北京)有限公司 A kind of display control method and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108536498A (en) * 2017-12-29 2018-09-14 广东欧珀移动通信有限公司 Electronic device, the control method of chat interface and Related product
CN108182043A (en) * 2018-01-19 2018-06-19 维沃移动通信有限公司 A kind of method for information display and mobile terminal
CN109032445A (en) * 2018-07-16 2018-12-18 维沃移动通信有限公司 A kind of control method for screen display and terminal device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Android兼容Huawei手机刘海屏解决方案;奥特曼超人Dujinyang;《https://blog.csdn.net/djy1992/article/details/80683575》;20180613;第1-10页 *

Also Published As

Publication number Publication date
CN109885373A (en) 2019-06-14

Similar Documents

Publication Publication Date Title
CN109885373B (en) Rendering method and device of user interface
CN106851010B (en) Interference processing method and terminal for target application
WO2015172704A1 (en) To-be-shared interface processing method, and terminal
CN109871164B (en) Message sending method and terminal equipment
CN109947327B (en) Interface viewing method, wearable device and computer-readable storage medium
CN107066268B (en) Display position switching method and device for widget application
CN109407948B (en) Interface display method and mobile terminal
CN108958593B (en) Method for determining communication object and mobile terminal
CN111092990A (en) Application program sharing method and electronic equipment
CN107967153B (en) Application program management method and mobile terminal
WO2020181956A1 (en) Method for displaying application identifier, and terminal apparatus
CN111127595A (en) Image processing method and electronic device
CN107357651B (en) Application acceleration method and device and terminal
CN110898424B (en) Display control method and electronic equipment
CN110167006B (en) Method for controlling application program to use SIM card and terminal equipment
CN109933267B (en) Method for controlling terminal equipment and terminal equipment
CN109126127B (en) Game control method, dual-screen mobile terminal and computer-readable storage medium
CN110769303A (en) Playing control method and device and mobile terminal
CN112691367B (en) Data processing method and related device
CN107193551B (en) Method and device for generating image frame
CN111399715B (en) Interface display method and electronic equipment
CN110908757B (en) Method and related device for displaying media content
CN111176529B (en) Key display method and electronic equipment
CN110083205B (en) Page switching method, wearable device and computer-readable storage medium
CN110865743A (en) Task management method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant