CN112328130B - Display processing method and electronic equipment - Google Patents

Display processing method and electronic equipment Download PDF

Info

Publication number
CN112328130B
CN112328130B CN202010990210.5A CN202010990210A CN112328130B CN 112328130 B CN112328130 B CN 112328130B CN 202010990210 A CN202010990210 A CN 202010990210A CN 112328130 B CN112328130 B CN 112328130B
Authority
CN
China
Prior art keywords
window
target
area
windows
drawing instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010990210.5A
Other languages
Chinese (zh)
Other versions
CN112328130A (en
Inventor
姚鑫
李杰纯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202010990210.5A priority Critical patent/CN112328130B/en
Publication of CN112328130A publication Critical patent/CN112328130A/en
Application granted granted Critical
Publication of CN112328130B publication Critical patent/CN112328130B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application relates to the technical field of terminal equipment, and provides a display processing method and electronic equipment, wherein the display processing method comprises the following steps: the method comprises the steps of carrying out block division on a display area of the electronic equipment to obtain a plurality of block areas; detecting that the electronic equipment displays a plurality of windows, and acquiring attribute information of the plurality of windows; determining the stacking relation of the windows according to the attribute information of the windows; if the plurality of windows are determined to be displayed in a shielding manner according to the stacking relation of the plurality of windows, obtaining a drawing instruction of each window before each window is submitted to rendering; determining a completely-shielded window and a non-completely-shielded window in the plurality of windows according to the stacking relation of the plurality of windows, and eliminating a drawing instruction of the completely-shielded window; matching a corresponding drawing instruction for each block area to obtain a processed drawing instruction of each non-completely-shielded window; and displaying according to the drawing instruction processed by each non-completely-shielded window. According to the embodiment of the application, the drawing instruction is simplified, and the display workload is reduced.

Description

Display processing method and electronic equipment
Technical Field
The present application relates to the field of terminal device technologies, and in particular, to a display processing method and an electronic device.
Background
With the rapid development of multi-device cooperation and distributed scenes, a trend that a plurality of windows are displayed simultaneously appears on the terminal device. Current graphics display systems, such as Android (Android) graphics display systems, render each window separately and then perform pixel synthesis on each window when multiple windows are displayed simultaneously. When the display areas of the windows are partially overlapped, each window is rendered in an independent thread, and the window rendered in any thread is not known to be blocked by other windows, so that each complete window is independently rendered, and each window is subjected to overlapping and/or cutting processing during display. The simultaneous rendering of a complete interface by a plurality of windows results in increased load and waste of resources.
Disclosure of Invention
The embodiment of the application provides a display processing method and electronic equipment, and the technical problems that a plurality of windows are rendered simultaneously to form a complete interface, so that the load is increased and system resources are wasted can be solved.
In a first aspect, an embodiment of the present application provides a display processing method, which is applied to an electronic device, and the display processing method includes:
the display area of the electronic equipment is divided into blocks to obtain a plurality of block areas;
detecting that the electronic equipment displays a plurality of windows, and acquiring attribute information of the windows;
determining the stacking relation of the windows according to the attribute information of the windows;
if the plurality of windows are determined to be displayed in a shielding manner according to the stacking relation of the plurality of windows, obtaining a drawing instruction of each window before each window is submitted to rendering;
determining a completely-shielded window and a non-completely-shielded window in the plurality of windows according to the stacking relation of the plurality of windows, and eliminating the drawing instruction of the completely-shielded window;
matching a corresponding drawing instruction for each blocking area to obtain a processed drawing instruction of each non-completely-shielded window;
and displaying according to the processed drawing instruction of each non-completely-shielded window.
The embodiment of the first aspect is applied to a scene with multiple windows simultaneously displayed, and on one hand, the workload of graphic display can be reduced by simplifying drawing instructions, so that the load of a system is reduced, and the reduction of power consumption and the improvement of performance are brought; on the other hand, the method can be applied to electronic equipment with different computational powers by respectively matching the corresponding drawing instructions to the block areas.
In a possible implementation manner of the first aspect, the dividing a display screen of the electronic device into blocks to obtain a plurality of block areas includes:
and according to one or more of the resolution of the display screen of the electronic equipment, the size of the display screen, the total number of displayed windows and the self calculation power of the electronic equipment, carrying out block division on the display area of the display screen to obtain a plurality of block areas.
In the implementation mode, the display screen of the electronic equipment can be partitioned based on different parameters, so that the method and the device are suitable for different application scenes, and the universality of the application is improved.
In one possible implementation manner of the first aspect, the attribute information includes one or more of a position, a size, a transparency, and a Z-axis order of the window.
In a possible implementation manner of the first aspect, the matching, for each blocking area, a corresponding drawing instruction to obtain a processed drawing instruction of each non-completely-occluded window includes:
sequentially starting the calculation tasks of each block area, wherein each calculation task matches a corresponding drawing instruction for each block area according to the stacking relation of the non-complete shielding windows in each block area to obtain the processed drawing instruction of each non-complete shielding window; or the like, or, alternatively,
and simultaneously starting a plurality of computing tasks, wherein each computing task matches a corresponding drawing instruction for a block area according to the stacking relation of the non-complete shielding windows in the block area until each block area is matched with the corresponding drawing instruction, so as to obtain the processed drawing instruction of each non-complete shielding window.
In this implementation manner, the corresponding drawing instructions can be respectively matched to all the partitioned areas by sequentially starting the calculation tasks of the partitioned areas or simultaneously starting the calculation tasks of the partitioned areas, and the method can be applied to electronic devices with different computing powers. When a plurality of computing tasks are started simultaneously, the data processing efficiency can be greatly improved, and the display time consumption is reduced.
In a possible implementation manner of the first aspect, each of the computation tasks matches a corresponding drawing instruction for one of the block regions according to a stacking relationship of the non-complete occlusion windows in the block region, and includes:
determining target windows falling into the same blocking area, and acquiring a target drawing instruction of each target window in the blocking area, wherein the target windows are the non-completely-shielded windows falling into the same blocking area, and the target drawing instruction is a drawing instruction of any one target window in the blocking area;
and determining to reserve, delete or modify the target drawing instruction of each target window in the block area according to the stacking relation of the target windows in the block area.
In the implementation mode, a convenient implementation mode of how to simplify the drawing instructions is provided, so that the method and the device are easy to implement.
In a possible implementation manner of the first aspect, determining to reserve, delete, or modify the target drawing instruction of each target window in the block area according to the stacking relationship of the target windows in the block area includes:
if any one target window falls into the blocking area of the target window, the area completely belongs to the shielding area of the target window, and then the target drawing instruction of the target window in the blocking area is deleted;
if any region of the target window falling into the block region completely belongs to the non-shielding region of the target window, reserving a target drawing instruction of the target window in the block region;
and if the part of the area of any target window falling into the block area belongs to the shielding area of the target window and the part of the area belongs to the non-shielding area of the target window, modifying the target drawing instruction of the target window in the block area.
In the implementation mode, a convenient implementation mode of how to obtain the processed drawing instruction is provided, so that the method and the device are easy to implement.
In a possible implementation manner of the first aspect, the determining, reserving, deleting, or modifying the target drawing instruction of each target window in the block area according to the stacking relationship of the target windows in the block area includes:
if any target drawing instruction of any target window completely falls into the block area, determining to reserve, delete or modify the target drawing instruction according to the stacking relation of the target window in the block area;
if any target drawing instruction of any target window falls into a plurality of partitioned areas including the partitioned area, the plurality of the falling partitioned areas are used as first areas, and the target drawing instruction is determined to be reserved, deleted or modified according to the stacking relation of the target window in the first areas.
In the implementation mode, a convenient implementation mode of how to obtain the processed drawing instruction is provided, so that the method and the device are easy to implement.
In a possible implementation manner of the first aspect, if any target rendering instruction of any target window completely falls into the block area, determining to reserve, delete, or modify the target rendering instruction according to a stacking relationship of the target window in the block area, includes:
if any target drawing instruction of any target window completely falls into the block area and the target window is positioned at the uppermost layer of the block area, the target drawing instruction is reserved;
if any target drawing instruction of any target window completely falls into the block area, the target window is not positioned on the uppermost layer in the block area, and other target windows positioned on the upper layer of the target window in the block area belong to transparent windows, the target drawing instruction is reserved;
if any target drawing instruction of any target window completely falls into the block area, the target window is not positioned on the uppermost layer in the block area, other target windows positioned on the upper layer of the target window in the block area belong to non-transparent windows, and the target drawing instruction is completely shielded, the target drawing instruction is deleted;
if any target drawing instruction of any target window completely falls into the block area, the target window is not positioned on the uppermost layer in the block area, other target windows positioned on the upper layer of the target window in the block area belong to non-transparent windows, and the target drawing instruction is regularly shielded, the display area of the target drawing instruction is modified;
if any target drawing instruction of any target window completely falls into the block area, the target window is not positioned on the uppermost layer in the block area, other target windows positioned on the upper layer of the target window in the block area belong to non-transparent windows, and the target drawing instruction is irregularly shielded, the minimum modifiable area of the non-shielded area of the target drawing instruction in the block area is found out, and if the minimum modifiable area is found, the display area of the target drawing instruction is modified to be the minimum modifiable area; if the minimum modifiable area is not found, the target rendering instruction is retained.
In the implementation mode, a convenient implementation mode of how to obtain the processed drawing instruction is provided, so that the method and the device are easy to implement.
In a possible implementation manner of the first aspect, the determining to reserve, delete, or modify the target rendering instruction according to the stacking relationship of the target window in the first area includes:
if the target window is positioned at the uppermost layer of the first area, the target drawing instruction is reserved;
if the target window is not positioned on the uppermost layer in the first area and other target windows positioned on the upper layer of the target window in the first area belong to transparent windows, the target drawing instruction is reserved;
if the target window is not positioned on the uppermost layer in the first area, other target windows positioned on the upper layer of the target window in the first area belong to non-transparent windows, and the target drawing instruction is completely shielded, deleting the target drawing instruction;
if the target window is not positioned on the uppermost layer in the first area, other target windows positioned on the upper layer of the target window in the first area belong to non-transparent windows, and the target drawing instruction is regularly shielded, modifying the display area of the target drawing instruction;
if the target window is not positioned on the uppermost layer in the first area, other target windows positioned on the upper layer of the target window in the first area belong to non-transparent windows, and the target drawing instruction is irregularly shielded, finding out the minimum modifiable area of the non-shielded area of the target drawing instruction in the first area, and if the minimum modifiable area is found, modifying the display area of the target drawing instruction into the minimum modifiable area; if the minimum modifiable area is not found, the target rendering instruction is retained.
In the implementation mode, a convenient implementation mode of how to obtain the processed drawing instruction is provided, so that the method and the device are easy to implement.
In one possible implementation form of the first aspect, the minimum modifiable area comprises a minimum modifiable rectangular area.
In a possible implementation manner of the first aspect, after determining the stacking relationship of the windows according to the attribute information of the windows, the method further includes:
and if the plurality of windows are determined not to be displayed in a shielding manner according to the stacking relation of the plurality of windows, displaying according to the drawing instructions of the plurality of windows.
In a second aspect, there is provided a display processing apparatus configured to an electronic device, the display processing apparatus corresponding to the display processing method provided in the first aspect, the display processing apparatus including:
the dividing module is used for dividing the display area of the electronic equipment into blocks to obtain a plurality of block areas;
the detection module is used for detecting that the electronic equipment displays a plurality of windows and acquiring attribute information of the windows;
the determining module is used for determining the stacking relation of the windows according to the attribute information of the windows;
the obtaining module is used for obtaining a drawing instruction of each window before each window submits rendering if the plurality of windows are determined to be displayed in a shielding manner according to the stacking relation of the plurality of windows;
the deleting module is used for determining a completely-shielded window and a non-completely-shielded window in the windows according to the stacking relation of the windows and rejecting a drawing instruction of the completely-shielded window;
the matching module is used for matching a corresponding drawing instruction for each blocking area to obtain a processed drawing instruction of each non-completely-shielded window;
and the display module is used for displaying according to the processed drawing instruction of each non-completely-occluded window.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program, so that the electronic device implements the display processing method according to any one of the first aspect and possible implementation manners of the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the display processing method according to any one of the first aspect and possible implementation manners of the first aspect is implemented.
In a fifth aspect, an embodiment of the present application provides a computer program product, which, when run on an electronic device, causes the electronic device to execute the display processing method described in any one of the foregoing first aspect and possible implementation manners of the first aspect.
It will be appreciated that the advantageous effects of the second to fifth aspects described above may be seen in relation to the description of the first aspect described above.
Drawings
FIG. 1A is a schematic view of a scene displayed simultaneously in multiple windows according to an embodiment of the present application;
FIG. 1B is a schematic diagram of an occlusion region of a window A in the scene shown in FIG. 1A in the present application;
FIG. 1C is a schematic diagram of a non-occluded area of a window A in the scene shown in FIG. 1A in the present application;
FIG. 2 is a schematic view of another scenario of multi-window simultaneous display according to an embodiment of the present application;
fig. 3 is a flowchart of a display processing method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 5 is a block diagram of a software structure of an electronic device according to an embodiment of the present application;
FIG. 6 is a first application scenario with multiple windows simultaneously displayed according to an embodiment of the present application;
fig. 7 is a schematic flowchart of a display processing method according to an embodiment of the present application;
FIG. 8 is a block division diagram of a display area according to an embodiment of the present application;
FIG. 9A is a block diagram of a display area provided in an embodiment of the present application;
FIG. 9B is a block diagram of a display area provided in an embodiment of the present application;
FIG. 10 is a schematic diagram of an overlay display of a window C and a window D in the scenario illustrated in FIG. 6 of the present application;
FIG. 11 is a schematic illustration of the display of window B in the scenario illustrated in FIG. 6 of the present application;
FIG. 12 is a block diagram illustrating a display area in the scene shown in FIG. 6 according to the present application;
FIG. 13 is a flowchart illustrating matching of corresponding rendering instructions to each tile region according to an embodiment of the present application;
fig. 14A is a second application scenario of the display processing method according to an embodiment of the present application;
fig. 14B is a second application scenario of the display processing method according to an embodiment of the present application;
fig. 15A is a third application scenario of a display processing method according to an embodiment of the present application;
fig. 15B is a third application scenario of the display processing method according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of this application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, such as "one or more", unless the context clearly indicates otherwise.
It should also be understood that in the embodiments of the present application, "a plurality" and "one or more" mean one, two or more; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when.. or" upon "or" in response to a determination "or" in response to a detection ".
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In order to better understand the technical solution of the present application, several important terms related to the present application will be introduced.
Window or picture layer
Each application may correspond to one or more graphical interfaces, each of which may be referred to as a layer (surface), or window (window). For example, fig. 1A is a schematic display diagram of a multi-window simultaneous display. The three different areas filled with different lines in fig. 1A correspond to three different windows, namely window a, window B and window C, respectively. There is an overlap between the three windows. In the schematic diagram shown in fig. 1A, window B partially covers window a, and window C partially covers both window a and window B.
Size of the layer
The size of the layer refers to the display area of the layer, and reflects the size of a display area occupied by the layer when the layer is displayed on a display screen. The size of the layer is usually determined by two parameters, the width and height of the layer, which are in pixels (px). The width of the layer is multiplied by the height, and the size of the layer, namely the number of occupied pixels, can be obtained.
Position of the layers
The position of the layer refers to the display position of the layer on the display screen. In general, the start position of the layer is represented by the coordinates of the upper left corner or the lower left corner of the layer. The coordinates of the top left or bottom left corner of the layer are typically pixel coordinates.
It should be noted that when the width, height and position of the layer are determined, which area (including the number of occupied pixels and the coordinates of the occupied pixels) of the display screen the layer is displayed in is determined.
Transparency of layer
The transparency of the layer refers to the transparency degree of the layer, and influences the overlapping effect of the layer and other layers. In order to not completely shield the layer positioned below, the layer positioned above can be provided with a certain transparency, so that the superposition display effect of a plurality of layers can be seen simultaneously.
Transparency is usually expressed in percent and ranges from 0 to 100%. The window with the transparency of 0 is a non-transparent window, and the window with the transparency of 0 is a transparent window.
With continued reference to the schematic diagram shown in fig. 1A, window B has a transparency of 0% and belongs to a non-transparent window. The transparency of the window C is 50%, and the window C belongs to a transparent window.
Z-axis sequence (Z-order)
The Z-order is called Z-order for short, a Z-axis is arranged in the direction perpendicular to the plane of the display screen, and the front and back orders of the display are determined by different layers according to coordinates on the Z-axis, and the order is called Z-order. The smaller the Z-axis sequence of the layer is, the more front the layer is displayed, namely the more upper the layer is displayed; the larger the Z-axis sequence of the layers is, the more backward the layers are displayed, that is, the more downward the layers are displayed.
In an application scenario in which multiple windows are simultaneously displayed, the multiple windows may be displayed in an overlapping manner. Thus, displaying the front non-transparent window may at least partially obscure displaying the rear window. When the graphic display system displays a plurality of windows, whether the plurality of windows are displayed in an overlapped mode on the display screen can be determined according to the width, the height, the position and the Z-axis sequence of each window, and if the windows are displayed in an overlapped mode, the shielded area and the non-shielded area of each window can be determined.
The occlusion region of a window is the region of the window that is occluded by a non-transparent window that is smaller in the Z-axis order. The occluded region is not viewable by the user, and thus the occluded region may also be referred to as a visually invisible region or an invisible region. The occlusion region of a window is the intersection of the window with a non-transparent window whose Z-axis order is smaller than that of the window.
The non-occlusion area of the window is the area left after the window is occluded by the non-transparent window which is smaller than the non-occlusion area in the Z-axis sequence. The non-occluded area can be viewed by the user and thus the non-occluded area can also be referred to as a visually visible area or a visible area. The non-occlusion area of the window is the remaining area of the window minus the occlusion area.
With continued reference to the schematic diagram shown in FIG. 1A, window C has a Z-order of 0, window B has a Z-order of 1, and window A has a Z-order of 2. Window C is displayed on the display screen at the forefront. Window a is displayed most back on the display and window B is displayed between window a and window C. The graphical display system controls the display of the three windows a, B and C on the display screen by maintaining a Z-order of the three windows.
The transparency of the window C is 50%, and the window C belongs to a transparent window; the transparency of the window B is 0%, and the window B belongs to a non-transparent window. The window A is not shielded by the transparent window C and is partially shielded by the non-transparent window B. The occlusion region of the window a is the intersection region of the window a and the window B, as shown in fig. 1B. The non-occluded area of window a is the remaining area of window a except its intersection with window B, as shown in fig. 1C. The window B is not blocked by the transparent window C, and there is no blocked area, i.e., the blocked area is 0. The transparent window C is the window that is displayed furthest forward, apparently without occlusion.
Resolution of display screen
The resolution of the display screen refers to the number of pixels in the vertical and horizontal directions, and the unit is px. The resolution of the display screen determines the number of pixels displayed on the display screen of the electronic device, measured as pixels in the horizontal and vertical directions. The resolution of the display screen is 160 × 128, which means that the number of pixels in the horizontal direction is 160 and the number of pixels in the vertical direction is 128.
In the case of a display screen of the same size, when the resolution is low, for example, 640 × 480, a small number of pixels are displayed on the display screen, and the size of a single pixel is large. When the resolution is high, for example 1600 × 1200, a large number of pixels are displayed on the display screen, and the size of a single pixel is small.
In order to explain the technical means of the present application, the following description will be given by way of specific examples.
With the improvement of the performance of the equipment and the continuous enhancement of the application function, the application scenes of games, live broadcast, video call, meetings or video playing and the like are more and more abundant. More and more applications are beginning to support multi-window modes, such as split screen mode, free mode, and picture-in-picture.
The graphic display system supports the display function of multiple layers or multiple windows, but because each window is rendered in an independent process, the window rendered in any thread is not known to be blocked by other windows, so that each complete window is independently rendered, and each window is subjected to superposition and/or clipping processing during display. The simultaneous rendering of a complete interface by a plurality of windows results in increased load and waste of resources.
As a non-limiting example, an Android graphics display system framework is used for illustration.
The Android interface is usually formed by overlapping a plurality of layers, such as a navigation bar, a status bar, foreground applications, a floating window and the like. For example, as shown in fig. 2, an Android interface is formed by superimposing layer D, layer E, layer F, and layer G. Layer D is a window of the first application. Layer E is a window of the second application. Layer F is a window of a third application. And the layer G is a window of the fourth application.
The display of an interface in the Android system needs to go through stages of interface drawing, interface rendering, synthesis display and the like. Fig. 3 is a schematic diagram illustrating a drawing process of an Android interface in an Android system. In this example, the example is described with an example in which the Android interface includes four application interfaces that are displayed simultaneously, and each application interface includes one layer or window.
As shown in fig. 3, the multi-window display processing flow mainly includes: after a Vertical Synchronization (VSYNC) signal arrives, a Central Processing Unit (CPU) respectively draws display content of each application interface to generate a drawing instruction, converts the drawing instruction into a rendering instruction, such as an OpenGL instruction, and submits the rendering instruction to a Graphics Processing Unit (GPU). And transforming, synthesizing and rendering by the GPU, and submitting a rendering result to a frame buffer by the GPU. When the VSYNC signal comes next time, the video controller reads the data of the frame buffer area line by line according to the VSYNC signal and transmits the data to the display screen for display through possible digital-to-analog conversion.
It can be seen that each window is subjected to drawing, rendering and other processing in an independent process, and each window is completely rendered and then is superposed with other windows. The simultaneous rendering of a complete interface by a plurality of windows results in increased load and waste of resources.
Therefore, the embodiment of the application provides a display processing method, which is applied to a scene of simultaneous display of multiple windows, and reduces workload of graphic display, thereby reducing load of a system, and bringing about reduction of power consumption and improvement of performance.
The display processing method provided by the embodiment of the application can be applied to electronic devices, and the electronic devices include, but are not limited to, a mobile phone, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, a super-mobile personal computer (UMPC), a netbook, a tablet computer, a smart speaker, a Set Top Box (STB) or a television. The embodiment of the present application does not set any limit to the specific type of the electronic device.
In some embodiments of the present application, the electronic device comprises a mobile phone, a tablet computer, a personal computer, or the like.
Fig. 4 shows a schematic structural diagram of the electronic device 100, taking a mobile phone as an example.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a layered architecture as an example to exemplarily illustrate a software structure of the electronic device 100.
Fig. 5 is a block diagram of the software configuration of the electronic apparatus 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 5, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 5, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following illustrates application scenarios and implementation flows of the embodiments of the present application by way of non-limiting examples.
First application scenario
Fig. 6 is a schematic diagram of a first application scenario. The first application scene is an application scene displayed by one multi-window of the notebook computer at the same time. As shown in fig. 6, window a is a system window, window B is a window of a certain news application that the mobile phone is projected to the notebook computer, window C is a window of a certain video application opened on the notebook computer, and window D is a window of a certain instant messaging application opened on the notebook computer.
An embodiment of the present application provides a display processing method, which can be applied to the application scenario shown in fig. 6, and at this time, the display processing method can be executed by a notebook computer. Firstly, acquiring window attribute information of each window of a foreground, namely a window A, a window B, a window C and a window D, and the size, the position, the transparency, the Z-axis sequence and the like of each window at the system side, and determining the number of foreground windows and the stacking relation of four windows. When the situation that the notebook computer is in a multi-window overlapped display scene is detected, then the display area of the display screen of the notebook computer is divided into blocks, and according to the dividing result, a drawing instruction corresponding to the visual invisible part of each window in each block area is calculated. And then modifying and/or deleting the drawing instruction corresponding to the invisible part, converting the residual drawing instruction after modification and/or deletion into a rendering instruction, and submitting the rendering instruction to the GPU. The GPU only needs to render according to the rendering instructions converted from the residual drawing instructions of each window, and therefore GPU load is reduced.
It should be noted that, in some other embodiments, the step of dividing the display area of the display screen of the notebook computer into blocks may also be performed at the beginning.
It should be noted that, in other embodiments, the step of dividing the display area of the notebook computer into blocks need not be performed each time the display processing method is executed, and this step may be a preprocessing step. For example, after the first time of partitioning the display area of the notebook computer, the partition result may be stored for later calling.
Specifically, as an embodiment of the present application, as shown in fig. 7, the display processing method includes steps S710 to S760, which are described in detail as follows.
And S710, carrying out block division on the display area to obtain a plurality of block areas.
The implementation manners of dividing the display area into blocks include the following five. The display area of the display screen may be partitioned according to any of the following implementations or examples of any of the implementations. It should be noted that the following implementation manner or implementation example is only an exemplary description, and is not used to limit the protection scope of the present application. Indeed, various modifications, combinations, substitutions, or alterations may be contemplated based on the implementations or examples of implementations set forth in the specification without departing from the application.
First implementation
In a first implementation manner, the resolution of the display screen is obtained, and the display area of the display screen is divided into blocks according to the resolution to obtain a plurality of block areas.
As a non-limiting example of the first implementation manner, the resolution of the display screen in the notebook computer is obtained, for example, 1600 × 900, and the display area of the display screen is divided into blocks according to the resolution. The system is preset with a division rule, and the display area of the display screen is divided into blocks according to the set division rule. The preset dividing rule can be set by default of the system or can be set by user self-definition. The partitioning rule may include the size of the partitioned area and/or the total number of partitioned areas, and the like.
For example, according to a first preset rule, the rule is divided into 72 partitioned areas distributed in a checkerboard manner, as shown in fig. 8. The 72 block areas are 9 × 8 checkerboard distribution. The 9 × 8 checkerboard distribution represents a horizontal division into 9 block areas and a vertical division into 8 block areas. As shown in fig. 8, the 9 rows divided in the horizontal direction are sequentially labeled as 0 to 8, i.e., the 1 st to 9 th rows are sequentially labeled as 0 to 8; the 8 lines divided in the vertical direction are sequentially labeled 0 to 7, that is, the 1 st line to the 8 th line are sequentially labeled 0 to 7. Each block area can be numbered by using the identification of the row and the column, for example, the block area corresponding to the 2 nd row and the 3 rd column is numbered as 12; for another example, the block area corresponding to row 7 and column 9 is numbered 68. The division is carried out in the vertical direction and the horizontal direction of the display screen, the size of each block area is the same, and the number of pixels occupied by each block area is approximately 178 x 112.
For another example, according to a second preset rule, the rule is divided into 8 × 6 block areas distributed in a checkerboard manner, the size of each block area is the same, and the number of pixels occupied by each block area is 200 × 150.
For another example, the image processing device is divided into 12 block areas according to a third preset rule, so that the number of pixels occupied by each block area is just an integer. For example, 12 block areas are distributed in a4 × 3 checkerboard manner, are equally divided into 4 block areas in the horizontal direction and equally divided into 3 block areas in the vertical direction, and the number of pixels occupied by each block area is 400 × 300.
As other non-limiting examples of the first implementation, the sizes of the plurality of partitioned areas may be different. For example, the averaging may not be done in the vertical direction of the display screen; or, the display screen may not be equally divided in the horizontal direction; or, there may be no division equally in both the vertical and horizontal directions of the display screen.
As another non-limiting example of the first implementation, the block division may be performed randomly in a horizontal direction or a vertical direction of the display screen, resulting in a plurality of block areas. More generally, the whole display screen can be divided into a plurality of block areas randomly.
Second implementation
In the second implementation mode, the size of the display screen is obtained, and the display area of the display screen is divided into blocks according to the size to obtain a plurality of block areas.
The system is preset with a division rule, and the display area of the display screen is divided into blocks according to the set division rule. The preset dividing rule can be set by default of the system or can be set by user self-definition. The partitioning rule may include the size of the partitioned area and/or the total number of partitioned areas, and the like.
The size of the display screen is generally referred to as the length of the diagonal, and is typically in inches or centimeters.
For example, the display screen has dimensions of 14 inches, 14 inches being equal to 35.56 centimeters. If the aspect ratio of the display screen is 16:9, the length is 30.99 cm and the width is 17.43 cm.
As a non-limiting example of the second implementation manner, the size of the display screen in the notebook computer is obtained, for example, 14 inches, and the display area of the display screen is divided into blocks according to the size.
For example, the regular partition into 72 partitioned areas in a checkerboard distribution, and with continued reference to fig. 8, is a 9 × 8 checkerboard distribution, where the 9 × 8 checkerboard distribution represents a horizontal partition into 9 partitioned areas and a vertical partition into 8 partitioned areas. As shown in fig. 8, the display screen is divided equally in the vertical direction and the horizontal direction, and each of the block areas has the same size, and the size of each block area is approximately 3.44 cm × 2.18 cm. When the resolution of the display screen is 1600 × 900, it can be obtained by conversion that each centimeter is approximately equal to 52 pixels, the size of each block area is approximately 3.44 centimeters × 2.18 centimeters, and correspondingly, the number of pixels of each block area is approximately equal to 178 × 112.
For another example, the partition rule is divided into 8 × 6 partitioned areas distributed in a checkerboard manner, each partitioned area has the same size, and each partitioned area has a size of approximately 3.87 centimeters × 2.90 centimeters.
As another non-limiting example of the second implementation, the display screen of the electronic device is divided into several screen types according to the size of the display screen. For example, three screen types, small screen, medium screen, and large screen, are divided; or, the screen is divided into two screen types of small screen and large screen; or divided into four screen types of small screen, middle screen, large screen and super large screen, etc. And determining a division rule corresponding to the block division of the display area of the display screen of each screen type. And after determining the screen type of the display screen to be divided, dividing the display area of the display screen to be divided into blocks according to the division rule corresponding to the screen type.
For example, as shown in the following table one, the display screens with different sizes may be divided into three screen types, and the different screen types may correspond to different division rules.
Watch 1
Figure BDA0002690603900000161
Figure BDA0002690603900000171
In the application scenario of the simultaneous display of multiple windows in the notebook computer shown in fig. 6, if the size of the display screen in the notebook computer to be divided is 14.4 inches, according to table one, the 14.4 inch display screen belongs to the large screen type, and the division rule corresponding to the screen type is an equal division chessboard of 9 × 8. The display screens of the notebook computer are distributed according to a chessboard of 9 multiplied by 8 and are equally divided in the horizontal direction and the vertical direction. The display screen of the notebook computer is divided into 72 equally divided block areas.
In other application scenarios, for example, an application scenario in which multiple windows are simultaneously displayed in a mobile phone, the size of the display screen in the mobile phone to be divided is 6 inches, according to table one, the display screen of 6 inches belongs to the screen type of a small screen, and the division rule corresponding to the screen type is an equal division chessboard of 2 × 3. As shown in fig. 9A, the display screen of the mobile phone is divided equally in the horizontal direction and the vertical direction according to a2 × 3 checkerboard distribution, and the display screen of the mobile phone is divided into 6 equally divided block areas.
It should be understood that, as shown in fig. 9A, in the vertical screen display state of the mobile phone, 2 × 3 chessboard division is performed in the horizontal direction and the vertical direction. If the mobile phone is in the landscape display state, as shown in fig. 9B, the chessboard division is performed by 3 × 2 in the horizontal direction and the vertical direction.
As other non-limiting examples of the second implementation, the sizes of the plurality of partitioned areas may be different. For example, the averaging may not be done in the vertical direction of the display screen; or, the display screen may not be equally divided in the horizontal direction; or, there may be no division equally in both the vertical and horizontal directions of the display screen.
As another non-limiting example of the second implementation, the block division may be performed randomly in a horizontal direction or a vertical direction of the display screen, resulting in a plurality of block areas. More generally, the whole display screen can be divided into a plurality of block areas randomly.
Third implementation
And in the third implementation mode, the resolution ratio of the display screen and the size of the display screen are obtained, and the display area of the display screen is divided into blocks according to the resolution ratio and the size to obtain a plurality of block areas.
In a third implementation, the partitioning is performed in consideration of both resolution and size.
As a non-limiting example of the third implementation manner, the higher the resolution of the display screen, the larger the size, the more the total number of the blocking areas can be, i.e., relatively detailed division can be performed; conversely, the lower the resolution of the display screen, the smaller the size, the smaller the total number of blocked areas may be, i.e. a relatively coarse division may be made.
As another non-limiting example of the third implementation, the smaller the size, the less the total number of blocking areas may be, i.e. a relatively coarse division may be made, with the same resolution; conversely, the larger the size, the more the total number of partitioned areas may be, i.e., relatively fine partitioning may be performed.
As another non-limiting example of the third implementation, the lower the resolution, the less the total number of blocked regions may be, i.e., a relatively coarse division may be made, with the same size; conversely, the higher the resolution, the more the total number of blocking areas can be, i.e., relatively fine division can be made.
As another non-limiting example of the third implementation, the display screen of the electronic device is divided into several screen types according to the size of the display screen. And determining the resolution level according to the resolution of the display screen aiming at the display screen of each screen type. Different screen types and different resolution levels are provided with respective corresponding division rules. And after determining the screen type of the display screen to be divided, determining the resolution grade, and dividing the display area of the display screen to be divided into blocks according to the division rule corresponding to the screen type and the resolution grade.
For example, referring to the following table two, the display screens of different sizes may be classified into three screen types, i.e., a small screen, a medium screen, and a large screen. The same screen type can be divided into three resolution levels, namely low, medium and high according to the resolution. And setting respective corresponding division rules for different screen types and different resolution levels.
Watch two
Figure BDA0002690603900000181
In the application scenario of the notebook computer shown in fig. 6, in which multiple windows are simultaneously displayed, if the size of the display screen in the notebook computer to be divided is 18 inches, the resolution is 1600 × 900. According to table two, the 18-inch display screen is of the large screen type. For the large screen type, the resolution 1600 × 900 is a low resolution. The division rule corresponding to the notebook computer display screen is an equal division chessboard with 7 multiplied by 6. The display areas of the notebook computer display screen are distributed according to a 7 multiplied by 6 chessboard and are equally divided in the horizontal direction and the vertical direction. The display area of the notebook computer display screen is divided into 42 equally divided block areas.
In other application scenarios, for example, an application scenario in which multiple windows are simultaneously displayed in a mobile phone, the size of the display screen in the mobile phone to be divided is 6 inches, and the resolution is 720 × 1280. According to table two, the 6-inch display screen is of the small screen type. For a screen type of small screen, the resolution 720 × 1280 is high resolution. The division rule corresponding to the mobile phone display screen is an equipartition chessboard with the division rule of 3 multiplied by 4. The display area of the mobile phone display screen is divided equally in the horizontal direction and the vertical direction, and the display area of the mobile phone display screen is divided into 12 equally divided block areas.
Fourth mode of implementation
In a fourth implementation manner, when the display screen is partitioned, the partition is performed according to the resolution and/or the size of the display screen. The partitioning may also be performed based on the total number of display windows, and/or based on the computational power of the electronic device itself.
When the blocking division is performed according to the total number of the display windows, step S710 may be performed after step S720, that is, after it is determined that a plurality of windows are displayed, the blocking division of the display area of the display screen is performed. It should be understood that the execution order of each step should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
As a non-limiting example of the fourth implementation, the partitioning into blocks is performed according to the total number of windows displayed. The greater the total number of windows displayed, the fewer the total number of blocking areas may be, i.e., a relatively coarse division may be made; conversely, the fewer the total number of windows displayed, the more the total number of blocking areas may be, i.e., relatively fine divisions may be made.
As another non-limiting example of the fourth implementation, the partitioning is performed according to the computational power of the electronic device itself. The weaker the calculation force of the electronic equipment is, the less the total number of the blocking areas can be, namely, the relatively rough division can be carried out; conversely, the stronger the calculation power of the electronic device is, the more the total number of the blocking areas can be, i.e., the relatively more detailed division can be performed.
As another non-limiting example of the fourth implementation, the partitioning is performed in consideration of the total number of windows displayed and the calculation power of the electronic device itself. The more the total number of the displayed windows is, the weaker the calculation force of the electronic equipment is, the less the total number of the blocking areas can be, namely, the relatively rough division can be carried out; conversely, the smaller the total number of displayed windows, the weaker the calculation power of the electronic device, and the more the total number of blocking areas can be, i.e., relatively detailed division can be performed.
Fifth implementation
In a fifth implementation manner, in addition to considering the resolution and/or the size of the display screen, a dividing rule for dividing the blocks may be determined by taking the total number of the displayed windows and/or the computing power of the electronic device into consideration. In a fifth implementation manner, due to comprehensive consideration of multidimensional factors, the adaptability of the embodiment of the application can be improved, and the application can be implemented in different application scenarios.
As a non-limiting example of the fifth implementation, the larger the total number of windows displayed and/or the weaker the computing power of the electronic device, the fewer the total number of blocked regions may be, i.e. a relatively coarse division may be made, given the same resolution and/or size; conversely, the less the total number of windows displayed and/or the more computationally intensive the electronic device, the more the total number of blocked regions may be, i.e., relatively more detailed partitioning may be performed.
As another non-limiting example of the fifth implementation, in the case that the resolution, the size, and/or the total number of windows are the same, the weaker the computation power of the electronic device, the less the total number of blocking areas may be, i.e., a relatively coarse division may be made; conversely, the stronger the calculation power of the electronic device is, the more the total number of the blocking areas can be, i.e., the relatively more detailed division can be performed.
S720, acquiring the position, size, transparency and Z-axis sequence of each displayed window, and determining the stacking relation of each window.
And when detecting that the electronic equipment displays a plurality of windows, acquiring attribute information of the displayed windows. Windows include, but are not limited to, windows of applications and/or windows of systems. The window of the application includes, but is not limited to, a window of a terminal application or a web application. End applications include, but are not limited to, native applications of the system, as well as third party applications.
The method comprises the steps of obtaining window attribute information such as the position, the size, the transparency and the Z-axis sequence of each window displayed on a foreground side, and then determining the stacking relation of each window according to the attribute information such as the position, the size, the transparency and the Z-axis sequence of each window.
In some embodiments, the stacking relationship of the windows may include whether the windows are blocked or not. Alternatively, in other embodiments, the stacking relationship of the windows includes, but is not limited to: the front and back display sequence of each window, whether each window is shielded or not, the shielding area and/or the non-shielding area of the shielded window and the like.
If the window displayed at the front is a transparent window, namely a window with the transparency ratio not being 0, the covered window displayed at the back is not shielded, and a shielding area does not exist. In some implementations, the non-occluded or visible area of the covered window can be marked.
If the window displayed in front is an opaque window, that is, a window with a transparency rate of 0, the window covered behind is blocked, and a blocked area exists. In some implementations, the occluded and non-occluded regions of the window that are covered behind can be marked separately; it is also possible to mark only occluded or non-occluded areas.
If the entire window is covered by other windows that are opaque, the entire window is occluded and the entire window may be marked as occluded or invisible.
In some implementations, the non-occluded area, i.e., the visible area, of each window is calculated in a coordinate manner based on the Z-axis order, size, position, and transparency of each window, and the coordinate range of the visible area is recorded.
In some implementations, it may be determined whether each window is occluded, and the occluded region and the non-occluded region of each occluded window in order in the Z-axis of each window, starting with the display of the next-to-first window, in addition to the display of the most-to-first window. In other implementations, whether each window is occluded or not can be sequentially calculated according to other orders, and an occlusion region and a non-occlusion region when the window is occluded can be determined. Or, calculating whether each window is occluded or not in parallel, and determining an occlusion area and a non-occlusion area when the window is occluded.
As a non-limiting example, in the application scenario in which multiple windows are simultaneously displayed in the notebook computer shown in fig. 6, a stacking relationship between the four windows may be sequentially determined according to the position, size, transparency, and Z-axis of each window, where the stacking relationship may include determining whether each window is occluded, and may further include an un-occluded area and/or an occluded area of the occluded window, and the like.
In particular, in the application scenario shown in fig. 6. The window attribute information obtained for window a, window B, window C, and window D is shown in table three below.
Watch III
Window opening Window A Window B Window C Window D
Size and breadth w1*h1 w2*h2 w3*h3 w4*h4
Position of (x1,y1) (x2,y2) (x3,y3) (x4,y4)
Z-axis sequence z1=3 z2=2 z3=1 z4=0
Transparency 0% 0% 0% 0%
In the example shown in table three, each window represents the size of the window in the number of pixels occupied in the horizontal direction and the vertical direction. Each window represents the position of the window in pixel coordinates in the lower left corner.
wi represents the number of pixels occupied by the ith window in the horizontal direction, and hi represents the number of pixels occupied by the ith window in the vertical direction. Wherein i takes on values of 1,2,3 and 4.
(xj, yj) represents the bottom left corner pixel coordinate of the jth window. Wherein j has values of 1,2,3 and 4.
zk denotes the Z-axis order of the kth window. Wherein k is 1,2,3 and 4.
The 1 st window is window a, the 2 nd window is window B, the 3 rd window is window C, and the 4 th window is window D.
Determining that the window D is displayed most forwards and has no shielding according to the window attribute information of the window A, the window B, the window C and the window D; the window C is next to the front and is partially shielded by the opaque window D; the window B is partially shielded by the non-transparent window D; displayed at the bottom layer is window A, which is partially blocked by non-transparent window B, window C and window D.
Taking the window C as an example, whether the window C is occluded or not, and an occluded area and a non-occluded area of the window C are exemplarily described. As shown in FIG. 10, the coordinates (x3, y3) at the lower left corner of window C are (600,200), the size of window C is 650 × 400, the Z-axis order Z3 is 1, and the transparency is 0%. The window smaller than the Z-axis order of window C, i.e., displayed further forward than window C, is window D, whose Z-axis order is 0. The coordinates (x4, y4) at the lower left corner of window D are (200,50), the size of window D is 550 × 500, and the transparency is 0%. And if the window coordinates of the window C and the window D have overlapped parts, the part of the window C overlapped with the window D is shielded. The window coordinates of the window C and the window D have an overlapping portion, which can be calculated according to the attribute information such as the size and the position of the window C and the window D. The overlapping area between the window C and the window D is a rectangular area with coordinates (600,200) and (750, 550) as vertices, that is, the horizontal direction coordinate segment of the overlapping area is 600 to 750, and the vertical direction coordinate segment is 200 to 550. Therefore, the occlusion region of the window C is the overlapping region. And the rest area except the overlapping area in the window coordinate of the window C is a non-occlusion area.
According to the above example of window C, it can be analogized whether other windows are occluded, and the occluded and/or non-occluded areas of the occluded window. Continuing with FIG. 6, for example, window B is partially occluded by opaque window D, with an occluded region; and determining the position and the size of the shielding area and/or the non-shielding area of the window B according to the size, the position and other attribute information of the window B and the window D. For another example, the window a is blocked by the non-transparent window B, the window C and the window D, and a blocked area exists; the position and size of the occlusion region and/or the non-occlusion region of the window A can be determined according to the attribute information such as the size and position of the window A, the window B, the window C and the window D.
It should be understood that in other implementations, the size and position of the window may also be represented by coordinates in the lower left and upper right corners of the window, possibly due to differences in the graphical display system; alternatively, the size and position of the window may also be represented by coordinates of the top left and bottom right corners of the window. The present application does not specifically limit the specific representation of each attribute of the window.
As a non-limiting example, the size and position of a rectangular window is represented by a set of numbers (0, 1080, 1765). Where (0,0) represents the coordinates of the lower left corner of the rectangular window and (1080,1765) represents the coordinates of the upper right corner of the rectangular window. From this set of numbers, the upper and lower and left and right boundaries of the rectangular window can be determined, i.e. all window coordinates of the rectangular window are determined. The horizontal direction coordinate section of the window coordinate is 0 to 1080, and the vertical direction coordinate section is 0 to 1765. Therefore, whether overlapping display exists in a multi-window display scene can be determined by judging whether intersection exists between window coordinates of different windows. Wherein, the intersection between the window coordinates of the two windows is the overlapping area between the two windows.
In step S720, the stacking relationship of the windows in the multi-window display scene is determined, so that whether the occlusion display occurs can be determined according to the stacking relationship.
When it is determined that an occluded display occurs, meaning that the display content of some windows is occluded by other windows, there is an occluded area, i.e., a visually invisible area. Thus, before each window is submitted for GPU rendering, all drawing instructions of each window are fetched, and the subsequent steps are continued. That is, when it is determined that the occlusion display occurs, steps S730 to S760 are performed.
And when the occlusion display is determined not to appear, independently rendering each window to form a complete window, and transmitting the complete window to the display screen for displaying. That is, when it is determined that the occlusion display does not occur, step S770 is executed to display the window according to the drawing instruction of each window. For example, in some implementations, the display processing flow shown in FIG. 3 may be employed.
And S730, if the occurrence of shielding display is determined according to the stacking relation of the windows, acquiring all drawing instructions of each window before each window is submitted to GPU rendering.
The occlusion display includes at least one window appearing occluded area, i.e., a visually invisible area. In the case that two windows are displayed in an overlapping manner, when the window that is displayed in front is a non-transparent window, the window that is displayed in back is blocked, and therefore, in the embodiment of the application, whether blocking display occurs is determined according to the stacking relation of the windows, and if blocking display occurs, all drawing instructions of each window are acquired before each window is submitted to GPU rendering.
As a non-limiting example, the description continues with the application scenario shown in fig. 6 as an example. Fig. 11 shows a window B of a news application in the application scenario shown in fig. 6. As shown in fig. 11, the area outlined by the black box 111 in fig. 11 is a news listing display area 111 in the window B of the news application. An example of a drawing instruction corresponding to the news listing display area 111 is as follows:
DrawRenderNode (FeedeRecycleView 0xb2cf1c00) # control type and object Address #
(ClipRect 100,100,550,580) # shows field shape and size #
As can be seen, the drawing instruction includes information such as control type, object address, shape and size of the display area. The object address generally includes a resource and the like. In the above-described example of the drawing instruction, the news listing display area is shaped as a rectangular area, and the coordinates of the lower left corner of the rectangular area are (100 ) and the coordinates of the upper right corner are (550,580).
And S740, determining the completely shielded windows according to the stacking relation of the windows, and eliminating the drawing instructions of the completely shielded windows.
The stacking relationship of each window may include whether the window is occluded or not, and may also include an occluded area and a non-occluded area of the occluded window. Therefore, a completely occluded window and a non-completely occluded window can be determined from the stacking relationship.
A window that is completely occluded may also be referred to as a completely occluded window, i.e., a window in which the entire window is completely occluded by other previously displayed windows. The entire window that is completely occluded is an occlusion region. In the application scenario shown in fig. 6, there are no completely occluded windows.
A non-fully occluded window may also be referred to as a non-fully occluded window, i.e., a window where the entire window is not fully occluded by other previously displayed windows.
And for the completely shielded window, all drawing instructions of the completely shielded window are removed or deleted and are not submitted to a GPU for rendering. Therefore, the rendering content can be reduced, and the system resource can be saved.
And S750, matching a corresponding drawing instruction for each block area according to the block areas divided for the display screen to obtain the processed drawing instruction of the non-completely-shielded window.
In step S740, after the completely-occluded window and the non-completely-occluded window are determined, all the drawing instructions of the completely-occluded window are removed or deleted for the completely-occluded window, and in step S750, the drawing instructions of the non-completely-occluded window are processed to obtain a processed drawing instruction of the non-completely-occluded window. Wherein a non-fully occluded window, i.e. a window where the entire window is not fully occluded by other windows previously displayed. The non-fully occluded window includes an occluded region and a non-occluded region.
In some implementation manners, the calculation tasks of each block area may be sequentially started, and each calculation task matches a corresponding drawing instruction for a block area according to the stacking relationship of the non-complete occlusion windows in the block area, so as to obtain a processed drawing instruction of each non-complete occlusion window.
In some implementation manners, multiple computation tasks may be started simultaneously, and each computation task matches a corresponding drawing instruction for a block region according to the stacking relationship of the non-complete occlusion windows in the block region, so as to obtain a processed drawing instruction of each non-complete occlusion window. Therefore, the drawing instructions corresponding to the plurality of block areas can be calculated simultaneously, and the display processing efficiency is improved.
If only the stacking relationship of the windows is roughly calculated in step S720, that is, only the completely occluded window and the non-completely occluded window are distinguished, in step S750, the stacking relationship of the non-completely occluded windows falling into each block area needs to be calculated, that is, the transparency, the Z-axis order, the occluded area and the non-occluded area of each non-completely occluded window are determined. If the stacking relationship of the windows is calculated in detail in step S720, that is, not only the completely occluded window and the incompletely occluded window are distinguished, but also the transparency, the Z-axis order, the occluded area, the non-occluded area, and the like of each of the incompletely occluded windows are determined, the stacking relationship of each of the incompletely occluded windows falling into each of the block areas may be determined using the stacking relationship calculated in step S720 in step S750. The stacking relationship of the non-complete shielding window can comprise transparency, Z-axis sequence, shielding areas and non-shielding areas of the non-complete shielding window and the like. And then processing the drawing instruction of the non-complete shielding window falling into each block area according to the stacking relation of each block area to obtain the processed drawing instruction of each non-complete shielding window.
As a non-limiting example, continuing with the application scenario shown in fig. 6 as an example, as shown in fig. 12, the display screen is divided into 9 × 8 block areas for illustration. For a certain computation task of a certain block area, as shown in fig. 13, the computation task may include the following processes:
determining that a plurality of non-complete shielding windows are stacked in the block area, then obtaining a drawing instruction of each non-complete shielding window in the block area, and determining to reserve, delete or modify the drawing instruction according to the stacking relation of each non-shielding window in the block area.
According to the size and the position of each non-complete shielding window and the size and the position of the blocking area, which non-complete shielding windows fall into the blocking area can be determined, and the non-complete shielding windows falling into the blocking area can be called as target windows. And then obtaining the drawing instruction of each target window in the block area, which can be called as a target drawing instruction. And processing the target drawing instruction of each target window in a corresponding mode, namely reserving, deleting or modifying the target drawing instruction of each target window according to whether the target drawing instruction completely falls into the block area and the stacking relation of each target window in the block area. The following is an analysis according to two cases, i.e., case a and case b, of whether the target rendering instruction completely falls into the tile region.
Case a: one target drawing instruction of one target window is completely in one block area
a. If a certain target drawing instruction of a certain target window is completely in the partitioned area, the target drawing instruction is determined to be reserved, deleted or modified according to the stacking relation.
Specifically, the case a includes four cases a1 to a4.
a1, if the target window is at the top layer, i.e. the display is at the top, the target drawing instruction is retained.
In this case, in a certain block area, the most front window is displayed, obviously without occlusion, and thus it is necessary to retain the drawing instruction of the most front window that completely falls into the block area.
As a non-limiting example, the description continues with the application scenario shown in fig. 6 as an example. As shown in fig. 12, the display screen is divided into 9 × 8 block areas, and each block area is numbered according to the combination number of rows and columns, for example, a block area with a number of 22 refers to a block area corresponding to the 3 rd row and the 3 rd column.
For the partition area 22 numbered 22, the target windows falling within the partition area 22 include a window a, a window B, and a window D. Acquiring a target drawing instruction that the window A completely falls into the block area 22, and acquiring a target drawing instruction that the window B completely falls into the block area 22; the acquisition window D completely falls within the target drawing instruction of the blocking area 22. In the block area 22, the window D is displayed most forward, and if a certain target drawing instruction of the window D completely falls into the block area 22, the item drawing instruction is retained.
a2, if the target window is at the lower layer and other target windows at the upper layer belong to the transparent window, the target drawing instruction is retained.
In this case, in a certain tile region, a window displayed later (or displayed lower) cannot be blocked by a transparent window displayed earlier (or displayed upper), and therefore, it is necessary to retain a drawing instruction of the window displayed later that completely falls into the tile region.
a3, if the target window is at the lower layer and other target windows at the upper layer of the target window are not transparent and the target drawing instruction is completely blocked, deleting the target drawing instruction.
In this case, in a certain blocking area, a window displayed later (or displayed lower) may be blocked by a non-transparent window displayed earlier (or displayed upper), and therefore, when a target drawing instruction is completely blocked, a drawing instruction of the window displayed later that completely falls into the blocking area may be deleted. That is, when a certain target window is completely blocked by other non-transparent windows displayed in the front in a certain block region, the target drawing instructions in which the target window completely falls in the block region are deleted.
For example, continuing with fig. 12, for the block area 22 numbered 22, the target windows falling within the block area 22 include window a, window B, and window D. Acquiring a target drawing instruction that the window A completely falls into the block area 22, and acquiring a target drawing instruction that the window B completely falls into the block area 22; the acquisition window D completely falls within the target drawing instruction of the blocking area 22.
In the blocking area 22, a window B is displayed behind a window D, which belongs to a non-transparent window. The area of the window B in the blocking area 22 is completely occluded by the window D. If a target drawing instruction of window B completely falls into the block area 22, the entry drawing instruction is deleted.
In this block area 22, a window a is displayed behind a window B and a window D, both of which belong to non-transparent windows. The area of window a in the blocking area 22 is completely obscured by window B and window D. If a target drawing instruction of window a completely falls into the block area 22, the entry drawing instruction is deleted.
a4 if the target window is in a lower layer and other target windows in upper layers of the target window are opaque and the target drawing instruction is partially occluded.
In this case, in a certain block area, a window displayed later (or displayed lower) may be blocked by a non-transparent window displayed earlier (or displayed upper), so that when the target drawing instruction is partially blocked, the drawing instruction of the window displayed later which completely falls into the block area may be modified or retained. That is, when a target window is partially occluded in a certain block region by other non-transparent windows displayed in front, the target drawing instructions in which the target window completely falls in the block region are modified or retained.
Specifically, the case a4 may include four cases a4.1 and a 4.2.
and a4.1, if the target drawing instruction of the target window is regularly shielded, modifying the display area of the target drawing instruction, and modifying the display area into the area of a visible area or a non-shielded area.
The target drawing instruction of the target window is blocked by a rule, and the method comprises the following steps: the blocked area of the target drawing instruction in the block area is a rectangle; or the shape of the visible region or the non-occluded region of the target window in the block region is a regular shape, such as a rectangle. At the moment, the display area of the target drawing instruction is modified, and the display area is modified into the area of a visible area or a non-shielding area. It should be understood that the visible region may have other shapes, and any shape that can be expressed by the graphic display system may be used depending on the expression of the graphic display system actually used.
For example, continuing with fig. 12, for a partitioned area 30 numbered 30, the target windows falling within the partitioned area 30 include window a and window B. The target drawing instruction of which the acquisition window a completely falls in the block area 30, and the target drawing instruction of which the acquisition window B completely falls in the block area 30 are obtained.
In the blocking area 30, a window a is displayed behind a window B, which belongs to a non-transparent window. The area of the window a in the block area 30 is partially blocked by the window B, and the visible area of the window a in the block area 30 is a rectangular area. If a target drawing instruction of the window a completely falls into the block area 30, the display area of the item drawing instruction is modified to the rectangular area. For example, the visible region of the window a in the block region 30 is a rectangular region (0,410,100,450), and the display area of the target drawing command is modified from the original (ClipRect 0,410,150,450) to (ClipRect 0,410,100,450).
a4.2, if the target drawing instruction of the target window is irregularly shielded, finding out the minimum modifiable rectangular area of the non-shielded area, and if so, modifying the display area of the target drawing instruction into the rectangular area; if not, the target rendering instruction is retained.
The target drawing instructions of the target window are irregularly shielded, and the method comprises the following steps: the blocked area of the target drawing instruction in the block area is a circle or a fan; alternatively, the shape of the visible region or the non-occluded region of the target window in this block region is an irregular shape, such as a non-rectangle or the like. At the moment, finding out the minimum modifiable rectangular area of the non-shielding area, and modifying the display area of the target drawing instruction into the rectangular area if the minimum modifiable rectangular area exists; if not, the target rendering instruction is retained. The smallest modifiable rectangular area may be the smallest rectangular area of the rectangles whose four vertices fall on the boundary of the non-occlusion region. It should be understood that the example is illustrated by taking the smallest modifiable rectangular area as an example, and in other implementations, the shape may be other, and any shape that can be expressed by the graphic display system may be adopted according to the expression of the graphic display system used in practice.
Case b: one target drawing instruction of one target window falls into several block areas
b. If a certain target drawing instruction of a certain target window falls into several block areas, the several block areas which fall into are taken as a large block area for calculation, and the drawing instruction is reserved, deleted or modified according to the similar method described above.
If a target drawing instruction of a target window falls into several block areas, the several block areas are merged or combined into a large area, and the merged or combined large area can be referred to as a first area.
For example, if it is determined that the target window is at the top or most displayed in the first area, or it is determined that the target window is not at the top in the first area and other windows covering the target window are transparent windows in the first area, it means that the item plotting instruction is not occluded, and the item plotting instruction of the target window is retained.
If it is determined that the item-plotting instructions are completely occluded in the first region or the target window is completely occluded in the first region, the item-plotting instructions of the target window are deleted. For example, if it is determined that the target window is not at the uppermost layer in the first area, other target windows located at the upper layer of the target window belong to non-transparent windows, and the target drawing instruction is completely blocked, the target drawing instruction is deleted.
And if the target drawing instruction of the target window in the first area is determined to be partially and regularly occluded, modifying the display area of the target drawing instruction, and modifying the display area into the area of a visible area or a non-occluded area. For example, if it is determined that the target window is not at the uppermost layer in the first area, other target windows located at the upper layer of the target window belong to non-transparent windows, and the target drawing instruction is regularly blocked, the display area of the target drawing instruction is modified.
If the target drawing instruction of the target window in the first area is determined to be partially irregularly shielded, finding out the minimum modifiable rectangular area of the non-shielded area, and if so, modifying the display area of the target drawing instruction into the rectangular area; if not, the target rendering instruction is retained. For example, if it is determined that the target window is not located on the uppermost layer in the first area, other target windows located on the upper layer of the target window belong to non-transparent windows, and the target drawing instruction is irregularly occluded, finding out the minimum modifiable area of the non-occluded area of the target drawing instruction in the first area, and if the minimum modifiable area is found, modifying the display area of the target drawing instruction to the minimum modifiable area; if the minimum modifiable area is not found, the target rendering instruction is retained.
As a non-limiting example, continuing to refer to fig. 12, as shown in fig. 12, a news listing display area 111 in a window B of a news application corresponds to one drawing instruction, the one drawing instruction falls into the following 20 block areas, and the numbers of the falling 20 block areas are respectively: 21. 22, 23, 24; 31. 32, 33, 34; 41. 42, 43, 44; 51. 52, 53, 54; 61. 62, 63, 64. These 20 block areas. These 20 divided areas are combined into one large area, i.e., the first area on the display screen corresponding to the news article display area 111 shown in fig. 12. In the first area, the window D regularly covers the window B, and the visible area of the window B in the first area is a rectangle. The display area of the modified news listing display area 111 corresponding to one drawing instruction is the visible area. For example: the visible area of the window B in the first area is a rectangle whose position and size are (100,100,200,580), and the display area of a drawing command corresponding to the news listing display area 111 is modified from the original (ClipRect 100,100,550,580) to (ClipRect 100,100,200,580).
And S760, displaying according to the processed drawing instruction of each non-complete shielding window.
And matching the corresponding drawing instruction for each block area according to the step S750 to obtain the processed drawing instruction of each non-completely-shielded window.
In some implementations, the processed rendering instructions for each non-fully occluded window are converted to rendering instructions, which are submitted to a GPU or Display subsystem (MDSS). And the GPU or the MDSS sends the rendered content to a display screen or display equipment for display.
In some implementations, the processed rendering instructions for each non-fully occluded window are converted into rendering instructions, such as OpenGL instructions, and the rendering instructions are submitted to a GPU or MDSS. And then rendering is carried out by the GPU or the MDSS, and the GPU or the MDSS submits rendering results to a frame buffer. Finally, when the next VSYNC signal arrives, the video controller of the electronic device will read the data of the frame buffer line by line according to the VSYNC signal, and transmit the data to the display screen or the display device for display through possible digital-to-analog conversion.
According to the embodiment of the application, under the scene of simultaneous display of multiple windows, the workload of graphic display can be reduced through simplifying the drawing instruction, so that the load of a system is reduced, and the reduction of power consumption and the improvement of performance are brought.
Second application scenario
Fig. 14A and 14B are schematic diagrams of a second application scenario. The second application scenario is an application scenario of the mobile phone. In this application scenario, as shown in fig. 14A, the mobile phone opens a browser application, and displays a first window 141 of the browser application. As shown in fig. 14B, an instant messenger application is opened based on the opening of the browser application, and the mobile phone displays a first window 141 of the browser application and a second window 142 of the instant messenger application at the same time. The second window 142 of the instant messaging application obscures a portion of the area of the first window 141 of the browser application.
When the mobile phone receives an operation of opening the browser application by the user, window attribute information such as the size, the position, the transparency and the like of the first window 141 of the browser application is obtained through a drawing instruction of the browser application. The window number and the window stacking relationship of the current system are obtained according to the window attribute information of the first window 141.
It is determined that only the first window 141 needs to be displayed based on the number of windows and the window stacking relationship, and the first window 141 is displayed foremost without being blocked. All drawing instructions of the first window 141 are reserved. The rendering instruction of the first window 141 is converted into a rendering instruction and then submitted to the GPU for rendering, the GPU sends the rendering result to the display screen for displaying, and the display screen of the mobile phone displays the first window 141 of the browser application, as shown in fig. 14A.
When the mobile phone receives an operation of opening the instant messaging application by a user, window attribute information such as the size, the position, the transparency, the Z-axis sequence and the like of the second window 142 of the instant messaging application is obtained through a drawing instruction of the instant messaging application. The window attribute information such as the size, position, transparency, and Z-axis order of the first window 141 of the browser application is acquired through the drawing instruction of the browser application. Then, according to the window attribute information of the first window 141 and the second window 142, the window stacking relationship of the current system is acquired.
And determining that the second window 142 is displayed at the forefront without occlusion according to the window stacking relation. All drawing instructions of the second window 142 are retained. According to the window stacking relationship, it is determined that the first window 141 is partially blocked by the non-transparent second window 142 displayed before, according to the invisible area of the first window 141, it is determined that the drawing instruction of the first window 141 completely falls into the invisible area, and the drawing instruction of the first window 141 completely falling into the invisible area is deleted, so as to obtain the simplified drawing instruction of the first window 141.
Converting all the drawing instructions of the second window 142 into rendering instructions and submitting the rendering instructions to a GPU for rendering; the simplified rendering instructions of the first window 141 are converted into rendering instructions and then submitted to the GPU for rendering. The GPU sends the rendering results of the two windows to the display screen for display, as shown in fig. 14B, the mobile phone displays the second window 142 on the basis of fig. 14A, and the second window 142 is displayed on the first window 141 in an overlapping manner.
Third application scenario
Fig. 15A and 15B are schematic diagrams of a third application scenario. The third application scenario is an application scenario of the tablet computer. In this application scenario, as shown in fig. 15A, the tablet computer opens an instant messaging application, and displays a first window 151 of the instant messaging application. As shown in fig. 15B, a player application is opened upon starting the instant messenger application, and the mobile phone displays a first window 151 of the instant messenger application and a second window 152 of the player application at the same time. The second window 152 of the player application completely obscures the first window 151 of the instant messaging application.
When the tablet computer receives an operation of opening the instant messaging application by a user, window attribute information such as the size, the position and the transparency of the first window 151 of the instant messaging application is obtained through a drawing instruction of the instant messaging application. The window number and the window stacking relationship of the current system are obtained according to the window attribute information of the first window 151.
It is determined that only the first window 151 needs to be displayed based on the number of windows and the window stacking relationship, and the first window 151 is displayed foremost without being blocked. All drawing instructions of the first window 151 are reserved. The rendering instruction of the first window 151 is converted into a rendering instruction and then submitted to the GPU for rendering, the GPU sends the rendering result to the display screen for displaying, and the display screen of the tablet computer displays the first window 151 of the instant messaging application, as shown in fig. 15A.
When the tablet computer receives an operation of opening the player application by a user, window attribute information such as the size, position, transparency, Z-axis sequence and the like of the second window 152 of the player application is obtained through a drawing instruction of the player application. The window attribute information of the first window 151 of the instant messaging application, such as the size, position, transparency, and Z-axis order, is obtained through a drawing instruction of the instant messaging application. Then, the window stacking relationship of the current system is acquired according to the window attribute information of the first window 151 and the second window 152.
It is determined from the window stack-up relationship that the second window 152 is displayed foremost, without occlusion. All drawing instructions of the second window 152 are reserved. And determining that the first window 151 is completely blocked by the non-transparent second window 152 displayed in front according to the window stacking relation, and deleting all drawing instructions of the first window 151.
Converting all the drawing instructions of the second window 152 into rendering instructions and submitting the rendering instructions to a GPU for rendering; while the drawing instructions of the first window 151 need not be submitted to the GPU for rendering. The GPU sends the rendering result to the display screen for display, as shown in fig. 15B, the tablet only displays the second window 152 of the player application.
In each refreshing display process, the system acquires window attribute information of a foreground window of a current system of the system, and further acquires the window number and window stacking relation of the current system according to the window attribute information, wherein the window stacking relation comprises whether the window is shielded, and a visible area and/or an invisible area of the shielded window. And then, processing the drawing instruction of each window according to the window stacking relation to obtain the drawing instruction processed by each window. And finally, converting the processed drawing instruction into a rendering instruction, submitting the rendering instruction to a GPU for rendering, and sending a rendering result to a display screen by the GPU for displaying.
With reference to the second and third application scenarios and the related drawings, an embodiment of the present application provides a display processing method, where in a multi-window display state, in a process of refreshing display of a system each time, window attribute information of a current window of the system is obtained, and then, according to the window attribute information, the number of windows of the system and a stacking relationship of the windows are obtained, where the stacking relationship of the windows may include whether the window is blocked, and may also include a visible region and/or an invisible region of the blocked window. Then, according to the stacking relationship of the windows, the rendering instructions of each window are processed to obtain the rendering instructions processed by each window, for example, the rendering instructions that completely fall into the shielding region of the window in the rendering instructions of the windows are deleted to obtain a simplified rendering instruction. And finally, converting the processed drawing instruction into a rendering instruction, submitting the rendering instruction to a GPU for rendering, and sending a rendering result to a display screen by the GPU for displaying.
Through this embodiment, under the scene that the multiple windows were simultaneously shown, reduce the work load of graphic display, and then reduce the load of system, bring the reduction of consumption and the promotion of performance.
It should be understood that the execution sequence of each process in the above embodiments should be determined by the function and the inherent logic thereof, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Corresponding to the display processing method described in the foregoing embodiments, an embodiment of the present application provides a display processing apparatus, where the display processing apparatus is configured on an electronic device, and each module included in the display processing apparatus may correspondingly implement each step of the display processing method.
It will be appreciated that the electronic device, in order to implement the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The present application can be realized in hardware or a combination of hardware and computer software in conjunction with the description of the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application with the embodiment described, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It should be noted that, for the contents of information interaction, execution process, and the like between modules/units of the display processing apparatus, specific functions and technical effects brought by the method embodiments based on the same concept can be specifically referred to a part of the method embodiments, and details are not described here.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiment of the present application further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the electronic device is enabled to implement the steps in the above method embodiments.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments may be implemented.
Embodiments of the present application provide a computer program product, which when executed on an electronic device, enables the electronic device to implement the steps in the above method embodiments.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/electronic device, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunication signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed electronic device and method may be implemented in other ways. For example, the above-described electronic device embodiments are merely illustrative. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (16)

1. A display processing method is applied to electronic equipment, and is characterized by comprising the following steps:
detecting that the electronic equipment displays a plurality of windows, and acquiring attribute information of the windows;
determining the stacking relation of the windows according to the attribute information of the windows;
determining that the plurality of windows are displayed in a shielding manner, and acquiring a drawing instruction of each window before each window is submitted to rendering;
determining a completely-shielded window and a non-completely-shielded window in the plurality of windows according to the stacking relation of the plurality of windows, and eliminating the drawing instruction of the completely-shielded window;
matching a corresponding drawing instruction for each block area to obtain a processed drawing instruction of each non-completely-shielded window;
displaying according to the processed drawing instruction of each non-completely-shielded window;
the block area is obtained by dividing according to the display area of the electronic equipment.
2. The display processing method according to claim 1, wherein the block area is obtained by dividing according to a display area of the electronic device, and includes:
according to one or more of the resolution of the display screen of the electronic equipment, the size of the display screen, the total number of displayed windows and the computing capacity of the electronic equipment, the display area of the display screen is partitioned into blocks, and a plurality of partitioned areas are obtained.
3. The display processing method according to claim 1, wherein the attribute information includes one or more of a position, a size, a transparency, and a Z-axis order of a window.
4. The display processing method according to claim 1, wherein the obtaining of the processed rendering instruction of each non-completely-occluded window by matching a corresponding rendering instruction to each blocking area comprises:
and sequentially starting the calculation tasks of each block area, wherein each calculation task matches a corresponding drawing instruction for each block area according to the stacking relation of the non-complete shielding windows in each block area to obtain the processed drawing instruction of each non-complete shielding window.
5. The display processing method according to claim 1, wherein the obtaining of the processed rendering instruction of each non-completely-occluded window by matching a corresponding rendering instruction to each blocking area comprises:
and simultaneously starting a plurality of computing tasks, wherein each computing task matches a corresponding drawing instruction for a block area according to the stacking relation of the non-complete shielding windows in the block area until each block area is matched with the corresponding drawing instruction, so as to obtain the processed drawing instruction of each non-complete shielding window.
6. The display processing method according to claim 4 or 5, wherein each of the calculation tasks matches a corresponding drawing instruction for one of the block regions according to a stacking relationship of the non-complete occlusion windows in the block region, and includes:
determining target windows falling into the same blocking area, and acquiring a target drawing instruction of each target window in the blocking area, wherein the target windows are the non-completely-shielded windows falling into the same blocking area, and the target drawing instruction is a drawing instruction of any one target window in the blocking area;
and determining to reserve, delete or modify the target drawing instruction of each target window in the block area according to the stacking relation of the target windows in the block area.
7. The display processing method according to claim 6, wherein determining to reserve, delete or modify the target rendering instruction of each target window in the block area according to the stacking relationship of the target windows in the block area comprises:
if any one target window falls into the blocking area of the target window, the area completely belongs to the shielding area of the target window, and then the target drawing instruction of the target window in the blocking area is deleted;
if any region of the target window falling into the block region completely belongs to the non-shielding region of the target window, reserving a target drawing instruction of the target window in the block region;
and if the part of the area of any target window falling into the block area belongs to the shielding area of the target window and the part of the area belongs to the non-shielding area of the target window, modifying the target drawing instruction of the target window in the block area.
8. The display processing method according to claim 6, wherein determining to reserve, delete or modify the target rendering instruction of each target window in the block area according to the stacking relationship of the target windows in the block area comprises:
if any target drawing instruction of any target window completely falls into the block area, determining to reserve, delete or modify the target drawing instruction according to the stacking relation of the target window in the block area;
if any target drawing instruction of any target window falls into a plurality of partitioned areas including the partitioned area, the plurality of the falling partitioned areas are used as first areas, and the target drawing instruction is determined to be reserved, deleted or modified according to the stacking relation of the target window in the first areas.
9. The method according to claim 8, wherein determining to reserve, delete or modify the target rendering instruction according to the stacking relationship of the target window in the block area if any target rendering instruction of any target window completely falls into the block area comprises:
if any target drawing instruction of any target window completely falls into the block area and the target window is positioned at the uppermost layer of the block area, the target drawing instruction is reserved;
if any target drawing instruction of any target window completely falls into the block area, the target window is not positioned on the uppermost layer in the block area, and other target windows positioned on the upper layer of the target window in the block area belong to transparent windows, the target drawing instruction is reserved;
if any target drawing instruction of any target window completely falls into the block area, the target window is not positioned on the uppermost layer in the block area, other target windows positioned on the upper layer of the target window in the block area belong to non-transparent windows, and the target drawing instruction is completely shielded, the target drawing instruction is deleted;
if any target drawing instruction of any target window completely falls into the block area, the target window is not positioned on the uppermost layer in the block area, other target windows positioned on the upper layer of the target window in the block area belong to non-transparent windows, and the target drawing instruction is regularly shielded, the display area of the target drawing instruction is modified;
if any target drawing instruction of any target window completely falls into the block area, the target window is not positioned on the uppermost layer in the block area, other target windows positioned on the upper layer of the target window in the block area belong to non-transparent windows, and the target drawing instruction is irregularly shielded, the minimum modifiable area of the non-shielded area of the target drawing instruction in the block area is found out, and if the minimum modifiable area is found, the display area of the target drawing instruction is modified to be the minimum modifiable area; if the minimum modifiable area is not found, the target rendering instruction is retained.
10. The method according to claim 9, wherein determining to retain, delete or modify the target rendering instruction according to the stacking relationship of the target window in the first area comprises:
if the target window is positioned at the uppermost layer of the first area, the target drawing instruction is reserved;
if the target window is not positioned on the uppermost layer in the first area and other target windows positioned on the upper layer of the target window in the first area belong to transparent windows, the target drawing instruction is reserved;
if the target window is not positioned on the uppermost layer in the first area, other target windows positioned on the upper layer of the target window in the first area belong to non-transparent windows, and the target drawing instruction is completely shielded, deleting the target drawing instruction;
if the target window is not positioned on the uppermost layer in the first area, other target windows positioned on the upper layer of the target window in the first area belong to non-transparent windows, and the target drawing instruction is regularly shielded, modifying the display area of the target drawing instruction;
if the target window is not positioned on the uppermost layer in the first area, other target windows positioned on the upper layer of the target window in the first area belong to non-transparent windows, and the target drawing instruction is irregularly shielded, finding out the minimum modifiable area of the non-shielded area of the target drawing instruction in the first area, and if the minimum modifiable area is found, modifying the display area of the target drawing instruction into the minimum modifiable area; if the minimum modifiable area is not found, the target rendering instruction is retained.
11. The display processing method of claim 10, wherein the minimum modifiable area comprises a minimum modifiable rectangular area.
12. The display processing method according to claim 1 or 2, wherein the completely-occluded window includes a window in which the entire window is completely occluded by other windows previously displayed.
13. The display processing method according to claim 1 or 2, wherein the non-fully-occluded window includes a window in which the entire window is not fully occluded by other windows previously displayed.
14. The display processing method according to claim 1 or 2, wherein after determining the stacking relationship of the plurality of windows according to the attribute information of the plurality of windows, the method further comprises:
and if the plurality of windows are determined not to be displayed in a shielding manner according to the stacking relation of the plurality of windows, displaying according to the drawing instructions of the plurality of windows.
15. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, causes the electronic device to implement the display processing method of any one of claims 1 to 14.
16. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, implements the display processing method according to any one of claims 1 to 14.
CN202010990210.5A 2020-09-04 2020-09-04 Display processing method and electronic equipment Active CN112328130B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010990210.5A CN112328130B (en) 2020-09-04 2020-09-04 Display processing method and electronic equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010924403.0A CN113791706A (en) 2020-09-04 2020-09-04 Display processing method and electronic equipment
CN202010990210.5A CN112328130B (en) 2020-09-04 2020-09-04 Display processing method and electronic equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010924403.0A Division CN113791706A (en) 2020-09-04 2020-09-04 Display processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN112328130A CN112328130A (en) 2021-02-05
CN112328130B true CN112328130B (en) 2021-10-01

Family

ID=74304556

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010990210.5A Active CN112328130B (en) 2020-09-04 2020-09-04 Display processing method and electronic equipment
CN202010924403.0A Pending CN113791706A (en) 2020-09-04 2020-09-04 Display processing method and electronic equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202010924403.0A Pending CN113791706A (en) 2020-09-04 2020-09-04 Display processing method and electronic equipment

Country Status (1)

Country Link
CN (2) CN112328130B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113360150B (en) * 2021-05-25 2024-04-26 广东海启星海洋科技有限公司 Multi-module data linkage display method and device
CN113259755B (en) * 2021-06-15 2021-10-12 北京新唐思创教育科技有限公司 Screen recording method, device, equipment and medium
CN113590251B (en) * 2021-08-05 2024-04-12 四川艺海智能科技有限公司 Single-screen multi-window digital interactive display system and method
CN114356475A (en) * 2021-12-16 2022-04-15 北京飞讯数码科技有限公司 Display processing method, device, equipment and storage medium
CN114296848B (en) * 2021-12-23 2024-03-19 深圳市宝视达光电有限公司 Conference all-in-one machine, screen segmentation display method and storage device
CN115550708B (en) * 2022-01-07 2023-12-19 荣耀终端有限公司 Data processing method and electronic equipment
CN114647476A (en) * 2022-03-31 2022-06-21 北京百度网讯科技有限公司 Page rendering method, device, equipment, storage medium and program
CN116700655B (en) * 2022-09-20 2024-04-02 荣耀终端有限公司 Interface display method and electronic equipment
CN117785343A (en) * 2022-09-22 2024-03-29 华为终端有限公司 Interface generation method and electronic equipment
CN115686727B (en) * 2023-01-04 2023-04-14 麒麟软件有限公司 Method for realizing synthesis rendering based on wlroots

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915349A (en) * 2012-09-27 2013-02-06 北京奇虎科技有限公司 Method for displaying webpage in browser and webpage component displayed in browser
CN105472436A (en) * 2015-11-24 2016-04-06 努比亚技术有限公司 Information shielding device and method thereof
CN110362304A (en) * 2018-03-26 2019-10-22 北京京东尚科信息技术有限公司 The method and apparatus of web displaying

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156999B (en) * 2010-02-11 2015-06-10 腾讯科技(深圳)有限公司 Generation method and device thereof for user interface
CN105025335B (en) * 2015-08-04 2017-11-10 合肥中科云巢科技有限公司 The method that a kind of audio video synchronization under cloud desktop environment renders
CN106681583A (en) * 2016-12-02 2017-05-17 广东威创视讯科技股份有限公司 Method and system for processing displayed content in overlapping windows
CN110209444B (en) * 2019-03-20 2021-07-09 华为技术有限公司 Graph rendering method and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915349A (en) * 2012-09-27 2013-02-06 北京奇虎科技有限公司 Method for displaying webpage in browser and webpage component displayed in browser
CN105472436A (en) * 2015-11-24 2016-04-06 努比亚技术有限公司 Information shielding device and method thereof
CN110362304A (en) * 2018-03-26 2019-10-22 北京京东尚科信息技术有限公司 The method and apparatus of web displaying

Also Published As

Publication number Publication date
CN113791706A (en) 2021-12-14
CN112328130A (en) 2021-02-05

Similar Documents

Publication Publication Date Title
CN112328130B (en) Display processing method and electronic equipment
US20220291816A1 (en) Interface display method and device
CN112130742B (en) Full screen display method and device of mobile terminal
CN110506416B (en) Method for switching camera by terminal and terminal
US11669242B2 (en) Screenshot method and electronic device
CN112532869B (en) Image display method in shooting scene and electronic equipment
WO2021000881A1 (en) Screen splitting method and electronic device
CN109559270B (en) Image processing method and electronic equipment
CN111669459B (en) Keyboard display method, electronic device and computer readable storage medium
CN112445448B (en) Flexible screen display method and electronic equipment
CN112262563B (en) Image processing method and electronic device
CN111190681A (en) Display interface adaptation method, display interface adaptation design method and electronic equipment
CN111669462B (en) Method and related device for displaying image
WO2022007862A1 (en) Image processing method, system, electronic device and computer readable storage medium
CN114089932B (en) Multi-screen display method, device, terminal equipment and storage medium
CN114115619A (en) Application program interface display method and electronic equipment
CN113986070B (en) Quick viewing method for application card and electronic equipment
CN113170037A (en) Method for shooting long exposure image and electronic equipment
CN110286975B (en) Display method of foreground elements and electronic equipment
CN110138999B (en) Certificate scanning method and device for mobile terminal
CN113986162B (en) Layer composition method, device and computer readable storage medium
CN115115679A (en) Image registration method and related equipment
CN112449101A (en) Shooting method and electronic equipment
CN114004732A (en) Image editing prompting method and device, electronic equipment and readable storage medium
CN114756184A (en) Collaborative display method, terminal device and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210425

Address after: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Applicant after: Honor Device Co.,Ltd.

Address before: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Applicant before: HUAWEI TECHNOLOGIES Co.,Ltd.

GR01 Patent grant
GR01 Patent grant