CN114356475B - Display processing method, device, equipment and storage medium - Google Patents

Display processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN114356475B
CN114356475B CN202111547335.1A CN202111547335A CN114356475B CN 114356475 B CN114356475 B CN 114356475B CN 202111547335 A CN202111547335 A CN 202111547335A CN 114356475 B CN114356475 B CN 114356475B
Authority
CN
China
Prior art keywords
area
window
display
target
temporary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111547335.1A
Other languages
Chinese (zh)
Other versions
CN114356475A (en
Inventor
邵华明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Feixun Digital Technology Co ltd
Original Assignee
Beijing Feixun Digital Technology Co ltd
Filing date
Publication date
Application filed by Beijing Feixun Digital Technology Co ltd filed Critical Beijing Feixun Digital Technology Co ltd
Priority to CN202111547335.1A priority Critical patent/CN114356475B/en
Publication of CN114356475A publication Critical patent/CN114356475A/en
Application granted granted Critical
Publication of CN114356475B publication Critical patent/CN114356475B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The embodiment of the invention discloses a display processing method, a device, equipment and a storage medium. The method comprises the following steps: acquiring a window superposition display request, and determining a display area of each window; determining a target area to be cut by regularly combining the display areas of the windows; and cutting off the target area on a rendering picture corresponding to the window superposition display request. The technical scheme of the embodiment of the invention can reduce the cutting times of the rendered picture and improve the performance of the terminal equipment when processing multi-window superposition display.

Description

Display processing method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of terminal equipment, in particular to a display processing method, a device, equipment and a storage medium.
Background
At present, when video is displayed, encoding and decoding of the video and rendering are carried out on the same computer PC, but a CPU in the PC cannot carry out multi-path encoding and decoding at the same time, in order to solve the problem, an external audio/video terminal can be adopted, encoding and decoding of the multi-path video can be realized in a hardware decoding mode, and rendered video pictures are overlapped on PC pictures and displayed on a display.
In this scheme, since the rendered video frame is always superimposed on the PC frame, when the PC frame needs to display a sub-window on the rendered frame, the rendered frame needs to be "hollowed out" so that the sub-window can be seen on the display, as shown in fig. 1. However, "hole digging" on the rendering screen can cause performance degradation of the audio and video terminal, and the larger the number of "hole digging" is, the larger the influence on the performance of the audio and video terminal is.
Disclosure of Invention
The embodiment of the invention provides a display processing method, a device, equipment and a storage medium, which are used for reducing the cutting times of a rendered picture and improving the performance of terminal equipment when multi-window superposition display is processed.
In a first aspect, an embodiment of the present invention provides a display processing method, including:
acquiring a window superposition display request, and determining a display area of each window;
Determining a target area to be cut by regularly combining the display areas of the windows;
and cutting out the target area on the rendering picture corresponding to the window superposition display request.
Optionally, acquiring the window overlapping display request and determining a display area of each window includes:
Acquiring a window superposition display request, and determining a rendering picture of a window to be superimposed and a position coordinate of the window in the rendering picture according to the window superposition display request;
And determining the display area of each window in the rendered picture according to the position coordinates.
Optionally, determining the target area to be cut by regularly merging the display areas of all windows includes:
Selecting a target window to be processed currently, calculating a merging result between a display area of the target window and the temporary area as a union area, and updating the temporary area according to the shape of the union area; the temporary area comprises at least one candidate target area corresponding to the processed window;
And if the unprocessed window still exists, returning to execute the operation of selecting the current target window to be processed, calculating the merging result between the display area and the temporary area of the target window as a union area, otherwise, taking the candidate target area in the temporary area as the target area to be cut.
Optionally, calculating a merging result between the display area of the target window and the temporary area as a union area, and updating the temporary area according to a shape of the union area, includes:
Calculating a merging result between a display area and a temporary area of the target window as a complete merging area, and judging whether the complete merging area is rectangular;
If yes, the complete union region is used as the unique candidate target region in the temporary region;
otherwise, calculating a merging result between the display area of the target window and each candidate target area in the temporary area as a partial union area, and judging whether the partial union area with the rectangular shape exists or not;
if the candidate target area exists, taking the partial union region with the rectangular shape as a new candidate target area, and replacing the candidate target area corresponding to the partial union region in the temporary area;
otherwise, the display area of the target window is used as a new candidate target area to be added into the temporary area.
Optionally, the method further comprises:
Responding to window closing operation, and determining a target area corresponding to the window to be closed and an associated window included in the target area;
and acquiring the upper and lower layer display relation between the window to be closed and the associated window, and if the window to be closed is positioned at the lower layer of the associated window and the display area of the window to be closed comprises the display area of the associated window, cutting off the target area corresponding to the display area of the associated window on the rendering picture again after the window to be closed is closed.
Optionally, after determining the target area corresponding to the window to be closed and the associated window in response to the window closing operation, the method further includes:
And if the display area of the window to be closed and the display area of the associated window are partially overlapped, closing the area outside the display area of the associated window in the target area.
Optionally, after the window overlay display request is acquired, the method further includes:
Determining the upper and lower layer display relationship among the windows according to the sequence of the request time in the superposition display request of the windows;
Wherein, the window with the preceding request time is displayed at the lower layer of the window with the following request time.
In a second aspect, an embodiment of the present invention further provides a display processing apparatus, including:
the acquisition module is used for acquiring the window superposition display request and determining the display area of each window;
The determining module is used for determining a target area to be cut by regularly combining the display areas of the windows;
And the clipping module is used for clipping the target area on the rendering picture corresponding to the window superposition display request.
In a third aspect, an embodiment of the present invention further provides a terminal device, where the device includes:
One or more processors;
Storage means for storing one or more programs,
When the one or more programs are executed by the one or more processors, the one or more processors implement a display processing method provided by any embodiment of the present invention.
In a fourth aspect, an embodiment of the present invention further provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a display processing method provided by any embodiment of the present invention.
According to the technical scheme, the terminal equipment acquires a window superposition display request and determines the display area of each window; determining a target area to be cut by regularly combining the display areas of the windows; and cutting out the target area on the rendering picture corresponding to the window superposition display request. By combining the display areas of the windows, when the multi-window superposition display is processed on the rendering picture, the number of times of cutting the rendering picture is reduced, the performance of the terminal equipment is improved, and the problem that the terminal performance is affected due to too many holes in the multi-window processing in the prior art is solved.
Drawings
FIG. 1 is a schematic diagram of a display overlay window on a rendered frame according to the prior art;
FIG. 2a is a flow chart of a display processing method according to a first embodiment of the invention;
FIG. 2b is a diagram illustrating a window position relationship according to a first embodiment of the present invention;
FIG. 2c is a diagram illustrating another window position relationship according to the first embodiment of the present invention;
FIG. 2d is a schematic diagram of a window position relationship according to a first embodiment of the present invention;
FIG. 2e is a diagram illustrating a window position relationship according to a first embodiment of the present invention;
FIG. 2f is a diagram illustrating a window position relationship according to a first embodiment of the present invention;
FIG. 3 is a schematic diagram of a display processing device according to a second embodiment of the present invention;
fig. 4 is a schematic structural diagram of a terminal device in the third embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Example 1
Fig. 2a is a flowchart of a display processing method in a first embodiment of the present invention, where the present embodiment is applicable to a case where a plurality of overlay windows are displayed on a rendered screen, and the method may be performed by a display processing device, where the device may be implemented by hardware and/or software, and may be generally integrated in a terminal device that provides a video display service, for example, an audio/video terminal with an external PC terminal, and used with the PC terminal. As shown in fig. 2a, the method comprises:
Step 110, acquiring a window superposition display request, and determining a display area of each window.
In this embodiment, when the PC end wants to display a window on the rendering page, a window overlapping display request corresponding to the window may be generated and sent to the external audio/video terminal. After receiving the window superposition display request, the audio/video terminal can determine the specific display position on the window rendering picture according to the window display related information carried in the window superposition display request.
Optionally, acquiring the window overlay display request and determining the display area of each window may include: acquiring a window superposition display request, and determining a rendering picture of a window to be superimposed and a position coordinate of the window in the rendering picture according to the window superposition display request; and determining the display area of each window in the rendered picture according to the position coordinates.
In this embodiment, the identifier of the window to be displayed may be extracted from the window overlay display request, where the window is to be displayed in an overlaid manner, and the coordinate value corresponding to the display position of the window in the rendered image. Wherein, a coordinate system can be pre-established in each rendering picture, and the position coordinates of the window can be coordinate values of each vertex of the display area of the window in the coordinate system of the rendering picture. For example, in fig. 2b, the position coordinates of the window a may include coordinate values of four vertices of a, and further, a specific display area of the window on the rendered screen may be determined according to the vertex coordinate values.
And 120, determining a target area to be cut by regularly combining the display areas of the windows.
In this embodiment, in order to reduce the number of times that the rendering frame needs to be cut for displaying the plurality of superimposed windows, the display areas of each window are not cut on the rendering frame, but the number of target areas that need to be cut on the rendering frame is reduced by regularly combining the display areas of the plurality of windows.
Optionally, determining the target area to be cut by regularly merging the display areas of all windows may include: selecting a target window to be processed currently, calculating a merging result between a display area of the target window and the temporary area as a union area, and updating the temporary area according to the shape of the union area; the temporary area comprises at least one candidate target area corresponding to the processed window; and if the unprocessed window still exists, returning to execute the operation of selecting the current target window to be processed, calculating the merging result between the display area and the temporary area of the target window as a union area, otherwise, taking the candidate target area in the temporary area as the target area to be cut.
In this embodiment, for the case of displaying multiple windows superimposed on the rendered page, in order to increase the calculation speed and reduce the calculation amount, two windows may be selected first, and it is determined that several target areas need to be cut to completely display the two windows. And then taking the currently determined target area to be cut as a temporary area, selecting the next window, determining the target area required by completely displaying the temporary area and the next window, and the like until the target area required by completely displaying all the windows is determined.
In this embodiment, a target area required for completely displaying a processed window is used as a temporary area, one of the unprocessed windows is selected as a target window, a merging result between a display area of the target window and the temporary area is calculated as a union area, and since a rectangular area can only be cut out on a rendering screen, whether the target window can be merged with the temporary area can be determined according to whether the shape of the union area is rectangular, and further, the target area required for completely displaying the target window and the temporary area can be determined.
Optionally, calculating a merging result between the display area of the target window and the temporary area as a union area, and updating the temporary area according to a shape of the union area may include: calculating a merging result between a display area and a temporary area of the target window as a complete merging area, and judging whether the complete merging area is rectangular; if yes, the complete union region is used as the unique candidate target region in the temporary region; otherwise, calculating a merging result between the display area of the target window and each candidate target area in the temporary area as a partial union area, and judging whether the partial union area with the rectangular shape exists or not; if the candidate target area exists, taking the partial union region with the rectangular shape as a new candidate target area, and replacing the candidate target area corresponding to the partial union region in the temporary area; otherwise, the display area of the target window is used as a new candidate target area to be added into the temporary area.
In this embodiment, when determining whether the target window can be merged with the temporary area, the union of the display area of the target window and the entire temporary area may be first used as the complete union area, and whether the complete union area is a rectangle may be determined. If so, the superposition display of the target window and the processed window can be realized by only cutting a rectangular complete union region on the rendering picture, so that the complete union region can be used as the only candidate target region in the temporary region. Where a union may be understood as the total area covered by several graphics that participate in the merge.
For example, as shown in fig. 2B, assuming that a represents a temporary area and B represents a display area of a target window, the complete union area of a and B is a rectangle, and thus, the candidate target area included in the temporary area is only a at this time. As further shown in fig. 2c, assuming that a represents a temporary area and B represents a display area of the target window, the complete union area of a and B is also a rectangle, and thus, the candidate target area included in the temporary area is updated from a to the complete union area of a and B.
In this embodiment, if the complete union region of the display region of the target window and the entire temporary region is not rectangular, considering that at least one candidate target region may be included in the temporary region, it is possible to further regard the union between the display region of the target window and each candidate target region in the temporary region as a partial union region and determine whether each partial union region is rectangular. If a partial union region between the display region of the target window and one of the candidate target regions is rectangular, the partial union region may be added as a new candidate target region to the temporary region, and the candidate target region included in the partial union region in the temporary region may be deleted. If the partial union areas between the at least two candidate target areas and the display area of the target window are all rectangular, the number of the candidate target areas after the display areas of the target windows are combined can be counted to change, so that the number of the candidate target areas is reduced after the display areas of the target windows are combined.
In this embodiment, if no rectangle exists in the full union region or the partial union region of the display region and the temporary region of the target window, the target window cannot be temporarily combined with the temporary region, and the display region of the target window needs to be added to the temporary region separately as a new candidate target region. For example, as shown in fig. 2d, assuming that a represents one of candidate target areas in the temporary area and B represents the display area of the target window, B cannot be combined with a. As also shown in fig. 2e, if a represents the entire temporary area and B represents the display area of the target window, B cannot be merged with a, and a rectangular area must be cut out for B alone.
In this embodiment, taking three windows to be displayed as examples, the candidate target areas corresponding to two windows can be regarded as a whole, then the relation with the third window is judged, and if the two windows are contained, the two rectangular areas can be ensured to be displayed by cutting off at most; if there is a region overlap between the two, and just one rectangle can be formed, then at most two rectangular regions need to be cut out. If the window relation is as shown in fig. 2f, only one rectangular area needs to be cut off to ensure that the windows are displayed.
And 130, cutting out the target area on the rendering picture corresponding to the window superposition display request.
In this embodiment, after determining the target areas, the position coordinates of the vertices of each target area may be determined according to the position coordinates of the windows included in the target areas, so as to determine the specific positions of each target area in the rendering screen, and each target area is cut out on the rendering screen to display all the windows.
Optionally, the method may further include: responding to window closing operation, and determining a target area corresponding to the window to be closed and an associated window included in the target area; and acquiring the upper and lower layer display relation between the window to be closed and the associated window, and if the window to be closed is positioned at the lower layer of the associated window and the display area of the window to be closed comprises the display area of the associated window, cutting off the target area corresponding to the display area of the associated window on the rendering picture again after the window to be closed is closed.
In this embodiment, when the window is hidden or closed, since the window to be closed may be displayed in one target area together with other windows, the closing of the window may affect the normal display of other windows in the same target area, and the affected window needs to be displayed and restored. And responding to the window closing operation, determining a window to be closed and a target area for displaying the window to be closed in the rendering picture, and determining an associated window included in the target area according to the stored related information of the target area. For example, as shown in fig. 2B, for the window a to be closed, the corresponding target area is a, and the associated window sharing the target area with a is B.
In this embodiment, for the window in the same target area, if the window in the upper layer is closed, the window in the lower layer can still be normally displayed, and the closed area is not required to be restored; if the window at the lower layer is closed, normal display of the window at the upper layer may be affected. Therefore, after judging whether the window to be closed is closed, when the associated window can be normally displayed, the upper and lower layer display relationship between the window to be closed and the associated window can be obtained first, if the window to be closed is located at the lower layer of the associated window, and the display area of the window to be closed comprises the display area of the associated window, namely, the display area is equivalent to the rectangular area of the window to be closed occupied by the associated window for displaying, after the window to be closed is closed, the associated window cannot be normally displayed due to the lack of the rectangular area for displaying. The rectangular area corresponding to the display area of the associated window needs to be cut off again on the rendering screen so as to normally display the associated window.
For example, as shown in fig. 2B, the window a is located at the lower layer of the window B for display, the display area of the window a includes the display area of the window B, and the rectangular area cut out in the rendered screen is the display area of the window a. When the window B is closed, the window B is an upper window, so that after the window B is closed, the lower window A can be normally displayed, and the terminal does not need to restore the area of the window B, so that the processing content of the audio/video terminal is reduced, and the performance of the audio/video terminal is ensured. When the window a is closed, since the window a is a lower window and the display area of the window B includes the display area of the window a, that is, the rectangular area corresponding to the display area of the window a is not cut out in the rendering screen, the window B cannot be normally displayed after the window a is closed, and the rectangular area corresponding to the display area of the window a needs to be cut out again in the rendering screen to normally display the window a.
Optionally, after the window overlay display request is acquired, the method may further include: determining the upper and lower layer display relationship among the windows according to the sequence of the request time in the superposition display request of the windows; wherein, the window with the preceding request time is displayed at the lower layer of the window with the following request time.
In this embodiment, the upper and lower layer display relationships between windows are determined according to the requested display time of the windows. After receiving the window superposition display request, the terminal can extract the time of the window from the window superposition display request, and determine and store the upper and lower hierarchical relationship of each window according to the rule that the window with the previous request time is at the upper layer and the window with the subsequent request time is at the lower layer.
Optionally, after determining the target area corresponding to the window to be closed and the associated window in response to the window closing operation, the method may further include: and if the display area of the window to be closed and the display area of the associated window are partially overlapped, closing the area outside the display area of the associated window in the target area.
In this embodiment, after determining the target area corresponding to the window to be closed and the associated window, if it is found that the window to be closed and the associated window that share the target area are partially overlapped, for example, as shown in fig. 2c, when the window a is requested to be closed, the size of the rectangular area required for normally displaying the associated window B may be determined according to the vertex position coordinates of the window B, and then the area except for the associated window B in the target area is used as the area to be closed to close the window B, so that the window B is normally displayed, and therefore, the rendering frame does not need to be cut once again, and the performance influence on the terminal device is reduced.
According to the technical scheme, the terminal equipment acquires a window superposition display request and determines the display area of each window; determining a target area to be cut by regularly combining the display areas of the windows; and cutting out the target area on the rendering picture corresponding to the window superposition display request. By combining the display areas of the windows, when the multi-window superposition display is processed on the rendering picture, the number of times of cutting the rendering picture is reduced, the performance of the terminal equipment is improved, and the problem that the terminal performance is affected due to too many holes in the multi-window processing in the prior art is solved.
Example two
Fig. 3 is a schematic structural diagram of a display processing device in a second embodiment of the present invention, where the embodiment is applicable to a case where a plurality of overlay windows are displayed on a rendered screen, and the device may be implemented by hardware and/or software, and may be generally integrated in a terminal device that provides a video display service, for example, an audio/video terminal with an external PC terminal, and used in cooperation with the PC terminal. As shown in fig. 3, the apparatus includes:
an obtaining module 210, configured to obtain a window overlapping display request, and determine a display area of each window;
the determining module 220 is configured to determine a target area to be cut by regularly merging the display areas of the windows;
And a clipping module 230, configured to clip the target area on the rendering screen corresponding to the window overlay display request.
According to the technical scheme, the terminal equipment acquires a window superposition display request and determines the display area of each window; determining a target area to be cut by regularly combining the display areas of the windows; and cutting out the target area on the rendering picture corresponding to the window superposition display request. By combining the display areas of the windows, when the multi-window superposition display is processed on the rendering picture, the number of times of cutting the rendering picture is reduced, the performance of the terminal equipment is improved, and the problem that the terminal performance is affected due to too many holes in the multi-window processing in the prior art is solved.
Optionally, the obtaining module 210 is configured to:
Acquiring a window superposition display request, and determining a rendering picture of a window to be superimposed and a position coordinate of the window in the rendering picture according to the window superposition display request;
And determining the display area of each window in the rendered picture according to the position coordinates.
Optionally, the determining module 220 includes:
An updating unit, configured to select a target window to be currently processed, calculate a merging result between a display area of the target window and the temporary area as a union area, and update the temporary area according to a shape of the union area; the temporary area comprises at least one candidate target area corresponding to the processed window;
And the judging unit is used for returning to execute the operation of selecting the current target window to be processed if the unprocessed window still exists, calculating the merging result between the display area and the temporary area of the target window as a union area, and otherwise, taking the candidate target area in the temporary area as the target area to be cut.
Optionally, the updating unit is configured to:
Calculating a merging result between a display area and a temporary area of the target window as a complete merging area, and judging whether the complete merging area is rectangular;
If yes, the complete union region is used as the unique candidate target region in the temporary region;
otherwise, calculating a merging result between the display area of the target window and each candidate target area in the temporary area as a partial union area, and judging whether the partial union area with the rectangular shape exists or not;
if the candidate target area exists, taking the partial union region with the rectangular shape as a new candidate target area, and replacing the candidate target area corresponding to the partial union region in the temporary area;
otherwise, the display area of the target window is used as a new candidate target area to be added into the temporary area.
Optionally, the method further comprises:
the window closing module is used for responding to window closing operation and determining a target area corresponding to the window to be closed and an associated window included in the target area;
and acquiring the upper and lower layer display relation between the window to be closed and the associated window, and if the window to be closed is positioned at the lower layer of the associated window and the display area of the window to be closed comprises the display area of the associated window, cutting off the target area corresponding to the display area of the associated window on the rendering picture again after the window to be closed is closed.
Optionally, the window closing module is further configured to:
After determining a target area corresponding to a window to be closed and an associated window in response to a window closing operation, if a display area of the window to be closed and a display area of the associated window partially overlap, closing an area other than the display area of the associated window in the target area.
Optionally, the method further comprises:
The hierarchical relation determining module is used for determining the upper and lower layer display relation among the windows according to the sequence of the request time in each window superposition display request after the window superposition display request is acquired;
Wherein, the window with the preceding request time is displayed at the lower layer of the window with the following request time.
The display processing device provided by the embodiment of the invention can execute the display processing method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example III
Fig. 4 is a schematic structural diagram of a cloud server according to a third embodiment of the present invention. Fig. 4 shows a block diagram of an exemplary device 12 suitable for use in implementing embodiments of the present invention. The device 12 shown in fig. 4 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 4, device 12 is in the form of a general purpose computing device. Components of device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, a bus 18 that connects the various system components, including the system memory 28 and the processing units 16.
Bus 18 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, micro channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 12 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 30 and/or cache memory. Device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, commonly referred to as a "hard disk drive"). Although not shown in fig. 4, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 18 through one or more data medium interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored in, for example, memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 42 generally perform the functions and/or methods of the embodiments described herein.
Device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), one or more devices that enable a user to interact with device 12, and/or any devices (e.g., network card, modem, etc.) that enable device 12 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 22. Also, device 12 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, via network adapter 20. As shown, network adapter 20 communicates with other modules of device 12 over bus 18. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with device 12, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 16 executes various functional applications and data processing by running programs stored in the system memory 28, for example, implementing the display processing method provided by the embodiment of the present invention.
Namely: a display processing method is realized, which comprises the following steps:
acquiring a window superposition display request, and determining a display area of each window;
Determining a target area to be cut by regularly combining the display areas of the windows;
and cutting out the target area on the rendering picture corresponding to the window superposition display request.
Example IV
The fourth embodiment of the present invention also discloses a computer storage medium having stored thereon a computer program which when executed by a processor implements a display processing method comprising:
acquiring a window superposition display request, and determining a display area of each window;
Determining a target area to be cut by regularly combining the display areas of the windows;
and cutting out the target area on the rendering picture corresponding to the window superposition display request.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (8)

1. A display processing method, characterized by comprising:
acquiring a window superposition display request, and determining a display area of each window;
Determining a target area to be cut by regularly combining the display areas of the windows;
cutting out the target area on a rendering picture corresponding to the window superposition display request;
the determining the target area to be cut by regularly combining the display areas of the windows comprises the following steps:
Selecting a target window to be processed currently, calculating a merging result between a display area of the target window and a temporary area as a union area, and updating the temporary area according to the shape of the union area; the temporary area comprises at least one candidate target area corresponding to the processed window;
If an unprocessed window exists, returning to execute the operation of selecting the current target window to be processed, calculating a merging result between a display area and a temporary area of the target window as a union area, otherwise, taking a candidate target area in the temporary area as a target area to be cut;
The step of calculating the merging result between the display area and the temporary area of the target window as a union area and updating the temporary area according to the shape of the union area comprises the following steps:
Calculating a merging result between a display area and a temporary area of a target window as a complete union area, and judging whether the complete union area is rectangular;
If yes, the complete union region is used as the unique candidate target region in the temporary region;
otherwise, calculating a merging result between the display area of the target window and each candidate target area in the temporary area as a partial union area, and judging whether the partial union area with the rectangular shape exists or not;
if the candidate target area exists, taking the partial union region of the rectangular shape as a new candidate target area, and replacing the candidate target area corresponding to the partial union region in the temporary area;
otherwise, taking the display area of the target window as a new candidate target area to be added into the temporary area.
2. The method of claim 1, wherein obtaining a window overlay display request and determining a display area for each window comprises:
acquiring a window superposition display request, and determining a rendering picture of a window to be superimposed and a position coordinate of the window in the rendering picture according to the window superposition display request;
and determining the display area of each window in the rendered picture according to the position coordinates.
3. The method as recited in claim 1, further comprising:
responding to window closing operation, and determining a target area corresponding to a window to be closed and an associated window included in the target area;
And acquiring the upper and lower layer display relation between the window to be closed and the associated window, and cutting off a target area corresponding to the display area of the associated window on a rendering picture again after closing the window to be closed if the window to be closed is positioned at the lower layer of the associated window and the display area of the window to be closed comprises the display area of the associated window.
4. The method of claim 3, further comprising, after determining the target area corresponding to the window to be closed and the associated window in response to the window closing operation:
And if the display area of the window to be closed and the display area of the associated window are partially overlapped, closing the area outside the display area of the associated window in the target area.
5. The method of claim 3, further comprising, after obtaining the window overlay display request:
Determining the upper and lower layer display relationship among the windows according to the sequence of the request time in the superposition display request of the windows;
Wherein, the window with the preceding request time is displayed at the lower layer of the window with the following request time.
6. A display processing apparatus, comprising:
the acquisition module is used for acquiring the window superposition display request and determining the display area of each window;
The determining module is used for determining a target area to be cut by regularly combining the display areas of the windows;
The clipping module is used for clipping the target area on a rendering picture corresponding to the window superposition display request;
The determining module includes:
An updating unit, configured to select a target window to be currently processed, calculate a merging result between a display area of the target window and a temporary area as a union area, and update the temporary area according to a shape of the union area; the temporary area comprises at least one candidate target area corresponding to the processed window;
The judging unit is used for returning to execute the operation of selecting the current target window to be processed if an unprocessed window exists, calculating the merging result between the display area and the temporary area of the target window as a union area, and otherwise, taking the candidate target area in the temporary area as a target area to be cut;
the updating unit is used for:
Calculating a merging result between a display area and a temporary area of a target window as a complete union area, and judging whether the complete union area is rectangular;
If yes, the complete union region is used as the unique candidate target region in the temporary region;
otherwise, calculating a merging result between the display area of the target window and each candidate target area in the temporary area as a partial union area, and judging whether the partial union area with the rectangular shape exists or not;
if the candidate target area exists, taking the partial union region of the rectangular shape as a new candidate target area, and replacing the candidate target area corresponding to the partial union region in the temporary area;
otherwise, taking the display area of the target window as a new candidate target area to be added into the temporary area.
7. A terminal device, the device comprising:
One or more processors;
Storage means for storing one or more programs,
When executed by the one or more processors, causes the one or more processors to implement a display processing method as recited in any one of claims 1-5.
8. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements a display processing method according to any one of claims 1-5.
CN202111547335.1A 2021-12-16 Display processing method, device, equipment and storage medium Active CN114356475B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111547335.1A CN114356475B (en) 2021-12-16 Display processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111547335.1A CN114356475B (en) 2021-12-16 Display processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114356475A CN114356475A (en) 2022-04-15
CN114356475B true CN114356475B (en) 2024-06-04

Family

ID=

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01250130A (en) * 1988-03-30 1989-10-05 Toshiba Corp Multi-window display device
JPH0784742A (en) * 1993-09-09 1995-03-31 Toshiba Corp Window displaying method
JPH07140953A (en) * 1993-06-17 1995-06-02 Mitsubishi Electric Corp Image display device
JPH0778711B2 (en) * 1986-01-29 1995-08-23 株式会社日立製作所 Multi-window display control method
JPH1011263A (en) * 1996-06-20 1998-01-16 Sharp Corp Multiwindow system
JP2002116903A (en) * 2000-10-06 2002-04-19 Hitachi Ltd Plural screen display method
CN1645320A (en) * 2005-01-31 2005-07-27 浙江大学 Method for determining window shearing relation in grahpic user interface
CN102065336A (en) * 2010-07-21 2011-05-18 深圳创维数字技术股份有限公司 Digital television receiver and method for determining multistage window shearing relation of digital television receiver
CN103426419A (en) * 2012-05-23 2013-12-04 腾讯科技(深圳)有限公司 Method and device for refreshing areas
CN105005430A (en) * 2015-07-17 2015-10-28 深圳市金立通信设备有限公司 Window display method and terminal
CN106681583A (en) * 2016-12-02 2017-05-17 广东威创视讯科技股份有限公司 Method and system for processing displayed content in overlapping windows
CN107124651A (en) * 2017-04-12 2017-09-01 青岛海信电器股份有限公司 Window display method and device
CN108205456A (en) * 2017-12-28 2018-06-26 北京奇虎科技有限公司 Window rendering intent, equipment and the storage medium of a kind of striding course
CN111147770A (en) * 2019-12-18 2020-05-12 广州市保伦电子有限公司 Multi-channel video window overlapping display method, electronic equipment and storage medium
CN112306588A (en) * 2019-07-26 2021-02-02 浙江宇视科技有限公司 Window laminating method, device, equipment and medium
CN112328130A (en) * 2020-09-04 2021-02-05 华为技术有限公司 Display processing method and electronic equipment
CN113360045A (en) * 2021-04-23 2021-09-07 数坤(北京)网络科技股份有限公司 Medical image processing and displaying method, processing device, display device and storage medium

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0778711B2 (en) * 1986-01-29 1995-08-23 株式会社日立製作所 Multi-window display control method
JPH01250130A (en) * 1988-03-30 1989-10-05 Toshiba Corp Multi-window display device
JPH07140953A (en) * 1993-06-17 1995-06-02 Mitsubishi Electric Corp Image display device
JPH0784742A (en) * 1993-09-09 1995-03-31 Toshiba Corp Window displaying method
JPH1011263A (en) * 1996-06-20 1998-01-16 Sharp Corp Multiwindow system
JP2002116903A (en) * 2000-10-06 2002-04-19 Hitachi Ltd Plural screen display method
CN1645320A (en) * 2005-01-31 2005-07-27 浙江大学 Method for determining window shearing relation in grahpic user interface
CN102065336A (en) * 2010-07-21 2011-05-18 深圳创维数字技术股份有限公司 Digital television receiver and method for determining multistage window shearing relation of digital television receiver
CN103426419A (en) * 2012-05-23 2013-12-04 腾讯科技(深圳)有限公司 Method and device for refreshing areas
CN105005430A (en) * 2015-07-17 2015-10-28 深圳市金立通信设备有限公司 Window display method and terminal
CN106681583A (en) * 2016-12-02 2017-05-17 广东威创视讯科技股份有限公司 Method and system for processing displayed content in overlapping windows
CN107124651A (en) * 2017-04-12 2017-09-01 青岛海信电器股份有限公司 Window display method and device
CN108205456A (en) * 2017-12-28 2018-06-26 北京奇虎科技有限公司 Window rendering intent, equipment and the storage medium of a kind of striding course
CN112306588A (en) * 2019-07-26 2021-02-02 浙江宇视科技有限公司 Window laminating method, device, equipment and medium
CN111147770A (en) * 2019-12-18 2020-05-12 广州市保伦电子有限公司 Multi-channel video window overlapping display method, electronic equipment and storage medium
CN112328130A (en) * 2020-09-04 2021-02-05 华为技术有限公司 Display processing method and electronic equipment
CN113360045A (en) * 2021-04-23 2021-09-07 数坤(北京)网络科技股份有限公司 Medical image processing and displaying method, processing device, display device and storage medium

Similar Documents

Publication Publication Date Title
CN107204023B (en) Method and apparatus for avoiding distortion of graphics drawn in canvas
US9495719B2 (en) Multi-source, multi-destination data transfers
CN110032701B (en) Image display control method and device, storage medium and electronic equipment
CN107341016B (en) Focus state implementation method and device under split screen mechanism, terminal and storage medium
CN110287146B (en) Method, device and computer storage medium for downloading application
WO2018120992A1 (en) Window rendering method and terminal
US20070130519A1 (en) Arbitrary rendering of visual elements on a code editor
CN111880879A (en) Playing method, device, equipment and storage medium of dynamic wallpaper
CN108008876B (en) Display method, device and equipment of floating window and storage medium
CN107301220B (en) Method, device and equipment for data driving view and storage medium
CN110286971B (en) Processing method and system, medium and computing device
CN109558118B (en) Method, apparatus, device and storage medium for creating native components of a smart applet
CN113657518B (en) Training method, target image detection method, device, electronic device, and medium
CN112287010B (en) Map service providing method, device, terminal and storage medium based on android system
CN107862035B (en) Network reading method and device for conference record, intelligent tablet and storage medium
CN114356475B (en) Display processing method, device, equipment and storage medium
CN110147283B (en) Display content switching display method, device, equipment and medium
CN107885807B (en) File saving method and device, intelligent tablet and storage medium
CN108021317B (en) Method and device for screen editing
CN111054072B (en) Method, device, equipment and storage medium for role model tailing
CN114356475A (en) Display processing method, device, equipment and storage medium
CN113791858A (en) Display method, device, equipment and storage medium
CN114528509A (en) Page display processing method and device, electronic equipment and storage medium
CN113760317A (en) Page display method, device, equipment and storage medium
CN113626120A (en) Method and device for determining display page, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant