CN117666988A - Window rendering method, device, equipment and storage medium - Google Patents

Window rendering method, device, equipment and storage medium Download PDF

Info

Publication number
CN117666988A
CN117666988A CN202211041788.1A CN202211041788A CN117666988A CN 117666988 A CN117666988 A CN 117666988A CN 202211041788 A CN202211041788 A CN 202211041788A CN 117666988 A CN117666988 A CN 117666988A
Authority
CN
China
Prior art keywords
window
rendered
sub
position parameter
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211041788.1A
Other languages
Chinese (zh)
Inventor
李娜芬
李斌
罗程
梁百怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202211041788.1A priority Critical patent/CN117666988A/en
Publication of CN117666988A publication Critical patent/CN117666988A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a window rendering method, device, equipment and storage medium, which are used for rendering a child window and a main window control respectively and independently without adding additional windows. Comprising the following steps: acquiring a position parameter set of a main window control on a display interface and a first position parameter of a sub-window on the display interface, wherein the sub-window is a window loaded based on the main window; acquiring a region to be rendered of the sub-window in the display interface according to the position parameter set and the first position parameter; determining the content to be rendered of the sub window according to the region to be rendered; and rendering the to-be-rendered content in the to-be-rendered area to obtain a picture image of the sub-window on the display interface. The method and the device are applied to the field of computers.

Description

Window rendering method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of computer images, and in particular, to a method, apparatus, device, and storage medium for rendering a window.
Background
The region where the child window exists on the Windows operating system (Windows) is not capable of simultaneously redisplaying the content of the parent window, and it is less likely that the child window will be overlaid with the parent window content of the same region. In the session window of social software, some message bubbles need to be implemented with child windows, while message bubbles can scroll along with the session content, and there may be some other interface elements that need to be overlaid on top of some of the bubbles. This creates a problem: other control elements are arranged on the parent window, when the parent window control needs to be displayed above the child window message, the child window of the message can cover the parent window control instead, and the product is abnormal in appearance.
There is therefore a strong need for a window rendering method that avoids child windows from overlaying parent windows to render content.
Disclosure of Invention
The embodiment of the application provides a window rendering method, device, equipment and storage medium, which are used for rendering a child window and a main window control respectively and independently without adding additional windows.
In view of this, the present application provides, in one aspect, a window rendering method, including: acquiring a position parameter set of a main window control on a display interface and a first position parameter of a sub-window on the display interface, wherein the sub-window is a window loaded based on the main window; acquiring a region to be rendered of the sub-window in the display interface according to the position parameter set and the first position parameter; determining the content to be rendered of the sub window according to the region to be rendered; and rendering the to-be-rendered content in the to-be-rendered area to obtain a picture image of the sub-window on the display interface.
Another aspect of the present application provides a window rendering apparatus, including: the device comprises an acquisition module, a display interface and a sub-window, wherein the acquisition module is used for acquiring a position parameter set of a main window control on the display interface and a first position parameter of the sub-window on the display interface, and the sub-window is a window loaded based on the main window;
The processing module is used for determining the area to be rendered of the sub-window in the display interface according to the position parameter set and the first position parameter; determining the content to be rendered of the sub window according to the region to be rendered;
and the rendering module is used for rendering the to-be-rendered content in the to-be-rendered area to obtain a picture image of the sub-window on the display interface.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the window rendering apparatus further includes a building module for building a base class of the main window control; the processing module is further configured to monitor, based on the base class, whether a position parameter of the main window control changes relative to a position parameter of the sub-window, and if so, trigger to obtain a position parameter set of the main window control on the display interface and a first position parameter of the sub-window on the display interface.
In one possible design, in another implementation manner of another aspect of the embodiments of the present application, the processing module is specifically configured to obtain an intersection area set of each location parameter in the location parameter set and the first location parameter; and removing the intersection region set from the region indicated by the first position parameter to obtain the region to be rendered of the sub-window on the display interface.
In a possible design, in another implementation manner of another aspect of the embodiments of the present application, the processing module is specifically configured to determine a first intersection area according to a second location parameter in the location parameter set and the first location parameter, where the second location parameter is a location parameter corresponding to the first control in the main window control; removing the first intersection area from the area indicated by the first position parameter to obtain a first middle area to be rendered of the sub-window on the display interface; determining a second intersection area according to a third position parameter in the position parameter set and the position parameter of the first intermediate area to be rendered, wherein the third position parameter is a position parameter corresponding to a second control in the main window control; removing the second intersection area from the first intermediate area to be rendered to obtain a second intermediate area to be rendered; and traversing the position parameters in the position parameter set to obtain the region to be rendered.
In one possible design, in another implementation manner of another aspect of the embodiments of the present application, the processing module is specifically configured to obtain a target area corresponding to the area to be rendered in the frame image displayed by the sub-window at the last time; and determining the rendering content of the target area in the picture image displayed last time as the content to be rendered.
In another implementation manner of another aspect of the embodiments of the present application, the processing module is specifically configured to intercept, from the last displayed frame image of the sub-window, the target rendering content as the content to be rendered according to the region to be rendered, where a size of the region corresponding to the target rendering content is equal to a size of the region to be rendered.
In one possible design, in another implementation manner of another aspect of the embodiments of the present application, the main window is a chat window of social software, the sub-window is a code presentation window based on chat window presentation, and the main window control at least includes an input control, a message control, and an avatar control of the chat window.
Another aspect of the present application provides a computer device comprising: a memory, a processor, and a bus system;
wherein the memory is used for storing programs;
the processor is used for executing the program in the memory, and the processor is used for executing the method according to the aspects according to the instructions in the program code;
the bus system is used to connect the memory and the processor to communicate the memory and the processor.
Another aspect of the present application provides a computer-readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the methods of the above aspects.
In another aspect of the present application, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the methods provided in the above aspects.
From the above technical solutions, the embodiments of the present application have the following advantages: determining an overlapping area of the sub window and the main window according to the position parameters of the main window control and the position parameters of the sub window, determining a to-be-rendered area of the sub window according to the overlapping area, and finally rendering the content of the sub window in the to-be-rendered area, so that the sub window and the main window control can be respectively and independently rendered under the condition that no additional window is added only by adjusting the rendering area of the sub window.
Drawings
FIG. 1 is a schematic diagram of a composition of a computer device to which the window rendering method, apparatus, device and storage medium of the present application are applied;
fig. 2 is a schematic diagram of an architecture of a communication system according to an embodiment of the present application;
FIG. 3 is a schematic diagram of one embodiment of a window rendering method according to an embodiment of the present application;
FIGS. 4a to 4d are schematic views illustrating positions between a sub window and a main window according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an interface in which the position of a control in a main window is changed in an embodiment of the present application;
FIG. 6 is a schematic diagram of an interface in which a position of a sub-window changes relative to a main window according to an embodiment of the present application;
FIG. 7 is a flowchart of determining a sub-window to be rendered according to an embodiment of the present application;
FIG. 8 is another flow chart of determining a sub-window to be rendered area according to an embodiment of the present application;
FIG. 9 is a schematic flow chart of determining contents to be rendered of a child window according to an embodiment of the present application;
FIG. 10 is another flow chart of determining contents to be rendered of a child window according to an embodiment of the present application;
FIG. 11 is another flow chart of determining contents to be rendered of a child window according to an embodiment of the present application;
FIG. 12 is another flow chart of determining contents to be rendered of a child window according to an embodiment of the present application;
FIG. 13 is a schematic view of an embodiment of a window rendering apparatus according to an embodiment of the present application;
FIG. 13a is a schematic diagram of another embodiment of a window rendering apparatus according to an embodiment of the present application;
FIG. 14 is a schematic view of another embodiment of a window rendering apparatus according to an embodiment of the present application;
fig. 15 is a schematic view of another embodiment of a window rendering apparatus according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a window rendering method, device, equipment and storage medium, which are used for rendering a child window and a main window control respectively and independently without adding additional windows.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be capable of operation in sequences other than those illustrated or described herein, for example. Furthermore, the terms "comprises," "comprising," and "includes" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
The region where the child window exists on the Windows operating system (Windows) is not capable of simultaneously redisplaying the content of the parent window, and it is less likely that the child window will be overlaid with the parent window content of the same region. In the session window of social software, some message bubbles need to be implemented with child windows, while message bubbles can scroll along with the session content, and there may be some other interface elements that need to be overlaid on top of some of the bubbles. This creates a problem: other control elements are arranged on the parent window, when the parent window control needs to be displayed above the child window message, the child window of the message can cover the parent window control instead, and the product is abnormal in appearance. There is therefore a strong need for a window rendering method that avoids child windows from overlaying parent windows to render content. In order to solve the technical problem, the application provides the following technical scheme: acquiring a position parameter set of a main window control on a display interface and a first position parameter of a sub-window on the display interface, wherein the sub-window is a window displayed based on the main window; acquiring a region to be rendered of the sub-window in the display interface according to the position parameter set and the first position parameter; determining the content to be rendered of the sub window according to the region to be rendered; and rendering the to-be-rendered content in the to-be-rendered area to obtain a picture image of the sub-window on the display interface. Therefore, the sub-window and the main window control can be respectively and independently rendered under the condition that no additional window is added only by adjusting the rendering area of the sub-window.
The method, the device, the equipment and the storage medium are suitable for any computer equipment, so that the rendering frame number of the picture is controlled through the computer equipment, and the rendering of the picture are realized. For example, the computer device may be a terminal running social software or the like, such as a cell phone, tablet, notebook, or the like.
Fig. 1 is a schematic diagram showing a composition structure of a computer device to which the window rendering method, apparatus, device and storage medium of the present application are applied. In fig. 1, the computer device may include: a processor 101, a memory 102, a communication interface 103, a display 104, an input unit 105 and a communication bus 106.
The processor 101, the memory 102, the communication interface 103, the display 104, and the input unit 105 all perform communication with each other via the communication bus 106.
In this application, the processor 101 may include: a graphics processor (GPU, graphics Processing Unit) 1012 in the graphics card may be used to implement graphics data processing associated with rendering of a picture image, etc. in embodiments of the present application.
The processor 101 further includes a central processing unit (CPU, central Processing Unit) 1011 to control the frame number of the image rendering and to implement the main data processing operation of the computer device, however, the central processing unit may be replaced by an application-specific integrated circuit (ASIC), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA), or other programmable logic device.
The memory 102 is used to store one or more programs, which may include program code including computer operating instructions. The memory may comprise high-speed RAM memory or may further comprise non-volatile memory, such as at least one disk memory.
The communication interface 103 may be an interface of a communication module, such as an interface of a GSM module.
The display 104 may be used for a window interface and displays rendered picture images in the window interface; information entered by the user, or provided to the user, may also be displayed, as well as various graphical user interfaces of the computer device, which may be formed from any combination of graphics, text, pictures, etc. The display may include a display panel, for example, may be a display panel configured in the form of a liquid crystal display, an organic light emitting diode, or the like. Further, the display may include a touch display panel with a capture touch event.
Optionally, the computer device may further comprise an input unit 105, which input unit 105 may be used for receiving input of user-entered information such as characters, numbers, etc., and generating signal inputs related to user settings and function control. The input unit may include, but is not limited to, one or more of a physical keyboard, a mouse, a joystick, etc.
Of course, the computer device structure shown in fig. 1 does not constitute a limitation of the computer device, and the computer device may include more or less components than those shown in fig. 1, or may combine some components in practical applications.
The method provided in the application may also be applied to a communication system shown in fig. 2, please refer to fig. 2, fig. 2 is a schematic diagram of an architecture of the communication system in the embodiment of the application, as shown in fig. 2, the communication system includes a server and a terminal device, and a client (a client related to window rendering) is deployed on the terminal device, where the client may run on the terminal device in a browser manner, may also run on the terminal device in a stand-alone Application (APP) manner, and the specific presentation form of the client is not limited herein. The server related to the application can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDNs), basic cloud computing services such as big data and artificial intelligence platforms. The terminal device may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a palm computer, a personal computer, a smart television, a smart watch, a vehicle-mounted device, a wearable device, and the like. The terminal device and the server may be directly or indirectly connected through wired or wireless communication, which is not limited herein. The number of servers and terminal devices is not limited either. The scheme provided by the application can be independently completed by the terminal equipment, can be independently completed by the server, and can be completed by the cooperation of the terminal equipment and the server, so that the application is not particularly limited. In this embodiment, the rendering contents and rendering configuration related to the sub-window and the main window may be stored in a database. The Database (Database), which can be considered as an electronic filing cabinet, is a place for storing electronic files, and users can perform operations such as adding, inquiring, updating, deleting and the like on the data in the files. A "database" is a collection of data stored together in a manner that can be shared with multiple users, with as little redundancy as possible, independent of the application. The database management system (Database Management System, DBMS) is a computer software system designed for managing databases, and generally has basic functions of storage, interception, security, backup, and the like. The database management system may classify according to the database model it supports, e.g., relational, extensible markup language (Extensible Markup Language, XML); or by the type of computer supported, e.g., server cluster, mobile phone; or by classification according to the query language used, e.g. structured query language (Structured Query Language, SQL), XQuery; or by performance impact emphasis, such as maximum scale, maximum speed of operation; or other classification schemes. Regardless of the manner of classification used, some DBMSs are able to support multiple query languages across categories, for example, simultaneously.
It will be appreciated that in the specific embodiments of the present application, related data such as rendering data is referred to, and when the above embodiments of the present application are applied to specific products or technologies, user permissions or consents need to be obtained, and the collection, use and processing of related data need to comply with related laws and regulations and standards of related countries and regions.
In view of the fact that certain terms are used in this application, these terms are first described below.
Base class: in an object-oriented design, class types, defined to contain commonalities of all entities, are referred to as "base classes". The base classes have inheritance, which is one of the most important concepts of object-oriented programming, allowing existing classes to be utilized and extended in the hierarchical structure that constitutes the software system to support new functionality. I.e. the base class can be inherited all information and new information is added to form a new class. This allows the programmer to only define the components in the new class that are not present in the existing class to build the new class, thereby greatly improving the reusability and maintainability of the software. In the process of constructing new classes, the newly established classes are called as 'subclasses' or 'derived classes'; while inherited classes containing the same features are referred to as "parent classes" or "base classes". The derived class inherits all the members of the base class and can add data members and member functions that are not available in the base class to meet the requirements for describing new objects.
A main window: also referred to as a parent window, each form (form) is a parent window that is not a child of any other form. All controls, buttons, etc. on the window are child windows of the window. A window (other than a window) may be either a parent window or a child window. For example, a frame (frame) is located in the window, and the control in the frame is a child window of the frame, but the frame is also a child window of the window. Just as the container is filled with containers, the largest container is the window and the contents of each container are the child windows.
With reference to the foregoing description, a window rendering method in the present application will be described below, in this embodiment, a terminal device is used as an execution body, and referring to fig. 3, an embodiment of the window rendering method in the present application includes:
301. and acquiring a position parameter set of the main window control on a display interface and a first position parameter of a sub-window on the display interface, wherein the sub-window is a window loaded based on the main window.
The terminal equipment acquires a position parameter set of the main window control in real time based on the base class of the main window control, and acquires the position parameter of the sub window when the sub window is displayed. It can be understood that the base class of the main window control is constructed for the main window control when the main window is constructed, so that the position parameters of the main window control can be conveniently acquired, and the need of acquiring the change information of a plurality of parameters is avoided.
In an exemplary embodiment, the positional relationship between the main window and the sub window may be as shown in fig. 4a to 4 d. The main window is a chat main window 200, the child window is a message bubble 100 for displaying code information in the chat window, and at this time, other controls in the main window except for the message bubble for displaying code information are all main window controls. As shown in fig. 4 a-4 d, the message bubble 100 is the child window and controls 201-206 are the main window controls. In fig. 4a, there is an overlap of the left side of the child window with the main window control; in fig. 4b, there is an overlap of the right side of the child window with the main window control; in fig. 4c, there is an overlap of the upper portion of the child window with the main window control; in fig. 4d, the overlapping portion of the upper portion of the child window and the main window control is larger in area than the overlapping portion shown in fig. 4 c.
In this embodiment, the terminal device may trigger to obtain the position parameter set of the main window control and the position parameter of the sub window when the position parameter of the sub window changes relative to the position parameter of the main window control or the position parameter of at least one control in the main window control changes. As shown in fig. 5, in an exemplary scheme, an input field in the main window control is changed from a position parameter shown in (a) in fig. 5 to a position parameter shown in (b) in fig. 5, that is, an area occupied by the input field in the chat window is increased, and at this time, the position parameter of the input field is also changed, so that the terminal device may obtain a position parameter set and a position parameter of the sub window at this time based on obtaining all the position parameters of the controls 201 to 206 of the main window by using a base class. Another exemplary scenario is shown in fig. 6, where the sub-window (i.e., the message bubble exhibiting the code information) is moved upward relative to the main window by a mouse wheel or manual operation. That is, the position of the sub-window in the chat window is moved from the position shown in fig. 6 (a) to the position shown in fig. 6 (b), when the overlapping position of the sub-window and the main window is changed, the terminal device may obtain the position parameter set and the position parameter of the sub-window at this time by acquiring all the position parameters of the controls 201 to 206 of the main window based on the base class.
302. And determining the area to be rendered of the sub-window on the display interface according to the position parameter set and the first position parameter.
After the terminal equipment obtains the position parameter set of the main window control and the first position parameter of the sub-window, calculating an overlapping area of the main window control and the sub-window according to the position parameter set and the first position parameter, and determining a region to be rendered of the sub-window according to the overlapping area.
Optionally, the specific manner of the terminal device in acquiring the area to be rendered may be as follows:
in a possible implementation manner, the terminal device determines an overlapping region set of each control in the main window control and the sub window according to the position parameter set of the main window control and the first position parameter of the sub window; the set of overlapping regions is then removed from the original region of the sub-window to obtain the region to be rendered. An exemplary scenario may be as shown in fig. 7, where the master window control includes control a, control B, and control C. The position parameter of the control a acquired by the terminal device is a (a 1, a2, a3, a 4), the control B is B (B1, B2, B3, B4), the control C is C (C1, C2, C3, C4), and the position parameter of the sub-window is (left, top, right, bottom). Then, the intersection of the control a and the original area of the child window is taken to obtain an overlapping area, i.e., the overlapping area is resultA' (l 1, t1, r1, b 1) = (max (a 1, left), max (a 2, top), min (a 3, right), min (a 4, bottom)); and so on, the overlapping area of the sub-window and the control B is resultB' (l 2, t2, r2, B2) = (max (B1, left), max (B2, top), min (B3, right), min (B4, bottom)); the overlapping area of the sub-window and the control C is resultC' (l 3, t3, r3, b 3) = (max (C1, left), max (C2, top), min (C3, right), min (C4, bottom)). And then removing resultA ', resultab ' and resultac ' from the original area of the sub-window to obtain the area to be rendered of the sub-window.
In another possible implementation manner, the terminal device determines a first intersection area according to a second position parameter in the position parameter set and the first position parameter, where the second position parameter is a position parameter corresponding to the first control in the main window control; removing the first intersection area from the area indicated by the first position parameter to obtain a first middle area to be rendered of the sub-window in the display interface; then determining a second intersection area according to a third position parameter in the position parameter set and the position parameter of the first intermediate area to be rendered, wherein the third position parameter is a position parameter corresponding to a second control in the main window control; removing the second intersection area from the first central area to be rendered to obtain a second intermediate area to be rendered; and traversing all the position parameters in the position parameter set to obtain the region to be rendered. An exemplary scenario is shown in FIG. 8, where the master window control includes control A, control B, and control C. The position parameter of the control a acquired by the terminal device is a (a 1, a2, a3, a 4), the control B is B (B1, B2, B3, B4), the control C is C (C1, C2, C3, C4), and the position parameter of the sub-window is (left, top, right, bottom). Then, the control a and the original area of the sub-window take an intersection to obtain an overlapping area, namely, the overlapping area is resultA' (l 1, t1, r1, b 1) = (max (a 1, left), max (a 2, top), min (a 3, right), min (a 4, bottom)), and then the overlapping area is removed from the area indicated by the first position parameter of the sub-window to obtain a first intermediate area to be rendered resultA (max (l 1, left), max (t 1, top), min (r 1, right), min (b 1, bottom)); then determining an overlapping region resultab 'of the resultA (max (l 1, left), max (t 1, top), min (r 1, right), min (B1, bottom)) and the control B, and obtaining resultab (namely a second intermediate region to be rendered) according to the resultA and the resultab'; and then determining an overlapping region resultal 'of the resultab and the control C, and obtaining resultal (namely the region to be rendered) according to the resultab and the resultal'.
303. And determining the content to be rendered of the sub window according to the region to be rendered.
After the terminal equipment determines the area to be rendered of the sub-window, the content to be rendered of the sub-window is obtained according to the area to be rendered. In this embodiment, when the terminal device obtains the content to be rendered, the following possible implementation manners may be adopted:
in a possible implementation manner, the terminal device obtains a target area corresponding to the area to be rendered in the picture image displayed by the sub-window at the last time; and determining the rendering content of the target area in the picture image displayed last time as the content to be rendered. The terminal equipment determines that the content of the region to be rendered in the last display of the sub-window is the content to be rendered. An exemplary scenario is shown in fig. 9, where (a) in fig. 9 is the content in the last display of the child window, which specifically includes the content shown in lines 18-38 of the code. After the position of the sub-window changes, as shown in (b) of fig. 9, the area to be rendered is the area indicated by the 21 st line to 38 th line of the code, and the terminal device may acquire the content of the 21 st line to 38 th line of the code as the content to be rendered.
In another possible implementation manner, the terminal device intercepts target rendering content from the last displayed picture image of the sub-window according to the region to be rendered as the content to be rendered, wherein the size of the region corresponding to the target rendering content is equal to the size of the region to be rendered. An exemplary scenario is shown in fig. 10, where (a) in fig. 10 is the content in the last display of the child window, which specifically includes the content shown in lines 18-38 of the code. After the position of the sub-window changes, as shown in (b) of fig. 10, the area to be rendered is the area indicated by the 18 th to 35 th lines of the code, and the terminal device may acquire the content of the 18 th to 35 th lines of the code as the content to be rendered.
In another possible implementation manner, the terminal device optionally intercepts target rendering content from the picture image displayed last time by the sub-window according to the region to be rendered as the content to be rendered, wherein the size of the region corresponding to the target rendering content is equal to the size of the region to be rendered. An exemplary scenario is shown in fig. 11, where (a) in fig. 11 is the content in the last display of the child window, which specifically includes the content shown in lines 18-38 of the code. After the position of the sub-window changes, the area to be rendered is an area indicated by the 18 th line to the 35 th line of the code, so that the terminal device may acquire the contents of the 18 th line to the 35 th line of the code as the content to be rendered, or may acquire the contents of the 19 th line to the 36 th line of the code as the content to be rendered or the contents of the 20 th line to the 37 th line of the code as the content to be rendered (as shown in (b) of fig. 11), that is, as long as the area occupied by the intercepted rendering content is the same as the area of the area to be rendered, the fragment is specifically intercepted, which is not limited herein.
In this embodiment, the terminal device may intercept the content to be rendered according to a preset rule or in response to a user operation. An exemplary scenario is shown in fig. 12, if the user selects the contents from the 20 th line to the 37 th line of the code in the sub-window when generating a change in the position of the sub-window, the terminal device may preferably select to intercept the contents from the 20 th line to the 37 th line of the code when intercepting the contents to be rendered.
304. And rendering the to-be-rendered content in the to-be-rendered area to obtain a picture image of the sub-window on the display interface.
After the terminal equipment acquires the content to be rendered, rendering the content to be rendered in the region to be rendered to obtain a new picture image of the display interface of the sub-window. The specific rendering result may be the parameter (b) in fig. 9, the parameter (b) in fig. 10, or the parameter (b) in fig. 11, and detailed description thereof will not be repeated here.
Referring to fig. 13, fig. 13 is a schematic diagram illustrating an embodiment of a window rendering device according to an embodiment of the present application, and the window rendering device 20 includes:
the obtaining module 201 is configured to obtain a set of position parameters of a main window control on a display interface and a first position parameter of a sub-window on the display interface, where the sub-window is a window loaded based on the main window;
A processing module 202, configured to determine, according to the set of location parameters and the first location parameter, a region to be rendered of the sub-window on the display interface; determining the content to be rendered of the sub window according to the region to be rendered;
and the rendering module 203 is configured to render the to-be-rendered content in the to-be-rendered area to obtain a picture image of the sub-window on the display interface.
In an embodiment of the application, a window rendering device is provided. By adopting the device, the overlapping area of the sub window and the main window is determined according to the position parameters of the main window control and the position parameters of the sub window, the area to be rendered of the sub window is determined according to the overlapping area, and finally the content of the sub window is rendered in the area to be rendered, so that the sub window and the main window control can be respectively and independently rendered under the condition that no additional window is added only by adjusting the rendering area of the sub window.
Optionally, on the basis of the embodiment corresponding to fig. 13, in another embodiment of the window rendering device 20 provided in the embodiment of the present application, as shown in fig. 13a, the window rendering device further includes a building module 204, configured to build a base class of the main window control; the processing module 202 is further configured to monitor, based on the base class, whether a position parameter of the main window control changes relative to a position parameter of the sub-window, and if so, trigger to obtain a position parameter set of the main window control on the display interface and a first position parameter of the sub-window on the display interface.
In an embodiment of the application, a window rendering device is provided. By adopting the device, the base class of the main window control is constructed, so that the position parameters of the main window control and the position parameters of the sub window are obtained, the rendering area of the sub window can be adjusted according to the position parameters of the main window control and the position parameters of the sub window under the condition that the workload of software development is not increased, and the sub window and the main window control are rendered independently.
Optionally, in another embodiment of the window rendering apparatus 20 provided in the embodiment of the present application, based on the embodiment corresponding to fig. 13, the processing module 202 is specifically configured to obtain an intersection area set of each location parameter in the location parameter set and the first location parameter; and removing the intersection region set from the region indicated by the first position parameter to obtain the region to be rendered of the sub-window on the display interface.
In an embodiment of the application, a window rendering device is provided. By adopting the device, the intersection area of each main window control and the sub window is obtained, and then the intersection area is removed from the original area of the sub window to obtain the current area to be rendered of the sub window, so that the rendering area of the sub window is independent of the rendering area of the main window control, and further the sub window and the main window control are rendered independently.
Optionally, in another embodiment of the window rendering apparatus 20 provided in the embodiment of the present application, based on the embodiment corresponding to fig. 13, the processing module 202 is specifically configured to determine the first intersection area according to a second location parameter in the location parameter set and the first location parameter, where the second location parameter is a location parameter corresponding to the first control in the main window control; removing the first intersection area from the area indicated by the first position parameter to obtain a first middle area to be rendered of the sub-window on the display interface; determining a second intersection area according to a third position parameter in the position parameter set and the position parameter of the first intermediate area to be rendered, wherein the third position parameter is a position parameter corresponding to a second control in the main window control; removing the second intersection area from the first intermediate area to be rendered to obtain a second intermediate area to be rendered; and traversing the position parameters in the position parameter set to obtain the region to be rendered.
In an embodiment of the application, a window rendering device is provided. By adopting the device, the intersection area between each main window control and the sub-window is obtained through iteration, and then the intersection area is removed from the middle area to be rendered in the iteration process of the sub-window to obtain the current area to be rendered of the sub-window, so that the rendering area of the sub-window is independent of the rendering area of the main window control, and further the sub-window and the main window control are rendered independently.
Optionally, in another embodiment of the window rendering apparatus 20 provided in the embodiment of the present application, based on the embodiment corresponding to fig. 13, the processing module 202 is specifically configured to obtain a target area corresponding to the area to be rendered in the picture image displayed by the sub-window last time; and determining the rendering content of the target area in the picture image displayed last time as the content to be rendered.
In an embodiment of the application, a window rendering device is provided. By adopting the device, after the to-be-rendered area of the sub-window is obtained, the content of the to-be-rendered area in the original area of the sub-window is obtained as the to-be-rendered content, so that a user can conveniently obtain the changed display content.
Optionally, in another embodiment of the window rendering apparatus 20 provided in the embodiment of fig. 13, the processing module 202 is specifically configured to intercept, from the last displayed picture image of the sub-window, the target rendering content as the content to be rendered according to the region to be rendered, where a size of the region corresponding to the target rendering content is equal to a size of the region to be rendered.
In an embodiment of the application, a window rendering device is provided. After the device is adopted to acquire the region to be rendered of the sub-window, the content of the region with the same size as the region to be rendered is intercepted from the original display content of the sub-window to serve as the content to be rendered, so that the content to be rendered can be acquired according to the requirements of a user, the experience of the user is improved, and the feasibility of a scheme is improved.
Optionally, on the basis of the embodiment corresponding to fig. 13, in another embodiment of the window rendering apparatus 20 provided in this embodiment of the present application, the main window is a chat window of social software, the sub-window is a code display window based on chat window display, and the main window control includes at least an input control, a message control, and an avatar control of the chat window.
In an embodiment of the application, a window rendering device is provided. By adopting the device, a specific application scene is added, so that the feasibility and operability of the scheme are improved.
Referring to fig. 14, fig. 14 is a schematic diagram of a server structure provided in an embodiment of the present application, where the server 300 may have a relatively large difference due to different configurations or performances, and may include one or more central processing units (central processing units, CPU) 322 (e.g., one or more processors) and a memory 332, one or more storage media 330 (e.g., one or more mass storage devices) storing application programs 342 or data 344. Wherein the memory 332 and the storage medium 330 may be transitory or persistent. The program stored on the storage medium 330 may include one or more modules (not shown), each of which may include a series of instruction operations on a server. Still further, the central processor 322 may be configured to communicate with the storage medium 330 and execute a series of instruction operations in the storage medium 330 on the server 300.
The Server 300 may also include one or more power supplies 326, one or more wired or wireless network interfaces 350, one or more input/output interfaces 358, and/or one or more operating systems 341, such as Windows Server TM ,Mac OS X TM ,Unix TM ,Linux TM ,FreeBSD TM Etc.
The steps performed by the terminal device in the above-described embodiments may also be performed based on the server structure shown in fig. 14.
The window rendering device provided in the present application may be used in a terminal device, please refer to fig. 15, only the portion related to the embodiment of the present application is shown for convenience of explanation, and specific technical details are not disclosed, please refer to the method portion of the embodiment of the present application. In the embodiment of the present application, a terminal device is taken as a smart phone as an example to describe:
fig. 15 is a block diagram showing a part of the structure of a smart phone related to a terminal device provided in an embodiment of the present application. Referring to fig. 15, the smart phone includes: radio Frequency (RF) circuitry 410, memory 420, input unit 430, display unit 440, sensor 450, audio circuitry 460, wireless fidelity (wireless fidelity, wiFi) module 470, processor 480, and power supply 490. Those skilled in the art will appreciate that the smartphone structure shown in fig. 15 is not limiting of the smartphone and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The following describes each component of the smart phone in detail with reference to fig. 15:
the RF circuit 410 may be used for receiving and transmitting signals during the process of receiving and transmitting information or communication, in particular, after receiving downlink information of the base station, the downlink information is processed by the processor 480; in addition, the data of the design uplink is sent to the base station. In general, RF circuitry 410 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (low noise amplifier, LNA), a duplexer, and the like. In addition, the RF circuitry 410 may also communicate with networks and other devices via wireless communications. The wireless communications may use any communication standard or protocol including, but not limited to, global system for mobile communications (global system of mobile communication, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), long term evolution (long term evolution, LTE), email, short message service (short messaging service, SMS), and the like.
The memory 420 may be used to store software programs and modules, and the processor 480 may perform various functional applications and data processing of the smartphone by executing the software programs and modules stored in the memory 420. The memory 420 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebooks, etc.) created according to the use of the smart phone, etc. In addition, memory 420 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The input unit 430 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the smart phone. In particular, the input unit 430 may include a touch panel 431 and other input devices 432. The touch panel 431, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 431 or thereabout using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch panel 431 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 480, and can receive commands from the processor 480 and execute them. In addition, the touch panel 431 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 430 may include other input devices 432 in addition to the touch panel 431. In particular, other input devices 432 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc.
The display unit 440 may be used to display information input by a user or information provided to the user and various menus of the smart phone. The display unit 440 may include a display panel 441, and optionally, the display panel 441 may be configured in the form of a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), or the like. Further, the touch panel 431 may cover the display panel 441, and when the touch panel 431 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 480 to determine the type of the touch event, and then the processor 480 provides a corresponding visual output on the display panel 441 according to the type of the touch event. Although in fig. 15, the touch panel 431 and the display panel 441 are two separate components to implement the input and input functions of the smart phone, in some embodiments, the touch panel 431 and the display panel 441 may be integrated to implement the input and output functions of the smart phone.
The smartphone may also include at least one sensor 450, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 441 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 441 and/or the backlight when the smartphone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for identifying the application of the gesture of the smart phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration identification related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured with the smart phone are not described in detail herein.
Audio circuitry 460, speaker 461, microphone 462 can provide an audio interface between the user and the smartphone. The audio circuit 460 may transmit the received electrical signal after the audio data conversion to the speaker 461, and the electrical signal is converted into a sound signal by the speaker 461 and output; on the other hand, microphone 462 converts the collected sound signals into electrical signals, which are received by audio circuit 460 and converted into audio data, which are processed by audio data output processor 480, and transmitted via RF circuit 410 to, for example, another smart phone, or which are output to memory 420 for further processing.
WiFi belongs to a short-distance wireless transmission technology, and a smart phone can help a user to send and receive emails, browse webpages, access streaming media and the like through a WiFi module 470, so that wireless broadband Internet access is provided for the user. Although fig. 15 shows a WiFi module 470, it is understood that it does not belong to the essential constitution of a smart phone, and can be omitted entirely as required within the scope of not changing the essence of the invention.
The processor 480 is a control center of the smart phone, connects various parts of the entire smart phone using various interfaces and lines, and performs various functions and processes data of the smart phone by running or executing software programs and/or modules stored in the memory 420 and invoking data stored in the memory 420, thereby performing overall monitoring of the smart phone. Optionally, the processor 480 may include one or more processing units; alternatively, the processor 480 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 480.
The smart phone also includes a power supply 490 (e.g., a battery) for powering the various components, optionally in logical communication with the processor 480 through a power management system that performs functions such as managing charge, discharge, and power consumption.
Although not shown, the smart phone may further include a camera, a bluetooth module, etc., which will not be described herein.
The steps performed by the terminal device in the above-described embodiments may be based on the terminal device structure shown in fig. 15.
Also provided in embodiments of the present application is a computer-readable storage medium having a computer program stored therein, which when run on a computer, causes the computer to perform the methods as described in the foregoing embodiments.
Also provided in embodiments of the present application is a computer program product comprising a program which, when run on a computer, causes the computer to perform the methods described in the foregoing embodiments.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (10)

1. A method of window rendering, comprising:
acquiring a position parameter set of a main window control on a display interface and a first position parameter of a sub-window on the display interface, wherein the sub-window is a window loaded based on the main window;
determining a region to be rendered of the sub-window on the display interface according to the position parameter set and the first position parameter;
determining the content to be rendered of the sub window according to the region to be rendered;
and rendering the to-be-rendered content in the to-be-rendered area to obtain a picture image of the sub-window on the display interface.
2. The method of claim 1, wherein the obtaining the set of positional parameters of the main window control at the display interface and the child window at the first positional parameter of the display interface, the method further comprising:
constructing a base class of the main window control;
and monitoring whether the position parameters of the main window control are changed relative to the position parameters of the sub window based on the base class, and if so, triggering and acquiring a position parameter set of the main window control on the display interface and a first position parameter of the sub window on the display interface.
3. The method of claim 1, wherein the determining that the sub-window is in the area to be rendered of the display interface according to the set of location parameters and the first location parameter comprises:
acquiring an intersection area set of each position parameter in the position parameter set and the first position parameter;
and removing the intersection region set from the region indicated by the first position parameter to obtain the region to be rendered of the sub-window on the display interface.
4. The method of claim 1, wherein determining that the sub-window is in a region of the display interface to be rendered based on the set of location parameters and the first location parameter comprises:
determining a first intersection area according to a second position parameter in the position parameter set and the first position parameter, wherein the second position parameter is a position parameter corresponding to a first control in the main window control;
removing the first intersection area from the area indicated by the first position parameter to obtain a first middle area to be rendered of the sub-window on the display interface;
determining a second intersection area according to a third position parameter in the position parameter set and the position parameter of the first intermediate area to be rendered, wherein the third position parameter is a position parameter corresponding to a second control in the main window control;
Removing the second intersection area from the first intermediate area to be rendered to obtain a second intermediate area to be rendered;
and traversing the position parameters in the position parameter set to obtain the region to be rendered.
5. The method of any of claims 1-4, wherein the determining the content to be rendered of the sub-window from the region to be rendered comprises:
acquiring a target area corresponding to the area to be rendered in a picture image displayed by the sub-window at the last time;
and determining the rendering content of the target area in the picture image displayed last time as the content to be rendered.
6. The method of any of claims 1-4, wherein the determining the content to be rendered of the sub-window from the region to be rendered comprises:
and capturing target rendering content from the picture image displayed last time by the sub window according to the region to be rendered as the content to be rendered, wherein the size of the region corresponding to the target rendering content is equal to the size of the region to be rendered.
7. The method according to any one of claims 1 to 4, wherein the main window is a chat window of social software, the sub-window is a code presentation window based on chat window presentation, and the main window control at least comprises an input control, a message control and an avatar control of the chat window.
8. A window rendering apparatus, comprising:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a position parameter set of a main window control on a display interface and a first position parameter of a sub window on the display interface, and the sub window is a window loaded based on the main window;
the processing module is used for determining the region to be rendered of the sub-window on the display interface according to the position parameter set and the first position parameter; determining the content to be rendered of the sub window according to the region to be rendered;
and the rendering module is used for rendering the to-be-rendered content in the to-be-rendered area to obtain a picture image of the sub-window on the display interface.
9. A computer device, comprising: a memory, a processor, and a bus system;
wherein the memory is used for storing programs;
the processor being for executing a program in the memory, the processor being for executing the method of any one of claims 1 to 7 according to instructions in program code;
the bus system is used for connecting the memory and the processor so as to enable the memory and the processor to communicate.
10. A computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the method of any one of claims 1 to 7.
CN202211041788.1A 2022-08-29 2022-08-29 Window rendering method, device, equipment and storage medium Pending CN117666988A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211041788.1A CN117666988A (en) 2022-08-29 2022-08-29 Window rendering method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211041788.1A CN117666988A (en) 2022-08-29 2022-08-29 Window rendering method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117666988A true CN117666988A (en) 2024-03-08

Family

ID=90083117

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211041788.1A Pending CN117666988A (en) 2022-08-29 2022-08-29 Window rendering method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117666988A (en)

Similar Documents

Publication Publication Date Title
CN106775637B (en) Page display method and device for application program
US10304461B2 (en) Remote electronic service requesting and processing method, server, and terminal
CN108039963B (en) Container configuration method and device and storage medium
CN108156508B (en) Barrage information processing method and device, mobile terminal, server and system
US10506292B2 (en) Video player calling method, apparatus, and storage medium
US20170068443A1 (en) Information Display Method and Apparatus
CN106126207B (en) Desktop information display method and device and mobile terminal
US20160292946A1 (en) Method and apparatus for collecting statistics on network information
CN104571979B (en) A kind of method and apparatus for realizing split view
WO2015003636A1 (en) Method and device for interception of page elements
WO2015067142A1 (en) Webpage display method and device
US10298590B2 (en) Application-based service providing method, apparatus, and system
CN109067981A (en) Split screen application switching method, device, storage medium and electronic equipment
CN110780793B (en) Tree menu construction method and device, electronic equipment and storage medium
CN113313804A (en) Image rendering method and device, electronic equipment and storage medium
CN105631059B (en) Data processing method, data processing device and data processing system
US20160283047A1 (en) Login interface displaying method and apparatus
CN111210496B (en) Picture decoding method, device and equipment
US20180260847A1 (en) Information display method, apparatus, and system
CN115904179A (en) Multi-screen desktop split-screen display method, device, equipment and storage medium
US20150095754A1 (en) Method and device for inputting account information
CN110196662A (en) A kind of method, apparatus, terminal and storage medium showing synchronous regime
CN105320532A (en) Interactive interface display method and device as well as terminal
CN117666988A (en) Window rendering method, device, equipment and storage medium
CN112148409A (en) Window image effect realization method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination