CN110457102B - Visual object blurring method, visual object rendering method and computing equipment - Google Patents

Visual object blurring method, visual object rendering method and computing equipment Download PDF

Info

Publication number
CN110457102B
CN110457102B CN201910683994.4A CN201910683994A CN110457102B CN 110457102 B CN110457102 B CN 110457102B CN 201910683994 A CN201910683994 A CN 201910683994A CN 110457102 B CN110457102 B CN 110457102B
Authority
CN
China
Prior art keywords
visual object
fuzzy
map
visual
window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910683994.4A
Other languages
Chinese (zh)
Other versions
CN110457102A (en
Inventor
曹思源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Deepin Technology Co ltd
Original Assignee
Wuhan Deepin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Deepin Technology Co ltd filed Critical Wuhan Deepin Technology Co ltd
Priority to CN201910683994.4A priority Critical patent/CN110457102B/en
Priority to CN202210614578.0A priority patent/CN114924824B/en
Publication of CN110457102A publication Critical patent/CN110457102A/en
Application granted granted Critical
Publication of CN110457102B publication Critical patent/CN110457102B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a fuzzy method of a visual object, which is executed in a computing device, wherein the visual object is provided with a fuzzy sub-object, and the method comprises the following steps: acquiring a pixel value of an area where a visual object is located from a frame buffer area, and generating a material diagram of a fuzzy sub-object of the visual object; performing fuzzy processing on the material map to obtain a background material map; writing the background texture map back to a corresponding area of the frame buffer; drawing a visual object to generate a foreground material map of the visual object; and carrying out Alpha mixing on the background material map and the foreground material map so as to enable the rendered visual object to present a fuzzy effect. The invention also discloses corresponding computing equipment.

Description

Visual object blurring method, visual object rendering method and computing equipment
Technical Field
The invention relates to the technical field of graphical interface display, in particular to a visual object blurring method, a visual object rendering method and computing equipment.
Background
A window manager is an application for managing windows, which can control, for example, the appearance of windows, display positions in a screen, and provide a user with a method of operating windows, etc. The window manager may visually provide a window with a customized title bar that typically displays the window's icon, title, and minimize, maximize, close, etc. buttons, as well as a border. In the Linux desktop environment, the user can customize the title bar and border by X11 related techniques or in a manner specified by the window manager.
In order to make the desktop more attractive and easy to use, the user wants the background and border of the window title bar to be mixed with the content at the bottom of the window, and a semi-transparent dynamic fuzzy effect is presented. When the content at the bottom of the window changes, the background and the border of the window title bar also dynamically change accordingly.
At present, most window managers do not support the dynamic fuzzy effect of a window title bar and a frame under the Linux desktop environment; for a few window managers supporting dynamic fuzzy effect, the customizability and the attractiveness are poor, and the requirements of users are difficult to meet. For example, the window manager KWin under KDE desktop environment can only implement dynamic blurring of all window banners, cannot perform blurring on one or a few windows in a customized manner, and does not support dynamic blurring over the full window range. Furthermore, kvin also fails to customize the shape of the obscured region of a single window, and fails to obscure visual elements outside the window (e.g., window switching component, window preview component, etc.).
Disclosure of Invention
To this end, the present invention provides a method of blurring visual objects, a method of rendering and a computing device in an attempt to solve or at least alleviate the problems presented above.
According to a first aspect of the present invention, there is provided a method of obscuring a visual object, performed in a computing device, the visual object having a obscuring sub-object, the method comprising: acquiring a pixel value of a region where a visual object is located from a frame buffer area, and generating a material diagram of a fuzzy sub-object of the visual object; carrying out fuzzy processing on the material map to obtain a background material map; writing the background texture map back to a corresponding region of the frame buffer; drawing the visual object to generate a foreground material map of the visual object; and carrying out Alpha mixing on the background material map and the foreground material map so as to enable the rendered visual object to present a fuzzy effect.
Alternatively, in the blurring method of a visual object according to the present invention, the visual object includes: the system comprises a window, a window switching component, a window previewing component and a working area previewing component.
Optionally, in the method for blurring the visual object according to the present invention, the step of blurring the material map to obtain a background material map includes: performing fuzzy processing on the material diagram to obtain a fuzzy material diagram; and acquiring a fuzzy region mask of the visual object, and mixing the fuzzy region mask with the fuzzy material map to obtain the background material map.
Optionally, in the method for blurring a visual object according to the present invention, the pixel value includes a color value and a transparency value of the pixel, and the material map includes a color channel map and a transparency channel map; the step of performing fuzzy processing on the material map to obtain a background material map comprises the following steps: carrying out fuzzy processing on the color channel map to obtain a fuzzy material map; and mixing the fuzzy area mask with the transparency channel image to obtain a background material image.
Optionally, in the method for blurring a visual object according to the present invention, the step of blurring the color channel map includes: and carrying out convolution on the color channel map by adopting a preset convolution kernel.
Optionally, in the blurring method of the visual object according to the present invention, the blurring region mask is a bitmap for indicating the blurring region, and a pixel value of the blurring region is 1; the step of blending the blur area mask with the transparency channel map comprises: and calculating the pixel values of the corresponding positions of the fuzzy region mask and the transparency channel map.
According to a second aspect of the present invention, there is provided a rendering method, executed in a computing device, comprising: the visual object which sends the redrawing request and the visual object with the fuzzy sub-object are used as visual objects to be redrawn; and rendering each visual object to be redrawn according to the display level from bottom to top, wherein when the visual object to be redrawn has a fuzzy sub-object, the visual object is blurred according to the blurring method, so that the rendered visual object presents a blurring effect.
According to a third aspect of the invention, there is provided a computing device comprising: at least one processor; and a memory storing program instructions, the program instructions comprising a window manager; when read and executed by the processor, the window manager causes the computing device to perform the method of obscuring visual objects and the method of rendering as described above.
According to a fourth aspect of the present invention, there is provided a readable storage medium storing program instructions which, when read and executed by a computing device, cause the computing device to perform the method of blurring visual objects and the method of rendering as described above.
The blurring method of the visual object realizes the blurring of the visual object by setting the blurring sub-object for the visual object, namely, the blurring sub-object is set for the visual object which needs to be blurred, and the blurring sub-object is not set for the visual object which does not need to be blurred, so that one or more visual objects can be customized and selectively blurred.
On the basis of setting the fuzzy sub-object, the dynamic fuzzy of the visual object can be realized by the fuzzy sub-object. Specifically, the pixel value of the region where the visible object is located is obtained from the frame buffer and used as the material map of the fuzzy sub-object, and the material map is subjected to fuzzy processing to obtain the background material map. And then, carrying out Alpha mixing on the background material map and the foreground material map of the visual object so as to enable the rendered visual object to present a fuzzy effect. Because the material diagram of the fuzzy sub-object is generated by the pixel value of the area where the visual object is located, the fuzzy method can realize the fuzzy of the whole visual object area, and is more beautiful in vision and better in user experience.
Further, the visual objects of the present invention can be windows, window switching components, window preview components, workspace preview components, etc., thereby obscuring the various visual elements managed by the window manager.
Further, the blurring method of the present invention may set a blurring region mask for the visual object to be blurred, and the blurring region mask may define any shape. The material map of the fuzzy sub-object is cut through the fuzzy area mask, so that a background material map with a target shape can be obtained, and the customizability of the fuzzy area of a single visual object is realized.
Based on the visual object blurring method, the invention also provides a rendering method, which can render visual objects which do not need to be blurred by adopting a conventional method when each frame of desktop image is rendered, and render visual objects which need to be blurred (namely the visual objects provided with the blurring sub-objects) by adopting the blurring method, so that each rendered visual object presents a customized blurring effect.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic diagram of a computing device 100, according to one embodiment of the invention;
FIG. 2 shows a flow diagram of a rendering method 200 according to one embodiment of the invention;
FIG. 3 illustrates a schematic diagram of a desktop including a window A, B, C, D, according to one embodiment of the invention;
FIG. 4 is a diagram illustrating a window stack corresponding to the desktop shown in FIG. 3;
FIG. 5 is a diagram illustrating an Actor tree structure corresponding to the desktop shown in FIG. 3;
FIG. 6 illustrates a flow diagram of a method 600 of obscuring visual objects according to one embodiment of the invention;
FIG. 7 is a diagram illustrating a process for blurring a visual object according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Aiming at the problems in the prior art, the invention provides a visual object fuzzy method and a visual object rendering method, so as to realize customized fuzzy of the visual object, enable the visual object on a desktop to present more harmonious, beautiful and modern visual effects and improve user experience.
The blurring method and the rendering method of the visual object of the invention are executed in a computing device. The computing device may be any device having storage and computing capabilities, and may be, for example, a personal computer such as a desktop computer and a notebook computer, a computer with a higher hardware configuration such as a workstation and a server, or a mobile terminal such as a mobile phone, a tablet computer, and a smart wearable device, but is not limited thereto.
FIG. 1 shows a schematic diagram of a computing device 100, according to one embodiment of the invention. It should be noted that the computing device 100 shown in fig. 1 is only an example, and in practice, the computing device for implementing the blurring and rendering method for visual objects of the present invention may be any type of device, and the hardware configuration thereof may be the same as or different from that of the computing device 100 shown in fig. 1. In practice, the computing device implementing the blurring and rendering method of the present invention may add or delete hardware components of the computing device 100 shown in fig. 1, and the present invention is not limited to the specific hardware configuration of the computing device.
As shown in FIG. 1, in a basic configuration 102, a computing device 100 typically includes a system memory 106 and one or more processors 104. A memory bus 108 may be used for communication between the processor 104 and the system memory 106.
Depending on the desired configuration, the processor 104 may be any type of processing, including but not limited to: a microprocessor (μ P), a microcontroller (μ C), a Digital Signal Processor (DSP), or any combination thereof. The processor 104 may include one or more levels of cache, such as a level one cache 110 and a level two cache 112, a processor core 114, and registers 116. The example processor core 114 may include an Arithmetic Logic Unit (ALU), a Floating Point Unit (FPU), a digital signal processing core (DSP core), or any combination thereof. The example memory controller 118 may be used with the processor 104, or in some implementations the memory controller 118 may be an internal part of the processor 104.
Depending on the desired configuration, system memory 106 may be any type of memory, including but not limited to: volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.), or any combination thereof. System memory 106 may include an operating system 120, one or more applications 122, and program data 124. In some implementations, the application 122 can be arranged to execute instructions on an operating system with the program data 124 by the one or more processors 104.
Computing device 100 may also include an interface bus 140 that facilitates communication from various interface devices (e.g., output devices 142, peripheral interfaces 144, and communication devices 146) to the basic configuration 102 via the bus/interface controller 130. The example output device 142 includes a graphics processing unit 148 and an audio processing unit 150. They may be configured to facilitate communication with various external devices, such as a display or speakers, via one or more a/V ports 152. Example peripheral interfaces 144 may include a serial interface controller 154 and a parallel interface controller 156, which may be configured to facilitate communication with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device) or other peripherals (e.g., printer, scanner, etc.) via one or more I/O ports 158. An example communication device 146 may include a network controller 160, which may be arranged to facilitate communications with one or more other computing devices 162 over a network communication link via one or more communication ports 164.
A network communication link may be one example of a communication medium. Communication media may typically be embodied by computer readable instructions, data structures, program modules, and may include any information delivery media, such as carrier waves or other transport mechanisms, in a modulated data signal. A "modulated data signal" may be a signal that has one or more of its data set or its changes made in such a manner as to encode information in the signal. By way of non-limiting example, communication media may include wired media such as a wired network or private-wired network, and various wireless media such as acoustic, Radio Frequency (RF), microwave, Infrared (IR), or other wireless media. The term computer readable media as used herein may include both storage media and communication media.
In the computing device 100 according to the invention, the application 122 includes a window manager 128. The window manager 128 is an application for managing windows, which may control, for example, the appearance of windows, the display position in a screen, and the method of operating windows to provide a user, etc. In an embodiment of the present invention, the window manager 128 includes instructions for executing the rendering method 200 and the blurring method 600 of the present invention, and the instructions may instruct the processor 104 to execute the rendering method 200 and the blurring method 600 of the present invention, when rendering each frame of desktop image, render visual objects that do not need to be blurred by using a conventional method, and render visual objects that need to be blurred by using the blurring method 600 of the present invention, so that each rendered visual object exhibits a customized blurring effect.
It should be noted that the window manager 128 may execute the rendering method 200 and the blurring method 600, but the development process of the window manager 128 is not limited by the present invention. The window manager 128 may be a new window manager developed completely autonomously, or may be an improvement on an existing window manager, such as a window manager in a GNOME desktop environment, e.g., Mutter in a GNOME desktop environment, or kvin in a KDE desktop environment, that is, some code is modified and added in the existing window manager to support the rendering method 200 and the obfuscation method 600 of the present invention.
According to one embodiment, the window manager 128 is an improvement over the window manager of the Mutter architecture. The Mutter window manager relies on the Clutter technique to implement, Clutter being a graphical library for hardware accelerated user interfaces to create a fast, visually rich graphical user interface. While Clutter relies on the open gl package library, cogl exposes relatively limited functionality in order to provide maximum compatibility, especially for cases where OpenGL extensions are required. Therefore, in order to enable the Mutter window manager to implement the rendering method 200 and the blurring method 600 of the present invention, the present invention makes the following improvements to the Mutter:
1. the cogl is modified to support the functionality of OpenGL ES2.0API as glCopyTexSubImage 2D. Specifically, an API of texture _2d _ copy _ sub _ image for copying the contents (pixel values) of a specified region from a frame buffer (frame buffer) into a currently bound texture map (texture) may be added to the cogl's texture handling driver (cogldivervitable).
In addition, the processing functions required by the present invention, such as fuzzy processing functions, Alpha blending functions, etc., are defined in the virtual function table of textures cogl _ texture _ pixmap _ x11_ vtable (which includes a set of general-purpose OpenGL texture processing functions).
By setting texture _2d _ copy _ sub _ image and defining a processing function, the cog can take the drawn content from the current frame buffer as a material map, and then perform the processing of blurring, composition (composite) and the like on the material map, so as to provide a functional basis for executing the rendering method 200 and the blurring method 600 of the present invention.
2. Clutter is modified so that the visual objects that need to be blurred are redrawn every frame.
The design of the Clutter is carried out by taking the Actor as a unit, and each visual object corresponds to one Actor. The visual objects include, for example, but are not limited to, windows, window switching components provided by a window manager, window preview components, workspace preview components, and the like. The actors have a hierarchical relationship, the name of the top-level Actor is Stage, and each Actor can have any number of sub actors.
Specifically, in order to enable the Clutter to support the rendering method 200 and the blur method 600 of the present invention, a special Actor class, which is called metablunter (blur class), may be defined in the Clutter. In addition, an attribute _ NET _ WM _ deep _ BLUR _ REGION _ MASK (BLUR area MASK) of the visual object is defined, a value of the attribute is a bitmap for indicating a BLUR area of the visual object, and a pixel value of the BLUR area is 1.
Based on the fuzzy class metablu, a metablu sub-object (hereinafter referred to as a fuzzy sub-object) can be set for a visual object Actor that needs to be blurred, so as to mark the visual object, so that the window manager can identify all visual objects that need to be blurred in the drawing stage of each frame, and ensure that the visual objects are redrawn in each frame. Further, dynamic blurring of a corresponding visual object Actor may be achieved by blurring the child object metablu Actor. The specific steps for obscuring the visual object by obscuring the sub-objects are detailed in the obscuring method 600 below.
Based on the attribute of _ NET _ WM _ deep _ BLUR _ REGION _ MASK, a BLUR area MASK can be set for the visual object to be blurred, and the BLUR area MASK can define any shape. The material map of the fuzzy sub-object is cut through the fuzzy area mask, so that a background material map with a target shape can be obtained, and the customizability of the fuzzy area of a single visual object is realized. The specific application of the blur area mask is described in detail below in the blur method 600.
The above modifications to cog and Clutter provide basic functional support for the Mutter window manager to perform the rendering method 200 and the obfuscation method 600 of the present invention. That is, on the basis of the above modification, program instructions for executing the rendering method 200 and the blurring method 600 may be written such that the Mutter window manager can execute the rendering method 200 and the blurring method 600 of the present invention.
The improved procedure of the window manager 128 is described above by taking the Mutter window manager as an example. However, it will be understood by those skilled in the art that the window manager 128 for implementing the present invention may be an improvement of other existing window managers besides the existing Mutter window manager, or may be a completely new window manager developed completely autonomously, and the development process of the window manager 128 is not limited by the present invention.
FIG. 2 shows a flow diagram of a rendering method 200 according to one embodiment of the invention. Method 200 is performed in a computing device (e.g., computing device 100 described above), and in particular, method 200 is performed by a window manager in the computing device. It should be noted that the rendering method 200 is used for rendering a frame of desktop image, and may be executed in a loop according to a preset frame rate. For example, if the frame rate is set to 60Hz, 60 frames of desktop images can be refreshed every second, and each frame of desktop image is rendered by the method 200.
As shown in fig. 2, the method 200 begins at step S210.
In step S210, the visual object that has sent the redraw request and the visual object having the blurred sub-object are taken as the visual objects to be redrawn.
According to one embodiment, the visual object is an object having a visual display effect within the window display area of the desktop (i.e., the area of the screen occupied by the maximized window). The visual objects include, for example, but are not limited to, windows, window switching components provided by a window manager, window preview components, workspace preview components, and the like.
FIG. 3 illustrates a schematic diagram of a desktop including A, B, C, D four windows, each window being a visual object. It should be noted that, for simplicity of the drawings and the text description, only the window visual objects are shown in fig. 3, and other types of visual objects such as a window switching component, a window preview component, and the like are not shown.
The window manager usually manages the opened windows by using a window stack, where a currently active window (i.e., a window currently operated by a user) is located at the top of the window stack, and other windows are arranged according to an operation sequence of the windows, and the later the time sequence of operating the windows by the user is, the closer the position of the window in the window stack is to the top of the window stack. A plurality of windows can be displayed on the desktop in a stacked mode, and lower windows can be shielded and covered by upper windows. The window stack may also be used to represent the display level of the window, with the display level being higher for windows closer to the top of the window stack. The window at the top of the window stack is the current active window, the display level of the window stack is highest, the content is completely displayed, and the window stack cannot be shielded by other windows. In the desktop shown in fig. 3, the window C is a window currently being operated by the user, i.e., a currently active window.
FIG. 4 illustrates a window stack corresponding to the desktop shown in FIG. 3. As shown in fig. 4, the current active window C is located at the top of the window stack, and the other windows A, B, D are sequentially arranged in the window stack according to the sequence operated by the user.
In the window manager of the Mutter architecture, actors are employed to manage visual objects, each visual object corresponding to an Actor. The actors have a hierarchical relationship, and the name of the Actor at the top layer is Stage, which corresponds to the whole window display area. Each Actor may have any number of child actors.
Fig. 5 is a schematic diagram illustrating an Actor tree structure corresponding to the window display area shown in fig. 3. As shown in fig. 5, the windows a to D correspond to actors _ a to Actor _ D, which are all sub-actors of a top-level window, namely Stage. In the desktop shown in fig. 3, windows C and D are visual objects that need to be blurred, and accordingly, a blur sub-object metablu _ C, MetaBlurActor _ D is set for the Actor _ C, Actor _ D, respectively, to implement dynamic blurring of windows C and D.
The display state of the visual objects on the desktop can change along with the interaction of the user. When a user operates a visual object, such as dragging a window, adjusting the size of the window, switching a currently active window (changing the display level of the window), displaying/hiding a window preview component, and the like, the visual object correspondingly issues a redrawing request to request the window manager to redraw the visual object, so that the visual object presents a visual effect corresponding to the user operation.
According to one embodiment, the Actor for the visual object may initiate a redraw request by sending a queue _ redraw (with _ clip) message to the window manager. The redraw request includes clip information, i.e., region information to be redrawn. The area to be redrawn is a rectangular area, and the information of the area to be redrawn can be information in the shape of (x, y, width, height), wherein x and y are the horizontal axis and the vertical axis coordinates of the pixel point at the upper left corner of the rectangular area respectively, and width and height are the width and height of the rectangular area respectively. After receiving the redrawing request containing the clip information from the visual object, the window manager redraws the area to be redrawn of the visual object in the subsequent step S220. It should be noted that redrawing of a child Actor may also cause redrawing of a parent Actor.
In addition to the visual objects that have sent the redraw request, the window manager also identifies all visual object actors with the fuzzy sub-object metablu Actor during the drawing phase of each frame, and redraws these visual objects as the visual objects to be redrawn.
For example, in the desktop shown in fig. 3, a user switches a window a to a current active window, and accordingly, an Actor of the window a initiates a redrawing request based on a user operation, where the window a is a visual object to be redrawn. In addition, the window C has a blurred sub-object metablunter _ C, and the window D has a blurred sub-object metablunter _ D, that is, both the window C and the window D are visual objects to be blurred, and accordingly, both the window C and the window D are visual objects to be redrawn.
After the step S210 determines the visual object to be redrawn, the step S220 is executed.
In step S220, the visual objects to be redrawn are rendered in a bottom-up order according to the display hierarchy, wherein when the visual objects to be redrawn have the blur sub-objects, the blur method 600 for the visual objects of the present invention is used to blur the visual objects, so that the rendered visual objects exhibit a blur effect.
The display hierarchy is used for representing the stacking condition of each visual object in the window display area, and the higher the display hierarchy of the visual object is, the more complete the display content of the visual object is, the less the visual object is blocked by other visual objects. When the visual object is a window, the display level of the window can be represented by a window stack, and the window at the top of the window stack (i.e., the currently active window) has the highest display level.
According to an embodiment, in the rendering process of step S220, each of the visual objects corresponds to an off-screen frame buffer (offset frame), and a material map of the visual object is stored in the off-screen frame buffer, where the material map includes pixel values of pixels of the visual object. And each visual object can finish drawing respectively, and the drawing result is written into the respective material diagram. Then, the material maps of the respective visual objects are sequentially written into corresponding positions of a frame buffer (frame buffer) in a bottom-to-top order of the display hierarchy, so as to complete the rendering of the current frame.
In step S220, the visual objects to be redrawn may be classified into two types, one type is a visual object that does not need to be blurred (i.e., a visual object without a blur sub-object), and the other type is a visual object that needs to be blurred (i.e., a visual object with a blur sub-object), and the two types of visual objects are drawn by different methods.
For a plurality of visual objects which do not need to be blurred, the visual objects are not in strict sequence in drawing, and the visual objects can be drawn respectively and in parallel.
For a plurality of visual objects which need to be blurred, because dynamic blurring of the background needs to be performed, the drawing needs to be completed in a bottom-to-top order according to the display hierarchy, that is, only after the lower visual object of a certain visual object (the lower visual object is the background of the visual object) is completely drawn, the drawing of the visual object can be started. During rendering, the visual object is blurred according to method 600 described below to visually present a blurring effect to the visual object.
FIG. 6 illustrates a flow diagram of a method 600 of obscuring visual objects, according to one embodiment of the present invention. Method 600 is performed in a computing device (e.g., computing device 100 described above), and in particular, method 600 is performed by a window manager in a computing device for obscuring a single visual object. As shown in fig. 6, the method 600 begins at step S610.
In step S610, pixel values of the region where the visual object is located are obtained from the frame buffer, and a material map of the fuzzy sub-object of the visual object is generated.
The frame buffer is a frame buffer, in which pixel values of all pixel points on the screen are stored, and based on the pixel values stored in the frame buffer, a corresponding image can be displayed on the screen.
According to one embodiment, in step S610, an area where the visual object is located, that is, an area of the screen covered by the visual object is determined. Then, the pixel values of all the pixel points in the screen area are obtained from the frame buffer, and are stored into an off-screen frame buffer (offset frame) of a fuzzy sub-object (metablunter) of the visual object, so as to generate a material map of the fuzzy sub-object.
According to an embodiment, the pixel values include color values (R, G, B channel values) and transparency values (a channel values) of the pixels, and accordingly, the generated material map of the fuzzy sub-object includes a color channel map (R, G, B channel map) in which color values of the pixels are stored and a transparency channel map (a channel map) in which transparency values of the pixels are stored.
Subsequently, in step S620, the material map is blurred, and a background material map is obtained.
According to one embodiment, step S620 includes two processing steps, blurring and cropping.
In the fuzzy processing step, fuzzy processing is carried out on the material map to obtain a fuzzy material map. Specifically, the color channel maps in the material map are blurred, for example, the color channel maps are convolved (RGB channel maps are convolved respectively) by using a preset convolution kernel, so as to obtain a blurred material map. The fuzzy material map comprises R, G, B channel maps after fuzzy processing.
It should be noted that, the present invention does not limit the specific algorithm used in the blurring process of the color channel map, and the algorithm may be, for example, a gaussian blurring algorithm, a mean blurring algorithm, a median filtering algorithm, etc., but is not limited thereto. The size of the convolution kernel set and the weight value of each position in the convolution kernel may be different for different blurring algorithms.
After the fuzzy processing step, a clipping processing step is executed. In the step of cutting processing, a fuzzy region mask of the visual object is obtained, and the fuzzy region mask and the fuzzy material map are mixed to obtain a background material map. Specifically, the blur area mask is mixed with a transparency channel map (a channel map) in the material map to obtain a background material map. The background material map comprises a mixed transparency channel map and the fuzzy material map.
As previously mentioned, the blur area mask is an attribute of the visual object. For a visual object needing blurring processing, the blurring region mask is a bitmap for indicating blurring regions of the visual object, the blurring regions can be defined into any shape, and the pixel value in the blurring regions is 1. According to an embodiment, the pixel values of the corresponding positions of the fuzzy region mask and the transparency channel map are subjected to AND operation, a 0 value region can be cut out, a background material map with a target shape is obtained, and accordingly customizability of the fuzzy region of the visual object is achieved.
Subsequently, in step S630, the background texture map is written back to the corresponding area of the frame buffer. The "corresponding region" is the region where the visible object is located in the step S610.
Subsequently, in step S640, the visual object is rendered to generate a foreground material map of the visual object.
In step S640, the visual object is rendered to obtain pixel values of each pixel of the visual object, and the pixel values are written into an off-screen frame buffer (offset frame) of the visual object to generate a foreground material map of the visual object.
Subsequently, in step S650, the background texture map and the foreground texture map are Alpha-blended, so that the rendered visual object presents a fuzzy effect.
In step S650, the visual object reads the background material map from the frame buffer, performs Alpha blending on the background material map and the foreground material map stored in the off-screen frame buffer of the visual object, and writes the blended material map back to the corresponding area of the frame buffer. In this way, the visual object will appear blurred in the desktop image rendered from the data in the framebuffer.
Fig. 7 shows a schematic illustration of blurring the window C in fig. 3. As shown in fig. 7, RGBA channel values of each pixel point of the screen are stored in the frame buffer.
In step S1, the screen area 710 covered by the window C (Actor _ C) is determined, and the pixel values of the pixels in the area 710 of the frame buffer frame are copied to the off-screen frame buffer (offset frame) of the fuzzy sub-object (metablu _ C) of the window C, so as to generate a material map of the fuzzy sub-object metablu _ C. The texture map includes a color channel map (i.e., R, G, B channel map) and a transparency channel map (i.e., a channel map).
Subsequently, in step S2, the texture map of the paste sub-object metabluctor _ C is blurred and clipped. Specifically, the method comprises the following steps:
firstly, the color channel map is blurred, and a preset convolution kernel (for example, a gaussian convolution kernel) is adopted to convolve R, G, B channel maps in the material map respectively, so that a blurred material map comprising R, G, B three channel maps is generated.
And then, cutting the transparency channel image, and performing AND operation on a fuzzy region mask (shape mask) of the window C and the A channel image, namely performing AND operation on pixel values of corresponding positions of the fuzzy region mask of the window C and the A channel image to obtain a cut A channel image, wherein the cut A channel image has a shape defined by the fuzzy region mask of the window C.
Through the steps of fuzzy processing and cutting processing, a background material map of the fuzzy sub-object MetaBluractor _ C can be obtained. The background texture map includes R, G, B three-channel maps (i.e., blurred texture map) after blurring and a-channel map after cropping.
Subsequently, in step S3, the background texture map is written back to the frame buffer area 710.
Subsequently, in step S4, the window C is drawn to obtain the pixel value of each pixel point in the window C, and the pixel value is written into an off-screen frame buffer (offset frame) of the window C to generate a foreground material map of the window C.
Subsequently, in step S5, the window C (Actor _ C) acquires a background texture map from the area 710 in the frame buffer, and performs Alpha blending on the foreground texture map and the acquired background texture map.
Subsequently, in step S6, the window C (Actor _ C) writes the blended texture map back into the region 710 of the frame buffer. Thus, window C will appear blurred in the next frame of desktop image rendered based on the frame buffer.
A11: a computing device, comprising:
at least one processor; and
a memory storing program instructions, the program instructions comprising a window manager;
the window manager, when read and executed by the processor, causes the computing device to perform the method of obscuring visual objects of any of claims 1-7 and the method of rendering of any of claims 8-10.
A12: a readable storage medium storing program instructions which, when read and executed by a computing device, cause the computing device to perform the method of blurring visual objects according to any one of claims 1-7 and the method of rendering according to any one of claims 8-10.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as removable hard drives, U.S. disks, floppy disks, CD-ROMs, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the blurring method and the rendering method of the invention according to instructions in said program code stored in the memory.
By way of example, and not limitation, readable media may comprise readable storage media and communication media. Readable storage media store information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of readable media.
In the description provided herein, algorithms and displays are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with examples of this invention. The required structure for constructing such a system will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Additionally, some of the embodiments are described herein as a method or combination of method elements that can be implemented by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense with respect to the scope of the invention, as defined in the appended claims.

Claims (12)

1. A method of obscuring a visual object, executed in a computing device, the visual object having obscuring sub-objects, the method comprising:
acquiring a pixel value of an area where a visual object is located from a frame buffer area, and generating a material diagram of a fuzzy sub-object of the visual object;
carrying out fuzzy processing on the material map to obtain a background material map;
writing the background texture map back to a corresponding region of the frame buffer;
drawing the visual object to obtain pixel values of all pixel points in the visual object, and writing the pixel values into an off-screen frame buffer area of the visual object to generate a foreground material map of the visual object;
acquiring a background material map from the frame buffer area, and carrying out Alpha mixing on the background material map and the foreground material map;
and writing the mixed material diagram back to the frame buffer area so as to enable the rendered visual object to present a fuzzy effect.
2. The method of claim 1, wherein the visual object comprises:
the device comprises a window, a window switching component, a window previewing component and a working area previewing component.
3. The method of claim 1, wherein the step of blurring the material map to obtain a background material map comprises:
performing fuzzy processing on the material diagram to obtain a fuzzy material diagram;
and acquiring a fuzzy region mask of the visual object, and mixing the fuzzy region mask with the fuzzy material map to obtain the background material map.
4. The method of claim 3, wherein the pixel values comprise color values and transparency values of pixels, and the material map comprises a color channel map and a transparency channel map;
the step of performing fuzzy processing on the material map to obtain a background material map comprises the following steps:
carrying out fuzzy processing on the color channel map to obtain a fuzzy material map;
and mixing the fuzzy area mask with the transparency channel image to obtain a background material image.
5. The method of claim 4, wherein the blurring the color channel map comprises: and performing convolution on the color channel map by adopting a preset convolution kernel.
6. The method of claim 4, wherein the blurred region mask is a bitmap indicating blurred regions, and the blurred regions have pixel values of 1;
the step of blending the blurred region mask with the transparency channel map comprises: and calculating the pixel values of the corresponding positions of the fuzzy region mask and the transparency channel map.
7. The method of any of claims 1-6, performed by a window manager.
8. A rendering method, executed in a computing device, comprising:
the visual object which sends the redrawing request and the visual object with the fuzzy sub-object are used as visual objects to be redrawn;
rendering the visual objects to be redrawn in a bottom-up order of the display hierarchy, wherein when the visual objects to be redrawn have a fuzzy sub-object, the visual objects are blurred according to the method of any one of claims 1 to 7, so that the rendered visual objects present a fuzzy effect.
9. The method of claim 8, wherein the redraw request includes region information of the visual object to be redrawn.
10. The method of claim 8 or 9, performed by a window manager.
11. A computing device, comprising:
at least one processor; and
a memory storing program instructions, the program instructions comprising a window manager;
the window manager, when read and executed by the processor, causes the computing device to perform the method of obscuring a visual object of any of claims 1-7 or the method of rendering of any of claims 8-10.
12. A readable storage medium storing program instructions that, when read and executed by a computing device, cause the computing device to perform the method of blurring visual objects according to any one of claims 1-7 or the method of rendering according to any one of claims 8-10.
CN201910683994.4A 2019-07-26 2019-07-26 Visual object blurring method, visual object rendering method and computing equipment Active CN110457102B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910683994.4A CN110457102B (en) 2019-07-26 2019-07-26 Visual object blurring method, visual object rendering method and computing equipment
CN202210614578.0A CN114924824B (en) 2019-07-26 2019-07-26 Visual object blurring method, visual object rendering method and computing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910683994.4A CN110457102B (en) 2019-07-26 2019-07-26 Visual object blurring method, visual object rendering method and computing equipment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210614578.0A Division CN114924824B (en) 2019-07-26 2019-07-26 Visual object blurring method, visual object rendering method and computing device

Publications (2)

Publication Number Publication Date
CN110457102A CN110457102A (en) 2019-11-15
CN110457102B true CN110457102B (en) 2022-07-08

Family

ID=68483598

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210614578.0A Active CN114924824B (en) 2019-07-26 2019-07-26 Visual object blurring method, visual object rendering method and computing device
CN201910683994.4A Active CN110457102B (en) 2019-07-26 2019-07-26 Visual object blurring method, visual object rendering method and computing equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210614578.0A Active CN114924824B (en) 2019-07-26 2019-07-26 Visual object blurring method, visual object rendering method and computing device

Country Status (1)

Country Link
CN (2) CN114924824B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114924824B (en) * 2019-07-26 2023-11-14 武汉深之度科技有限公司 Visual object blurring method, visual object rendering method and computing device
CN111242838B (en) * 2020-01-09 2022-06-03 腾讯科技(深圳)有限公司 Blurred image rendering method and device, storage medium and electronic device
CN112199537A (en) * 2020-09-18 2021-01-08 杭州安恒信息技术股份有限公司 Visual image updating method and device, electronic device and storage medium
CN113791861B (en) * 2021-11-16 2022-03-01 北京鲸鲮信息系统技术有限公司 Window fuzzy transparent display method and device, electronic equipment and storage medium
CN117710502A (en) * 2023-12-12 2024-03-15 摩尔线程智能科技(北京)有限责任公司 Rendering method, rendering device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104103045A (en) * 2014-07-09 2014-10-15 广东欧珀移动通信有限公司 Gaussian blur processing method and system for terminal
CN105404438A (en) * 2014-08-13 2016-03-16 小米科技有限责任公司 Background fuzzy method and apparatus and terminal device
CN106570847A (en) * 2016-10-24 2017-04-19 广州酷狗计算机科技有限公司 Image processing method and image processing device
CN106993134A (en) * 2017-03-31 2017-07-28 努比亚技术有限公司 A kind of video generation device and method, terminal

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6348919B1 (en) * 1995-12-18 2002-02-19 3Dlabs Inc, Ltd. Graphics system with optimized use of unified local and frame buffers
US8495514B1 (en) * 2005-06-02 2013-07-23 Oracle America, Inc. Transparency assisted window focus and selection
US7970206B2 (en) * 2006-12-13 2011-06-28 Adobe Systems Incorporated Method and system for dynamic, luminance-based color contrasting in a region of interest in a graphic image
WO2016015255A1 (en) * 2014-07-30 2016-02-04 华为技术有限公司 Method and device for setting background of ui control and terminal
CN104156922A (en) * 2014-08-12 2014-11-19 广州市久邦数码科技有限公司 Image processing method and image processing system
CN105407261A (en) * 2014-08-15 2016-03-16 索尼公司 Image processing device and method, and electronic equipment
CN105138317B (en) * 2015-07-24 2018-11-13 安一恒通(北京)科技有限公司 Window display processing method and device for terminal device
US11055830B2 (en) * 2016-10-31 2021-07-06 Victoria Link Limited Rendering process and system
CN114924824B (en) * 2019-07-26 2023-11-14 武汉深之度科技有限公司 Visual object blurring method, visual object rendering method and computing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104103045A (en) * 2014-07-09 2014-10-15 广东欧珀移动通信有限公司 Gaussian blur processing method and system for terminal
CN105404438A (en) * 2014-08-13 2016-03-16 小米科技有限责任公司 Background fuzzy method and apparatus and terminal device
CN106570847A (en) * 2016-10-24 2017-04-19 广州酷狗计算机科技有限公司 Image processing method and image processing device
CN106993134A (en) * 2017-03-31 2017-07-28 努比亚技术有限公司 A kind of video generation device and method, terminal

Also Published As

Publication number Publication date
CN110457102A (en) 2019-11-15
CN114924824B (en) 2023-11-14
CN114924824A (en) 2022-08-19

Similar Documents

Publication Publication Date Title
CN110457102B (en) Visual object blurring method, visual object rendering method and computing equipment
US9710883B2 (en) Flexible control in resizing of visual displays
US9373308B2 (en) Multi-viewport display of multi-resolution hierarchical image
US10157593B2 (en) Cross-platform rendering engine
US9142044B2 (en) Apparatus, systems and methods for layout of scene graphs using node bounding areas
US8629886B2 (en) Layer combination in a surface composition system
US20190196774A1 (en) Manipulating shared screen content
US8723887B2 (en) Methods and systems for per pixel alpha-blending of a parent window and a portion of a background image
US7911465B2 (en) Techniques for displaying information for collection hierarchies
WO2017166210A1 (en) Method for processing application program and mobile device
US8856682B2 (en) Displaying a user interface in a dedicated display area
EP1462936A2 (en) Visual and scene graph interfaces
KR20060105421A (en) Dynamic window anatomy
US8423883B1 (en) Systems and methods of creating and editing electronic content including multiple types of graphics
US8358876B1 (en) System and method for content aware in place translations in images
US10169307B2 (en) Method and system for the use of adjustment handles to facilitate dynamic layout editing
JPH1039850A (en) Method and system for realizing light transmissive window
JP4742051B2 (en) Spatial and temporal motion blur effect generation method
CN104038807A (en) Layer mixing method and device based on open graphics library (OpenGL)
EP1316064B1 (en) Scaling images
US20130063482A1 (en) Application programming interface for a bitmap composition engine
CN111625237B (en) Character vision deformation method, system and medium
Hoddie et al. Drawing Graphics with Poco
CN116301506A (en) Content display method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant