CN105404438A - Background fuzzy method and apparatus and terminal device - Google Patents

Background fuzzy method and apparatus and terminal device Download PDF

Info

Publication number
CN105404438A
CN105404438A CN201410405987.5A CN201410405987A CN105404438A CN 105404438 A CN105404438 A CN 105404438A CN 201410405987 A CN201410405987 A CN 201410405987A CN 105404438 A CN105404438 A CN 105404438A
Authority
CN
China
Prior art keywords
window
memory
image
background
windows
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410405987.5A
Other languages
Chinese (zh)
Other versions
CN105404438B (en
Inventor
朱才
李伟星
王亚辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201410405987.5A priority Critical patent/CN105404438B/en
Publication of CN105404438A publication Critical patent/CN105404438A/en
Application granted granted Critical
Publication of CN105404438B publication Critical patent/CN105404438B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a background fuzzy method and apparatus and a terminal device. The method comprises: obtaining a plurality of windows contained in an application; determining a background window and a foreground window in the windows; performing fuzzy processing on the background window; and superposing the foreground window and the background window subjected to the fuzzy processing to obtain a user interface of the application, and displaying the user interface of the application. According to the scheme, the background window is subjected to the fuzzy processing, so that the contrast of a background and a foreground of the user interface is increased; and the foreground is obvious to display and relatively high in readability, so that foreground information can be clearly and effectively showed for users and user experience is improved.

Description

Background blurring method and device and terminal equipment
Technical Field
The present disclosure relates to the technical field of terminal devices, and in particular, to a background blurring method and apparatus, and a terminal device.
Background
Along with the development of terminal equipment towards the direction of intellectuality, more and more Application (APP), the user enjoys the convenience that terminal equipment intellectuality brought. Meanwhile, the display effect of the user interface is more and more abundant. The display interface is usually formed by overlapping a plurality of windows, including a foreground window and a background window. With the development of image processing technology, the display fineness of the foreground window and the background window is higher and higher.
Generally, foreground information is often more concerned by users; at present, the display effect of the background and the foreground of the user interface is the same, the contrast of the background and the foreground is very small, the foreground is not displayed obviously, the readability of the foreground is poor, and foreground information cannot be presented to a user clearly and effectively, so that the user experience is poor.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a background blurring method, device and terminal device.
According to a first aspect of embodiments of the present disclosure, there is provided a background blurring method, including:
acquiring a plurality of windows included in an application program;
determining a background window and a foreground window from the plurality of windows;
carrying out fuzzy processing on the background window;
and superposing the foreground window and the background window after the fuzzy processing to obtain a user interface of the application program, and displaying the user interface of the application program.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the determining a background window and a foreground window from the multiple windows includes:
detecting whether the plurality of windows carry first identification information or not, determining the window carrying the first identification information as a background window, and determining the window not carrying the first identification information as a foreground window; or,
and detecting whether the plurality of windows carry second identification information, if one window is detected to carry the second identification information, determining a window behind the window in a window sequence consisting of the plurality of windows as a background window, and determining other windows except the background window in the plurality of windows as foreground windows.
With reference to the first aspect, in a second possible implementation manner of the first aspect, the step of performing a blurring process on the background window includes:
drawing the background window to a first memory;
and after the image in the first memory is subjected to blurring processing in the x direction, drawing the image in a second memory, and then, after the image in the second memory is subjected to blurring processing in the y direction, drawing the image in the first memory.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the step of blurring the image in the first memory in the x direction includes:
sequentially acquiring each pixel point on the image in the first memory;
executing for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the x direction; and after the obtained color values of the pixels are weighted and averaged, obtaining the color value of the current pixel.
With reference to the second possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, the step of blurring the image in the second memory in the y direction includes:
sequentially acquiring each pixel point on the image in the second memory;
executing for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the y direction; and after the obtained color values of the pixels are weighted and averaged, obtaining the color value of the current pixel.
With reference to the second possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, after the drawing the background window into the first memory, before the step of drawing the image in the first memory into the second memory after the x-direction blurring processing, the method further includes:
reducing the image in the first memory by a set reduction multiple;
the step of rendering the image in the first memory into a second memory after the image in the first memory is subjected to the blurring processing in the x direction includes:
and performing blurring processing on the reduced image in the first memory in the x direction and then drawing the image in a second memory.
With reference to the fifth possible implementation manner of the first aspect, in a sixth possible implementation manner of the first aspect, the step of overlapping the foreground window and the blurred background window includes:
drawing the foreground window into a frame buffer area; and
and drawing the background window subjected to the tiger fuzzy processing into the frame buffer area by a set magnification factor, wherein the set reduction factor is equal to the set magnification factor.
According to a second aspect of embodiments of the present disclosure, there is provided a background blurring apparatus including:
an acquisition unit configured to acquire a plurality of windows included in an application program;
a determining unit configured to determine a background window and a foreground window from the plurality of windows;
the processing unit is used for carrying out fuzzy processing on the background window;
and the display unit is used for overlapping the foreground window and the background window subjected to the fuzzy processing to obtain a user interface of the application program and displaying the user interface of the application program.
With reference to the second aspect, in a first possible implementation manner of the second aspect, the determining unit includes:
the first detection subunit is configured to detect whether the multiple windows carry first identification information, determine a window carrying the first identification information as a background window, and determine a window not carrying the first identification information as a foreground window; or,
and the second detection subunit is configured to detect whether the multiple windows carry second identification information, determine, if it is detected that one window carries the second identification information, a window subsequent to the one window in a window sequence formed by the multiple windows as a background window, and determine, as a foreground window, other windows in the multiple windows except the background window.
With reference to the second aspect, in a second possible implementation manner of the second aspect, the processing unit includes:
the first drawing subunit is used for drawing the background window to a first memory;
and the second drawing subunit is used for drawing the image in the first memory into a second memory after the image in the first memory is subjected to the blurring processing in the x direction, and drawing the image in the second memory into the first memory after the image in the second memory is subjected to the blurring processing in the y direction.
With reference to the second possible implementation manner of the second aspect, in a third possible implementation manner of the second aspect, the second drawing subunit includes:
the first obtaining subunit is used for sequentially obtaining each pixel point on the image in the first memory;
a first execution subunit, configured to execute, for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the x direction; and after the obtained color values of the pixels are weighted and averaged, obtaining the color value of the current pixel.
With reference to the second possible implementation manner of the second aspect, in a fourth possible implementation manner of the second aspect, the second drawing subunit includes:
the second obtaining subunit is used for sequentially obtaining each pixel point on the image in the second memory;
a second execution subunit, configured to execute, for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the y direction; and after the obtained color values of the pixels are weighted and averaged, obtaining the color value of the current pixel.
With reference to the second possible implementation manner of the second aspect, in a fifth possible implementation manner of the second aspect, the processing unit further includes:
a reduction subunit, configured to reduce the image in the first memory by a set reduction factor;
the second drawing subunit is further configured to perform blurring processing on the reduced image in the first memory in the x direction, and then draw the image in the second memory.
With reference to the fifth possible implementation manner of the second aspect, in a sixth possible implementation manner of the second aspect, the display unit includes:
the third drawing subunit is used for drawing the foreground window into a frame buffer area; and drawing the background window subjected to the blurring processing into the frame buffer at a set magnification, wherein the set reduction magnification is equal to the set magnification.
According to a third aspect of the embodiments of the present disclosure, there is provided a terminal device, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring a plurality of windows included in an application program;
determining a background window and a foreground window from the plurality of windows;
carrying out fuzzy processing on the background window;
and overlapping the foreground window and the background window subjected to the fuzzy processing to obtain a user interface of the application program, and displaying the user interface of the application program.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the method comprises the steps that in the process of drawing a user interface of an application program, a step of blurring a background window is added to a terminal device, then a foreground window and the background window after blurring are superposed to obtain the user interface of the application program, and the user interface of the application program is displayed.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow chart illustrating a background blurring method according to an example embodiment.
FIG. 2 is a flow chart illustrating another background blurring method according to an example embodiment.
FIG. 3 is a block diagram illustrating a background blurring apparatus according to an example embodiment.
FIG. 4 is a block diagram illustrating a first type of processing unit in accordance with an example embodiment.
FIG. 5 is a block diagram illustrating a second type of processing unit, according to an example embodiment.
FIG. 6 illustrates a block diagram of a terminal device in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating a background blurring method, as shown in fig. 1, for use in a terminal device, according to an exemplary embodiment, including the following steps.
In step S11, a plurality of windows included in the application program are acquired.
Generally, a plurality of application programs are installed in a terminal device, and when a user wants to open a certain application program or select a certain function option of the application program, if the terminal device is provided with a touch screen, an icon of the application program can be directly clicked or clicked by using a touch pen; if the terminal device is connected with external devices such as a mouse, a keyboard and the like, the external devices can be used for clicking the icon of the application program.
After the terminal device detects the click behavior, a user interface corresponding to the application program needs to be displayed, and if the application program sets the background blur, the user interface needs to achieve the effect of the background blur. Since the application generally comprises a plurality of windows, the terminal device needs to first acquire these windows, which can be understood as pictures.
In step S12, a background window and a foreground window are determined from the plurality of windows.
The windows of the application program generally include a foreground window and a background window, where the background window and the foreground window both include at least one window, and after the terminal device obtains multiple windows included in the application program, it needs to determine which windows are foreground windows and which windows are background windows.
In step S13, the background window is subjected to a blurring process.
In step S14, the foreground window and the background window subjected to the blurring process are superimposed to obtain a user interface of the application, and the user interface of the application is displayed.
In the related art, when the terminal device displays the user interface of the application program, if background blurring is not needed, after S12 is executed, the foreground window and the background window can be directly superimposed to obtain the user interface of the application program; in this embodiment, since the background needs to be blurred, after S12 is executed, S13 is added, that is, in the process of drawing the user interface of the application program, the step of blurring the background window is added to the terminal device, and then the foreground window and the background window after blurring are superimposed to obtain the user interface of the application program, and the user interface of the application program is displayed.
Optionally, the step of determining the background window and the foreground window from the plurality of windows in S12 includes the following two ways:
in the first mode, whether the plurality of windows carry first identification information is detected, the window carrying the first identification information is determined as a background window, and the window not carrying the first identification information is determined as a foreground window.
The application program can distinguish a background window and a foreground window by using first identification information in a plurality of windows, the first identification information can be added to each background window, the first identification information is not added to the foreground window, and correspondingly, the terminal device can determine the foreground window and the background window by detecting whether the plurality of windows carry the first identification information.
The first identification information may be set according to actual needs, and may be selected letters, numbers, or an existing identification, for example.
And in the second mode, whether the plurality of windows carry second identification information is detected, if it is detected that one window carries the second identification information, a window behind one window in a window sequence formed by the plurality of windows is determined as a background window, and other windows except the background window in the plurality of windows are determined as foreground windows.
The application program may also distinguish the background window and the foreground window by using the second identification information in the multiple windows, the second identification information may be added to one window, a window behind the window in a window sequence formed by the multiple windows is determined as the background window, and other windows except the background window in the multiple windows are determined as the foreground window, and accordingly, the terminal device may determine the foreground window and the background window by detecting whether the multiple windows carry the second identification information.
The first identification information may be set according to actual needs, and may be selected letters, numbers, or an existing identification, for example.
The above only lists two methods for determining the background window and the foreground window from a plurality of windows, and of course, many other methods may be adopted to implement the determination, and are not described in detail here.
Optionally, the step of blurring the background window in S13 includes:
drawing the background window to a first memory;
and after the image in the first memory is subjected to the blurring processing in the x direction, the image is drawn into a second memory, and then the image in the second memory is subjected to the blurring processing in the y direction and is drawn into the first memory.
The pixel point is the minimum unit in the image, the image is formed by sequentially arranging a plurality of pixel points and can be divided into the x direction and the y direction, therefore, the background window can be firstly drawn into the selected first memory, and then the resolution of the image is reduced by respectively carrying out fuzzy processing on the image in the first memory from the x direction and the y direction, so that the fuzzy processing of the background window is realized. The second memory may be selected according to actual needs, and may be the same as or different from the first memory in size and position.
The step of blurring the image in the first memory in the x direction may include: sequentially acquiring each pixel point on the image in the first memory; executing for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the x direction; and after the obtained color values of the pixels are weighted and averaged, the color value of the current pixel is obtained.
Each pixel point of the image in the first memory can be sequentially acquired, the acquired pixel points can be used as current pixel points, and color values of the pixel points are respectively calculated. For example, color values of 5 pixels around the current pixel in the x direction may be obtained for weighted average, the weight value may be determined according to the distance from the current pixel, and the weight value close to the current pixel may be larger; the weighted value far away from the current pixel point may be smaller. And the color value after weighted average is used as the color value of the current pixel point, and the color value of each pixel point of the image in the first memory is recalculated, so that the fuzzy processing of the image in the first memory is realized, and the processed image is stored in the second memory, thereby facilitating the fuzzy processing of the image in the y direction.
The step of blurring the image in the second memory in the y direction includes: sequentially acquiring each pixel point on the image in the second memory; executing for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the y direction; and after the obtained color values of the pixels are weighted and averaged, the color value of the current pixel is obtained.
The method for blurring the image in the second memory in the y direction may refer to the method for blurring the image in the first memory in the x direction.
Optionally, after the background window is drawn into the first memory, before the step of drawing the image in the first memory into the second memory after the x-direction blurring processing, the method further includes: the image in the first memory is reduced by the set reduction times.
After the image in the first memory is subjected to the blurring processing in the x direction, the image is drawn into a second memory, and the method comprises the following steps:
and performing blurring processing on the reduced image in the first memory in the x direction and then drawing the image in the second memory.
After the background window is drawn into the first memory, a reduction factor may be set to reduce the image in the first memory, and then the image in the reduced first memory is blurred in the x direction, where the reduction factor may be set according to actual needs, for example, 1.5 times, 2 times, 2.5 times, and the like.
The resolution of the image can be reduced by reducing the image in the first memory, so that the blurring processing can be performed more conveniently.
Optionally, the step of superimposing the foreground window and the blurred background window in S14 includes:
drawing the foreground window into a frame buffer area; and the number of the first and second groups,
and drawing the background window subjected to the fuzzy processing into a frame buffer by a set magnification factor, wherein the set reduction factor is equal to the set magnification factor.
In the embodiment, since the image in the first memory is reduced by the set reduction factor, in order to ensure that the background window can be displayed in a full screen, the background window subjected to the blurring processing needs to be drawn into the frame buffer by the set amplification factor, so that the background blurring of the user interface is realized.
The set magnification factor can be set according to actual needs, for example, 1 time, 2 times, 3 times, 4 times, 5 times, 6.5 times and the like, and the set reduction factor needs to be equal to the set magnification factor, so that full-screen display of the background window can be ensured.
Fig. 2 is a flowchart illustrating another background blurring method for a user interface according to an exemplary embodiment, where as shown in fig. 2, the method is used in a terminal device, and assuming that an Android system is provided on the terminal device, in the Android system, a surfefinger module is responsible for displaying a user interface of an application program, and an execution subject of the method may be the surfefinger module, and includes the following steps.
In step S21, a plurality of windows included in the application program are acquired.
In step S22, it is detected whether the plurality of windows carry the second identification information.
In step S23, if it is detected that one window carries the second identification information, the window following one window in the window sequence formed by the multiple windows is determined as the background window, and the other windows except the background window in the multiple windows are determined as the foreground windows.
Assume that the application includes 5 windows, the 5 windows form a window queue and are numbered 1, 2, 3, 4, 5, respectively, and the 3 rd window is provided with second identification information, such as BLUR _ beat.
After detecting that the 3 rd window carries BLUR _ BEHIND, the surfaceFlinger determines that the 1 st, 2 nd and 3 rd windows are foreground windows and the 4 th and 5 th windows are background windows.
In step S24, the background window is drawn to the first memory.
Drawing the background window into the first memory may be implemented using OGL (open graphics library).
In step S25, the image in the first memory is reduced by the set reduction factor.
In step S26, sequentially acquiring each pixel point on the image in the first memory; executing for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the x direction; after the color values of the acquired pixel points are weighted and averaged, the color value of the current pixel point is obtained; and drawing the processed image in the first memory to a second memory.
In step S27, sequentially acquiring each pixel point on the image in the second memory; executing for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the y direction; after the color values of the acquired pixel points are weighted and averaged, the color value of the current pixel point is obtained; and drawing the processed image in the second memory to the first memory.
The complexity of the fuzzy process can be reduced by independently performing fuzzy processing from the x direction and the y direction, the processing time is saved, and the processing efficiency is improved.
In step S28, the foreground window is drawn into the frame buffer, the background window in the first memory is drawn into the frame buffer with the set magnification, the user interface of the application program is obtained, and the user interface of the application program is displayed.
The background blurring of the user interface can be realized by firstly reducing the image in the first memory, then blurring the image in the first memory from the x direction and the y direction, and then magnifying and displaying the image on the display screen, and as the SurfaceFlinger module originally needs to draw a plurality of windows of an application program into a frame buffer area, the effect of superposing the plurality of windows onto the display screen is achieved, but in the embodiment, the blurring of the background window is only carried out before the windows are drawn into the frame buffer area in the SurfaceFlinger module, and then the windows are drawn, that is, only one blurring process is additionally added, so that the influence on the performance of the terminal equipment is small, the time is saved compared with the related technology, and the user experience is improved.
FIG. 3 is a block diagram illustrating a background obscuring means of a user interface, according to an example embodiment. Referring to fig. 3, the apparatus includes an acquisition unit 31, a determination unit 32, a processing unit 33, and a display unit 34.
The acquisition unit 31 is configured to acquire a plurality of windows included in an application program.
The determining unit 32 is configured to determine a background window and a foreground window from a plurality of windows.
The processing unit 33 is configured to blur the background window.
The display unit 34 is configured to superimpose the foreground window and the blurred background window, obtain a user interface of the application, and display the user interface of the application.
In the scheme, the step of blurring the background window is added in the process of drawing the user interface of the application program by the terminal equipment, then the foreground window and the background window after blurring are superposed to obtain the user interface of the application program, and the user interface of the application program is displayed.
Optionally, the determining unit 32 includes a first detecting subunit or a second detecting subunit.
The first detection subunit is configured to detect whether the plurality of windows carry first identification information, determine a window carrying the first identification information as a background window, and determine a window not carrying the first identification information as a foreground window.
The second detecting subunit is configured to detect whether the multiple windows carry second identification information, determine, if it is detected that one window carries the second identification information, a window following one window in a window sequence formed by the multiple windows as a background window, and determine, as a foreground window, other windows in the multiple windows except the background window.
FIG. 4 is a block diagram illustrating a processing unit in accordance with an exemplary embodiment. Referring to fig. 4, the processing unit 33 includes a first drawing sub-unit 331 and a second drawing sub-unit 332. Wherein:
the first drawing subunit 331 is configured to draw the background window to the first memory.
The second rendering subunit 332 is configured to render the image in the first memory into the second memory after the image in the first memory is blurred in the x direction, and render the image in the second memory into the first memory after the image in the second memory is blurred in the y direction.
Optionally, the second drawing sub-unit 332 includes a first obtaining sub-unit and a first executing sub-unit.
The first obtaining subunit is configured to sequentially obtain each pixel point on the image in the first memory.
The first execution subunit is configured to, for each pixel point, execute: acquiring a pixel point which is a set distance away from a current pixel point in the x direction; and after the obtained color values of the pixels are weighted and averaged, the color value of the current pixel is obtained.
Optionally, the second drawing sub-unit 332 includes a second obtaining sub-unit and a second executing sub-unit.
The second obtaining subunit is configured to sequentially obtain each pixel point on the image in the second memory.
The second execution subunit is configured to, for each pixel point, execute: acquiring a pixel point which is a set distance away from a current pixel point in the y direction; and after the obtained color values of the pixels are weighted and averaged, the color value of the current pixel is obtained.
FIG. 5 is a block diagram illustrating a processing unit in accordance with an exemplary embodiment. Referring to fig. 5, the processing unit 33 further includes a reduction subunit 333.
The reduction subunit 333 is configured to reduce the image in the first memory by a set reduction factor.
The second rendering subunit 332 is further configured to render the reduced image in the first memory to the second memory after the x-direction blurring processing.
Optionally, the display unit 34 includes a third drawing subunit.
The third rendering subunit is configured to render the foreground window into the frame buffer; and drawing the background window subjected to the blurring processing into the frame buffer at a set magnification, wherein the set reduction magnification is equal to the set magnification.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 6 is a block diagram illustrating an apparatus 800 for background blurring according to an example embodiment. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 6, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium having instructions therein which, when executed by a processor of a terminal device, enable the terminal device to perform a background obfuscation method, the method comprising:
acquiring a plurality of windows included in an application program;
determining a background window and a foreground window from the plurality of windows;
carrying out fuzzy processing on the background window;
and overlapping the foreground window and the background window subjected to the fuzzy processing to obtain a user interface of the application program, and displaying the user interface of the application program.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (15)

1. A background blurring method, comprising:
acquiring a plurality of windows included in an application program;
determining a background window and a foreground window from the plurality of windows;
carrying out fuzzy processing on the background window;
and superposing the foreground window and the background window after the fuzzy processing to obtain a user interface of the application program, and displaying the user interface of the application program.
2. The method of claim 1, wherein the step of determining a background window and a foreground window from the plurality of windows comprises:
detecting whether the plurality of windows carry first identification information or not, determining the window carrying the first identification information as a background window, and determining the window not carrying the first identification information as a foreground window; or,
and detecting whether the plurality of windows carry second identification information, if one window is detected to carry the second identification information, determining a window behind the window in a window sequence consisting of the plurality of windows as a background window, and determining other windows except the background window in the plurality of windows as foreground windows.
3. The method of claim 1, wherein the step of blurring the background window comprises:
drawing the background window to a first memory;
and after the image in the first memory is subjected to blurring processing in the x direction, drawing the image in a second memory, and then, after the image in the second memory is subjected to blurring processing in the y direction, drawing the image in the first memory.
4. The method of claim 3, wherein blurring the image in the first memory in the x-direction comprises:
sequentially acquiring each pixel point on the image in the first memory;
executing for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the x direction; and after the obtained color values of the pixels are weighted and averaged, obtaining the color value of the current pixel.
5. The method of claim 3, wherein the step of blurring the image in the second memory in the y-direction comprises:
sequentially acquiring each pixel point on the image in the second memory;
executing for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the y direction; and after the obtained color values of the pixels are weighted and averaged, obtaining the color value of the current pixel.
6. The method of claim 3, wherein after the step of rendering the background window into a first memory, and before the step of rendering the image in the first memory into a second memory after the step of x-direction blurring, the method further comprises:
reducing the image in the first memory by a set reduction multiple;
the step of rendering the image in the first memory into a second memory after the image in the first memory is subjected to the blurring processing in the x direction includes:
and performing blurring processing on the reduced image in the first memory in the x direction and then drawing the image in a second memory.
7. The method of claim 6, wherein the step of superimposing the foreground window and the blurred background window comprises:
drawing the foreground window into a frame buffer area; and
and drawing the background window subjected to the fuzzy processing into the frame buffer at a set magnification, wherein the set reduction magnification is equal to the set magnification.
8. A background blurring apparatus, comprising:
an acquisition unit configured to acquire a plurality of windows included in an application program;
a determining unit configured to determine a background window and a foreground window from the plurality of windows;
the processing unit is used for carrying out fuzzy processing on the background window;
and the display unit is used for overlapping the foreground window and the background window subjected to the fuzzy processing to obtain a user interface of the application program and displaying the user interface of the application program.
9. The apparatus of claim 8, wherein the determining unit comprises:
the first detection subunit is configured to detect whether the multiple windows carry first identification information, determine a window carrying the first identification information as a background window, and determine a window not carrying the first identification information as a foreground window; or,
and the second detection subunit is configured to detect whether the multiple windows carry second identification information, determine, if it is detected that one window carries the second identification information, a window subsequent to the one window in a window sequence formed by the multiple windows as a background window, and determine, as a foreground window, other windows in the multiple windows except the background window.
10. The apparatus of claim 8, wherein the processing unit comprises:
the first drawing subunit is used for drawing the background window to a first memory;
and the second drawing subunit is used for drawing the image in the first memory into a second memory after the image in the first memory is subjected to the blurring processing in the x direction, and drawing the image in the second memory into the first memory after the image in the second memory is subjected to the blurring processing in the y direction.
11. The apparatus of claim 10, wherein the second rendering subunit comprises:
the first obtaining subunit is used for sequentially obtaining each pixel point on the image in the first memory;
a first execution subunit, configured to execute, for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the x direction; and after the obtained color values of the pixels are weighted and averaged, obtaining the color value of the current pixel.
12. The apparatus of claim 10, wherein the second rendering subunit comprises:
the second obtaining subunit is used for sequentially obtaining each pixel point on the image in the second memory;
a second execution subunit, configured to execute, for each pixel: acquiring a pixel point which is a set distance away from a current pixel point in the y direction; and after the obtained color values of the pixels are weighted and averaged, obtaining the color value of the current pixel.
13. The method of claim 10, wherein the processing unit further comprises:
a reduction subunit, configured to reduce the image in the first memory by a set reduction factor;
the second drawing subunit is further configured to perform blurring processing on the reduced image in the first memory in the x direction, and then draw the image in the second memory.
14. The apparatus of claim 13, wherein the display unit comprises:
the third drawing subunit is used for drawing the foreground window into a frame buffer area; and drawing the background window subjected to the blurring processing into the frame buffer at a set magnification, wherein the set reduction magnification is equal to the set magnification.
15. A terminal device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring a plurality of windows included in an application program;
determining a background window and a foreground window from the plurality of windows;
carrying out fuzzy processing on the background window;
and overlapping the foreground window and the background window subjected to the fuzzy processing to obtain a user interface of the application program, and displaying the user interface of the application program.
CN201410405987.5A 2014-08-13 2014-08-13 Blurred background method, apparatus and terminal device Active CN105404438B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410405987.5A CN105404438B (en) 2014-08-13 2014-08-13 Blurred background method, apparatus and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410405987.5A CN105404438B (en) 2014-08-13 2014-08-13 Blurred background method, apparatus and terminal device

Publications (2)

Publication Number Publication Date
CN105404438A true CN105404438A (en) 2016-03-16
CN105404438B CN105404438B (en) 2019-10-15

Family

ID=55469954

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410405987.5A Active CN105404438B (en) 2014-08-13 2014-08-13 Blurred background method, apparatus and terminal device

Country Status (1)

Country Link
CN (1) CN105404438B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250127A (en) * 2016-07-26 2016-12-21 深圳天珑无线科技有限公司 A kind of blurred background processing method and terminal
CN106371723A (en) * 2016-08-26 2017-02-01 维沃移动通信有限公司 Intelligent terminal-based interface processing method and intelligent terminal
CN106504220A (en) * 2016-08-19 2017-03-15 华为机器有限公司 A kind of image processing method and device
CN106570847A (en) * 2016-10-24 2017-04-19 广州酷狗计算机科技有限公司 Image processing method and image processing device
CN107992242A (en) * 2017-11-29 2018-05-04 广州视源电子科技股份有限公司 Switching method, device and equipment of floating window and storage medium
CN109002750A (en) * 2017-12-11 2018-12-14 罗普特(厦门)科技集团有限公司 A kind of correlation filtering tracking based on conspicuousness detection and image segmentation
CN110457102A (en) * 2019-07-26 2019-11-15 武汉深之度科技有限公司 Blur method, rendering method and the calculating equipment of visual object
CN111104935A (en) * 2019-11-08 2020-05-05 浙江口碑网络技术有限公司 Image acquisition method, image display method, device and equipment
CN111741366A (en) * 2020-07-16 2020-10-02 广州酷狗计算机科技有限公司 Audio playing method, device, terminal and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102890602A (en) * 2012-09-17 2013-01-23 福建星网视易信息系统有限公司 Object highlighting method and display device
CN103246430A (en) * 2013-04-24 2013-08-14 深圳市同洲电子股份有限公司 Terminal and method for managing multiple windows
US20140189606A1 (en) * 2013-01-02 2014-07-03 Canonical Limited User interface for a computing device
CN103927086A (en) * 2014-04-21 2014-07-16 深圳市中兴移动通信有限公司 Wallpaper processing method and system and mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102890602A (en) * 2012-09-17 2013-01-23 福建星网视易信息系统有限公司 Object highlighting method and display device
US20140189606A1 (en) * 2013-01-02 2014-07-03 Canonical Limited User interface for a computing device
CN103246430A (en) * 2013-04-24 2013-08-14 深圳市同洲电子股份有限公司 Terminal and method for managing multiple windows
CN103927086A (en) * 2014-04-21 2014-07-16 深圳市中兴移动通信有限公司 Wallpaper processing method and system and mobile terminal

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250127A (en) * 2016-07-26 2016-12-21 深圳天珑无线科技有限公司 A kind of blurred background processing method and terminal
CN106504220B (en) * 2016-08-19 2019-07-23 华为机器有限公司 A kind of image processing method and device
CN106504220A (en) * 2016-08-19 2017-03-15 华为机器有限公司 A kind of image processing method and device
US11729514B2 (en) 2016-08-19 2023-08-15 Huawei Technologies Co., Ltd. Image processing method and apparatus
US11039064B2 (en) 2016-08-19 2021-06-15 Huawei Technologies Co., Ltd. Image processing method and apparatus
CN106371723A (en) * 2016-08-26 2017-02-01 维沃移动通信有限公司 Intelligent terminal-based interface processing method and intelligent terminal
CN106570847A (en) * 2016-10-24 2017-04-19 广州酷狗计算机科技有限公司 Image processing method and image processing device
CN107992242A (en) * 2017-11-29 2018-05-04 广州视源电子科技股份有限公司 Switching method, device and equipment of floating window and storage medium
CN109002750A (en) * 2017-12-11 2018-12-14 罗普特(厦门)科技集团有限公司 A kind of correlation filtering tracking based on conspicuousness detection and image segmentation
CN110457102A (en) * 2019-07-26 2019-11-15 武汉深之度科技有限公司 Blur method, rendering method and the calculating equipment of visual object
CN110457102B (en) * 2019-07-26 2022-07-08 武汉深之度科技有限公司 Visual object blurring method, visual object rendering method and computing equipment
CN114924824A (en) * 2019-07-26 2022-08-19 武汉深之度科技有限公司 Visual object blurring method, visual object rendering method and computing equipment
CN114924824B (en) * 2019-07-26 2023-11-14 武汉深之度科技有限公司 Visual object blurring method, visual object rendering method and computing device
CN111104935A (en) * 2019-11-08 2020-05-05 浙江口碑网络技术有限公司 Image acquisition method, image display method, device and equipment
CN111741366A (en) * 2020-07-16 2020-10-02 广州酷狗计算机科技有限公司 Audio playing method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN105404438B (en) 2019-10-15

Similar Documents

Publication Publication Date Title
CN105404438B (en) Blurred background method, apparatus and terminal device
US11086482B2 (en) Method and device for displaying history pages in application program and computer-readable medium
EP3316105B1 (en) Instant message processing method and device
CN105955607B (en) Content sharing method and device
US20170304735A1 (en) Method and Apparatus for Performing Live Broadcast on Game
US9674395B2 (en) Methods and apparatuses for generating photograph
EP3182716A1 (en) Method and device for video display
CN105204846B (en) Display methods, device and the terminal device of video pictures in more people's videos
CN107908351B (en) Application interface display method and device and storage medium
US9591120B2 (en) Method and device for adding application badge
EP3121699A1 (en) Method and device for displaying badge of icon
CN104238890B (en) Character displaying method and device
US20190235745A1 (en) Method and device for displaying descriptive information
CN107992257B (en) Screen splitting method and device
EP3147802B1 (en) Method and apparatus for processing information
CN104636106A (en) Picture displaying method and device and terminal device
US20150116368A1 (en) Method and device for adjusting characters of application
US20190034046A1 (en) Method and device for displaying application, and storage medium
CN104461236A (en) Method and device for displaying application icons
CN112331158B (en) Terminal display adjusting method, device, equipment and storage medium
CN107491238B (en) Display method and device of push information
US11600300B2 (en) Method and device for generating dynamic image
CN105094500B (en) A kind of icon arrangement method and device
CN106919302B (en) Operation control method and device of mobile terminal
CN105955637B (en) Method and device for processing text input box

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant