CN108874482B - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN108874482B
CN108874482B CN201810543070.XA CN201810543070A CN108874482B CN 108874482 B CN108874482 B CN 108874482B CN 201810543070 A CN201810543070 A CN 201810543070A CN 108874482 B CN108874482 B CN 108874482B
Authority
CN
China
Prior art keywords
image
fuzzy
layer
current image
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810543070.XA
Other languages
Chinese (zh)
Other versions
CN108874482A (en
Inventor
周雨田
刘伟权
龙海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201810543070.XA priority Critical patent/CN108874482B/en
Publication of CN108874482A publication Critical patent/CN108874482A/en
Application granted granted Critical
Publication of CN108874482B publication Critical patent/CN108874482B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Abstract

The present disclosure relates to an image processing method and apparatus for increasing the speed of image processing. The method is applied to the terminal equipment, an operating system carried by the terminal equipment comprises an application layer, an application framework layer and a system operation layer, and the method comprises the following steps: the system operation layer receives a user interface refreshing instruction sent by the application layer through the application framework layer; wherein the user interface refreshing instruction comprises a fuzzy parameter; the system operation layer carries out fuzzy processing on a current image displayed by the terminal equipment according to the fuzzy parameters to obtain a first fuzzy image; the system operation layer obtains respective fusion coefficients of the current image and the first blurred image according to the blurring parameters; and the system operation layer fuses the current image and the first blurred image according to respective fusion coefficients to obtain an image to be displayed. The technical scheme of the disclosure can improve the speed of image processing.

Description

Image processing method and device
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus.
Background
In the related art, the terminal device not only increases the investment cost in performance and function, but also continuously explores the UI (user interface) as a display effect for better user experience, wherein the gradual blur effect is one of the attempts. However, how to increase the speed of image processing in the gradual-change blurring process is a technical problem to be solved.
Disclosure of Invention
To overcome the problems in the related art, embodiments of the present disclosure provide an image processing method and apparatus for increasing the speed of image processing.
According to a first aspect of the embodiments of the present disclosure, there is provided an image processing method applied to a terminal device, where an operating system installed on the terminal device includes an application layer, an application framework layer, and a system running layer, the method including:
the system running layer receives a user interface refreshing instruction sent by the application layer through the application framework layer; wherein the user interface refreshing instruction comprises a fuzzy parameter;
the system operation layer carries out fuzzy processing on the current image displayed by the terminal equipment according to the fuzzy parameter to obtain a first fuzzy image;
the system operation layer obtains respective fusion coefficients of the current image and the first blurred image according to the blurring parameters;
and the system operation layer fuses the current image and the first blurred image according to respective fusion coefficients to obtain an image to be displayed.
In one embodiment, the user interface refresh command may further include a fuzzy flag bit; before the system running layer performs the blurring processing on the current image displayed by the terminal device according to the blurring parameter to obtain the first blurred image, the method may further include:
and determining that the fuzzy flag bit indicates to carry out fuzzy processing on the current image.
In one embodiment, the user interface refreshing instruction may further include a fuzzy mode identifier;
the blurring processing, by the system running layer, the current image displayed by the terminal device according to the blurring parameter to obtain a first blurred image, which may include:
the system running layer obtains a corresponding fuzzy mode according to the fuzzy mode identifier and the corresponding relation between the fuzzy mode identifier and the fuzzy mode;
and the system operation layer carries out fuzzy processing on the current image according to the fuzzy parameter and the obtained fuzzy mode to obtain a first fuzzy image.
In one embodiment, before the system running layer receives the user interface refresh command sent by the application layer via the application framework layer, the method may further include:
acquiring the average brightness of the current image;
and obtaining a corresponding fuzzy mode identifier according to the average brightness and the corresponding relation between the brightness and the fuzzy mode identifier.
In one embodiment, the obtaining, by the system operation layer, the respective fusion coefficients of the current image and the first blurred image according to the blur parameter may include:
and the system running layer obtains respective fusion coefficients of the current image and the first blurred image according to the blurring parameters and the blurring mode.
In an embodiment, the blurring, by the system running layer, the current image displayed by the terminal device according to the blurring parameter to obtain a first blurred image may include:
carrying out reduction processing on the current image to obtain a compressed image;
blurring the compressed image to obtain a second blurred image;
amplifying the second blurred image to obtain a first blurred image; wherein a size of the first blurred image is the same as a size of the current image.
In one embodiment, the blurring the compressed image to obtain a second blurred image may include:
and carrying out fuzzy processing on the compressed image according to a preset fuzzy radius to obtain the second fuzzy image.
In one embodiment, the current image is located in a first layer, and when a second layer exists above the first layer, the blurring parameter may change as an area of the first layer covered by the second layer changes.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus applied to a terminal device, where an operating system installed on the terminal device includes an application layer, an application framework layer, and a system running layer, the apparatus including:
a sending module configured to receive, by the system running layer, a user interface refresh instruction sent by the application layer via the application framework layer; wherein the user interface refreshing instruction comprises a fuzzy parameter;
the processing module is configured to perform fuzzy processing on a current image displayed by the terminal equipment through the system running layer according to the fuzzy parameter to obtain a first fuzzy image;
a first obtaining module configured to obtain, by the system operating layer, a respective fusion coefficient of the current image and the first blurred image according to the blur parameter;
and the fusion module is configured to fuse the current image and the first blurred image according to respective fusion coefficients through the system operation layer to obtain an image to be displayed.
In one embodiment, the user interface refresh command further includes a fuzzy flag bit; the apparatus may further comprise:
a determination module configured to determine that the blur flag indicates to blur the current image.
In one embodiment, the user interface refreshing instruction further comprises a fuzzy mode identifier; the processing module may include:
the first obtaining submodule is configured to obtain a corresponding fuzzy mode according to the fuzzy mode identifier and the corresponding relation between the fuzzy mode identifier and the fuzzy mode through the system running layer;
and the second acquisition submodule is configured to perform fuzzy processing on the current image according to the fuzzy parameter and the acquired fuzzy mode through the system operation layer to obtain a first fuzzy image.
In one embodiment, the apparatus may further include:
a second obtaining module configured to obtain an average brightness of the current image;
and the third acquisition module is configured to acquire a corresponding fuzzy mode identifier according to the average brightness and the corresponding relation between the brightness and the fuzzy mode identifier.
In an embodiment, the first obtaining module may be further configured to obtain, by the system operating layer, a fusion coefficient of each of the current image and the first blurred image according to the blur parameter and the blur mode.
In one embodiment, the processing module may include:
the first processing submodule is configured to perform reduction processing on the current image to obtain a compressed image;
the second processing submodule is configured to perform fuzzy processing on the compressed image to obtain a second fuzzy image;
a third processing submodule configured to perform amplification processing on the second blurred image to obtain the first blurred image; wherein a size of the first blurred image is the same as a size of the current image.
In an embodiment, the second processing sub-module may be further configured to perform a blurring process on the compressed image according to a preset blurring radius, so as to obtain the second blurred image.
In one embodiment, the current image is located in a first layer, and when a second layer exists above the first layer, the blurring parameter may change as an area of the first layer covered by the second layer changes.
According to a third aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform: the method of the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of the first aspect described above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: and when the application layer receives a user interface refreshing instruction, the application layer sends the user interface refreshing instruction to the system operation layer through the application framework layer, the system operation layer carries out fuzzy processing on the current image displayed by the terminal equipment according to fuzzy parameters in the user interface refreshing instruction to obtain a first fuzzy image, obtains respective fusion coefficients of the current image and the first fuzzy image according to the fuzzy parameters, and finally fuses the current image and the first fuzzy image according to the respective fusion coefficients to obtain an image to be displayed. The system operation layer directly or indirectly holds the current image displayed by the terminal equipment, so that the current image can be quickly acquired, and the acquisition of the current image and the acquisition of the image to be displayed are completed in the system operation layer, so that the processing speed is high, the image processing speed can be improved, and the time consumption is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow diagram illustrating an image processing method according to an exemplary embodiment.
FIG. 2 illustrates a Gaussian blur template with a blur radius of 1, according to an example embodiment.
Fig. 3 to 5 are application scene diagrams illustrating an image processing method according to an exemplary embodiment.
Fig. 6 to 7 are flowcharts illustrating an image processing method according to another exemplary embodiment.
Fig. 8 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment.
Fig. 9 is a block diagram illustrating an image processing apparatus according to another exemplary embodiment.
Fig. 10 is a block diagram illustrating an image processing apparatus according to another exemplary embodiment.
Fig. 11 is a block diagram illustrating an image processing apparatus according to another exemplary embodiment.
Fig. 12 is a block diagram illustrating an image processing apparatus according to another exemplary embodiment.
Fig. 13 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
In the related art, a terminal device equipped with an Android operating system not only increases the investment cost in performance and function, but also continuously explores a UI (user interface) in terms of a display effect for better user experience, wherein a gradual blur effect is one of attempts.
For example, one method for achieving the effect of gradual blur is to capture a current picture, blur the captured picture, and achieve the effect of gradual blur by continuously adjusting the transparency. However, because the screen capture is performed first and then the blurring process is performed, which is time-consuming, only static scenes can be satisfied, and a scene with a dynamic background picture, such as dynamic wallpaper or video playing, cannot achieve a real-time blurring gradual change effect.
The embodiment of the disclosure provides an image processing method and an image processing device, which can solve the technical problem and improve the speed of image processing.
Fig. 1 is a flowchart illustrating an image processing method according to an exemplary embodiment, where the method may be applied to a terminal device, such as a smartphone, a tablet computer (PAD), and the like, and an operating system installed on the terminal device, such as an Android operating system, may include an Application layer, an Application Framework (Application Framework) layer, and a system running layer. As shown in fig. 1, the image processing method includes the following steps S101 to S104:
in step S101, the system running layer receives a user interface refresh instruction sent by the application layer via the application framework layer; wherein, the user interface refreshing instruction comprises a fuzzy parameter.
In one embodiment, when an application program (APP) located in an application layer of an operating system of a terminal device detects a user interface refresh command, the user interface refresh command may be sent to a system running layer via an application framework layer. In one embodiment, the application layer may send the user interface refresh command to a surfaceflunger layer in the system runtime layer via the application framework layer.
In one exemplary embodiment, the terminal device is provided with a touch panel, by which a gesture performed by a user on the touch panel for controlling the terminal device can be detected. When the terminal device detects a gesture, for example, a gesture of pulling down the status bar, a user interface refresh command may be sent to the surfaflinger layer through a status bar application program of the application layer.
In one embodiment, a blur parameter may be included in the user interface refresh command. The system operation layer can perform fuzzy processing on the current image displayed by the terminal equipment according to the fuzzy parameters. The fuzzy parameter can be changed with time or according to the operation executed by the user on the terminal equipment, so as to generate the effect of gradually changing fuzzy.
In one embodiment, the current image is located in a first layer, and when a second layer exists above the first layer, the blurring parameter may change as an area of the first layer covered by the second layer changes. Continuing with the exemplary embodiment described above, the blur parameter may identify that the status bar covers the current image area. For example, the value of the fuzzy parameter may be the height of the status bar drop down. In the process that the user executes the sliding gesture downwards from the top end of the touch panel of the terminal device, the height of the status bar is gradually increased, and the value of the fuzzy parameter is gradually increased.
In one embodiment, the user interface refresh command may further include a fuzzy flag bit. Wherein the value of the blurring flag bit can be used to indicate whether to blur the current image. For example, the value of the ambiguity flag may be 1 or 0. And when the value of the blurring flag bit is 1, the method is used for indicating that the current image is subjected to blurring processing. And when the value of the fuzzy flag bit is 0, the fuzzy flag bit is used for indicating that the current image is not subjected to fuzzy processing.
In one embodiment, the user interface refresh command may further include an ambiguous pattern identifier. The blurring mode identification is used for indicating a blurring mode for blurring the current image. In one exemplary embodiment, the blur mode may include a first blur mode, a second blur mode, and a third blur mode. When the average brightness of the current image is greater than the first preset brightness a1, the system running layer adopts a first blurring mode, first reduces the brightness of the current image, for example, to a range from the first preset brightness a1 to a second preset brightness a2, wherein the first preset brightness a1 is greater than the second preset brightness a2, and then performs blurring processing on the current image according to the blurring parameter. And when the average brightness of the current image is within the range from the first preset brightness to the second preset brightness, a second blurring mode is adopted, the brightness of the current image is not adjusted, and the current image is directly blurred according to the blurring parameters. When the average brightness of the current image is smaller than the second preset brightness, a third blurring mode is adopted, the brightness of the current image is firstly increased, for example, the brightness is increased to a range from the first preset brightness to the second preset brightness, and then the current image is blurred according to the blurring parameters. In this way, the brightness of the image to be displayed can be adjusted according to the blur pattern.
In an embodiment, before step S101, an average brightness of the current image may be obtained, and then a corresponding blur mode identifier may be obtained according to the average brightness and a correspondence between the brightness and the blur mode identifier.
In an exemplary embodiment, the average brightness may be obtained by calculating an average value of brightness values of all pixel points of the current image. In another exemplary embodiment, the luminance values of the pixel points of the current image may also be acquired by an equidistant sampling method, and the average luminance is calculated according to the acquired luminance values. Thus, the calculation amount can be saved, and the image processing speed can be improved.
In an exemplary embodiment, the correspondence of the brightness to the blur pattern identification may be as shown in table 1. After obtaining the average brightness of the current image, the corresponding blur mode identifier can be obtained by looking up table 1. For example, when a1 is 180 and a2 is 80, if the average brightness of the current image is 220, the corresponding blur pattern identification may be M01 by looking up table 1 at 220.
TABLE 1
Brightness of light Fuzzy pattern identification
255~A1 M01
A1~A2 M02
A2~0 M03
In step S102, the system running layer performs a blurring process on the current image displayed by the terminal device according to the blurring parameter, so as to obtain a first blurred image.
In an embodiment, the system running layer may perform reduction processing on the current image to obtain a compressed image, perform blur processing on the compressed image according to a blur parameter to obtain a second blurred image, and perform enlargement processing on the second blurred image to obtain a first blurred image; wherein a size of the first blurred image is the same as a size of the current image. Since the compressed image after the reduction processing is subjected to the blurring processing, the amount of calculation can be reduced, the burden on the terminal device can be reduced, and the image processing speed can be increased.
In an embodiment, the system running layer may perform a blurring process on the compressed image according to a preset blurring radius by using a preset blurring algorithm to obtain the second blurred image. In an exemplary embodiment, the size of the current image is 1080 × 2160, and the compressed image may be blurred by using a gaussian blur algorithm with a blur radius of 1, resulting in the second blurred image. The gaussian blur template with a blur radius of 1 can be shown in fig. 2. In practical application, the preset blur radius and the blur algorithm may be set by a designer after balancing between the performance of the terminal device and the blur effect to be achieved, so that both the time consumption of image processing and the blur effect can be considered. This can be applied even to a mobile terminal device with poor performance.
In an embodiment, when the user interface refresh command further includes a blur flag, before step S102, it is determined whether to blur the current image according to the blur flag, and when it is determined that the blur flag indicates to blur the current image, step S102 is performed.
In one embodiment, when the user interface refresh command further includes a blur mode identifier, the system running layer obtains a corresponding blur mode according to the blur mode identifier and a corresponding relationship between the blur mode identifier and the blur mode, and then performs blur processing on the current image according to the blur parameter and the obtained blur mode to obtain a first blurred image. The method for the system running layer to obtain the corresponding blur mode according to the blur mode identifier and the corresponding relationship between the blur mode identifier and the blur mode is similar to the method for obtaining the corresponding blur mode identifier according to the average brightness and the corresponding relationship between the brightness and the blur mode identifier, and is not described herein again.
In step S103, the system running layer obtains a fusion coefficient of each of the current image and the first blurred image according to the blur parameter.
In one embodiment, the system running layer may obtain the fusion coefficient α% of the current image according to the blur parameter, and then obtain the fusion coefficient (1- α%) of the first blurred image according to the fusion coefficient α% of the current image.
In an embodiment, when the user interface refresh command further includes a blur mode identifier, the system running layer may obtain a fusion coefficient of each of the current image and the first blurred image according to the blur parameter and the blur mode.
In step S104, the system running layer fuses the current image and the first blurred image according to their respective fusion coefficients to obtain an image to be displayed.
In an embodiment, the system running layer may fuse the current image and the first blurred image according to respective fusion coefficients according to the following calculation formula (1), so as to obtain an image to be displayed.
G=α%F'+(1-α%)F (1)
Wherein G is an image to be displayed, F' is a first blurred image, and F is the current image.
In one embodiment, the system operation layer may send the image to be displayed to a display of the terminal device for display after obtaining the image to be displayed. In an exemplary embodiment, the surfefinger layer may receive a user interface refresh command, calculate an image to be displayed according to the user interface refresh command, and send the image to the display of the terminal device for display.
In an exemplary scenario, as shown in fig. 3 to 5, when the terminal device 31 detects that the external object 22 performs a gesture of pulling down the status bar 34 on the touch panel (not shown), a user interface refresh instruction may be sent to the surfaflinger layer through a status bar application program of the application layer. The user interface refreshing instruction comprises a fuzzy flag bit, a fuzzy parameter and a fuzzy mode identifier. And after determining to perform blurring processing on the current image 32 according to the blurring flag, the surfaceFlinger layer determines a blurring mode according to the blurring mode identifier, and performs blurring processing on the current image according to the blurring mode and the blurring parameter to obtain a first blurred image. Where the fuzzy parameter is the height of the status bar 34 that is pulled down. Since the height of the status bar 34 being pulled down changes as the gesture of the pulling down changes, the degree of blur of the resulting first blurred image also changes as the height of the status bar 34 being pulled down changes. The surfaceFlinger layer obtains respective fusion coefficients of the current image 32 and the first blurred image according to the blurring parameters and the blurring mode, and fuses the current image 32 and the first blurred image according to the respective fusion coefficients to obtain an image to be displayed. Since the degree of blurring of the first blurred image changes with the height at which the status bar 34 is pulled down in performing the status bar 34 pull-down, the degree of blurring of the image to be displayed changes with the height at which the status bar 34 is pulled down. As shown in fig. 4, when the status bar 34 is pulled down to a height h1, the terminal device displays the image to be displayed 35, and when the status bar 34 is pulled down to a height h2, the terminal device displays the image to be displayed 36, wherein h2 is greater than h1, and the degree of blur of the image to be displayed 36 is greater than the image to be displayed 35. That is, in the process of performing the pull-down of the status bar 34, the degree of blur of the displayed current image gradually changes with the pull-down height, and the display effect of gradual blur is realized.
In this embodiment, when receiving a user interface refresh instruction, the application layer sends the user interface refresh instruction to the system running layer via the application framework layer, and the system running layer performs a blurring process on a current image displayed by the terminal device according to a blurring parameter in the user interface refresh instruction to obtain a first blurred image, and obtains a respective fusion coefficient of the current image and the first blurred image according to the blurring parameter, and finally fuses the current image and the first blurred image according to the respective fusion coefficient to obtain an image to be displayed. The system running layer directly or indirectly holds the current image displayed by the terminal equipment, so that the current image can be quickly acquired, and the acquisition of the current image and the acquisition of the image to be displayed are completed in the system running layer, so that less computing resources can be consumed, the processing speed is high, and the image processing speed can be improved.
The image processing method provided by the embodiment of the disclosure can be applied to a static background picture due to the high image processing speed, can process in real time even for a video playing or a dynamic wallpaper and other dynamically changing scenes, and can display a corresponding fuzzy picture in time along with the dynamic picture of the background, namely, can realize the effect of gradually changing the fuzzy in real time for the static and dynamic background pictures.
Fig. 6 to 7 are flowcharts illustrating an image processing method according to another exemplary embodiment. In this embodiment, the user interface refresh command further includes a fuzzy pattern identifier. In addition to the embodiment shown in fig. 1, in this embodiment, the step S101 further includes the following steps S601 to S602, and the step S102 includes the following steps S701 to S702:
in step S601, the average brightness of the current image is acquired.
In step S602, a corresponding blur mode identifier is obtained according to the average brightness and the correspondence between the brightness and the blur mode identifier.
In step S701, the system running layer obtains a corresponding fuzzy pattern according to the fuzzy pattern identifier and the corresponding relationship between the fuzzy pattern identifier and the fuzzy pattern.
In step S702, the system running layer performs a blurring process on the current image according to the blurring parameter and the obtained blurring mode to obtain a first blurred image.
In an exemplary embodiment, the average brightness may be obtained by calculating an average value of brightness values of all pixel points of the current image. In another exemplary embodiment, the luminance values of the pixel points of the current image may also be acquired by an equidistant sampling method, and the average luminance is calculated according to the acquired luminance values. Thus, the calculation amount can be saved, and the image processing speed can be improved.
In one exemplary embodiment, the blur mode may include the first blur mode, the second blur mode, and the third blur mode described above. The correspondence between the brightness and the blur pattern identification may be as shown in table 1 above. After obtaining the average brightness of the current image, the corresponding blur mode identifier can be obtained by looking up table 1.
In an exemplary embodiment, the system running layer may obtain a corresponding blur mode according to the blur mode identifier and the correspondence between the blur mode identifier and the blur mode, and perform blur processing on the current image according to the blur parameter and the obtained blur mode to obtain a first blurred image. The method for the system running layer to obtain the corresponding blur mode according to the blur mode identifier and the corresponding relationship between the blur mode identifier and the blur mode is similar to the method for obtaining the corresponding blur mode identifier according to the average brightness and the corresponding relationship between the brightness and the blur mode identifier, and is not described herein again.
In an exemplary embodiment, when the average brightness of the current image is greater than the first preset brightness a1, the first blurring mode is adopted to reduce the brightness of the current image, for example, to a range from the first preset brightness a1 to the second preset brightness a2, and then the current image is blurred according to the blurring parameter. And when the average brightness of the current image is within the range from the first preset brightness to the second preset brightness, a second blurring mode is adopted, the brightness of the current image is not adjusted, and the current image is directly blurred according to the blurring parameters. When the average brightness of the current image is smaller than the second preset brightness, a third blurring mode is adopted, the brightness of the current image is firstly increased, for example, the brightness is increased to a range from the first preset brightness to the second preset brightness, and then the current image is blurred according to the blurring parameters. Therefore, the brightness of the image to be displayed can be adjusted according to the fuzzy mode, and the visual effect is improved.
In this embodiment, the blur mode for performing the blur processing may be determined according to the average brightness of the current image displayed by the terminal device, so as to adjust the brightness of the image to be displayed, and improve the visual effect.
Fig. 8 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment. The image processing apparatus provided by the embodiment of the present disclosure is applied to a terminal device, an operating system carried by the terminal device includes an application layer, an application framework layer and a system running layer, and the apparatus includes:
a sending module 81 configured to receive, by the system running layer, a user interface refresh instruction sent by the application layer via the application framework layer; wherein the user interface refreshing instruction comprises a fuzzy parameter;
the processing module 82 is configured to perform blurring processing on the current image displayed by the terminal device according to the blurring parameter through the system running layer to obtain a first blurred image;
a first obtaining module 83, configured to obtain, by the system running layer, a respective fusion coefficient of the current image and the first blurred image according to the blur parameter;
and a fusion module 84 configured to fuse the current image and the first blurred image according to respective fusion coefficients through the system running layer to obtain an image to be displayed.
In an embodiment, the current image is located in a first layer, and when a second layer exists above the first layer, the blur parameter changes with a change in an area of the first layer covered by the second layer.
Fig. 9 is a block diagram illustrating an image processing apparatus according to another exemplary embodiment. In the embodiment of the disclosure, the user interface refreshing instruction further includes a fuzzy flag bit; the device further comprises:
a determining module 85 configured to determine that the blur flag indicates to blur the current image.
Fig. 10 is a block diagram illustrating an image processing apparatus according to another exemplary embodiment. In the embodiment of the present disclosure, the user interface refresh command further includes a fuzzy mode identifier; the processing module 82 may include:
a first obtaining sub-module 821 configured to obtain, by the system running layer, a corresponding fuzzy mode according to the fuzzy mode identifier and the corresponding relationship between the fuzzy mode identifier and the fuzzy mode;
and the second obtaining submodule 822 is configured to perform blurring processing on the current image according to the blurring parameter and the obtained blurring mode through the system running layer to obtain a first blurred image.
In this embodiment, the first obtaining module 83 may be further configured to obtain, by the system running layer, a fusion coefficient of the current image and the first blurred image according to the blur parameter and the blur mode.
Fig. 11 is a block diagram illustrating an image processing apparatus according to another exemplary embodiment. In the embodiment of the present disclosure, the apparatus may further include:
a second obtaining module configured to obtain an average brightness of the current image;
and the third acquisition module is configured to acquire a corresponding fuzzy mode identifier according to the average brightness and the corresponding relation between the brightness and the fuzzy mode identifier.
Fig. 12 is a block diagram illustrating an image processing apparatus according to another exemplary embodiment. In an embodiment of the present disclosure, the processing module 82 may include:
a first processing sub-module 823, configured to perform reduction processing on the current image to obtain a compressed image;
a second processing submodule 824, configured to perform blurring processing on the compressed image, to obtain a second blurred image;
a third processing submodule 825 configured to perform amplification processing on the second blurred image to obtain the first blurred image; wherein a size of the first blurred image is the same as a size of the current image.
In an embodiment, the second processing sub-module 824 may be further configured to perform a blurring process on the compressed image according to a preset blurring radius, so as to obtain the second blurred image.
The specific manner in which the processor performs the operations with respect to the apparatus in the above-described embodiment has been described in detail in relation to the embodiment of the method, and will not be elaborated upon here.
Fig. 13 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment. For example, apparatus 1300 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and so forth.
Referring to fig. 13, the apparatus 1300 may include one or more of the following components: a processing component 1302, a memory 1304, a power component 1306, a multimedia component 1308, an audio component 1310, an input/output (I/O) interface 1312, a sensor component 1314, and a communication component 1316.
The processing component 1302 generally controls overall operation of the device 1300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing element 1302 may include one or more processors 1320 to execute instructions to perform all or part of the steps of the method described above. Further, the processing component 1302 can include one or more modules that facilitate interaction between the processing component 1302 and other components. For example, the processing component 1302 may include a multimedia module to facilitate interaction between the multimedia component 1308 and the processing component 1302.
The memory 1304 is configured to store various types of data to support operation at the device 1300. Examples of such data include instructions for any application or method operating on device 1300, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1304 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power supply component 1306 provides power to the various components of device 1300. Power supply components 1306 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for apparatus 1300.
The multimedia component 1308 includes a screen between the device 1300 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1308 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the back-facing camera may receive external multimedia data when the device 1300 is in an operational mode, such as a capture mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1310 is configured to output and/or input audio signals. For example, the audio component 1310 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 1300 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1304 or transmitted via the communication component 1316. In some embodiments, the audio component 1310 also includes a speaker for outputting audio signals.
The I/O interface 1312 provides an interface between the processing component 1302 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1314 includes one or more sensors for providing various aspects of state assessment for the device 1300. For example, the sensor assembly 1314 may detect an open/closed state of the device 1300, the relative positioning of components, such as a display and keypad of the apparatus 1300, the sensor assembly 1314 may also detect a change in position of the apparatus 1300 or a component of the apparatus 1300, the presence or absence of user contact with the apparatus 1300, orientation or acceleration/deceleration of the apparatus 1300, and a change in temperature of the apparatus 1300. The sensor assembly 1314 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1314 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1314 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1316 is configured to facilitate communications between the apparatus 1300 and other devices in a wired or wireless manner. The apparatus 1300 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1316 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1316 also includes a Near Field Communications (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 1304 comprising instructions, executable by the processor 1320 of the apparatus 1300 to perform the method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (18)

1. An image processing method is applied to a terminal device, wherein an operating system loaded by the terminal device comprises an application layer, an application framework layer and a system running layer, and the method comprises the following steps:
the system running layer receives a user interface refreshing instruction sent by the application layer through the application framework layer; wherein the user interface refreshing instruction comprises a fuzzy parameter;
the system operation layer carries out fuzzy processing on the current image displayed by the terminal equipment according to the fuzzy parameter to obtain a first fuzzy image; the system operation layer holds a current image displayed by the terminal equipment;
the system operation layer obtains respective fusion coefficients of the current image and the first blurred image according to the blurring parameters;
and the system operation layer fuses the current image and the first blurred image according to respective fusion coefficients to obtain an image to be displayed.
2. The method of claim 1, wherein the user interface refresh command further comprises a fuzzy flag bit; the system operation layer performs blurring processing on the current image displayed by the terminal device according to the blurring parameter, and before obtaining a first blurred image, the method further comprises the following steps:
and determining that the fuzzy flag bit indicates to carry out fuzzy processing on the current image.
3. The method of claim 1, wherein the user interface refresh command further comprises a fuzzy pattern identifier; the system operation layer carries out fuzzy processing on the current image displayed by the terminal equipment according to the fuzzy parameter to obtain a first fuzzy image, and the fuzzy processing method comprises the following steps:
the system running layer obtains a corresponding fuzzy mode according to the fuzzy mode identifier and the corresponding relation between the fuzzy mode identifier and the fuzzy mode;
and the system operation layer carries out fuzzy processing on the current image according to the fuzzy parameter and the obtained fuzzy mode to obtain a first fuzzy image.
4. The method of claim 3, wherein before the system runtime layer receives the user interface refresh command sent by the application layer via the application framework layer, the method further comprises:
acquiring the average brightness of the current image;
and obtaining a corresponding fuzzy mode identifier according to the average brightness and the corresponding relation between the brightness and the fuzzy mode identifier.
5. The method according to claim 3, wherein the obtaining, by the system operating layer, the respective fusion coefficients of the current image and the first blurred image according to the blur parameters includes:
and the system running layer obtains respective fusion coefficients of the current image and the first blurred image according to the blurring parameters and the blurring mode.
6. The method according to claim 1, wherein the blurring, by the system running layer, the current image displayed by the terminal device according to the blurring parameter to obtain a first blurred image comprises:
carrying out reduction processing on the current image to obtain a compressed image;
blurring the compressed image to obtain a second blurred image;
amplifying the second blurred image to obtain a first blurred image; wherein a size of the first blurred image is the same as a size of the current image.
7. The method of claim 6, wherein blurring the compressed image to obtain a second blurred image comprises:
and carrying out fuzzy processing on the compressed image according to a preset fuzzy radius to obtain the second fuzzy image.
8. The method according to claim 1, wherein the current image is located in a first layer, and when a second layer exists above the first layer, the blurring parameter changes as an area of the first layer covered by the second layer changes.
9. An image processing apparatus, applied to a terminal device, wherein an operating system installed on the terminal device includes an application layer, an application framework layer, and a system running layer, the apparatus comprising:
a sending module configured to receive, by the system running layer, a user interface refresh instruction sent by the application layer via the application framework layer; wherein the user interface refreshing instruction comprises a fuzzy parameter;
the processing module is configured to perform fuzzy processing on a current image displayed by the terminal equipment through the system running layer according to the fuzzy parameter to obtain a first fuzzy image; the system operation layer holds a current image displayed by the terminal equipment;
a first obtaining module configured to obtain, by the system operating layer, a respective fusion coefficient of the current image and the first blurred image according to the blur parameter;
and the fusion module is configured to fuse the current image and the first blurred image according to respective fusion coefficients through the system operation layer to obtain an image to be displayed.
10. The apparatus of claim 9, wherein the user interface refresh command further comprises a fuzzy flag bit; the device further comprises:
a determination module configured to determine that the blur flag indicates to blur the current image.
11. The apparatus according to claim 9, wherein the user interface refresh command further comprises a fuzzy pattern identifier; the processing module comprises:
the first obtaining submodule is configured to obtain a corresponding fuzzy mode according to the fuzzy mode identifier and the corresponding relation between the fuzzy mode identifier and the fuzzy mode through the system running layer;
and the second acquisition submodule is configured to perform fuzzy processing on the current image according to the fuzzy parameter and the acquired fuzzy mode through the system operation layer to obtain a first fuzzy image.
12. The apparatus of claim 11, further comprising:
a second obtaining module configured to obtain an average brightness of the current image;
and the third acquisition module is configured to acquire a corresponding fuzzy mode identifier according to the average brightness and the corresponding relation between the brightness and the fuzzy mode identifier.
13. The apparatus according to claim 11, wherein the first obtaining module is further configured to obtain, by the system operation layer, a fusion coefficient of each of the current image and the first blurred image according to the blur parameter and the blur mode.
14. The apparatus of claim 9, wherein the processing module comprises:
the first processing submodule is configured to perform reduction processing on the current image to obtain a compressed image;
the second processing submodule is configured to perform fuzzy processing on the compressed image to obtain a second fuzzy image;
a third processing submodule configured to perform amplification processing on the second blurred image to obtain the first blurred image; wherein a size of the first blurred image is the same as a size of the current image.
15. The apparatus according to claim 14, wherein the second processing sub-module is further configured to blur the compressed image according to a preset blur radius, resulting in the second blurred image.
16. The apparatus according to claim 9, wherein the current image is located in a first layer, and when a second layer exists above the first layer, the blurring parameter changes as an area of the first layer covered by the second layer changes.
17. An image processing apparatus, characterized in that the apparatus comprises:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the method of any one of claims 1 to 8.
18. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 8.
CN201810543070.XA 2018-05-30 2018-05-30 Image processing method and device Active CN108874482B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810543070.XA CN108874482B (en) 2018-05-30 2018-05-30 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810543070.XA CN108874482B (en) 2018-05-30 2018-05-30 Image processing method and device

Publications (2)

Publication Number Publication Date
CN108874482A CN108874482A (en) 2018-11-23
CN108874482B true CN108874482B (en) 2021-06-15

Family

ID=64336715

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810543070.XA Active CN108874482B (en) 2018-05-30 2018-05-30 Image processing method and device

Country Status (1)

Country Link
CN (1) CN108874482B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110879739A (en) * 2019-11-27 2020-03-13 广东欢太科技有限公司 Display method and display device of notification bar

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1467987A (en) * 2002-06-27 2004-01-14 ŵ��ʿ�ֻ���ʽ���� Image processing method, image processing program and memory media storing the program
CN101169867A (en) * 2007-12-04 2008-04-30 北京中星微电子有限公司 Image dividing method, image processing apparatus and system
CN103164868A (en) * 2011-12-09 2013-06-19 金耀有限公司 Method and device for generating image with depth-of-field (DOF) effect
CN104361558A (en) * 2014-10-24 2015-02-18 小米科技有限责任公司 Image processing method, device and equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1467987A (en) * 2002-06-27 2004-01-14 ŵ��ʿ�ֻ���ʽ���� Image processing method, image processing program and memory media storing the program
CN101169867A (en) * 2007-12-04 2008-04-30 北京中星微电子有限公司 Image dividing method, image processing apparatus and system
CN103164868A (en) * 2011-12-09 2013-06-19 金耀有限公司 Method and device for generating image with depth-of-field (DOF) effect
CN104361558A (en) * 2014-10-24 2015-02-18 小米科技有限责任公司 Image processing method, device and equipment

Also Published As

Publication number Publication date
CN108874482A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN107330868B (en) Picture processing method and device
CN107888984B (en) Short video playing method and device
CN107341777B (en) Picture processing method and device
CN108462833B (en) Photographing method, photographing device and computer-readable storage medium
CN108200279B (en) Backlight adjusting method, device and equipment
CN106775235B (en) Screen wallpaper display method and device
CN105389083B (en) Large font implementation method and device and intelligent terminal
CN112331158B (en) Terminal display adjusting method, device, equipment and storage medium
CN107730443B (en) Image processing method and device and user equipment
CN107437269B (en) Method and device for processing picture
CN107563957B (en) Eye image processing method and device
CN111614910B (en) File generation method and device, electronic equipment and storage medium
CN108874482B (en) Image processing method and device
CN111261081B (en) Screen display method and device
CN109102779B (en) Backlight adjusting method and device
CN107122356B (en) Method and device for displaying face value and electronic equipment
CN114442792A (en) Method and device for adjusting operating frequency of processor and storage medium
CN110069310B (en) Method and device for switching desktop wallpaper and storage medium
CN112037719B (en) Screen brightness adjusting method and device
CN110312117B (en) Data refreshing method and device
CN115134516A (en) Shooting method and device
CN108769513B (en) Camera photographing method and device
CN107783704B (en) Picture effect adjusting method and device and terminal
CN111538447A (en) Information display method, device, equipment and storage medium
CN112019680A (en) Screen brightness adjusting method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant