CN115705620A - Image blurring method and device, electronic equipment and computer-readable storage medium - Google Patents

Image blurring method and device, electronic equipment and computer-readable storage medium Download PDF

Info

Publication number
CN115705620A
CN115705620A CN202110902442.5A CN202110902442A CN115705620A CN 115705620 A CN115705620 A CN 115705620A CN 202110902442 A CN202110902442 A CN 202110902442A CN 115705620 A CN115705620 A CN 115705620A
Authority
CN
China
Prior art keywords
blurring
image
light source
pixel
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110902442.5A
Other languages
Chinese (zh)
Inventor
李鹏
任世强
刘阳兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan TCL Group Industrial Research Institute Co Ltd
Original Assignee
Wuhan TCL Group Industrial Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan TCL Group Industrial Research Institute Co Ltd filed Critical Wuhan TCL Group Industrial Research Institute Co Ltd
Priority to CN202110902442.5A priority Critical patent/CN115705620A/en
Publication of CN115705620A publication Critical patent/CN115705620A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The embodiment of the application provides an image blurring method, an image blurring device, electronic equipment and a computer-readable storage medium, wherein the method comprises the following steps: providing a Y-channel image in a target YUV image; determining a blurring weight coefficient of each pixel point in the target YUV image according to the pixel value of each pixel point in the Y-channel image; and performing blurring processing on the target YUV image according to the blurring weight coefficient to obtain a target blurring image. The image blurring method provided by the embodiment of the application is used for independently extracting and utilizing the pixel value of each pixel point in the Y-channel image to perform blurring processing on the image, has a good blurring effect when spot blurring or blurring related to brightness is performed, and effectively reduces the calculated amount in the blurring processing process.

Description

Image blurring method and device, electronic equipment and computer-readable storage medium
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to an image blurring method and device, electronic equipment and a computer-readable storage medium.
Background
The image blurring is a process of blurring a background region in an image to highlight a screen of a main image region. At present, the method is widely applied to shooting terminals such as mobile phones and cameras, and therefore images after blurring are directly shot.
At present, a shooting terminal generally needs to convert a shot image into an RGB format with a smaller memory occupation and then perform blurring processing. However, some special blurring requirements, such as spot blurring, usually require blurring a region of an image whose luminance value meets a specific requirement, and an image in RGB format cannot visually represent the luminance value of each pixel in the image, and cannot better locate the luminance region, so that the blurring effect is not ideal when spot blurring or other blurring related to luminance is performed.
Disclosure of Invention
The embodiment of the application provides an image blurring method, an image blurring device, electronic equipment and a computer-readable storage medium, and aims to solve the technical problem that the effect of the existing blurring method is not ideal enough.
In one aspect, an embodiment of the present application provides a preview image blurring method, including:
providing a Y-channel image in a target YUV image;
determining a blurring weight coefficient of each pixel point in the target YUV image according to the pixel value of each pixel point in the Y-channel image;
and performing blurring processing on the target YUV image according to the blurring weight coefficient to obtain a target blurring image.
On the other hand, an embodiment of the present application further provides a preview image blurring device, including:
the providing module is used for providing a Y-channel image in the target YUV image;
the determining module is used for determining a blurring weight coefficient of each pixel point in the target YUV image according to the pixel value of each pixel point in the Y-channel image;
and the blurring module is used for blurring the target YUV image according to the blurring weight coefficient to obtain a target blurring image.
On the other hand, an embodiment of the present application further provides an electronic device, where the electronic device includes a processor, a memory, and an image blurring program stored in the memory and capable of being executed on the processor, and the processor executes the image blurring program to implement the steps in the image blurring method.
On the other hand, an embodiment of the present application further provides a computer-readable storage medium, on which an image blurring program is stored, where the image blurring program is executed by a processor to implement the steps in the image blurring method.
Compared with the prior art, the image blurring method provided by the application utilizes the pixel value of each pixel point in the Y-channel image to perform blurring processing on the image, namely blurring processing is directly performed on a YUV color space. On one hand, the pixel value of each pixel point of the Y-channel image can visually represent the brightness value of each pixel point in the image, and the Y-channel image has a good blurring effect when flare blurring or other blurring related to brightness is performed, and on the other hand, the calculation amount in the blurring processing process is effectively reduced by independently extracting the Y-channel image.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a scene schematic diagram of an image blurring method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a first embodiment of an image blurring method provided in an embodiment of the present application;
FIG. 3 is a flowchart illustrating an image blurring method according to a second embodiment of the present disclosure;
FIG. 4 is a flowchart illustrating a third exemplary embodiment of an image blurring method provided in an embodiment of the present application;
FIG. 5 is a flowchart illustrating a fourth exemplary embodiment of an image blurring method provided in an embodiment of the present application;
FIG. 6 is a flowchart illustrating a fifth exemplary embodiment of an image blurring method provided in an embodiment of the present application;
fig. 7 is a schematic flowchart of a sixth embodiment of an image blurring method provided in an embodiment of the present application;
fig. 8 is a schematic flowchart of a seventh embodiment in the image blurring method provided in the embodiment of the present application;
FIG. 9 is a block diagram illustrating an exemplary embodiment of an image blurring apparatus provided in an exemplary embodiment of the present application;
fig. 10 is a schematic structural diagram of an embodiment of an electronic device provided in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the scope of the present invention.
In the embodiments of the present application, the word "exemplary" is used to mean "serving as an example, instance, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. The following description is presented to enable any person skilled in the art to make and use the invention. In the following description, details are set forth for the purpose of explanation. It will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known structures and processes are not shown in detail to avoid obscuring the description of the invention with unnecessary detail. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed in the embodiments herein.
Embodiments of the present application provide an image blurring method, an image blurring device, an electronic device, and a computer-readable storage medium, which are respectively described in detail below.
The image blurring method in the embodiment of the application is applied to an image blurring device, the image blurring device is arranged on an electronic device, the electronic device comprises a memory, a processor and an image blurring program which is stored in the memory and can be run on the processor, and the processor executes the image blurring program, so that the steps in the image blurring method are realized.
As shown in fig. 1, fig. 1 is a schematic view of an image blurring scene according to an embodiment of the present disclosure, where the image blurring scene includes an image blurring device 100 and a shooting device 200. The shooting device 200 is mainly used for shooting YUV images, and the image blurring device 100 runs a computer storage medium corresponding to the image blurring method to perform the image blurring step.
The image blurring device 100 according to the embodiment of the present invention is mainly used for: providing a Y-channel image in a target YUV image; determining a blurring weight coefficient of each pixel point in the target YUV image according to the pixel value of each pixel point in the Y-channel image; and performing virtualization processing on the target YUV image according to the virtualization weight coefficient to obtain a target virtualized image.
It should be noted that the schematic view of the image blurring scene shown in fig. 1 is only an example, and the image blurring scene described in the embodiment of the present invention is for more clearly explaining the technical solution of the embodiment of the present invention, and does not constitute a limitation on the technical solution provided by the embodiment of the present invention.
Based on the scene of image blurring, an embodiment of an image blurring method is provided.
As shown in fig. 2, fig. 2 is a schematic flowchart of a first embodiment of an image blurring method provided in an embodiment of the present application, an execution subject of the image blurring method is an electronic device, and the image blurring method in this embodiment includes steps 201 to 203:
and 201, providing a Y-channel image in the target YUV image.
In this embodiment, the target YUV image generally refers to an image directly acquired by a camera. Specifically, taking a smart phone as an example, when the user selects blurring shooting, an image directly acquired by a camera on the smart phone is a target YUV image, and the target YUV image is input into an image blurring device and processed by the image blurring method provided by the present invention to obtain a blurring shooting image required by the user. Of course, in other scenarios, the target YUV image may be provided in any other manner. Of course, the target YUV image may be obtained by converting the provided initial image in other color formats according to the existing format conversion rule
In this embodiment, the YUV image generally includes three channels, that is, each pixel in the YUV image is represented by three component YUV, and therefore, the Y component of the three components of each pixel is extracted separately, and the Y channel image of the target YUV image can be obtained.
In this embodiment, the Y component in the YUV image represents the luminance value of the pixel, and the UV components together represent the chromatic value of the pixel. That is to say, the Y-channel image provided by the image blurring device may be understood as a luminance image of the target YUV image, and the pixel value of each pixel point in the Y-channel image is the luminance value of a corresponding pixel point in the target YUV image.
And 202, determining a blurring weight coefficient of each pixel point in the target YUV image according to the pixel value of each pixel point in the Y-channel image.
In this embodiment, for some blurring processes related to the brightness values of the pixels in the image, the image blurring device will generally set the blurring weight coefficient of each pixel in the target YUV image according to the pixel value of each pixel in the Y-channel image, and the set blurring weight coefficient will be higher for pixels meeting the blurring requirement. Taking spot blurring as an example, spot blurring refers to blurring a light source region in an image, that is, an image blurring device usually determines whether a pixel point is located in the light source region according to a pixel value of each pixel point in a Y channel image, and if yes, a higher blurring weight coefficient is set for the pixel point to ensure a blurring effect of the pixel point, thereby realizing spot blurring as a whole. For further understanding the light spot blurring proposed in the present application, specific implementation flows of the light spot blurring are specifically provided in subsequent fig. 3 to 5, and refer to the subsequent fig. 3 to 5 and the contents of the explanation thereof.
It should be noted that the blurring process is usually performed to blur a background region in an image, and the common flare blurring is mainly performed to blur a light source region outside the background region, that is, a main region in the image. As a preferred embodiment of the present application, in the process of setting the blurring weight coefficient of each pixel point in the target YUV image, the image blurring device identifies a background area according to a depth value corresponding to each pixel point, and completes the setting of the blurring weight coefficient in combination with the background area, and the detailed description may refer to the content of the following fig. 6 and the explanation thereof.
And 203, performing blurring processing on the target YUV image according to the blurring weight coefficient to obtain a target blurring image.
In this embodiment, it can be known from the foregoing description that the blurring weight coefficient is set based on the pixel value of each pixel in the Y-channel image, that is, the brightness value of each pixel in the target YUV image, so that after the image blurring device adjusts the pixel value of each pixel in the target YUV image by using the blurring weight coefficient of each pixel, the obtained blurring image is obviously associated with the brightness value of each pixel. Taking the light spot virtualization as an example, since the virtualization weight coefficient of the pixel points in the light source area is set to be higher in the foregoing process, the light source area in the finally obtained virtualized image has an obvious virtualization effect.
In this embodiment, in general, the pixel value of each pixel in the target YUV image is processed by using the blurring weight coefficient of each pixel, so as to obtain a blurred image. Specifically, the pixel value of the pixel point in the neighborhood of the pixel point and the blurring weight coefficient are weighted and summed, so that the adjusted pixel value of the pixel point is obtained, and the blurring weight coefficient of the pixel point in the light source area is higher than the blurring weight coefficient of the pixel point in the non-light source area, that is, the pixel value of the pixel point close to the light source area is more affected by the pixel point in the light source area, so that the blurring effect of the light source area is realized.
In this embodiment, it should be noted that, when blurring an image, blurring the entire YUV image, that is, blurring a Y-channel image and a UV-channel image according to a blurring weight coefficient corresponding to each pixel point. However, in general, the size of the UV channel image is usually 1/4 of the size of the Y channel image, and therefore, after the blurring weight coefficient of each pixel in the YUV image is obtained by using the Y channel image, the blurring weight coefficient of each pixel in the YUV image needs to be downsampled when the UV channel image is processed, so as to ensure that the blurring weight coefficient corresponds to each pixel in the UV channel image one to one. The specific sampling manner is not limited herein.
Further, in order to improve the blurring effect, another implementation manner for blurring to obtain a blurred image is also provided in this embodiment, specifically, for different pixel points, the neighborhood ranges of the selected neighborhood pixel points are different, and please refer to subsequent fig. 7 and the contents explained in the following description.
As a further preferred embodiment of the present application, after blurring the image, the image blurring device further determines a sudden depth value change region according to a change situation of the depth value of each pixel point in the YUV image, and then performs a smoothing process on the sudden depth value change region in the blurred image to obtain a blurred image with a better blurring effect, which may be referred to subsequent fig. 8 and the content explained in the following description.
Compared with the prior art, the image blurring method provided by the application is used for blurring the image by utilizing the pixel value of each pixel point in the Y-channel image, namely blurring the image directly on a YUV color space. On one hand, the pixel value of each pixel point of the Y-channel image can visually represent the brightness value of each pixel point in the image, and has a good blurring effect when light spot blurring or blurring related to brightness is carried out, and on the other hand, the calculation amount in the blurring processing process is effectively reduced by independently extracting the Y-channel image.
As shown in fig. 3, fig. 3 is a schematic flowchart of a second embodiment of an image blurring method provided in the embodiment of the present application.
Considering that spot blurring is the most common blurring operation related to the brightness value of a pixel point, the spot blurring method is widely applied to mobile phones, cameras and other shooting terminals at present, and has a wide application scene. Therefore, in this embodiment, by taking spot virtualization as an example, a specific implementation process is provided for determining a virtualization weight coefficient of each pixel point in a target YUV image according to a pixel value of each pixel point in a Y-channel image in a process of spot virtualization of an image, including steps 301 to 303:
and 301, determining a light source area in the target YUV image according to the pixel value of each pixel point in the Y-channel image and a preset light source segmentation threshold.
In this embodiment, it can be understood that the spot blurring is mainly performed on the light source area in the image. Because the brightness value of a pixel point in the light source area is high, the pixel value of the pixel point is very high when the pixel point corresponds to the Y-channel image, the image blurring device can determine the light source area in the target YUV image according to the size relation between the pixel value of each pixel point in the Y-channel image and the preset light source segmentation threshold, and specifically, if the pixel value of a certain pixel point is higher than the segmentation threshold, the brightness value of the pixel point is high, and the possibility that the pixel point is a point in the light source area is high. Preferably, experimental data show that when the light source segmentation threshold is set to 250, a good separation effect is achieved between pixel points inside the light source region and pixel points outside the light source region.
In combination with the foregoing, the image blurring device may determine the light source region in the target YUV image directly according to the size relationship between the pixel value of each pixel point in the Y-channel image and the preset light source segmentation threshold, that is, all the pixel points whose pixel values are higher than the light source segmentation threshold are regarded as pixel points in the light source region. However, considering that there may be noise points in the image or other pixel points with higher brightness values but not the light source, as an optimal choice, the present application further proposes a method of performing contour segmentation on the light source region by using a light source segmentation threshold, and then further determining whether each contour is the light source region contour according to the area ratio of the light source contour to the minimum circumscribed rectangle of the light source contour, so as to determine a more accurate light source region, which specifically refers to fig. 4 and the contents explained therein.
302, the blurring weight coefficient of each pixel outside the light source region is set.
In this embodiment, for convenience of description, a region other than the light source region in the target YUV image is hereinafter simply referred to as a non-light source region.
In this embodiment, the setting of the blurring weight coefficient of each pixel point in the non-light source region and the subsequent setting of the blurring weight coefficient of each pixel point in the light source region in the target YUV image may be performed separately or in association, and the specific setting rule will be described in the subsequent step 303.
303, setting the blurring weight coefficient of each pixel point in the light source region.
Wherein the blurring weight coefficient of each pixel point in the light source area is larger than that of each pixel point outside the light source area; or, the average value of the blurring weight coefficients of the pixels in the light source region in the preset proportion is greater than the average value of the blurring weight coefficients of all the pixels outside the light source region, and the preset proportion may be nine tenths or eight tenths, which is not limited herein.
In this embodiment, after the image blurring device determines the light source region and the non-light source region in the target YUV image according to the pixel value of each pixel and the preset light source segmentation threshold, it needs to set the blurring weight coefficient of each pixel in the light source region and the blurring weight coefficient of each pixel in the non-light source region, respectively. There are many specific setting rules, however, no matter what rule is used for setting, the blurring weight coefficient of each pixel point in the light source region needs to be greater than the blurring weight coefficient of each pixel point in the non-light source region, so as to ensure that the subsequent light spot blurring effect can be normally realized.
As a feasible scheme, the image blurring device may uniformly set the blurring weight coefficient of each pixel point in the non-light source region as a first blurring weight coefficient, and uniformly set the blurring weight coefficient of each pixel point in the light source region as a second blurring weight coefficient, so that the second blurring weight coefficient is greater than the first blurring weight coefficient. For example, the image blurring device may set the blurring weight coefficient of each pixel in the non-light source region to 1, and set the blurring weight coefficient of each pixel in the light source region to 5.
As another feasible scheme, the blurring weight coefficient of each pixel point in the light source region is set according to the blurring weight coefficient of each pixel point outside the light source region and a correction coefficient related to the brightness value of each pixel point in the light source region, and the brightness of the final blurring light spot can be controlled. At this time, please refer to the following fig. 5 and the explanation thereof for the specific steps.
The embodiment provides a specific implementation process for setting the blurring weight coefficient of each pixel point according to the pixel value of each pixel point in the Y-channel image in the light spot blurring process of the image, that is, when the blurring weight coefficient of the pixel point is set by using the method provided by the embodiment of the present application, the effect of light spot blurring can be finally achieved, and the method can be widely applied to blurring shooting programs of mobile phones, cameras and other shooting terminals.
As shown in fig. 4, fig. 4 is a schematic flowchart of a third embodiment of an image blurring method provided in the embodiment of the present application.
Considering that there is interference of noise points in the light spot blurring process and brightness values of some non-light source objects are also high when light reflects, if the brightness values of the pixels and the preset light source segmentation threshold are directly used to directly judge whether the pixels are light source pixels, a problem that misjudgment is easy to occur is solved, another technical scheme capable of accurately extracting a light source region is provided in the embodiment of the application, and the method includes steps 401 to 404:
401, segmenting the Y-channel image according to a size relationship between a pixel value of each pixel point in the Y-channel image and a preset light source segmentation threshold value to obtain at least one candidate light source region in the Y-channel image; and the pixel value of each pixel point in the candidate light source area is greater than the light source segmentation threshold.
In this embodiment, the image blurring device segments the Y-channel image using the light source segmentation threshold, so as to obtain candidate light source regions, where the candidate light source regions consider more overall features of the image than a mode of determining whether each pixel point is greater than the light source segmentation threshold.
And 402, determining a minimum bounding rectangle area corresponding to each candidate light source area.
In this embodiment, in general, the image blurring device extracts the outline of each candidate light source region, and then determines the minimum bounding rectangle along the outline, that is, the minimum rectangle capable of including the outline. The present invention is not described herein in detail, and for example, the minimum bounding rectangles including outlines in each orientation may be determined respectively, and then the minimum bounding rectangle having the smallest area, that is, the minimum bounding rectangle including the smallest number of pixels, is determined from the minimum bounding rectangles including outlines in each orientation.
403, respectively obtaining the number of pixel points of each candidate light source region and the corresponding minimum circumscribed rectangular region.
In this embodiment, in a general case, the area of the image region may be determined by using a point counting method, that is, the number of pixels in each image region is calculated. Therefore, after the minimum circumscribed rectangular region is determined, the number of pixel points in each candidate light source region and the number of pixel points in the corresponding minimum circumscribed rectangular region are respectively obtained.
And 404, respectively calculating the ratio of the number of the pixel points of each candidate light source area to the number of the pixel points of the corresponding minimum circumscribed rectangular area to obtain the light source occupation ratio of each candidate light source area, and determining the candidate light source area with the light source occupation ratio higher than a preset occupation ratio threshold value as the light source area in the target YUV image.
In this embodiment, it can be understood that the light sources mainly appearing in the image are generally point light sources or line light sources, and the light source regions formed by the two light sources are regular, the brightness values of the internal pixel points are also higher, and the non-light source region generally does not exist inside, that is, the area of the light source region occupying the minimum circumscribed rectangular region is larger, that is, the ratio of the number of the pixel points in the light source region to the number of the pixel points in the corresponding minimum circumscribed rectangular region is higher. For irregular light source regions, such as crescent-like light source regions, or uneven light source regions with more fine non-light source regions inside, the light is more likely to be reflected light of an irregular object, and is not a direct light source, and the area of the light source region occupying the minimum circumscribed rectangular region is smaller. Therefore, the image blurring device calculates the ratio of the number of pixels in each candidate light source region to the number of pixels in the corresponding minimum circumscribed rectangular region to obtain the light source proportion of each candidate light source region, and further determines the magnitude relation between the light source proportion and a preset proportion threshold value to determine which candidate light source regions are more likely to be light source regions, and if the light source proportion is higher than the proportion threshold value, it indicates that the candidate light source regions are more likely to be light source regions and the light source proportion is higher. Preferably, experiments show that when the duty threshold is preset to 0.4, the real light source region in the candidate light source regions can be better identified.
The embodiment provides a method for determining a light source region according to pixel values of pixels in an image, wherein the image is segmented by using a threshold value to obtain a candidate light source region containing overall characteristics, and then based on the attribute of the light source region formed by common point light sources and line light sources, the area ratio of each candidate light source region to a minimum external rectangle, namely the ratio of the number of pixels, is used to obtain the area ratio of the light source region to the minimum external rectangle region, so that part of pseudo light source regions, such as irregular light source regions and uneven light source regions, which are more likely to be formed by the reflection of irregular objects, are removed, the finally obtained light source region is more accurately positioned, the misjudgment of the light source region is reduced, and the light spot blurring effect is improved.
As shown in fig. 5, fig. 5 is a schematic flowchart of a fourth embodiment of an image blurring method provided in the embodiment of the present application.
Considering that directly setting the blurring weight coefficient of each pixel point in the light source region and the non-light source region as a fixed constant causes difficulty in adjusting the brightness of the finally obtained blurring light spot, this embodiment further provides a method for setting the blurring coefficient of each pixel point in the light source region based on the blurring weight coefficient of each pixel point in the non-light source region, including steps 501-502:
501, calculating a blurring weight correction coefficient of each pixel point in the light source region according to a difference value between a pixel value of each pixel point in the light source region and a light source segmentation threshold value.
In this embodiment, the image blurring device sets the blurring weight correction coefficient of each pixel point by using the difference between the luminance value and the preset light source segmentation threshold, instead of directly setting the blurring coefficient of the pixel point in the light source region to a constant greater than 1. That is, the correction coefficients set for different luminance values in the light source region are also different.
502, setting the blurring weight coefficient of each pixel point in the light source area according to the blurring weight correction coefficient of each pixel point in the light source area and the blurring weight coefficient of each pixel point outside the light source area.
In this embodiment, the blurring weight coefficient of each pixel outside the light source area may be preset to 1, which is not limited herein.
In this embodiment, in a normal case, if the blurring weight correction coefficient is a positive number, the blurring weight coefficient of each pixel in the light source region may be a sum of the blurring weight correction coefficient and the blurring weight basis coefficient, and if the blurring weight correction coefficient is a number greater than 1, the blurring weight coefficient of each pixel in the light source region may also be a product of the blurring weight correction coefficient and the blurring weight basis coefficient. Specifically, an optimal calculation formula of the blurring weight coefficient of each pixel point in the light source region is as follows:
W i,j =1+5*exp(((Y_4 i,j -255)/var) 2 )
wherein, W i,j I.e. pixel point I in the image i,j Is normalized by the weight coefficient of blurring, Y _4 i,j I.e. in the image I i,j I.e. pixel point I in the Y-channel image i,j Var is the luminance coefficient for controlling the light spot, preferably var =25.
Compared with the technical scheme that the blurring weight coefficient of each pixel point in the light source region and the blurring weight coefficient of each pixel point in the non-light source region are set to be different fixed constants, the blurring weight coefficient of each pixel point in the light source region is set according to the brightness value of each pixel point in the light source region and the blurring weight coefficient of each pixel point in the non-light source region, so that the control on the blurring light spot brightness can be conveniently realized, and the light spot blurring effect is improved.
As shown in fig. 6, fig. 6 is a schematic flowchart of a fifth embodiment of an image blurring method provided in the embodiment of the present application.
It is considered that the blurring of the image is usually mainly performed on the image of the background area, and the blurring of the flare is also mainly performed on the blurring of the light source in the background area. Therefore, the embodiment of the present invention further provides a technical solution for segmenting the image continuous background region by using the depth value of each pixel point in the image, including steps 601-603:
601, obtaining the depth value corresponding to each pixel point in the Y channel image and the focus depth value of the Y channel image.
In this embodiment, in a normal case, the depth value and the focus depth value corresponding to each pixel point in the captured YUV image may be determined by a depth image captured by a capture device having a depth acquisition function at the same time. That is to say, if the depth value corresponding to each pixel point needs to be utilized, the shooting device needs to have a depth acquisition function, for example, the shooting device can acquire a blurred YUV image and acquire a depth image corresponding to the YUV image in a manner of installing a binocular lens or installing a TOF lens, where the pixel value of each pixel point in the depth image is the depth value corresponding to each corresponding pixel point in the YUV image. Of course, it is also feasible to obtain the depth value corresponding to each pixel point of the YUV image and the data of the depth value of the focus in other ways of obtaining the YUV image.
And 602, determining a background area in the target YUV image according to the depth value corresponding to each pixel point in the Y-channel image and the focus depth value.
In this embodiment, it can be understood that the region at the focus is usually the main region, and therefore, the image blurring device can quickly determine the background region and the main region in the YUV image according to the depth value corresponding to each pixel in the Y-channel image and the difference value between the focus depth values. In general, it may be considered that a pixel point whose difference between the depth value and the focus depth value is greater than a preset depth tolerance threshold is a pixel point in the background region, and a pixel point whose difference between the depth value and the focus depth value is less than or equal to the preset depth tolerance threshold is a pixel point in the background region.
Preferably, it is considered that different degrees of blurring may be performed on each pixel point according to the depth value of each pixel point in the blurring process, for example, when the difference between the depth value and the focus depth value is larger, the blurring degree is higher, at this time, a blurring radius map may be determined according to the depth value and the focus depth value of each pixel point, that is, the blurring radius corresponding to each pixel point in the blurring radius map increases as the difference between the depth value and the focus depth value increases, and when the difference between the depth value and the focus depth value is smaller than the depth tolerance threshold, the corresponding blurring radius is 0. Specifically, the calculation rule of the blurring radius map is as follows:
Figure BDA0003200459190000121
wherein r is i,j I.e. pixel point I i,j Corresponding blurring radius, L is a predetermined blurring strength coefficient, D i,j I.e. pixel point I i,j Corresponding depth value, D f I.e. the depth of focus value, D max The depth value is a maximum value in the depth values corresponding to each pixel point, and Δ d is a depth tolerance threshold, which is usually preset to 3.
In this embodiment, it can be seen that the blurring radius coefficient represents a difference between the depth value of each pixel point and the depth value of the focus. In general, r may be i,j And determining the pixel point of =0 as the pixel point of the background area. Of course, r can also be i,j The pixel points larger than a given positive number are determined as the pixel points of the background area, and taking 1 as an example of the threshold, it can be understood that, at this time, the difference between the depth value and the focus depth value needs to be smaller than D max the/L pixel points are regarded as the pixel points of the background area, but no matter how many threshold values are selected, the background area is always associated with the depth value of the pixel points.
It should be noted that, in general, the background area is represented in a Mask map Mask in a computer, and the definition of the Mask map belongs to the common knowledge of those skilled in the art, and the present invention is not described herein again.
603, determining the blurring weight coefficient of each pixel point in the target YUV image according to the pixel value of each pixel point in the background region.
In this embodiment, it can be understood that the spot blurring is also performed on the light source in the background region, that is, after the background region is determined by using the depth value, the image blurring device determines the light source region in the target YUV image according to the pixel value of each pixel in the background region and the preset light source segmentation threshold, that is, only searches the light source region for the background region, and at this time, in the steps shown in fig. 3 to 5, the processing on the Y channel image also needs to be adaptively adjusted to process the background region in the Y channel image.
Of course, it should be emphasized that, in addition to performing only the processing on the background region when searching the light source region, in the subsequent process of performing the blurring processing on the image, only the blurring processing is performed on the pixel points in the background region, and specifically, the step 204 is also adapted to replace the background region in the target YUV image with the blurring weight coefficient to obtain the blurring image of the target YUV image.
In this embodiment, the background region of the image is segmented by introducing the depth value, and then the light source region in the background region is searched and blurred, so that the light spot blurring effect for the background region can be realized, and the actual requirements are met.
As shown in fig. 7, fig. 7 is a schematic flowchart of a sixth embodiment of an image blurring method provided in the embodiment of the present application.
In order to further improve the blurring effect, in this embodiment, for different pixel points, neighborhood value parameters with different sizes are adopted, including steps 701 to 704:
701, acquiring a target blurring radius parameter corresponding to a target pixel point in a background area in the Y-channel image.
In this embodiment, the target blurring radius parameter may be calculated based on the blurring radius parameter calculation formula provided in step 602, and it can be seen that the blurring radius parameter is specifically related to the difference between the depth value and the focal depth value of the target pixel, and the difference between the depth value and the focal depth value of the target pixel is related, and the larger the difference is, that is, the farther the target pixel is away from the focal distance, the larger the blurring radius parameter of the pixel is, that is, the higher the blurring degree of the light source region at the farther distance is, the larger the blurring spot is, the blurring degree of the light source region closer to the main body region is lower, and the blurring spot is smaller, so that an obvious blurring layered feeling can be expressed. And when the difference value between the depth value of the target pixel point and the focus depth value is smaller than a certain threshold value, the blurring radius parameter is 0, that is, blurring is not performed for the non-background area.
And 702, determining neighborhood pixels corresponding to the target pixels according to the target blurring radius parameter.
In this embodiment, in determining the target blurring radius parameter of the target pixel, the image blurring device sets, as a neighborhood pixel, a pixel whose abscissa or ordinate difference from the target pixel is smaller than the target blurring radius parameter, that is, the neighborhood pixel is all pixels in a square region whose abscissa value range is (i-r, i + r) and ordinate value range is (j-r, j + r), where i and j are the abscissa and ordinate of the target pixel, and r is the target blurring radius parameter corresponding to the target pixel.
703, carrying out weighted summation on the pixel values of the neighborhood pixels according to the virtualization weight coefficients corresponding to the neighborhood pixels to obtain the virtualization pixel values corresponding to the target pixel.
In this embodiment, the image blurring device may use a blurring weight coefficient corresponding to each neighborhood pixel point as a weighting coefficient, and perform weighted summation on the pixel value of each neighborhood pixel point, so as to obtain a blurring pixel value corresponding to the target pixel point, but in consideration of that the blurring pixel value should also be normally located in a range of 0 to 255, normally, normalization processing needs to be performed on the blurring weight coefficient corresponding to each neighborhood pixel point, specifically, the blurring weight coefficient of each neighborhood pixel point is divided by the sum of all blurring weight coefficients, so as to obtain a normalized blurring weight coefficient, and then the pixel value of each neighborhood pixel point and the corresponding normalized blurring weight coefficient are multiplied and summed, so as to obtain a blurring pixel value located in a range of 0 to 255.
And 704, adjusting the pixel value of each pixel point in the target YUV image to be a virtual pixel value corresponding to each pixel point respectively, so as to obtain a target virtual image.
In this embodiment, the image blurring device adjusts the pixel value of each pixel point in the YUV image to the blurring pixel value corresponding to each pixel point, and at this time, the pixel value of each pixel point in the YUV image can be obtained, where the specific calculation formula is as follows:
Figure BDA0003200459190000141
wherein, blu _ Y _2 is a blurred image of the Y channel image, Y _2 represents the Y channel image, W is a blurring weight coefficient, which can also be understood as a blurring weight coefficient image, and (m, n) represents the pixel point I in each image m,n The corresponding numerical value.
For the blurring process of the UV channel image, the specific calculation formula is the same as above, except that the Y channel image is replaced with the UV channel image.
The blurring mode provided by this embodiment is through introducing the blurring radius parameter corresponding to each pixel point, that is to say to the pixel point of different positions, the neighborhood range selected is also different, specifically, blurring radius parameter and pixel point depth value are positively correlated with the difference of focus depth value, that is to say, the difference is bigger, blurring radius parameter is also bigger, that is, to the deeper background in the image, the blurring facula can be bigger, and is close to the light source region of the main body region, the blurring facula that obtains can be smaller, the stereovision of blurring image has been highlighted promptly, the blurring effect has further been improved.
As shown in fig. 8, fig. 8 is a schematic flowchart of a seventh embodiment of an image blurring method provided in the embodiment of the present application.
Considering that, in the process of blurring an image, an obvious fault occurs in a blurring result due to a sudden change of a depth value, in order to improve the blurring effect, this embodiment further provides a technical scheme of determining a sudden change region according to the depth value corresponding to each pixel value, and performing smoothing processing on the sudden change region in the obtained blurring image, including steps 801 to 803:
and 801, performing blurring processing on the target YUV image according to the blurring weight coefficient to obtain an initial blurring image corresponding to the target YUV image.
In this embodiment, the target YUV image is processed according to the method shown in fig. 7, and the obtained blurred image is an initial blurred image which may have abrupt changes, that is, the blurred image may be discontinuous due to abrupt changes of depth values.
And 802, determining a mutation area in the initial blurring image according to the depth value corresponding to each pixel point in the Y-channel image.
In this embodiment, the depth value corresponding to each pixel point may refer to the foregoing step 601 and the content of the explanation thereof, and may be determined by a depth image captured by a capturing device having a depth acquisition function at the same time in a normal case, or may be obtained in other manners.
In this embodiment, the abrupt change region generally refers to a region where the depth value of a pixel point changes suddenly, and thus, the abrupt change region in the initial blurred image can be determined according to the depth value corresponding to each pixel point in the Y-channel image. Specifically, the second-order gradient of the depth value of each pixel may be calculated, for example, a laplacian operator is used to solve the laplacian second-order gradient of the depth value of each pixel, and of course, other second-order gradient operators may also be used. When the second-order gradient of the depth value of a certain pixel point is higher than a certain threshold value, the pixel point is indicated to be a pixel point in the mutation area.
Further, considering that the above-mentioned blurring radius parameter is also associated with the depth value in the image, the abrupt change region can be determined by solving the second order gradient of the blurring radius parameter of each pixel, and the nature of the abrupt change region is the same as the second order gradient of the depth value of each pixel. However, it should be noted that, since the blurring radius parameter of each pixel point in the main region is 0, there is obviously no abrupt change, that is, the second-order gradient of the blurring radius parameter is equivalent to that only the abrupt change region in the background region is determined, and therefore, the abrupt change region in the background region can be better determined by using the second-order gradient of the blurring radius parameter of each pixel point.
803, smoothing the abrupt change region to obtain a smoothed blurred image.
In this embodiment, the smoothed blurred image is the target blurred image.
In this embodiment, a sudden change region in the image, that is, a region where the depth value changes suddenly, is identified by using the depth value corresponding to each pixel point, and the sudden change region in the blurred image is smoothed correspondingly, so that a smoothed blurred image can be obtained.
In order to better implement the image blurring method in the embodiment of the present application, based on the image blurring method, a schematic structural diagram of an image blurring device is further provided in the embodiment of the present application, as shown in fig. 9, the image blurring device includes:
a providing module 901, configured to provide a Y-channel image in a target YUV image;
a determining module 902, configured to determine a blurring weight coefficient of each pixel in the target YUV image according to a pixel value of each pixel in the Y-channel image;
and a blurring module 903, configured to perform blurring processing on the target YUV image according to the blurring weight coefficient, so as to obtain a target blurring image.
In some embodiments of the present application, the determining module includes:
the light source area segmentation submodule is used for determining a light source area in a target YUV image according to the pixel value of each pixel point in the Y-channel image and a preset light source segmentation threshold;
the first blurring weight coefficient setting submodule is used for setting blurring weight coefficients of all pixel points outside the light source area;
and the second blurring weight coefficient setting submodule is used for setting the blurring weight coefficient of each pixel point in the light source area.
In some embodiments of the present application, the light source region segmentation submodule includes:
the light source region segmentation unit is used for segmenting the Y-channel image according to the size relationship between the pixel value of each pixel point in the Y-channel image and a preset light source segmentation threshold value to obtain at least one candidate light source region in the Y-channel image; the pixel value of each pixel point in the candidate light source area is larger than a light source segmentation threshold;
the external rectangular area determining unit is used for determining the minimum external rectangular area corresponding to each candidate light source area;
the pixel number obtaining unit is used for respectively obtaining the pixel number of each candidate light source area and the corresponding minimum circumscribed rectangular area;
and the light source region segmentation unit is used for respectively calculating the ratio of the number of the pixels of each candidate light source region to the number of the pixels of the corresponding minimum circumscribed rectangular region to obtain the light source occupation ratio of each candidate light source region, and determining the candidate light source region with the light source occupation ratio higher than a preset occupation ratio threshold value as the light source region in the target YUV image.
In some embodiments of the present application, the second blurring weight coefficient setting sub-module includes:
the correction coefficient determining unit is used for calculating a blurring weight correction coefficient of each pixel point in the light source region according to the difference value between the pixel value of each pixel point in the light source region and the light source segmentation threshold value;
and the light source area internal coefficient setting unit is used for setting the blurring weight coefficient of each pixel point in the light source area according to the blurring weight correction coefficient of each pixel point in the light source area and the blurring weight coefficient of each pixel point outside the light source area.
In some embodiments of the application, the determining module includes:
the depth value acquisition secondary module is used for acquiring the depth value corresponding to each pixel point in the Y channel image and the focus depth value of the Y channel image;
the background area division submodule is used for determining a background area in the target YUV image according to the depth value corresponding to each pixel point in the Y-channel image and the depth value of the focus;
and the background blurring weight coefficient setting secondary module is used for determining the blurring weight coefficient of each pixel point in the target YUV image according to the pixel value of each pixel point in the background area.
In some embodiments of the present application, the blurring module includes:
the radius parameter acquisition secondary module is used for acquiring a target blurring radius parameter corresponding to a target pixel point in a background area in the Y-channel image;
the neighborhood pixel point determining secondary module is used for determining the neighborhood pixel point corresponding to the target pixel point according to the target virtualization radius parameter; wherein, the difference of the abscissa or the difference of the ordinate between the neighborhood pixel point and the target pixel point is smaller than the target blurring radius parameter;
and the blurring pixel value calculation secondary module is used for carrying out weighted summation on the pixel values of the neighborhood pixels according to the blurring weight coefficient corresponding to each neighborhood pixel point to obtain the blurring pixel value corresponding to the target pixel point.
And the virtualization pixel value adjusting submodule is used for adjusting the pixel value of each pixel point in the target YUV image into the virtualization pixel value corresponding to each pixel point respectively to obtain the target virtualization image.
In some embodiments of the present application, the blurring module includes:
the initial blurring image determining secondary module is used for blurring the target YUV image according to the blurring weight coefficient to obtain an initial blurring image corresponding to the target YUV image;
the mutation area determining secondary module is used for determining a mutation area in the initial blurring image according to the depth value corresponding to each pixel point in the Y-channel image;
and the smoothing secondary module is used for smoothing the mutation region to obtain a smoothed blurring image, and the smoothed blurring image is the target blurring image.
An embodiment of the present invention further provides an electronic device, as shown in fig. 10, fig. 10 is a schematic structural diagram of an embodiment of the electronic device provided in the embodiment of the present application, where the electronic device includes a memory, a processor, and an image blurring program stored in the memory and capable of running on the processor, and when the processor executes the image blurring program, the steps in the image blurring method in any embodiment are implemented.
Specifically, the method comprises the following steps: the electronic device may include components such as a processor 1001 of one or more processing cores, memory 1002 of one or more storage media, a power supply 1003, and an input unit 1004. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 10 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the processor 1001 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 1002 and calling data stored in the memory 1002, thereby integrally monitoring the electronic device. Optionally, processor 1001 may include one or more processing cores; preferably, the processor 1001 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1001.
The memory 1002 may be used to store software programs and modules, and the processor 1001 executes various functional applications and data processing by operating the software programs and modules stored in the memory 1002. The memory 1002 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, application programs (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory 1002 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 1002 may also include a memory controller to provide the processor 1001 access to the memory 1002.
The electronic device further includes a power source 1003 for supplying power to each component, and preferably, the power source 1003 may be logically connected to the processor 1001 through a power management system, so that functions of managing charging, discharging, power consumption, and the like are implemented through the power management system. The power source 1003 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
The electronic device may further include an input unit 1004, and the input unit 1004 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
Although not shown, the electronic device may further include a display unit and the like, which are not described in detail herein. Specifically, in this embodiment, the processor 1001 in the electronic device loads the executable file corresponding to the process of one or more application programs into the memory 1002 according to the following instructions, and the processor 1001 runs the application programs stored in the memory 1002, thereby implementing any of the steps in the image blurring method provided by the embodiment of the present invention.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a storage medium and loaded and executed by a processor.
To this end, an embodiment of the present invention provides a computer storage medium, which may include: read Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, and the like. The computer readable storage medium stores an image blurring program, and when executed by the processor, the image blurring program implements the steps of any of the image blurring methods provided by the embodiments of the present invention.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed descriptions of other embodiments, which are not described herein again.
In specific implementation, each unit or structure may be implemented as an independent entity, or may be combined arbitrarily to be implemented as the same entity or several entities, and specific implementation of each unit or structure may refer to the foregoing method embodiment, which is not described herein again.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
The image blurring method provided by the embodiment of the present application is described in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present invention, and the description of the above embodiment is only used to help understanding the method and the core idea of the present invention; meanwhile, for those skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. An image blurring method, comprising:
providing a Y-channel image in a target YUV image;
determining a blurring weight coefficient of each pixel point in the target YUV image according to the pixel value of each pixel point in the Y-channel image;
and performing virtualization processing on the target YUV image according to the virtualization weight coefficient to obtain a target virtualized image.
2. The method of claim 1, wherein determining the blurring weight coefficient for each pixel in the target YUV image according to the pixel value of each pixel in the Y-channel image comprises:
determining a light source area in the target YUV image according to the pixel value of each pixel point in the Y-channel image and a preset light source segmentation threshold;
setting a blurring weight coefficient of each pixel point outside the light source area;
and setting the blurring weight coefficient of each pixel point in the light source area.
3. The method according to claim 2, wherein the determining a light source region in the target YUV image according to the pixel value of each pixel point in the Y-channel image and a preset light source segmentation threshold comprises:
segmenting the Y-channel image according to the size relation between the pixel value of each pixel point in the Y-channel image and a preset light source segmentation threshold value to obtain at least one candidate light source region in the Y-channel image; the pixel value of each pixel point in the candidate light source area is larger than the light source segmentation threshold;
determining a minimum circumscribed rectangular area corresponding to each candidate light source area;
respectively obtaining the number of pixel points of each candidate light source region and the corresponding minimum circumscribed rectangular region;
and respectively calculating the ratio of the number of the pixels of each candidate light source region to the number of the pixels of the corresponding minimum external rectangular region to obtain the light source occupation ratio of each candidate light source region, and determining the candidate light source region with the light source occupation ratio higher than a preset occupation ratio threshold value as the light source region in the target YUV image.
4. The method of claim 2, wherein the setting the blurring weight coefficient of each pixel in the light source region comprises:
calculating a blurring weight correction coefficient of each pixel point in the light source region according to a difference value between the pixel value of each pixel point in the light source region and the light source segmentation threshold value;
and setting the blurring weight coefficient of each pixel point in the light source area according to the blurring weight correction coefficient of each pixel point in the light source area and the blurring weight coefficient of each pixel point outside the light source area.
5. The method of claim 1, wherein determining the blurring weight coefficient of each pixel in the target YUV image according to the pixel value of each pixel in the Y-channel image comprises:
acquiring depth values corresponding to all pixel points in the Y-channel image and a focus depth value of the Y-channel image;
determining a background area in the target YUV image according to the depth value corresponding to each pixel point in the Y-channel image and the focus depth value;
and determining the blurring weight coefficient of each pixel point in the target YUV image according to the pixel value of each pixel point in the background region.
6. The method according to any one of claims 1 to 5, wherein the blurring the target YUV image according to the blurring weight coefficient to obtain a target blurring image comprises:
acquiring a target blurring radius parameter corresponding to a target pixel point in a background area in the Y-channel image;
determining a neighborhood pixel point corresponding to the target pixel point according to the target blurring radius parameter; wherein the difference between the abscissa or ordinate of the neighborhood pixel point and the target pixel point is smaller than the target blurring radius parameter;
according to the blurring weight coefficient corresponding to the neighborhood pixel point, carrying out weighted summation on the pixel values of the neighborhood pixel points to obtain a blurring pixel value corresponding to the target pixel point;
and adjusting the pixel value of each pixel point in the target YUV image to the virtualized pixel value corresponding to each pixel point respectively to obtain a target virtualized image.
7. The method according to any one of claims 1 to 5, wherein the blurring the target YUV image according to the blurring weight coefficient to obtain a target blurring image comprises:
performing virtualization processing on the target YUV image according to the virtualization weight coefficient to obtain an initial virtualization image corresponding to the target YUV image;
determining a mutation area in the initial blurring image according to the depth value corresponding to each pixel point in the Y-channel image;
and smoothing the mutation area to obtain a smoothed blurring image, wherein the smoothed blurring image is a target blurring image.
8. An image blurring apparatus, comprising:
a providing module for providing a Y-channel image in the target YUV image
The determining module is used for determining a blurring weight coefficient of each pixel point in the target YUV image according to the pixel value of each pixel point in the Y-channel image;
and the blurring module is used for blurring the target YUV image according to the blurring weight coefficient to obtain a target blurring image.
9. An electronic device comprising a processor, a memory, and an image blurring program stored in the memory and executable on the processor, wherein the processor executes the image blurring program to implement the steps in the image blurring method according to any one of claims 1 to 7.
10. A computer-readable storage medium having an image blurring program stored thereon, the image blurring program being executed by a processor to implement the steps in the image blurring method according to any one of claims 1 to 7.
CN202110902442.5A 2021-08-06 2021-08-06 Image blurring method and device, electronic equipment and computer-readable storage medium Pending CN115705620A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110902442.5A CN115705620A (en) 2021-08-06 2021-08-06 Image blurring method and device, electronic equipment and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110902442.5A CN115705620A (en) 2021-08-06 2021-08-06 Image blurring method and device, electronic equipment and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN115705620A true CN115705620A (en) 2023-02-17

Family

ID=85179082

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110902442.5A Pending CN115705620A (en) 2021-08-06 2021-08-06 Image blurring method and device, electronic equipment and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN115705620A (en)

Similar Documents

Publication Publication Date Title
CN108550101B (en) Image processing method, device and storage medium
CN109272459B (en) Image processing method, image processing device, storage medium and electronic equipment
US11410277B2 (en) Method and device for blurring image background, storage medium and electronic apparatus
US10165248B2 (en) Optimization method of image depth information and image processing apparatus
US8358812B2 (en) Image Preprocessing
US20160171706A1 (en) Image segmentation using color & depth information
CN107103619B (en) Method, device and system for processing hair texture direction
CN105243371A (en) Human face beauty degree detection method and system and shooting terminal
CN112541868B (en) Image processing method, device, computer equipment and storage medium
CN111368587B (en) Scene detection method, device, terminal equipment and computer readable storage medium
CN113132695B (en) Lens shading correction method and device and electronic equipment
CN111161136B (en) Image blurring method, image blurring device, equipment and storage device
CN112308797A (en) Corner detection method and device, electronic equipment and readable storage medium
CN115082350A (en) Stroboscopic image processing method and device, electronic device and readable storage medium
CN111563517A (en) Image processing method, image processing device, electronic equipment and storage medium
JP2013182330A (en) Image processor and image processing method
CN109672829B (en) Image brightness adjusting method and device, storage medium and terminal
CN110689565B (en) Depth map determination method and device and electronic equipment
CN115705620A (en) Image blurring method and device, electronic equipment and computer-readable storage medium
CN114596210A (en) Noise estimation method, device, terminal equipment and computer readable storage medium
CN112488933A (en) Video detail enhancement method and device, mobile terminal and storage medium
CN112085002A (en) Portrait segmentation method, portrait segmentation device, storage medium and electronic equipment
CN115861084A (en) Image blurring method and device, electronic equipment and computer-readable storage medium
CN112040121B (en) Focusing method and device, storage medium and terminal
US20230290019A1 (en) Perspective method for physical whiteboard and generation method for virtual whiteboard

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication