CN114626992A - Image blurring method and terminal equipment - Google Patents

Image blurring method and terminal equipment Download PDF

Info

Publication number
CN114626992A
CN114626992A CN202011432957.5A CN202011432957A CN114626992A CN 114626992 A CN114626992 A CN 114626992A CN 202011432957 A CN202011432957 A CN 202011432957A CN 114626992 A CN114626992 A CN 114626992A
Authority
CN
China
Prior art keywords
image
filtering
processed
blurring
kernel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011432957.5A
Other languages
Chinese (zh)
Inventor
李鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan TCL Group Industrial Research Institute Co Ltd
Original Assignee
Wuhan TCL Group Industrial Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan TCL Group Industrial Research Institute Co Ltd filed Critical Wuhan TCL Group Industrial Research Institute Co Ltd
Priority to CN202011432957.5A priority Critical patent/CN114626992A/en
Publication of CN114626992A publication Critical patent/CN114626992A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/44Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The application belongs to the technical field of image processing, and provides an image blurring method and terminal equipment, wherein the method comprises the following steps: acquiring an image to be processed and a disparity map of the image to be processed; the image to be processed is fused by at least two images; determining the radius of a filtering kernel corresponding to each pixel point in the image to be processed according to the disparity map; acquiring an image blurring instruction, and generating an image filtering kernel according to the radius of the filtering kernel corresponding to each pixel point and light spot shape setting information contained in the image blurring instruction; and filtering the image to be processed according to the image filtering core to obtain a target blurring image. The embodiment of the application solves the problems that light spots cannot be defined by users and long time is consumed for image blurring.

Description

Image blurring method and terminal equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image blurring method and a terminal device.
Background
In the photographing function of the double-camera mobile phone, the background blurring function is more and more sought after for simulating the large-aperture mode imaging of the single lens reflex camera, and the light spot effect is also one of the important bright points in the function. In the prior art, the background blurring method for realizing the editable light spot of the shot image mainly has the following two modes: one way is to accelerate the blurring speed by using the blurring effect of the separable blurring nucleus band light spots; the other way is that various shapes of blurring cores are designed to directly perform blurring effects with light spots;
the two editable blurring modes of the light spots have certain defects. The separable blurring kernel can only make light spots in a specific shape, so that the diversification requirement of a user is limited; blurring of the designed blurring kernel is a two-dimensional blurring operation that is time consuming, while sample-based blurring to reduce time produces noise artifacts.
Disclosure of Invention
In view of this, embodiments of the present invention provide an image blurring method and a terminal device to solve the problems that a light spot cannot be customized and image blurring takes a long time.
A first aspect of an embodiment of the present invention provides an image blurring method, including:
acquiring an image to be processed and a disparity map of the image to be processed; the image to be processed is fused by at least two images;
determining the radius of a filtering kernel corresponding to each pixel point in the image to be processed according to the disparity map;
acquiring an image blurring instruction, and generating an image filtering kernel according to the radius of the filtering kernel corresponding to each pixel point and light spot shape setting information contained in the image blurring instruction;
and filtering the image to be processed according to the image filtering core to obtain a target blurring image.
In an implementation example, the filtering the image to be processed according to the image filtering core to obtain a target blurred image includes:
decomposing the image filtering kernel into a plurality of sub-filtering kernels;
and respectively inputting the image to be processed into each sub-filtering kernel to obtain a filtering result after the sub-filtering kernels are superposed, and obtaining a target virtual image based on all the filtering results.
In one implementation example, before the decomposing the image filtering kernel into several sub-filtering kernels, the method further includes:
setting the light spot brightness weight of the image filtering kernel according to the brightness value of each pixel point in the image to be processed, and obtaining a weight coefficient corresponding to each pixel point in the image to be processed according to the set light spot brightness weight; the weight coefficient is used for adjusting the brightness of each pixel point in the image to be processed.
In an implementation example, the respectively inputting the to-be-processed image into each of the sub-filtering kernels to obtain a filtering result obtained by superimposing the sub-filtering kernels, and obtaining the target blurred image based on all the filtering results includes:
respectively inputting the image to be processed into each sub-filtering kernel to obtain a filtering result after the sub-filtering kernels are superposed, and obtaining a filtering image based on all the filtering results;
and adjusting the light spot brightness of the filtered image according to the weight coefficient to obtain a target blurred image.
In one embodiment, the target blurred image is a blurred image of the light spot having a shape corresponding to the light spot shape setting information.
In an implementation example, the acquiring the to-be-processed image and the disparity map of the to-be-processed image includes:
and calculating the image according to a depth estimation algorithm to obtain a disparity map of the image to be processed.
In an implementation example, the determining, according to the disparity map, a filtering kernel radius corresponding to each pixel point in the image to be processed includes:
taking the focus position of the disparity map as the center of a preset value frame, and determining a disparity median value positioned in the preset value frame in the disparity map as a focus disparity value;
and determining the radius of a filter kernel corresponding to each pixel point in the image to be processed according to the maximum parallax value, a preset blurring radius, the focus parallax value and the parallax value corresponding to each pixel point in the image to be processed in the parallax map.
In one example, the decomposing the image filtering kernel into a number of sub-filtering kernels includes:
singular value decomposition is carried out on the image filtering kernel to obtain a plurality of singular values;
selecting N singular values meeting preset conditions as target singular values, and constructing N sub-filter kernels according to the eigenvectors corresponding to the target singular values; each of the feature vectors includes two vectors in horizontal and vertical directions; the N sub-filter kernels have horizontal and vertical directions; and N is a positive integer greater than 0.
In an implementation example, the preset condition is that the singular values are arranged into a queue according to numerical values from large to small, and the first N singular values in the queue are selected.
In one implementation example, each image fusing the images to be processed is captured by a camera of the peripheral device.
A second aspect of an embodiment of the present invention provides a device for blurring an image with a spot, including:
the disparity map acquisition module is used for acquiring an image to be processed and a disparity map of the image to be processed; the image to be processed is fused by at least two images;
the filtering kernel radius determining module is used for determining the filtering kernel radius corresponding to each pixel point in the image to be processed according to the disparity map;
the image filtering kernel generating module is used for acquiring an image blurring instruction and generating an image filtering kernel according to the filtering kernel radius corresponding to each pixel point and light spot shape setting information contained in the image blurring instruction;
and the target blurring image generation module is used for filtering the image to be processed according to the image filtering core so as to obtain a target blurring image.
A third aspect of embodiments of the present invention provides a computer-readable storage medium, which stores a computer program that, when executed by a processor, implements the steps of the method of the first aspect.
A fourth aspect of an embodiment of the present invention provides a terminal device, including: a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the image blurring method of the first aspect when executing the computer program.
The embodiment of the invention provides an image blurring method and terminal equipment, which are used for obtaining an image to be processed fused by at least two images and a disparity map of the image to be processed; determining the radius of a filter kernel corresponding to each pixel point in the image to be processed according to the disparity map, distinguishing a virtual region and a non-virtual region of the image to be processed according to the position of the focal point in the disparity map, and determining the radius of the filter kernel corresponding to each pixel in the image to be processed; and acquiring an image blurring instruction, and generating an image filtering kernel according to the spot shape setting information contained in the image blurring instruction and the radius of the filtering kernel corresponding to each pixel point. The light spot shape setting information in the image blurring instruction can be customized by a user, and the image filtering kernel is generated according to the light spot shape setting information and the radius of the filtering kernel corresponding to each pixel point, so that the light spot shape customized by the user can be generated by filtering the image to be processed, and the function of editable light spot shape is realized. And filtering the image to be processed according to the image filtering core to obtain a target blurred image, so as to obtain the target blurred image with the user-defined light spot shape.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of an image blurring method according to an embodiment of the present invention;
FIG. 2 is a block diagram of a peripheral system according to an embodiment of the present invention;
FIG. 3 is a sample diagram of a spot shape provided by an embodiment of the invention;
FIG. 4 is a schematic diagram of filtering an image to be processed by a plurality of sub-filtering kernels according to an embodiment of the present invention;
FIG. 5 is a schematic diagram illustrating an effect of a blurred image with light spots according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a speckle reduction apparatus for an image according to a second embodiment of the present invention;
fig. 7 is a schematic structural diagram of a terminal device according to a third embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood by those skilled in the art, the technical solutions in the embodiments of the present invention will be clearly described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "comprises" and "comprising," and any variations thereof, in the description and claims of this invention and the above-described drawings are intended to cover non-exclusive inclusions. For example, a process, method, or system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Furthermore, the terms "first," "second," and "third," etc. are used to distinguish between different objects and are not used to describe a particular order.
Example one
Fig. 1 is a schematic flow chart of an image blurring method according to an embodiment of the present invention. The embodiment can be applied to application scenes of blurring the image of the image to be processed and editing the shape of the light spot in the image, wherein the image to be processed is obtained by fusing images shot by at least two cameras. The method can be executed by an image blurring device, and the device can be a control device or a terminal tablet, a PC or a server and the like; in the embodiment of the present application, a device for blurring an image with a light spot is taken as an execution subject, and the method specifically includes the following steps:
because the prior art is adopted to carry out blurring processing on the image and only the facula with the preset shape can be made by the separable blurring kernel when the shape of the facula in the image is edited, the requirement of a user on the change of the shape of the facula is limited; the blurring kernel is a two-dimensional blurring operation, which takes a long time to process an image, and if a blurring process based on sampling is applied to an image to shorten the time, noise artifacts are generated. In order to solve the technical problem, in the embodiment of the application, an image filtering kernel is generated according to the spot shape setting information contained in the image blurring instruction and the radius of the filtering kernel corresponding to each pixel point, and a spot shape defined by a user is generated by filtering the image to be processed, so that the function of spot shape editable is realized; and decomposing the image filtering kernel into a plurality of sub-filtering kernels in the horizontal and vertical directions, and then respectively inputting the image to be processed into each sub-filtering kernel, so that the plurality of sub-filtering kernels obtained after decomposition can simultaneously filter the image to be processed in the horizontal and vertical directions, and the processing time of light spot editing and image blurring is shortened.
S110, acquiring an image to be processed and a disparity map of the image to be processed; the image to be processed is fused by at least two images.
Because the existing peripheral system (such as a mobile terminal) is provided with two or more cameras, the display images shot by the peripheral system are obtained by fusing images shot by the cameras. In order to realize the background blurring function of simulating the large-aperture mode imaging of the single lens reflex, the displayed image can be blurred with light spots. When the speckle blurring processing of the image is executed, a to-be-processed image is acquired, and the to-be-processed image is fused by at least two images.
In one implementation, each image may be captured by a camera of the peripheral device. And acquiring an image to be processed, wherein the image to be processed is a display image formed by fusing at least two images, namely at least two images obtained by shooting through at least two cameras of the peripheral system are fused. Because the image to be processed is obtained by fusing at least two images shot by at least two cameras, a parallax image of the image to be processed is obtained according to the at least two images shot by the at least two cameras, so as to perform blurring processing of the light spots on the image.
Further, when the peripheral system has three or more than three cameras, three or more than three images shot by the multiple cameras in the peripheral system are fused to generate a display image, and the image to be processed is obtained.
Specifically, a structural framework diagram of the peripheral system is shown in fig. 2. The image fusion can be divided into two types, one type is that the depth of field information is obtained by utilizing the distance from an object in an image obtained by each camera to a lens or a focal length, and the subsequent steps of 3D reconstruction, image segmentation or background blurring and the like can be carried out to complete the image fusion; the other type is that different images formed by two or more different cameras are used for image fusion to obtain more object detail information; different cameras in the peripheral system can adopt RGB (red, green and blue) lenses, Mono black and white cameras and the like, and the black and white cameras can capture more details, namely, the resolution is higher; the HDR effect can also be realized by adopting a wide-angle camera and a long-focus camera, the fusion of images obtained by the wide-angle camera and the long-focus camera and the cameras with different exposure parameters. In the first category, in order to obtain depth information by triangulation, the physical distance between the cameras in the peripheral system is as large as possible, while in the second category, in order to fuse images, two or more images need to be aligned, so it is desirable that the cameras in the peripheral system overlap as much as possible.
In one implementation example, the process of acquiring the disparity map of the image to be processed is as follows: and calculating the sub-images according to a depth estimation algorithm to obtain a disparity map of the image to be processed.
Specifically, the sub-images obtained by shooting by the cameras fusing the image to be processed can be calculated through a depth estimation algorithm, so that a disparity map with a disparity value of each pixel point in the image to be processed is obtained. For example, if the image to be processed is obtained by shooting by an external system having a main camera and a sub-camera, the image shot by the main camera and the image shot by the sub-camera in the external system are calculated according to a depth estimation algorithm to obtain a disparity map of the image to be processed.
Specifically, when the peripheral system is a dual-camera system, pixels of the same scene imaged under two cameras have position deviation. Since the next two binocular cameras are usually placed horizontally, this positional deviation is generally reflected in the horizontal direction. For example, an X point in the scene is X coordinates at the left camera, then the imaging at the right camera is (X + d) coordinates. d is the value of the x coordinate point in the disparity map of the image to be processed. A depth map refers to the distance of each point in the scene from the camera. For binocular imaging, the disparity map and the depth map are equivalent to a certain extent, and the depth map can be converted into the disparity map according to relevant parameters of two cameras in an external system, so that the disparity map of an image to be processed is obtained.
Further, when the peripheral system is a three-camera system or a multi-camera system, pixels of the same scene imaged under each camera have position deviation. Since the plurality of cameras are generally horizontally disposed, the positional deviation is generally reflected in the horizontal direction. For example, an X point in a scene is X coordinates at a first camera, and then the image at a second camera is (X + d)1) Coordinates and so on, the imaging of the nth camera is (x + d)n-1) And n is a positive integer greater than or equal to three. The value of the x coordinate point in the disparity map of the image to be processed comprises d1To dn-1. The depth map refers to the distance of each point in the scene from the camera. For multi-view imaging, the disparity map and the depth map are equivalent to some extent, and the depth map can be converted into the disparity map according to relevant parameters of a plurality of cameras in a peripheral system, so that the disparity map of an image to be processed is obtained.
And S120, determining the radius of a filtering kernel corresponding to each pixel point in the image to be processed according to the disparity map.
In the image blurring process, the image is usually processed by using a filter kernel. After the image to be processed and the disparity map of the image to be processed are obtained, a blurring region and a non-blurring region of the image to be processed can be distinguished according to the focal position in the disparity map, and the filtering kernel radius corresponding to each pixel in the image to be processed is determined according to the disparity map. Optionally, the focal position in the disparity map may be a user-defined focal position.
In an implementation example, the specific process of determining the filtering kernel radius corresponding to each pixel point in the image to be processed according to the disparity map includes steps 11 to 12:
step 11, taking the focus position of the disparity map as the center of a preset value frame, and determining a disparity median value in the preset value frame in the disparity map as a focus disparity value;
after obtaining a disparity map of an image to be processed, determining the focus position of the disparity map; and taking the focus position of the disparity map as the central point of a preset value-taking frame, and placing the preset value-taking frame on the disparity map. And obtaining a plurality of parallax values positioned in a preset value frame in the parallax map, and taking the median of the obtained plurality of parallax values as a focus parallax value. Optionally, the size range of the preset value frame is min (w, h)/c; wherein w represents the width of a preset value frame and is smaller than the width of the disparity map; h represents the height of a preset value frame, which is smaller than the height of the disparity map; c is defined to be any integer ranging from 16 to 24.
And step 12, determining the radius of a filter kernel corresponding to each pixel point in the image to be processed according to the maximum parallax value, a preset blurring radius, the focus parallax value and the parallax value corresponding to each pixel point in the image to be processed in the parallax map.
After the focus parallax value is obtained according to the obtained parallax image, the parallax value corresponding to each pixel point in the image to be processed and the maximum parallax value in the parallax image can be obtained by searching the parallax image. And calculating the maximum parallax value, the preset blurring radius, the focus parallax value and the parallax value corresponding to each pixel point in the image to be processed according to a calculation formula to obtain the radius of a filter kernel corresponding to each pixel point in the image to be processed. Optionally, the preset virtualization radius may be customized according to a user requirement. Specifically, the calculation formula is as follows:
Figure BDA0002827241050000101
wherein C is a preset blurring radius; r isi,jThe radius of a filter kernel corresponding to a pixel point with the coordinate position (i, j) in the image to be processed is set as (i, j); di,jThe disparity value of a pixel point with a coordinate position (i, j) in the image to be processed is corresponding to the value with the coordinate position (i, j) in the disparity map; d _ focus is a focus disparity value; d _ max is the maximum disparity value in the disparity map. And calculating according to the formula to obtain the radius of the filter kernel corresponding to each pixel point in the image to be processed.
S130, acquiring an image blurring instruction, and generating an image filtering kernel according to the radius of the filtering kernel corresponding to each pixel point and light spot shape setting information contained in the image blurring instruction.
The method comprises the steps that a spot blurring device of an image obtains an image blurring instruction, wherein the image blurring instruction comprises spot shape setting information set by a user in a self-defined mode; the spot shape setting information describes a user-defined spot shape. And generating an image filtering kernel by combining the light spot shape setting information contained in the image blurring instruction and the radius of the filtering kernel corresponding to each pixel point, so that the generated filtering kernel can filter the image to be processed to generate a user-defined light spot shape, and the function of editable light spot shape is realized. As shown in fig. 3, which is a sample diagram of the spot shape, the spot shape customized by the user may be a heart shape, a hexagon shape, a circle shape, a pentagram shape, or the like.
For example, if the light spot shape described by the light spot shape setting information included in the image blurring instruction is heart-shaped, the calculation formula for generating the image filter kernel according to the light spot shape setting information and the filter kernel radius corresponding to each pixel point is as follows: k (m, n) ═ 1if (x)2+y2-0.5)3-x2*y3<0; with the proviso that-r.ltoreq.m, n.ltoreq.r, x.m/r, y.n/r. Wherein K (m, n) represents the m-th row and the n-th column in the filter kernel matrix; r represents the filtering kernel radius; (x, y)) And (3) representing the coordinate position of any pixel point in the image to be processed, wherein r corresponds to the filtering kernel radius corresponding to the pixel point positioned at the (x, y) position in the image to be processed. If the light spot shape described by the light spot shape setting information contained in the image blurring instruction is a circle, a calculation formula for generating an image filter kernel according to the light spot shape setting information and the filter kernel radius corresponding to each pixel point is as follows: k (m, n) ═ 1if x2+y2≤r2. And when the light spot shape described by the light spot shape setting information contained in the image blurring instruction is other shapes, correspondingly changing the calculation formula of the image filtering kernel according to the function expression formula of the light spot shape. Alternatively, the generated image filter kernel may be defined as a square, and the size of the image filter kernel is (2 × r +1 ).
And S140, filtering the image to be processed according to the image filtering core to obtain a target blurring image.
The light spot shape setting information in the image blurring instruction can be defined by a user, an image filtering kernel is generated according to the light spot shape setting information and the radius of the filtering kernel corresponding to each pixel point, and then the image to be processed is filtered according to the image filtering kernel to obtain a target blurring image, so that the target blurring image with the user-defined light spot shape is obtained.
In one embodiment, to increase the processing speed of image blurring while ensuring the flare effect, the image filter kernel may be decomposed into several sub-filter kernels. The specific process of filtering the image to be processed according to the image filtering core to obtain the target blurred image comprises the following steps 21 to 22:
and step 21, decomposing the image filtering kernel into a plurality of sub-filtering kernels.
After the image filtering kernel is generated, in order to shorten the processing time of blurring the image by the image filtering kernel, the image filtering kernel can be decomposed into a plurality of sub-filtering kernels, so that the image is simultaneously subjected to filtering processing by the plurality of sub-filtering kernels obtained by decomposition, and the operation time is shortened.
In one embodiment, to ensure that flare occurs in the highlighted portion after the image blurring, flare brightness weight amplification is performed on the image filtering kernel based on the brightness of the image to be processed. Before decomposing the image filtering kernel into a plurality of sub-filtering kernels, the method further comprises the following steps:
setting the light spot brightness weight of the image filtering kernel according to the brightness value of each pixel point in the image to be processed, and obtaining a weight coefficient corresponding to each pixel point in the image to be processed according to the set light spot brightness weight; the weight coefficient is used for adjusting the brightness of each pixel point in the image to be processed.
Specifically, the brightness value corresponding to each pixel point in the image to be processed is obtained, and the light spot brightness weight amplification is performed on the image filtering kernel based on the brightness of the image to be processed, so that the weight coefficient corresponding to each pixel point in the image to be processed is obtained. In detail, the calculation formula for performing the light spot brightness weight reproduction on the image filtering kernel based on the brightness of the image to be processed is as follows:
Figure BDA0002827241050000121
mean(I(p,q))=(Ir(p,q)+Ig(p,q)+Ib(p,q))/3
w (p, q) is a weight coefficient value corresponding to a pixel point located at the (p, q) position in the image to be processed; α is a bright spot luminance, and 1< α <3 is generally set. Alternatively, α may be set to 1.7 and σ may be set to 196.
In one embodiment, the image filter kernel may be decomposed into a plurality of sub-filter kernels by a singular value decomposition method, and the specific process includes steps 31 to 32:
step 31, performing singular value decomposition on the image filtering kernel to obtain a plurality of singular values;
and carrying out Singular Value Decomposition (SVD) on the generated image filtering kernel, namely the filtering kernel matrix K to obtain a plurality of singular values. Specifically, the singular value decomposition is performed on the filtering kernel matrix K to obtain
Figure BDA0002827241050000131
Wherein σiIs a scalar quantity, uiIs the ith column vector of the unit orthogonal matrix U,
Figure BDA0002827241050000132
is a unit orthogonal matrix V*R denotes the rank of the filtering kernel matrix K.
Step 32, selecting N singular values meeting preset conditions as target singular values, and constructing N sub-filter kernels according to eigenvectors corresponding to the target singular values; each of the feature vectors includes two vectors in horizontal and vertical directions; the N sub-filter kernels have horizontal and vertical directions; and N is a positive integer greater than 0.
Specifically, before singular value decomposition is not performed on the image filtering kernel, the speckle blurring processing is performed on the image to be processed I according to the weight coefficient W and the generated image filtering kernel K to obtain a speckle blurring image
Figure BDA0002827241050000135
The specific calculation process is as follows:
Figure BDA0002827241050000133
due to the fact that
Figure BDA0002827241050000134
It can be seen that R times of filtering operations in horizontal and vertical directions on an image I to be processed can be decomposed into N singular values satisfying preset conditions as target singular values, N sub-filter kernels are constructed according to eigenvectors corresponding to the target singular values, and each eigenvector includes two vectors u in the horizontal and vertical directionsiAnd vi. As shown in fig. 4, when the value of N is 3, 3 singular values satisfying the preset condition are selected as target singular values, and the feature vector u in the horizontal direction corresponding to each target singular value is determined according to the feature vector u1′、u2' and u3' and feature vector v ' in the vertical direction '1、v′2And v'3Constructing 3 sub-filtering kernels; respectively inputting the image I to be processed into the 3 sub-filtering kernels for filtering in the horizontal and vertical directions, and superposing filtering results output by the 3 sub-filtering kernels to obtain a blurred image with light spots
Figure BDA0002827241050000141
In an implementation example, the preset condition is that the singular values are arranged into a queue according to numerical values from large to small, and the first N singular values in the queue are selected. Specifically, the preset condition is that the first N singular values with the largest value among the plurality of singular values obtained are selected.
Specifically, the calculation formula for determining the number N of the selected singular values is as follows:
Figure BDA0002827241050000142
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002827241050000143
is obtained by the first N maximum singular values and the eigenvector, | · | | non-woven phosphorFIs the F norm; the existence of K in the formulaFHere |. circum | | non-conducting phosphorFIs an abstract expression formula which expresses the F norm of a matrix; where may be replaced by any matrix, such as an image filtering kernel matrix K, etc. In this embodiment, the value of N needs to satisfy the condition: epsilon is less than or equal to 0.1; sigmac+1Representing singular values, performing singular value decomposition on the image filtering kernel matrix K to obtain R singular values, removing the first N maximum singular values, and leaving N +1, N +2, and.
And step 22, respectively inputting the image to be processed into each sub-filtering kernel to obtain a filtering result after the sub-filtering kernels are superposed, and obtaining a target blurring image based on all the filtering results.
After the image filtering kernel is decomposed into a plurality of sub-filtering kernels in the horizontal direction and the vertical direction, the image to be processed is respectively input into each sub-filtering kernel, so that the image to be processed can be simultaneously filtered in the horizontal direction and the vertical direction by the plurality of sub-filtering kernels obtained after decomposition, and the processing time of light spot editing and image blurring is shortened; and finally, superposing the filtering results output by the sub-filtering kernels to obtain a target blurring image, and improving the processing speed of image blurring while ensuring the light spot effect.
In one embodiment, in order to ensure that the bright part of the image after blurring can also generate the light spots, the filtered image obtained after filtering by each sub-filtering kernel needs to be subjected to light spot brightness amplification according to the calculated weight coefficient. Inputting the image to be processed into each sub-filtering kernel respectively to obtain a filtering result obtained after the sub-filtering kernels are superposed, wherein the specific process of obtaining the target virtual image based on all the filtering results comprises steps 41 to 42:
step 41, inputting the image to be processed into each sub-filtering kernel respectively to obtain a filtering result obtained after the sub-filtering kernels are superposed, and obtaining a filtering image based on all the filtering results;
specifically, the image I to be processed is respectively input into N constructed sub-filter kernels, and filtering in the horizontal direction and the vertical direction is carried out in each sub-filter kernel at the same time, so as to obtain a filtering result after the sub-filter kernels are superposed; and superposing the filtering results output by the N sub-filtering kernels to obtain a filtering image.
And 42, adjusting the light spot brightness of the filtered image according to the weight coefficient to obtain a target blurred image.
Adjusting the light spot brightness of the filtering image according to the weight coefficient W corresponding to each pixel point in the image to be processed, namely, obtaining the blurred image with the light spots by cross multiplication of the weight coefficient W and the filtering image
Figure BDA0002827241050000151
In order to ensure that the highlight part can also generate light spots after the image is blurred, the step of adjusting the light spot brightness of the filtering image according to the weight coefficient W corresponding to each pixel point in the image to be processed can be to adjust the light of the filtering image according to the weight coefficient W corresponding to each pixel point in the image to be processedThe brightness of the spot is amplified. As shown in fig. 5, the effect of the obtained blurred image with the light spots is schematically shown. The obtained target blurring image is a blurring image of a background blurring light spot with a shape corresponding to the light spot shape setting information in the image blurring instruction.
The image blurring method provided by the embodiment of the invention comprises the steps of obtaining an image to be processed which is formed by fusing at least two images and a disparity map of the image to be processed; determining the radius of a filter kernel corresponding to each pixel point in the image to be processed according to the disparity map, distinguishing a virtual region and a non-virtual region of the image to be processed according to the position of the focal point in the disparity map, and determining the radius of the filter kernel corresponding to each pixel in the image to be processed; and acquiring an image blurring instruction, and generating an image filtering kernel according to the light spot shape setting information contained in the image blurring instruction and the radius of the filtering kernel corresponding to each pixel point. The light spot shape setting information in the image blurring instruction can be customized by a user, and the image filtering kernel is generated according to the light spot shape setting information and the radius of the filtering kernel corresponding to each pixel point, so that the light spot shape customized by the user can be generated by filtering the image to be processed, and the function of editable light spot shape is realized. After the image filtering kernel is decomposed into a plurality of sub filtering kernels in the horizontal direction and the vertical direction, the image to be processed is respectively input into each sub filtering kernel, so that the plurality of sub filtering kernels obtained after decomposition can simultaneously filter the image to be processed in the horizontal direction and the vertical direction, and the processing time of light spot editing and image blurring is shortened; and finally, superposing the filtering results output by the sub-filtering kernels to obtain a target blurring image with a user-defined light spot shape, and improving the processing speed of image blurring while ensuring the light spot effect.
Example two
Fig. 6 shows a device for blurring a band spot of an image according to a third embodiment of the present invention. On the basis of the first embodiment, the embodiment of the present invention further provides a device 6 for blurring an image with a light spot, the device including:
a disparity map obtaining module 601, configured to obtain a to-be-processed image and a disparity map of the to-be-processed image; the image to be processed is fused by at least two images;
a filtering kernel radius determining module 602, configured to determine, according to the disparity map, a filtering kernel radius corresponding to each pixel point in the image to be processed;
an image filtering kernel generating module 603, configured to obtain an image blurring instruction, and generate an image filtering kernel according to the filtering kernel radius corresponding to each pixel point and light spot shape setting information included in the image blurring instruction;
the target blurred image generating module 604 is configured to perform filtering processing on the image to be processed according to the image filtering core, so as to obtain a target blurred image.
In one example implementation, the target-blurred image generation module 604 includes:
the image filtering kernel decomposition unit is used for decomposing the image filtering kernel into a plurality of sub filtering kernels;
and the blurring image generation unit with the light spots is used for respectively inputting the image to be processed into each sub-filtering kernel to obtain a filtering result after the sub-filtering kernels are superposed, and obtaining a target blurring image based on all the filtering results.
In one embodiment, the band spot blurring device 5 further includes:
the weight coefficient calculation module is used for setting the light spot brightness weight of the image filtering kernel according to the brightness value of each pixel point in the image to be processed, and obtaining the weight coefficient corresponding to each pixel point in the image to be processed according to the set light spot brightness weight; the weight coefficient is used for adjusting the brightness of each pixel point in the image to be processed so as to obtain the target blurred image.
In one embodiment, the speckle blurring image generating unit includes:
the filtering result superposition subunit is used for respectively inputting the image to be processed into each sub-filtering kernel to obtain a filtering result obtained after the sub-filtering kernels are superposed, and obtaining a filtering image based on all the filtering results;
and the light spot brightness adjusting subunit is used for amplifying the light spot brightness of the filtering image according to the weight coefficient to obtain a target blurring image.
In one implementation example, the filtering kernel radius determination module 502 includes:
the focus parallax value determining unit is used for determining a parallax median value positioned in a preset value frame in the parallax image as a focus parallax value by taking the focus position of the parallax image as the center of the preset value frame;
and the filtering kernel radius determining unit is used for determining the filtering kernel radius corresponding to each pixel point in the image to be processed according to the maximum parallax value, the preset blurring radius, the focus parallax value and the parallax value corresponding to each pixel point in the image to be processed in the parallax map.
In one example, the image filtering kernel decomposition unit includes:
the singular value decomposition subunit is used for performing singular value decomposition on the image filtering kernel to obtain a plurality of singular values;
the sub-filtering kernel constructing subunit is used for selecting N singular values meeting preset conditions as target singular values and constructing N sub-filtering kernels according to the eigenvectors corresponding to the target singular values; each of the feature vectors includes two vectors in horizontal and vertical directions; the N sub-filter kernels have horizontal and vertical directions; and N is a positive integer greater than 0.
The device for blurring the image with the light spot provided by the embodiment of the invention is used for acquiring a to-be-processed image fused by at least two images and a parallax image of the to-be-processed image; determining the radius of a filter kernel corresponding to each pixel point in the image to be processed according to the disparity map, distinguishing a virtual region and a non-virtual region of the image to be processed according to the position of the focal point in the disparity map, and determining the radius of the filter kernel corresponding to each pixel in the image to be processed; and acquiring an image blurring instruction, and generating an image filtering kernel according to the spot shape setting information contained in the image blurring instruction and the radius of the filtering kernel corresponding to each pixel point. The light spot shape setting information in the image blurring instruction can be customized by a user, and the image filtering kernel is generated according to the light spot shape setting information and the radius of the filtering kernel corresponding to each pixel point, so that the light spot shape customized by the user can be generated by filtering the image to be processed, and the function of editable light spot shape is realized. After the image filtering kernel is decomposed into a plurality of sub-filtering kernels in the horizontal direction and the vertical direction, the image to be processed is respectively input into each sub-filtering kernel, so that the image to be processed can be simultaneously filtered in the horizontal direction and the vertical direction by the plurality of sub-filtering kernels obtained after decomposition, and the processing time of light spot editing and image blurring is shortened; and finally, superposing the filtering results output by the sub-filtering kernels to obtain a target blurring image with a user-defined light spot shape, and improving the processing speed of image blurring while ensuring the light spot effect.
EXAMPLE III
Fig. 7 is a schematic structural diagram of a terminal device according to a third embodiment of the present invention. The terminal device includes: a processor 71, a memory 72 and a computer program 73, such as a program for an image blurring method, stored in said memory 72 and executable on said processor 71. The processor 71 executes the computer program 73 to implement the steps in the above-mentioned image blurring method embodiment, such as the steps S110 to S150 shown in fig. 1.
Illustratively, the computer program 73 may be partitioned into one or more modules that are stored in the memory 72 and executed by the processor 71 to accomplish the present application. The one or more modules may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 73 in the terminal device. For example, the computer program 73 may be divided into a disparity map obtaining module, a filter kernel radius determining module, an image filter kernel generating module, and a target blurred image generating module, and each module has the following specific functions:
the disparity map acquisition module is used for acquiring an image to be processed and a disparity map of the image to be processed; the image to be processed is fused by at least two images;
the filtering kernel radius determining module is used for determining the filtering kernel radius corresponding to each pixel point in the image to be processed according to the disparity map;
the image filtering kernel generating module is used for acquiring an image blurring instruction and generating an image filtering kernel according to the filtering kernel radius corresponding to each pixel point and light spot shape setting information contained in the image blurring instruction;
a target blurring image generation module, configured to perform filtering processing on the image to be processed according to the image filtering core to obtain a target blurring image
The terminal device may include, but is not limited to, a processor 71, a memory 72, and a computer program 73 stored in the memory 72. It will be understood by those skilled in the art that fig. 7 is merely an example of a terminal device, and does not constitute a limitation of the device for blurring the image with the light spot, and may include more or less components than those shown, or combine some components, or different components, for example, the device for blurring the image with the light spot may further include an input-output device, a network access device, a bus, etc.
The Processor 71 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 72 may be an internal storage unit of the image blurring device with light spot, such as a hard disk or a memory of the image blurring device with light spot. The memory 72 may also be an external storage device, such as a plug-in hard disk provided on a device for blurring image spots, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like. Further, the memory 72 may also include both an internal storage unit with a spot blurring apparatus of an image and an external storage device. The memory 72 is used for storing the computer program and other programs and data required for the image blurring method. The memory 72 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment. In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U.S. disk, removable hard disk, magnetic diskette, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunications signal, and software distribution medium, etc. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein.

Claims (13)

1. An image blurring method, comprising:
acquiring an image to be processed and a disparity map of the image to be processed; the image to be processed is fused by at least two images;
determining the radius of a filtering kernel corresponding to each pixel point in the image to be processed according to the disparity map;
acquiring an image blurring instruction, and generating an image filtering kernel according to the radius of the filtering kernel corresponding to each pixel point and light spot shape setting information contained in the image blurring instruction;
and filtering the image to be processed according to the image filtering core to obtain a target blurring image.
2. The image blurring method as claimed in claim 1, wherein said filtering the image to be processed according to the image filtering kernel to obtain a target blurring image, includes:
decomposing the image filtering kernel into a plurality of sub-filtering kernels;
and respectively inputting the image to be processed into each sub-filtering kernel to obtain a filtering result after the sub-filtering kernels are superposed, and obtaining a target virtual image based on all the filtering results.
3. An image blurring method as claimed in claim 2, further comprising, before said decomposing said image filtering kernel into a number of sub-filtering kernels:
setting the light spot brightness weight of the image filtering kernel according to the brightness value of each pixel point in the image to be processed, and obtaining a weight coefficient corresponding to each pixel point in the image to be processed according to the set light spot brightness weight; the weight coefficient is used for adjusting the brightness of each pixel point in the image to be processed.
4. The image blurring method according to claim 3, wherein the inputting the image to be processed into each of the sub-filtering kernels respectively to obtain a filtering result obtained by superimposing the sub-filtering kernels, and obtaining the target blurring image based on all the filtering results comprises:
respectively inputting the image to be processed into each sub-filtering kernel to obtain a filtering result after the sub-filtering kernels are superposed, and obtaining a filtering image based on all the filtering results;
and adjusting the light spot brightness of the filtered image according to the weight coefficient to obtain a target blurred image.
5. The image blurring method according to any one of claims 1 to 4, wherein the target blurring image is a blurring image of a spot having a shape corresponding to the spot shape setting information.
6. The image blurring method as claimed in any one of claims 1 to 4, wherein the obtaining of the image to be processed and the disparity map of the image to be processed comprises:
and calculating the image according to a depth estimation algorithm to obtain a disparity map of the image to be processed.
7. The image blurring method according to any one of claims 1 to 4, wherein the determining, according to the disparity map, a filtering kernel radius corresponding to each pixel point in the image to be processed includes:
taking the focus position of the disparity map as the center of a preset value frame, and determining a disparity median value positioned in the preset value frame in the disparity map as a focus disparity value;
and determining the radius of a filter kernel corresponding to each pixel point in the image to be processed according to the maximum parallax value, a preset blurring radius, the focus parallax value and the parallax value corresponding to each pixel point in the image to be processed in the parallax map.
8. The image blurring method as claimed in claim 2, wherein said decomposing the image filter kernel into a number of sub-filter kernels comprises:
singular value decomposition is carried out on the image filtering kernel to obtain a plurality of singular values;
selecting N singular values meeting preset conditions as target singular values, and constructing N sub-filter kernels according to the eigenvectors corresponding to the target singular values; each of the feature vectors includes two vectors in horizontal and vertical directions; the N sub-filter kernels have horizontal and vertical directions; and N is a positive integer greater than 0.
9. An image blurring method as claimed in claim 8, wherein the predetermined condition is that the singular values are arranged in a queue according to a numerical size, and the first N singular values in the queue are selected.
10. An image blurring method as claimed in claim 1, wherein each of said images fused with said image to be processed is captured by an external camera.
11. An image blurring apparatus, comprising:
the disparity map acquisition module is used for acquiring an image to be processed and a disparity map of the image to be processed; the image to be processed is fused by at least two images;
the filtering kernel radius determining module is used for determining the filtering kernel radius corresponding to each pixel point in the image to be processed according to the disparity map;
the image filtering kernel generating module is used for acquiring an image blurring instruction and generating an image filtering kernel according to the filtering kernel radius corresponding to each pixel point and light spot shape setting information contained in the image blurring instruction;
and the target blurring image generation module is used for filtering the image to be processed according to the image filtering core so as to obtain a target blurring image.
12. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the image blurring method according to any one of claims 1 to 10.
13. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the image blurring method according to any one of claims 1 to 10 when executing the computer program.
CN202011432957.5A 2020-12-09 2020-12-09 Image blurring method and terminal equipment Pending CN114626992A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011432957.5A CN114626992A (en) 2020-12-09 2020-12-09 Image blurring method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011432957.5A CN114626992A (en) 2020-12-09 2020-12-09 Image blurring method and terminal equipment

Publications (1)

Publication Number Publication Date
CN114626992A true CN114626992A (en) 2022-06-14

Family

ID=81894926

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011432957.5A Pending CN114626992A (en) 2020-12-09 2020-12-09 Image blurring method and terminal equipment

Country Status (1)

Country Link
CN (1) CN114626992A (en)

Similar Documents

Publication Publication Date Title
US10645368B1 (en) Method and apparatus for estimating depth of field information
EP3134868B1 (en) Generation and use of a 3d radon image
CN111353948B (en) Image noise reduction method, device and equipment
US9367896B2 (en) System and method for single-frame based super resolution interpolation for digital cameras
CN110324532B (en) Image blurring method and device, storage medium and electronic equipment
CN108024054A (en) Image processing method, device and equipment
US11282176B2 (en) Image refocusing
CN109151329A (en) Photographic method, device, terminal and computer readable storage medium
TWI777098B (en) Method, apparatus and electronic device for image processing and storage medium thereof
CN111131688B (en) Image processing method and device and mobile terminal
JPWO2019065260A1 (en) Information processing equipment, information processing methods, programs, and interchangeable lenses
CN109214996A (en) A kind of image processing method and device
CN114627034A (en) Image enhancement method, training method of image enhancement model and related equipment
CN111105370B (en) Image processing method, image processing apparatus, electronic device, and readable storage medium
CN108122218B (en) Image fusion method and device based on color space
EP3113475B1 (en) An apparatus and a method for modifying colours of a focal stack of a scene according to a colour palette
CN112700376A (en) Image moire removing method and device, terminal device and storage medium
CN110838088B (en) Multi-frame noise reduction method and device based on deep learning and terminal equipment
CN113159229A (en) Image fusion method, electronic equipment and related product
CN106454101A (en) Image processing method and terminal
CN114626992A (en) Image blurring method and terminal equipment
CN113014811A (en) Image processing apparatus, image processing method, image processing device, and storage medium
CN115086558B (en) Focusing method, image pickup apparatus, terminal apparatus, and storage medium
CN111835968B (en) Image definition restoration method and device and image shooting method and device
CN116012431A (en) Depth image acquisition method, apparatus and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination