CN114693508A - Image processing method, device, terminal equipment and computer readable storage medium - Google Patents

Image processing method, device, terminal equipment and computer readable storage medium Download PDF

Info

Publication number
CN114693508A
CN114693508A CN202011630812.6A CN202011630812A CN114693508A CN 114693508 A CN114693508 A CN 114693508A CN 202011630812 A CN202011630812 A CN 202011630812A CN 114693508 A CN114693508 A CN 114693508A
Authority
CN
China
Prior art keywords
image
channel
foreground
processed
noise
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011630812.6A
Other languages
Chinese (zh)
Inventor
李鹏
刘阳兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan TCL Group Industrial Research Institute Co Ltd
Original Assignee
Wuhan TCL Group Industrial Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan TCL Group Industrial Research Institute Co Ltd filed Critical Wuhan TCL Group Industrial Research Institute Co Ltd
Priority to CN202011630812.6A priority Critical patent/CN114693508A/en
Publication of CN114693508A publication Critical patent/CN114693508A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

The application is applicable to the technical field of image processing, and provides an image processing method, an image processing device, terminal equipment and a computer readable storage medium, wherein the method comprises the following steps: the image processing method comprises the steps of carrying out image processing on an image to be processed to obtain a mask and a blurring image, respectively obtaining a foreground Y-channel image of the image to be processed and a background Y-channel image of the blurring image according to the mask, generating a Y-channel noise point diagram according to the foreground Y-channel image, and overlapping the Y-channel noise point diagram and the background Y-channel image to obtain a blurring effect diagram. The blurring effect image is obtained by superposing the Y-channel noise point diagram and the background Y-channel diagram of the blurring image, so that the noise point superposition processing operation of the image to be processed based on a single channel is realized, the calculated amount and the processing time of blurring processing are reduced, and the blurring processing efficiency is improved.

Description

Image processing method, image processing device, terminal equipment and computer readable storage medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a terminal device, and a computer-readable storage medium.
Background
In the shooting function of the existing terminal equipment, when shooting is carried out by using a portrait mode, an obtained image is usually a blurring effect picture with clear main body and fuzzy background.
In the shooting process, the shot image often has noise, and after the image is subjected to background blurring, noise consistency (noise consistency) of a blurring effect graph is easily deteriorated, so that the blurring effect is not real.
To solve the above problem, a noise superposition method is usually used to improve the noise consistency of the blurring effect map. The existing noise superposition method generally obtains an original image to be blurred, collects noise signals in the original image to be blurred, periodically expands the noise signals, and superposes the noise signals on the blurred image to obtain a final blurred effect image.
However, the above method has poor applicability, large calculation amount of superposition processing, long blurring processing time, increased power consumption of the terminal device, and reduced blurring efficiency.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, terminal equipment and a computer readable storage medium, and can solve the problems of poor applicability, large calculated amount, long processing time and low blurring efficiency of the existing noise point superposition method.
In a first aspect, an embodiment of the present application provides an image processing method, including:
carrying out image processing on an image to be processed to obtain a mask and a blurred image;
respectively obtaining a foreground Y-channel image of the image to be processed and a background Y-channel image of the blurred image according to the mask;
generating a Y-channel noise point diagram according to the foreground Y-channel diagram;
and superposing the Y-channel noise point diagram and the background Y-channel diagram to obtain a blurring effect image.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the processing module is used for processing the image to be processed to obtain a mask and a blurred image;
the acquisition module is used for respectively obtaining a foreground Y-channel image of the image to be processed and a background Y-channel image of the blurred image according to the mask;
the generation module is used for generating a Y-channel noise point diagram according to the foreground Y-channel diagram;
and the superposition processing module is used for superposing the Y-channel noise point diagram and the background Y-channel diagram to obtain a blurring effect image.
In a third aspect, an embodiment of the present application provides a terminal device, where the terminal device includes a memory, a processor, and an image processing program stored in the memory and executable on the processor, and the processor implements the image processing method according to the first aspect when executing the image processing program.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which an image processing program is stored, and when executed by a processor, the image processing program implements the image processing method according to the first aspect.
In a fifth aspect, the present application provides a computer program product, which when run on a terminal device, causes the terminal device to execute the image processing method of the first aspect.
The method comprises the steps of obtaining a mask of an image to be processed through calculation, obtaining a foreground Y-channel image of the image to be processed and a background Y-channel image of a blurring image according to the mask, generating a corresponding Y-channel noise point diagram according to the foreground Y-channel image, and overlapping the Y-channel noise point diagram and the background Y-channel image to obtain a blurring effect image.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application;
FIG. 2(a) is a schematic diagram of an image containing noisy details provided by an embodiment of the present application;
FIG. 2(b) is a schematic diagram of an image containing no noisy details provided by another embodiment of the present application;
FIG. 3 is a diagram illustrating a blurring effect without noise superposition according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a blurring effect image according to another embodiment of the present application;
fig. 5 is a schematic diagram of a Y-channel noise plot provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The image processing method provided by the embodiment of the application can be applied to terminal equipment such as a mobile phone, a tablet computer, a notebook computer and the like, and the embodiment of the application does not limit the specific type of the terminal equipment.
Fig. 1 shows a schematic flow chart of an image processing method provided by the present application, which can be applied to the above-mentioned mobile phone by way of example and not limitation.
S101, processing the image to be processed to obtain a mask image and a blurring image.
In specific application, a camera of a current terminal is used for shooting and acquiring an image to be processed. The image to be processed refers to an original image which needs to be subjected to image processing. In this embodiment, the image to be processed is set as an image in a preset format, for example, the image to be processed is set as a YUV format, or is set as an RGB format according to requirements, so that the method is suitable for terminal devices with different camera types.
In specific application, a binocular disparity map corresponding to an image to be processed is obtained, and meanwhile position information of a target focus in the binocular disparity map is obtained, so that a mask of the image to be processed is obtained through calculation according to the binocular disparity map and the position information of the target focus; meanwhile, blurring the image to be processed according to the binocular disparity map and the position information of the target focus to obtain a blurred image. Among other things, MASKs (MASKs) are used to distinguish between the foreground and background of an image. The acquisition mode of the position information of the target focus can be specifically set according to actual conditions. In the present embodiment, a point selected by the user in the binocular disparity map is set as a target focus, and position information of the corresponding target focus is obtained.
As shown in fig. 2(a), a schematic of an image containing noisy details is provided;
as shown in fig. 2(b), a schematic of an image containing no noisy details is provided.
As can be seen from the comparison of fig. 2, the foreground and background noise in the image containing noise details has high consistency, so that the virtual background and foreground edges transition naturally, and the virtual effect is closer to the real optical blurring effect.
And S102, respectively obtaining a foreground Y-channel image of the image to be processed and a background Y-channel image of the blurred image according to the mask.
In specific application, the foreground and background distinguishing operation is carried out on the image to be processed and the blurred image according to the mask of the image to be processed, and a foreground Y-channel image of the image to be processed and a background Y-channel image of the blurred image are obtained respectively.
And S103, generating a Y-channel noise point diagram according to the foreground Y-channel diagram.
In specific application, according to pixel values of pixel points in an image to be processed, calculating to obtain a noise distribution parameter of a foreground Y-channel image of the image to be processed, and generating a corresponding Y-channel noise point image according to the noise distribution parameter. The noise distribution parameters of the foreground Y-channel image comprise the mean value and the variance of pixel points in the foreground Y-channel image; and generating a Y-channel noise point diagram carrying Gaussian noise points and white noise points according to the mean value and the variance of the pixel points in the foreground Y-channel diagram.
And S104, superposing the Y-channel noise point diagram and the background Y-channel diagram to obtain a blurring effect image.
In the specific application, the pixel values of the pixel points in the Y-channel noise point diagram are superposed with the pixel values of the pixel points in the background Y-channel diagram of the blurring image so as to update the pixel values of the pixel points in the background Y-channel diagram in the blurring image, and further obtain the blurring effect image.
In specific application, a pixel point Bokeh _ Y _ B in a background Y channel image of the blurring image can be determined according to a mask, and noise points in the Y channel noise point diagram are superposed into the background Y channel image of the blurring image according to a preset formula (namely formula 1), so that the blurring effect image is obtained.
Bokeh_Y_Bi=Bokeh_Y_Bi+p*Noisei(formula 1);
wherein: bokeh _ Y _ BiPixel value Noise of the ith pixel point in the background Y-channel image representing the blurred imageiAnd the ith pixel in the Y-channel noise point diagram is represented, and p obeys binomial probability distribution and can be randomly selected to be 0 or 1.
As shown in fig. 3, a schematic diagram of a blurring effect graph without noise superposition is provided;
as shown in fig. 4, a schematic diagram of a blurring effect image is provided.
The blurring processing is directly performed on the image to be processed, so that a blurring effect image as shown in fig. 3 can be obtained, a Y channel noise point diagram is obtained through calculation, and noise point superposition processing is performed on the Y channel noise point diagram and a background Y channel diagram of the blurring effect image as shown in fig. 3, so that a blurring effect image after superposition processing as shown in fig. 4 can be obtained.
In one embodiment, step S102 includes:
s1021, acquiring a binocular disparity map corresponding to the image to be processed and position information of a target focus;
s1022, calculating the image to be processed according to the binocular disparity map and the position information to obtain a mask;
and S1023, blurring the image to be processed according to the binocular disparity map and the position information to obtain a blurred image.
In specific application, a binocular disparity map corresponding to an image to be processed is acquired, position information of a target focus selected by a user in the binocular disparity map is acquired, blurring processing is carried out on the image to be processed according to the position information of the target focus and the binocular disparity map to obtain a blurring image, meanwhile, a disparity value of the target focus is obtained through calculation by taking the position of the target focus as the center in the binocular disparity map, and then a mask of the image to be processed is obtained through calculation according to the disparity value of the target focus.
In one embodiment, step S1022 includes:
taking the position corresponding to the position information as a center, acquiring the median of all pixel points in a preset area in the binocular disparity map, and taking the median as the disparity value of the target focus;
calculating the radius of a fuzzy kernel of each pixel point in the image to be processed;
performing fuzzy processing on each pixel point in the image to be processed according to the radius of the fuzzy kernel to obtain a fuzzy image to be processed;
and obtaining a mask according to the pixel value of each pixel point in the blurred image to be processed.
In specific application, a preset area is established by taking a position (namely the position of a target corner point) corresponding to the position information of a target focus in a binocular disparity map as a central point, and the median of pixel values of all pixel points in the preset area is obtained and is taken as a disparity value of the target focus; calculating to obtain a fuzzy kernel radius of each pixel point in the image to be processed according to the parallax value of the target focus, the parallax value of each pixel in the binocular parallax image and the maximum parallax value of the parallax values of all the pixel points of the binocular parallax image, performing fuzzy processing on each pixel according to the fuzzy kernel radius of each pixel point to obtain the pixel value of each pixel point in the fuzzy image to be processed, judging whether each pixel point is a foreground or not according to whether the fuzzy kernel radius of each pixel point in the fuzzy image to be processed is 0 or not (namely setting the pixel point with the fuzzy kernel radius of 0 as the foreground and the pixel point with the fuzzy kernel radius of not 0 as the background), and further obtaining a mask of the image to be processed. The preset area can be specifically set according to actual requirements. For example, the size of the preset area is set equal to 1/8 of the image to be processed.
In specific application, the fuzzy kernel radius r of each pixel point in the image to be processed can be calculated and obtained through the formula (2)i,jThe following formula:
Figure BDA0002876384690000071
wherein: r represents the maximum blurring radius (which can be set according to data input by a user), di,jThe disparity value of a certain pixel point (i, j) in the binocular disparity map is represented, d _ focus represents the disparity value of the target focus, d _ max represents the maximum disparity value in the disparity values of all the pixel points in the binocular disparity map, and delta d represents the disparity dynamic range of the pixel which can be regarded as the foreground at the target focus, and the disparity dynamic range can be specifically set according to actual conditions. According to a plurality of experiments, when it is detected that Δ d is 0.03 × d _ max, the calculated mask accuracy is the highest, and therefore, in the present embodiment, Δ d is set to 0.03 × d _ max.
In one embodiment, step S104 includes:
s1041, carrying out high-pass filtering processing on the foreground Y-channel image to obtain a filtering foreground Y-channel image;
s1042, calculating a pixel difference value of the foreground Y-channel image and the filtering foreground Y-channel image to obtain a noise point image of the foreground Y-channel;
s1043, calculating to obtain a noise distribution parameter of the foreground Y-channel noise point diagram;
and S1044, generating a corresponding Y-channel noise point diagram according to the noise distribution parameters.
In the specific application, the foreground Y-channel image of the image to be processed is subjected to high-pass filtering processing to obtain a filtering foreground Y-channel image, and the pixel difference value between each pixel point in the foreground Y-channel image of the image to be processed and each pixel point in the filtering foreground Y-channel image is calculated to obtain the noise point image of the corresponding foreground Y-channel. And the pixel value of each pixel point in the noise point diagram of the foreground Y channel is the pixel difference value. And then, calculating to obtain noise distribution parameters of the pixels in the foreground Y-channel noise point diagram according to the pixel values of all the pixels in the foreground Y-channel noise point diagram, and generating a corresponding Y-channel noise point diagram according to the noise distribution parameters of the pixels in the foreground Y-channel noise point diagram. The noise distribution parameters include, but are not limited to, mean and variance.
In a specific application, since noise is generated by a large amount of random interference in general, the distribution of noise points in the foreground Y-channel noise point diagram can be assumed to follow the gaussian distribution principle by combining the central limit theorem, and therefore, the mean and the variance of the foreground Y-channel noise point diagram can be calculated by formula (3) and formula (4):
Figure BDA0002876384690000081
Figure BDA0002876384690000082
wherein: v denotes the mean, σ2Denotes variance, piAnd the N represents the number of the pixel points in the foreground Y-channel noise point diagram.
In specific application, the mean value and the variance of the pixel points in the foreground Y-channel noise point diagram can be calculated through a formula (5) and a formula (6), the pixel value of the noise point of the Y-channel noise point diagram is correspondingly obtained, and the Y-channel noise point diagram is correspondingly generated:
noisei=2*MAX*(N(xi,[υ,σ2]) -0.5) equation (5);
Figure BDA0002876384690000083
wherein: noiseiPixel value, x, representing any noise in a Y-channel noise mapiRepresents a random number, xiTake [ -255, 255]Any value in between. MAX represents the maximum noise value of the Y channel noise plot, which can be set according to specific requirements. In this embodiment, MAX is set to 8, because the Y-channel noise map generated in response to detection of MAX being 8 is most effective.
It is understood that the size of the Y-channel noise map generated according to the above equations (5) and (6) is equal to 1/2 of the size of the image to be processed. Therefore, the size of the generated Y-channel noise point map needs to be enlarged to the size of the image to be processed by a nearest neighbor interpolation (nearest _ neighbor) algorithm. Wherein, the nearest neighbor interpolation algorithm is to make the gray value of the transformed pixel equal to the gray value of the input pixel nearest to the transformed pixel.
As shown in fig. 5, a schematic diagram of a Y-channel noise plot is provided.
Fig. 5 is a Y-channel noise point diagram carrying gaussian noise points and white noise points generated according to noise distribution parameters of pixels in the foreground Y-channel diagram.
In one embodiment, step S1043 includes:
calculating to obtain a noise distribution parameter of the foreground Y-channel noise point diagram according to the pixel value and the pixel number of each pixel point in the foreground Y-channel noise point diagram; wherein the noise distribution parameters include a mean and a variance.
In specific application, the mean value and the variance of the foreground Y-channel noise point diagram are respectively calculated according to the pixel value of each pixel point in the foreground Y-channel noise point diagram and the number of the pixel points in the foreground Y-channel noise point diagram.
In one embodiment, step S1044 includes:
and calculating to obtain the pixel value of each noise point in the Y channel according to the mean value and the variance, and generating a corresponding Y channel noise point diagram according to the pixel values.
In specific application, the mean value and the variance of pixel points in the foreground Y-channel noise point diagram are calculated according to a preset formula to obtain the pixel value of each noise point in a Y channel, and a corresponding Y-channel noise point diagram is generated according to the pixel values of all the noise points. The preset formulas are the formula (5) and the formula (6).
The method comprises the steps of obtaining a mask of an image to be processed through calculation, obtaining a foreground Y-channel image of the image to be processed and a background Y-channel image of a blurring image according to the mask, generating a corresponding Y-channel noise point diagram according to the foreground Y-channel image, and overlapping the Y-channel noise point diagram and the background Y-channel image to obtain a blurring effect image, so that noise point overlapping processing operation of the image to be processed based on a single channel is realized, the calculated amount and the processing time of blurring processing are reduced, and the blurring processing efficiency is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 6 shows a block diagram of an image processing apparatus provided in an embodiment of the present application, corresponding to the image processing method of the above embodiment, and only shows portions related to the embodiment of the present application for convenience of explanation.
Referring to fig. 6, the image processing apparatus 100 includes:
the processing module 101 is configured to perform image processing on an image to be processed to obtain a mask and a blurred image;
an obtaining module 102, configured to obtain a foreground Y-channel image of the image to be processed and a background Y-channel image of the blurred image according to the mask;
the generating module 103 is configured to generate a Y channel noise point diagram according to the foreground Y channel diagram;
and the superposition processing module 104 is configured to superpose the Y-channel noise point diagram and the background Y-channel diagram to obtain a blurring effect image.
In one embodiment, the processing module 101 includes:
the first acquisition unit is used for acquiring a binocular disparity map corresponding to an image to be processed and position information of a target focus;
the first calculation unit is used for calculating the image to be processed according to the binocular disparity map and the position information to obtain a mask;
and the blurring processing unit is used for blurring the image to be processed according to the binocular disparity map and the position information to obtain a blurring image.
In one embodiment, a first computing unit includes:
the first obtaining subunit is used for obtaining the median of all pixel points in a preset area in the binocular disparity map by taking the position corresponding to the position information as the center, and taking the median as the disparity value of the target focus;
the first calculating subunit is used for calculating the fuzzy kernel radius of each pixel point in the image to be processed;
the fuzzy processing subunit is used for carrying out fuzzy processing on each pixel point in the image to be processed according to the radius of the fuzzy kernel to obtain a fuzzy image to be processed;
and the second obtaining subunit is used for obtaining a mask according to the pixel value of each pixel point in the blurred image to be processed.
In one embodiment, the generation module 104 includes:
the filtering processing unit is used for carrying out high-pass filtering processing on the foreground Y-channel image to obtain a filtering foreground Y-channel image;
the second calculation unit is used for calculating the pixel difference value of the foreground Y-channel image and the filtering foreground Y-channel image to obtain a noise point image of the foreground Y-channel;
the third calculating unit is used for calculating and obtaining the noise distribution parameters of the foreground Y-channel noise point diagram;
and the generating unit is used for generating a corresponding Y-channel noise point diagram according to the noise distribution parameters.
In one embodiment, the third computing unit includes:
the second calculating subunit is used for calculating a noise distribution parameter of the foreground Y-channel noise point diagram according to the pixel value and the pixel number of each pixel point in the foreground Y-channel noise point diagram; wherein the noise distribution parameters include a mean and a variance.
In one embodiment, a generation unit includes:
and the generating subunit is used for calculating a pixel value of each noise point in the Y channel according to the mean value and the variance, and generating a corresponding Y channel noise point diagram according to the pixel values.
The method comprises the steps of obtaining a mask of an image to be processed through calculation, obtaining a foreground Y-channel image of the image to be processed and a background Y-channel image of a blurring image according to the mask, generating a corresponding Y-channel noise point diagram according to the foreground Y-channel image, and overlapping the Y-channel noise point diagram and the background Y-channel image to obtain a blurring effect image.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 7, the terminal device 7 of this embodiment includes: at least one processor 70 (only one shown in fig. 7), a memory 71, and an image processing program 72 stored in the memory 71 and executable on the at least one processor 70, the steps in any of the various image processing method embodiments described above being implemented when the image processing program 72 is executed by the processor 70.
The terminal device 7 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 70, a memory 71. Those skilled in the art will appreciate that fig. 7 is only an example of the terminal device 7, and does not constitute a limitation to the terminal device 7, and may include more or less components than those shown, or combine some components, or different components, for example, and may further include input/output devices, network access devices, and the like.
The Processor 70 may be a Central Processing Unit (CPU), and the Processor 70 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may in some embodiments be an internal storage unit of the terminal device 7, such as a hard disk or a memory of the terminal device 7. The memory 71 may also be an external storage device of the terminal device 7 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital Card (SD), a Flash memory Card (Flash Card), and the like, provided on the terminal device 7. Further, the memory 71 may also include both an internal storage unit of the terminal device 7 and an external storage device. The memory 71 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as a program code of an image processing program. The memory 71 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiment of the present application further provides a computer-readable storage medium, where an image processing program is stored, and when the image processing program is executed by a processor, the image processing program implements the steps that can be implemented in the above method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method of the embodiments described above can be implemented by an image processing program, which can be stored in a computer readable storage medium, and when the image processing program is executed by a processor, the steps of the method embodiments described above can be implemented. Wherein the image processing program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present application, and they should be construed as being included in the present application.

Claims (10)

1. An image processing method, comprising:
performing image processing on an image to be processed to obtain a mask and a blurred image;
respectively obtaining a foreground Y-channel image of the image to be processed and a background Y-channel image of the blurred image according to the mask;
generating a Y-channel noise point diagram according to the foreground Y-channel diagram;
and superposing the Y-channel noise point diagram and the background Y-channel diagram to obtain a blurring effect image.
2. The method of claim 1, wherein the image processing the image to be processed to obtain a masked and blurred image comprises:
acquiring a binocular disparity map corresponding to an image to be processed and position information of a target focus;
calculating the image to be processed according to the binocular disparity map and the position information to obtain a mask;
and performing blurring processing on the image to be processed according to the binocular disparity map and the position information to obtain a blurring image.
3. The method of claim 2, wherein the calculating the image to be processed according to the binocular disparity map and the position information to obtain a mask comprises:
taking the position corresponding to the position information as a center, acquiring a median value of all pixel points in a preset area in the binocular disparity map, and taking the median value as a disparity value of the target focus;
calculating the fuzzy kernel radius of each pixel point in the image to be processed;
performing fuzzy processing on each pixel point in the image to be processed according to the fuzzy kernel radius to obtain a fuzzy image to be processed;
and obtaining a mask according to the pixel value of each pixel point in the blurred image to be processed.
4. The method of any one of claims 1-3, wherein generating a Y-channel noise map from the foreground Y-channel map comprises:
carrying out high-pass filtering processing on the foreground Y-channel image to obtain a filtering foreground Y-channel image;
calculating pixel difference values of the foreground Y-channel image and the filtering foreground Y-channel image to obtain a noise point image of a foreground Y-channel;
calculating to obtain a noise distribution parameter of the foreground Y-channel noise point diagram;
and generating a corresponding Y-channel noise point diagram according to the noise distribution parameters.
5. The method of claim 4, wherein the computing noise distribution parameters for the foreground Y-channel noise map comprises:
calculating to obtain a noise distribution parameter of the foreground Y-channel noise point diagram according to the pixel value and the number of pixels of each pixel point in the foreground Y-channel noise point diagram; wherein the noise distribution parameters include a mean and a variance.
6. The method of claim 5, wherein generating a corresponding Y-channel noise map from the noise distribution parameters comprises:
and calculating to obtain the pixel value of each noise point in the Y channel according to the mean value and the variance, and generating a corresponding Y channel noise point diagram according to the pixel value.
7. An image processing apparatus characterized by comprising:
the processing module is used for processing the image to be processed to obtain a mask and a blurred image;
the acquisition module is used for respectively obtaining a foreground Y-channel image of the image to be processed and a background Y-channel image of the blurred image according to the mask;
the generation module is used for generating a Y-channel noise point diagram according to the foreground Y-channel diagram;
and the superposition processing module is used for superposing the Y-channel noise point diagram and the background Y-channel diagram to obtain a blurring effect image.
8. The apparatus of claim 7, wherein the processing module comprises:
the first acquisition unit is used for acquiring a binocular disparity map corresponding to an image to be processed and position information of a target focus;
the first calculation unit is used for calculating the image to be processed according to the binocular disparity map and the position information to obtain a mask;
and the blurring processing unit is used for blurring the image to be processed according to the binocular disparity map and the position information to obtain a blurring image.
9. A terminal device, characterized in that the terminal device comprises a memory, a processor and an image processing program stored in the memory and executable on the processor, the processor implementing the method according to any one of claims 1 to 6 when executing the image processing program.
10. A computer-readable storage medium, characterized in that it stores an image processing program which, when executed by a processor, implements the method of any one of claims 1 to 6.
CN202011630812.6A 2020-12-30 2020-12-30 Image processing method, device, terminal equipment and computer readable storage medium Pending CN114693508A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011630812.6A CN114693508A (en) 2020-12-30 2020-12-30 Image processing method, device, terminal equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011630812.6A CN114693508A (en) 2020-12-30 2020-12-30 Image processing method, device, terminal equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114693508A true CN114693508A (en) 2022-07-01

Family

ID=82134478

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011630812.6A Pending CN114693508A (en) 2020-12-30 2020-12-30 Image processing method, device, terminal equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114693508A (en)

Similar Documents

Publication Publication Date Title
CN109064428B (en) Image denoising processing method, terminal device and computer readable storage medium
CN111275626B (en) Video deblurring method, device and equipment based on ambiguity
KR102010712B1 (en) Distortion Correction Method and Terminal
CN111311482B (en) Background blurring method and device, terminal equipment and storage medium
CN108600783B (en) Frame rate adjusting method and device and terminal equipment
CN109286758B (en) High dynamic range image generation method, mobile terminal and storage medium
CN109214996B (en) Image processing method and device
CN109309826B (en) Image color balancing method and device, terminal equipment and readable storage medium
CN111127303A (en) Background blurring method and device, terminal equipment and computer readable storage medium
CN110874827B (en) Turbulent image restoration method and device, terminal equipment and computer readable medium
CN113344821B (en) Image noise reduction method, device, terminal and storage medium
CN112767281A (en) Image ghost eliminating method, device, electronic equipment and storage medium
CN111757100B (en) Method and device for determining camera motion variation, electronic equipment and medium
CN111383178A (en) Image enhancement method and device and terminal equipment
CN111311481A (en) Background blurring method and device, terminal equipment and storage medium
CN111563517A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112435182B (en) Image noise reduction method and device
CN111340722B (en) Image processing method, processing device, terminal equipment and readable storage medium
CN113628259A (en) Image registration processing method and device
CN114493988A (en) Image blurring method, image blurring device and terminal equipment
CN110070482B (en) Image processing method, apparatus and computer readable storage medium
CN110880160A (en) Picture frame super-division method and device, terminal equipment and computer readable storage medium
CN114693508A (en) Image processing method, device, terminal equipment and computer readable storage medium
CN109308690B (en) Image brightness balancing method and terminal
CN111416937B (en) Image processing method, image processing device, storage medium and mobile equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination