CN110874814B - Image processing method, image processing device and terminal equipment - Google Patents

Image processing method, image processing device and terminal equipment Download PDF

Info

Publication number
CN110874814B
CN110874814B CN201811013836.XA CN201811013836A CN110874814B CN 110874814 B CN110874814 B CN 110874814B CN 201811013836 A CN201811013836 A CN 201811013836A CN 110874814 B CN110874814 B CN 110874814B
Authority
CN
China
Prior art keywords
image
position information
foreground
processed
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811013836.XA
Other languages
Chinese (zh)
Other versions
CN110874814A (en
Inventor
张弓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201811013836.XA priority Critical patent/CN110874814B/en
Publication of CN110874814A publication Critical patent/CN110874814A/en
Application granted granted Critical
Publication of CN110874814B publication Critical patent/CN110874814B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G06T3/04
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

The application provides an image processing method, an image processing device and terminal equipment, wherein the method comprises the following steps: acquiring an image to be processed; detecting a foreground in the image to be processed to obtain position information of the foreground; performing expansion operation on an image area indicated by the position information in the image to be processed to obtain corrected position information; and blurring processing is carried out on the areas except the image area indicated by the correction position information in the image to be processed, so that a processed image is obtained. The method and the device can enable the image after blurring to be more natural.

Description

Image processing method, image processing device and terminal equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a terminal device, and a computer readable storage medium.
Background
The existing terminal equipment (such as a mobile phone, a tablet personal computer and the like) can shoot an image with a background blurring effect, and the traditional background blurring method is to firstly detect a foreground area and a background area in an image to be processed, and then perform blurring processing on the background area in the image to be processed, so that a blurring image with clear foreground area and blurred background area is obtained.
However, in the image after blurring obtained by using the conventional background blurring method, the foreground area is clearer, and the background area is more blurred, so that the junction between the foreground area and the background area in the image after blurring is abrupt, and the image after blurring is unnatural. For example, when the foreground is a portrait, in the blurred image obtained by using the traditional background blurring method, at the edge contour line of the portrait, the image area on one side is clearer, and the image area on the other side is more blurred, so that the edge area of the portrait in the blurred image is unnatural.
Disclosure of Invention
In view of the above, the present application provides an image processing method, an image processing apparatus, a terminal device, and a computer readable storage medium, which can solve the technical problem of making a blurred image unnatural by using a conventional background blurring method.
The first aspect of the present application provides an image processing method, including:
acquiring an image to be processed;
detecting the foreground in the image to be processed to obtain the position information of the foreground;
performing expansion operation on an image area indicated by the position information in the image to be processed to obtain corrected position information;
And blurring the region except the image region indicated by the correction position information in the image to be processed to obtain a processed image.
A second aspect of the present application provides an image processing apparatus, comprising:
the image acquisition module is used for acquiring an image to be processed;
the foreground detection module is used for detecting the foreground in the image to be processed and obtaining the position information of the foreground;
the expansion operation module is used for carrying out expansion operation on the image area indicated by the position information in the image to be processed to obtain corrected position information;
and the blurring processing module is used for blurring the areas except the image area indicated by the correction position information in the image to be processed to obtain a processed image.
A third aspect of the present application provides a terminal device comprising a memory, a processor and a computer program stored in said memory and executable on said processor, said processor implementing the steps of the method of the first aspect as described above when said computer program is executed.
A fourth aspect of the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, performs the steps of the method of the first aspect described above.
A fifth aspect of the present application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the method of the first aspect as described above.
From the foregoing, the present application provides an image processing method. Firstly, acquiring an image to be processed, such as an image shot by a user through a mobile phone camera; secondly, detecting the foreground in the image to be processed to obtain the position information of the foreground, and determining the image area where the foreground in the image to be processed is located through the position information; then, performing expansion operation on the image area indicated by the position information in the image to be processed to obtain corrected position information, namely, performing expansion operation on the image area where the foreground in the image to be processed is located to obtain corrected position information, wherein the image area indicated by the corrected position information covers the image area where the foreground is located; and finally, blurring the region except the image region indicated by the correction position information in the image to be processed to obtain a processed image. Therefore, when blurring processing is performed on an image to be processed, blurring processing is performed on an area other than an image area indicated by correction position information, and because the correction position information is obtained by performing expansion operation on the image area where a foreground is located, the image area indicated by the correction position information covers the image area where the foreground is located, and therefore, a phenomenon that one side image area is clearer and the other side image area is more blurred in a foreground contour line in the processed image does not occur, and therefore, a boundary between the foreground area and the background area in the processed image is not abrupt any more, and therefore, the technical scheme provided by the application can solve the technical problem that the blurred image is unnatural by using a traditional background blurring method to a certain extent.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic implementation flow chart of an image processing method according to a first embodiment of the present application;
FIG. 2 is a schematic diagram of detecting foreground position information according to depth information according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a training process implementation flow of a full convolutional neural network model according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a training process of a full convolutional neural network model according to an embodiment of the present application;
fig. 5 is a schematic flow chart of an implementation of step S103 provided in the first embodiment of the present application;
FIG. 6 is a schematic diagram of an expansion operation according to an embodiment of the present disclosure;
fig. 7 is a schematic implementation flow chart of another image processing method according to the second embodiment of the present application;
fig. 8 is a schematic diagram of a method for determining correction position information according to a second embodiment of the present application;
Fig. 9 is a schematic structural view of an image processing apparatus according to a third embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The image processing method provided by the embodiment of the application may be applicable to a terminal device, and exemplary terminal devices include, but are not limited to: smart phones, tablet computers, learning machines, smart wearable devices, etc.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
In order to illustrate the technical solutions described above, the following description is made by specific embodiments.
Example 1
Referring to fig. 1, an image processing method according to a first embodiment of the present application includes:
in step S101, an image to be processed is acquired;
in the embodiment of the application, an image to be processed is first acquired. The image to be processed may be an image shot by a user through a terminal device, for example, the user starts a camera application program in a mobile phone, clicks a shooting button, and then shoots the image; or, after the user starts the camera or the video camera in the terminal device, a certain frame of preview image collected by the video camera, for example, after the user starts the camera application program in the mobile phone, a certain frame of preview image collected by the video camera; or, the images received by the user through other application programs, for example, the images sent by other WeChat contacts received by the user in the WeChat; alternatively, an image downloaded by the user from the internet, for example, an image downloaded by the user in a browser through a common carrier network; or may be a certain frame of image in the video, for example, one frame of image in a television show watched by the user, where the source of the image to be processed is not limited.
In step S102, detecting a foreground in the image to be processed, to obtain position information of the foreground;
in this embodiment of the present application, after the image to be processed is obtained, the blurring process of the image to be processed needs to be implemented through the subsequent steps S102 to S104, so before the step S102, it may be detected first whether the user inputs an instruction for instructing to perform the blurring process on the image to be processed, and if it is detected that the user inputs the instruction for instructing to perform the blurring process on the image to be processed, the step S102 and the subsequent steps are executed.
In addition, in the embodiment of the present application, if the image to be processed acquired in the step S101 is an image acquired by a camera in the terminal device, the step S102 may be:
firstly, acquiring depth information of the image to be processed, wherein the depth information can be acquired based on methods such as structured light, TOF (Time of Flight) cameras or binocular cameras and the like, and is used for indicating distance information of each pixel point in the image to be processed from the terminal equipment; secondly, according to the depth information, determining each pixel point of the image to be processed, wherein the distance between each pixel point and the terminal equipment is smaller than a preset distance; and then, determining the position information of each pixel point with the distance smaller than the preset distance from the terminal equipment as the position information of the foreground in the image to be processed. As shown in fig. 2, assuming that the image 201 is an image to be processed captured by the terminal device 200, in the step S102, first, depth information of the image 201 to be processed is acquired, where the depth information is used to indicate distance information of each pixel point in the image 201 to be processed from the terminal device 200, for example, the depth information indicates that a distance between a pixel point a and the terminal device 200 is 2 meters, and a distance between a pixel point B and the terminal device 200 is 1.8 meters; secondly, according to the depth information of the image 201 to be processed, determining all pixels with a distance smaller than a preset distance from the terminal device 200, wherein an image area formed by each determined pixel is a foreground area of the image 201 to be processed, and if the preset distance is 2.2 meters, the depth information of the image 201 to be processed indicates that the distance between the pixel A and the terminal device 200 is 2 meters and the distance between the pixel B and the terminal device 200 is 1.8 meters, it can be determined that the pixel A and the pixel B are all pixels forming the foreground of the image 201 to be processed; then, the position information of each pixel having a distance from the terminal device 200 smaller than the preset distance is determined as the position information of the foreground in the image 201 to be processed, as shown in fig. 2, assuming that the pixel a and the pixel B are all the finally determined pixels having a distance from the terminal device 200 smaller than the preset distance, and the position information of the pixel a is (1000 ), that is, the position information of the pixel a is (1000 ) located at a distance of 1000 pixels from the first pixel O in the upper left corner in the X direction and also located at a distance of 1000 pixels from the O point in the Y direction, and the position information of the pixel B is (1800,2300), that is, the position information of the pixel B is located at a distance of 1800 pixels from the O point in the X direction and a distance of 2300 pixels from the O point in the Y direction, the position information of the foreground in the image 201 to be processed is (1000 ) and (1800,2300). In addition, in the embodiment of the present application, the position information of the pixel point may be expressed by the number of pixel points spaced from the reference point, and may also be expressed by the distance length from the reference point, for example, the position information of the pixel point a in fig. 2 may also be expressed as (3 cm ), that is, the position of the pixel point a spaced 3cm from the O point in the X direction and spaced 3cm from the O point in the Y direction.
In addition, in the step S102, the method for detecting the foreground in the image to be processed, so as to obtain the position information of the foreground may further be:
and detecting the foreground in the image to be processed by using the trained full convolution neural network (Fully Convolutional Network, FCN) model, so as to obtain the position information of the foreground in the image to be processed. Compared with a CNN (Convolutional Neural Network ) model, the FCN model has no full connection layer, so that the FCN model can obtain the type information of each pixel point in an input image and is often used in the field of image segmentation.
Illustratively, the training process of the above-mentioned trained FCN model may be as shown in fig. 3, and includes steps S301-S304:
in step S301, each sample image and foreground position information corresponding to each sample image are obtained in advance;
in this step, each sample image may be selected from the sample database, and foreground position information corresponding to each sample image may be obtained. For example, as shown in fig. 4, an initial FCN model 401 is trained using a sample image 1, a sample image 2, and a sample image 3, where the foreground position information a corresponding to the sample image 1, the foreground position information B corresponding to the sample image 2, and the foreground position information C corresponding to the sample image 3, where each of the foreground position information is composed of position information of each pixel point in the foreground region in the corresponding sample image, and as shown in fig. 4, the foreground position information a is composed of position information of each pixel point in the foreground region X (shaded portion) in the sample image 1. In addition, in order to enable the FCN model after training to more accurately identify the foreground position in the input image, as many sample images as possible may be acquired in this step S301.
In step S302, each sample image is input into an initial FCN model, so that the initial FCN model detects position information of a foreground in each sample image;
in the embodiment of the present application, an initial FCN model for detecting a foreground position in an input image is first established, and then each sample image obtained in step S301 is input into the initial FCN model, so that the initial FCN model outputs position information of a foreground in each sample image. As shown in fig. 4, the initial FCN model 401 outputs the position information A1 of the foreground in the sample image 1, the position information B1 of the foreground in the sample image 2, and the position information C1 of the foreground in the sample image 3.
In step S303, determining the detection accuracy of the initial FCN model according to the foreground position information corresponding to each sample image obtained in advance and the position information of the foreground in each sample image detected by the initial FCN model;
in this embodiment of the present application, according to the position information of the foreground corresponding to a certain sample image output by the initial FCN model and the foreground position information of the sample image obtained in advance in step S301, whether the initial FCN model accurately detects the foreground position of the sample image is determined, all sample images are traversed, the proportion of the sample image accurately detecting the foreground position in all sample images is calculated, and the proportion is determined as the detection accuracy of the initial FCN model.
As shown in fig. 4, it may be determined whether the initial FCN model 401 accurately detects the foreground position in the sample image 1 according to the foreground position information A1 corresponding to the sample image 1 output by the initial FCN model 401 and the foreground position information a acquired in step S301. The specific method for judging whether to accurately detect the foreground position in the sample image 1 may be: determining the number of pixel points in the intersection of the foreground position information A1 and the foreground position information A and the number of pixel points in the union; if the ratio of the number of pixels in the intersection of A1 and a to the number of pixels in the union of A1 and a is greater than the preset ratio, the FCN model 401 is considered to accurately detect the foreground position in the sample image 1. In addition, the method for judging whether to accurately detect the foreground position in the sample image 1 may further be: and calculating the intersection ratio of the area of the image area determined by the foreground position information A in the sample image 1 and the area of the image area determined by the foreground position information A1 in the sample image 1 according to the foreground position information A and the foreground position information A1, and if the intersection ratio of the area is larger than the preset area intersection ratio, considering that the FCN model 401 accurately detects the foreground position in the sample image 1. The method for judging whether the FCN model accurately detects the foreground position in the sample image is not limited. And finally, traversing the rest sample images, namely the sample image 2 and the sample image 3, and calculating the proportion of the sample image of the initial FCN model 401 for accurately detecting the foreground position in all the sample images, thereby obtaining the detection accuracy of the initial FCN model 401.
In step S304, continuously adjusting each parameter of the current FCN model, and continuously detecting the foreground position in each sample image through the FCN model after parameter adjustment, until the detection accuracy of the FCN model after parameter adjustment is greater than the preset accuracy, using the current FCN model as the FCN model after training;
in general, the initial FCN model cannot accurately detect the foreground position in the sample image, so that each parameter of the initial FCN model needs to be adjusted, and each sample image acquired in step S301 is input into the FCN model after parameter adjustment again, so that the FCN model after parameter adjustment continues to detect the foreground position in each sample image, the detection accuracy of the FCN model after parameter adjustment is obtained again, each parameter of the current FCN model is continuously adjusted until the detection accuracy of the current FCN model is greater than the preset accuracy, and then the current FCN model is used as the FCN model after training. Common methods for adjusting the parameters include a random gradient descent algorithm (Stochastic Gradient Descent, SGD), a dynamic update algorithm (Momentum update), and the like, and the method used for adjusting the parameters is not limited herein.
In step S103, performing an expansion operation on an image area indicated by the position information of the foreground in the image to be processed, to obtain corrected position information;
in the embodiment of the present application, the specific implementation procedure of the step S103 may be as shown in fig. 5, including steps S501 to S503:
in step S501, setting a pixel value of each pixel point in the image area indicated by the position information of the foreground in the image to be processed as a first pixel value, and setting the rest pixel values in the image to be processed as a second pixel value, so as to obtain a binarized image;
as shown in fig. 6, assuming that the image 601 is the image to be processed acquired in step S101, and the shadow area in the image 601 is the image area indicated by the position information of the foreground detected in step S102, that is, the foreground area in the image 601, in step S501, the pixel value of each pixel point in the foreground area of the image 601 may be set to a uniform first pixel value, for example, 255, and the remaining pixel points of the image 601 may be set to a uniform second pixel value, for example, 0, so as to obtain a binarized image 602.
In step S502, performing an expansion operation on an image area having a pixel value of the first pixel value in the binarized image to obtain a corrected binarized image;
Specifically, the specific implementation procedure of this step S502 may be: firstly, defining a first traversing frame, wherein the first traversing frame comprises one or more reference points, can be a rectangular frame, a round frame and the like, and has a size smaller than the image size of the binary image; and secondly, traversing all the pixel points of the binarized image by using the first traversing frame, if the image area covered by the first traversing frame contains the pixel points with the first pixel values, modifying the pixel values at the reference points in the first traversing frame into the first pixel values, and if the pixel values of all the pixel points in the image area covered by the first traversing frame are all the second pixel values, not modifying the pixel values of all the pixel points in the first traversing frame.
For more clear description of this step, referring to fig. 6, a 3×3 rectangular frame 603 (i.e. the first traversing frame) is defined as shown in fig. 6, where the second row and the second column in the rectangular frame 603 are reference points of the rectangular frame, first, the rectangular frame 603 is covered on the upper left corner area of the binarized image 602, where the pixel values of the pixels covered on the rectangular frame 603 are all the second pixel values, and the pixel values of the pixels covered on the rectangular frame 603 are not modified; next, the rectangular frame is moved rightward by one pixel point, and whether the image area covered by the rectangular frame 603 includes a pixel point of a first pixel value is determined again, if yes, the pixel value at the reference point in the rectangular frame 603 is modified to be the first pixel value, if no, the pixel value of each pixel point in the rectangular frame 603 is not modified, and the pixel point is continuously moved rightward (or downward) until the binary image 602 is traversed, so as to obtain a modified binary image. Those skilled in the art will readily appreciate that the image area of the modified binarized image having the first pixel value is larger than the image area of the binarized image having the first pixel value.
In step S503, position information of each pixel point having the pixel value of the corrected binarized image that is the first pixel value is determined as corrected position information.
In obtaining the corrected binarized image, the corrected position information is determined, and it is easily understood by those skilled in the art that in the image to be processed, the image area indicated by the corrected position information will cover the foreground area determined in step S102.
In step S104, blurring processing is performed on the region of the image to be processed except the image region indicated by the correction position information, so as to obtain a processed image;
in the embodiment of the present application, after the corrected position information is obtained, blurring processing, that is, blurring processing, is performed on the image area excluding the corrected position information in the image to be processed obtained in step S101. In this embodiment of the present application, the step S104 may further perform blurring processing with different degrees according to the distance between different image areas in the image to be processed and the foreground area (the distance is not an actual distance between the image to be processed and the foreground, but a linear length between the image to be processed and the foreground area), for example, a region with a short linear length with the foreground area may perform blurring processing with a smaller degree, and a region with a long linear length with the foreground area may perform blurring processing with a larger degree. As shown in fig. 6, any one point L is selected from the front Jing Ouyu, any one point M is selected from the image area U, any one point N is selected from the image area V, the linear distance a between the point M and the point L is calculated, a is used as the distance between the image area U and the foreground area, the linear distance b between the point N and the point L is calculated, b is used as the distance between the image area V and the foreground area, if a=5 cm and b=3 cm, the image area U can be subjected to higher-degree blurring processing, and the image area V can be subjected to lower-degree blurring processing, so that the background area closer to the foreground area in the blurred image is clearer, and the blurred image is more natural.
In the first embodiment of the present application, when blurring processing is performed on an image to be processed, blurring processing is performed on an area other than an image area indicated by correction position information, and since the correction position information is obtained by performing expansion operation on the image area where a foreground is located, the image area indicated by the correction position information covers the image area where the foreground is located, so that a phenomenon that one side image area is clearer and the other side image area is more blurred in a foreground contour line in a blurred image does not occur, and therefore a boundary between the foreground area and a background area in the processed image is not abrupt any more, and a technical problem that the blurred image is unnatural due to a traditional background blurring method is solved to a certain extent.
Example two
In a second embodiment of the present application, referring to fig. 7, another image processing method is provided, including:
in step S701, an image to be processed is acquired;
in step S702, detecting a foreground in the image to be processed, to obtain position information of the foreground;
in the second embodiment of the present application, the steps S701 to S702 are the same as the steps S101 to S102 in the first embodiment, and the description of the first embodiment will be referred to herein, and will not be repeated.
In step S703, traversing all pixels of the image to be processed by using a predefined second traversing frame, and determining whether the image area covered by the second traversing frame includes both a foreground area and a background area according to the position information of the foreground in the image to be processed, where the size of the second traversing frame is smaller than the image size of the image to be processed;
in step S704, if yes, determining all the image areas selected in the second traversal frame as correction areas;
in step S705, a union of the position information of the pixel points in each correction area and the position information of the foreground determined in accordance with step S702 is taken, and all the position information in the union is determined as correction position information;
in the first embodiment of the present application, a specific method for determining corrected position information is provided (see fig. 5), and the steps S703 to S705 are another method for determining corrected position information provided in the second embodiment of the present application.
In order to more clearly describe another method for determining corrected position information provided in the second embodiment of the present application, the following description will be made with reference to fig. 8. As shown in fig. 8, a 3×3 rectangular frame 802 is defined, where the rectangular frame 802 is the second traversing frame (the second traversing frame may be a circular frame or a rectangular frame, etc.), first, the rectangular frame 802 is covered with an upper left corner region of the image 801 to be processed, where all image regions covered with the rectangular frame are background regions, in this case, the rectangular frame 802 is continuously moved, the rectangular frame 802 is moved to the right by one pixel point, it is again determined whether the image regions covered with the rectangular frame 802 include both the foreground region and the background region, if yes, the image region currently covered with the rectangular frame 802 is determined as a correction region, if no, the rectangular frame 802 is continuously moved to the right (or downward) by one pixel point until the image 801 to be processed is traversed, so as to obtain each correction region, then each correction region is combined with the foreground region determined in step S702, and the position information of all the pixels in the union is determined as correction position information.
In step S706, blurring processing is performed on the region of the image to be processed except the image region indicated by the correction position information, so as to obtain a processed image;
in the second embodiment of the present application, the step S706 is the same as the step S104 in the first embodiment, and the description of the first embodiment can be referred to, and the details are not repeated here.
In the embodiment of the present application, another method for determining correction position information is defined as compared with the first embodiment. In general, before performing expansion operation on a certain region in an image, binarization processing is performed on the image first, and then the binarized image is temporarily stored, so that subsequent operations are performed on the binarized image. However, the expansion operation method provided in the second embodiment of the present application avoids performing binarization processing on the image, so that when performing expansion operation on the image, additional temporary storage of the image after binarization processing is avoided, and storage space can be saved to a certain extent when the technical scheme of the second embodiment is executed. In addition, the second embodiment of the present application is the same as the first embodiment, so that the junction between the foreground region and the background region in the image after blurring processing is not abrupt, and the technical problem that the image after blurring is unnatural by using the traditional background blurring method is solved to a certain extent.
It should be understood that the sequence numbers of the steps in the first embodiment and the second embodiment of the method do not mean the sequence of execution, and the execution sequence of each process should be determined by the functions and the internal logic of the execution sequence, and should not limit the implementation process of the embodiments of the present application in any way.
Example III
An image processing apparatus according to a third embodiment of the present application is provided, and for convenience of explanation, only a portion related to the present application is shown, as shown in fig. 9, an image processing apparatus 900 includes:
an image acquisition module 901, configured to acquire an image to be processed;
the foreground detection module 902 is configured to detect a foreground in the image to be processed, and obtain location information of the foreground;
the expansion operation module 903 is configured to perform expansion operation on an image area indicated by the position information in the image to be processed, so as to obtain corrected position information;
and a blurring processing module 904, configured to perform blurring processing on an area of the image to be processed except for the image area indicated by the correction position information, so as to obtain a processed image.
Optionally, the foreground detection module 902 is specifically configured to:
when detecting that a user inputs an instruction for indicating blurring processing of the image to be processed, detecting a foreground in the image to be processed to obtain position information of the foreground.
Optionally, the expansion operation module 903 includes:
a binarization unit, configured to set a pixel value of each pixel point in the image area indicated by the position information in the image to be processed as a first pixel value, and set a pixel value of each pixel point in the image to be processed except for the image area indicated by the position information as a second pixel value, so as to obtain a binarized image;
an expansion unit, configured to perform expansion operation on an image area with a pixel value being the first pixel value in the binarized image, so as to obtain a corrected binarized image;
and a correction position determining unit configured to determine, as correction position information, position information of each pixel point having the pixel value of the correction binarized image as the first pixel value.
Optionally, the image to be processed is an image acquired by the terminal equipment;
accordingly, the foreground detection module 902 includes:
a depth acquisition unit, configured to acquire depth information of the image to be processed;
a foreground determining unit, configured to determine, according to the depth information, each pixel point having a distance from the terminal device smaller than a preset distance;
and a foreground position determining unit configured to determine, as the position information of the foreground, position information of each pixel point having a distance from the terminal device smaller than the preset distance.
Optionally, the foreground detection module 902 is specifically configured to:
and detecting the foreground in the image to be processed by using the trained full convolution neural network model to obtain the position information of the foreground.
Optionally, the full convolutional neural network model is trained by a training module, and the training module includes:
the sample acquisition unit is used for acquiring each sample image and foreground position information corresponding to each sample image in advance;
the position detection unit is used for inputting each sample image into an initial full convolution neural network model so that the initial full convolution neural network model detects the position information of the foreground in each sample image;
the accuracy rate determining unit is used for determining the detection accuracy rate of the initial full-convolution neural network model according to the foreground position information corresponding to each sample image acquired in advance and the position information of the foreground in each sample image detected by the initial full-convolution neural network model;
and the parameter adjusting unit is used for continuously adjusting each parameter of the current full convolution neural network model until the detection accuracy of the full convolution neural network model after parameter adjustment is greater than the preset accuracy.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
Example IV
Fig. 10 is a schematic diagram of a terminal device provided in a fourth embodiment of the present application. As shown in fig. 10, the terminal device 10 of this embodiment includes: a processor 100, a memory 101, and a computer program 102 stored in the memory 101 and executable on the processor 100. The steps of the various method embodiments described above, such as steps S101 to S104 shown in fig. 1, are implemented when the processor 100 executes the computer program 102. Alternatively, the processor 100 may implement the functions of the modules/units in the above-described apparatus embodiments when executing the computer program 102, for example, the functions of the modules 901 to 904 shown in fig. 9.
Illustratively, the computer program 102 may be partitioned into one or more modules/units that are stored in the memory 101 and executed by the processor 100 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program 102 in the terminal device 10. For example, the computer program 102 may be divided into an image acquisition module, a foreground detection module, an expansion operation module, and a blurring processing module, where each module specifically functions as follows:
Acquiring an image to be processed;
detecting the foreground in the image to be processed to obtain the position information of the foreground;
performing expansion operation on an image area indicated by the position information in the image to be processed to obtain corrected position information;
and blurring the region except the image region indicated by the correction position information in the image to be processed to obtain a processed image.
The terminal device may include, but is not limited to, a processor 100, a memory 101. It will be appreciated by those skilled in the art that fig. 10 is merely an example of the terminal device 10 and is not intended to limit the terminal device 10, and may include more or fewer components than shown, or may combine certain components, or different components, such as the terminal device described above may also include input-output devices, network access devices, buses, etc.
The processor 100 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 101 may be an internal storage unit of the terminal device 10, for example, a hard disk or a memory of the terminal device 10. The memory 101 may be an external storage device of the terminal device 10, for example, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided in the terminal device 10. Further, the memory 101 may also include both an internal storage unit and an external storage device of the terminal device 10. The memory 101 is used for storing the computer program and other programs and data required for the terminal device. The above-described memory 101 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units described above is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the above computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. The computer program comprises computer program code, and the computer program code can be in a source code form, an object code form, an executable file or some intermediate form and the like. The computer readable medium may include: any entity or device capable of carrying the computer program code described above, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable medium described above can be appropriately increased or decreased according to the requirements of the jurisdiction's legislation and the patent practice, for example, in some jurisdictions, the computer readable medium does not include electrical carrier signals and telecommunication signals according to the legislation and the patent practice.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (9)

1. An image processing method, comprising:
acquiring an image to be processed;
detecting a foreground in the image to be processed to obtain position information of the foreground;
performing expansion operation on an image area indicated by the position information in the image to be processed to obtain corrected position information;
blurring processing is carried out on the areas except for the image area indicated by the correction position information in the image to be processed, so that a processed image is obtained;
performing expansion operation on the image area indicated by the position information in the image to be processed to obtain corrected position information, including:
Setting a pixel value of each pixel point in an image area indicated by the position information in the image to be processed as a first pixel value, and setting a pixel value of each pixel point except the image area indicated by the position information in the image to be processed as a second pixel value, so as to obtain a binarized image;
performing expansion operation on an image area with the pixel value of the binary image being the first pixel value to obtain a corrected binary image;
determining position information of each pixel point with a pixel value of the first pixel value in the corrected binarized image as corrected position information;
when performing expansion operation on the binary image, traversing all pixel points of the binary image through a first traversing frame, and if the image area covered by the first traversing frame contains pixel points with first pixel values, modifying the pixel values at the reference points in the first traversing frame into first pixel values; the first traversal frame includes one or more reference points, and a size of the first traversal frame is smaller than an image size of the binarized image.
2. The image processing method according to claim 1, wherein the detecting the foreground in the image to be processed to obtain the position information of the foreground includes:
When detecting that a user inputs an instruction for indicating blurring processing of the image to be processed, detecting a foreground in the image to be processed, and obtaining position information of the foreground.
3. The image processing method according to claim 1 or 2, wherein the image to be processed is an image acquired by a terminal device;
correspondingly, the detecting the foreground in the image to be processed to obtain the position information of the foreground includes:
acquiring depth information of the image to be processed;
according to the depth information, determining each pixel point with the distance from the terminal equipment smaller than a preset distance;
and determining the position information of each pixel point with the distance smaller than the preset distance from the terminal equipment as the position information of the foreground.
4. The image processing method according to claim 1 or 2, wherein the detecting the foreground in the image to be processed to obtain the position information of the foreground includes:
and detecting the foreground in the image to be processed by using the trained full convolution neural network model to obtain the position information of the foreground.
5. The image processing method of claim 4, wherein the training process of the full convolutional neural network model comprises:
Acquiring foreground position information corresponding to each sample image in advance;
inputting each sample image into an initial full convolution neural network model, so that the initial full convolution neural network model detects the position information of the foreground in each sample image;
determining the detection accuracy of the initial full-convolution neural network model according to the foreground position information corresponding to each sample image obtained in advance and the position information of the foreground in each sample image detected by the initial full-convolution neural network model;
continuously adjusting each parameter of the current full-convolution neural network model, continuously detecting the foreground position in each sample image through the full-convolution neural network model after parameter adjustment until the detection accuracy of the full-convolution neural network model after parameter adjustment is greater than the preset accuracy, and determining the current full-convolution neural network model as the trained full-convolution neural network model.
6. An image processing apparatus, comprising:
the image acquisition module is used for acquiring an image to be processed;
the foreground detection module is used for detecting the foreground in the image to be processed and obtaining the position information of the foreground;
The expansion operation module is used for carrying out expansion operation on the image area indicated by the position information in the image to be processed to obtain corrected position information;
the blurring processing module is used for blurring the areas except the image area indicated by the correction position information in the image to be processed to obtain a processed image;
the expansion operation module comprises:
a binarization unit, configured to set a pixel value of each pixel point in the image area indicated by the position information in the image to be processed as a first pixel value, and set a pixel value of each pixel point in the image to be processed except for the image area indicated by the position information as a second pixel value, so as to obtain a binarized image;
an expansion unit, configured to perform expansion operation on an image area with a pixel value being the first pixel value in the binarized image, so as to obtain a corrected binarized image;
a correction position determining unit configured to determine position information of each pixel point having the pixel value of the correction binarized image as the first pixel value as correction position information;
when performing expansion operation on the binary image, traversing all pixel points of the binary image through a first traversing frame, and if the image area covered by the first traversing frame contains pixel points with first pixel values, modifying the pixel values at the reference points in the first traversing frame into first pixel values; the first traversal frame includes one or more reference points, and a size of the first traversal frame is smaller than an image size of the binarized image.
7. The image processing apparatus according to claim 6, wherein the foreground detection module is specifically configured to:
when detecting that a user inputs an instruction for indicating blurring processing of the image to be processed, detecting a foreground in the image to be processed, and obtaining position information of the foreground.
8. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 5 when the computer program is executed.
9. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 5.
CN201811013836.XA 2018-08-31 2018-08-31 Image processing method, image processing device and terminal equipment Active CN110874814B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811013836.XA CN110874814B (en) 2018-08-31 2018-08-31 Image processing method, image processing device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811013836.XA CN110874814B (en) 2018-08-31 2018-08-31 Image processing method, image processing device and terminal equipment

Publications (2)

Publication Number Publication Date
CN110874814A CN110874814A (en) 2020-03-10
CN110874814B true CN110874814B (en) 2023-07-28

Family

ID=69715274

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811013836.XA Active CN110874814B (en) 2018-08-31 2018-08-31 Image processing method, image processing device and terminal equipment

Country Status (1)

Country Link
CN (1) CN110874814B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112862852A (en) * 2021-02-24 2021-05-28 深圳市慧鲤科技有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN113808154A (en) * 2021-08-02 2021-12-17 惠州Tcl移动通信有限公司 Video image processing method and device, terminal equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102087707A (en) * 2009-12-03 2011-06-08 索尼株式会社 Image processing equipment and image processing method
CN107038681A (en) * 2017-05-31 2017-08-11 广东欧珀移动通信有限公司 Image weakening method, device, computer-readable recording medium and computer equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102087707A (en) * 2009-12-03 2011-06-08 索尼株式会社 Image processing equipment and image processing method
CN107038681A (en) * 2017-05-31 2017-08-11 广东欧珀移动通信有限公司 Image weakening method, device, computer-readable recording medium and computer equipment

Also Published As

Publication number Publication date
CN110874814A (en) 2020-03-10

Similar Documents

Publication Publication Date Title
CN110660066B (en) Training method of network, image processing method, network, terminal equipment and medium
CN109886997B (en) Identification frame determining method and device based on target detection and terminal equipment
EP2849431B1 (en) Method and apparatus for detecting backlight
CN108833784B (en) Self-adaptive composition method, mobile terminal and computer readable storage medium
CN110335216B (en) Image processing method, image processing apparatus, terminal device, and readable storage medium
CN110796600B (en) Image super-resolution reconstruction method, image super-resolution reconstruction device and electronic equipment
CN108961183B (en) Image processing method, terminal device and computer-readable storage medium
CN107657595B (en) Distortion correction method, mobile terminal and related medium product
CN110084765B (en) Image processing method, image processing device and terminal equipment
CN110400271B (en) Stripe non-uniformity correction method and device, electronic equipment and storage medium
CN110874814B (en) Image processing method, image processing device and terminal equipment
CN109285122B (en) Method and equipment for processing image
CN111063029A (en) Map construction method and device, computer readable storage medium and robot
CN111669492A (en) Method for processing shot digital image by terminal and terminal
CN110650288B (en) Focusing control method and device, electronic equipment and computer readable storage medium
CN110765875B (en) Method, equipment and device for detecting boundary of traffic target
CN108810407B (en) Image processing method, mobile terminal and computer readable storage medium
CN111161299B (en) Image segmentation method, storage medium and electronic device
CN111885371A (en) Image occlusion detection method and device, electronic equipment and computer readable medium
CN108510636B (en) Image segmentation method, image segmentation device and terminal equipment
CN114723715B (en) Vehicle target detection method, device, equipment, vehicle and medium
CN106033616B (en) Electronic device and image processing method
CN112733565A (en) Two-dimensional code coarse positioning method, equipment and storage medium
CN114095683A (en) Video noise estimation method, device, equipment and storage medium based on differential block
CN112580638B (en) Text detection method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant