CN107704798B - Image blurring method and device, computer readable storage medium and computer device - Google Patents

Image blurring method and device, computer readable storage medium and computer device Download PDF

Info

Publication number
CN107704798B
CN107704798B CN201710676169.2A CN201710676169A CN107704798B CN 107704798 B CN107704798 B CN 107704798B CN 201710676169 A CN201710676169 A CN 201710676169A CN 107704798 B CN107704798 B CN 107704798B
Authority
CN
China
Prior art keywords
blurring
image
face
area
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710676169.2A
Other languages
Chinese (zh)
Other versions
CN107704798A (en
Inventor
丁佳铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710676169.2A priority Critical patent/CN107704798B/en
Publication of CN107704798A publication Critical patent/CN107704798A/en
Priority to PCT/CN2018/099403 priority patent/WO2019029573A1/en
Application granted granted Critical
Publication of CN107704798B publication Critical patent/CN107704798B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/77
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Abstract

The invention relates to an image blurring method, an image blurring device, a computer readable storage medium and computer equipment. The method comprises the following steps: acquiring an image to be processed; detecting a face region in the image to be processed, and acquiring physical distance information corresponding to the face region; and acquiring background blurring strength according to the physical distance information, and blurring a background area in the image to be processed according to the background blurring strength. The image blurring method, the image blurring device, the computer-readable storage medium and the computer device can improve the accuracy of image processing.

Description

Image blurring method and device, computer readable storage medium and computer device
Technical Field
The present invention relates to the field of computer technologies, and in particular, to an image blurring method and apparatus, a computer-readable storage medium, and a computer device.
Background
Nowadays, people are more and more unable to take pictures in life, and especially along with the development of intelligent terminals, after the intelligent terminals realize the function of taking pictures, the application of taking pictures becomes more extensive. Meanwhile, the requirements on the quality of photographing and the user experience are higher and higher no matter in personal life or commercial use.
However, the photographed scene is often complicated and changeable, and in order to adapt the photographed picture to the complicated and changeable scene and to further highlight the photographed subject to show the sense of layering, a common processing method is to maintain the definition of the photographed subject and to blur the region outside the photographed subject. The blurring process is to blur the region other than the main body so as to make the main body more prominent. The traditional blurring method is to recognize a main body in an image, and then directly blurring the region outside the main body to a fixed degree, so that the background and the main body are displayed in a distinguishing manner.
Disclosure of Invention
The embodiment of the application provides an image blurring method and device, a computer readable storage medium and computer equipment, which can improve the accuracy of image processing.
A method of image blurring, the method comprising:
acquiring an image to be processed;
detecting a face region in the image to be processed, and acquiring physical distance information corresponding to the face region;
and acquiring background blurring strength according to the physical distance information, and blurring a background area in the image to be processed according to the background blurring strength.
An image blurring device, the device comprising:
the image acquisition module is used for acquiring an image to be processed;
the information acquisition module is used for detecting a face area in the image to be processed and acquiring physical distance information corresponding to the face area;
and the background blurring module is used for acquiring background blurring strength according to the physical distance information and blurring a background area in the image to be processed according to the background blurring strength.
One or more non-transitory computer-readable storage media embodying computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of:
acquiring an image to be processed;
detecting a face region in the image to be processed, and acquiring physical distance information corresponding to the face region;
and acquiring background blurring strength according to the physical distance information, and blurring a background area in the image to be processed according to the background blurring strength.
A computer device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of:
acquiring an image to be processed;
detecting a face region in the image to be processed, and acquiring physical distance information corresponding to the face region;
and acquiring background blurring strength according to the physical distance information, and blurring a background area in the image to be processed according to the background blurring strength.
According to the image blurring method, the image blurring device, the computer-readable storage medium and the computer device, the face region in the image to be processed is detected, the blurring strength of the background region is obtained according to the physical distance information of the face region, and then the background region is subjected to blurring processing according to the blurring strength. The physical distance information can reflect the distance between the face and the lens, and the blurring strength obtained by different distances is different, so that blurring processing is more accurate.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram showing an internal structure of an electronic apparatus according to an embodiment;
FIG. 2 is a diagram illustrating an internal architecture of a server according to an embodiment;
FIG. 3 is a flow diagram of a method for image blurring in one embodiment;
FIG. 4 is a flow chart of an image blurring method in another embodiment;
FIG. 5 is a schematic diagram of obtaining physical distance information in one embodiment;
FIG. 6 is a schematic diagram illustrating an exemplary embodiment of an image blurring apparatus;
FIG. 7 is a schematic diagram of an image blurring apparatus according to another embodiment;
FIG. 8 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present invention. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is a schematic diagram of an internal structure of an electronic device in one embodiment. As shown in fig. 1, the electronic apparatus includes a processor, a nonvolatile storage medium, an internal memory, and a network interface, a display screen, and an input device, which are connected by a system bus. Wherein the non-volatile storage medium of the electronic device stores an operating system and computer readable instructions. The computer readable instructions, when executed by a processor, implement an image blurring method. The processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. Internal memory in the electronic device provides an environment for the execution of computer-readable instructions in the non-volatile storage medium. The network interface is used for performing network communication with the server, such as sending an image blurring request to the server, receiving a blurring processed image returned by the server, and the like. The display screen of the electronic device may be a liquid crystal display screen or an electronic ink display screen, and the input device may be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a housing of the electronic device, or an external keyboard, a touch pad or a mouse. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc. Those skilled in the art will appreciate that the architecture shown in fig. 1 is a block diagram of only a portion of the architecture associated with the subject application, and does not constitute a limitation on the electronic devices to which the subject application may be applied, and that a particular electronic device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
Fig. 2 is a schematic diagram of an internal structure of the server in one embodiment. As shown in fig. 2, the server includes a processor, a non-volatile storage medium, an internal memory, and a network interface connected through a system bus. Wherein the non-volatile storage medium of the server stores an operating system and computer readable instructions. The computer readable instructions, when executed by a processor, implement an image blurring method. The processor of the server is used for providing calculation and control capacity and supporting the operation of the whole server. The network interface of the server is used for communicating with an external terminal through network connection, such as receiving an image blurring request sent by the terminal and returning a blurring processed image to the terminal. The server may be implemented as a stand-alone server or as a server cluster consisting of a plurality of servers. Those skilled in the art will appreciate that the architecture shown in fig. 2 is a block diagram of only a portion of the architecture associated with the subject application, and does not constitute a limitation on the servers to which the subject application applies, as a particular server may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
FIG. 3 is a flow diagram of a method for image blurring in one embodiment. As shown in fig. 3, the image blurring method includes steps 302 to 306, wherein:
step 302, acquiring an image to be processed.
In the embodiment provided by the application, the image to be processed refers to an image which needs to be subjected to blurring processing, and can be acquired through an image acquisition device. The image capturing device refers to a device for capturing an image, and for example, the image capturing device may be a camera, a camera on a mobile terminal, a video camera, or the like. After receiving the image blurring instruction, the user terminal may directly perform blurring processing on the image to be processed at the user terminal, or may initiate an image blurring request to the server to perform blurring processing on the image to be processed at the server.
It is understood that the image blurring instruction may be input by the user or automatically triggered by the user terminal. For example, a user inputs a photographing instruction through a user terminal, and after detecting the photographing instruction, the mobile terminal acquires an image to be processed through a camera. And then automatically triggering to generate an image blurring instruction, and performing blurring processing on the image to be processed. The photographing instruction may be triggered by a physical key or a touch screen operation of the mobile terminal, or may be a voice instruction or the like.
And 304, detecting a face region in the image to be processed, and acquiring physical distance information corresponding to the face region.
In one embodiment, the face region refers to a region where a face is located in the image to be processed, and the physical distance information refers to a related parameter indicating a physical distance from the image acquisition device to an object corresponding to each pixel point in the image to be processed. The physical distance information corresponding to the face region refers to related parameters of the physical distance between the image acquisition device and the face.
Specifically, the feature points in the image to be processed may be identified first, and then the feature points are extracted and matched with the preset face model, and if the extracted feature points are matched with the preset face model, the region where the extracted feature points are located is used as the face region.
In the embodiment provided by the application, the image to be processed is composed of a plurality of pixels, each pixel has corresponding physical distance information, and the physical distance information represents the corresponding physical distance from the object represented by the pixel to the image acquisition device.
It can be understood that a plurality of face regions may exist in the image to be processed, and after the face regions in the image to be processed are detected, the region areas corresponding to the face regions can be obtained; and acquiring the physical distance information corresponding to the face region with the largest region area. And acquiring background blurring strength according to the physical distance information corresponding to the face region with the largest region area. The area of the region refers to the area size corresponding to the face region, and the area of the region may be represented by the number of pixels included in the face region, or may be represented by the ratio of the size of the region occupied by the face region to the size of the image to be processed.
Generally, when the physical distance of an object is acquired, there is an effective distance range, and the object within the effective distance range can accurately acquire corresponding physical distance information, and the object beyond the effective distance range cannot be accurately measured. The effective distance range can be adjusted through hardware according to different value ranges of different effective distance ranges of hardware performance. Therefore, the physical distance information within the effective distance range can be represented by an accurate numerical value, and the physical distance information exceeding the effective distance range can be represented by a fixed numerical value.
That is, only the face region within the effective distance range may be detected, and the face region having checked the effective distance range may not be obtained. Step 304 may include: detecting a face region in a preset distance range in an image to be processed, and acquiring physical distance information corresponding to the face region. The preset distance range may be, but is not limited to, a value range of the effective physical distance information.
And step 306, acquiring background blurring strength according to the physical distance information, and blurring the background area in the image to be processed according to the background blurring strength.
In the embodiment provided by the present invention, the blurring process is performed on the image, and the blurring process is performed according to the blurring strength, and the blurring strength and the blurring degree are different. The background region may refer to a region other than the face region or the portrait region in the image to be processed. The portrait area refers to an area where the whole portrait in the image to be processed is located.
The background blurring strength is a parameter indicating the degree of blurring the background region. And acquiring background blurring strength according to the physical distance information of the face region, and blurring the background region according to the background blurring strength, wherein the obtained blurring result changes along with the change of the actual physical distance from the face to the image acquisition device. Generally, the larger the physical distance information is, the smaller the background blurring strength is, and the smaller the degree of blurring the background area is; the smaller the physical distance information is, the greater the background blurring strength is, and the greater the degree of blurring the background region is.
The image blurring method comprises the steps of firstly detecting a face area in an image to be processed, obtaining blurring strength of a background area according to physical distance information of the face area, and blurring the background area according to the blurring strength. The physical distance information can reflect the distance between the face and the lens, the blurring strength obtained at different distances is different, and the blurring degree is changed along with the change of the physical distance information, so that the blurring effect can adapt to different shooting scenes, and the blurring process is more accurate.
FIG. 4 is a flowchart of an image blurring method in another embodiment. As shown in fig. 4, the image blurring method includes steps 402 to 410, wherein:
step 402, acquiring an image to be processed.
In the embodiments provided in the present application, the image to be processed may be directly obtained locally or on a server. Specifically, after receiving the image blurring instruction, the user terminal may directly obtain the corresponding image to be processed according to the image storage address and the image identifier included in the image blurring instruction. The image storage address may be local to the user terminal or may be on the server. After the image to be processed is acquired, the image to be processed may be subjected to blurring locally, or may be subjected to blurring on a server.
Step 404, detecting the face regions in the image to be processed, and acquiring the physical distance information corresponding to each face region in the image to be processed.
In the embodiment provided by the invention, the image acquisition device can be provided with two cameras, and the information of the physical distance between the image acquisition device and an object is measured through the two cameras. Specifically, images of an object are respectively shot through a first camera and a second camera; acquiring a first included angle and a second included angle according to the image, wherein the first included angle is an included angle between a horizontal line from the first camera to the object and a horizontal line from the first camera to the second camera, and the second included angle is an included angle between a horizontal line from the second camera to the object and a horizontal line from the second camera to the first camera; and acquiring physical distance information between the image acquisition device and the object according to the first included angle, the second included angle and the distance between the first camera and the second camera.
FIG. 5 is a schematic diagram of obtaining physical distance information in one embodiment. As shown in fig. 5, the first camera 502 and the second camera 504 respectively capture images of the object 506, a first included angle a1 and a second included angle a2 can be obtained according to the images, and then a physical distance D between any point on the horizontal line of the first camera 402 to the second camera 504 and the object 506 can be obtained according to the first included angle a1, the second included angle a2 and the distance T between the first camera 502 and the second camera 504.
It can be understood that the same scene often contains a plurality of human faces, and therefore the image to be processed also contains a plurality of human face regions. Each face region in the image to be processed is extracted, and physical distance information corresponding to the face region is obtained. Generally, when an image is acquired for a certain scene, a depth map corresponding to the scene may be acquired at the same time. The acquired depth map is in one-to-one correspondence with the image, and the values in the depth map represent physical distance information of corresponding pixels in the image. That is to say, the corresponding depth map may be acquired while the image to be processed is acquired, and after the face region in the image to be processed is detected, the corresponding physical distance information may be acquired in the depth map according to the pixel coordinates in the face region.
In one embodiment, the face region includes a plurality of pixels, since each pixel has corresponding physical distance information. After the physical distance information corresponding to each pixel in the face area is obtained, an average value of the physical distance information corresponding to all pixels in the face area may be obtained, or the physical distance information corresponding to a certain pixel may be obtained to represent the physical distance information corresponding to the face area. For example, the physical distance information corresponding to the center pixel of the face region is obtained to represent the physical distance information corresponding to the face region.
And 406, acquiring background blurring strength according to the physical distance information, and blurring a background area in the image to be processed according to the background blurring strength.
In one embodiment, the portrait is considered to be in the same vertical plane as the face, and the physical distance from the portrait to the image capture device is in the same range as the physical distance from the face to the image capture device. Therefore, after the physical distance information and the face region are acquired, the portrait region in the image to be processed can be acquired according to the physical distance corresponding to the face region, and then the background region can be determined in the image to be processed according to the portrait region.
Specifically, a face region in the image to be processed is detected, a portrait distance range is obtained according to physical distance information corresponding to the face region, a portrait region in the image to be processed can be obtained according to the portrait distance range, and then a background region is obtained according to the portrait region. The portrait distance range refers to a value range of physical distance information corresponding to a portrait area in an image to be processed. The physical distance from the image acquisition device to the face and the physical distance from the image acquisition device to the portrait can be regarded as equal, after the face area is detected, the physical distance information corresponding to the face area is obtained, the range of the physical distance information corresponding to the portrait area can be determined according to the physical distance information corresponding to the face area, the physical distance information in the range is regarded as the physical distance information corresponding to the portrait area, and the physical distance information outside the range is regarded as the physical distance information of the background area.
Further, step 406 may be preceded by: acquiring a portrait distance range according to the physical distance information corresponding to the face region, and acquiring an image region in the image to be processed according to the physical distance information in the portrait distance range; and acquiring color information of the image area, and acquiring a background area except the portrait area in the image to be processed according to the color information.
The image area extracted according to the portrait distance range is the area where the object in the same physical distance range with the human face in the image to be processed is located, and if other objects exist beside the human, the extracted image area may have other objects except the portrait area. At this time, the portrait area can be further extracted according to the color information of the image area.
In the embodiments provided by the present invention, the color information refers to relevant parameters used for representing colors of an image, for example, the color information may include information of hue, saturation, brightness, and the like of the colors in the image. The hue of the color refers to the angular measurement of the color, and the value range of the hue is 0-360 degrees, the hue of the color is 0 degree, the hue of the color is 120 degrees, and the hue of the color is 240 degrees, which are calculated from red in a counterclockwise direction. The saturation is the degree that the color is close to the spectrum, and generally, the higher the saturation is, the more vivid the color is; the lower the saturation, the darker the color. Brightness indicates the brightness of the color.
Different objects often have different color characteristics, i.e. the color information presented in the image is also different. For example, the tree is green in color, blue in the sky, yellow in the ground, etc. The portrait area and the background area outside the portrait area may be extracted according to color information in the image area.
Specifically, color components of an image area are acquired, and an area, within a preset range, of the color components in the image area is extracted as a portrait area. The color component refers to an image component generated by converting an image to be processed into an image with a certain color dimension, for example, the color component may refer to an RGB color component, a CMY color component, an HSV color component, and the like of the image, and it is understood that the RGB color component, the CMY color component, and the HSV color component may be converted into each other.
In one embodiment, HSV color components of the image area are obtained, and the area, within the preset range, of the HSV color components in the image area is extracted to serve as a portrait area. The HSV color components respectively refer to hue (H), saturation (S) and lightness (V) components of an image, a preset range is set for the three components respectively, and areas of the three components in the image area within the preset range are extracted to be used as portrait areas.
For example, the portrait area is obtained through HSV color components, specifically, the HSV color components of the image area are obtained, and an area satisfying the conditions that the H value is 20-25, the S value is 10-50 and the V value is 50-85 in the image area is obtained and serves as the portrait area.
In one embodiment, step 406 may comprise: acquiring the area corresponding to each face area, acquiring background blurring strength according to the physical distance information and the area, and blurring the background area in the image to be processed according to the background blurring strength.
If a plurality of face regions are acquired, each face region has corresponding physical distance information, and the background blurring strength is acquired according to the acquired physical distance information. Further, the area corresponding to each face area may be obtained first, and the background blurring strength may be obtained according to the area and the physical distance information. For example, after a plurality of face regions are acquired, the background blurring strength is acquired according to the physical distance information corresponding to the face region with the largest or smallest region area. Or acquiring physical distance information corresponding to each face region, and acquiring background blurring strength according to an average value of the physical distance information corresponding to each face region.
In one embodiment, the physical distance information and the background blurring strength have a corresponding relationship, and after the physical distance information is acquired, the background blurring strength can be acquired according to the physical distance information and the corresponding relationship. And blurring the background area according to the background blurring strength.
And 408, acquiring portrait blurring strength corresponding to each face area in the image to be processed according to the physical distance information corresponding to each face area.
In an embodiment, after the plurality of face regions are acquired, blurring processing may be continuously performed on the portrait region corresponding to the face regions, and a portrait blurring strength indicating a degree of blurring processing performed on the portrait region is acquired according to the physical distance information corresponding to the face regions.
And step 410, blurring the portrait area corresponding to the face area according to the portrait blurring strength.
Further, acquiring a region area corresponding to the face region, taking the face region with the largest region area as a basic region, and taking the face region except the basic region as a face blurring region; acquiring portrait blurring strength corresponding to the face blurring area according to the physical distance information corresponding to the basic area and the face blurring area; and blurring the portrait area corresponding to the face blurring area according to the portrait blurring strength. And acquiring background blurring strength according to the physical distance information corresponding to the basic area.
That is, the face region is divided into a basic region and a face blurring region according to the region area, and the basic region and the face blurring region are subjected to blurring processing in different degrees. For example, the portrait area corresponding to the basic area is not blurred, and the portrait area corresponding to the face blurring area needs to be blurred. And acquiring portrait blurring strength corresponding to the face blurring area based on the physical distance information corresponding to the face area.
For example, suppose that the image to be processed includes A, B, C face regions, and the corresponding physical distance information is Da、DbAnd Dc. And if the area of the area A is the largest, taking the area A as a basic area, and taking the area B and the area C as a face blurring area. The physical distance information corresponding to the area a has a corresponding relationship with the background blurring strength, and after the physical distance information corresponding to the area a is acquired, the background blurring strength can be acquired. The background blurring strength may represent the blurring strength of the background region, and it is assumed that the background blurring strength is X, and the blurring strengths of the portrait regions corresponding to the B region and the C region are X respectivelybAnd XcThen XbAnd XcCan be calculated by the following formula:
Figure BDA0001374349820000101
the image blurring method comprises the steps of firstly detecting a face area in an image to be processed, obtaining blurring strength of a background area according to physical distance information of the face area, and blurring the background area according to the blurring strength. The physical distance information can reflect the distance between the face and the lens, the blurring strength obtained at different distances is different, and the blurring degree is changed along with the change of the physical distance information, so that the blurring effect can adapt to different shooting scenes, and the blurring process is more accurate. Meanwhile, the human face area is divided into a basic area and a human face blurring area, different blurring processing is carried out on different human face areas, and the blurring processing accuracy is further improved.
FIG. 6 is a block diagram of an image blurring apparatus according to an embodiment. The image blurring apparatus 600 includes an image obtaining module 602, an information obtaining module 604, and a background blurring module 606. Wherein:
an image obtaining module 602, configured to obtain an image to be processed.
An information obtaining module 604, configured to detect a face region in the image to be processed, and obtain physical distance information corresponding to the face region.
A background blurring module 606, configured to obtain a background blurring strength according to the physical distance information, and perform blurring processing on a background area in the image to be processed according to the background blurring strength.
The image blurring device firstly detects a face region in an image to be processed, obtains blurring strength of a background region according to physical distance information of the face region, and then performs blurring processing on the background region according to the blurring strength. The physical distance information can reflect the distance between the face and the lens, the blurring strength obtained at different distances is different, and the blurring degree is changed along with the change of the physical distance information, so that the blurring effect can adapt to different shooting scenes, and the blurring process is more accurate.
Fig. 7 is a schematic structural diagram of an image blurring device in another embodiment. The image blurring apparatus 700 includes an image obtaining module 702, an information obtaining module 704, a background blurring module 706, an area obtaining module 708, a parameter obtaining module 710, and a portrait blurring module 712. Wherein:
an image obtaining module 702, configured to obtain an image to be processed.
An information obtaining module 704, configured to detect face regions in the image to be processed, and obtain physical distance information corresponding to each face region in the image to be processed.
And a background blurring module 706, configured to obtain a background blurring strength according to the physical distance information, and perform blurring processing on a background area in the image to be processed according to the background blurring strength.
The region obtaining module 708 is configured to obtain region areas corresponding to the face regions, use the face region with the largest region area as a basic region, and use the face regions other than the basic region as face blurring regions.
The intensity obtaining module 710 is configured to obtain a portrait blurring intensity corresponding to the face blurring region according to the physical distance information corresponding to the basic region and the face blurring region.
And a portrait blurring module 712, configured to perform blurring processing on a portrait area corresponding to the face blurring area according to the portrait blurring strength.
The image blurring device firstly detects a face region in an image to be processed, obtains blurring strength of a background region according to physical distance information of the face region, and then performs blurring processing on the background region according to the blurring strength. The physical distance information can reflect the distance between the face and the lens, the blurring strength obtained at different distances is different, and the blurring degree is changed along with the change of the physical distance information, so that the blurring effect can adapt to different shooting scenes, and the blurring process is more accurate. Meanwhile, the human face area is divided into a basic area and a human face blurring area, different blurring processing is carried out on different human face areas, and the blurring processing accuracy is further improved.
In another embodiment, the information obtaining module 704 is further configured to detect a face region in the image to be processed, and obtain physical distance information corresponding to the face region
In the embodiment provided by the present application, the background blurring module 706 is further configured to obtain a region area corresponding to each face region, obtain a background blurring strength according to the physical distance information and the region area, and perform blurring processing on the background region in the image to be processed according to the background blurring strength.
In an embodiment, the intensity obtaining module 710 is further configured to obtain the portrait blurring intensity corresponding to each face region in the image to be processed according to the physical distance information corresponding to each face region.
In one embodiment, the portrait blurring module 712 is configured to perform blurring processing on a portrait area corresponding to the face area according to the portrait blurring strength.
The division of the modules in the image blurring device is only for illustration, and in other embodiments, the image blurring device may be divided into different modules as needed to complete all or part of the functions of the image blurring device.
The embodiment of the invention also provides a computer readable storage medium. One or more non-transitory computer-readable storage media embodying computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of:
acquiring an image to be processed;
detecting a face region in the image to be processed, and acquiring physical distance information corresponding to the face region;
and acquiring background blurring strength according to the physical distance information, and blurring a background area in the image to be processed according to the background blurring strength.
In one embodiment, the detecting the face region in the image to be processed, and the obtaining the physical distance information corresponding to the face region, performed by the processor, includes:
and detecting the face regions in the image to be processed, and acquiring physical distance information corresponding to each face region in the image to be processed.
In other embodiments provided by the present application, the obtaining, by the processor, a background blurring strength according to the physical distance information, and blurring a background region in the image to be processed according to the background blurring strength includes:
acquiring the area corresponding to each face area, acquiring background blurring strength according to the physical distance information and the area, and blurring the background area in the image to be processed according to the background blurring strength.
In another embodiment, the method performed by the processor further comprises:
acquiring portrait blurring strength corresponding to each face area in the image to be processed according to the physical distance information corresponding to each face area;
and performing blurring processing on the portrait area corresponding to the face area according to the portrait blurring strength.
In one embodiment, the method performed by the processor further comprises:
acquiring the area corresponding to each face area, taking the face area with the largest area as a basic area, and taking the face areas except the basic area as face blurring areas;
the obtaining of the portrait blurring strength corresponding to each face region in the image to be processed according to the physical distance information corresponding to each face region includes:
acquiring portrait blurring strength corresponding to the face blurring region according to the physical distance information corresponding to the basic region and the face blurring region;
the blurring processing of the portrait area corresponding to the face area according to the portrait blurring strength comprises:
and performing blurring processing on the portrait area corresponding to the face blurring area according to the portrait blurring strength.
The embodiment of the invention also provides computer equipment. The computer device includes therein an Image processing circuit, which may be implemented using hardware and/or software components, and may include various processing units defining an ISP (Image signal processing) pipeline. FIG. 8 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 8, for ease of explanation, only aspects of the image processing techniques related to embodiments of the present invention are shown.
As shown in fig. 8, the image processing circuit includes an ISP processor 840 and control logic 850. Image data captured by imaging device 810 is first processed by ISP processor 840, and ISP processor 840 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of imaging device 810. Imaging device 810 may include a camera having one or more lenses 812 and an image sensor 814. Image sensor 814 may include an array of color filters (e.g., Bayer filters), and image sensor 814 may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 814 and provide a set of raw image data that may be processed by ISP processor 840. The sensor 820 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 840 based on the type of sensor 820 interface. The sensor 820 interface may utilize a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, the image sensor 814 may also send raw image data to the sensor 820, the sensor 820 may provide the raw image data to the ISP processor 840 for processing based on the sensor 820 interface type, or the sensor 820 may store the raw image data in the image memory 830.
The ISP processor 840 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 840 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 840 may also receive pixel data from image memory 830. For example, the sensor 820 interface sends raw image data to the image memory 830, and the raw image data in the image memory 830 is then provided to the ISP processor 840 for processing. The image Memory 830 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from image sensor 814 interface or from sensor 820 interface or from image memory 830, ISP processor 840 may perform one or more image processing operations, such as temporal filtering. The image data processed by ISP processor 840 may be sent to image memory 830 for additional processing before being displayed. ISP processor 840 receives processed data from image memory 830 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The processed image data may be output to a display 880 for viewing by a user and/or further Processing by a Graphics Processing Unit (GPU). Further, the output of ISP processor 840 may also be sent to image memory 830 and display 880 may read image data from image memory 830. In one embodiment, image memory 830 may be configured to implement one or more frame buffers. Further, the output of the ISP processor 840 may be transmitted to an encoder/decoder 870 for encoding/decoding the image data. The encoded image data may be saved and decompressed before being displayed on the display 880 device.
The ISP processed image data may be sent to a blurring module 860 for blurring the image before being displayed. The blurring module 860 may perform blurring on the image data by obtaining a background blurring strength according to the physical distance information, blurring a background region in the image data according to the background blurring strength, and the like. After the image data is subjected to the blurring process by the blurring module 860, the blurred image data may be transmitted to the encoder/decoder 870 so as to encode/decode the image data. The encoded image data may be saved and decompressed prior to display on a display 880 device. It is understood that the image data processed by the blurring module 860 may be directly transmitted to the display 880 for display without passing through the encoder/decoder 870. The image data processed by ISP processor 840 may also be processed by encoder/decoder 870 before being processed by blurring module 860. The blurring module 860 or the encoder/decoder 870 may be a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU) in the mobile terminal.
The statistics determined by ISP processor 840 may be sent to control logic 850 unit. For example, the statistical data may include image sensor 814 statistical information such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 812 shading correction, and the like. Control logic 850 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 810 and ISP processor 840 based on the received statistical data. For example, the control parameters of imaging device 810 may include sensor 820 control parameters (e.g., gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 812 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 812 shading correction parameters.
The following steps are implemented to implement the image blurring method using the image processing technique in fig. 8:
acquiring an image to be processed;
detecting a face region in the image to be processed, and acquiring physical distance information corresponding to the face region;
and acquiring background blurring strength according to the physical distance information, and blurring a background area in the image to be processed according to the background blurring strength.
In one embodiment, the detecting the face region in the image to be processed and acquiring the physical distance information corresponding to the face region includes:
and detecting the face regions in the image to be processed, and acquiring physical distance information corresponding to each face region in the image to be processed.
In other embodiments provided by the present application, the obtaining a background blurring strength according to the physical distance information, and blurring a background area in the image to be processed according to the background blurring strength includes:
acquiring the area corresponding to each face area, acquiring background blurring strength according to the physical distance information and the area, and blurring the background area in the image to be processed according to the background blurring strength.
In another embodiment, the method further comprises:
acquiring portrait blurring strength corresponding to each face area in the image to be processed according to the physical distance information corresponding to each face area;
and performing blurring processing on the portrait area corresponding to the face area according to the portrait blurring strength.
In one embodiment, the method further comprises:
acquiring the area corresponding to each face area, taking the face area with the largest area as a basic area, and taking the face areas except the basic area as face blurring areas;
the obtaining of the portrait blurring strength corresponding to each face region in the image to be processed according to the physical distance information corresponding to each face region includes:
acquiring portrait blurring strength corresponding to the face blurring region according to the physical distance information corresponding to the basic region and the face blurring region;
the blurring processing of the portrait area corresponding to the face area according to the portrait blurring strength comprises:
and performing blurring processing on the portrait area corresponding to the face blurring area according to the portrait blurring strength.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. A method of blurring an image, the method comprising:
acquiring an image to be processed;
detecting a face region in the image to be processed, and acquiring physical distance information corresponding to the face region, wherein the physical distance information corresponding to the face region is a parameter related to a physical distance from an image acquisition device to a face;
and acquiring background blurring strength according to the physical distance information, and blurring a background area in the image to be processed according to the background blurring strength, wherein the background blurring strength is inversely related to the distance indicated by the physical distance information.
2. The image blurring method according to claim 1, wherein the detecting a face region in the image to be processed and acquiring physical distance information corresponding to the face region comprises:
and detecting the face regions in the image to be processed, and acquiring physical distance information corresponding to each face region in the image to be processed.
3. The image blurring method according to claim 2, wherein the obtaining a background blurring strength according to the physical distance information, and blurring a background region in the image to be processed according to the background blurring strength comprises:
acquiring the area corresponding to each face area, acquiring background blurring strength according to the physical distance information and the area, and blurring the background area in the image to be processed according to the background blurring strength.
4. The image blurring method according to claim 2, further comprising:
acquiring portrait blurring strength corresponding to each face area in the image to be processed according to the physical distance information corresponding to each face area;
and performing blurring processing on the portrait area corresponding to the face area according to the portrait blurring strength.
5. The image blurring method according to claim 4, further comprising:
acquiring the area corresponding to each face area, taking the face area with the largest area as a basic area, and taking the face areas except the basic area as face blurring areas;
the obtaining of the portrait blurring strength corresponding to each face region in the image to be processed according to the physical distance information corresponding to each face region includes:
acquiring portrait blurring strength corresponding to the face blurring region according to the physical distance information corresponding to the basic region and the face blurring region;
the blurring processing of the portrait area corresponding to the face area according to the portrait blurring strength comprises:
and performing blurring processing on the portrait area corresponding to the face blurring area according to the portrait blurring strength.
6. An image blurring apparatus, comprising:
the image acquisition module is used for acquiring an image to be processed;
the information acquisition module is used for detecting a face region in the image to be processed and acquiring physical distance information corresponding to the face region, wherein the physical distance information corresponding to the face region is a parameter related to the physical distance between the image acquisition device and the face;
and the background blurring module is used for acquiring background blurring strength according to the physical distance information and blurring a background area in the image to be processed according to the background blurring strength, wherein the background blurring strength is negatively correlated with the distance indicated by the physical distance information.
7. The image blurring device according to claim 6, wherein the information obtaining module is further configured to detect face regions in the image to be processed, and obtain physical distance information corresponding to each face region in the image to be processed.
8. The image blurring device according to claim 7, wherein the background blurring module is further configured to obtain a region area corresponding to each face region, obtain a background blurring strength according to the physical distance information and the region area, and perform blurring processing on the background region in the image to be processed according to the background blurring strength.
9. The image blurring device according to claim 7, further comprising:
the intensity acquisition module is used for acquiring portrait blurring intensity corresponding to each face area in the image to be processed according to the physical distance information corresponding to each face area;
and the portrait blurring module is used for blurring a portrait area corresponding to the face area according to the portrait blurring strength.
10. The image blurring device according to claim 9, further comprising:
the region acquisition module is used for acquiring the region areas corresponding to the face regions, taking the face region with the largest region area as a basic region, and taking the face regions except the basic region as face blurring regions;
the intensity acquisition module is also used for acquiring portrait blurring intensity corresponding to the face blurring region according to the physical distance information corresponding to the basic region and the face blurring region;
the portrait blurring module is further configured to perform blurring processing on a portrait area corresponding to the face blurring area according to the portrait blurring strength.
11. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the image blurring method of any one of claims 1 to 5.
12. A computer device comprising a memory and a processor, the memory having stored therein computer readable instructions that, when executed by the processor, cause the processor to perform the image blurring method as claimed in any one of claims 1 to 5.
CN201710676169.2A 2017-08-09 2017-08-09 Image blurring method and device, computer readable storage medium and computer device Active CN107704798B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710676169.2A CN107704798B (en) 2017-08-09 2017-08-09 Image blurring method and device, computer readable storage medium and computer device
PCT/CN2018/099403 WO2019029573A1 (en) 2017-08-09 2018-08-08 Image blurring method, computer-readable storage medium and computer device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710676169.2A CN107704798B (en) 2017-08-09 2017-08-09 Image blurring method and device, computer readable storage medium and computer device

Publications (2)

Publication Number Publication Date
CN107704798A CN107704798A (en) 2018-02-16
CN107704798B true CN107704798B (en) 2020-06-12

Family

ID=61170965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710676169.2A Active CN107704798B (en) 2017-08-09 2017-08-09 Image blurring method and device, computer readable storage medium and computer device

Country Status (2)

Country Link
CN (1) CN107704798B (en)
WO (1) WO2019029573A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107704798B (en) * 2017-08-09 2020-06-12 Oppo广东移动通信有限公司 Image blurring method and device, computer readable storage medium and computer device
CN110099251A (en) * 2019-04-29 2019-08-06 努比亚技术有限公司 Processing method, device and the computer readable storage medium of monitor video
CN110971827B (en) * 2019-12-09 2022-02-18 Oppo广东移动通信有限公司 Portrait mode shooting method and device, terminal equipment and storage medium
CN112217992A (en) * 2020-09-29 2021-01-12 Oppo(重庆)智能科技有限公司 Image blurring method, image blurring device, mobile terminal, and storage medium
CN113673474B (en) * 2021-08-31 2024-01-12 Oppo广东移动通信有限公司 Image processing method, device, electronic equipment and computer readable storage medium
CN115883958A (en) * 2022-11-22 2023-03-31 荣耀终端有限公司 Portrait shooting method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102843509A (en) * 2011-06-14 2012-12-26 宾得理光映像有限公司 Image processing device and image processing method

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4752941B2 (en) * 2009-03-31 2011-08-17 カシオ計算機株式会社 Image composition apparatus and program
CN103945118B (en) * 2014-03-14 2017-06-20 华为技术有限公司 Image weakening method, device and electronic equipment
CN103973977B (en) * 2014-04-15 2018-04-27 联想(北京)有限公司 Virtualization processing method, device and the electronic equipment of a kind of preview interface
CN104333700B (en) * 2014-11-28 2017-02-22 广东欧珀移动通信有限公司 Image blurring method and image blurring device
JP6495122B2 (en) * 2015-07-02 2019-04-03 オリンパス株式会社 Imaging apparatus and image processing method
CN105389801B (en) * 2015-10-20 2018-09-21 厦门美图之家科技有限公司 Character contour setting method, character image weakening method, system and camera terminal
CN106331492B (en) * 2016-08-29 2019-04-16 Oppo广东移动通信有限公司 A kind of image processing method and terminal
CN106548185B (en) * 2016-11-25 2019-05-24 三星电子(中国)研发中心 A kind of foreground area determines method and apparatus
CN106875348B (en) * 2016-12-30 2019-10-18 成都西纬科技有限公司 A kind of heavy focus image processing method
CN106952222A (en) * 2017-03-17 2017-07-14 成都通甲优博科技有限责任公司 A kind of interactive image weakening method and device
CN107704798B (en) * 2017-08-09 2020-06-12 Oppo广东移动通信有限公司 Image blurring method and device, computer readable storage medium and computer device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102843509A (en) * 2011-06-14 2012-12-26 宾得理光映像有限公司 Image processing device and image processing method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Out-of-focus: Learning Depth from Image Bokeh for Robotic Perception;Eric Cristofalo et al.;《arXiv:1705.01152v1》;20170502;第1-6页 *
基于多聚焦图像深度信息提取的背景虚化显示;肖进胜 等;《自动化学报》;20150228;第41卷(第2期);第304-311页 *

Also Published As

Publication number Publication date
WO2019029573A1 (en) 2019-02-14
CN107704798A (en) 2018-02-16

Similar Documents

Publication Publication Date Title
CN107948519B (en) Image processing method, device and equipment
CN107704798B (en) Image blurring method and device, computer readable storage medium and computer device
KR102279436B1 (en) Image processing methods, devices and devices
US10757312B2 (en) Method for image-processing and mobile terminal using dual cameras
CN107481186B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN109068058B (en) Shooting control method and device in super night scene mode and electronic equipment
CN107509031B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107493432B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107395991B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
CN109348088B (en) Image noise reduction method and device, electronic equipment and computer readable storage medium
CN109685853B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107911682B (en) Image white balance processing method, device, storage medium and electronic equipment
CN107563979B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN108259770B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108717530B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
US11233948B2 (en) Exposure control method and device, and electronic device
CN108401110B (en) Image acquisition method and device, storage medium and electronic equipment
CN107959841B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN109559352B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN108174173B (en) Photographing method and apparatus, computer-readable storage medium, and computer device
CN107454335B (en) Image processing method, image processing device, computer-readable storage medium and mobile terminal
CN107682611B (en) Focusing method and device, computer readable storage medium and electronic equipment
CN107194901B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN107563329B (en) Image processing method, image processing device, computer-readable storage medium and mobile terminal
CN109584311B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., Ltd.

GR01 Patent grant
GR01 Patent grant