WO2019029573A1 - Procédé de floutage d'image, support d'informations lisible par ordinateur et dispositif informatique - Google Patents

Procédé de floutage d'image, support d'informations lisible par ordinateur et dispositif informatique Download PDF

Info

Publication number
WO2019029573A1
WO2019029573A1 PCT/CN2018/099403 CN2018099403W WO2019029573A1 WO 2019029573 A1 WO2019029573 A1 WO 2019029573A1 CN 2018099403 W CN2018099403 W CN 2018099403W WO 2019029573 A1 WO2019029573 A1 WO 2019029573A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
image
face
distance information
blurring
Prior art date
Application number
PCT/CN2018/099403
Other languages
English (en)
Chinese (zh)
Inventor
丁佳铭
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2019029573A1 publication Critical patent/WO2019029573A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the present application relates to the field of computer technology, and in particular, to an image blurring method, apparatus, computer readable storage medium, and computer device.
  • the scenes of photographing are often complicated and varied.
  • the subject of the photographs is more prominent and the layering is reflected.
  • the usual processing method is to maintain the sharpness of the subject and to shoot the subject.
  • the area other than the area is blurred.
  • Blurring is to blur the area outside the subject, making the subject more prominent.
  • the traditional method of blurring is to first identify the subject in the image, and then directly perform a fixed degree of blurring of the area outside the subject, so that the background and the subject are displayed differently.
  • Embodiments of the present application provide an image blurring method, a computer readable storage medium, and a computer device.
  • An image blurring method comprising:
  • One or more non-transitory computer readable storage media containing computer executable instructions that, when executed by one or more processors, cause the processor to perform the following operations:
  • a computer device comprising a memory and a processor, the memory storing computer readable instructions, the instructions being executed by the processor, causing the processor to perform the following operations:
  • the image blurring method, apparatus, computer readable storage medium and computer device provided by the embodiments of the present application first detect a face region in an image to be processed, and obtain a blurring intensity of the background region according to physical distance information of the face region. Then, the background area is blurred according to the blurring intensity.
  • the physical distance information can reflect the distance between the face and the lens, and the blurring strength of the different acquisitions is different, which makes the blurring process more precise.
  • FIG. 1 is a schematic diagram showing the internal structure of an electronic device in an embodiment
  • FIG. 2 is a schematic diagram showing the internal structure of a server in an embodiment
  • 3 is a flow chart of an image blurring method in an embodiment
  • FIG. 5 is a schematic diagram of acquiring physical distance information in an embodiment
  • FIG. 6 is a schematic structural diagram of an image blurring device in an embodiment
  • FIG. 7 is a schematic structural diagram of an image blurring device in another embodiment
  • Figure 8 is a schematic illustration of an image processing circuit in one embodiment.
  • first may be referred to as a second client
  • second client may be referred to as a first client, without departing from the scope of the present application.
  • Both the first client and the second client are clients, but they are not the same client.
  • FIG. 1 is a schematic diagram showing the internal structure of an electronic device in an embodiment.
  • the electronic device includes a processor coupled through a system bus, a non-volatile storage medium, an internal memory and network interface, a display screen, and an input device.
  • the non-volatile storage medium of the electronic device stores an operating system and computer readable instructions.
  • the computer readable instructions are executed by a processor to implement an image blurring method.
  • the processor is used to provide computing and control capabilities to support the operation of the entire electronic device.
  • the internal memory in the electronic device provides an environment for the operation of computer readable instructions in a non-volatile storage medium.
  • the network interface is used for network communication with the server, such as sending an image blur request to the server, receiving the blurred image returned by the server, and the like.
  • the display screen of the electronic device may be a liquid crystal display or an electronic ink display screen
  • the input device may be a touch layer covered on the display screen, or may be a button, a trackball or a touchpad provided on the outer casing of the electronic device, or may be An external keyboard, trackpad, or mouse.
  • the electronic device can be a cell phone, a tablet or a personal digital assistant or a wearable device. A person skilled in the art can understand that the structure shown in FIG.
  • FIG. 1 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device to which the solution of the present application is applied.
  • the specific electronic device may be It includes more or fewer components than those shown in the figures, or some components are combined, or have different component arrangements.
  • the server includes a processor, a non-volatile storage medium, an internal memory, and a network interface connected by a system bus.
  • the non-volatile storage medium of the server stores an operating system and computer readable instructions.
  • the computer readable instructions are executed by a processor to implement an image blurring method.
  • the server's processor is used to provide computing and control capabilities that support the operation of the entire server.
  • the network interface of the server is configured to communicate with an external terminal through a network connection, such as receiving an image blur request sent by the terminal, and returning the blurred image to the terminal.
  • the server can be implemented with a stand-alone server or a server cluster consisting of multiple servers.
  • FIG. 2 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the server to which the solution of the present application is applied.
  • the specific server may include a ratio. More or fewer components are shown in the figures, or some components are combined, or have different component arrangements.
  • FIG. 3 is a flow chart of an image blurring method in one embodiment. As shown in FIG. 3, the image blurring method includes operations 302 through 306, wherein:
  • Operation 302 obtaining an image to be processed.
  • the image to be processed refers to an image that needs to be blurred, and can be collected by the image capturing device.
  • the image acquisition device refers to a device for acquiring an image.
  • the image acquisition device may be a camera, a camera on a mobile terminal, a camera, or the like.
  • the user terminal may directly perform the blurring process on the image to be processed by the user terminal, or may initiate an image blurring request to the server, and perform the blurring process on the server to process the image.
  • the image blurring instruction may be input by the user, or may be automatically triggered by the user terminal.
  • the user inputs a photographing instruction through the user terminal, and after detecting the photographing instruction, the mobile terminal collects the image to be processed through the camera. Then, the image blurring instruction is automatically triggered, and the image to be processed is blurred.
  • the photographing instruction may be triggered by a physical button or a touch screen operation of the mobile terminal, or may be a voice instruction or the like.
  • Operation 304 detecting a face region in the image to be processed, and acquiring physical distance information corresponding to the face region.
  • the face area refers to the area where the face is located in the image to be processed
  • the physical distance information refers to a related parameter indicating the physical distance between the image collection device and the object corresponding to each pixel point in the image to be processed.
  • the physical distance information corresponding to the face region refers to a related parameter of the physical distance between the image capturing device and the face.
  • the feature points in the image to be processed may be first identified, and then the feature points are extracted and matched with the preset face model. If the extracted feature points match the preset face model, the feature points are extracted.
  • the area is the face area.
  • the image to be processed is composed of a plurality of pixels, each pixel having corresponding physical distance information indicating the physical distance of the object represented by the pixel to the image capturing device.
  • a plurality of face regions may exist in the image to be processed, and after detecting the face region in the image to be processed, the area corresponding to each face region may be acquired; and the physical region corresponding to the face region having the largest area is obtained.
  • Distance information The background blur strength is obtained according to the physical distance information corresponding to the face area having the largest area.
  • the area of the area refers to the size of the area corresponding to the face area, and the area of the area may be represented by the number of pixels included in the face area, or may be represented by the ratio of the size of the area occupied by the face area to the size of the image to be processed. .
  • the physical distance information within the effective distance range can be represented by an accurate numerical value
  • the physical distance information exceeding the effective distance range is represented by a fixed numerical value
  • the operation 304 may include: detecting a face region within a preset distance range in the image to be processed, and acquiring physical distance information corresponding to the face region.
  • the preset distance range may be, but is limited to, a value range of valid physical distance information.
  • the background blurring intensity is obtained according to the physical distance information, and the background area in the image to be processed is blurred according to the background blurring intensity.
  • the blurring process refers to blurring the image and performing blurring processing according to the blurring intensity, and the blurring intensity is different, and the degree of blurring processing is also different.
  • the background area may refer to an area other than the face area or the portrait area in the image to be processed.
  • the portrait area refers to the area where the entire portrait in the image to be processed is located.
  • the background blur strength refers to a parameter indicating the degree of blurring of the background area. Obtaining the background blurring intensity according to the physical distance information of the face region, and then blurring the background region according to the background blurring intensity, and the obtained blurring result will change with the actual physical distance of the face to the image capturing device. change. Generally, the larger the physical distance information is, the smaller the background blurring intensity is, and the smaller the degree of blurring of the background area is; the smaller the physical distance information is, the greater the background blurring intensity is, and the background area is blurred. The greater the degree.
  • the image blurring method firstly detects a face region in the image to be processed, and obtains a blurring intensity of the background region according to the physical distance information of the face region, and then blurs the background region according to the blurring intensity.
  • the physical distance information can reflect the distance between the face and the lens, and the blurring intensity of the different acquisitions is also different.
  • the degree of blurring changes with the change of the physical distance information, so that the effect of the blurring process can adapt to different shooting scenes, and the blurring process is performed. More precise.
  • the image blurring method includes operations 402 through 410, wherein:
  • Operation 402 obtaining an image to be processed.
  • the image to be processed may be acquired directly on the local or server.
  • the user terminal may directly acquire the corresponding image to be processed according to the image storage address and the image identifier included in the image blurring instruction.
  • the image storage address can be local to the user terminal or it can be on the server.
  • the image to be processed may be blurred locally, or may be blurred on the server.
  • Operation 404 detecting a face area in the image to be processed, and acquiring physical distance information corresponding to each face area in the image to be processed.
  • a dual camera can be installed on the image acquisition device, and the physical distance information between the image acquisition device and the object is measured by the dual camera.
  • the image of the object is respectively captured by the first camera and the second camera; and the first angle and the second angle are obtained according to the image, wherein the first angle is the first camera to the horizontal line of the object and the first camera is The angle between the horizontal lines of the second camera, the second angle is the angle between the horizontal line of the second camera to the object and the horizontal line of the second camera to the first camera; according to the first angle, the second angle and The distance between the first camera and the second camera acquires physical distance information between the image capturing device and the object.
  • Figure 5 is a schematic diagram of obtaining physical distance information in one embodiment.
  • an image of the object 506 is respectively captured by the first camera 502 and the second camera 504, and according to the image, the first angle A1 and the second angle A2 can be acquired, and then according to the first angle A1,
  • the two angles A2 and the distance T between the first camera 502 and the second camera 504 can obtain a physical distance D between any point on the horizontal line of the first camera 402 to the second camera 504 and the object 506.
  • the image to be processed also includes multiple face regions.
  • Each face region in the image to be processed is extracted, and physical distance information corresponding to the face region is obtained.
  • the depth map corresponding to the scene may be acquired at the same time.
  • the obtained depth map has a one-to-one correspondence with the image, and the value in the depth map represents physical distance information of the corresponding pixel in the image. That is to say, the corresponding depth map can be acquired while acquiring the image to be processed, and after detecting the face region in the image to be processed, the corresponding physical distance information can be obtained in the depth map according to the pixel coordinates in the face region. .
  • the face region contains a plurality of pixels. After obtaining the physical distance information corresponding to each pixel in the face region, the physical distance information corresponding to all the pixels in the face region may be averaged, or the physical distance information corresponding to a certain pixel may be obtained to represent the person.
  • the physical distance information corresponding to the face area For example, physical distance information corresponding to a central pixel of the face region is obtained to represent physical distance information corresponding to the face region.
  • the background blurring intensity is obtained according to the physical distance information, and the background area in the image to be processed is blurred according to the background blurring intensity.
  • the portrait is considered to be on the same vertical plane as the face, and the physical distance of the portrait to the image capture device is within the same range as the physical distance of the face to the image capture device. Therefore, after the physical distance information and the face area are acquired, the portrait area in the image to be processed can be acquired according to the physical distance corresponding to the face area, and then the background area can be determined in the image to be processed according to the portrait area.
  • the face area in the image to be processed is detected, and the range of the portrait distance is obtained according to the physical distance information corresponding to the face area, and the portrait area in the image to be processed can be obtained according to the range of the portrait distance, and then the background is obtained according to the portrait area. region.
  • the portrait distance range refers to a range of values of physical distance information corresponding to the portrait area in the image to be processed. Since the physical distance from the image capturing device to the human face and the physical distance to the human face can be regarded as equal, after detecting the face region, the physical distance information corresponding to the face region is acquired, and then the physical distance corresponding to the face region is obtained. The information can determine the range of the physical distance information corresponding to the portrait area.
  • the physical distance information in the range is considered to be the physical distance information corresponding to the portrait area, and the physical distance information outside the range is regarded as the physical distance information of the background area.
  • the method further includes: acquiring a range of the portrait distance according to the physical distance information corresponding to the face region, and acquiring the image region in the image to be processed according to the physical distance information in the range of the portrait distance; acquiring the color information of the image region, And obtaining a background area other than the portrait area in the image to be processed according to the color information.
  • the image area extracted according to the range of portrait distance is the area of the object in the same physical distance range from the face in the image to be processed. If there are other objects beside the person, the extracted image area may exist except the portrait area. Other objects. At this time, the portrait area can be further extracted according to the color information of the image area.
  • the color information refers to a related parameter used to represent the color of the image, and for example, the color information may include information such as hue, saturation, brightness, and the like of the color in the image.
  • the hue of color refers to the angle measurement of color, which ranges from 0° to 360°, and is calculated from the red counterclockwise direction, red is 0°, green is 120°, and blue is 240°.
  • Saturation refers to the degree to which the color is close to the spectrum. Generally, the higher the saturation, the brighter the color; the lower the saturation, the darker the color. Brightness indicates the brightness of the color.
  • the color information presented in the image is also different.
  • the color of the trees is green
  • the sky is blue
  • the earth is yellow
  • the background area outside the portrait area and the portrait area can be extracted based on the color information in the image area.
  • the color component of the image region is acquired, and an area in the image region in which the color component is within the preset range is extracted as the portrait region.
  • the color component refers to an image component generated by converting an image to be processed into an image of a certain color dimension.
  • the color component may refer to an RGB color component, a CMY color component, an HSV color component, etc. of the image, and an RGB color is understood.
  • the component, CMY color component, and HSV color component can be converted to each other.
  • the HSV color component of the image region is acquired, and an area of the image region in which the HSV color component is within a preset range is extracted as the portrait region.
  • the HSV color component refers to the hue (H), saturation (S), and lightness (V) components of the image, respectively, and respectively sets a preset range for the three components, and the three components in the image region are The area within the preset range is extracted as a portrait area.
  • the HSV color component is used to obtain the portrait region, specifically, the HSV color component of the image region is acquired, and the condition “H value is 20-25, S value is 10-50, and V value is 50.
  • the area between ⁇ 85 is used as a portrait area.
  • the operation 406 may include: acquiring an area corresponding to each face area, obtaining a background blur strength according to the physical distance information and the area area, and performing blurring on the background area in the image to be processed according to the background blur intensity. deal with.
  • each face region has corresponding physical distance information, and the background blur strength is obtained according to the acquired physical distance information.
  • the area of the area corresponding to each face area may be first acquired, and the background blur strength may be obtained according to the area area and the physical distance information.
  • the background blur strength is obtained according to the physical distance information corresponding to the face region having the largest or smallest region area. It is also possible to obtain the physical distance information corresponding to each face region, and obtain the background blur strength according to the average value of the physical distance information corresponding to each face region.
  • the physical distance information has a corresponding relationship with the background blur strength.
  • the background blur strength can be obtained according to the physical distance information and the corresponding relationship.
  • the background area is blurred according to the background blur strength.
  • Operation 408 Obtain a portrait blurring intensity corresponding to each face region in the image to be processed according to the physical distance information corresponding to each face region.
  • the portrait region corresponding to the face region may be further blurred, and the image blur intensity is obtained according to the physical distance information corresponding to the face region, and the portrait blur intensity is represented. The degree to which the portrait area is blurred.
  • the portrait area corresponding to the face area is blurred according to the image blurring intensity.
  • the area corresponding to the face area is obtained, the face area having the largest area is used as the base area, and the face area other than the base area is used as the face blur area; according to the base area and the face virtual
  • the physical distance information corresponding to the region is obtained, and the portrait blurring intensity corresponding to the blurred area of the face is obtained; and the portrait region corresponding to the blurred area of the face is blurred according to the blurring intensity of the portrait.
  • the background blur strength is obtained according to the physical distance information corresponding to the basic region.
  • the face area is divided into the base area and the face blur area, and the base area and the face blur area are treated to different degrees.
  • the portrait area corresponding to the base area is not blurred, and the portrait area corresponding to the face blur area needs to be blurred.
  • the portrait blurring intensity corresponding to the face blur region is obtained.
  • the image to be processed includes three face regions A, B, and C, and the corresponding physical distance information is D a , D b , and D c , respectively .
  • the area of the A area is the largest, and the A area is used as the basic area, and the B area and the C area are used as the blurred area of the face.
  • the physical distance information corresponding to the A area has a corresponding relationship with the background blur strength. After the physical distance information corresponding to the A area is acquired, the background blur strength can be obtained.
  • the background blurring intensity may indicate the intensity of the blurring process on the background region, assuming that the background blurring intensity is X, and the portrait blurring intensities of the portrait regions corresponding to the B region and the C region are X b and X c , respectively, then X b And X c can be calculated by the following formula:
  • the image blurring method firstly detects a face region in the image to be processed, and obtains a blurring intensity of the background region according to the physical distance information of the face region, and then blurs the background region according to the blurring intensity.
  • the physical distance information can reflect the distance between the face and the lens, and the blurring intensity of the different acquisitions is also different.
  • the degree of blurring changes with the change of the physical distance information, so that the effect of the blurring process can adapt to different shooting scenes, and the blurring process is performed. More precise.
  • the face area is divided into the basic area and the face blurred area, and different face areas are treated differently, which further improves the accuracy of the blurring process.
  • FIG. 6 is a schematic structural diagram of an image blurring device in an embodiment.
  • the image blurring device 600 includes an image obtaining module 602, an information acquiring module 604, and a background blurring module 606. among them:
  • the image obtaining module 602 is configured to acquire an image to be processed.
  • the information acquiring module 604 is configured to detect a face area in the image to be processed, and acquire physical distance information corresponding to the face area.
  • the ambiguity module 606 is configured to obtain a background blurring strength according to the physical distance information, and perform a blurring process on the background region in the image to be processed according to the background blurring intensity.
  • the image blurring device first detects a face region in the image to be processed, and obtains a blurring intensity of the background region according to the physical distance information of the face region, and then blurs the background region according to the blurring intensity.
  • the physical distance information can reflect the distance between the face and the lens, and the blurring intensity of the different acquisitions is also different.
  • the degree of blurring changes with the change of the physical distance information, so that the effect of the blurring process can adapt to different shooting scenes, and the blurring process is performed. More precise.
  • FIG. 7 is a schematic structural diagram of an image blurring device in another embodiment.
  • the image blurring device 700 includes an image obtaining module 702, an information acquiring module 704, a background blurring module 706, a region obtaining module 708, a parameter obtaining module 710, and a portrait blurring module 712. among them:
  • the image obtaining module 702 is configured to acquire an image to be processed.
  • the information acquiring module 704 is configured to detect a face region in the image to be processed, and acquire physical distance information corresponding to each face region in the image to be processed.
  • the ambiguity module 706 is configured to obtain a background blurring strength according to the physical distance information, and perform a blurring process on the background region in the image to be processed according to the background blurring intensity.
  • the area obtaining module 708 is configured to acquire an area corresponding to each face area, use a face area with the largest area as the base area, and use a face area other than the base area as the face blur area.
  • the strength obtaining module 710 is configured to obtain the portrait blurring intensity corresponding to the face blurring area according to the physical distance information corresponding to the basic area and the face blurring area.
  • the portrait blurring module 712 is configured to perform a blurring process on the portrait area corresponding to the face blurring area according to the portrait blurring intensity.
  • the image blurring device first detects a face region in the image to be processed, and obtains a blurring intensity of the background region according to the physical distance information of the face region, and then blurs the background region according to the blurring intensity.
  • the physical distance information can reflect the distance between the face and the lens, and the blurring intensity of the different acquisitions is also different.
  • the degree of blurring changes with the change of the physical distance information, so that the effect of the blurring process can adapt to different shooting scenes, and the blurring process is performed. More precise.
  • the face area is divided into the basic area and the face blurred area, and different face areas are treated differently, which further improves the accuracy of the blurring process.
  • the information acquiring module 704 is further configured to detect a face region in the image to be processed, and acquire physical distance information corresponding to the face region.
  • the background blurring module 706 is further configured to acquire an area corresponding to each face region, obtain a background blur strength according to the physical distance information and the area, and according to the background blur strength Performing a blurring process on the background area in the image to be processed.
  • the intensity obtaining module 710 is further configured to acquire, according to the physical distance information corresponding to the respective face regions, a portrait blurring intensity corresponding to each face region in the image to be processed.
  • the portrait blur module 712 is configured to perform a blurring process on the portrait region corresponding to the face region according to the portrait blur intensity.
  • each module in the image blurring device is for illustrative purposes only. In other embodiments, the image blurring device may be divided into different modules as needed to complete all or part of the functions of the image blurring device.
  • the embodiment of the present application also provides a computer readable storage medium.
  • One or more non-transitory computer readable storage media containing computer executable instructions that, when executed by one or more processors, cause the processor to:
  • the detecting, by the processor, the face area in the image to be processed, and acquiring physical distance information corresponding to the face area includes:
  • the performing, by the processor, acquiring the background blurring intensity according to the physical distance information, and blurring the background region in the image to be processed according to the background blurring intensity Processing includes:
  • the method performed by the processor further comprises:
  • the portrait area corresponding to the face area is blurred according to the portrait blur intensity.
  • the method executed by the processor further comprises:
  • the image blurring intensity corresponding to each face region in the image to be processed includes:
  • the blurring processing on the portrait area corresponding to the face area according to the image blurring intensity includes:
  • the portrait area corresponding to the face blur area is blurred according to the portrait blur intensity.
  • the embodiment of the present application further provides a computer device.
  • the above computer device includes an image processing circuit, and the image processing circuit may be implemented by hardware and/or software components, and may include various processing units defining an ISP (Image Signal Processing) pipeline.
  • Figure 8 is a schematic illustration of an image processing circuit in one embodiment. As shown in FIG. 8, for convenience of explanation, only various aspects of the image processing technique related to the embodiment of the present application are shown.
  • the image processing circuit includes an ISP processor 840 and a control logic 850.
  • the image data captured by imaging device 810 is first processed by ISP processor 840, which analyzes the image data to capture image statistics that can be used to determine and/or control one or more control parameters of imaging device 810.
  • Imaging device 810 can include a camera having one or more lenses 812 and image sensors 814.
  • Image sensor 814 can include a color filter array (such as a Bayer filter) that can capture light intensity and wavelength information captured with each imaging pixel of image sensor 814 and provide a set of primitives that can be processed by ISP processor 840 Image data.
  • a sensor 820 such as a gyroscope, can provide acquired image processing parameters (such as anti-shake parameters) to the ISP processor 840 based on the sensor 820 interface type.
  • the sensor 820 interface may utilize a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
  • SMIA Standard Mobile Imaging Architecture
  • image sensor 814 can also transmit raw image data to sensor 820, which can provide raw image data to ISP processor 840 for processing based on sensor 820 interface type, or sensor 820 stores raw image data into image memory 830. .
  • the ISP processor 840 processes the raw image data pixel by pixel in a variety of formats.
  • each image pixel can have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 840 can perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Among them, image processing operations can be performed with the same or different bit depth precision.
  • ISP processor 840 can also receive pixel data from image memory 830.
  • sensor 820 interface sends raw image data to image memory 830, which is then provided to ISP processor 840 for processing.
  • Image memory 830 can be part of a memory device, a storage device, or a separate dedicated memory within an electronic device, and can include DMA (Direct Memory Access) features.
  • DMA Direct Memory Access
  • the ISP processor 840 can perform one or more image processing operations, such as time domain filtering.
  • the image data processed by the ISP processor 840 can be sent to the image memory 830 for additional processing before being displayed.
  • the ISP processor 840 receives the processed data from the image memory 830 and performs image data processing in the original domain and in the RGB and YCbCr color spaces.
  • the processed image data can be output to display 880 for viewing by a user and/or further processed by a graphics engine or a GPU (Graphics Processing Unit). Additionally, the output of ISP processor 840 can also be sent to image memory 830, and display 880 can read image data from image memory 830.
  • image memory 830 can be configured to implement one or more frame buffers. Additionally, the output of ISP processor 840 can be sent to encoder/decoder 870 for encoding/decoding image data. The encoded image data can be saved and decompressed before being displayed on the display 880 device.
  • the image data processed by the ISP can be sent to the blurring module 860 to blush the image before being displayed.
  • the blurring processing of the image data by the blurring module 860 may include acquiring the background blurring intensity according to the physical distance information, and performing blurring processing on the background region in the image data according to the background blurring intensity.
  • the image data after the blurring process can be sent to the encoder/decoder 870 to encode/decode the image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 880 device. It can be understood that the image data processed by the blurring module 860 can be directly sent to the display 880 for display without passing through the encoder/decoder 870.
  • the image data processed by the ISP processor 840 may also be processed by the encoder/decoder 870 and then processed by the blurring module 860.
  • the ambiguity module 860 or the encoder/decoder 870 may be a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit) in the mobile terminal.
  • the statistics determined by the ISP processor 840 can be sent to the control logic 850 unit.
  • the statistics may include image sensor 814 statistics such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, lens 812 shading correction, and the like.
  • Control logic 850 can include a processor and/or a microcontroller that executes one or more routines, such as firmware, and one or more routines can determine control parameters of imaging device 810 and ISP processing based on received statistical data.
  • Control parameters of the 840 may include sensor 820 control parameters (eg, gain, integration time of exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 812 control parameters (eg, focal length for focus or zoom), or these A combination of parameters.
  • the ISP control parameters may include a gain level and color correction matrix for automatic white balance and color adjustment (eg, during RGB processing), and a lens 812 shading correction parameter.
  • the detecting the face area in the image to be processed and acquiring the physical distance information corresponding to the face area includes:
  • the obtaining a background blurring intensity according to the physical distance information, and performing a blurring process on the background area in the image to be processed according to the background blurring intensity includes:
  • the method further includes:
  • the portrait area corresponding to the face area is blurred according to the portrait blur intensity.
  • the method further includes:
  • the image blurring intensity corresponding to each face region in the image to be processed includes:
  • the blurring processing on the portrait area corresponding to the face area according to the image blurring intensity includes:
  • the portrait area corresponding to the face blur area is blurred according to the portrait blur intensity.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé de floutage d'image consistant : à acquérir une image à traiter ; à détecter une zone de visage à l'intérieur de l'image à traiter, et à obtenir des informations de distance physique correspondant à la zone de visage ; à acquérir une intensité de floutage d'arrière-plan en fonction des informations de distance physique, et à flouter une zone d'arrière-plan de l'image à traiter en fonction de l'intensité de floutage d'arrière-plan.
PCT/CN2018/099403 2017-08-09 2018-08-08 Procédé de floutage d'image, support d'informations lisible par ordinateur et dispositif informatique WO2019029573A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710676169.2 2017-08-09
CN201710676169.2A CN107704798B (zh) 2017-08-09 2017-08-09 图像虚化方法、装置、计算机可读存储介质和计算机设备

Publications (1)

Publication Number Publication Date
WO2019029573A1 true WO2019029573A1 (fr) 2019-02-14

Family

ID=61170965

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/099403 WO2019029573A1 (fr) 2017-08-09 2018-08-08 Procédé de floutage d'image, support d'informations lisible par ordinateur et dispositif informatique

Country Status (2)

Country Link
CN (1) CN107704798B (fr)
WO (1) WO2019029573A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107704798B (zh) * 2017-08-09 2020-06-12 Oppo广东移动通信有限公司 图像虚化方法、装置、计算机可读存储介质和计算机设备
CN110099251A (zh) * 2019-04-29 2019-08-06 努比亚技术有限公司 监控视频的处理方法、装置以及计算机可读存储介质
CN110971827B (zh) * 2019-12-09 2022-02-18 Oppo广东移动通信有限公司 人像模式拍摄方法、装置、终端设备和存储介质
CN112217992A (zh) * 2020-09-29 2021-01-12 Oppo(重庆)智能科技有限公司 图像虚化方法、图像虚化装置、移动终端及存储介质
CN113673474B (zh) * 2021-08-31 2024-01-12 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备及计算机可读存储介质
CN115883958A (zh) * 2022-11-22 2023-03-31 荣耀终端有限公司 一种人像拍摄方法
CN118450067A (zh) * 2023-02-06 2024-08-06 Oppo广东移动通信有限公司 视频处理方法及装置、计算机可读介质和电子设备
CN117714893A (zh) * 2023-05-17 2024-03-15 荣耀终端有限公司 一种图像虚化处理方法、装置、电子设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017017585A (ja) * 2015-07-02 2017-01-19 オリンパス株式会社 撮像装置、画像処理方法
CN106875348A (zh) * 2016-12-30 2017-06-20 成都西纬科技有限公司 一种重对焦图像处理方法
CN106952222A (zh) * 2017-03-17 2017-07-14 成都通甲优博科技有限责任公司 一种交互式图像虚化方法及装置
CN107704798A (zh) * 2017-08-09 2018-02-16 广东欧珀移动通信有限公司 图像虚化方法、装置、计算机可读存储介质和计算机设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4752941B2 (ja) * 2009-03-31 2011-08-17 カシオ計算機株式会社 画像合成装置及びプログラム
JP5760727B2 (ja) * 2011-06-14 2015-08-12 リコーイメージング株式会社 画像処理装置および画像処理方法
CN103945118B (zh) * 2014-03-14 2017-06-20 华为技术有限公司 图像虚化方法、装置及电子设备
CN103973977B (zh) * 2014-04-15 2018-04-27 联想(北京)有限公司 一种预览界面的虚化处理方法、装置及电子设备
CN104333700B (zh) * 2014-11-28 2017-02-22 广东欧珀移动通信有限公司 一种图像虚化方法和图像虚化装置
CN105389801B (zh) * 2015-10-20 2018-09-21 厦门美图之家科技有限公司 人物轮廓设置方法、人物图像虚化方法、系统及拍摄终端
CN106331492B (zh) * 2016-08-29 2019-04-16 Oppo广东移动通信有限公司 一种图像处理方法及终端
CN106548185B (zh) * 2016-11-25 2019-05-24 三星电子(中国)研发中心 一种前景区域确定方法和装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017017585A (ja) * 2015-07-02 2017-01-19 オリンパス株式会社 撮像装置、画像処理方法
CN106875348A (zh) * 2016-12-30 2017-06-20 成都西纬科技有限公司 一种重对焦图像处理方法
CN106952222A (zh) * 2017-03-17 2017-07-14 成都通甲优博科技有限责任公司 一种交互式图像虚化方法及装置
CN107704798A (zh) * 2017-08-09 2018-02-16 广东欧珀移动通信有限公司 图像虚化方法、装置、计算机可读存储介质和计算机设备

Also Published As

Publication number Publication date
CN107704798B (zh) 2020-06-12
CN107704798A (zh) 2018-02-16

Similar Documents

Publication Publication Date Title
WO2019029573A1 (fr) Procédé de floutage d'image, support d'informations lisible par ordinateur et dispositif informatique
WO2019105154A1 (fr) Procédé, appareil et dispositif de traitement d'image
WO2020038028A1 (fr) Procédé de capture d'images de nuit, appareil, dispositif électronique et un support d'informations
KR102279436B1 (ko) 이미지 처리 방법, 장치 및 기기
WO2020038074A1 (fr) Procédé et appareil de commande d'exposition, et dispositif électronique
CN107481186B (zh) 图像处理方法、装置、计算机可读存储介质和计算机设备
CN107509031B (zh) 图像处理方法、装置、移动终端及计算机可读存储介质
US10805508B2 (en) Image processing method, and device
WO2020038087A1 (fr) Procédé et appareil de commande photographique dans un mode de scène de super nuit et dispositif électronique
CN109685853B (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
CN107395991B (zh) 图像合成方法、装置、计算机可读存储介质和计算机设备
CN107563979B (zh) 图像处理方法、装置、计算机可读存储介质和计算机设备
WO2019105254A1 (fr) Procédé, appareil et dispositif de traitement de flou d'arrière-plan
WO2019011154A1 (fr) Procédé et appareil de traitement d'équilibrage de blancs
WO2019105260A1 (fr) Procédé, appareil et dispositif d'obtention de profondeur de champ
US11233948B2 (en) Exposure control method and device, and electronic device
CN109559352B (zh) 摄像头标定方法、装置、电子设备和计算机可读存储介质
CN109068060B (zh) 图像处理方法和装置、终端设备、计算机可读存储介质
CN113313626A (zh) 图像处理方法、装置、电子设备及存储介质
CN107454317B (zh) 图像处理方法、装置、计算机可读存储介质和计算机设备
CN107563329B (zh) 图像处理方法、装置、计算机可读存储介质和移动终端
CN107454335B (zh) 图像处理方法、装置、计算机可读存储介质和移动终端
WO2019019890A1 (fr) Procédé de traitement d'image, équipement informatique et support de stockage lisible par ordinateur
CN109584311B (zh) 摄像头标定方法、装置、电子设备和计算机可读存储介质
CN107295261B (zh) 图像去雾处理方法、装置、存储介质和移动终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18843872

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18843872

Country of ref document: EP

Kind code of ref document: A1