WO2019105254A1 - Procédé, appareil et dispositif de traitement de flou d'arrière-plan - Google Patents

Procédé, appareil et dispositif de traitement de flou d'arrière-plan Download PDF

Info

Publication number
WO2019105254A1
WO2019105254A1 PCT/CN2018/116230 CN2018116230W WO2019105254A1 WO 2019105254 A1 WO2019105254 A1 WO 2019105254A1 CN 2018116230 W CN2018116230 W CN 2018116230W WO 2019105254 A1 WO2019105254 A1 WO 2019105254A1
Authority
WO
WIPO (PCT)
Prior art keywords
brightness
area
background
background area
foreground
Prior art date
Application number
PCT/CN2018/116230
Other languages
English (en)
Chinese (zh)
Inventor
欧阳丹
谭国辉
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2019105254A1 publication Critical patent/WO2019105254A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability

Definitions

  • the present application relates to the field of image processing technologies, and in particular, to a background blur processing method, apparatus, and device.
  • the background area of the photograph is blurred.
  • the subject in the image after the blurring process may not be prominent, for example, if photographing
  • the main body is located between the light source and the camera, which may result in insufficient exposure of the subject and a backlighting effect.
  • the brightness is very low, the details are blurred, and after the background area is blurred, the subject body cannot be highlighted, and the visual effect after the image processing is poor.
  • the present application provides a background blur processing method, apparatus, and device to solve the technical problem that the foreground area is not prominent after the background area is blurred by the low brightness of the foreground area in the prior art.
  • the embodiment of the present application provides a background blurring processing method, including: acquiring depth of field information of the main image according to a main image acquired by a main camera and a sub image acquired by a sub camera; according to the focus area of the main image and the The depth of field information determines a foreground area and a background area; adjusts a brightness difference between the foreground area and the background area according to a preset policy; and when the brightness difference is adjusted to meet a preset condition, the background area is virtualized
  • the processing generates a target image.
  • a background blur processing apparatus including: a calculation module, configured to acquire depth of field information of the main image according to a main image acquired by a main camera and a sub image acquired by a sub camera; Determining a foreground area and a background area according to the focus area of the main image and the depth of field information; and an adjustment module, configured to adjust a brightness difference between the foreground area and the background area according to a preset policy; When it is detected that the brightness difference is adjusted to satisfy a preset condition, the background area is blurred to generate a target image.
  • a further embodiment of the present application provides a computer device including a memory and a processor, wherein the memory stores computer readable instructions, and when the instructions are executed by the processor, the processor performs the above implementation of the present application.
  • the background blurring method described in the example is described in the example.
  • a further embodiment of the present application provides a non-transitory computer readable storage medium having a computer program stored thereon, which when executed by a processor, implements a background blurring processing method as described in the above embodiments of the present application.
  • FIG. 1 is a flow chart of a background blurring processing method according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a principle of triangulation according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a dual camera viewing angle coverage according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a process of calculating a depth of field by a dual camera according to an embodiment of the present application
  • FIG. 5(a) is a schematic diagram of a portrait photographed image according to an embodiment of the present application.
  • FIG. 5(b) is a schematic diagram of an image after background blur processing according to the prior art
  • FIG. 5(c) is a schematic diagram of an image after background blur processing according to an embodiment of the present application.
  • FIG. 6 is a flowchart of a background blur processing method according to another embodiment of the present application.
  • FIG. 7 is a flowchart of a background blurring processing method according to still another embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a background blurring processing apparatus according to an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a background blurring processing apparatus according to another embodiment of the present application.
  • FIG. 10 is a schematic diagram of an image processing circuit according to another embodiment of the present application.
  • FIG. 1 is a flowchart of a background blur processing method according to an embodiment of the present application. As shown in FIG. 1, the method includes:
  • Step 101 Acquire depth of field information of the main image according to the main image acquired by the main camera and the sub image acquired by the sub camera.
  • the spatial depth of the clear imaging allowed by the human eye before and after the focus area where the subject is located is the depth of field.
  • the depth of field of the human eye is mainly determined by binocular vision to distinguish the depth of field. This is the same as the principle of dual camera resolution of depth of field, mainly relying on the principle of triangular ranging as shown in Figure 2.
  • the imaged object is drawn, as well as the positions of the two cameras O R and O T , and the focal planes of the two cameras.
  • the focal plane is at a distance f from the plane of the two cameras. Two cameras are imaged at the focal plane position to obtain two captured images.
  • P and P' are the positions of the same object in different captured images, respectively.
  • the distance from the P point to the left boundary of the captured image is X R
  • the distance from the P′ point to the left boundary of the captured image is X T .
  • O R and O T are two cameras respectively, and the two cameras are on the same plane with a distance B.
  • the distance Z between the object in Figure 2 and the plane of the two cameras has the following relationship:
  • d is the difference in distance between the positions of the same object in different captured images. Since B and f are constant values, the distance Z of the object can be determined according to d.
  • the above formula is implemented based on two parallel cameras.
  • the main camera is used to take the main image of the actual image.
  • the sub-image obtained by the sub-camera is mainly used to calculate the depth of field. Based on the above analysis, the sub-camera The FOV is generally larger than the main camera, but even if it is as shown in Figure 3, objects that are closer together may still be in different images acquired by the two cameras.
  • the adjusted calculated depth of field range is as follows:
  • a map of different point differences is calculated by the main image acquired by the main camera and the sub-image acquired by the sub-camera, and is represented by a disparity map, which is the same on the two graphs.
  • Step 102 Determine a foreground area and a background area according to the focus area and the depth information of the main image.
  • the range of imaging before the focus area is the foreground depth of field
  • the area corresponding to the foreground depth of field is the foreground area
  • the range of clear imaging after the focus area is the background depth of field
  • the area corresponding to the background depth of field is the background area
  • the foreground area includes The subject image of the photo taken.
  • Step 103 Adjust a brightness difference between the foreground area and the background area according to a preset policy.
  • Step 104 When it is detected that the brightness difference is adjusted to meet the preset condition, the background area is blurred to generate a target image.
  • the background area outside the focus area of the photographed subject is blurred.
  • the photographing scene is Backlighting scenes, or the subject's light source is blocked, or photographed in the dark, even if the background area is blurred, it may not highlight the subject, which leads to the subject is not prominent enough, and the visual effect is poor.
  • FIG. 5( a ) when the subject of the photograph is a person, if the photographed scene is a backlit scene, even if the background area is blurred, as shown in FIG. 5( b ), the brightness of the person image is low. It is also impossible to highlight people and the visual effect is poor.
  • the reason why the foreground area is not prominent after the background area is blurred is mainly as follows: on the one hand, the background area is brighter than the foreground area, so that the brightness is lower after the background area is blurred. The foreground area is not prominent enough. On the other hand, the brightness of the background area and the foreground area are both low, so that it is difficult to highlight the subject image in the foreground area in the entire image with low luminance after blurring the background area.
  • the brightness difference between the foreground area and the background area is adjusted according to a preset policy, so that the foreground area is highlighted with respect to the background area, and when the brightness difference is detected to be adjusted to meet the pre- The condition is set, and the background image is blurred to generate a target image, wherein the foreground area in the target image is prominent, and the visual effect is good.
  • the brightness difference adjustment strategy may be implemented only according to a scene with a low foreground brightness, that is, the brightness of the foreground area is detected, wherein the preset first threshold may be It is calibrated according to a large amount of experimental data, and is used to determine whether the brightness of the foreground area is low.
  • the first threshold may also be calibrated according to the user's personal preference. If the brightness of the foreground area is detected to be lower than a preset first threshold, The brightness difference between the foreground area and the background area is adjusted according to a preset policy.
  • the farther away from the background the more distant the subject may be related to the user's current photo needs. For example, when the user is taking a photo with a nearby flower, the farther the scene is, the more relevant the current photo needs are. In order to further meet the user's photographing needs, it is also possible to blur according to the depth information of the foreground area and the background area.
  • the first depth information of the foreground area and the second depth information of the background area are calculated according to the focus area and the depth information of the main image, and are obtained according to the first depth information and the second depth information.
  • the basic value of the degree of blurring for example, the smaller the first depth of field information, the larger the second depth of field information, indicating that the current photographing scene is irrelevant to the background, and thus the basis value of the degree of blurring acquired at this time is larger, and The larger the first depth information is, the smaller the second depth information is, indicating that the current photographing scene may be related to the background, so that the basic value of the degree of blur obtained at this time is smaller, and at this time, the basic value pair according to the degree of blurring
  • the background area is Gaussian blur processing to generate the target image, and the background area is blurred corresponding to the photographing requirement, which satisfies the current photographing requirement of the user.
  • the preset condition is different from the brightness of the foreground area and the background area included in the current preset strategy.
  • the brightness difference may be calibrated according to a large amount of experimental data, or may be set according to the user's personal preference. After blurring the background area, the foreground area can stand out.
  • the preset strategy includes: increasing the brightness of the foreground area when the brightness of the background area is not bright. That is to say, in the embodiment of the present application, when the brightness of the background area is low, it indicates that the brightness of the entire image is low, and at this time, even if the brightness of the background area is reduced, the brightness difference between the background area and the foreground area is reduced, and the foreground area is It may also be insufficiently highlighted in the blurred image due to the low brightness. Therefore, in this scenario, the brightness of the foreground area is increased as a preset strategy.
  • step 103 includes:
  • Step 201 Detect the brightness of the background area.
  • Step 202 If it is determined that the brightness of the background area is less than or equal to a preset second threshold and greater than the first threshold, wherein the second threshold is greater than the first threshold, the first adjusted brightness of the foreground area is calculated according to a preset algorithm.
  • the second threshold may be calibrated according to a large amount of experimental data, or may be calibrated according to a user's personal preference, and the second threshold is used to determine whether the background area is bright.
  • Step 203 increasing the brightness of the foreground area according to the first adjusted brightness.
  • the first adjusted brightness of the foreground area is calculated according to a preset algorithm to increase the brightness of the foreground area according to the first adjusted brightness, thereby, the foreground area
  • the brightness difference between the background area and the foreground area is reduced, and the foreground area is highlighted in the target image generated by blurring the background area.
  • the foregoing preset algorithm for adjusting the brightness of the foreground area may have different implementation manners in different application scenarios.
  • the focus area and depth information of the main image Calculating a first depth information of the foreground area and a second depth information of the background area, and further calculating a depth of field ratio of the first depth information and the second depth information, wherein a ratio of depth of field of the first depth information and the second depth information is larger, The closer the distance between the foreground and the background is, the smaller the ratio is, the further the distance between the foreground and the background is.
  • the brightness of the foreground area and the depth of field ratio are calculated according to a preset algorithm to obtain the first adjusted brightness of the foreground area, wherein the depth of field
  • the preset strategy includes: reducing the brightness of the background area when the background area is brighter. That is to say, in the embodiment of the present application, when the brightness of the background area is high, it indicates that the brightness of the entire image is high, and the reason that the foreground area is not highlighted after the blurring of the foreground area may be that the background area is brighter. When the brightness of the background area is reduced, the brightness difference between the background area and the foreground area is reduced, and the blurred image of the foreground area can be highlighted. Therefore, in this scenario, the brightness of the background area is reduced as a preset strategy.
  • step 103 includes:
  • step 301 the brightness of the background area is detected.
  • Step 302 If it is determined that the brightness of the background area is greater than the second threshold, calculate the second adjusted brightness of the background area according to a preset algorithm.
  • Step 303 reducing the brightness of the background area according to the second adjustment brightness.
  • the second adjusted brightness of the background area is calculated according to a preset algorithm to reduce the brightness of the background area according to the second adjusted brightness, thereby, the brightness of the background area.
  • the luminance difference between the background area and the foreground area is reduced, and in the target image generated by blurring the background area, the foreground area is more prominent.
  • the foregoing preset algorithm for adjusting the brightness of the foreground area may have different implementation manners in different application scenarios.
  • the depth information and brightness adjustment according to the background area may be adjusted.
  • the brightness of the background area in this implementation, in order to ensure the blur effect, the depth information of the background area is larger, and the more the foreground area of the current photograph is, the greater the degree of brightness adjustment of the background area is, and the brightness of the background area is larger. The easier it is to cause the foreground area to be less prominent, the greater the degree of brightness adjustment to the background area.
  • the preset first adjustment factor corresponding to the depth information of the background area may be queried, wherein, as analyzed above, the depth information of the background area is larger, the value of the first adjustment factor is larger, and the brightness of the background area is The larger the value of the corresponding second adjustment factor is, the more the second adjustment brightness of the background area is obtained by calculating the brightness of the background area, the first adjustment factor and the second adjustment factor according to a preset algorithm.
  • the preset algorithm may be different in different application scenarios.
  • the preset algorithm may include: calculating a second according to weight values corresponding to the first background brightness, the first adjustment factor, and the second adjustment factor, respectively. The brightness is adjusted, wherein the above weight value can be calibrated according to experimental data.
  • the background blurring processing method of the embodiment of the present application does not directly blur the background area, but first adjusts the brightness difference between the background area and the foreground area, so that the foreground area is initially highlighted, and then the background is The area is blurred, so that the foreground area of the blurred target image is prominent.
  • the background area is blurred.
  • the foreground area in the target image can be made more prominent.
  • the background blur processing method of the embodiment of the present application calculates the depth information of the main image according to the main image acquired by the main camera and the sub image acquired by the sub camera, and determines the foreground area according to the focus area and the depth information of the main image. And the background area, if the brightness of the foreground area is detected to be lower than the preset first threshold, the brightness difference between the foreground area and the background area is adjusted according to the preset policy, and when the brightness difference is detected to meet the preset condition, The background area is blurred to generate a target image.
  • FIG. 8 is a schematic structural diagram of a background blur processing device according to an embodiment of the present application.
  • the calculation module 100 includes a calculation module 100, a determination module 200, an adjustment module 300, and a processing module 400.
  • the calculation module 100 is configured to acquire depth information of the main image according to the main image acquired by the main camera and the sub image acquired by the sub camera.
  • the determining module 200 is configured to determine the foreground area and the background area according to the focus area and the depth information of the main image.
  • the adjustment module 300 is configured to adjust a brightness difference between the foreground area and the background area according to a preset policy.
  • FIG. 9 is a schematic structural diagram of a background blur processing apparatus according to another embodiment of the present application.
  • the adjustment module 300 includes a detecting unit 310, a calculating unit 320, and an adjusting unit. 330.
  • the detecting unit 310 is configured to detect the brightness of the background area.
  • the calculating unit 320 is configured to: when it is determined that the brightness of the background area is less than or equal to a preset second threshold and greater than the first threshold, wherein the second threshold is greater than the first threshold, calculate the first adjusted brightness of the foreground area according to a preset algorithm .
  • the adjusting unit 330 is configured to increase the brightness of the foreground area according to the first adjusted brightness.
  • the processing module 400 is configured to perform a blurring process on the background area to generate a target image when the brightness difference is adjusted to meet the preset condition.
  • each module in the background blur processing device is for illustrative purposes only. In other embodiments, the background blur processing device may be divided into different modules as needed to complete all or part of the background blur processing device.
  • the background blur processing apparatus of the embodiment of the present invention calculates the depth information of the main image according to the main image acquired by the main camera and the sub image acquired by the sub camera, and determines the foreground area according to the focus area and the depth information of the main image. And the background area, adjusting the brightness difference between the foreground area and the background area according to the preset policy, and when the brightness difference is adjusted to meet the preset condition, the background area is blurred to generate the target image.
  • the present application further provides a computer device, wherein the computer device includes an image processing circuit, and the image processing circuit can be implemented by using hardware and/or software components, and can include defining an ISP (Image Signal Processing). ) Various processing units of the pipeline.
  • Figure 10 is a schematic illustration of an image processing circuit in one embodiment. As shown in FIG. 10, for convenience of explanation, only various aspects of the image processing technique related to the embodiment of the present application are shown.
  • the image processing circuit includes an ISP processor 640 and a control logic 650.
  • the image data captured by imaging device 610 is first processed by ISP processor 640, which analyzes the image data to capture image statistical information that can be used to determine and/or control one or more control parameters of imaging device 610.
  • Imaging device 610 can include a camera having one or more lenses 612 and image sensors 614.
  • Image sensor 614 may include a color filter array (such as a Bayer filter) that may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 614 and provide a set of primitives that may be processed by ISP processor 640 Image data.
  • Sensor 620 can provide raw image data to ISP processor 640 based on sensor 620 interface type.
  • the sensor 620 interface may utilize a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
  • SMIA Standard Mobile Imaging Architecture
  • the ISP processor 640 processes the raw image data pixel by pixel in a variety of formats.
  • each image pixel can have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 640 can perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Among them, image processing operations can be performed with the same or different bit depth precision.
  • ISP processor 640 can also receive pixel data from image memory 630. For example, raw pixel data is sent from the sensor 620 interface to image memory 630, which is then provided to ISP processor 640 for processing.
  • Image memory 630 can be part of a memory device, a storage device, or a separate dedicated memory within an electronic device, and can include DMA (Direct Memory Access) features.
  • DMA Direct Memory Access
  • ISP processor 640 can perform one or more image processing operations, such as time domain filtering.
  • the processed image data can be sent to image memory 630 for additional processing before being displayed.
  • the ISP processor 640 receives the processed data from the image memory 630 and performs image data processing in the original domain and in the RGB and YCbCr color spaces.
  • the processed image data can be output to display 670 for viewing by a user and/or further processed by a graphics engine or GPU (Graphics Processing Unit). Additionally, the output of ISP processor 640 can also be sent to image memory 630, and display 670 can read image data from image memory 630.
  • image memory 630 can be configured to implement one or more frame buffers.
  • ISP processor 640 can be sent to encoder/decoder 660 to encode/decode image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 670 device.
  • Encoder/decoder 660 can be implemented by a CPU or GPU or coprocessor.
  • the statistics determined by the ISP processor 640 can be sent to the control logic 650 unit.
  • the statistics may include image sensor 614 statistics such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, lens 612 shading correction, and the like.
  • Control logic 650 can include a processor and/or a microcontroller that executes one or more routines (such as firmware) that can determine control parameters and control of imaging device 610 based on received statistical data. parameter.
  • the control parameters may include sensor 620 control parameters (eg, gain, integration time for exposure control), camera flash control parameters, lens 612 control parameters (eg, focus or zoom focal length), or a combination of these parameters.
  • the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), as well as lens 612 shading correction parameters.
  • the background area is blurred to generate a target image.
  • the present application also proposes a non-transitory computer readable storage medium, which enables execution of a background blurring processing method as described in the above embodiments when instructions in the storage medium are executed by a processor .
  • FIG. 11 is a schematic diagram of an image processing circuit as a possible implementation. For ease of explanation, only the various aspects related to the embodiments of the present application are shown.
  • the image processing circuit specifically includes: a photographing unit 11 and a processing unit 12, wherein the photographing unit 11 includes a main camera 111 and a sub-camera 112.
  • the photographing unit 11 is configured to acquire a main image according to the main camera 111 and acquire a sub image according to the sub camera 112;
  • the processing unit 12 is configured to acquire depth of field information of the main image according to the main image acquired by the main camera and the sub image acquired by the sub camera, and determine the foreground area and the background according to the in-focus area of the main image and the depth information. And adjusting a brightness difference between the foreground area and the background area according to a preset policy. When detecting that the brightness difference is adjusted to meet a preset condition, the background area is blurred to generate a target image.
  • the processing unit 12 includes an image signal processing ISP processor 121, and the image signal processing ISP processor 121 is configured to detect the brightness of the foreground area, if the If the brightness of the foreground area is lower than a preset first threshold, the brightness difference between the foreground area and the background area is adjusted according to a preset policy.
  • first and second are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated.
  • features defining “first” or “second” may include at least one of the features, either explicitly or implicitly.
  • the meaning of "a plurality” is at least two, such as two, three, etc., unless specifically defined otherwise.
  • a "computer-readable medium” can be any apparatus that can contain, store, communicate, propagate, or transport a program for use in an instruction execution system, apparatus, or device, or in conjunction with the instruction execution system, apparatus, or device.
  • computer readable media include the following: electrical connections (electronic devices) having one or more wires, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer readable medium may even be a paper or other suitable medium on which the program can be printed, as it may be optically scanned, for example by paper or other medium, followed by editing, interpretation or, if appropriate, other suitable The method is processed to obtain the program electronically and then stored in computer memory.
  • portions of the application can be implemented in hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware and in another embodiment, it can be implemented by any one or combination of the following techniques well known in the art: discrete with logic gates for implementing logic functions on data signals Logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate arrays (PGAs), field programmable gate arrays (FPGAs), and the like.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated modules, if implemented in the form of software functional modules and sold or used as stand-alone products, may also be stored in a computer readable storage medium.
  • the above mentioned storage medium may be a read only memory, a magnetic disk or an optical disk or the like. While the embodiments of the present application have been shown and described above, it is understood that the above-described embodiments are illustrative and are not to be construed as limiting the scope of the present application. The embodiments are subject to variations, modifications, substitutions and variations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne un procédé, un appareil et un dispositif de traitement de flou d'arrière-plan, le procédé consistant : à obtenir des informations de profondeur d'une image principale en fonction d'une image principale acquise par une caméra principale et d'une image secondaire acquise par une caméra secondaire; à déterminer une zone de premier plan et une zone d'arrière-plan en fonction d'une zone de mise au point et des informations de profondeur de l'image principale; à régler la différence de luminosité entre la zone de premier plan et la zone d'arrière-plan conformément à une stratégie prédéfinie; lorsqu'il est détecté que la différence de luminosité est réglée de façon à satisfaire une condition prédéfinie, à effectuer un traitement de flou sur la zone d'arrière-plan afin de générer une image cible. Ainsi, le présent procédé permet de résoudre le problème technique dans la technologie existante selon lequel une zone de premier plan n'est pas assez dominante après le traitement de flou d'une zone d'arrière-plan, la zone de premier plan après le traitement de flou étant plus dominante grâce au réglage de la différence de luminosité entre la zone d'arrière-plan et la zone de premier plan, ce qui permet d'améliorer l'effet du traitement de flou.
PCT/CN2018/116230 2017-11-30 2018-11-19 Procédé, appareil et dispositif de traitement de flou d'arrière-plan WO2019105254A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711242157.5 2017-11-30
CN201711242157.5A CN108024057B (zh) 2017-11-30 2017-11-30 背景虚化处理方法、装置及设备

Publications (1)

Publication Number Publication Date
WO2019105254A1 true WO2019105254A1 (fr) 2019-06-06

Family

ID=62077779

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/116230 WO2019105254A1 (fr) 2017-11-30 2018-11-19 Procédé, appareil et dispositif de traitement de flou d'arrière-plan

Country Status (2)

Country Link
CN (1) CN108024057B (fr)
WO (1) WO2019105254A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114286004A (zh) * 2021-12-28 2022-04-05 维沃移动通信有限公司 对焦方法、拍摄装置、电子设备及介质
CN114859581A (zh) * 2022-03-24 2022-08-05 京东方科技集团股份有限公司 背光测试装置、背光测试方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108024057B (zh) * 2017-11-30 2020-01-10 Oppo广东移动通信有限公司 背景虚化处理方法、装置及设备
KR102525000B1 (ko) 2018-08-08 2023-04-24 삼성전자 주식회사 복수의 이미지들이 합성된 이미지를 깊이 정보에 기반하여 흐림 처리하는 전자 장치 및 상기 전자 장치의 구동 방법
CN110198421B (zh) * 2019-06-17 2021-08-10 Oppo广东移动通信有限公司 视频处理方法及相关产品
CN110677621B (zh) * 2019-09-03 2021-04-13 RealMe重庆移动通信有限公司 摄像头调用方法、装置、存储介质及电子设备
CN116668804B (zh) * 2023-06-14 2023-12-22 山东恒辉软件有限公司 一种视频图像分析处理方法、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10233919A (ja) * 1997-02-21 1998-09-02 Fuji Photo Film Co Ltd 画像処理装置
CN101527773A (zh) * 2008-03-05 2009-09-09 株式会社半导体能源研究所 图像处理方法、图像处理系统、以及计算机程序
CN105574866A (zh) * 2015-12-15 2016-05-11 努比亚技术有限公司 一种实现图像处理的方法及装置
CN106060423A (zh) * 2016-06-02 2016-10-26 广东欧珀移动通信有限公司 虚化照片生成方法、装置和移动终端
CN108024057A (zh) * 2017-11-30 2018-05-11 广东欧珀移动通信有限公司 背景虚化处理方法、装置及设备

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6157733A (en) * 1997-04-18 2000-12-05 At&T Corp. Integration of monocular cues to improve depth perception
JP3591575B2 (ja) * 1998-12-28 2004-11-24 日立ソフトウエアエンジニアリング株式会社 画像合成装置および画像合成方法
US7663689B2 (en) * 2004-01-16 2010-02-16 Sony Computer Entertainment Inc. Method and apparatus for optimizing capture device settings through depth information
CN105025229B (zh) * 2015-07-30 2018-06-29 广东欧珀移动通信有限公司 调节照片亮度的方法和相关装置
CN106993112B (zh) * 2017-03-09 2020-01-10 Oppo广东移动通信有限公司 基于景深的背景虚化方法及装置和电子装置
CN107357500A (zh) * 2017-06-21 2017-11-17 努比亚技术有限公司 一种图片调整方法、终端及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10233919A (ja) * 1997-02-21 1998-09-02 Fuji Photo Film Co Ltd 画像処理装置
CN101527773A (zh) * 2008-03-05 2009-09-09 株式会社半导体能源研究所 图像处理方法、图像处理系统、以及计算机程序
CN105574866A (zh) * 2015-12-15 2016-05-11 努比亚技术有限公司 一种实现图像处理的方法及装置
CN106060423A (zh) * 2016-06-02 2016-10-26 广东欧珀移动通信有限公司 虚化照片生成方法、装置和移动终端
CN108024057A (zh) * 2017-11-30 2018-05-11 广东欧珀移动通信有限公司 背景虚化处理方法、装置及设备

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114286004A (zh) * 2021-12-28 2022-04-05 维沃移动通信有限公司 对焦方法、拍摄装置、电子设备及介质
CN114859581A (zh) * 2022-03-24 2022-08-05 京东方科技集团股份有限公司 背光测试装置、背光测试方法
CN114859581B (zh) * 2022-03-24 2023-10-24 京东方科技集团股份有限公司 背光测试装置、背光测试方法

Also Published As

Publication number Publication date
CN108024057B (zh) 2020-01-10
CN108024057A (zh) 2018-05-11

Similar Documents

Publication Publication Date Title
WO2019105262A1 (fr) Procédé, appareil et dispositif de traitement de flou d'arrière-plan
US10997696B2 (en) Image processing method, apparatus and device
US10878539B2 (en) Image-processing method, apparatus and device
US10757312B2 (en) Method for image-processing and mobile terminal using dual cameras
JP6968992B2 (ja) デュアルカメラベースの撮像のための方法、移動端末、および記憶媒体
WO2019105254A1 (fr) Procédé, appareil et dispositif de traitement de flou d'arrière-plan
JP7145208B2 (ja) デュアルカメラベースの撮像のための方法および装置ならびに記憶媒体
CN107945105B (zh) 背景虚化处理方法、装置及设备
US10825146B2 (en) Method and device for image processing
US10805508B2 (en) Image processing method, and device
WO2019109805A1 (fr) Procédé et dispositif de traitement d'image
WO2019105261A1 (fr) Procédé et appareil de floutage d'arrière-plan et dispositif
WO2019011147A1 (fr) Procédé et appareil de traitement de région de visage humain dans une scène de rétroéclairage
JP6999802B2 (ja) ダブルカメラベースの撮像のための方法および装置
CN108053438B (zh) 景深获取方法、装置及设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18883505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18883505

Country of ref document: EP

Kind code of ref document: A1