WO2022213332A1 - Method for bokeh processing, electronic device and computer-readable storage medium - Google Patents

Method for bokeh processing, electronic device and computer-readable storage medium Download PDF

Info

Publication number
WO2022213332A1
WO2022213332A1 PCT/CN2021/086031 CN2021086031W WO2022213332A1 WO 2022213332 A1 WO2022213332 A1 WO 2022213332A1 CN 2021086031 W CN2021086031 W CN 2021086031W WO 2022213332 A1 WO2022213332 A1 WO 2022213332A1
Authority
WO
WIPO (PCT)
Prior art keywords
bokeh
image
size
camera parameters
map
Prior art date
Application number
PCT/CN2021/086031
Other languages
French (fr)
Inventor
Takuya Oi
Jun Luo
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2021/086031 priority Critical patent/WO2022213332A1/en
Priority to CN202180096376.5A priority patent/CN117178286A/en
Publication of WO2022213332A1 publication Critical patent/WO2022213332A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20012Locally adaptive
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • H04N23/662Transmitting camera control signals through networks, e.g. control via the Internet by using master/slave camera arrangements for affecting the control of camera image capture, e.g. placing the camera in a desirable condition to capture a desired image

Definitions

  • the present disclosure relates to a method for bokeh processing, in particular, to a method for accurately reproducing high quality bokeh equivalent to bokeh created by DSLR cameras, an electronic device performing the method, and a computer-readable storage medium storing a program to implement the method.
  • a subject such as a person should be clearly displayed, on the other hand, the background such as buildings or sky should be blurred.
  • a bokeh size/intensity is set to be 0 in an area occupied by the subject, and the bokeh sizes of the other areas are increased as the distance from the subject increases.
  • the size/area of an image sensor of an electronic device such as a smart phone is smaller than that of a Digital Single Lens Reflex (DSLR) camera which has a large sensor such as a 35mm sensor. Due to the small sensor size, it is unavoidable that the bokeh size generated based on an image taken with the electronic device is smaller than a bokeh size of an image taken with the DSLR camera. As a result, an image with bokeh generated by the electronic device may not look natural. To improve a quality of an image with bokeh, it is known that depth values of a depth map are used to determine the size of bokeh. However, even with this method, it is difficult to obtain an image with the same bokeh as an image taken with the DSLR camera.
  • DSLR Digital Single Lens Reflex
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide a method for bokeh processing and an electrical device implementing such method.
  • a method for bokeh processing includes acquiring DSLR camera parameters including a focal length (f) and an F-number (A) , acquiring an image, a focus distance (D) and a depth map which corresponds to the image, obtaining a bokeh size, for each pixel of the image, based on the DSLR camera parameters, the focus distance and the depth map to generate a bokeh size map, and performing bokeh processing on the image based on the bokeh size map.
  • an electric device includes a processor and a memory for storing instructions.
  • the instructions when executed by the processor, cause the processor to perform the method according to the present disclosure.
  • a computer-readable storage medium on which a computer program is stored, is provided.
  • the computer program is executed by a computer to implement the method according to the present disclosure.
  • FIG. 1A is a functional block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • FIG. 1B is a functional block diagram of an image signal processor in the electronic device according to an embodiment of the present disclosure.
  • FIG. 2 is a flowchart for generating an image with bokeh according to an embodiment of the present disclosure.
  • FIG. 3 is an example of a User Interface for inputting DSLR camera parameters.
  • FIG. 4A shows an example of an image captured by a camera module.
  • FIG. 4B shows an example of a depth map corresponding to the image shown in FIG. 4A.
  • FIG. 5 shows an example of a bokeh size map generated by using a method according to an embodiment of the present disclosure.
  • FIG. 6 shows an example of an image with bokeh generated by using a method according to an embodiment of the present disclosure.
  • FIG. 7 is a diagram for explaining the emphasis of bokeh based on the DSLR camera parameters.
  • FIG. 8 is a diagram for explaining the change of bokeh size in a DoF range.
  • FIG. 1A is a functional block diagram illustrating an example of a configuration of the electronic device 100 according to an embodiment of the present disclosure.
  • the electronic device 100 is a mobile device such as a smartphone, a tablet terminal or a mobile phone, but may be other types of electronic device equipped with one or more camera modules.
  • the electronic device 100 includes a stereo camera module 10, a range sensor module 20, and an image signal processor 30, a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial measurement unit (IMU) 47, a main processor 48, and a memory 49.
  • GNSS global navigation satellite system
  • IMU inertial measurement unit
  • the stereo camera module 10 includes a master camera module 11 and a slave camera module 12 to be used for binocular stereo viewing, as shown in FIG. 1A.
  • the camera module 10 may shoot a video at a given frame rate.
  • the master camera module 11 includes a first lens 11a that is capable of focusing on a subject, a first image sensor 11b that detects an image inputted via the first lens 11a, and a first image sensor driver 11c that drives the first image sensor 11b, as shown in FIG. 1A.
  • the slave camera module 12 includes a second lens 12a that is capable of focusing on a subject, a second image sensor 12b that detects an image inputted via the second lens 12a, and a second image sensor driver 12c that drives the second image sensor 12b, as shown in FIG. 1A.
  • the master camera module 11 captures a master camera image.
  • the slave camera module 12 captures a slave camera image.
  • the master camera image and the slave camera image may be a color image such as an RGB image, or a monochrome image.
  • a depth map can be generated based on the master camera image and the slave camera image by means of the stereo match technique. Specifically, an amount of parallax is calculated for each corresponding pixel of a stereo image (i.e., the master camera image and the slave camera image) . The depth value increases as the amount of parallax increases.
  • the depth map includes a depth value of each pixel in the image.
  • the range sensor module 20 captures a depth map.
  • the range sensor module 20 is a ToF camera and captures a time-of-flight depth map (a ToF depth map) by emitting pulsed light toward a subject and detecting light reflected from the subject.
  • the ToF depth map indicates an actual distance between the electronic device 100 and the subject.
  • the range sensor module can be omitted.
  • the image signal processor (ISP) 30 controls the master camera module 11, the slave camera module 12 and the range sensor module 20.
  • the ISP 30 also performs image processing on an image captured by the stereo camera module 10. Specifically, the ISP 30 acquires an original image from the camera module 10.
  • the original image is either a master camera image or a slave camera image.
  • the ISP 30 performs bokeh processing on the original image to generate an image with bokeh.
  • the image with bokeh can be generated based on the original image and a bokeh size map which will be described later.
  • the ISP 30 acquires an autofocus area in the captured image from the stereo camera module 10.
  • the autofocus area indicates an in-focus area.
  • the auto focus area is displayed as an AF rectangle.
  • the autofocus area is obtained by an autofocus operation of the camera module 10.
  • the ISP 30 acquires a focus distance or a subject distance.
  • the focus distance is a distance between a subject and the camera module 10. More precisely, the focus distance indicates a distance between a focused plane and a principal point of a lens 11a (11b) .
  • the ISP 30 may calculate the focus distance based on the depth map and the autofocus area. For example, the focus distance is acquired by calculating a representative value of depth values in the autofocus area.
  • the representative value may be a mean value, a median value or a quartile range value (e.g., a third quartile, a first quartile) .
  • the GNSS module 40 measures a current position of the electronic device 100.
  • the wireless communication module 41 performs wireless communications with the Internet.
  • the CODEC 42 bi-directionally performs encoding and decoding, using a predetermined encoding/decoding method.
  • the speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42.
  • the microphone 44 outputs sound data to the CODEC 42 based on inputted sound.
  • the display module 45 displays various information such as an image captured by the camera module 10 in real-time, a User Interface (UI) , and an image with bokeh generated by the ISP 30.
  • UI User Interface
  • the input module 46 inputs information via a user’s operation.
  • the input module 46 is a touch panel or a keyboard and so on.
  • the input module 46 inputs an instruction to capture and store an image displayed on the display module 45. Further, the input module 46 inputs DSLR camera parameters (described below) selected by a user.
  • the IMU 47 detects the angular velocity and the acceleration of the electronic device 100. A posture of the electronic device 100 can be grasped by a measurement result of the IMU 47.
  • the main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
  • GNSS global navigation satellite system
  • the memory 49 stores data of the image, data of depth map, various camera parameters to be used in image processing, and a program which runs on the image signal processor 30 and/or the main processor 48.
  • the ISP 30 includes a first acquiring unit 31, a second acquiring unit 32, an obtaining unit 33 and a performing unit 34.
  • the first acquiring unit 31 is configured to acquire DSLR camera parameters including a focal length and an F-number.
  • the DSLR camera parameters are camera parameters of Digital Single Lens Reflex cameras. Camera parameters include a type of an image sensor and a type of lens.
  • the DSLR camera parameters are different from camera parameters of the camera module 10 installed on the electronic device 100.
  • the DSLR camera parameters include at least a Focal length and an F-number.
  • the DSLR camera parameters may further include a size and/or a resolution of a DSLR image sensor.
  • the DSLR image sensor is an image sensor of a DSLR camera.
  • the second acquiring unit 32 is configured to acquire an image, a focus distance and a depth map.
  • the depth map corresponds to the acquired image.
  • the obtaining unit 33 is configured to obtain a bokeh size, for each pixel of the image, based on the DSLR camera parameters, the focus distance and the depth map to generate a bokeh size map.
  • the performing unit 34 is configured to perform bokeh processing on the image based on the bokeh size map.
  • a method for bokeh processing according to an embodiment of the present disclosure will be described with reference to a flowchart shown in FIG. 2.
  • the display module 45 displays a User Interface (UI) that allows a user to input DSLR camera parameters.
  • UI User Interface
  • FIG. 3 illustrates an example of the UI displayed by the display module 45.
  • the user can input the DSLR camera parameters via the UI.
  • the user can select a type of the DSLR image sensor, i.e., “645” , “35mm” or “APS-C” .
  • the user can also select a lens type, i.e., the Focal length (20mm to 100mm) and the F-number (F1.4 to F11) .
  • the user taps a “Save” button to save the selected parameters in the memory 49.
  • the UI that allows a user to select a resolution of the DSLR image sensor from candidates may be displayed.
  • the first acquiring unit 31 of the ISP 30 acquires the DSLR camera parameters inputted by the user in the step S1. Specifically, the first acquiring unit 31 reads the inputted DSLR camera parameters from the memory 49.
  • the second acquiring unit 32 of the ISP 30 acquires an image (i.e., original image) , a focus distance, and a depth map which corresponds to the original image. Specifically, when the user takes a photo/video with the electronic device 100, the stereo camera module 10 captures a master camera image and a slave camera image. The second acquiring unit 32 acquires a master camera image or a slave camera image as the original image.
  • the ISP 30 generates a depth map based on the master camera image and the slave camera image by means of the stereo match technique.
  • the depth map may be acquired from the range sensor module 20.
  • the second acquiring unit 32 acquires an autofocus area from the camera module 10 and acquires the focus distance by calculating a representative value of depth values in the autofocus area.
  • the second acquiring unit 32 may acquire a focus distance directly from the camera module 10.
  • the focus distance is determined by an autofocus operation of the camera module 10.
  • FIG. 4A shows an example of the image captured by the camera module 10.
  • the image includes three objects S1, S2 and S3 which are placed on a table in the order of the object S1, the object S2 and the object S3 from the front.
  • the autofocus area R is on the subject S2.
  • FIG. 4B shows an example of the depth map corresponding to the acquired image shown in FIG. 4A.
  • the depth map is a greyscale image.
  • the brightness of an area decreases in the depth map as the distance from the electronic device 100 increases.
  • the obtaining unit 33 of the ISP 30 calculates, for each pixel of the acquired image, a bokeh size based on the DSLR camera parameters, the focus distance and the depth map. As a result, a bokeh size map is generated.
  • the bokeh size may be calculated by the following equation.
  • C is the bokeh size
  • f is the focal length
  • A is the F-number
  • D is the focus distance
  • d is a depth value of a corresponding pixel in the depth map.
  • FIG. 5 shows an example of the bokeh size map generated in the step S4.
  • the bokeh size map is a greyscale image.
  • the brightness of an area in the bokeh size map increases as the bokeh size of the area increases.
  • the brightness of an area indicating the focused object S2 is the lowest (i.e., the smallest bokeh size) .
  • the brightness of an area indicating the focused object S1 is the highest (i.e., the largest bokeh size) .
  • the performing unit 34 of the ISP 30 performs bokeh processing on the original image based on the bokeh size map generated in the step S4. As a result, an image with bokeh can be obtained.
  • the bokeh processing may be performed by applying a smoothing filter to the original image.
  • the smoothing filter is a filter which is generated based on the bokeh size map.
  • the smoothing filter is a Gaussian filter with a standard deviation which is calculated based on the bokeh size in the bokeh size map.
  • the bokeh size may be converted to a standard deviation by the following equation.
  • is the standard deviation
  • C is the bokeh size
  • pp is a pixel pitch of the DSLR image sensor.
  • the pixel pitch pp may be calculated by the following equation.
  • N p is a number of pixels in the DSLR image sensor. Both S and N p are the DSLR camera parameters.
  • the performing unit 34 generates the Gaussian kernel for each pixel of the image.
  • a coefficient of the Gaussian kernel is determined based on the standard deviation ⁇ .
  • the performing unit 34 convolutes, for each pixel in the image, the Gaussian kernel on a corresponding pixel.
  • a size of the Gaussian kernel is 3x3, 8x8, 16x16, for example, but it is not limited to any specific size.
  • the kernel size may be determined based on the standard deviation ⁇ . For example, the kernel size increases as the standard deviation decreases.
  • the kernel size may also be determined based on a computing power of the ISP 30. For example, the kernel size increases as the computing power increases.
  • FIG. 6 shows an example of the image with bokeh generated in the step S5.
  • the focused object S2 is clearly displayed while the objects S1 and S3 are greatly blurred.
  • FIG. 7 shows a situation where a user P of the electronic device 100 takes a photo or video of a subject person S and there are a flower FG in the foreground and a tree BG in the background.
  • the bokeh sizes generated based on the DSLR camera parameters according to the method described above are greater than the bokeh sizes generated based on actual camera parameters of the electronic device 100 (dashed line) over the entire distance. Therefore, a large bokeh of the flower FG and the tree BG can be obtained.
  • the curve of the bokeh sizes in the vicinity of the subject person S is not symmetrical along a distance direction. That is to say, the bokeh sizes in front of the subject person S are greater than the bokeh sizes behind the subject person S. This can be understood from the equation for calculating the bokeh size described above.
  • the bokeh processing in the step S5 is not limited to the above method as long as the bokeh size map is used.
  • the DSLR camera parameters may be acquired from preset parameters stored in the memory 49 in advance. In this case, it is not necessary to display the UI described above, that is to say, the step S1 is not necessary.
  • At least one of the steps described above may be performed by the main processor 48.
  • the bokeh size in the vicinity of the subject may be changed.
  • a bokeh size in a Depth of Field (DoF) may be changed to a value smaller than the calculated bokeh size.
  • DoF Depth of Field
  • all of the bokeh sizes in the DoF are changed to 0.
  • the obtaining unit 33 calculates the bokeh size by means of the equation above and changes the bokeh size in the DoF to a predetermined value (e.g., 0) smaller than the calculated bokeh size.
  • FIG. 8 shows the same situation as FIG. 7 described above.
  • bokeh sizes in the DoF are set to 0.
  • bokeh sizes are set to 0 in a first range in front of the subject S by a distance Tf and a second range behind the subject S by a distance Tr. Setting bokeh sizes in the first and second ranges to 0 enables the subject person S to be displayed more clearly.
  • the DoF may be calculated based on the focal length (f) , the F-number (A) , the focus distance (D) and a permissible circle of confusion ( ⁇ ) . Specifically, the DoF is calculated by the following equations.
  • T f is a forward depth of field
  • T r is a rear depth of field
  • is the permissible circle of confusion
  • f is the focal length
  • A is the F-number
  • D is the focus distance.
  • the permissible circle of confusion ⁇ is calculated by the following equations.
  • is the permissible circle of confusion
  • Max is a function which returns the greater of the two arguments
  • pp is a pixel pitch of the DSLR image sensor
  • adr is an Airy disk radius
  • is a representative wavelength (e.g., 530nm)
  • A is the F-number.
  • bokeh processing is performed by using the bokeh size map generated based on the DSLR camera parameters, which can accurately reproduce a high quality bokeh equivalent to bokeh created by DSLR cameras. In other words, it is possible to generate an image with natural and large bokeh.
  • a smart phone with a relatively small image sensor can generate an image with large bokeh equivalent to bokeh created by DSLR cameras with a large image sensor, even though a picture or video is taken with a smart phone etc.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • a feature defined as “first” and “second” may comprise one or more of this feature.
  • a plurality of means “two or more than two” , unless otherwise specified.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
  • the storage medium may be transitory or non-transitory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

Disclosed is a method for bokeh processing. The method includes acquiring DSLR camera parameters including a focal length and an F-number, acquiring an image, a focus distance and a depth map which corresponds to the image, obtaining a bokeh size, for each pixel of the image, based on the DSLR camera parameters, the focus distance and the depth map to generate a bokeh size map, and performing bokeh processing on the image based on the bokeh size map.

Description

METHOD FOR BOKEH PROCESSING, ELECTRONIC DEVICE AND COMPUTER-READABLE STORAGE MEDIUM TECHNICAL FIELD
The present disclosure relates to a method for bokeh processing, in particular, to a method for accurately reproducing high quality bokeh equivalent to bokeh created by DSLR cameras, an electronic device performing the method, and a computer-readable storage medium storing a program to implement the method.
BACKGROUND
In recent years, techniques of artificially generating an image with bokeh are widely used. In the image with bokeh, a subject such as a person should be clearly displayed, on the other hand, the background such as buildings or sky should be blurred. For example, a bokeh size/intensity is set to be 0 in an area occupied by the subject, and the bokeh sizes of the other areas are increased as the distance from the subject increases.
The size/area of an image sensor of an electronic device such as a smart phone is smaller than that of a Digital Single Lens Reflex (DSLR) camera which has a large sensor such as a 35mm sensor. Due to the small sensor size, it is unavoidable that the bokeh size generated based on an image taken with the electronic device is smaller than a bokeh size of an image taken with the DSLR camera. As a result, an image with bokeh generated by the electronic device may not look natural. To improve a quality of an image with bokeh, it is known that depth values of a depth map are used to determine the size of bokeh. However, even with this method, it is difficult to obtain an image with the same bokeh as an image taken with the DSLR camera.
SUMMARY
The present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide a method for bokeh processing and an electrical device implementing such method.
In accordance with the present disclosure, a method for bokeh processing includes acquiring DSLR camera parameters including a focal length (f) and an F-number (A) , acquiring an image, a focus distance (D) and a depth map which corresponds to the image, obtaining a bokeh size, for each pixel of the image, based on the DSLR camera parameters, the focus distance and the depth map to generate a bokeh size map, and performing bokeh processing on the image based on the bokeh size map.
In accordance with the present disclosure, an electric device includes a processor and a memory for storing instructions. The instructions, when executed by the processor, cause the processor to perform the method according to the present disclosure.
In accordance with the present disclosure, a computer-readable storage medium, on which a computer program is stored, is provided. The computer program is executed by a computer to implement the method according to the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings below.
FIG. 1A is a functional block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
FIG. 1B is a functional block diagram of an image signal processor in the electronic device according to an embodiment of the present disclosure.
FIG. 2 is a flowchart for generating an image with bokeh according to an embodiment of the present disclosure.
FIG. 3 is an example of a User Interface for inputting DSLR camera parameters.
FIG. 4A shows an example of an image captured by a camera module.
FIG. 4B shows an example of a depth map corresponding to the image shown in FIG. 4A.
FIG. 5 shows an example of a bokeh size map generated by using a method according to an embodiment of the present disclosure.
FIG. 6 shows an example of an image with bokeh generated by using a method according to an embodiment of the present disclosure.
FIG. 7 is a diagram for explaining the emphasis of bokeh based on the DSLR camera parameters.
FIG. 8 is a diagram for explaining the change of bokeh size in a DoF range.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples of the embodiments will be illustrated in the accompanying drawings. The same or similar elements and elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory and aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
<Electronic device 100>
An electronic device 100 will be described with reference to FIG. 1A. FIG. 1A is a functional block diagram illustrating an example of a configuration of the electronic device 100 according to an embodiment of the present disclosure.
The electronic device 100 is a mobile device such as a smartphone, a tablet terminal or a mobile phone, but may be other types of electronic device equipped with one or more camera modules.
As shown in FIG. 1A, the electronic device 100 includes a stereo camera module 10, a range sensor module 20, and an image signal processor 30, a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial measurement unit (IMU) 47, a main processor 48, and a memory 49.
The stereo camera module 10 includes a master camera module 11 and a slave camera module 12 to be used for binocular stereo viewing, as shown in FIG. 1A. The camera module 10 may shoot a video at a given frame rate.
The master camera module 11 includes a first lens 11a that is capable of focusing on a subject, a first image sensor 11b that detects an image inputted via the first lens 11a, and a first image sensor driver 11c that drives the first image sensor 11b, as shown in FIG. 1A.
The slave camera module 12 includes a second lens 12a that is capable of focusing on a subject, a second image sensor 12b that detects an image inputted via the second lens 12a, and a second image sensor driver 12c that drives the second image sensor 12b, as shown in FIG. 1A.
The master camera module 11 captures a master camera image. The slave camera module 12 captures a slave camera image. The master camera image and the slave camera image may be a color image such as an RGB image, or a monochrome image.
A depth map can be generated based on the master camera image and the slave camera image by means of the stereo match technique. Specifically, an amount of parallax is calculated for each corresponding pixel of a stereo image (i.e., the master camera image and the slave camera image) . The depth value increases as the amount of parallax increases. The depth map includes a depth value of each pixel in the image.
The range sensor module 20 captures a depth map. For example, the range sensor module 20 is a ToF camera and captures a time-of-flight depth map (a ToF depth map) by emitting pulsed light toward a subject and detecting light reflected from the subject. The ToF depth map indicates an actual distance between the electronic device 100 and the subject. Optionally, the range sensor module can be omitted.
The image signal processor (ISP) 30 controls the master camera module 11, the slave camera module 12 and the range sensor module 20. The ISP 30 also performs image processing on an image captured by the stereo camera module 10. Specifically, the ISP 30 acquires an original image from the camera module 10. The original image is either a master camera image or a slave camera image. The ISP 30 performs bokeh processing on the original image to generate an image with bokeh. The image with bokeh can be generated based on the original image and a bokeh size map which will be described later.
When an image is captured by the camera module 10, the ISP 30 acquires an autofocus area in the captured image from the stereo camera module 10. The autofocus area indicates an in-focus area. For example, the auto focus area is displayed as an AF rectangle. The autofocus area is obtained by an autofocus operation of the camera module 10.
The ISP 30 acquires a focus distance or a subject distance. The focus distance is a distance between a subject and the camera module 10. More precisely, the focus distance indicates a distance between a focused plane and a principal point of a lens 11a (11b) .
The ISP 30 may calculate the focus distance based on the depth map and the autofocus area. For example, the focus distance is acquired by calculating a representative value of depth values in the autofocus area. The representative value may be a mean value, a median value or a quartile range value (e.g., a third quartile, a first quartile) .
The GNSS module 40 measures a current position of the electronic device 100. The wireless communication module 41 performs wireless communications with the Internet. The CODEC 42 bi-directionally performs encoding and decoding, using a predetermined encoding/decoding method. The speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42. The microphone 44 outputs sound data to the CODEC 42 based on inputted sound.
The display module 45 displays various information such as an image captured by the camera module 10 in real-time, a User Interface (UI) , and an image with bokeh generated by the ISP 30.
The input module 46 inputs information via a user’s operation. The input module 46 is a touch panel or a keyboard and so on. The input module 46 inputs an instruction to capture and store an image displayed on the display module 45. Further, the input module 46 inputs DSLR camera parameters (described below) selected by a user.
The IMU 47 detects the angular velocity and the acceleration of the electronic device 100. A posture of the electronic device 100 can be grasped by a measurement result of the IMU 47.
The main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
The memory 49 stores data of the image, data of depth map, various camera parameters to be used in image processing, and a program which runs on the image signal processor 30 and/or the main processor 48.
Next, the ISP 30 is described in detail with reference to FIG. 1B.
The ISP 30 includes a first acquiring unit 31, a second acquiring unit 32, an obtaining unit 33 and a performing unit 34.
The first acquiring unit 31 is configured to acquire DSLR camera parameters including a focal length and an F-number. The DSLR camera parameters are camera parameters of Digital Single Lens Reflex cameras. Camera parameters include a type of an image sensor and a type of  lens. The DSLR camera parameters are different from camera parameters of the camera module 10 installed on the electronic device 100.
The DSLR camera parameters include at least a Focal length and an F-number. The DSLR camera parameters may further include a size and/or a resolution of a DSLR image sensor. The DSLR image sensor is an image sensor of a DSLR camera.
The second acquiring unit 32 is configured to acquire an image, a focus distance and a depth map. The depth map corresponds to the acquired image.
The obtaining unit 33 is configured to obtain a bokeh size, for each pixel of the image, based on the DSLR camera parameters, the focus distance and the depth map to generate a bokeh size map.
The performing unit 34 is configured to perform bokeh processing on the image based on the bokeh size map.
< Method for bokeh processing >
A method for bokeh processing according to an embodiment of the present disclosure will be described with reference to a flowchart shown in FIG. 2.
In the step S1, the display module 45 displays a User Interface (UI) that allows a user to input DSLR camera parameters.
FIG. 3 illustrates an example of the UI displayed by the display module 45. The user can input the DSLR camera parameters via the UI. In this example, the user can select a type of the DSLR image sensor, i.e., “645” , “35mm” or “APS-C” . The user can also select a lens type, i.e., the Focal length (20mm to 100mm) and the F-number (F1.4 to F11) . After selecting the DSLR camera parameters, the user taps a “Save” button to save the selected parameters in the memory 49.
Optionally, the UI that allows a user to select a resolution of the DSLR image sensor from candidates may be displayed.
In the step S2, the first acquiring unit 31 of the ISP 30 acquires the DSLR camera parameters inputted by the user in the step S1. Specifically, the first acquiring unit 31 reads the inputted DSLR camera parameters from the memory 49.
In the step S3, the second acquiring unit 32 of the ISP 30 acquires an image (i.e., original image) , a focus distance, and a depth map which corresponds to the original image. Specifically, when the user takes a photo/video with the electronic device 100, the stereo camera module 10 captures a master camera image and a slave camera image. The second acquiring unit 32 acquires a master camera image or a slave camera image as the original image.
The ISP 30 generates a depth map based on the master camera image and the slave camera image by means of the stereo match technique.
Optionally, the depth map may be acquired from the range sensor module 20.
The second acquiring unit 32 acquires an autofocus area from the camera module 10 and acquires the focus distance by calculating a representative value of depth values in the autofocus area.
Alternatively, the second acquiring unit 32 may acquire a focus distance directly from the camera module 10. In this case, the focus distance is determined by an autofocus operation of the camera module 10.
FIG. 4A shows an example of the image captured by the camera module 10. The image includes three objects S1, S2 and S3 which are placed on a table in the order of the object S1, the object S2 and the object S3 from the front. In the example shown in FIG. 4A, the autofocus area R is on the subject S2.
FIG. 4B shows an example of the depth map corresponding to the acquired image shown in FIG. 4A. The depth map is a greyscale image. For example, the brightness of an area decreases in the depth map as the distance from the electronic device 100 increases.
Next, in the step S4, the obtaining unit 33 of the ISP 30 calculates, for each pixel of the acquired image, a bokeh size based on the DSLR camera parameters, the focus distance and the depth map. As a result, a bokeh size map is generated.
The bokeh size may be calculated by the following equation.
Figure PCTCN2021086031-appb-000001
where C is the bokeh size, f is the focal length, A is the F-number, D is the focus distance, and d is a depth value of a corresponding pixel in the depth map.
FIG. 5 shows an example of the bokeh size map generated in the step S4. The bokeh size map is a greyscale image. The brightness of an area in the bokeh size map increases as the bokeh size of the area increases. In the example shown in FIG. 5, the brightness of an area indicating the focused object S2 is the lowest (i.e., the smallest bokeh size) . On the other hand, the brightness of an area indicating the focused object S1 is the highest (i.e., the largest bokeh size) .
In the step S5, the performing unit 34 of the ISP 30 performs bokeh processing on the original image based on the bokeh size map generated in the step S4. As a result, an image with bokeh can be obtained.
The bokeh processing may be performed by applying a smoothing filter to the original image. The smoothing filter is a filter which is generated based on the bokeh size map. For example, the smoothing filter is a Gaussian filter with a standard deviation which is calculated based on the bokeh size in the bokeh size map. The bokeh size may be converted to a standard deviation by the following equation.
Figure PCTCN2021086031-appb-000002
where σ is the standard deviation, C is the bokeh size and pp is a pixel pitch of the DSLR image sensor.
The pixel pitch pp may be calculated by the following equation.
Figure PCTCN2021086031-appb-000003
where S is an area of the DSLR image sensor, N p is a number of pixels in the DSLR image sensor. Both S and N p are the DSLR camera parameters.
Specifically, the performing unit 34 generates the Gaussian kernel for each pixel of the image. A coefficient of the Gaussian kernel is determined based on the standard deviation σ. After that, the performing unit 34 convolutes, for each pixel in the image, the Gaussian kernel on a corresponding pixel.
A size of the Gaussian kernel is 3x3, 8x8, 16x16, for example, but it is not limited to any specific size. The kernel size may be determined based on the standard deviation σ. For example, the kernel size increases as the standard deviation decreases. The kernel size may also be  determined based on a computing power of the ISP 30. For example, the kernel size increases as the computing power increases.
FIG. 6 shows an example of the image with bokeh generated in the step S5. The focused object S2 is clearly displayed while the objects S1 and S3 are greatly blurred.
FIG. 7 shows a situation where a user P of the electronic device 100 takes a photo or video of a subject person S and there are a flower FG in the foreground and a tree BG in the background. As shown in FIG. 7, the bokeh sizes generated based on the DSLR camera parameters according to the method described above (solid line) are greater than the bokeh sizes generated based on actual camera parameters of the electronic device 100 (dashed line) over the entire distance. Therefore, a large bokeh of the flower FG and the tree BG can be obtained.
Also, as can be seen from FIG. 7, the curve of the bokeh sizes in the vicinity of the subject person S is not symmetrical along a distance direction. That is to say, the bokeh sizes in front of the subject person S are greater than the bokeh sizes behind the subject person S. This can be understood from the equation for calculating the bokeh size described above.
It should be noted that there are various methods for bokeh processing. The bokeh processing in the step S5 is not limited to the above method as long as the bokeh size map is used.
The DSLR camera parameters may be acquired from preset parameters stored in the memory 49 in advance. In this case, it is not necessary to display the UI described above, that is to say, the step S1 is not necessary.
Optionally, at least one of the steps described above may be performed by the main processor 48.
Optionally, the bokeh size in the vicinity of the subject may be changed. Specifically, a bokeh size in a Depth of Field (DoF) may be changed to a value smaller than the calculated bokeh size. Typically, all of the bokeh sizes in the DoF are changed to 0. In this case, the obtaining unit 33 calculates the bokeh size by means of the equation above and changes the bokeh size in the DoF to a predetermined value (e.g., 0) smaller than the calculated bokeh size.
FIG. 8 shows the same situation as FIG. 7 described above. As shown in FIG. 8, bokeh sizes in the DoF are set to 0. In other words, bokeh sizes are set to 0 in a first range in front of the subject S by a distance Tf and a second range behind the subject S by a distance Tr. Setting bokeh sizes in the first and second ranges to 0 enables the subject person S to be displayed more clearly.
The DoF may be calculated based on the focal length (f) , the F-number (A) , the focus distance (D) and a permissible circle of confusion (δ) . Specifically, the DoF is calculated by the following equations.
DoF=T f+T r
Figure PCTCN2021086031-appb-000004
Figure PCTCN2021086031-appb-000005
where T f is a forward depth of field, T r is a rear depth of field, δ is the permissible circle of confusion, f is the focal length, A is the F-number and D is the focus distance.
The permissible circle of confusion δ is calculated by the following equations.
δ = Max {pp, adr}
adr=1.22×λ×A
where δ is the permissible circle of confusion, Max is a function which returns the greater of the two arguments, pp is a pixel pitch of the DSLR image sensor, adr is an Airy disk radius, λ is a representative wavelength (e.g., 530nm) and A is the F-number.
As described above, according to an embodiment of the present disclosure, bokeh processing is performed by using the bokeh size map generated based on the DSLR camera parameters, which can accurately reproduce a high quality bokeh equivalent to bokeh created by DSLR cameras. In other words, it is possible to generate an image with natural and large bokeh.
For example, a smart phone with a relatively small image sensor can generate an image with large bokeh equivalent to bokeh created by DSLR cameras with a large image sensor, even though a picture or video is taken with a smart phone etc.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "back" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings in discussion. These relative terms are only used to simplify the description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or must be constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, a feature defined as "first" and "second" may comprise one or more of this feature. In the description of the present disclosure, "a plurality of" means “two or more than two” , unless otherwise specified.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature "below" , "under" or "on bottom of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "below" , "under" or "on bottom of" the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain  elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may also be applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristics described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function, may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment. As to the specification, "the computer readable medium" may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module. The integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc. The storage medium may be transitory or non-transitory.
Although embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (20)

  1. A method for bokeh processing, comprising:
    acquiring DSLR camera parameters including a focal length and an F-number;
    acquiring an image, a focus distance and a depth map which corresponds to the image;
    obtaining a bokeh size, for each pixel of the image, based on the DSLR camera parameters, the focus distance and the depth map to generate a bokeh size map; and
    performing bokeh processing on the image based on the bokeh size map.
  2. The method of claim 1, wherein the DSLR camera parameters are acquired through a User Interface that allows a user to input the DSLR camera parameters.
  3. The method of claim 1, wherein the DSLR camera parameters are acquired from preset parameters.
  4. The method of any one of claims 1 to 3, wherein the focus distance is a representative value of depth values in an autofocus area.
  5. The method of any one of claims 1 to 4, wherein the bokeh size is calculated by an equation (1) .
    Figure PCTCN2021086031-appb-100001
    where C is the bokeh size, f is the focal length, A is the F-number, D is the focus distance and d is a depth value of a corresponding pixel in the depth map.
  6. The method of any one of claims 1 to 5, wherein a bokeh size in a Depth of Field (DoF) is changed to a predetermined value smaller than the calculated bokeh size.
  7. The method of claim 6, wherein the predetermined value is 0.
  8. The method of claims 6 or 7, wherein the DoF is calculated based on the focal length, the F-number, the focus distance and a permissible circle of confusion.
  9. The method of claim 8, wherein the DoF is calculated by equations (2) , (3) and (4) .
    DoF=T f+T r   … (2)
    Figure PCTCN2021086031-appb-100002
    Figure PCTCN2021086031-appb-100003
    where T f is a forward depth of field, T r is a rear depth of field, δ is the permissible circle of confusion, f is the focal length, A is the F-number and D is the focus distance.
  10. The method of any one of claims 1 to 9, wherein the bokeh processing is performed by applying a smoothing filter generated based on the bokeh size map to the image.
  11. The method of claim 10, wherein the smoothing filter is a Gaussian filter with a standard deviation calculated based on the bokeh size.
  12. The method of claim 11, wherein the standard deviation is calculated by an equation (5) .
    Figure PCTCN2021086031-appb-100004
    where σ is the standard deviation, C is the bokeh size and pp is a pixel pitch of a DSLR image sensor.
  13. The method of claim 11 or 12, wherein a size of the Gaussian filter is determined based on the standard deviation.
  14. An electronic device for image processing comprising:
    a first acquiring unit configured to acquire DSLR camera parameters including a focal length and an F-number;
    a second acquiring unit configured to acquire an image, a focus distance and a depth map which corresponds to the image;
    an obtaining unit configured to obtain a bokeh size, for each pixel of the image, based on the DSLR camera parameters, the focus distance and the depth map to generate a bokeh size map; and
    a performing unit configured to perform bokeh processing on the image based on the bokeh size map.
  15. The electronic device of claim 14, wherein the first acquiring unit acquires the DSLR camera parameters through a User Interface that allows a user to input the DSLR camera parameters.
  16. The electronic device of claim 14, wherein the first acquiring unit acquires the DSLR camera parameters from preset parameters.
  17. The electronic device of any one of claims 14 to 16, wherein the obtaining unit calculates the bokeh size and changes the bokeh size in a Depth of Field (DoF) to a predetermined value smaller than the calculated bokeh size.
  18. The electronic device of any one of claims 14 to 17, wherein the performing unit performs the bokeh processing by applying a smoothing filter generated based on the bokeh size map to the image.
  19. An electronic device for image processing, comprising a processor and a memory for storing instructions, wherein the instructions, when executed by the processor, cause the processor to perform the method according to any one of claims 1 to 13.
  20. A computer-readable storage medium, on which a computer program is stored, wherein the computer program is executed by a computer to implement the method according to any one of claims 1 to 13.
PCT/CN2021/086031 2021-04-08 2021-04-08 Method for bokeh processing, electronic device and computer-readable storage medium WO2022213332A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/086031 WO2022213332A1 (en) 2021-04-08 2021-04-08 Method for bokeh processing, electronic device and computer-readable storage medium
CN202180096376.5A CN117178286A (en) 2021-04-08 2021-04-08 Scan processing method, electronic device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/086031 WO2022213332A1 (en) 2021-04-08 2021-04-08 Method for bokeh processing, electronic device and computer-readable storage medium

Publications (1)

Publication Number Publication Date
WO2022213332A1 true WO2022213332A1 (en) 2022-10-13

Family

ID=83544980

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/086031 WO2022213332A1 (en) 2021-04-08 2021-04-08 Method for bokeh processing, electronic device and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN117178286A (en)
WO (1) WO2022213332A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140152886A1 (en) * 2012-12-03 2014-06-05 Canon Kabushiki Kaisha Bokeh amplification
CN103973962A (en) * 2013-02-06 2014-08-06 聚晶半导体股份有限公司 Image processing method and image acquisition device
US20150170400A1 (en) * 2013-12-16 2015-06-18 Google Inc. Depth map generation using bokeh detection
CN105989574A (en) * 2015-02-25 2016-10-05 光宝科技股份有限公司 Image processing device and image field-depth processing method
US10567641B1 (en) * 2015-01-19 2020-02-18 Devon Rueckner Gaze-directed photography

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140152886A1 (en) * 2012-12-03 2014-06-05 Canon Kabushiki Kaisha Bokeh amplification
CN103973962A (en) * 2013-02-06 2014-08-06 聚晶半导体股份有限公司 Image processing method and image acquisition device
US20150170400A1 (en) * 2013-12-16 2015-06-18 Google Inc. Depth map generation using bokeh detection
US10567641B1 (en) * 2015-01-19 2020-02-18 Devon Rueckner Gaze-directed photography
CN105989574A (en) * 2015-02-25 2016-10-05 光宝科技股份有限公司 Image processing device and image field-depth processing method

Also Published As

Publication number Publication date
CN117178286A (en) 2023-12-05

Similar Documents

Publication Publication Date Title
KR102278776B1 (en) Image processing method, apparatus, and apparatus
CN107950018B (en) Image generation method and system, and computer readable medium
EP3402180B1 (en) Blurred photo generation method and apparatus, and mobile terminal
KR102338576B1 (en) Electronic device which stores depth information associating with image in accordance with Property of depth information acquired using image and the controlling method thereof
US9973672B2 (en) Photographing for dual-lens device using photographing environment determined using depth estimation
US9544574B2 (en) Selecting camera pairs for stereoscopic imaging
TWI538512B (en) Method for adjusting focus position and electronic apparatus
US9036072B2 (en) Image processing apparatus and image processing method
WO2020151281A9 (en) Image processing method and device, electronic equipment and storage medium
JP2014168227A (en) Image processing apparatus, imaging apparatus, and image processing method
CN107295249B (en) All focus implementation
US9332195B2 (en) Image processing apparatus, imaging apparatus, and image processing method
KR20190009104A (en) Electronic Device for controlling lens focus and the controlling Method thereof
KR20130024007A (en) Image photographing device and control method thereof
CN107547789B (en) Image acquisition device and method for photographing composition thereof
US9918015B2 (en) Exposure control using depth information
US20230033956A1 (en) Estimating depth based on iris size
KR20230107255A (en) Foldable electronic device for multi-view image capture
JP6645711B2 (en) Image processing apparatus, image processing method, and program
WO2022213332A1 (en) Method for bokeh processing, electronic device and computer-readable storage medium
JP2012235257A (en) Photographing device
WO2022198525A1 (en) Method of improving stability of bokeh processing and electronic device
CN114514737A (en) Low light auto-focusing technology
WO2022188007A1 (en) Image processing method and electronic device
WO2022241728A1 (en) Image processing method, electronic device and non–transitory computer–readable media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21935562

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21935562

Country of ref document: EP

Kind code of ref document: A1