WO2021142711A1 - 图像处理方法、装置、存储介质及电子设备 - Google Patents

图像处理方法、装置、存储介质及电子设备 Download PDF

Info

Publication number
WO2021142711A1
WO2021142711A1 PCT/CN2020/072464 CN2020072464W WO2021142711A1 WO 2021142711 A1 WO2021142711 A1 WO 2021142711A1 CN 2020072464 W CN2020072464 W CN 2020072464W WO 2021142711 A1 WO2021142711 A1 WO 2021142711A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
lens
subject
distance
Prior art date
Application number
PCT/CN2020/072464
Other languages
English (en)
French (fr)
Inventor
王会朝
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to CN202080083918.0A priority Critical patent/CN114762313B/zh
Priority to PCT/CN2020/072464 priority patent/WO2021142711A1/zh
Publication of WO2021142711A1 publication Critical patent/WO2021142711A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay

Definitions

  • This application belongs to the field of image technology, and in particular relates to an image processing method, device, storage medium, and electronic equipment.
  • Image blur is often used in image processing technology.
  • the electronic device when performing image processing, the electronic device can perform blurring processing on the background of the image, so that the effect of highlighting the subject can be obtained. Images with prominent subjects and blurred backgrounds are highly expressive.
  • the embodiments of the present application provide an image processing method, device, storage medium, and electronic equipment, which can improve the blur effect of an image.
  • an embodiment of the present application provides an image processing method applied to an electronic device.
  • the electronic device includes at least a first camera and a second camera, and the method includes:
  • an embodiment of the present application provides an image processing device, which is applied to an electronic device, the electronic device includes at least a first camera and a second camera, and the device includes:
  • the first acquisition module is configured to use the first camera to acquire a first image with a clear image of the photographing subject
  • the second acquiring module is used to acquire the depth information of the first image
  • An image segmentation module configured to perform image segmentation on the first image according to the depth information to obtain a subject image, where the subject image is an image area in the first image corresponding to the photographed subject;
  • a third acquisition module configured to acquire a second image by using the second camera, in which the subject is out of focus
  • the image fusion module is used to perform image fusion processing on the subject image and the second image to obtain a target image.
  • an embodiment of the present application provides a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed on a computer, the computer is caused to execute the image provided by the embodiment of the present application.
  • the flow in the processing method is not limited to:
  • an embodiment of the present application provides an electronic device including a memory, a processor, and at least a first camera and a second camera, and the processor is configured to execute:
  • Fig. 1 is a schematic flowchart of an image processing method provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of another flow of an image processing method provided by an embodiment of the present application.
  • 3 to 5 are schematic diagrams of scenes of image processing methods provided by embodiments of the present application.
  • Fig. 6 is a schematic structural diagram of an image processing device provided by an embodiment of the present application.
  • Fig. 7 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • Fig. 8 is a schematic structural diagram of an image processing circuit provided by an embodiment of the present application.
  • the execution subject of the embodiments of the present application may be an electronic device with a camera such as a smart phone or a tablet computer.
  • FIG. 1 is a schematic flowchart of an image processing method provided by an embodiment of the present application.
  • the image processing method can be applied to an electronic device, and the electronic device can include at least a first camera and a second camera.
  • the flow of the image processing method may include:
  • Image blur is often used in image processing technology.
  • the electronic device when performing image processing, the electronic device can perform blurring processing on the background of the image, so that the effect of highlighting the subject can be obtained. Images with prominent subjects and blurred backgrounds are highly expressive.
  • image blurring processing is often used in shooting scenes such as portrait shooting and macro shooting.
  • the effect of image blurring is poor.
  • background blurring as an example, in related technologies, an image is generally taken, then the subject is segmented from the image, and the background area outside the subject is blurred and blurred. Therefore, the blurring process in the related technology is simulated and generated on the original image, which is not real and natural enough, resulting in poor blurring effect.
  • the electronic device has two cameras as an example for description.
  • the electronic device may also have more than two cameras, such as three cameras or four cameras, which is not specifically limited in this embodiment.
  • the electronic device may first use the first camera to obtain a first image with a clear image of the subject (such as a person, etc.). That is, the subject is clearly imaged on the first image.
  • a first image with a clear image of the subject (such as a person, etc.). That is, the subject is clearly imaged on the first image.
  • the electronic device can obtain the depth information of the first image.
  • the electronic device can segment the subject image from the first image according to the depth information, and the subject image is the image corresponding to the image area of the photographed subject in the first image .
  • the electronic device can segment the image of the human body from the first image according to the depth information.
  • the electronic device may also use a second camera to capture a second image, where the subject is out of focus in the second image. That is, the photographing subject is blurred in the second image.
  • the electronic device may fuse the subject image and the second image to obtain the target image.
  • the subject image used for fusion is a clear image of the subject
  • the subject in the target image is also sharp.
  • other areas other than the subject are also blurred, so other areas in the target image except the subject are blurred.
  • the image processing method provided in the embodiments of the present application can obtain an image with a real blurring effect, thereby improving the image blurring effect and Image quality.
  • FIG. 2 is a schematic flowchart of another image processing method provided by an embodiment of the application.
  • the image processing method can be applied to an electronic device, and the electronic device can include at least a first camera and a second camera.
  • the light spot can make the image have a hazy feeling, so that the image has a better expressive power.
  • an image is generally taken, and then an algorithm is used on this image to blur and blur the non-subject area where the light is located, thereby generating a light spot. Therefore, in the related technology, the light spots are all simulated and generated on the original image, and the imaging effect of the light spots generated by this simulation is unreal and unnatural.
  • the image processing method provided by the embodiment of the application can obtain a real and natural light spot.
  • the flow of the image processing method provided in the embodiments of the present application may include:
  • the electronic device uses a first camera to acquire a first image with a clear image of a photographing subject, and there is a light in a shooting scene corresponding to the first image.
  • the electronic device may use the first camera to obtain a first image with a clear imaging of the subject. Wherein, there is light in the shooting scene corresponding to the first image.
  • the first camera may be the main camera of the electronic device.
  • the electronic device determines the position where the lens of the first camera is located when the first camera shoots the first image as the first position.
  • the electronic device drives the lens of the second camera to the sixth position, and takes a third image, where the lens position of the first camera In the mapping relationship with the lens position of the second camera, the first position where the lens of the first camera is located corresponds to the sixth position where the lens of the second camera is located.
  • the electronic device obtains the disparity information of the first image and the third image, and calculates the depth information of the first image according to the disparity information.
  • 202, 203, and 204 may include:
  • the electronic device may use the second camera to obtain the third image, obtain the disparity information of the first image and the third image, and calculate the depth information of the first image according to the disparity information.
  • the electronic device may acquire the position of the first camera when the first camera takes the first image, and determine the position as the first position. After that, the electronic device can drive the lens of the second camera to the sixth position according to the preset mapping relationship between the lens position of the first camera and the lens position of the second camera, and when the lens of the second camera is located at the sixth position Take the third image at the location. Wherein, in the mapping relationship between the lens position of the first camera and the lens position of the second camera, the first position of the lens of the first camera corresponds to the sixth position of the lens of the second camera. After acquiring the third image, the electronic device may acquire the disparity information of the first image and the third image, and calculate the depth information of the first image according to the disparity information.
  • the first camera and the second camera are cameras of the same specification.
  • the electronic device can obtain the first digital-to-analog conversion code value (DAC code) corresponding to the first camera at this time. Then, the electronic device can drive the lens of the second camera to move to the sixth position according to the first digital-to-analog conversion code value. It is understandable that since the DAC code value corresponding to the sixth position and the DAC code value corresponding to the first position are the same value, the sixth position where the lens of the second camera is located is the first position where the lens of the first camera is located. Is corresponding.
  • the device that controls the focus of the lens in the electronic device is a voice coil motor (VCM).
  • VCM voice coil motor
  • the voice coil motor can convert current into mechanical force, and its positioning and force control are determined by an external controller.
  • the voice coil motor in electronic equipment has a corresponding voice coil motor drive circuit (VCM Driver IC).
  • VCM Driver IC voice coil motor drive circuit
  • the voice coil motor drive circuit can precisely control the moving distance and direction of the coil in the voice coil motor, thereby driving the movement of the lens to achieve the focusing effect.
  • the voice coil motor works based on Ampere's theorem, that is, when the coil in the voice coil motor conducts electricity, the force generated by the current in it pushes the lens fixed on the carrier to move, thereby changing the focus distance. It can be seen that the control of the focus distance of the voice coil motor is actually achieved by controlling the current in the coil. Simply put, the voice coil motor drive circuit provides the source power of "current”. After the current is supplied to the coil of the voice coil motor, the magnetic field in the voice coil motor is used to generate the force to push the coil (lens).
  • the voice coil motor drive circuit is actually a DAC circuit with a control algorithm. It can convert the DAC code value containing digital position information uploaded from the I2C bus into the corresponding output current (the output current corresponding to the DAC code value); and then convert the output current into the focus distance through the voice coil motor device. Different output currents form a loop through the voice coil motor to produce different ampere forces, which push the lens on the voice coil motor to move. Therefore, the camera will stay at a clear focus position after the focus is completed, and the clear focus position has a corresponding digital-to-analog conversion code value (DAC code).
  • DAC code digital-to-analog conversion code value
  • the electronic device performs image segmentation on the first image to obtain a subject image by segmentation, and the subject image is an image area corresponding to the photographed subject in the first image.
  • the electronic device may perform image segmentation on the first image according to the depth information, so as to segment the subject image from the first image.
  • the subject image is an image corresponding to the image area of the photographed subject in the first image.
  • the electronic device detects the distance between the photographing subject and the first camera.
  • the electronic device may determine the position of the lens of the first camera when the first camera takes the first image as the first position.
  • the electronic device can detect the distance between the photographing subject and the first camera.
  • the distance between the shooting subject and the first camera is less than the preset threshold, it can be considered that the shooting subject is close. At this time, enter the flow of 207.
  • the distance between the shooting subject and the first camera is greater than or equal to the preset threshold, it can be considered that the shooting subject is far away. At this time, enter the flow of 209.
  • the electronic device may detect the distance between the photographing subject and the first camera according to the first position where the lens of the first camera is located when the first camera captures the first image.
  • the lens when the lens is driven to a different position, it will correspond to different DAC code values.
  • the lens When the distance between the shooting subject and the first camera is different, the lens will be driven to a different position for clear imaging. Therefore, the distance between the subject and the first camera can be detected according to the position of the lens when the subject is clearly imaging.
  • the value range of the DAC code value of the first camera is [S1, S3], and S2 is greater than S1 and less than S3.
  • the electronic device may preset that when the current value of the DAC code value is within the range of [S1, S2], it means that the distance between the subject and the first camera is greater than or equal to the preset threshold, that is, the subject is far away at this time.
  • the current value of the DAC code value is within the range of (S2, S3), it means that the distance between the subject and the first camera is less than the preset threshold, that is, the subject is close at this time.
  • the electronic device may also use other methods to detect the distance between the subject and the first camera, so as to determine whether the subject is close or far away.
  • the electronic device can calculate the distance between the subject and the first camera based on the time difference between the laser detection signal sent out and the laser signal received back, so as to determine whether the subject is close or far, and so on.
  • the electronic device selects a corresponding lens position from a plurality of lens positions as the second position of the lens of the first camera according to the preset first strategy, wherein, The distance between the lens of the first camera and the image sensor when the lens of the first camera is at the second position is greater than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the first position.
  • the electronic device drives the lens of the second camera to the third position and takes a second image, where the lens position of the first camera In the mapping relationship with the lens position of the second camera, the second position of the lens of the first camera corresponds to the third position of the lens of the second camera, and there is light in the shooting scene corresponding to the second image.
  • 207 and 208 can include:
  • the electronic device detects that the distance between the photographing subject and the first camera is less than the preset threshold, that is, the photographing subject is close.
  • the electronic device may select a corresponding lens position from a plurality of lens positions as the second position of the lens of the first camera according to the preset first strategy. Wherein, the distance between the lens and the image sensor when the lens of the first camera is at the second position is greater than the distance between the lens and the image sensor when the lens of the first camera is at the first position.
  • the electronic device can drive the lens of the second camera to the third position, and when the lens of the second camera moves to the third position, The electronic device can take a second image when it is located.
  • the second position where the lens of the first camera is located corresponds to the third position where the lens of the second camera is located.
  • the DAC code value corresponding to the first camera is the same as when the lens of the second camera is at the third position.
  • the DAC code value corresponding to the second camera is the same.
  • the preset first strategy may be to randomly select a lens position as the second position of the lens of the first camera, as long as the lens and the image sensor are at the second position when the lens of the first camera is at the second position.
  • the distance is greater than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the first position.
  • the distance between the lens of the first camera and the image sensor (of the first camera) when the lens of the first camera is in the second position is greater than the distance between the lens of the first camera and the image sensor (of the first camera) when the lens of the first camera is in the first position. Therefore, compared to the location of the subject, the second image captured by the second camera lens at the third position can clearly image the closer scene, so that the subject can be lost in the second image captured by the second camera. Focus, that is, the subject is blurred. At the same time, the scene of the background area in the second image is also blurred. That is, at this time, the second image belongs to the far focus virtual focus. At this time, the second image may form a spot of light in the shooting scene. Since the light spot is naturally formed through the above-mentioned far-focus virtual focus, the light spot is a real generated light spot.
  • the distance between the lens and the image sensor when the lens of the first camera is at the second position may be greater than the distance between the lens and the image sensor when the lens of the first camera is at any other position. That is, in the first camera, the distance between the second position and the image sensor (of the first camera) may be greater than the distance between any other lens position and the image sensor (of the first camera). That is, the third position is the position where the lens of the second camera is driven to the outermost position. In this case, the blur effect of the object in the background area in the second image obtained by shooting is the best. At this time, the light in the shooting scene can form a light spot with the best blur effect in the second image.
  • the electronic device selects a corresponding lens position from a plurality of lens positions according to a preset second strategy as the fourth position of the lens of the first camera.
  • the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the fourth position is smaller than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the first position.
  • the electronic device drives the lens of the second camera to the fifth position and shoots a second image, where the lens position of the first camera In the mapping relationship with the lens position of the second camera, the fourth position of the lens of the first camera corresponds to the fifth position of the lens of the second camera, and there is light in the shooting scene corresponding to the second image.
  • 209 and 210 can include:
  • the electronic device detects that the distance between the photographing subject and the first camera is greater than or equal to the preset threshold, that is, the photographing subject is far away.
  • the electronic device can select a corresponding lens position from a plurality of lens positions as the fourth position of the lens of the first camera according to the preset second strategy, wherein the lens of the first camera is in the fourth position of the lens of the first camera.
  • the distance between the lens and the image sensor in the four positions is smaller than the distance between the lens and the image sensor when the lens of the first camera is in the first position.
  • the electronic device can drive the lens of the second camera to the fifth position, and when the lens of the second camera moves to the fifth position, The electronic device can take a second image.
  • the fourth position where the lens of the first camera is located corresponds to the fifth position where the second camera is located.
  • the DAC code value corresponding to the first camera when the lens of the first camera is at the fourth position is the same as that of the second camera when the lens of the second camera is at the fifth position.
  • the DAC code values corresponding to the two cameras are the same.
  • the preset second strategy may be to randomly select a lens position as the fourth position of the lens of the first camera, as long as the lens of the first camera is at the fourth position, the lens and the image sensor The distance of is smaller than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the first position.
  • the distance between the lens of the first camera and the image sensor (of the first camera) when the lens of the first camera is in the fourth position is smaller than the distance between the lens of the first camera and the image sensor of the (first camera) when the lens of the first camera is in the first position.
  • the second image captured by the second camera lens at the fifth position focuses on the farther scene, so that the subject can be out of focus in the second image captured by the second camera. That is, the subject is blurred.
  • the foreground area in the second image is also blurred. That is, at this time, the second image belongs to near-focus virtual focus.
  • a spot of light in the shooting scene can be formed in the second image. The spot is naturally formed through the above-mentioned near-focus virtual focus, so the spot is a real generated spot.
  • the distance between the lens and the image sensor when the lens of the first camera is at the fourth position may be smaller than the distance between the lens and the image sensor when the lens of the first camera is at any other position. That is, in the first camera, the distance between the fourth position and the image sensor (of the first camera) may be smaller than the distance between any other lens position and the image sensor (of the first camera). That is, in the fifth position, the lens of the second camera is driven to the innermost position. In this case, the blur effect of the objects in the foreground area in the second image obtained by shooting is the best. At this time, the light in the shooting scene can form a light spot with the best blur effect in the second image.
  • the electronic device calculates a defocus coefficient.
  • the electronic device adjusts the scale of the second image to obtain the second image after the scale is adjusted.
  • 211 and 212 can include:
  • the electronic device may calculate the out-of-focus coefficient according to the first image and the second image.
  • the defocus coefficient may refer to the magnification (deformation) multiple of the object in the second image relative to the object in the first image. Since the subject in the second image is out of focus, the object in the second image becomes blurred, and the blur will cause the object to be deformed, that is, become larger. Therefore, the object in the second image is enlarged relative to the object in the first image.
  • the diameter of the area formed by a certain light in the first image is 15 pixels
  • the diameter of the light spot formed after the light is out of focus in the second image is 30 pixels, then it indicates that the object in the second image is relative to The object in the first image is magnified by 2 times.
  • the scale of the second image can be adjusted according to the calculated defocus coefficient to obtain the second image with the adjusted scale. For example, if the object in the second image is enlarged by 2 times relative to the object in the first image, then the second image needs to be reduced to one-half of the original image size before image fusion.
  • the electronic device performs image fusion processing on the main body image and the scaled second image to obtain a target image.
  • the electronic device may perform image fusion processing on the main body image and the scaled second image to obtain the target image.
  • the photographing subject in the target image is clearly imaged, and other areas in the target image except for the photographing subject have a blur effect.
  • this embodiment can generate a real and natural light spot and merge the real light spot into the target image. Therefore, the light spot in the target image is also real and natural, thereby improving the imaging quality of the image.
  • the subject image used for fusion is a clear image of the subject
  • the subject in the target image is also sharp.
  • other areas other than the subject are also blurred, so other areas in the target image except the subject are blurred.
  • the image processing method provided in the embodiments of the present application can obtain an image with a real blurring effect, thereby improving the image blurring effect and Image quality.
  • the process of the electronic device fusing the subject image and the second image to obtain the target image may include:
  • the electronic device determines an area in the second image corresponding to the photographed subject as a target area, and the image of the target area matches the subject image;
  • the electronic device replaces the image of the target area in the second image with the subject image, and determines the second image after the image replacement is completed as the target image.
  • the electronic device may first determine the area corresponding to the photographed subject from the second image and determine the area as the target area.
  • the image of the target area matches the subject image segmented from the first image.
  • the electronic device can perform image alignment (matching) between the subject image and the second image, and after the image alignment, the electronic device can determine the area in the second image corresponding to the subject image as the target area.
  • the electronic device can directly replace the image of the target area in the second image with the main image, and determine the second image after the image replacement is completed as the target image, thereby completing the second image and the main image Fusion processing.
  • the process of the electronic device fusing the main image and the second image to obtain the target image may also include:
  • the electronic device determines an area in the second image corresponding to the photographed subject as a target area, and the image of the target area matches the subject image;
  • the electronic device performs image fusion processing on the image of the target area in the second image and the subject image, and determines the image after the fusion is completed as the target image.
  • the electronic device may first determine the area corresponding to the photographed subject from the second image, and determine the area as the target area.
  • the image of the target area matches the subject image segmented from the first image.
  • the electronic device can perform image alignment (matching) between the subject image and the second image, and after the image alignment, the electronic device can determine the area in the second image corresponding to the subject image as the target area.
  • the electronic device may perform image pixel fusion processing on the image of the target area in the second image and the main image, and determine the image after the fusion is completed as the target image.
  • the overall fusion of the image of the subject area and the second image is better.
  • the second camera of the electronic device may also be a deep-sensing camera.
  • the second camera may be a TOF camera or a 3D structured light camera. Then, the electronic device may obtain the depth information of the first image according to the second camera, and then divide the first image according to the depth information of the first image to obtain a subject image about the photographed subject.
  • the TOF (Time of Flight) camera is mainly composed of an infrared light projector and a receiving module.
  • the projector of the TOF camera can project infrared light outwards.
  • the infrared light is reflected after encountering the object to be measured, and is received by the receiving module of the TOF camera.
  • the illuminated object can be calculated by recording the time from emission to reception of the infrared light Depth information and complete 3D modeling. That is, the TOF camera can be used to obtain the depth information of the object being photographed.
  • the basic principle of 3D structured light technology is to use a near-infrared laser to project light with certain structural characteristics onto the object to be photographed, and then collect it by a special infrared camera.
  • This kind of light with a certain structure will collect different image phase information due to the different depth areas of the object being photographed, and then use the arithmetic unit to convert this structure change into depth information to obtain a three-dimensional structure.
  • the three-dimensional structure of the object being photographed is obtained by optical means, and then the obtained information is further applied. That is, the 3D structured light camera uses a dot-matrix projector to project multiple light spots to the object, and the infrared camera takes a three-dimensional light image of the object, and the processing system calculates the depth information of the object.
  • the method of calculating the depth information of the first image based on the disparity information of the images captured by the first camera and the second camera has low hardware cost, simple calculation, and can quickly calculate the first image. Depth information of an image, thereby increasing the speed of segmenting the subject image from the first image.
  • the second camera in the electronic device is a TOF camera
  • the recognition distance of the TOF camera is relatively long, so when the subject is far away from the camera, the TOF camera can be used to calculate the depth information of the first image more accurately. Improve the accuracy of segmenting the subject image from the first image.
  • the second camera in the electronic device is a 3D structured light camera, the recognition accuracy of the 3D structured light camera is high. Therefore, the use of a 3D structured light camera can improve the accuracy of calculating the depth information of the first image, thereby improving the accuracy of the first image The accuracy of segmenting the subject image in.
  • the electronic device may also first detect whether there are lights in the shooting scene. If there are lights in the shooting scene, the image processing method provided in this application may be used to obtain an image with real light spots. For example, the electronic device can detect whether there are lights in the shooting scene by means of scene recognition. For example, this way of scene recognition may be scene recognition based on artificial intelligence technology.
  • this embodiment can also detect whether there are lights in the shooting scene in the following ways, and when it is detected that there are lights in the shooting scene, use the The image processing method is used to obtain an image with a real light spot: for example, the electronic device may first use the first camera to obtain a first image with a clear imaging of the subject. After that, the electronic device may obtain the brightness distribution information of the first image, and detect whether the number of pixels in the first image with a brightness value greater than a preset brightness threshold is greater than a preset value according to the brightness distribution information of the first image.
  • the image processing method provided in this embodiment needs to be used to obtain a target image with a real light spot.
  • the brightness distribution information of the first image may be a brightness histogram of the first image.
  • FIG. 3 to FIG. 5 are schematic diagrams of scenes of the image processing method provided by the embodiments of the application.
  • the electronic device includes two cameras, a first camera and a second camera.
  • the electronic device can use the first camera to take a first image with a clear image of the shooting subject.
  • the first image may be as shown in FIG. 3, in which the subject of the shooting is the seat of a bicycle.
  • the electronic device can obtain the first position of the lens of the first camera when the first camera shoots the first image, and according to the preset mapping relationship between the lens position of the first camera and the lens position of the second camera, the first camera The lens of the second camera is driven to the sixth position, and the third image is taken when the lens of the second camera is located at the sixth position.
  • the first position of the lens of the first camera corresponds to the sixth position of the lens of the second camera.
  • the electronic device may calculate the disparity information of the first image and the third image, and use the disparity information to calculate the depth information of the first image.
  • the electronic device may perform image segmentation on the first image, thereby segmenting the subject image (that is, the seat of the bicycle) from the first image.
  • the electronic device can detect the distance between the photographing subject and the first camera, that is, the electronic device can detect whether the photographing subject is close or far. For example, in this embodiment, the electronic device detects that the subject (ie, the seat of the bicycle) is close.
  • the electronic device can select a corresponding lens position from a plurality of lens positions as the second position of the lens of the first camera, wherein, when the lens of the first camera is at the second position, the lens position of the image sensor The distance is greater than the distance between the lens of the first camera and the image sensor when the lens is in the first position, that is, relative to the first position, when the lens is in the second position, it is farther away from the image sensor, that is, the lens extends forward. And, when the lens of the first camera is at the second position, the distance from the image sensor is greater than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at any other position. At this time, the second position of the lens is the farthest away from the image. The location of the sensor.
  • the electronic device can drive the lens of the second camera to the third position.
  • the second position where the lens of the first camera is located corresponds to the third position where the lens of the second camera is located.
  • the DAC code value corresponding to the third position is the same as the DAC code value corresponding to the second position.
  • the electronic device can use the second camera to take a second image.
  • the second image is an image in which the subject is out of focus.
  • the second image is shown in Fig. 4, and it can be seen from Fig. 4 that the image of the bicycle seat is blurred, and the image around the bicycle is also blurred.
  • the electronic device may calculate the out-of-focus coefficient according to the first image and the second image. After that, the electronic device may adjust the scale of the second image according to the calculated defocus coefficient to obtain the second image with the adjusted scale.
  • the electronic device may perform image fusion processing on the main body image and the scaled second image, so as to obtain the target image.
  • the photographing subject in the target image is clearly imaged, and other areas in the target image except for the photographing subject have a blur effect.
  • the target image can be as shown in Fig. 5, and it can be seen from Fig. 5 that the photographed main bicycle seat has a clear image, and other areas except the bicycle seat have a blur effect.
  • FIGS. 3 to 5 it can be seen from FIGS. 3 to 5 that in the shooting scene, there is light between leaves and branches. In order to improve the expressiveness of the image, these lights can be blurred into light spots. Since the light spot in the second image is a real light spot that is naturally generated when the subject is out of focus, the light spot in Figure 5 after image fusion is a real and natural light spot, and the imaging quality is good.
  • the electronic device can provide an image with real blur, especially with light spots generated by real blur.
  • the light spot in this embodiment is directly collected by the second camera when the subject is out of focus, the light spot in this embodiment is natural and true.
  • FIG. 6 is a schematic structural diagram of an image processing apparatus provided by an embodiment of the application.
  • the image processing apparatus can be applied to electronic equipment, which includes at least a first camera and a second camera.
  • the image processing apparatus 300 may include: a first acquisition module 301, a second acquisition module 302, an image segmentation module 303, a third acquisition module 304, and an image fusion module 305.
  • the first acquisition module 301 is configured to use the first camera to acquire a first image with a clear image of the subject.
  • the second acquiring module 302 is configured to acquire the depth information of the first image.
  • the image segmentation module 303 is configured to perform image segmentation on the first image according to the depth information to obtain a subject image, and the subject image is an image area in the first image corresponding to the photographed subject.
  • the third acquisition module 304 is configured to use the second camera to acquire a second image in which the subject is out of focus.
  • the image fusion module 305 is configured to perform image fusion processing on the subject image and the second image to obtain a target image.
  • the third acquisition module 304 may be used to:
  • a corresponding lens position is selected from a plurality of lens positions as the second position of the lens of the first camera according to the preset first strategy, where , The distance between the lens of the first camera and the image sensor when the lens of the first camera is at the second position is greater than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the first position;
  • the lens of the second camera is driven to the third position and a second image is taken.
  • the second position of the lens of the first camera corresponds to the third position of the lens of the second camera.
  • the third acquiring module 304 may also be used for:
  • a corresponding lens position is selected from a plurality of lens positions according to a preset second strategy as the fourth position of the lens of the first camera , wherein the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the fourth position is smaller than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the first position;
  • the lens of the second camera is driven to the fifth position and a second image is taken.
  • the fourth position of the lens of the first camera corresponds to the fifth position of the lens of the second camera.
  • the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the second position is greater than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at any other position.
  • the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the fourth position is smaller than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at any other position.
  • the image fusion module 305 may also be used for:
  • the second acquisition module 302 may be used to:
  • the second acquisition module 302 may be used to:
  • the lens of the second camera is driven to the sixth position, and a third image is taken.
  • the first position of the lens of the first camera corresponds to the sixth position of the lens of the second camera.
  • the second camera is a TOF camera or a 3D structured light camera.
  • the second acquiring module 302 may be used to acquire the depth information of the first image according to the second camera.
  • the image fusion module 305 may be used to: determine the area corresponding to the photographed subject in the second image as a target area, and the image of the target area matches the subject image ; Replace the image of the target area in the second image with the subject image, and determine the second image after the image replacement is completed as the target image.
  • the image fusion module 305 may be used to: determine the area corresponding to the photographed subject in the second image as a target area, and the image of the target area matches the subject image ; Perform image fusion processing on the image of the target area and the subject image in the second image, and determine the image after the fusion is completed as the target image.
  • the embodiment of the present application provides a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed on a computer, the computer is caused to execute the process in the image processing method provided in this embodiment.
  • An embodiment of the present application also provides an electronic device, including a memory and a processor, and the processor is configured to execute a process in the image processing method provided in this embodiment by calling a computer program stored in the memory.
  • the above-mentioned electronic device may be a mobile terminal such as a tablet computer or a smart phone.
  • a mobile terminal such as a tablet computer or a smart phone.
  • FIG. 7 is a schematic structural diagram of an electronic device provided by an embodiment of the application.
  • the electronic device 400 may include components such as a camera module 401, a memory 402, and a processor 403. Those skilled in the art can understand that the structure of the electronic device shown in FIG. 7 does not constitute a limitation on the electronic device, and may include more or fewer components than those shown in the figure, or a combination of certain components, or different component arrangements.
  • the camera module 401 may include at least a first camera and a second camera.
  • the memory 402 can be used to store application programs and data.
  • the application program stored in the memory 402 contains executable code.
  • Application programs can be composed of various functional modules.
  • the processor 403 executes various functional applications and data processing by running application programs stored in the memory 402.
  • the processor 403 is the control center of the electronic device. It uses various interfaces and lines to connect the various parts of the entire electronic device. It executes the electronic device by running or executing the application program stored in the memory 402 and calling the data stored in the memory 402. The various functions and processing data of the electronic equipment can be used to monitor the electronic equipment as a whole.
  • the processor 403 in the electronic device will load the executable code corresponding to the process of one or more application programs into the memory 402 according to the following instructions, and the processor 403 will run and store the executable code in the memory.
  • the application in 402 thus executes:
  • the electronic device may also have components such as a touch screen, a speaker, a microphone, and a battery.
  • the touch screen can be used to display information such as images and text.
  • the touch screen can also function as an input and output unit.
  • the touch screen can be used to receive input numbers, character information, or user characteristic information (such as fingerprints), and to generate optical or trackball signal input related to user settings and function control.
  • the touch display screen can also be used to display information input by the user or information provided to the user, as well as various graphical user interfaces of electronic devices. These graphical user interfaces can be composed of graphics, text, icons, videos, and any combination thereof.
  • the speaker can be used to play sound signals.
  • the microphone can pick up sound signals from the surrounding environment.
  • the battery can provide power for the various components of the entire electronic device.
  • the embodiment of the present invention also provides an electronic device.
  • the above-mentioned electronic equipment includes an image processing circuit, which may be implemented by hardware and/or software components, and may include various processing units that define an image signal processing (Image Signal Processing) pipeline.
  • the image processing circuit may at least include a camera, an image signal processor (Image Signal Processor, ISP processor), a control logic, an image memory, a display, and so on.
  • the camera may at least include one or more lenses and image sensors.
  • the image sensor may include a color filter array (such as a Bayer filter).
  • the image sensor can obtain the light intensity and wavelength information captured by each imaging pixel of the image sensor, and provide a set of raw image data that can be processed by the image signal processor.
  • the image signal processor can process the original image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the image signal processor may perform one or more image processing operations on the original image data and collect statistical information about the image data. Among them, the image processing operations can be performed with the same or different bit depth accuracy.
  • the original image data can be stored in the image memory after being processed by the image signal processor.
  • the image signal processor can also receive image data from the image memory.
  • the image memory may be a part of a memory device, a storage device, or an independent dedicated memory in an electronic device, and may include DMA (Direct Memory Access) features.
  • DMA Direct Memory Access
  • the image signal processor can perform one or more image processing operations, such as temporal filtering.
  • the processed image data can be sent to the image memory for additional processing before being displayed.
  • the image signal processor may also receive processed data from the image memory, and perform image data processing in the original domain and in the RGB and YCbCr color spaces on the processed data.
  • the processed image data can be output to a display for viewing by the user and/or further processed by a graphics engine or GPU (Graphics Processing Unit, graphics processor).
  • the output of the image signal processor can also be sent to the image memory, and the display can read image data from the image memory.
  • the image memory may be configured to implement one or more frame buffers.
  • the statistical data determined by the image signal processor can be sent to the control logic.
  • the statistical data may include the statistical information of the image sensor such as automatic exposure, automatic white balance, automatic focus, flicker detection, black level compensation, and lens shading correction.
  • the control logic may include a processor and/or microcontroller that executes one or more routines (such as firmware).
  • routines can determine the control parameters of the camera and the ISP control parameters based on the received statistical data.
  • the control parameters of the camera may include camera flash control parameters, lens control parameters (for example, focal length for focusing or zooming), or a combination of these parameters.
  • ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (for example, during RGB processing).
  • FIG. 8 is a schematic diagram of the structure of the image processing circuit in this embodiment. As shown in FIG. 8, for ease of description, only various aspects of the image processing technology related to the embodiment of the present invention are shown.
  • the image processing circuit may include: a first camera 510, a second camera 520, a first image signal processor 530, a second image signal processor 540, a control logic 550, an image memory 560, and a display 570.
  • the first camera 510 may include one or more first lenses 511 and a first image sensor 512.
  • the second camera 520 may include one or more second lenses 521 and a second image sensor 522.
  • the first image collected by the first camera 510 is transmitted to the first image signal processor 530 for processing.
  • the first image signal processor 530 may send statistical data of the first image (such as the brightness of the image, the contrast value of the image, the color of the image, etc.) to the control logic 550.
  • the control logic 550 can determine the control parameters of the first camera 510 according to the statistical data, so that the first camera 510 can perform operations such as auto-focusing and auto-exposure according to the control parameters.
  • the first image can be stored in the image memory 560 after being processed by the first image signal processor 530.
  • the first image signal processor 530 may also read the image stored in the image memory 560 for processing.
  • the first image may be directly sent to the display 570 for display after being processed by the image signal processor 530.
  • the display 570 can also read the image in the image memory 560 for display.
  • the second image collected by the second camera 520 is transmitted to the second image signal processor 540 for processing.
  • the second image signal processor 540 may send statistical data of the second image (such as the brightness of the image, the contrast value of the image, the color of the image, etc.) to the control logic 550.
  • the control logic 550 can determine the control parameters of the second camera 520 according to the statistical data, so that the second camera 520 can perform operations such as automatic focusing and automatic exposure according to the control parameters.
  • the second image can be stored in the image memory 560 after being processed by the second image signal processor 540.
  • the second image signal processor 540 may also read the image stored in the image memory 560 for processing.
  • the second image may be directly sent to the display 570 for display after being processed by the image signal processor 540.
  • the display 570 can also read the image in the image memory 560 for display.
  • the first image signal processor and the second image signal processor can also be combined into a unified image signal processor, which processes the data of the first image sensor and the second image sensor respectively.
  • the electronic device may also include a CPU and a power supply module.
  • the CPU is connected to the logic controller, the first image signal processor, the second image signal processor, the image memory, and the display, and the CPU is used to implement global control.
  • the power supply module is used to supply power to each module.
  • a mobile phone with dual camera modules works in some camera modes.
  • the CPU controls the power supply module to supply power to the first camera and the second camera.
  • the image sensor in the first camera is powered on, and the image sensor in the second camera is powered on, so that image collection and conversion can be realized.
  • one camera in the dual camera module can work. For example, only telephoto cameras work.
  • the CPU controls the power supply module to supply power to the image sensor of the corresponding camera.
  • the electronic device may further execute: determining the position where the lens of the first camera is located when the first image is taken by the first camera as the first position;
  • the electronic device when the electronic device executes the use of the second camera to acquire the second image, it may also execute: detect the distance between the photographing subject and the first camera; when the distance between the photographing subject and the first camera is When it is less than the preset threshold, select the corresponding lens position from a plurality of lens positions as the second position of the lens of the first camera according to the preset first strategy, wherein the lens of the first camera is in the second position of the lens of the first camera.
  • the distance to the image sensor at the second position is greater than the distance between the lens of the first camera and the image sensor at the first position; according to the difference between the lens position of the first camera and the lens position of the second camera Mapping relationship, driving the lens of the second camera to a third position and taking a second image, wherein, in the mapping relationship between the lens position of the first camera and the lens position of the second camera, the The second position where the lens of the first camera is located corresponds to the third position where the lens of the second camera is located.
  • the electronic device when the electronic device executes the second image acquisition by using the second camera, it may also execute: when the distance between the photographing subject and the first camera is greater than or equal to a preset threshold, according to The preset second strategy selects a corresponding lens position from a plurality of lens positions as the fourth position of the lens of the first camera, wherein, when the lens of the first camera is at the fourth position, the lens is connected to the image sensor Is smaller than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the first position; according to the mapping relationship between the lens position of the first camera and the lens position of the second camera, the first camera The lens of the second camera is driven to the fifth position, and a second image is taken. In the mapping relationship between the lens position of the first camera and the lens position of the second camera, the lens of the first camera is located The fourth position corresponds to the fifth position where the lens of the second camera is located.
  • the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the second position is greater than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at any other position.
  • the distance between the lens of the first camera and the image sensor when the lens of the first camera is at the fourth position is smaller than the distance between the lens of the first camera and the image sensor when the lens of the first camera is at any other position.
  • the electronic device may further execute: calculate a defocus coefficient according to the first image and the second image; adjust the scale of the second image according to the defocus coefficient to obtain an adjustment ratio After the second image; the subject image and the second image are subjected to image fusion processing to obtain the target image, including: the subject image and the adjusted second image are subjected to image fusion processing to obtain Target image.
  • the electronic device when the electronic device executes the acquisition of the depth information of the first image, it may execute: use the second camera to shoot a third image; acquire the first image and the third image Disparity information, and calculating depth information of the first image according to the disparity information.
  • the electronic device when the electronic device executes the use of the second camera to capture the third image, it may execute: acquire the first camera where the lens of the first camera is located when the first camera captures the first image. Position; according to the mapping relationship between the lens position of the first camera and the lens position of the second camera, the lens of the second camera is driven to the sixth position, and a third image is taken, wherein, in the first In the mapping relationship between the lens position of a camera and the lens position of the second camera, the first position of the lens of the first camera corresponds to the sixth position of the lens of the second camera.
  • the second camera may be a TOF camera or a 3D structured light camera.
  • the electronic device executes the acquisition of the depth information of the first image, it may execute: acquire the depth information of the first image according to the second camera.
  • the electronic device when the electronic device executes the image fusion processing of the subject image and the second image to obtain the target image, it may execute: Determined as a target area, and the image of the target area matches the subject image; replace the image of the target area in the second image with the subject image, and replace the second image after the image is replaced Determined as the target image.
  • the electronic device when the electronic device executes the image fusion processing of the subject image and the second image to obtain the target image, it may execute: Determined as a target area, the image of the target area matches the subject image; the image of the target area in the second image and the subject image are subjected to image fusion processing, and the image after the fusion is completed Determined as the target image.
  • the image processing device provided in the embodiment of the application belongs to the same concept as the image processing method in the above embodiment, and any method provided in the image processing method embodiment can be run on the image processing device.
  • any method provided in the image processing method embodiment can be run on the image processing device.
  • For details of the implementation process refer to the embodiment of the image processing method, which is not repeated here.
  • the computer program may be stored in a computer readable storage medium, such as stored in a memory, and executed by at least one processor.
  • the execution process may include the process of the embodiment of the image processing method.
  • the storage medium may be a magnetic disk, an optical disc, a read only memory (ROM, Read Only Memory), a random access memory (RAM, Random Access Memory), etc.
  • the image processing device of the embodiment of the present application its functional modules may be integrated into one processing chip, or each module may exist alone physically, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software function module and sold or used as an independent product, it can also be stored in a computer readable storage medium, such as a read-only memory, a magnetic disk or an optical disk, etc. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请公开了一种图像处理方法,应用于电子设备,包括:利用第一摄像头,获取拍摄主体成像清晰的第一图像;获取第一图像的深度信息;对第一图像进行图像分割,分割得到主体图像;利用第二摄像头,获取第二图像,第二图像中拍摄主体失焦;将主体图像和第二图像进行图像融合处理,得到目标图像。

Description

图像处理方法、装置、存储介质及电子设备 技术领域
本申请属于图像技术领域,尤其涉及一种图像处理方法、装置、存储介质及电子设备。
背景技术
图像处理技术中经常使用到图像虚化。例如,在进行图像处理时,电子设备可以对图像的背景进行虚化处理,从而可以得到突出拍摄主体的效果。主体突出、背景虚化的图像具有很强的表现力。
发明内容
本申请实施例提供一种图像处理方法、装置、存储介质及电子设备,可以提高图像的虚化效果。
第一方面,本申请实施例提供一种图像处理方法,应用于电子设备,所述电子设备至少包括第一摄像头和第二摄像头,所述方法包括:
利用所述第一摄像头,获取拍摄主体成像清晰的第一图像;
获取所述第一图像的深度信息;
根据所述深度信息,对所述第一图像进行图像分割,分割得到主体图像,所述主体图像为所述第一图像中对应于所述拍摄主体的图像区域;
利用所述第二摄像头,获取第二图像,所述第二图像中所述拍摄主体失焦;
将所述主体图像和所述第二图像进行图像融合处理,得到目标图像。
第二方面,本申请实施例提供一种图像处理装置,应用于电子设备,所述电子设备至少包括第一摄像头和第二摄像头,所述装置包括:
第一获取模块,用于利用所述第一摄像头,获取拍摄主体成像清晰的第一图像;
第二获取模块,用于获取所述第一图像的深度信息;
图像分割模块,用于根据所述深度信息,对所述第一图像进行图像分割,分割得到主体图像,所述主体图像为所述第一图像中对应于所述拍摄主体的图像区域;
第三获取模块,用于利用所述第二摄像头,获取第二图像,所述第二图像中所述拍摄主体失焦;
图像融合模块,用于将所述主体图像和所述第二图像进行图像融合处理,得到目标图像。
第三方面,本申请实施例提供一种计算机可读的存储介质,其上存储有计算机程序,其中,当所述计算机程序在计算机上执行时,使得所述计算机执行本申请实施例提供的图像处理方法中的流程。
第四方面,本申请实施例提供一种电子设备,包括存储器,处理器,以及至少包括第一摄像头和第二摄像头,所述处理器通过调用所述存储器中存储的计算机程序,用于执行:
利用所述第一摄像头,获取拍摄主体成像清晰的第一图像;
获取所述第一图像的深度信息;
根据所述深度信息,对所述第一图像进行图像分割,分割得到主体图像,所述主体图像为所述第一图像中对应于所述拍摄主体的图像区域;
利用所述第二摄像头,获取第二图像,所述第二图像中所述拍摄主体失焦;
将所述主体图像和所述第二图像进行图像融合处理,得到目标图像。
附图说明
下面结合附图,通过对本申请的具体实施方式详细描述,将使本申请的技术方案及其有益效果显而易见。
图1是本申请实施例提供的图像处理方法的流程示意图。
图2是本申请实施例提供的图像处理方法的另一流程示意图。
图3至图5是本申请实施例提供的图像处理方法的场景示意图。
图6是本申请实施例提供的图像处理装置的结构示意图。
图7是本申请实施例提供的电子设备的结构示意图。
图8是本申请实施例提供的图像处理电路的结构示意图。
具体实施方式
请参照图示,其中相同的组件符号代表相同的组件,本申请的原理是以实施在一适当的运算环境中来举例说明。以下的说明是基于所例示的本申请具体实施例,其不应被视为限制本申请未在此详述的其它具体实施例。
可以理解的是,本申请实施例的执行主体可以是诸如智能手机或平板电脑等具有摄像头的电子设备。
请参阅图1,图1是本申请实施例提供的图像处理方法的流程示意图。该图像处理方法可以应用于电子设备中,该电子设备可以至少包括第一摄像头和第二摄像头。该图像处理方法的流程可以包括:
101、利用第一摄像头,获取拍摄主体成像清晰的第一图像。
图像处理技术中经常使用到图像虚化。例如,在进行图像处理时,电子设备可以对图像的背景进行虚化处理,从而可以得到突出拍摄主体的效果。主体突出、背景虚化的图像具有很强的表现力。相关技术中,在人物拍摄、微距拍摄等拍摄场景下,经常需要使用到图像虚化处理。然而,相关技术中,图像虚化的效果较差。以背景虚化为例,在相关技术中,一般是拍摄一张图像,然后从该图像中分割出拍摄主体,并对拍摄主体以外的背景区域进行模糊虚化处理。因此,相关技术中的虚化处理是在原图像上模拟生成的,不够真实、自然,导致其虚化效果较差。
在本申请实施例中,以电子设备具有两个摄像头为例进行说明。当然,在其它实施方式中,电子设备也可以具有多于两个的摄像头,如三个摄像头或者四个摄像头等,本实施例对此不做具体限定。
比如,电子设备可以先利用第一摄像头获取拍摄主体(如人物等)成像清晰的第一图像。即,在第一图像上拍摄主体是成像清晰的。
102、获取第一图像的深度信息。
比如,在拍摄得到第一图像后,电子设备可以获取该第一图像的深度信息。
103、根据深度信息,对第一图像进行图像分割,分割得到主体图像,该主体图像为第一图像中对应于拍摄主体的图像区域。
比如,在获取到第一图像的深度信息后,电子设备可以根据该深度信息,从第一图像中分割出主体图像,该主体图像即为该第一图像中对应于拍摄主体的图像区域的图像。
例如,拍摄主体为人物,那么电子设备可以根据深度信息从第一图像中分割出人体的图像。
104、利用第二摄像头,获取第二图像,该第二图像中拍摄主体失焦。
比如,电子设备还可以利用第二摄像头拍摄得到第二图像,其中在该第二图像中拍摄主体失焦。也即,在第二图像中拍摄主体是模糊的。
105、将主体图像和第二图像进行图像融合处理,得到目标图像。
比如,在从第一图像中分割出主体图像,并利用第二摄像头拍摄得到主体失焦的第二图像后,电子设备可以将该主体图像和该第二图像融合,从而得到目标图像。
可以理解的是,由于用于融合的主体图像是拍摄主体的清晰成像,因此目标图像中的拍摄主体也是成像清晰的。并且,由于第二图像中拍摄主体失焦导致拍摄主体外的其它区域也成像模糊,因此目标图像中除拍摄主体外的其它区域是呈模糊虚化状态的。
可以理解的是,在本申请实施例中,由于用于进行融合的第二图像是拍摄主体失焦的图像,因此第二图像中的模糊虚化是真实的、自然的模糊虚化,而不是模拟生成的虚化。因此,相比于相关技术中直接在一张原图像上模拟生成虚化效果的方案,本申请实施例提供的图像处理方法可以得到具有真实虚化效果的图像,从而提高了图像虚化效果和成像质量。
请参阅图2,图2为本申请实施例提供的图像处理方法的另一流程示意图。该图像处理方法可以应用于电子设备中,该电子设备可以至少包括第一摄像头和第二摄像头。
当拍摄场景中存在灯光时,若对灯光所在的区域进行虚化处理,则灯光会变为光斑。光斑可以让图像具有朦胧感,从而使得图像具有更好的表现力。然而,相关技术中,一般都是拍摄一张图像,然后在这张图像上利用算法对灯光所在的非主体区域进行模糊虚化处理,从而生成光斑。因此,相关技术中,光斑都是在原始图像上模拟生成的,这种模拟生成的光斑其成像效果不真实、不自然。
本申请实施例提供的图像处理方法,可以得到真实、自然的光斑。本申请实施例提供的图像处理方法的流程可以包括:
201、电子设备利用第一摄像头获取拍摄主体成像清晰的第一图像,该第一图像对应的拍摄场景中存在灯光。
比如,电子设备可以利用第一摄像头获取拍摄主体具有清晰成像的第一图像。其中,该第一图像对应的拍摄场景中存在灯光。
在一种实施方式中,第一摄像头可以为电子设备的主摄像头。
202、电子设备将第一摄像头拍摄第一图像时该第一摄像头的镜头所在的位置确定为第一位置。
203、根据第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系,电子设备将该第二摄像头的镜头驱动至第六位置,并拍摄第三图像,其中,在第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系中,该第一摄像头的镜头所在的第一位置与该第二摄像头的镜头所在的第六位置相对应。
204、电子设备获取第一图像和第三图像的视差信息,并根据该视差信息计算该第一图像的深度信息。
比如,202、203、204可以包括:
在拍摄得到第一图像后,电子设备可以利用第二摄像头获取第三图像,并获取该第一图像和该第三图像的视差信息,以及根据该视差信息计算该第一图像的深度信息。
例如,电子设备可以获取第一摄像头拍摄第一图像时该第一摄像头的所在的位置,并将该位置确定为第一位置。之后,电子设备可以根据预设的第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系,将该第二摄像头的镜头驱动至第六位置,并在第二摄像头的镜头位于该第六位置时拍摄第三图像。其中,在第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系中,该第一摄像头的镜头所在的第一位置与该第二摄像头的镜头所在 的第六位置相对应。在获取到第三图像后,电子设备可以获取第一图像和第三图像的视差信息,并根据该视差信息计算该第一图像的深度信息。
例如,第一摄像头和第二摄像头为相同规格的摄像头。当第一摄像头的镜头位于上述第一位置时,电子设备可以获取此时第一摄像头对应的第一数模转换代码值(DAC code)。然后,电子设备可以根据该第一数模转换代码值来驱动第二摄像头的镜头移动至第六位置。可以理解的是,由于第六位置对应的DAC code值与第一位置对应的DAC code值是同一个数值,因此第二摄像的镜头所在的第六位置和第一摄像头的镜头所在的第一位置是相对应的。
需要说明的是,电子设备中控制镜头对焦的器件为音圈马达(VCM)。音圈马达可以将电流转化为机械力,它的定位和力的控制都是由外部的控制器决定的。电子设备中的音圈马达有一对应的音圈马达驱动电路(VCM Driver IC)。音圈马达驱动电路可以精准的控制音圈马达内的线圈的移动距离和方向,从而带动镜头的移动以达到对焦效果。
音圈马达基于安培定理工作,即当音圈马达内的线圈导电时,其中的电流产生的作用力推动固定在载体上的镜头移动,从而改变对焦距离。可以看到,音圈马达对于对焦距离的控制实际上是通过对线圈中电流的控制来实现的。简单来说就是音圈马达驱动电路提供“电流”这个源动力,电流供给音圈马达的线圈后,利用音圈马达内的磁场,产生推动线圈(镜头)的力量。
音圈马达驱动电路实际上是一个带控制算法的DAC电路。它可以将I2C总线上传来的包含数字位置信息的DAC code值转换成对应的输出电流(DAC code值对应的输出电流);再通过音圈马达器件将输出电流转化为对焦距离。不同的输出电流经过音圈电机形成回路,产生不同的安培力,该力推动音圈马达上面的镜头运动。因此,在对焦完成后摄像头会停留在对焦清晰的位置,该对焦清晰的位置有一个对应的数模转换代码值(DAC code)。
205、根据深度信息,电子设备对第一图像进行图像分割,分割得到主体图像,该主体图像为该第一图像中对应于拍摄主体的图像区域。
比如,在获取到第一图像的深度信息后,电子设备可以根据该深度信息对第一图像进行图像分割,以从该第一图像中分割出主体图像。其中,该主体图像为第一图像中对应于拍摄主体的图像区域的图像。
206、电子设备检测拍摄主体与第一摄像头的距离。
比如,电子设备可以将第一摄像头拍摄第一图像时该第一摄像头的镜头所在的位置确定为第一位置。
之后,电子设备可以检测拍摄主体与第一摄像头的距离。
若拍摄主体与第一摄像头的距离小于预设阈值,那么可以认为拍摄主体在近处。此时,进入207的流程。
若拍摄主体与第一摄像头的距离大于或等于预设阈值,那么可以认为拍摄主体在远处。此时,进入209的流程。
在一种实施方式中,电子设备可以根据第一摄像头拍摄第一图像时该第一摄像头的镜头所在的第一位置来检测拍摄主体与第一摄像头的距离。
例如,如前所述,镜头被驱动到不同的位置时会对应不同的DAC code值。而当拍摄主体与第一摄像头的距离不同时,为了清晰成像,镜头会被驱动到不同的位置。因此,可以根据拍摄主体清晰成像时镜头所在的位置来检测拍摄主体与第一摄像头的距离。
例如,第一摄像头的DAC code值的取值范围为[S1,S3],S2大于S1且小于S3。电子设备可以预先设定当DAC code值的当前数值位于[S1,S2]的范围内时,表示拍摄主体与第一摄像头的距离大于或等于预设阈值,即此时拍摄主体在远处。当DAC code值的当前数 值位于(S2,S3]的范围内时,表示拍摄主体与第一摄像头的距离小于预设阈值,即此时拍摄主体在近处。
当然,电子设备还可以采用其它方式来检测拍摄主体与第一摄像头的距离,从而判断出拍摄主体在近处还是远处。例如,电子设备可以根据向外发出激光探测信号与接收到返回的激光信号的时间差来计算拍摄主体与第一摄像头的距离,从而判断拍摄主体在近处或远处,等等。
207、在拍摄主体与第一摄像头的距离小于预设阈值时,电子设备根据预设第一策略从多个镜头位置中选择对应的镜头位置作为该第一摄像头的镜头的第二位置,其中,在该第一摄像头的镜头在该第二位置处时与图像传感器的距离大于该第一摄像头的镜头在该第一位置处时与图像传感器的距离。
208、根据第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系,电子设备将该第二摄像头的镜头驱动至第三位置,并拍摄第二图像,其中,在第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系中,该第一摄像头的镜头所在的第二位置与该第二摄像头的镜头所在的第三位置相对应,该第二图像对应的拍摄场景中存在灯光。
比如,207和208可以包括:
电子设备检测到拍摄主体与第一摄像头的距离小于预设阈值,即拍摄主体在近处。在这种情况下,电子设备可以根据预设第一策略从多个镜头位置中选择对应的镜头位置作为第一摄像头的镜头的第二位置。其中,在该第一摄像头的镜头在该第二位置处时镜头与图像传感器的距离大于该第一摄像头的镜头在第一位置处时镜头与图像传感器的距离。之后,根据预设的第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系,电子设备可以将第二摄像头的镜头驱动至第三位置,并当第二摄像头的镜头移动至该第三位置时,电子设备可以拍摄第二图像。其中,在第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系中,该第一摄像头的镜头所在的第二位置与该第二摄像头的镜头所在的该第三位置相对应。例如,当第一摄像头和第二摄像头为相同规格的摄像头时,第一摄像头的镜头在第二位置处时该第一摄像头对应的DAC code值与第二摄像头的镜头在第三位置处时该第二摄像头对应的DAC code值相同。
在一些实施方式中,预设第一策略可以为随机地选择一个镜头位置作为第一摄像头的镜头的第二位置,只要在该第一摄像头的镜头在该第二位置处时镜头与图像传感器的距离大于该第一摄像头的镜头在第一位置处时镜头与图像传感器的距离即可。
需要说明的是,由于第一摄像头的镜头在第二位置时与(第一摄像头的)图像传感器的距离大于第一摄像头的镜头在第一位置时与(第一摄像头的)图像传感器的距离,因此相比拍摄主体所在的位置,第二摄像头的镜头在第三位置时拍摄的第二图像可以清晰成像更近处的景象,这样就可以使得第二摄像头拍摄得到的第二图像中拍摄主体失焦,即拍摄主体是模糊的。同时,在第二图像中背景区域的景象也是模糊的。也即,此时第二图像属于远焦虚焦。此时,第二图像中可以形成拍摄场景中的灯光的光斑。该光斑由于是经过上述远焦虚焦而自然形成的,因此该光斑是真实生成的光斑。
在一种实施方式中,在第一摄像头的镜头在第二位置处时镜头与图像传感器的距离可以大于该第一摄像头的镜头在其它任一位置处时镜头与图像传感器的距离。即,在第一摄像头中上述第二位置与(第一摄像头的)图像传感器的距离可以大于其它任一镜头位置与(第一摄像头的)图像传感器的距离。即,第三位置为第二摄像头的镜头被驱动到最外侧的位置。在这种情况下拍摄得到的第二图像中背景区域的物体的虚化效果最好。此时,拍摄场景中的灯光在第二图像中可以形成虚化效果最好的光斑。
209、在拍摄主体与第一摄像头的距离大于或等于预设阈值时,电子设备根据预设第二 策略从多个镜头位置中选择对应的镜头位置作为该第一摄像头的镜头的第四位置,其中,在第一摄像头的镜头在第四位置处时与图像传感器的距离小于第一摄像头的镜头在第一位置处时与图像传感器的距离。
210、根据第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系,电子设备将该第二摄像头的镜头驱动至第五位置,并拍摄第二图像,其中,在第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系中,该第一摄像头的镜头所在的第四位置与该第二摄像头的镜头所在的第五位置相对应,第二图像对应的拍摄场景中存在灯光。
比如,209和210可以包括:
电子设备检测到拍摄主体与第一摄像头的距离大于或等于预设阈值,即拍摄主体在远处。在这种情况下,电子设备可以根据预设第二策略从多个镜头位置中选择对应的镜头位置作为该第一摄像头的镜头的第四位置,其中,在该第一摄像头的镜头在该第四位置处时镜头与图像传感器的距离小于该第一摄像头的镜头在第一位置处时镜头与图像传感器的距离。之后,根据第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系,电子设备可以将第二摄像头的镜头驱动至第五位置,并当第二摄像头的镜头移动至该第五位置时,电子设备可以拍摄第二图像。其中,在该第一摄像头的镜头位置与该第二摄像头的镜头位置的映射关系中,该第一摄像头的镜头所在的第四位置与该第二摄像头所在的该第五位置相对应。例如,当第一摄像头和第二摄像头为相同规格的摄像头时,第一摄像头的镜头在第四位置处时该第一摄像头对应的DAC code值与第二摄像头的镜头在第五位置时该第二摄像头对应的DAC code值相同。
在一些实施方式中,预设第二策略可以为随机地选择一个镜头位置作为该第一摄像头的镜头的第四位置,只要在该第一摄像头的镜头在该第四位置处时镜头与图像传感器的距离小于该第一摄像头的镜头在第一位置处时镜头与图像传感器的距离即可。
需要说明的是,由于第一摄像头的镜头位于第四位置时与(第一摄像头的)图像传感器的距离小于第一摄像头的镜头位于第一位置时与(第一摄像头)图像传感器的距离,因此相比拍摄主体所在的位置,第二摄像头的镜头在第五位置时拍摄的第二图像对焦到了更远处的景象,这样就可以使得第二摄像头拍摄得到的第二图像中拍摄主体失焦,即拍摄主体是模糊的。同时,第二图像中的前景区域也是模糊的。也即,此时第二图像属于近焦虚焦。此时,第二图像中可以形成拍摄场景中的灯光的光斑,该光斑由于是经过上述近焦虚焦而自然形成的,因此该光斑是真实生成的光斑。
在一种实施方式中,在第一摄像头的镜头在第四位置处时镜头与图像传感器的距离可以小于该第一摄像头的镜头在其它任一位置处时镜头与图像传感器的距离。即,在第一摄像头中上述第四位置与(第一摄像头的)图像传感器的距离可以小于其它任一镜头位置与(第一摄像头的)图像传感器的距离。即,在第五位置时第二摄像头的镜头被驱动到最内侧的位置。在这种情况下拍摄得到的第二图像中前景区域的物体的虚化效果最好。此时,拍摄场景中的灯光在第二图像中可以形成虚化效果最好的光斑。
211、根据第一图像和第二图像,电子设备计算失焦系数。
212、根据失焦系数,电子设备调整第二图像的比例,得到调整比例后的第二图像。
比如,211和212可以包括:
在拍摄得到主体失焦的第二图像后,电子设备可以根据第一图像和第二图像计算失焦系数。
需要说明的是,本实施例中,失焦系数可以指第二图像中的物体相对于第一图像中的物体的放大(形变)倍数。由于第二图像中拍摄主体失焦,因此第二图像中的物体变模糊了,而模糊会使物体发生形变,即变大。因此,第二图像中的物体相对于第一图像中的物 体被放大了。例如,第一图像中的某个灯光构成的区域的直径为15个像素,而第二图像中该灯光失焦后形成的光斑的直径为30个像素,那么表明第二图像中的物体相对于第一图像中的物体放大了2倍。
为了让第一图像和第二图像能够保持比例一致,从而利于图像融合。本实施例可以根据计算得到的失焦系数调整第二图像的比例,得到调整比例后的第二图像。例如,第二图像中的物体相对于第一图像中的物体放大了2倍,那么需要在图像融合前,需要把第二图像缩小至原来图像尺寸的二分之一。
213、电子设备将主体图像和调整比例后的第二图像进行图像融合处理,得到目标图像。
比如,在得到调整比例后的第二图像后,电子设备可以将主体图像和调整比例后的第二图像进行图像融合处理,从而得到目标图像。其中,该目标图像中的拍摄主体成像清晰,并且该目标图像中除了该拍摄主体外的其它区域为虚化效果。
可以理解的是,本实施例可以生成真实自然的光斑,并将真实的光斑融合到目标图像中,因此,目标图像中的光斑也是真实自然的,从而提高了图像的成像质量。
可以理解的是,由于用于融合的主体图像是拍摄主体的清晰成像,因此目标图像中的拍摄主体也是成像清晰的。并且,由于第二图像中拍摄主体失焦导致拍摄主体外的其它区域也成像模糊,因此目标图像中除拍摄主体外的其它区域是呈模糊虚化状态的。
可以理解的是,在本申请实施例中,由于用于进行融合的第二图像是拍摄主体失焦的图像,因此第二图像中的模糊虚化是真实的模糊虚化,而不是模拟生成的虚化。因此,相比于相关技术中直接在一张原图像上模拟生成虚化效果的方案,本申请实施例提供的图像处理方法可以得到具有真实虚化效果的图像,从而提高了图像虚化效果和成像质量。
在一些实施方式中,电子设备将主体图像和第二图像进行融合处理得到目标图像的流程,可以包括:
电子设备将第二图像中对应于拍摄主体的区域确定为目标区域,该目标区域的图像与主体图像相匹配;
电子设备将第二图像中的目标区域的图像替换为主体图像,并将完成图像替换后的第二图像确定为目标图像。
比如,电子设备可以先从第二图像中确定出对应于拍摄主体的区域,并将该区域确定为目标区域,该目标区域的图像和从第一图像中分割出来的主体图像是相匹配的。例如,电子设备可以将主体图像和第二图像进行图像对齐(匹配),在图像对齐后电子设备可以将第二图像中对应于该主体图像的区域确定为目标区域。
在确定出目标区域后,电子设备可以将第二图像中的目标区域的图像直接用主体图像替换掉,并将完成图像替换后的第二图像确定为目标图像,从而完成第二图像和主体图像的融合处理。
可以理解的是,上述利用图像直接替换的方式完成的第二图像和主体图像的融合处理,最终得到的目标图像中关于拍摄主体的图像的清晰度较好,因为目标图像中拍摄主体部分的图像是第一图像中成像很清晰的拍摄主体的图像。
或者,在另一种实施方式中,电子设备将主体图像和第二图像进行融合处理得到目标图像的流程,也可以包括:
电子设备将第二图像中对应于拍摄主体的区域确定为目标区域,该目标区域的图像与主体图像相匹配;
电子设备将第二图像中的目标区域的图像与主体图像进行图像融合处理,并将融合完成后的图像确定为目标图像。
比如,电子设备可以先从第二图像中确定出对应于拍摄主体的区域,并将该区域确定 为目标区域,该目标区域的图像和从第一图像中分割出来的主体图像是相匹配的。例如,电子设备可以将主体图像和第二图像进行图像对齐(匹配),在图像对齐后电子设备可以将第二图像中对应于该主体图像的区域确定为目标区域。
在确定出目标区域后,电子设备可以将第二图像中的该目标区域的图像与主体图像进行图像像素融合处理,并将融合完成后的图像确定为目标图像。
可以理解的是,上述采用将第二图像中的目标区域的图像与主体图像进行图像像素融合处理的方式得到的目标图像中关于拍摄主体区域的图像和第二图像整体的融合度较好。
在本申请中,除了利用第一摄像头和第二摄像头拍摄的图像的视差信息来计算第一图像的深度信息外,在另一些实施方式中,电子设备的第二摄像头也可以是深感摄像头,例如第二摄像头可以为TOF摄像头或者3D结构光摄像头。那么,电子设备可以根据该第二摄像头获取到第一图像的深度信息,进而根据第一图像的深度信息对该第一图像进行分割,得到关于拍摄主体的主体图像。
需要说明的是,TOF(Time of Flight,飞行时间)摄像头主要由红外光投射器和接收模组构成。TOF摄像头的投射器可以向外投射红外光,该红外光遇到被测物体后反射,并被TOF摄像头的接收模组接收,通过记录红外光从发射到被接收的时间可以计算出被照物体的深度信息,并完成3D建模。即,利用TOF摄像头可以获取到被拍物体的深度信息。
而3D结构光技术的基本原理是通过近红外激光器,将具有一定结构特征的光线投射到被拍摄物体上,再由专门的红外摄像头进行采集。这种具备一定结构的光线,会因被拍摄物体的不同深度区域,而采集不同的图像相位信息,然后通过运算单元将这种结构的变化换算成深度信息,以此来获得三维结构。简单来说就是,通过光学手段获取被拍摄物体的三维结构,再将获取到的信息进行更深入的应用。即,3D结构光摄像头通过利用点阵投影仪向外投射多个光斑到被拍物体,以红外摄像头拍摄被拍摄物体的三维光图像,再经由处理系统计算出被拍摄物体的深度信息。
需要说明的是,在本申请中,基于第一摄像头和第二摄像头拍摄得到的图像的视差信息来计算第一图像的深度信息的方式,其硬件成本较低,计算简单,可以快速计算出第一图像的深度信息,从而提高从第一图像中分割出主体图像的速度。而若电子设备中的第二摄像头为TOF摄像头,那么由于TOF摄像头的识别距离较远,因此当拍摄主体距离摄像头较远时,利用TOF摄像头可以较为准确地计算得到第一图像的深度信息,从而提高从第一图像中分割出主体图像的精度。若电子设备中的第二摄像头为3D结构光摄像头,那么由于3D结构光摄像头的识别精度较高,因此利用3D结构光摄像头可以提高计算第一图像的深度信息的精度,从而提高从第一图像中分割出主体图像的精度。
在另一种实施方式中,电子设备也可以先检测拍摄场景中是否存在灯光,若拍摄场景中存在灯光,那么可以使用本申请提供的图像处理方法来得到具有真实光斑的图像。比如,电子设备可以通过场景识别的方式来检测拍摄场景中是否存在灯光。例如,这种场景识别的方式可以是基于人工智能技术实现的场景识别。
除了通过场景识别的方式来检测拍摄场景中是否存在灯光外,本实施例还可以通过如下方式来检测拍摄场景中是否存在灯光,并在检测到拍摄场景中存在灯光时,利用本实施例提供的图像处理方法来得到具有真实光斑的图像:例如电子设备可以先利用第一摄像头获取拍摄主体具有清晰成像的第一图像。之后,电子设备可以获取第一图像的亮度分布信息,并根据该第一图像的亮度分布信息来检测第一图像中亮度值大于预设亮度阈值的像素点数量是否大于预设数值。若检测到第一图像中亮度值大于预设亮度阈值的像素点数量大于预设数值,则可以认为第一图像中存在过曝区域,而该过曝区域很可能是灯或光形成的。在这种情况下,可以认为拍摄场景中存在灯光,并需要利用本实施例提供的图像处理方法 来得到具有真实光斑的目标图像。
在一种实施方式中,第一图像的亮度分布信息可以是第一图像的亮度直方图。
请参阅图3至图5,图3至图5为本申请实施例提供的图像处理方法的场景示意图。
比如,电子设备包括两个摄像头,分别为第一摄像头和第二摄像头。当用户将摄像头对准拍摄场景,并按下拍照按钮后,电子设备可以利用第一摄像头拍摄一张拍摄主体成像清晰的第一图像。例如,第一图像可以如图3所示,图3中拍摄主体为自行车的座椅。
之后,电子设备可以获取第一摄像头拍摄第一图像时该第一摄像头的镜头所在的第一位置,并根据预设的第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系,将第二摄像头的镜头驱动至第六位置,并在第二摄像头的镜头位于该第六位置时拍摄第三图像。其中,在第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系中,该第一摄像头的镜头所在的第一位置与该第二摄像头的镜头所在的第六位置相对应。然后,电子设备可以计算第一图像和第三图像的视差信息,并利用该视差信息计算出第一图像的深度信息。
在计算得到第一图像的深度信息后,电子设备可以对第一图像进行图像分割,从而从第一图像中分割得到主体图像(即自行车的座椅)。
之后,电子设备可以检测拍摄主体与第一摄像头的距离,即电子设备可以检测拍摄主体是在近处还是远处。例如,本实施例中,电子设备检测到拍摄主体(即自行车的座椅)在近处。
在这种情况下,电子设备可以从多个镜头位置中选择对应的镜头位置作为第一摄像头的镜头的第二位置,其中,在第一摄像头的镜头在该第二位置处时与图像传感器的距离大于第一摄像头的镜头在第一位置处时与图像传感器的距离,即相对于第一位置,当镜头位于第二位置时其更远离图像传感器,即镜头向前伸了。并且,在第一摄像头的镜头在该第二位置时与图像传感器的距离大于第一摄像头的镜头在其它任一位置时与图像传感器的距离,此时,镜头所在的第二位置为最远离图像传感器的位置。
然后,根据第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系,电子设备可以将第二摄像头的镜头驱动至第三位置,在第一摄像头的镜头位置与第二摄像头的镜头位置的映射关系中,第一摄像头的镜头所在的第二位置与第二摄像头的镜头所在的第三位置相对应。例如,当第一摄像头和第二摄像头的规格相同时,第三位置对应的DAC code值与第二位置对应的DAC code值相同。
当第二摄像头的镜头移动至第三位置时,电子设备可以利用第二摄像头拍摄第二图像。可以理解的是,第二图像为拍摄主体失焦的图像。例如,第二图像如图4所示,从图4中可以看出自行车座椅的图像模糊了,同时自行车周边的图像也模糊了。
在拍摄得到主体失焦的第二图像后,电子设备可以根据第一图像和第二图像计算失焦系数。之后,电子设备可以根据计算得到的失焦系数调整第二图像的比例,得到调整比例后的第二图像。
之后,电子设备可以将主体图像和调整比例后的第二图像进行图像融合处理,从而得到目标图像。其中,该目标图像中的拍摄主体成像清晰,并且该目标图像中除了该拍摄主体外的其它区域为虚化效果。例如,目标图像可以如图5所示,从图5可以看出,拍摄的主体自行车座椅成像清晰,并且除自行车座椅外其它区域呈虚化效果。
另外,从图3至图5可知,拍摄场景中,树叶、树枝之间有光亮,为了提高图像的表现力,这些光亮可以虚化成光斑。由于第二图像中光斑是在拍摄主体失焦时自然生成的真实光斑,因此图像融合后的图5中的光斑是真实自然的光斑,成像质量好。
可以理解的是,本申请实施例中,电子设备可以提供一种具有真实虚化,尤其是具有真实虚化生成的光斑的图像。相比于相关技术中利用算法来模拟生成光斑的技术,由于本 实施例光斑是在拍摄主体失焦时由第二摄像头直接采集得到的,因此本实施例中的光斑是自然真实的。
另外,由于利用算法模拟生成光斑时需要耗费大量的时间和算力来进行光斑渲染。因此,本实施例通过由第二摄像头直接采集光斑的方式,还可以节省为生成光斑所需的渲染时间。
请参阅图6,图6为本申请实施例提供的图像处理装置的结构示意图。该图像处理装置可以应用于电子设备,该电子设备至少包括第一摄像头和第二摄像头。图像处理装置300可以包括:第一获取模块301,第二获取模块302,图像分割模块303,第三获取模块304,图像融合模块305。
第一获取模块301,用于利用所述第一摄像头,获取拍摄主体成像清晰的第一图像。
第二获取模块302,用于获取所述第一图像的深度信息。
图像分割模块303,用于根据所述深度信息,对所述第一图像进行图像分割,分割得到主体图像,所述主体图像为所述第一图像中对应于所述拍摄主体的图像区域。
第三获取模块304,用于利用所述第二摄像头,获取第二图像,所述第二图像中所述拍摄主体失焦。
图像融合模块305,用于将所述主体图像和所述第二图像进行图像融合处理,得到目标图像。
在一种实施方式中,所述第一图像和所述第二图像对应的拍摄场景中存在灯光。
在一种实施方式中,所述第三获取模块304可以用于:
将所述第一摄像头拍摄所述第一图像时,所述第一摄像头的镜头所在的位置确定为第一位置;
检测所述拍摄主体与所述第一摄像头的距离;
在所述拍摄主体与所述第一摄像头的距离小于预设阈值时,根据预设第一策略从多个镜头位置中选择对应的镜头位置作为所述第一摄像头的镜头的第二位置,其中,在所述第一摄像头的镜头在所述第二位置处时与图像传感器的距离大于所述第一摄像头的镜头在所述第一位置处时与图像传感器的距离;
根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第三位置,并拍摄第二图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第二位置与所述第二摄像头的镜头所在的第三位置相对应。
在一种实施方式中,所述第三获取模块304还可以用于:
在所述拍摄主体与所述第一摄像头的距离大于或等于预设阈值时,根据预设第二策略从多个镜头位置中选择对应的镜头位置作为所述第一摄像头的镜头的第四位置,其中,在所述第一摄像头的镜头在所述第四位置处时与图像传感器的距离小于所述第一摄像头的镜头在所述第一位置处时与图像传感器的距离;
根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第五位置,并拍摄第二图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第四位置与所述第二摄像头的镜头所在的第五位置相对应。
在一种实施方式中,在所述第一摄像头的镜头在所述第二位置处时与图像传感器的距离大于所述第一摄像头的镜头在其它任一位置处时与图像传感器的距离。
在一种实施方式中,在所述第一摄像头的镜头在所述第四位置处时与图像传感器的距离小于所述第一摄像头的镜头在其它任一位置处时与图像传感器的距离。
在一种实施方式中,所述图像融合模块305还可以用于:
根据所述第一图像和所述第二图像,计算失焦系数;
根据所述失焦系数,调整所述第二图像的比例,得到调整比例后的第二图像;
将所述主体图像和所述调整比例后的第二图像进行图像融合处理,得到目标图像。
在一种实施方式中,所述第二获取模块302可以用于:
利用所述第二摄像头拍摄第三图像;
获取所述第一图像和所述第三图像的视差信息,并根据所述视差信息计算所述第一图像的深度信息。
在一种实施方式中,所述第二获取模块302可以用于:
获取所述第一摄像头拍摄所述第一图像时所述第一摄像头的镜头所在的第一位置;
根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第六位置,并拍摄第三图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第一位置与所述第二摄像头的镜头所在的第六位置相对应。
在一种实施方式中,所述第二摄像头为TOF摄像头或3D结构光摄像头。
那么,所述第二获取模块302可以用于:根据所述第二摄像头获取所述第一图像的深度信息。
在一种实施方式中,所述图像融合模块305可以用于:将所述第二图像中对应于所述拍摄主体的区域确定为目标区域,所述目标区域的图像与所述主体图像相匹配;将所述第二图像中的所述目标区域的图像替换为所述主体图像,并将完成图像替换后的第二图像确定为目标图像。
在一种实施方式中,所述图像融合模块305可以用于:将所述第二图像中对应于所述拍摄主体的区域确定为目标区域,所述目标区域的图像与所述主体图像相匹配;将所述第二图像中的所述目标区域的图像与所述主体图像进行图像融合处理,并将融合完成后的图像确定为目标图像。
本申请实施例提供一种计算机可读的存储介质,其上存储有计算机程序,当所述计算机程序在计算机上执行时,使得所述计算机执行如本实施例提供的图像处理方法中的流程。
本申请实施例还提供一种电子设备,包括存储器,处理器,所述处理器通过调用所述存储器中存储的计算机程序,用于执行本实施例提供的图像处理方法中的流程。
例如,上述电子设备可以是诸如平板电脑或者智能手机等移动终端。请参阅图7,图7为本申请实施例提供的电子设备的结构示意图。
该电子设备400可以包括摄像模组401、存储器402、处理器403等部件。本领域技术人员可以理解,图7中示出的电子设备结构并不构成对电子设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
摄像模组401可以至少包括第一摄像头和第二摄像头。
存储器402可用于存储应用程序和数据。存储器402存储的应用程序中包含有可执行代码。应用程序可以组成各种功能模块。处理器403通过运行存储在存储器402的应用程序,从而执行各种功能应用以及数据处理。
处理器403是电子设备的控制中心,利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储器402内的应用程序,以及调用存储在存储器402内的数据,执行电子设备的各种功能和处理数据,从而对电子设备进行整体监控。
在本实施例中,电子设备中的处理器403会按照如下的指令,将一个或一个以上的应用程序的进程对应的可执行代码加载到存储器402中,并由处理器403来运行存储在存储 器402中的应用程序,从而执行:
利用所述第一摄像头,获取拍摄主体成像清晰的第一图像;
获取所述第一图像的深度信息;
根据所述深度信息,对所述第一图像进行图像分割,分割得到主体图像,所述主体图像为所述第一图像中对应于所述拍摄主体的图像区域;
利用所述第二摄像头,获取第二图像,所述第二图像中所述拍摄主体失焦;
将所述主体图像和所述第二图像进行图像融合处理,得到目标图像。
在其它实施例中,除了摄像模组、存储器和处理器,电子设备还可以具有触摸显示屏、扬声器、麦克风、电池等部件。
其中,触摸显示屏可以用于显示诸如图像、文字等信息。并且,触摸显示屏还可以具有输入输出单元的作用。比如,触摸显示屏可用于接收输入的数字、字符信息或用户特征信息(比如指纹),以及产生与用户设置以及功能控制有关的光学或者轨迹球信号输入。触摸显示屏还可用于显示由用户输入的信息或提供给用户的信息以及电子设备的各种图形用户接口,这些图形用户接口可以由图形、文本、图标、视频和其任意组合来构成。
扬声器可以用于播放声音信号。麦克风则可以从周围环境中拾取声音信号。电池可以为整个电子设备的各个部件提供电力。
本发明实施例还提供一种电子设备。上述电子设备中包括图像处理电路,图像处理电路可以利用硬件和/或软件组件实现,可包括定义图像信号处理(Image Signal Processing)管线的各种处理单元。图像处理电路至少可以包括:摄像头、图像信号处理器(Image Signal Processor,ISP处理器)、控制逻辑器、图像存储器以及显示器等。其中摄像头至少可以包括一个或多个透镜和图像传感器。
图像传感器可包括色彩滤镜阵列(如Bayer滤镜)。图像传感器可获取用图像传感器的每个成像像素捕捉的光强度和波长信息,并提供可由图像信号处理器处理的一组原始图像数据。
图像信号处理器可以按多种格式逐个像素地处理原始图像数据。例如,每个图像像素可具有8、10、12或14比特的位深度,图像信号处理器可对原始图像数据进行一个或多个图像处理操作、收集关于图像数据的统计信息。其中,图像处理操作可按相同或不同的位深度精度进行。原始图像数据经过图像信号处理器处理后可存储至图像存储器中。图像信号处理器还可从图像存储器处接收图像数据。
图像存储器可为存储器装置的一部分、存储设备、或电子设备内的独立的专用存储器,并可包括DMA(Direct Memory Access,直接直接存储器存取)特征。
当接收到来自图像存储器的图像数据时,图像信号处理器可进行一个或多个图像处理操作,如时域滤波。处理后的图像数据可发送给图像存储器,以便在被显示之前进行另外的处理。图像信号处理器还可从图像存储器接收处理数据,并对所述处理数据进行原始域中以及RGB和YCbCr颜色空间中的图像数据处理。处理后的图像数据可输出给显示器,以供用户观看和/或由图形引擎或GPU(Graphics Processing Unit,图形处理器)进一步处理。此外,图像信号处理器的输出还可发送给图像存储器,且显示器可从图像存储器读取图像数据。在一种实施方式中,图像存储器可被配置为实现一个或多个帧缓冲器。
图像信号处理器确定的统计数据可发送给控制逻辑器。例如,统计数据可包括自动曝光、自动白平衡、自动聚焦、闪烁检测、黑电平补偿、透镜阴影校正等图像传感器的统计信息。
控制逻辑器可包括执行一个或多个例程(如固件)的处理器和/或微控制器。一个或多个例程可根据接收的统计数据,确定摄像头的控制参数以及ISP控制参数。例如,摄像头的 控制参数可包括照相机闪光控制参数、透镜的控制参数(例如聚焦或变焦用焦距)、或这些参数的组合。ISP控制参数可包括用于自动白平衡和颜色调整(例如,在RGB处理期间)的增益水平和色彩校正矩阵等。
请参阅图8,图8为本实施例中图像处理电路的结构示意图。如图8所示,为便于说明,仅示出与本发明实施例相关的图像处理技术的各个方面。
图像处理电路可以包括:第一摄像头510、第二摄像头520、第一图像信号处理器530、第二图像信号处理器540、控制逻辑器550、图像存储器560、显示器570。其中,第一摄像头510可以包括一个或多个第一透镜511和第一图像传感器512。第二摄像头520可以包括一个或多个第二透镜521和第二图像传感器522。
第一摄像头510采集的第一图像传输给第一图像信号处理器530进行处理。第一图像信号处理器530处理第一图像后,可将第一图像的统计数据(如图像的亮度、图像的反差值、图像的颜色等)发送给控制逻辑器550。控制逻辑器550可根据统计数据确定第一摄像头510的控制参数,从而第一摄像头510可根据控制参数进行自动对焦、自动曝光等操作。第一图像经过第一图像信号处理器530进行处理后可存储至图像存储器560中。第一图像信号处理器530也可以读取图像存储器560中存储的图像以进行处理。另外,第一图像经过图像信号处理器530进行处理后可直接发送至显示器570进行显示。显示器570也可以读取图像存储器560中的图像以进行显示。
第二摄像头520采集的第二图像传输给第二图像信号处理器540进行处理。第二图像信号处理器540处理第二图像后,可将第二图像的统计数据(如图像的亮度、图像的反差值、图像的颜色等)发送给控制逻辑器550。控制逻辑器550可根据统计数据确定第二摄像头520的控制参数,从而第二摄像头520可根据控制参数进行自动对焦、自动曝光等操作。第二图像经过第二图像信号处理器540进行处理后可存储至图像存储器560中。第二图像信号处理器540也可以读取图像存储器560中存储的图像以进行处理。另外,第二图像经过图像信号处理器540进行处理后可直接发送至显示器570进行显示。显示器570也可以读取图像存储器560中的图像以进行显示。
在另一些实施方式中,第一图像信号处理器和第二图像信号处理器也可合成为统一的图像信号处理器,分别处理第一图像传感器和第二图像传感器的数据。
此外,图中没有展示的,电子设备还可以包括CPU和供电模块。CPU和逻辑控制器、第一图像信号处理器、第二图像信号处理器、图像存储器和显示器均连接,CPU用于实现全局控制。供电模块用于为各个模块供电。
一般的,具有双摄像模组的手机,在某些拍照模式下,双摄像模组均工作。此时,CPU控制供电模块为第一摄像头和第二摄像头供电。第一摄像头中的图像传感器上电,第二摄像头中的图像传感器上电,从而可以实现图像的采集转换。在某些拍照模式下,可以是双摄像模组中的一个摄像头工作。例如,仅长焦摄像头工作。这种情况下,CPU控制供电模块给相应摄像头的图像传感器供电即可。
以下为运用图8中图像处理技术实现本实施例提供的图像处理方法的流程:
利用所述第一摄像头,获取拍摄主体成像清晰的第一图像;
获取所述第一图像的深度信息;
根据所述深度信息,对所述第一图像进行图像分割,分割得到主体图像,所述主体图像为所述第一图像中对应于所述拍摄主体的图像区域;
利用所述第二摄像头,获取第二图像,所述第二图像中所述拍摄主体失焦;
将所述主体图像和所述第二图像进行图像融合处理,得到目标图像。
在一种实施方式中,所述第一图像和所述第二图像对应的拍摄场景中存在灯光。
在一种实施方式中,电子设备还可以执行:将所述第一摄像头拍摄所述第一图像时,所述第一摄像头的镜头所在的位置确定为第一位置;
那么,电子设备执行所述利用所述第二摄像头获取第二图像时,还可以执行:检测所述拍摄主体与所述第一摄像头的距离;在所述拍摄主体与所述第一摄像头的距离小于预设阈值时,根据预设第一策略从多个镜头位置中选择对应的镜头位置作为所述第一摄像头的镜头的第二位置,其中,在所述第一摄像头的镜头在所述第二位置处时与图像传感器的距离大于所述第一摄像头的镜头在所述第一位置处时与图像传感器的距离;根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第三位置,并拍摄第二图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第二位置与所述第二摄像头的镜头所在的第三位置相对应。
在一种实施方式中,电子设备执行所述利用所述第二摄像头获取第二图像时,还可以执行:在所述拍摄主体与所述第一摄像头的距离大于或等于预设阈值时,根据预设第二策略从多个镜头位置中选择对应的镜头位置作为所述第一摄像头的镜头的第四位置,其中,在所述第一摄像头的镜头在所述第四位置处时与图像传感器的距离小于所述第一摄像头的镜头在所述第一位置处时与图像传感器的距离;根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第五位置,并拍摄第二图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第四位置与所述第二摄像头的镜头所在的第五位置相对应。
在一种实施方式中,在所述第一摄像头的镜头在所述第二位置处时与图像传感器的距离大于所述第一摄像头的镜头在其它任一位置处时与图像传感器的距离。
在一种实施方式中,在所述第一摄像头的镜头在所述第四位置处时与图像传感器的距离小于所述第一摄像头的镜头在其它任一位置处时与图像传感器的距离。
在一种实施方式中,电子设备还可以执行:根据所述第一图像和所述第二图像,计算失焦系数;根据所述失焦系数,调整所述第二图像的比例,得到调整比例后的第二图像;所述将所述主体图像和所述第二图像进行图像融合处理得到目标图像,包括:将所述主体图像和所述调整比例后的第二图像进行图像融合处理,得到目标图像。
在一种实施方式中,电子设备执行所述获取所述第一图像的深度信息时,可以执行:利用所述第二摄像头拍摄第三图像;获取所述第一图像和所述第三图像的视差信息,并根据所述视差信息计算所述第一图像的深度信息。
在一种实施方式中,电子设备执行所述利用所述第二摄像头拍摄第三图像时,可以执行:获取所述第摄像头拍摄所述第一图像时所述第一摄像头的镜头所在的第一位置;根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第六位置,并拍摄第三图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第一位置与所述第二摄像头的镜头所在的第六位置相对应。
在一种实施方式中,所述第二摄像头可以为TOF摄像头或3D结构光摄像头。
那么,电子设备执行所述获取所述第一图像的深度信息时,可以执行:根据所述第二摄像头获取所述第一图像的深度信息。
在一种实施方式中,电子设备执行所述将所述主体图像和所述第二图像进行图像融合处理得到目标图像时,可以执行:将所述第二图像中对应于所述拍摄主体的区域确定为目标区域,所述目标区域的图像与所述主体图像相匹配;将所述第二图像中的所述目标区域的图像替换为所述主体图像,并将完成图像替换后的第二图像确定为目标图像。
在一种实施方式中,电子设备执行所述将所述主体图像和所述第二图像进行图像融合处理得到目标图像时,可以执行:将所述第二图像中对应于所述拍摄主体的区域确定为目标区域,所述目标区域的图像与所述主体图像相匹配;将所述第二图像中的所述目标区域的图像与所述主体图像进行图像融合处理,并将融合完成后的图像确定为目标图像。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见上文针对图像处理方法的详细描述,此处不再赘述。
本申请实施例提供的所述图像处理装置与上文实施例中的图像处理方法属于同一构思,在所述图像处理装置上可以运行所述图像处理方法实施例中提供的任一方法,其具体实现过程详见所述图像处理方法实施例,此处不再赘述。
需要说明的是,对本申请实施例所述图像处理方法而言,本领域普通技术人员可以理解实现本申请实施例所述图像处理方法的全部或部分流程,是可以通过计算机程序来控制相关的硬件来完成,所述计算机程序可存储于一计算机可读取存储介质中,如存储在存储器中,并被至少一个处理器执行,在执行过程中可包括如所述图像处理方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)等。
对本申请实施例的所述图像处理装置而言,其各功能模块可以集成在一个处理芯片中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中,所述存储介质譬如为只读存储器,磁盘或光盘等。
以上对本申请实施例所提供的一种图像处理方法、装置、存储介质以及电子设备进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (20)

  1. 一种图像处理方法,应用于电子设备,其中,所述电子设备至少包括第一摄像头和第二摄像头,所述方法包括:
    利用所述第一摄像头,获取拍摄主体成像清晰的第一图像;
    获取所述第一图像的深度信息;
    根据所述深度信息,对所述第一图像进行图像分割,分割得到主体图像,所述主体图像为所述第一图像中对应于所述拍摄主体的图像区域;
    利用所述第二摄像头,获取第二图像,所述第二图像中所述拍摄主体失焦;
    将所述主体图像和所述第二图像进行图像融合处理,得到目标图像。
  2. 根据权利要求1所述的图像处理方法,其中,所述第一图像和所述第二图像对应的拍摄场景中存在灯光。
  3. 根据权利要求1所述的图像处理方法,其中,所述方法还包括:将所述第一摄像头拍摄所述第一图像时,所述第一摄像头的镜头所在的位置确定为第一位置;
    所述利用所述第二摄像头获取第二图像,包括:
    检测所述拍摄主体与所述第一摄像头的距离;
    在所述拍摄主体与所述第一摄像头的距离小于预设阈值时,根据预设第一策略从多个镜头位置中选择对应的镜头位置作为所述第一摄像头的镜头的第二位置,其中,在所述第一摄像头的镜头在所述第二位置处时与图像传感器的距离大于所述第一摄像头的镜头在所述第一位置处时与图像传感器的距离;
    根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第三位置,并拍摄第二图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第二位置与所述第二摄像头的镜头所在的第三位置相对应。
  4. 根据权利要求3所述的图像处理方法,其中,所述利用所述第二摄像头获取第二图像,还包括:
    在所述拍摄主体与所述第一摄像头的距离大于或等于预设阈值时,根据预设第二策略从多个镜头位置中选择对应的镜头位置作为所述第一摄像头的镜头的第四位置,其中,在所述第一摄像头的镜头在所述第四位置处时与图像传感器的距离小于所述第一摄像头的镜头在所述第一位置处时与图像传感器的距离;
    根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第五位置,并拍摄第二图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第四位置与所述第二摄像头的镜头所在的第五位置相对应。
  5. 根据权利要求3所述的图像处理方法,其中,在所述第一摄像头的镜头在所述第二位置处时与图像传感器的距离大于所述第一摄像头的镜头在其它任一位置处时与图像传感器的距离。
  6. 根据权利要求4所述的图像处理方法,其中,在所述第一摄像头的镜头在所述第四位置处时与图像传感器的距离小于所述第一摄像头的镜头在其它任一位置处时与图像传感 器的距离。
  7. 根据权利要求1所述的图像处理方法,其中,所述方法还包括:
    根据所述第一图像和所述第二图像,计算失焦系数;
    根据所述失焦系数,调整所述第二图像的比例,得到调整比例后的第二图像;
    所述将所述主体图像和所述第二图像进行图像融合处理得到目标图像,包括:将所述主体图像和所述调整比例后的第二图像进行图像融合处理,得到目标图像。
  8. 根据权利要求1所述的图像处理方法,其中,所述获取所述第一图像的深度信息,包括:
    利用所述第二摄像头拍摄第三图像;
    获取所述第一图像和所述第三图像的视差信息,并根据所述视差信息计算所述第一图像的深度信息。
  9. 根据权利要求8所述的图像处理方法,其中,所述利用所述第二摄像头拍摄第三图像,包括:
    获取所述第一摄像头拍摄所述第一图像时所述第一摄像头的镜头所在的第一位置;
    根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第六位置,并拍摄第三图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第一位置与所述第二摄像头的镜头所在的第六位置相对应。
  10. 根据权利要求1所述的图像处理方法,其中,所述第二摄像头为TOF摄像头或3D结构光摄像头;
    所述获取所述第一图像的深度信息,包括:根据所述第二摄像头获取所述第一图像的深度信息。
  11. 根据权利要求1所述的图像处理方法,其中,所述将所述主体图像和所述第二图像进行图像融合处理,得到目标图像,包括:
    将所述第二图像中对应于所述拍摄主体的区域确定为目标区域,所述目标区域的图像与所述主体图像相匹配;
    将所述第二图像中的所述目标区域的图像替换为所述主体图像,并将完成图像替换后的第二图像确定为目标图像。
  12. 根据权利要求1所述的图像处理方法,其中,所述将所述主体图像和所述第二图像进行图像融合处理,得到目标图像,包括:
    将所述第二图像中对应于所述拍摄主体的区域确定为目标区域,所述目标区域的图像与所述主体图像相匹配;
    将所述第二图像中的所述目标区域的图像与所述主体图像进行图像融合处理,并将融合完成后的图像确定为目标图像。
  13. 一种图像处理装置,应用于电子设备,其中,所述电子设备至少包括第一摄像头和第二摄像头,所述装置包括:
    第一获取模块,用于利用所述第一摄像头,获取拍摄主体成像清晰的第一图像;
    第二获取模块,用于获取所述第一图像的深度信息;
    图像分割模块,用于根据所述深度信息,对所述第一图像进行图像分割,分割得到主体图像,所述主体图像为所述第一图像中对应于所述拍摄主体的图像区域;
    第三获取模块,用于利用所述第二摄像头,获取第二图像,所述第二图像中所述拍摄主体失焦;
    图像融合模块,用于将所述主体图像和所述第二图像进行图像融合处理,得到目标图像。
  14. 一种计算机可读的存储介质,其上存储有计算机程序,其中,当所述计算机程序在计算机上执行时,使得所述计算机执行如权利要求1所述的方法。
  15. 一种电子设备,包括存储器,处理器,以及至少包括第一摄像头和第二摄像头,其中,所述处理器通过调用所述存储器中存储的计算机程序,用于执行:
    利用所述第一摄像头,获取拍摄主体成像清晰的第一图像;
    获取所述第一图像的深度信息;
    根据所述深度信息,对所述第一图像进行图像分割,分割得到主体图像,所述主体图像为所述第一图像中对应于所述拍摄主体的图像区域;
    利用所述第二摄像头,获取第二图像,所述第二图像中所述拍摄主体失焦;
    将所述主体图像和所述第二图像进行图像融合处理,得到目标图像。
  16. 根据权利要求15所述的电子设备,其中,所述处理器还用于执行:
    将所述第一摄像头拍摄所述第一图像时,所述第一摄像头的镜头所在的位置确定为第一位置;
    检测所述拍摄主体与所述第一摄像头的距离;
    在所述拍摄主体与所述第一摄像头的距离小于预设阈值时,根据预设第一策略从多个镜头位置中选择对应的镜头位置作为所述第一摄像头的镜头的第二位置,其中,在所述第一摄像头的镜头在所述第二位置处时与图像传感器的距离大于所述第一摄像头的镜头在所述第一位置处时与图像传感器的距离;
    根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第三位置,并拍摄第二图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第二位置与所述第二摄像头的镜头所在的第三位置相对应。
  17. 根据权利要求16所述的电子设备,其中,所述处理器还用于执行:
    在所述拍摄主体与所述第一摄像头的距离大于或等于预设阈值时,根据预设第二策略从多个镜头位置中选择对应的镜头位置作为所述第一摄像头的镜头的第四位置,其中,在所述第一摄像头的镜头在所述第四位置处时与图像传感器的距离小于所述第一摄像头的镜头在所述第一位置处时与图像传感器的距离;
    根据所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系,将所述第二摄像头的镜头驱动至第五位置,并拍摄第二图像,其中,在所述第一摄像头的镜头位置与所述第二摄像头的镜头位置的映射关系中,所述第一摄像头的镜头所在的第四位置与所述第二摄像头的镜头所在的第五位置相对应。
  18. 根据权利要求16所述的电子设备,其中,在所述第一摄像头的镜头在所述第二位 置处时与图像传感器的距离大于所述第一摄像头的镜头在其它任一位置处时与图像传感器的距离。
  19. 根据权利要求17所述的电子设备,其中,在所述第一摄像头的镜头在所述第四位置处时与图像传感器的距离小于所述第一摄像头的镜头在其它任一位置处时与图像传感器的距离。
  20. 根据权利要求15所述的电子设备,其中,所述处理器还用于执行:
    根据所述第一图像和所述第二图像,计算失焦系数;
    根据所述失焦系数,调整所述第二图像的比例,得到调整比例后的第二图像;
    所述将所述主体图像和所述第二图像进行图像融合处理得到目标图像,包括:将所述主体图像和所述调整比例后的第二图像进行图像融合处理,得到目标图像。
PCT/CN2020/072464 2020-01-16 2020-01-16 图像处理方法、装置、存储介质及电子设备 WO2021142711A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080083918.0A CN114762313B (zh) 2020-01-16 2020-01-16 图像处理方法、装置、存储介质及电子设备
PCT/CN2020/072464 WO2021142711A1 (zh) 2020-01-16 2020-01-16 图像处理方法、装置、存储介质及电子设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/072464 WO2021142711A1 (zh) 2020-01-16 2020-01-16 图像处理方法、装置、存储介质及电子设备

Publications (1)

Publication Number Publication Date
WO2021142711A1 true WO2021142711A1 (zh) 2021-07-22

Family

ID=76863463

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/072464 WO2021142711A1 (zh) 2020-01-16 2020-01-16 图像处理方法、装置、存储介质及电子设备

Country Status (2)

Country Link
CN (1) CN114762313B (zh)
WO (1) WO2021142711A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107018331A (zh) * 2017-04-19 2017-08-04 努比亚技术有限公司 一种基于双摄像头的成像方法及移动终端
CN107707809A (zh) * 2017-08-17 2018-02-16 捷开通讯(深圳)有限公司 一种图像虚化的方法、移动设备以及存储装置
CN108600643A (zh) * 2018-07-13 2018-09-28 广州三星通信技术研究有限公司 图像拍摄方法和装置
KR20190074455A (ko) * 2017-12-20 2019-06-28 (주)이더블유비엠 평면이미지의 리포커싱 방법, 장치 및 기록매체에 저장된 프로그램
CN110505406A (zh) * 2019-08-26 2019-11-26 宇龙计算机通信科技(深圳)有限公司 背景虚化方法、装置、存储介质及终端

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006140594A (ja) * 2004-11-10 2006-06-01 Pentax Corp デジタルカメラ
CN105847664B (zh) * 2015-07-31 2019-01-29 维沃移动通信有限公司 一种移动终端拍照的方法和装置
CN106791416A (zh) * 2016-12-29 2017-05-31 努比亚技术有限公司 一种背景虚化的拍摄方法及终端
CN108040207A (zh) * 2017-12-18 2018-05-15 信利光电股份有限公司 一种图像处理方法、装置、设备及计算机可读存储介质
CN109151329A (zh) * 2018-11-22 2019-01-04 Oppo广东移动通信有限公司 拍照方法、装置、终端及计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107018331A (zh) * 2017-04-19 2017-08-04 努比亚技术有限公司 一种基于双摄像头的成像方法及移动终端
CN107707809A (zh) * 2017-08-17 2018-02-16 捷开通讯(深圳)有限公司 一种图像虚化的方法、移动设备以及存储装置
KR20190074455A (ko) * 2017-12-20 2019-06-28 (주)이더블유비엠 평면이미지의 리포커싱 방법, 장치 및 기록매체에 저장된 프로그램
CN108600643A (zh) * 2018-07-13 2018-09-28 广州三星通信技术研究有限公司 图像拍摄方法和装置
CN110505406A (zh) * 2019-08-26 2019-11-26 宇龙计算机通信科技(深圳)有限公司 背景虚化方法、装置、存储介质及终端

Also Published As

Publication number Publication date
CN114762313B (zh) 2024-03-01
CN114762313A (zh) 2022-07-15

Similar Documents

Publication Publication Date Title
US10997696B2 (en) Image processing method, apparatus and device
US11948282B2 (en) Image processing apparatus, image processing method, and storage medium for lighting processing on image using model data
JP7145208B2 (ja) デュアルカメラベースの撮像のための方法および装置ならびに記憶媒体
CN108322646B (zh) 图像处理方法、装置、存储介质及电子设备
EP3609177A1 (en) Control method, control apparatus, imaging device, and electronic device
CN111028190A (zh) 图像处理方法、装置、存储介质及电子设备
WO2019109805A1 (zh) 图像处理方法和装置
CN111246093B (zh) 图像处理方法、装置、存储介质及电子设备
KR102266649B1 (ko) 이미지 처리 방법 및 장치
CN111246092B (zh) 图像处理方法、装置、存储介质及电子设备
WO2019105297A1 (zh) 图像虚化处理方法、装置、移动设备及存储介质
CN110930301B (zh) 图像处理方法、装置、存储介质及电子设备
WO2019105298A1 (zh) 图像虚化处理方法、装置、移动设备及存储介质
CN108156369B (zh) 图像处理方法和装置
CN107872631B (zh) 基于双摄像头的图像拍摄方法、装置及移动终端
CN110717871A (zh) 图像处理方法、装置、存储介质及电子设备
CN112261292B (zh) 图像获取方法、终端、芯片及存储介质
US20220329729A1 (en) Photographing method, storage medium and electronic device
JP2010072619A (ja) 露出演算装置およびカメラ
CN113298735A (zh) 图像处理方法、装置、电子设备及存储介质
JP2014179920A (ja) 撮像装置及びその制御方法、プログラム、並びに記憶媒体
CN106878606B (zh) 一种基于电子设备的图像生成方法和电子设备
CN106878604B (zh) 一种基于电子设备的图像生成的方法和电子设备
CN111212231B (zh) 图像处理方法、装置、存储介质及电子设备
WO2021142711A1 (zh) 图像处理方法、装置、存储介质及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20913179

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20913179

Country of ref document: EP

Kind code of ref document: A1